CN115729392A - Equipment interaction method, electronic equipment and system - Google Patents

Equipment interaction method, electronic equipment and system Download PDF

Info

Publication number
CN115729392A
CN115729392A CN202210270366.5A CN202210270366A CN115729392A CN 115729392 A CN115729392 A CN 115729392A CN 202210270366 A CN202210270366 A CN 202210270366A CN 115729392 A CN115729392 A CN 115729392A
Authority
CN
China
Prior art keywords
service
control
user
interface
scenarized
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210270366.5A
Other languages
Chinese (zh)
Inventor
陈绍君
胡建荣
彭峰
熊张亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to PCT/CN2022/115212 priority Critical patent/WO2023030196A1/en
Publication of CN115729392A publication Critical patent/CN115729392A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a device interaction method, electronic equipment and a system. The application provides a unified entrance for interaction between the core device and the multiple associated devices, when the multiple devices initiate connection, the super terminal system provides one or more scene service options for a user, and different scene service options are entrances of corresponding services in different application scenes. In addition, different atomization service/function combinations, matching equipment recommendation and the like can be recommended to the user. By implementing the method provided by the application, the user can quickly select the scene service and/or the atomization service according to personal requirements, the user selects the equipment combination, the available scene service and/or the atomization service under the equipment combination are provided in a targeted manner, detailed and rich use scenes are listed for the user, the user is guided to use the functions of the super terminal to fully explore more application scenes, and the effect of recommending the scene service according to the equipment combination is realized.

Description

Equipment interaction method, electronic equipment and system
The present application claims priority of chinese patent application entitled "a device interaction method, electronic device and system" filed by chinese patent office on 31/08/2021 with application number 202111016484.5, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a device interaction method, an electronic device, and a system.
Background
Along with the consumption upgrade, the number of intelligent terminal devices owned by a user is increased. In some usage scenarios, a user often needs two or more terminal devices to work together or cooperate to provide certain services and implement certain functions, which may also be referred to as cross-device services. For example, the mobile phone and the smart screen cooperatively realize video screen projection; the mobile phone shares files or pictures and the like to other equipment.
Generally speaking, the same device vendor or the same application vendor may set different portals and different triggering modes for different cross-device services. Different equipment manufacturers or different application manufacturers may set different entries and different trigger modes for the same cross-device service. For example, a user needs to share a picture on the electronic device a with the electronic device B, the user can select a picture to be shared on the electronic device a, click the sharing control, then the electronic device a starts to search surrounding devices and display a device list, the user selects the electronic device B in the device list, and the electronic device a can send the selected picture to the electronic device B. For another example, a user needs to screen the screen content of the electronic device a onto the electronic device B, the user needs to start the wireless functions of the electronic device a and the electronic device B, then pair the electronic device a and the electronic device B, establish a connection, then the user opens an application related to screen projection on the electronic device a, operates a screen projection control in the application interface, then the electronic device a sends screen projection data to the electronic device B, and the electronic device B displays a screen image of the electronic device a.
Therefore, when a plurality of devices are started to work cooperatively, the user operation is complex, the entrance distribution is scattered, the human-computer interaction performance is poor, and the user experience is poor.
Disclosure of Invention
The application provides a device interaction method, electronic devices and a system, the device interaction method provides a unified entrance for interaction between core devices and a plurality of associated devices, when the devices initiate connection, a super terminal system provides one or more scene service options for a user, and different scene service options are entrances of corresponding services in different application scenes. In addition, different atomization service/function combinations can be recommended for the user, matching equipment can be recommended, and the like.
The above and other objects are achieved by the features of the independent claims. Further implementations are presented in the dependent claims, the description and the drawings.
In a first aspect, an embodiment of the present application provides a device interaction method, where the method includes: the method comprises the steps that a first interface is displayed by a first device and comprises a first control and a second control, the first control indicates a device option of the first device, the second control indicates a device option of a second device, and the second device is a device discovered by the first device. The first device detects a first user operation for user association of the first device and the second device. The first device displays a second interface, wherein service options of a first scenarized service are displayed in the second interface, and the first scenarized service is cooperatively supported by the first device and the second device associated with the first user operation. The first device detects a second user operation selecting the first scenarized service. The first device indicates the second user to operate the running of the selected first scenario service.
By implementing the method of the first aspect, the man-machine interaction performance can be improved, a user can quickly select the scenized service and/or the atomized service according to personal requirements, the available scenized service and/or the atomized service under the equipment combination is provided in a targeted manner by selecting the equipment combination by the user, detailed and rich use scenes are listed for the user, the user is guided to fully explore more application scenes by using the functions of the super terminal, and the effect of recommending the scenized service according to the equipment combination is achieved.
With reference to the first aspect, in some embodiments, in the first interface, the second controls are distributed on a circle centered on the first control, and the first control and the second controls are in a separated state. Referring to the example shown in fig. 5, the first device is a core device, such as a mobile phone, and the second device is an associated device, which may be a smart watch, a smart television, or the like.
In conjunction with the first aspect, in some embodiments, the positional relationship of the first control to the second control in the first interface may indicate one or more of the following relationships between the first device and the second device: a connection relation, or an orientation relation, or a distance relation, or a signal strength relation. The first control is separated from the second control and indicates that the first device is not connected with the second device, and the first control is connected with the second control and indicates that the first device is connected with the second device. And/or the second control is positioned at the left side, the right side, the upper side or the lower side of the first control, and the second control indicates that the second device is positioned at the left side, the right side, the front side or the rear side of the first device respectively. And/or the farther the distance between the first control and the second control is, the longer the distance between the first device and the second device is indicated, or the weaker the signal between the first device and the second device is, the closer the distance between the first control and the second control is, the closer the distance between the first device and the second device is indicated, or the stronger the signal between the first device and the second device is.
The equipment connection state diagram in the first interface can vividly and concisely indicate each equipment and show the relation between the core equipment and the associated equipment, so that the user operation is convenient, the interface is understandable, the user interface design is more humanized, and the user experience is good.
With reference to the first aspect, in some embodiments, the first user operation includes an operation in which the second control is selected, or the first control is selected, or both the first control and the second control are selected.
With reference to the first aspect, in some embodiments, the first user operation is that the user selects the second control, and after dragging the second control to move to the designated area in a direction close to the first control, the operation of the second control is released. Or the first user operation is that the user selects the first control and the second control at the same time, drags the first control and the second control to be close to each other until the distance between the first control and the second control is less than a certain distance, and then releases the operation of the first control and the second control. The designated area may be a circular area centered on the first control and having a radius of a first radial distance. Or the first user operation is that the user selects the first control, drags the first control to move towards the direction close to the second control, and releases the operation of the first control until the distance between the first control and the second control is less than a certain distance.
The second interface also displays that the first control and the second control are adsorbed together to represent that the first equipment and the second equipment are connected and can provide system-level service in a cooperative work mode. Refer to the embodiments shown in fig. 6 and 7. The dragging operation is more accordant with the operation habit of the user, the learning cost of the user is low, and the operation is easy to operate. The two device options being attached together may mean that the edges of the icons corresponding to the two devices are tangent, or that the two device options being attached together may mean that the two device icons are wholly or partially overlapped. The present embodiment is not particularly limited to the form of expression of adsorption.
In some embodiments, the second control in the second interface may be different in appearance from the second control in the first interface. For example, in fig. 7, the smart watch icon 702 that has been connected to a cell phone to form a super terminal may display a distinguishing logo, facilitating distinction from the associated device icon in the unconnected state. If the smart watch icon 702 is displayed in a dark fill mode, it indicates that the smart watch and the mobile phone are in a connected state, and the size of the other smart watch icons 702 is smaller than the size of the smart watch icon 506 when the smart watch is not connected.
In some embodiments, the first user operation is an operation in which the user clicks the second control, or clicks both the first control and the second control.
With reference to the first aspect, in some embodiments, a floating control is displayed in the second interface, and the floating control includes a first area that displays service options of the first scenized service. The first user action may be an operation of the user holding down the second control. The second user operation may be an operation of releasing the second control after the user drags the second control to the first region, or the second user operation may be an operation of releasing the second control after the user drags the second control to move to a designated region in a direction close to the first control via the first region, where the designated region may be a circular region with the first control as a center and the radius as the first radius distance. The embodiments described with reference to fig. 11, 12, 13A, 13B, 13C, 13D. In this embodiment, one second user operation includes two functions of associating the second device to the first device and selecting the first scenario service, which improves user operation efficiency.
In conjunction with the first aspect, in some embodiments, the selected device option may change location with the trajectory of the user's finger dragging across the screen. The present embodiment does not limit the motion trajectory of the finger when the user drags the device icon, and the user dragging trajectory may be an arbitrary curve.
With reference to the first aspect, in some embodiments, the levitation control comprises a plurality of circular regions, or a plurality of rectangular regions, or a plurality of fan-shaped annular regions, or a plurality of circular-ring-shaped regions, or a plurality of polygonal regions, or the like.
In some embodiments in combination with the first aspect, a first control is included in the second interface. The first control may also not be included in the second interface.
A scenario service refers to a functional service based on a usage scenario, and is a set of multiple functions or services that can be provided for a particular scenario. A scenarized service may be supported by one or more atomization services. The service option of the scenario service displayed on the second interface may be a scenario service that can be provided by the current super terminal system, that is, a device combination of the first device and the second device, and the atomic capability corresponding to the scenario service is already installed and deployed on the first device or the second device.
In some embodiments, a certain scenarized service may be based on a certain same application under which different devices provide different atomic capabilities. The scenarized service is actually provided by the same service provider, for example, in the example shown in fig. 8, the mobile phone and the smart watch are both installed with the exercise health application, after the user selects the scenarized service option for outdoor running, both the mobile phone and the smart watch jump to the outdoor running service in the exercise health application, the smart watch can provide positioning service, user exercise data statistics service and the like, the smart watch can upload user data to the application server, the application server synchronously issues the user data to the mobile phone, and the mobile phone can provide a service for displaying the exercise track of the user and the like. For the same scene service, even if the same client is used, interfaces displayed by different devices and services provided by different devices can be different.
In some embodiments, in combination with the first aspect, the first scenarized service is a scenarized service queried from a first database that records one or more scenarized services supported by one or more device combinations including a device combination of the first device and its associated second device. Wherein the scenarized service may be determined by one or more of the following parameters: device type of device combination, device characteristics, product location, device usage, environment or scenario in which the device is located, state in which the device is located, recently running application, and the like. Refer to table one in the subsequent examples. The first database may be stored locally or on a cloud server.
In conjunction with the first aspect, in some embodiments, the first scenarized service is a scenarized service queried from a second database that records one or more scenarized services supported by one or more combinations of atomic services including a combination of atomic services made up of atomic capabilities possessed by the first device and its associated second device. Wherein the atomic capability may include one or more of: audio output capability, audio input capability, display capability, camera capability, touch input capability, keyboard and mouse input capability. The second database may be stored locally or on a cloud server.
For example, the audio output capability may include that the device supports mono or multi-channel, supported sound effect, supportable frequency response range, noise reduction capability, audio resolution capability, and the like when playing audio; audio input capabilities may include a range of sound reception, noise reduction capabilities, etc.; the display capabilities may include the screen size of the device, display resolution parameters, refresh rate, color rendering, etc.; the camera shooting capability can comprise the camera type of the equipment, the shooting pixel, the shooting night scene capability, the image adjusting capability and the like; touch input capability, keyboard and mouse input capability refer to whether the device can support touch input or keyboard and mouse input, etc.
In conjunction with the first aspect, in some embodiments, the first scenarization service may calculate the probability of being selected based on one or more of the following parameters: the frequency of the first scenized services used by the user, the sequence of the first scenized services set by the user, the environment or scene where the first device and/or the second device is located, the state where the first device and/or the second device is located, and the application recently run by the first device and/or the second device. The service option of the first scenizable service with higher probability of being selected by the user is preferentially recommended to be displayed.
For example, different device types have different device characteristics, for example, a smart television has a larger screen, and the user has better visual viewing experience and is more suitable for viewing videos; the intelligent sound box has strong audio output capability and good loudspeaking effect, and is more suitable for playing audio; the earphone is small in size, easy to carry, free of external placement and free of disturbing surrounding people, has a noise reduction effect, shields external environment noise, and is suitable for listening to audio in public places, such as public conversation and music; compared with a mobile phone, the smart watch and the smart bracelet can more accurately measure other health information such as exercise data or heart rate of the user; the notebook computer has strong processing capacity and is more suitable for office work. The device types and device features are not described in detail here, and the above example description does not limit other embodiments.
In some embodiments, when listing the scenized service options, the high-usage option may be displayed in front according to the counted usage frequency of each scenized service option by the user at a certain time. The certain time may be all the time of the history, or may be a certain period of time, such as the last three months. The higher the historical usage of the scenized service options, the earlier the display.
In other embodiments, the first device or the second device may report that the current device is detected to be in the environment, and sort the scenario service options according to the environment in which the current device is located. If the environment where the user is located is detected to be outdoor, the user preferably recommends an outdoor related service option, such as outdoor running. If the environment in which the user is located is detected to be indoor, indoor related service options such as indoor running are preferentially recommended. The embodiment does not limit the manner of detecting the environment where the user is located, and for example, the environment where the user is located may be determined according to the current positioning information of the device; or a camera is used for collecting the picture of the current environment, or ultrasonic waves, infrared rays and the like are used for collecting the depth or the size of the current space so as to judge the current environment of the user.
In other embodiments, the first device or the second device may further determine the environment or the scene where the device is located according to an application recently opened or used by the user or a currently running application, and further recommend the scenized service option. For example, when a super terminal is formed by a mobile phone and a smart television, if it is detected that the currently running application of the mobile phone is an office demonstration application, a multi-screen cooperative service option can be preferentially recommended; or if the application currently running by the mobile phone is detected to be a video application, preferentially recommending the home theater service option; or, when the condition that the webpage browsed by the user is the game-related webpage is detected, service options related to game entertainment are preferentially recommended, and the like.
In other embodiments, the scenized services may also be recommended based on the operating state of the device, such as on-screen or off-screen. For example, when a mobile phone is connected with a smart television to form a super terminal, if the smart television is detected to be in a bright screen state, the scene service is preferentially recommended to be a screen mirror image, and if the smart television is detected to be in a dead screen state, the scene service is preferentially recommended to be a voice service.
With reference to the first aspect, in some embodiments, in this embodiment, the layouts of the floating controls are different, and the arrangement order of the recommended scenarized services is also different. Developers can design the region layout of the floating control according to the user experience.
Optionally, referring to the embodiment shown in fig. 13C and 13D, the shape of the floating control is a circular ring or a sector, where multiple circular ring areas or sector ring areas respectively indicate multiple first scenarized services, and the first scenarized service indicated by the circular ring area or sector ring area closer to the center of the circle is a recommended scenarized service with higher priority.
Optionally, referring to the embodiment shown in fig. 11, fig. 12, fig. 13A, and fig. 13B, the shape of the floating control is a circle or a rectangle, the circle or the rectangle is divided into four regions, the first quadrant is located at region a, the second quadrant is located at region B, the third quadrant is located at region C, the fourth quadrant is located at region D, the recommendation sequence of the scenarized service option is scenarized service option a, scenarized service option B, scenarized service option C, and scenarized service option D, the regions respectively placed at region B, region a, region C, region D, region B, region C, region a, region D, region B, region D, region C, region B, region C, region a, region B, region C, and region D.
With reference to the first aspect, in some embodiments, the second interface further displays a device option of a third device, the third device being a device recommended for evaluating device usage requirements of the first device and its associated second device. Therefore, the method can help the user to expand more and richer use scenes and better experience the super terminal ecosystem.
With reference to the first aspect, in some embodiments, if the third device is a device discovered by the first device, the device option of the third device is clicked, and the first device displays the service option of the second scenarized service. The service option of the second scenario service may be some scenario service options added on the basis of the service option of the first scenario service, or the service option of the second scenario service is a scenario service updated again and is different from the service option of the first scenario service. Therefore, the operation of the user is convenient.
With reference to the first aspect, in some embodiments, if the third device is not a device discovered by the first device, clicking on a device option of the third device may jump to display a fifth interface, e.g., the fifth interface may be a purchase interface of the third device, the fifth interface including a plurality of item options, the plurality of item options being from a plurality of sources of provision, the plurality of item options including an item option indicative of the third device. The fifth interface may include function introduction, usage introduction, and accompanying usage scenario introduction of the third device, which may provide more convenient service to the user.
With reference to the first aspect, in some embodiments, the third device is a device recommended based on one or more parameters: the device type, device characteristics, product location, device usage, environment or scenario in which the device is located, state in which the device is located, recently running application, or first scenario-based service, or predicted scalable usage scenario, etc. of the first device and/or its associated second device.
After a recommendation device is added, the system can also recommend a new device according to the newly formed super terminal. The user can select to add more devices to form the super terminal, so that the use scene of the user is richer. After more associated devices are added, the scene services provided by the super terminal are more and richer, and the ecosystem of the super terminal is improved.
In conjunction with the first aspect, in some embodiments, the first scenized service is run supported by a first combination of atomic services, the first combination of atomic services including at least one first service provided by the first device and/or at least one second service provided by the second device. An atomic service refers to a minimum capability unit that can run independently, is a concept of performing abstract encapsulation on a single function/capability, and may be a hardware service or a software service.
In conjunction with the first aspect, in some embodiments, the first device may display the first atomized service combination on a seventh interface, with reference to the embodiment shown in fig. 10. The first atomized service combination can be selected by a user, or the first atomized service combination is an atomized service combination determined according to one or more of the following parameters: the method comprises the steps of obtaining a device type and device characteristics of the first device and/or the second device, or an atomization service combination commonly used by a user, or an atomization service combination set by the user to be default, or an atomization service combination which is obtained through analysis and is most suitable for the first scenarized service, or an atomization service combination which is obtained from a server and has the highest use frequency under the first scenarized service.
In some embodiments, in combination with the first aspect, the first device runs a first scenarized service and/or the second device associated with the first device runs the first scenarized service.
With reference to the first aspect, in some embodiments, the third interface displayed by the first device running the first scenarized service is different from the fourth interface displayed by the second device running the first scenarized service.
With reference to the first aspect, in some embodiments, if the first scenarized service is a screen projection service, the second device is a large-screen device, the fourth interface may be a video or image screen, and the third interface may be a control interface for controlling a video or image screen display function. Reference is made to the embodiment shown in fig. 18. Therefore, the second equipment can be matched with the first equipment for use, the characteristics of the equipment are fully utilized, and more convenience is provided for users. Of course, the user may also select the third interface to display the chat interface, and the user may use the first device to chat and the second device to watch the video. Optionally, the content of the picture displayed on the third interface and the fourth interface may be the same, and referring to the embodiment shown in fig. 19, both the mobile phone and the television display the same video picture.
With reference to the first aspect, in some embodiments, if the first scenarized service is a motion monitoring service, the second device is a wearable motion device, the third interface may be a picture of a motion trajectory, and the fourth interface may be detected motion data of the user, where the motion data includes: speed of movement, distance of movement, time of movement, heart rate. Reference is made to the embodiment shown in figure 8. Therefore, the second equipment can be matched with the first equipment for use, the characteristics of the equipment are fully utilized, and more convenience is provided for users. For example, the user is after associating cell-phone and smart watch, and the user puts the cell-phone at home, wears smart watch to go outdoor running, and smart watch possesses the locate function to and the function of testing the speed etc. and smart watch can give cloud server on constantly uploading locating information, and the cell-phone is issued again to cloud server, can see the orbit of oneself at the cell-phone after the user goes home.
The user drags the second control to be adsorbed to the first control from different directions and different areas, and the generated effect and the triggered service are possibly different.
With reference to the first aspect, in some embodiments, in a screen projection scene, the first device determines, according to the relative position of the second control and the first control when the drag operation is released, the position of the screen interface of the first device projected in the screen of the second device.
The embodiments described with reference to fig. 15, 16, 17. If the relative position is that the second control is to the left of the first control, then the screen interface of the first device is projected to the right in the screen of the second device. If the relative position is such that the second control is to the right of the first control, then the screen interface of the first device is projected to the left in the screen of the second device. If the relative position is that the second control is in the middle of the first control, the screen interface of the first device is projected on the screen of the second device in a full screen mode.
In another example, for audio devices, such as speakers, headphones, etc., different channels may be associated with different orientations of the suction. Assuming that a user has two or more sound boxes, the user drags a first sound box icon to fit and adsorb the first sound box icon together from the left area of the mobile phone icon, and then the first sound box can provide audio output of a left sound channel; the user drags the second loudspeaker box icon to fit the right area of the mobile phone icon and adsorb the second loudspeaker box icon together, so that the second loudspeaker box can provide audio output of a right sound channel. Other multi-channel arrangements work equally well.
In other embodiments, the first control and the second control interchange drag directions, and may perform different services or interactive functions. If the data transmission direction is different, the device is used as an adsorption party or as an adsorbed party to indicate that the device is a sending party or a receiving party of the data, and if the data transmission direction is specified to be that the adsorption party device sends a data stream to the adsorbed party device. For example, when a file is shared, if the second control is pressed and dragged to be attached to the first control, the triggered service is to send the file of the second device to the first device, and if the first control is pressed and dragged to be attached to the second control, the triggered service is to send the file of the first device to the second device.
In other embodiments, the scenario services triggered by the device option as the adsorber or the adsorber may be different. For example, if the second control is pressed and dragged to be adsorbed to the first control, the triggered service is to send the file of the second device to the first device, and if the first control is pressed and dragged to be adsorbed to the second control, the triggered service is to project the screen of the first device onto the screen of the second device.
In combination with the first aspect, in some embodiments, the method may further include: the first device detects a third user operation on the second interface, the third user operation is used for associating a fourth device with the first device, and the third user operation comprises selection of a third control operation indicating the fourth device. Responding to a third user operation, the first device displays a sixth interface, the sixth interface displays a first control, a second control and a third control which are adsorbed together, the sixth interface also displays a third scenario service option, the third scenario service option is determined according to the first device, the second device and the fourth device, and the third scenario service option is different from the first scenario service option. Reference is made to the embodiment shown in fig. 9.
With reference to the first aspect, in some embodiments, after the first device runs the first scenarized service selected by the second user operation, the first device detects a fourth user operation, which includes an operation to separate the first control from the second control. In response to the fourth user operation, the first device instructs the second user operation to select the first scenario service to stop running.
In some embodiments, in combination with the first aspect, in response to the fourth user operation, the first device stops running the first scenario service selected by the second user operation, and/or the second device stops running the first scenario service selected by the second user operation.
With reference to the first aspect, in some embodiments, the fourth user operation may select the second control for the user, drag the second control to move in a direction away from the first control, and release the operation of the second control until the specified area is moved out. The second control returns to the position before the second control is connected, the adsorption state is released, the second equipment is also disconnected from the first equipment, and the atomization service provided by the first equipment or the second equipment is also terminated. Or when the user clicks the second control in the connected state, or the user presses the second control and the first control, and drags the second control and the first control in the opposite direction to separate the second control and the first control, the first device and the second device can also be triggered to be disconnected, the second control and the first control are separated on the interface, and the second control returns to the position in the unconnected state.
In some embodiments, a key disconnection control may also be set on the interface of the first device, and clicking the key disconnection control may disconnect all connected associated devices from the first device, and return to a state where the first device is independent and temporarily does not have any associated device connected, and then the service provided by each associated device is terminated.
In combination with the first aspect, in some embodiments, the first device comprises any one of: mobile phones, tablet computers, portable/non-portable computers, personal computers, smart televisions, and the like. The second device comprises any one of: the mobile phone, the tablet computer, the portable mobile computer, the desktop personal computer, the intelligent sound box, the intelligent watch, the intelligent bracelet, the intelligent television, the earphone, the intelligent glasses, the car machine, the intelligent passenger cabin, the game machine, the treadmill, the spinning bike, the weighing scale, the body fat scale, the water heater, the lamps and lanterns, the air conditioner, the blood glucose meter, the blood oxygen saturation apparatus, the cardiotachometer, the AR/VR equipment, the cloud host computer/cloud server, the wearable equipment of intelligence, the intelligent house equipment, the printer etc.
With reference to the first aspect, in some embodiments, the first scenario service may be a printing service, and the second device is a printer. When the first device and the second device cooperate to run the first scenarization service, the first device may display a print levitation control. The print hover control may be hovered displayed on a user interface of the first device.
That is, the first device may always hover and display the print hover control on the user interface during the first device running the first scenario service. Wherein the content displayed on the user interface of the first device changes, but the print hover control can still be hovered displayed on the user interface. Therefore, the user can conveniently and quickly print the content displayed on the current user interface through the printing suspension control.
In some embodiments, in combination with the first aspect, the first device detects a fifth user operation of the print hover control. The first device may send the first print file to the second device for printing based on a fifth user operation, the first print file being content displayed on the first device when the fifth user operation is detected by the first device.
In some embodiments, when the content displayed on the first device user interface changes from the first print file to a second print file, the first device may send the second print file to the second device for printing in response to the operation of the print hover control.
In some embodiments, in combination with the first aspect, the first device detects a sixth user operation of the print hover control. The first device may delete the print hover control based on a sixth user operation and instruct the first scenarized service to stop running.
It can be seen that the user can directly trigger the first device and the second device to stop running the first scenarized service through the print suspension control, so that the user does not need to return to the first interface to trigger the first device and the second device to stop running the first scenarized service through the first control and the second control.
The user operation of providing the printing service by the first equipment and the printer and the user operation of stopping the cooperation of the first equipment and the printer are simple, and the use experience of a user can be improved.
In a second aspect, an embodiment of the present application provides a device interaction method. The first interface is displayed on the first device and comprises a first control and a second control, the first control is used for indicating a device option of the first device, the second control is used for indicating a device option of the second device, and the second device is a device discovered by the first device. The first device detects a first user operation, and the first user operation is used for associating the first device and the second device. The first device sends a first binding request to the second device according to the first user operation, and receives a first consent message of the second device. The first device displays a second interface according to the first agreement message, service options of the first scenized service are displayed in the second interface, and the first scenized service is a scenized service cooperatively supported by the first device and the second device.
By implementing the method of the second aspect, the man-machine interaction performance can be improved, a user can quickly select the scenized service and/or the atomized service according to personal requirements, the available scenized service and/or the atomized service under the equipment combination is provided in a targeted manner by selecting the equipment combination by the user, detailed and rich use scenes are listed for the user, the user is guided to fully explore more application scenes by using the functions of the super terminal, and the effect of recommending the scenized service according to the equipment combination is achieved.
With reference to the second aspect, in some embodiments, the first device may display a connection code setting box according to the first user operation, where the connection code setting box is used to set a connection code for binding the first device and the second device. The first device receives the first connection code input in the connection code setting frame and sends a first binding request to the second device. Wherein, the first agreement message includes the first connection code.
In some embodiments, after the second device receives the first binding request, the second device may ask the user whether the second device agrees to collaborate with the first device. If it is detected that the user agrees to the cooperative operation of the second device and the first device, the second device may display a connection code input box to allow the user to input the connection code. When receiving the connection code input by the user in the connection code input box, the second device may send the first consent message. After receiving the first consent message, the first device may detect whether the connection code included in the first consent message is the connection code set in the connection code setting box. If so, the first device and the second device can complete device binding and establish cooperation, so that the scene service is provided for the user.
As can be seen from the foregoing embodiments, before the first device and the second device run the first scenario service, device binding may be performed first. In the above device binding process, the first device may set a connection code, and the second device may input the connection code for device binding. The above device binding process may ask whether a user to which the second device belongs agrees that the second device cooperates with the first device. The embodiment can reduce the probability that the electronic equipment of the user is used by other users when the user to which the second equipment belongs disagrees or does not know the user to which the first equipment belongs and the user to which the second equipment belongs under the condition that the user to which the first equipment belongs and the user to which the second equipment belong are different. Moreover, the verification process of the connection code can ensure the accuracy of the electronic equipment bound with the first equipment, and the use experience of a user is improved.
In some embodiments, in combination with the second aspect, in the first interface, the second controls are distributed on a circle having a radius of the first length and centered on the first control, and the first control and the second controls are in a separated state.
The first user operation may be to select a second control for the user, drag the second control to move to a designated area in a direction close to the first control, and then release the operation of the second control. Or, the first user operation may be that the user selects the first control and the second control at the same time, drags the first control and the second control to be close to each other until the distance between the first control and the second control is smaller than a certain distance, and then releases the operations of the first control and the second control. The designated area may be a circular area centered on the first control and having a radius of a first radial distance. Or, the first user operation may select the first control for the user, drag the first control to move toward the direction close to the second control, and release the operation of the first control until the distance between the first control and the second control is less than a certain distance.
In some embodiments, a first control and a second control are also displayed in the second interface. In the second interface, the first control and the second control may be adsorbed together to indicate that the first device and the second device have established a connection, and may cooperate to provide a system-level service. The two device options being attached together may mean that the edges of the icons corresponding to the two devices are tangent, or the two device options being attached together may mean that the two device icons are wholly or partially overlapped.
The first operation accords with the operation habit of the user, the learning cost of the user is low, and the operation is easy to operate.
With reference to the second aspect, in some embodiments, the first interface further includes a fourth control, where the fourth control indicates a device option of a fifth device, the fifth device is a device discovered by the first device, and an account associated with the fifth device is the same as an account associated with the first device; the fourth control is distributed on a circle which takes the first control as the center and has the radius of the second length, the first control and the fourth control are in a separated state, and the first length is different from the second length.
In some embodiments, in combination with the second aspect, the account associated with the first device is different from the account associated with the second device, or the account associated with the second device is not.
As can be seen from the above embodiments, the device option indicating the cross-account device (or the device without account) and the device option indicating the device with the same account may be displayed on circles with different radiuses and centered on the first control. For example, device options indicating co-account devices may be displayed on a circle. Device options indicating cross-account devices and/or no-account devices may be displayed on another circle. In this way, the user can quickly discern which devices are of themselves and which devices are of other users.
The same account device may refer to a device with the same account associated with the first device. The cross-account device may refer to a device with an account associated with the first device. The above-mentioned non-account device may refer to a device without an associated account.
In some embodiments, when the second device is a cross-account device or a non-account device, the first device may perform device binding with the second device in response to the first operation. If the first device and the second device are firstly bound, the first device and the second device can be bound through the connection code. Illustratively, the first device may display the connection code setting frame, receive a connection code set by a user, and send a binding request to the second device. The second device, upon receiving the binding request, may ask the user whether the user agrees to bind the second device with the first device. If the operation of agreeing to the binding of the second device and the first device is detected, the second device may display a connection code input box, and request the user to belong to the connection code of the binding of the second device and the first device. The second device may transmit an agreement message to the first device upon receiving the connection code input at the connection code input box. The agreement message also includes the connection code received in the connection code input box. The user operation when the first device and the second device perform device binding for a non-first time (e.g., a second time, a third time, etc.) may be simpler than the user operation when the device binding is performed for a first time. For example, in a case where the first device and the second device have been device-bound once, the first device may transmit a binding request to the second device in response to the first user operation described above. Upon receiving the binding request, the second device may ask the user whether the user agrees to bind the second device with the first device. If the operation of agreeing to bind the second device with the first device is detected, the second device may send an agreement message to the first device.
In some embodiments, the first device and the second device may perform device binding in case of first receiving the first user operation associating the first device and the second device. After the first device and the second device are bound, when the first device receives the user operation associated with the first device and the second device again, the first device can be directly connected with the second device, and the scene service is provided through cooperative work.
With reference to the second aspect, in some embodiments, the first interface further includes a fifth control, the fifth control being a device option indicating a sixth device, the sixth device being a device discovered by the first device, the fifth control being in a separate state from the fourth control. The first device detects a seventh user operation for associating the fifth device and the sixth device. And the first device instructs the fifth device to request to bind with the sixth device according to the seventh user operation, and receives a second consent message for instructing the sixth device to consent to bind with the fifth device. And the first equipment displays an eighth interface according to the second agreement message, wherein the fourth control and the fifth control are adsorbed together in the eighth interface.
The adsorption of the fourth control and the fifth control may indicate that the first device and the second device have established a connection, and may cooperate to provide a system-level service.
As can be seen from the above embodiment, the core device in the super terminal may also respond to a user operation, and associate other same account devices and cross-account devices in the super terminal, so that the same account devices and the cross-account devices may cooperate to provide a scene service. The same account device and the cross-account device can be bound before cooperative work.
In some embodiments, the first device detects a user action on the second control and removes the second control from the first interface.
With reference to the second aspect, in some embodiments, a recycle bin control is further displayed in the first interface. The first device detects a user action on the recycle bin control and may display a recycle bin list. One or more device options may be included in the recycle bin list. The device indicated by the device option in the recycle bin list may be the device indicated by the device option removed from the first interface. Illustratively, when the user removes the second control at the first interface, the user may view the recycle bin list through the recycle bin control. A device option indicating the second device may be included in the recycle bin list. Wherein, in response to the operation of the device option indicating the second device in the recycle bin list, the first device may display the second control on the first interface again.
In conjunction with the second aspect, in some embodiments, the second control in the second interface may be differentiated from the representation of the second control in the first interface.
In some embodiments, in combination with the second aspect, a floating control is displayed in the second interface, and the floating control includes a first area, and the first area displays service options of the first scenized service. The first user action may be an operation of the user holding down the second control. The second user operation may be an operation of releasing the second control after the user drags the second control to the first region, or the second user operation may be an operation of releasing the second control after the user drags the second control to move to a specified region via the first region in a direction close to the first control, where the specified region may be a circular region with the first control as a center and the radius as the distance of the first radius. The embodiments described with reference to fig. 11, 12, 13A, 13B, 13C, 13D. In this embodiment, one second user operation includes two functions of associating the second device to the first device and selecting the first scenario service, which improves the user operation efficiency.
In combination with the second aspect, in some embodiments, the levitation control includes a plurality of circular regions, or a plurality of rectangular regions, or a plurality of fan-shaped annular regions, or a plurality of circular-ring-shaped regions, or a plurality of polygonal regions, etc.
In some embodiments, in combination with the second aspect, a first control is included in the second interface. The first control may also not be included in the second interface.
In conjunction with the second aspect, in some embodiments, after receiving the first consent message of the second device, the first device may run the first scenized service in cooperation with the second device. The first scenario service may be a scenario service with the highest probability of being selected by a user among the K1 scenario services. The K1 item of scenized services may be scenized services that the first device and the second device can provide cooperatively. K1 is a positive integer. In the K1 scenario services, the probability that one scenario service is selected by the user may be calculated according to one or more of the following parameters: the frequency of the scenario service used by the user, the ranking of the scenario service set by the user, the environment or scenario in which the first device and/or the second device is located, the state in which the first device and/or the second device is located, and the application recently run by the first device and/or the second device. The K1 item of scenarized services also include a fourth scenarized service. The second interface may display a service option of a fourth scenario service. The user may adjust the scenario service cooperatively provided by the first device and the second device from the first scenario service to a fourth scenario service using the service option of the first scenario service.
It can be known from the foregoing embodiment that the first device may directly provide the scenario service with the highest probability of user selection in cooperation with the second device according to the user operation associating the first device and the second device. If the scene service required by the user is not the scene service with the highest user selection probability determined by the first device and the second device, the user can select the scene service desired by the user through the service options of other scene services.
In a third aspect, an embodiment of the present application provides an electronic device, which may include: a communication device, a display device, a memory, and a processor coupled to the memory, a plurality of application programs, and one or more programs. The communication means is for communication, the display means is for displaying an interface, and the memory has stored therein computer-executable instructions that, when executed by the processor, enable the electronic device to implement any of the functions of the first device as described in the first aspect.
In a fourth aspect, an embodiment of the present application provides an electronic device, which may include: a communication device, a display device, a memory, and a processor coupled to the memory, a plurality of application programs, and one or more programs. The communication means is for communication, the display means is for displaying an interface, and the memory has stored therein computer-executable instructions that, when executed by the processor, enable the electronic device to implement any of the functions as the first device of the second aspect.
In a fifth aspect, the present application provides a computer storage medium, in which a computer program is stored, where the computer program includes executable instructions, and when the executable instructions are executed by a processor, the processor is caused to perform operations corresponding to the method provided in the first aspect.
In a sixth aspect, the present application provides a computer storage medium, in which a computer program is stored, where the computer program includes executable instructions, and when the executable instructions are executed by a processor, the processor is caused to perform operations corresponding to the method provided in the second aspect.
In a seventh aspect, an embodiment of the present application provides a computer program product, which, when run on an electronic device, causes the electronic device to perform any one of the implementation manners as described in the first aspect.
In an eighth aspect, the present application provides a computer program product, which when run on an electronic device, causes the electronic device to perform any one of the possible implementations as in the second aspect.
In a ninth aspect, an embodiment of the present application provides a chip system, where the chip system may be applied to an electronic device, and the chip includes one or more processors, and the processors are configured to invoke computer instructions to enable the electronic device to implement any implementation manner as possible in the first aspect.
In a tenth aspect, the present application provides a chip system, which may be applied to an electronic device, where the chip includes one or more processors, and the processors are configured to invoke computer instructions to enable the electronic device to implement any implementation manner as in the second aspect.
In an eleventh aspect, an embodiment of the present application provides a communication system, where the communication system includes a first device and a second device, where the first device implements any implementation manner as in the first aspect.
By implementing the aspects provided by the application, a user can quickly select the scenized service and/or the atomized service according to personal requirements, the available scenized service and/or the atomized service under the equipment combination is provided in a targeted manner by selecting the equipment combination by the user, detailed and rich use scenes are listed for the user, the user is guided to fully explore more application scenes by using the functions of the super terminal, and the effect of recommending the scenized service according to the equipment combination is realized.
Drawings
Fig. 1 is a schematic diagram of a communication system according to an embodiment of the present application;
fig. 2 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
Fig. 3 is a schematic diagram of a software architecture of an electronic device according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 6 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 8 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 9 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 10 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 11 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 12 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 13A is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 13B is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 13C is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 13D is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 14 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 15 is a schematic diagram of a user interaction interface provided by an embodiment of the application;
FIG. 16 is a schematic diagram of a user interaction interface provided by an embodiment of the present application;
FIG. 17 is a schematic view of a user interaction interface provided in an embodiment of the present application;
FIG. 18 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 19 is a schematic view of a user interface provided by an embodiment of the present application;
fig. 20 is a flowchart of a device interaction method according to an embodiment of the present application;
fig. 21 is a schematic diagram of a functional module according to an embodiment of the present application;
FIGS. 22A-22J are schematic diagrams of some of the user interfaces provided by embodiments of the present application;
FIGS. 23A-23D are some schematic diagrams of user interfaces provided by embodiments of the present application;
FIGS. 24A-24D are some schematic user interface diagrams provided by embodiments of the present application;
FIGS. 25A-25H are schematic diagrams of some of the user interfaces provided by embodiments of the present application;
FIGS. 26A-26C are schematic diagrams of some of the user interfaces provided by embodiments of the present application;
FIGS. 27A-27E are schematic diagrams of some user interfaces provided by embodiments of the present application;
FIGS. 28A-28K are schematic diagrams of some user interfaces provided by embodiments of the present application;
FIGS. 29A and 29B are schematic diagrams of some of the user interfaces provided by embodiments of the present application;
FIGS. 30A-30C are some schematic user interface diagrams provided by embodiments of the present application;
FIGS. 31A-31C are schematic diagrams of some user interfaces provided by embodiments of the present application;
FIGS. 32A and 32B are schematic diagrams of some of the user interfaces provided by embodiments of the present application;
FIGS. 33A-33C are schematic diagrams of some of the user interfaces provided by embodiments of the present application;
fig. 34A to 34O are schematic views of some user interfaces provided in the embodiments of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and exhaustively described below with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and further, in the description of embodiments of the application, "plurality" means two or more than two.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an Application (APP) or an Operating System (OS) and a user, and implements conversion between an internal form of information and a form acceptable to the user. The user interface is source code written by java, extensible markup language (XML) and other specific computer languages, and the interface source code is analyzed and rendered on the electronic equipment and finally presented as content which can be identified by a user. A common presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be a visual interface element such as text, an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. displayed in a display of the electronic device.
Along with the consumption upgrade, the number of intelligent terminal devices owned by a user is increased. In some usage scenarios, users often need two or more terminal devices to work together or cooperate to provide certain services and perform certain functions, which may also be referred to as cross-device services. For example, the mobile phone and the smart screen cooperate to realize video projection; the mobile phone shares files or pictures and the like to other devices. In practical application, when a plurality of devices are started to work cooperatively, the user operation is complex, the entrance distribution is scattered, the human-computer interaction performance is poor, and the user experience is poor.
In some implementations, when multiple devices work together, only one default service is provided, but one service has not been able to meet the requirements of diverse scenarios. For example, when the mobile phone is cooperated with the intelligent screen, multi-screen cooperative service can be provided. However, the mobile phone and the smart screen cooperate to provide more services, such as services in application scenarios like home theater, motion sensing game, and enjoying karaoke, and the mobile phone does not provide an entry for service options corresponding to multiple scenarios. If the user wants to switch the service using other application scenarios, the user may need to exit the current interface, open a new application program, and start the corresponding service, which is cumbersome for the user to operate.
The application provides a device interaction method, which can be applied to a communication system formed by a plurality of devices, and in some embodiments, the communication system can also be called a super terminal or a super terminal system. The method aims to explore a brand-new man-machine interaction mode, provide a more intuitive and more convenient equipment interaction mode, bear the requirements of a user on a plurality of pieces of equipment in cooperation, and provide the user with entries of service options corresponding to a plurality of scenes.
In the following embodiments of the present application, a plurality of terminal devices may form a "super terminal" (or a super terminal system). The super terminal integrates the capabilities of a plurality of terminals through a distributed technology, stores the capabilities in a virtual hardware resource pool, and uniformly manages, schedules and integrates the capabilities of the terminals according to service requirements to provide services for the outside, so that quick connection, capability mutual assistance and resource sharing are realized among different terminals.
The super terminal may include a core device/center device and a cooperative device/associated device. The core device is a control center of the super terminal system and bears the functions of device management, service scheduling and the like. The core device is connected with one or more associated devices to form the super terminal. The associated device may be an electronic device located on the same communication network as the core device (e.g., a cell phone), for example, an electronic device located on the same Wi-Fi network. The associated device may also be an electronic device whose login account and the core device login account are the same account (e.g., hua is an account), or an electronic device whose login account and the core device login account belong to the same group (e.g., the same family account). The association device may also be an electronic device that establishes a trusted relationship (or called pairing) with the core device in other manners, such as an electronic device that has paired bluetooth, an electronic device that connects a hotspot shared by a mobile phone, or a hotspot that has been connected by a mobile phone, or a Wi-Fi P2P connection between a mobile phone and the electronic device. The embodiment of the present application does not limit the specific association manner between the association device and the core device.
Different associated devices are connected with the core device and cooperatively work, so that different functions can be correspondingly realized. For example, a super terminal is composed of a core device mobile phone and a related device notebook computer, and a screen projection function of projecting a mobile phone interface onto the notebook computer can be realized; the core equipment mobile phone and the associated equipment intelligent sound box form a super terminal, so that the function of remotely controlling the intelligent sound box by the mobile phone can be realized, and the like.
It should be noted that, in the following embodiments of the present application, a "super terminal" is used to collectively refer to a core device, and one or more associated devices having an association relationship with the core device form a communication system. That is, "super terminal" is used to express a set of electronic devices having the above-described association relationship. It should be understood that the term "super terminal" is merely an illustrative word, and may be alternatively expressed as other words, such as "intelligent cooperation", "multi-device cooperation", and the like, which do not constitute a specific limitation of the communication system in the embodiment of the present application. The present application also does not impose any limitations on the types of devices of the core device and associated devices.
The electronic device in the application may be a mobile phone, a tablet computer, a notebook computer, a Personal Computer (PC), a smart television (also called smart screen, large screen, etc.), or a wearable device such as a smart watch, a smart bracelet, etc., which is not limited by the application. For example, with the mobile phone as a core device/center device, the mobile phone may cooperate with one or more other associated devices (e.g., a laptop, a PC, a tablet, a smart watch, etc.) to provide a multifunctional, personalized, more interesting, and more convenient service for consumers.
The device interaction method provides a unified entrance for interaction between the core device and the multiple associated devices, namely a super terminal interface, and a user can trigger any two or more electronic devices in the super terminal interface to form the super terminal. And when the multiple devices initiate the cooperation, the super terminal system can judge the use scene according to the device type, the device characteristics, the product positioning, the common functions, the device state, the current running application, the current device environment and other conditions of each device, and provide one or more scene service options for the user based on the use scene. Different scene service options are corresponding service entrances under different application scenes, and the problem that rich use scenes cannot be provided for users under different equipment combination conditions is solved. According to the method and the device, the electronic equipment is supported to intelligently recommend the scene service options corresponding to various application scenes for the user, and different atomization service/function combinations or interaction function options can be recommended according to different application scenes or types, characteristics, product positioning and the like of different equipment. In addition, in order to expand more scenes with better user experience, equipment can be recommended to the user to be matched, and the like.
By implementing the method provided by the application, the man-machine interaction performance can be enhanced, a user can quickly select the scene service and/or the atomization service according to personal requirements without complicated dispersed setting, and therefore the problems that the setting entrance of the interaction function among electronic equipment is scattered, the setting is complicated, and rich use scenes cannot be listed for the user can be solved. The method has the advantages that the device combination is selected by the user, available scene services and/or atomization services under the device combination are provided in a targeted mode, detailed and rich use scenes are listed for the user, the user is guided to use the functions of the super terminal to fully explore more application scenes, and the effect of recommending the scene services according to the device combination is achieved. Moreover, a more visual and concise user interface is displayed for a user, the intuitiveness, the readability and the operability of the user interface are improved, the user operation is simplified, the electronic equipment can quickly respond to and open different application services, and the user experience is improved.
The following describes a communication system 10 provided in an embodiment of the present application.
Fig. 1 illustrates a communication system 10 provided in an embodiment of the present application.
As shown in fig. 1, the communication system 10 includes a plurality of terminal devices, which may include a mobile phone 101, a smart watch 102, a notebook computer 103, a tablet computer 104, a smart band 105, an earphone 106, a smart tv (also referred to as a smart screen, a large screen, etc.) 107, a smart speaker 108, etc. as shown in fig. 1, the communication system 10 may further include other types of electronic devices, such as a Personal Computer (PC), a desktop computer, a laptop computer, a handheld computer, an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a game console, a treadmill, a cloud host/cloud server, other intelligent wearable devices, etc., and may further include internet of things (internet of things) devices or vehicle machines, smart lights, smart air conditioners, smart scales, etc., which do not limit the present application to any type. In this embodiment, the terminal device may also be referred to as a terminal for short, and the terminal device is generally an intelligent electronic device that can provide a user interface, interact with a user, and provide a service function for the user.
In the communication system 10, the mobile phone 101 serves as a central device/core device, and can establish communication connection with other cooperative devices/associated devices to form a "super terminal. The super terminal integrates the capabilities of a plurality of terminals through a distributed technology, stores the capabilities in a virtual hardware resource pool, and uniformly manages, schedules and integrates the capabilities of the terminals according to business requirements to provide services to the outside, so that quick connection, capability mutual assistance and resource sharing are realized among different terminals. For example, the super terminal is composed of the mobile phone 101 and the notebook computer 103, and the screen projection function of projecting the interface of the mobile phone 101 onto the notebook computer 103 can be realized; the mobile phone 101 and the smart sound box 108 form a super terminal, which can realize the function of remotely controlling the smart sound box 108 by the mobile phone 101, and the like.
In the following embodiments of the present application, the "super terminal" is collectively referred to as a mobile phone 101 as a core device, and is a system formed by association devices having the association relationship with the mobile phone 101. That is, "super terminal" is used to express a set of electronic devices having the above-described association relationship. It should be understood that the term "super terminal" is merely an illustrative word, and may be alternatively expressed as other words, such as "intelligent cooperation", "multi-device cooperation", and the like, which do not constitute a specific limitation of the communication system in the embodiment of the present application.
The communication connection between the terminal devices in the communication system 10 may be a wired connection or a wireless connection, and the embodiment is not limited thereto. Data or instructions can be transmitted between the terminals via the established communication connection.
The communication connection may be a close range communication connection. Such as a wired connection, e.g., a Universal Serial Bus (USB) connection, a high definition multimedia interface (highdefinition multimedia)A dialnterface, HDMI) connection, a display interface (DP) connection, etc. Or wireless connection, such as Bluetooth (BT) connection, wireless fidelity (Wi-Fi) connection, hotspot connection, near Field Communication (NFC), zigBee, and the like, enables communication between terminals without account numbers or different account numbers. The wireless connection is not bound by a connecting line, and the freedom degree of the movement of the user is higher. The embodiment of the present application does not limit the type of the communication connection. A Bluetooth (BT) module and/or a Wireless Local Area Network (WLAN) module may be configured in the terminal device. Wherein, the bluetooth module may provide solutions including one or more of classic bluetooth (bluetooth 2.1) or Bluetooth Low Energy (BLE) bluetooth communication, and the WLAN module may provide solutions including one or more of wireless fidelity peer-to-peer (Wi-Fi P2P), wireless fidelity local area network (Wi-Fi LAN) or wireless fidelity software access point (Wi-Fi software access point). In some embodiments, wi-Fi P2P refers to a technology that allows devices in a wireless network to connect to each other in a point-to-point fashion without going through a wireless router
Figure BDA0003554416580000151
The system may also be referred to as wireless fidelity direct (Wi-Fi direct). The devices establishing the Wi-FiP2P connection can directly exchange data through Wi-Fi (which must be in the same frequency band) under the condition of no connection with a network or a hot spot, so that point-to-point communication is realized, such as data transmission of files, pictures, videos and the like. Compared with Bluetooth, wi-Fi P2P has the advantages of higher searching speed and transmission speed, longer transmission distance and the like.
The communication connection may also be a long-range communication connection. For example, each terminal logs in the same account to connect and communicate via the Internet and the Internet. Multiple electronic devices in communication system 10 may also log into different accounts, but connect in a binding manner. For example, the cell phone 101 and the smart watch 102 may log in different accounts, and the cell phone 101 binds the smart watch 102 to itself in the device management application, and then connects through the device management application.
In addition, a plurality of terminal devices in the communication system 10 may also be connected and communicate in any of the above manners, which is not limited in this embodiment of the application. For example, the communication connection between the mobile phone 101 and the notebook computer 103 may be a combination of a plurality of connections, for example, the mobile phone 101 or the notebook computer 103 may access a network by establishing a connection with a router through Wi-Fi, or may access a network by establishing a connection with a base station through a cellular signal, and the mobile phone 101 and the notebook computer 103 may communicate through the network. If the mobile phone 101 sends the information to the cloud server through the network, the cloud server sends the information to the notebook computer 103 through the network.
In some embodiments, when two terminal devices are trusted devices, for example, the two terminal devices are matched or connected before, when the two terminal devices are to be connected again, the two terminal devices will automatically establish a communication connection and then perform data interaction, and the user does not need to manually perform the operation of connecting or matching again, which is time-saving and labor-saving.
Each terminal device in the communication system 10 can be mounted thereon
Figure BDA0003554416580000161
A system,
Figure BDA0003554416580000162
The system,
Figure BDA0003554416580000163
Figure BDA0003554416580000164
The operating system of each terminal device in communication system 10 may be the same or different, and is not limited in this application. In some embodiments, each terminal connected in the communication system 10 is equipped with a harmony os, and the communication system 10 may be referred to as a harmony os super virtual terminalA super virtual device (also called a harmony os super terminal).
In some embodiments, the super terminal application is installed on the mobile phone 101 or other terminal devices, so that the user can manage each terminal device conveniently.
For example, the smart watch may detect the user's athletic data, such as number of steps taken, length of time run, length of time swim, etc., and synchronize the user's athletic data to the treadmill and/or cell phone. Likewise, the treadmill and/or the cell phone may synchronize the detected user data to the smart watch. The multiple terminals are matched for use, so that more accurate detection of the motion data of the user can be realized.
In this embodiment, the communication system 10 may analyze the usage scenario of the current user based on information such as the type of the currently connected device and/or device characteristics, product location, and commonly used functions, or according to the detected current environment of the user, or the operating state of the device.
In this embodiment, the communication system 10 may comprehensively consider one or more scenario service options according to the device type, the device feature, the product location, the device usage, the collaborative service that the device can provide, the current environment of the device, the device running state, the application recently used by the user, the current running application, the direction information input by the user operation, and the like, and display the scenario service options on the user interface, so as to facilitate the user to quickly start the related service or function.
In some embodiments, when the scenized service options are listed, the scenized service providing module may calculate a probability that each of the scenized service options is selected according to a frequency of use of the different collaborative services by the user recently, or according to a detection of an environment where the current device is located, or according to an application that the user has recently run, and the like, and display the scenized service options that are presumed to have a high probability of being selected in the area of the scenized service options preferentially in front.
In some embodiments, communication system 10 may also analyze the current scenario based on the types of devices that currently make up the system, listing and recommending other devices to the user that are appropriate for the current scenario to help the user obtain a better scenario experience; or more extensible use scenes are presumed, then other equipment is listed and recommended to the user, and if the user adopts the equipment recommended by the system, more and richer use scenes can be further expanded.
In some embodiments, the communication system 10 may further automatically set the atomic service combination in the scene based on the current device combination and the queried atomic services that can be provided by each device, and according to the scenic service actually selected by the user, or based on a service combination commonly used by the user, or a default service combination set by the user, or based on a service combination that is most suitable for the scene in living cognition, or a service combination that is obtained from the server and has the highest use frequency in the scene, so as to save tedious selection operations of the user. Communication system 10 may also support a user to set an atomic service combination in a personalized manner, and the user may modify the atomic service combination manually according to personal preferences or actual conditions to adjust an atomic service to be provided by another device.
The description of the specific embodiments should be understood in conjunction with the following examples.
It should be noted that the communication system 10 shown in fig. 1 is only used to assist in describing the technical solutions provided in the embodiments of the present application, and does not limit other embodiments of the present application. In an actual service scenario, the communication system 10 may include more or fewer terminal devices, and the present application does not limit the types of the terminal devices, the number of the terminal devices, the connection mode, and the like in the communication system 10.
An exemplary electronic device 100 provided by embodiments of the present application is described below.
Fig. 2 is a schematic diagram of a hardware structure of the electronic device 100 according to an embodiment of the present disclosure. The exemplary electronic device 100 provided in the embodiment of the present application may be, but is not limited to, a mobile phone, a notebook computer, a tablet computer (PAD), a smart bracelet, a smart watch, a headset, a Personal Computer (PC), a smart tv, a smart speaker, and the like, and may also be a desktop computer, a laptop computer, a handheld computer, an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a car machine, a game machine, a treadmill, a cloud host/cloud server, other intelligent wearable devices, and the like, or other types of electronic devices such as an internet of things (IOT) device or an intelligent home device, e.g., an intelligent water heater, an intelligent light, an intelligent air conditioner, an intelligent weight scale, and the like.
It is to be understood that the structure illustrated in the present embodiment does not constitute a specific limitation of the first device. In other embodiments of the present application, electronic device 100 may include more or fewer components than those shown, may add or subtract parts of the hardware configuration, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Referring to fig. 2, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 is generally used to control the overall operation of the electronic device 100 and may include one or more processing units. For example: the processor 110 may include a Central Processing Unit (CPU), an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a Video Processing Unit (VPU), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), and the like. Wherein, the different processing units may be independent devices or may be integrated in one or more processors. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent cognition of electronic equipment, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, a Serial Peripheral Interface (SPI) interface, and the like.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 through an I2S interface.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, audio module 170 may also pass audio signals to wireless communication module 160 through a PCM interface. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the UART interface to realize the function of playing the audio.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device, and may also be used to transmit data between the electronic device and a peripheral device. The interface can also be used for connecting other electronic equipment, such as a mobile phone, a PC, a smart television and the like. The USB interface may be USB3.0, and is used for compatible with high-speed Display Port (DP) signaling, and may transmit video and audio high-speed data.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication applied to the electronic device 100, including a second generation (2th generation, 2g) network, a third generation (3th generation, 3g) network, a fourth generation (4th generation, 4g) network, a fifth generation (5th generation, 5g) network, and the like. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 may implement display functions via the GPU, the display screen 194, and the application processor, among others. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc.
The ISP is used to process the data fed back by the camera 193. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1. The camera 193 may include, but is not limited to, a conventional color camera (RGB camera), a depth camera (RGB depth camera), a Dynamic Vision Sensor (DVS) camera, and the like. In some embodiments, camera 193 may be a depth camera. The depth camera can acquire spatial information of a real environment.
In some embodiments, the electronic device 100 may capture an image of a user through the camera 193, identify different users through faces, correspondingly enable different user accounts, and store information of different users through the accounts of different users, so as to ensure that the accounts of different users are not confused, and further protect data privacy of the users.
In some embodiments, the camera 193 may capture hand images or body images of the user, and the processor 110 may be configured to analyze the images captured by the camera 193 to identify hand movements or body movements input by the user. For example, the camera 193 may recognize a hand motion of the user to realize the user gesture control.
The internal memory 121 may be used to store computer executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the using process of the electronic device.
In some embodiments of the present application, the internal memory 121 may be used to store application programs of one or more applications, including instructions. The application program, when executed by the processor 110, causes the electronic device 100 to generate content for presentation to a user. Illustratively, the applications may include applications for managing the electronic device 100, such as game applications, conferencing applications, video applications, desktop applications, or other applications, among others.
The internal memory 121 may include one or more Random Access Memories (RAMs) and one or more non-volatile memories (NVMs).
The random access memory has the characteristics of high reading/writing speed and volatility. Volatile means that upon power down, the data stored in the RAM will subsequently disappear. In general, the ram has a very low static power consumption and a relatively large operating power consumption. The data in the RAM is the memory data which can be read at any time and disappears when the power is off.
The nonvolatile memory has nonvolatile and stable storage data. The nonvolatile property means that after power is off, the stored data can not disappear, and the data can be stored for a long time after power is off. Data in the NVM includes application data and can be stably stored in the NVM for a long time. The application data refers to content written in the running process of an application program or a service process, such as photos or videos acquired by a photographing application, text edited by a user in a document application, and the like.
The random access memory may include static random-access memory (SRAM), dynamic random-access memory (DRAM), synchronous dynamic random-access memory (SDRAM), double data rate synchronous dynamic random-access memory (DDR SDRAM), such as fifth generation DDR SDRAM generally referred to as DDR5 SDRAM, and the like.
The nonvolatile memory may include a magnetic disk storage device (magnetic disk storage), a flash memory (flash memory), and the like.
The magnetic disk storage device is a storage device using a magnetic disk as a storage medium, and has the characteristics of large storage capacity, high data transmission rate, long-term storage of stored data and the like.
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. according to the operation principle, may include single-level cells (SLC), multi-level cells (MLC), three-level cells (TLC), four-level cells (QLC), etc. according to the level order of the memory cells, and may include universal FLASH memory (UFS), embedded multimedia memory cards (eMMC), etc. according to the storage specification.
The random access memory may be read directly by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other programs that are running, and may also be used to store data for user and application programs, etc.
The nonvolatile memory may also store executable programs, data of users and application programs, and the like, and may be loaded in advance into the random access memory for the processor 110 to directly read and write.
The external memory interface 120 may be used to connect an external nonvolatile memory to extend the storage capability of the electronic device. The external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are saved in an external nonvolatile memory.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic device answers a call or voice information, it can answer the voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, the user can input a voice signal to the microphone 170C by uttering a voice signal close to the microphone 170C through the mouth of the user. The electronic device may be provided with at least one microphone 170C. In other embodiments, the electronic device may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA.
The electronic device 100 may include one or more keys 190, and the keys 190 may control the electronic device 100 to provide a user with access to functions on the electronic device 100. The keys 190 may be in the form of mechanical buttons, switches, dials, etc., or may be touch or near touch sensing devices (e.g., touch sensors). The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100. The keys 190 may include a power-on key, a volume key, and the like.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration prompts as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the electronic device 100. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or may be used to indicate a message, notification, or the like.
Electronic device 100 may also include other input and output interfaces, and other apparatus may be connected to electronic device 100 via the appropriate input and output interfaces. The components may include, for example, audio/video jacks, data connectors, and the like.
The electronic device 100 is equipped with one or more sensors including, but not limited to, a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, etc.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message. In some embodiments, touch operations that are applied to the same touch position but have different touch operation time lengths may correspond to different operation instructions. For example: when a touch operation having a touch operation time length smaller than the first time threshold value is applied to the pressure sensor 180A, the confirmed instruction is executed. When a touch operation with the touch operation time length greater than or equal to the first time threshold acts on the pressure sensor 180A, a power-on/power-off instruction is executed.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device, calculates the distance to be compensated for the lens module according to the shake angle, and enables the lens to counteract the shake of the electronic device through a reverse motion, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the opening and closing of the flip may be detected according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for identifying the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and the like.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object nearby. The electronic device 100 can utilize the proximity sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to save power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid abnormal shutdown of the electronic device 100 due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs a boost on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human voice vibrating a bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so that the heart rate detection function is realized.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards can be the same or different. The SIM card interface 195 is also compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the invention adopts a layered architecture
Figure BDA0003554416580000241
The system is an example illustrating a software structure of the electronic device 100.
Fig. 3 is a block diagram of the software configuration of the electronic apparatus 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. Between layersCommunication is via a software interface. In some embodiments, the method can be used for
Figure BDA0003554416580000242
The system is divided into four layers, namely an application program layer, an application program framework layer and an android runtime from top to bottom
Figure BDA0003554416580000243
runtime) and system libraries, and the kernel layer.
The application layer may include a series of application packages.
As shown in fig. 3, the application packages may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, settings, etc. applications. The setting application can set the size, thickness, etc. of the font.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
Content providers are used to store and retrieve data and make it accessible to applications. Such data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as a notification manager used to notify download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
Figure BDA0003554416580000251
The Runtime comprises a core library and a virtual machine.
Figure BDA0003554416580000252
runtime is responsible for scheduling and management of the android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide a fusion of the 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The above description of the software architecture of the electronic device 100 is only an example, and it should be understood that the software architecture illustrated in the embodiment of the present invention is not specifically limited to the present application. In other embodiments of the present application, the software architecture of electronic device 100 may include more or fewer modules than shown, or some modules may be combined, some modules may be split, or a different architectural arrangement. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The technical solution of the present application is described below with reference to some examples of related schematic user interfaces.
After the core device (e.g., a cell phone) and the associated device (e.g., a smart watch) establish a communication connection, the core device and the associated device may send instructions, data, etc. to each other over the communication connection. The communication connection may be a USB connection, a bluetooth connection, a Wi-Fi P2P connection, etc., and the embodiment is not limited.
In the following embodiments of the present application, a super terminal is described as an example, where a mobile phone is used as a core device, and other devices, such as a smart watch and a smart television, are connected to the mobile phone as associated devices, and the super terminal is formed.
The same account may be logged in the core device and the associated device, or different accounts may also be logged in the core device and the associated device, which is not limited in this embodiment.
The core equipment is the control center of the super terminal system, and has the functions of equipment management, service scheduling and the like,
the associated device may be an electronic device located on the same communication network as the core device (e.g., a cell phone), for example, an electronic device located on the same Wi-Fi network. The associated device may also be an electronic device whose login account and the core device login account are the same account (e.g., hua is an account), or an electronic device whose login account and the core device login account belong to the same group (e.g., a same family account). The association device may also be an electronic device that establishes a trusted relationship (or called pairing) with the core device in another manner, for example, a paired bluetooth electronic device, an electronic device that connects to a hotspot shared by a mobile phone, or a hotspot shared by a mobile phone that connects to the electronic device, or a Wi-Fi P2P connection between the mobile phone and the electronic device. The embodiment of the present application does not limit the specific association manner between the association device and the core device.
It can be understood that, in the following example interfaces provided in the embodiments of the present application, the core device is mainly used to present a relevant user interface for a mobile phone, but this does not limit the present application. The following example interfaces may also be migrated for use on other types of devices, all within the scope of the present application, based on the same inventive concepts provided herein.
It should be understood that the user interfaces described in the following embodiments of the present application are only exemplary interfaces and are not intended to limit the present application. In other embodiments, different interface layouts can be used in the user interface, more or fewer controls can be included, and other functional options can be added or subtracted, which are within the scope of the present application based on the same inventive concept provided in the present application.
Referring to the embodiment shown in FIG. 4, FIG. 4 illustrates that a user may enter the Superterminal interface by pulling down the control center interface.
As shown in fig. 4, the user interface 400 is a schematic diagram of a main interface (or called a home screen interface, etc.) displayed on the mobile phone, and the user interface 400 may include a top status bar 401, a desktop 402, a bottom tray bar 403, and the like. The top status bar 401 includes a mobile signaling indicator, a wireless network signaling indicator, a power indicator, a time indicator, etc. Desktop 402 may have icons, application functional components, etc. for one or more applications displayed therein. The bottom tray bar 403 may display icons of commonly used applications, facilitating quick opening of these applications by the user. In the example of the user interface 400, icons of phone, text, address book, camera applications are displayed in the bottom tray bar 403.
As shown in FIG. 4, a user may bring up the control center interface via user operation 404. The user operation 404 shown in fig. 4 may be a touch operation in which the user performs a downward slide from the top of the display interface. In other embodiments, the user operation 404 may also be a touch operation performed by the user to slide up from the bottom of the display interface.
Upon detecting user action 404, the handset displays a control center interface, as shown by user interface 410. A plurality of functionality controls, which may also be referred to as cards or modules, etc., may be displayed in the user interface 410. Such as date and time display control 411, setting control 412, shortcut setting card 413, super terminal function card 414, music card 415, desk lamp shortcut control card 416, smart screen shortcut control card 417, etc. shown in user interface 410.
The date and time display control 411 displays the current time and date. The settings control 412 may be used for quick entry of the user into the settings interface. The shortcut setup card 413 includes a plurality of controls, which may be used for a user to quickly turn on/off or adjust a corresponding function, such as quickly turning on/off a Wi-Fi network, bluetooth, a mobile network, a sound, an alarm clock, and quickly adjusting screen display brightness. The super terminal function card 414 displays icons of one or more devices, where the devices may include devices that have been connected with a mobile phone to form a super terminal, and may also include devices that have not been connected, which are detected by the mobile phone, and the devices that have not been connected may be devices that have been paired before, or devices that have not been paired before, which is not limited in this embodiment. The user clicks on the super terminal function card 414 and the handset may display the user interface 500 shown in fig. 5. The music card 415 may display information of currently played music and singer, and further include controls such as previous, play/pause, next, song list, etc. The desk lamp quick control card 416 can be used for a user to quickly turn on or off the desk lamp. The smart screen shortcut control card 417 may be used for a user to quickly open or close a smart screen.
In other embodiments, the user may also enter the super-terminal interface by selecting a super-terminal option in the setting page, that is, the user operation 404 may include a series of operations, for example, an operation of clicking a "setting" application icon in the main interface by the user, an operation of selecting a corresponding function option in the "setting" application interface, and the like. Or the user may also enter the super terminal interface from an entry of a negative screen, a pull-up toolbar, a pull-down notification bar, a left-pull/right-pull shortcut icon bar, a desktop card, or an application (e.g., an intelligent home application) of the mobile phone, and the like.
The user clicks on the super terminal function card 414 and the handset may display the user interface 500 shown in fig. 5. FIG. 5 is a diagram of a device interaction interface associated with a super-terminal function.
In some embodiments, the handset may turn on the wireless communication function, search for surrounding electronic devices, and display the names of the searched electronic devices in the device list. Then the user selects the electronic equipment needing interaction in the equipment list according to the name, and then connection is initiated to the electronic equipment. In some embodiments, when the user has no special remark, the name of the electronic device displayed in the device list of the mobile phone is generally the type and/or model of the electronic device, for example, HUAWEI Vision, HUAWEI matchbook, and sometimes two or more electronic devices with the same name are encountered, and the user cannot determine which electronic device corresponds to each other.
The embodiment of the present application provides a new interactive interface, which can provide a user with an easier-to-understand interface and a more humanized and friendly interactive experience, such as the super terminal user interface 500 shown in fig. 5.
A super terminal title bar 501, a return control 502 for returning to the previous page, a hint 503, a device connection status diagram 504, and the like may be displayed in the user interface 500.
In the instruction 503, the instruction text "drag device to local machine, and establish collaboration" is displayed.
The device connection status diagram 504 may be represented as a ring diagram, and a cell phone icon 505 is displayed in the center of the device connection status diagram 504 to indicate that the core device is a cell phone. The device icons and names surrounding the cell phone are the surrounding devices searched by the cell phone, such as smart watch icon 506, sound X icon, vision icon, freebooks icon, smart band icon, matePad icon, mateBook icon, etc. shown in fig. 5.
It may be noted that in the device connection status diagram 504, the layout relationship of the core device and the associated devices may be visually indicated. The relative positions of the associated device icon and the core device icon, as displayed in the device connection state diagram 504, may indicate the position relationship of the associated device and the core device in the real space, such as relative orientation information, relative distance information, and the like, which facilitates the user to quickly find and distinguish the electronic devices. For example, the mobile phone searches for two devices named as smart watches, and the user can distinguish the devices to be selected according to the relative position (relative direction or distance) of the identifier and the mobile phone in the super terminal interface of the mobile phone. For example, in a real space, the smart watch a is located on the left side of the mobile phone, the smart watch B is located on the right side of the mobile phone, and in the device connection state diagram, the user can distinguish the smart watch a from the smart watch icon on the left side of the mobile phone icon, and the smart watch B from the smart watch icon on the right side of the mobile phone icon.
In some embodiments, in the device connection status diagram 504, the distances between different associated devices and the core device handset may also be different, and the distance between the associated device icon and the core device icon in the interface may indicate the distance in real space, such as the farther the associated device is from the core device in real space, the farther the associated device icon and the core device icon are correspondingly displayed in the interface. The specific technology for implementing the function is not limited in this embodiment, for example, the mobile phone may obtain the azimuth of the directional information carried by the associated device and the measured distance based on an Ultra Wideband (UWB) technology, so as to obtain a spatial mapping relationship between the associated device and the mobile phone in a real space.
In other embodiments, the distance between the associated device icon and the core device icon in the interface may indicate the signal strength of the associated device detected by the core device, such as bluetooth signal strength or WiFi signal strength, for example, the stronger the signal strength of the associated device detected by the core device, the closer the associated device icon and the core device icon are displayed in the interface.
The layout, representation, and meaning of the indications of the device connection status diagram 504 are not limited by this application. In other embodiments, other layouts or representations may be used to indicate the detected associated devices.
In some embodiments, the handset may automatically detect surrounding associated devices periodically, such as the handset may display an interface in the form of a radar scan indicating that the handset is currently performing detection tasks periodically. Alternatively, the handset may respond to user actions acting on the refresh control and then detect the surrounding associated devices. For example, the handset may broadcast a query message using the wireless communication capability to search for associated devices around the handset. When the electronic device of the accessory receives the query message of the mobile phone, a response message can be returned to the mobile phone, and the response message can carry information such as a media access control address (MAC address), a device type, a name, and the like of the electronic device.
The mobile phone can determine whether the electronic device is associated with the mobile phone according to a response message of the electronic device and the like, namely whether the electronic device is an associated device of the mobile phone. For example, the mobile phone may query, according to the MAC address of the electronic device carried in the response message, the account number logged in by the electronic device from the account number management server, so as to determine whether the account number logged in by the electronic device is the same as the account number logged in by the mobile phone or belongs to the same group of account numbers. Or, the mobile phone may query a router of the Wi-Fi network where the mobile phone is located according to the MAC address of the electronic device carried in the response message, whether the electronic device and the mobile phone are located in the same Wi-Fi network, and the like. Or, the mobile phone determines whether the electronic device is in a trusted relationship with the mobile phone according to a history record stored by the mobile phone. The embodiment of the present application does not limit a specific method for determining the device associated with the mobile phone.
The wireless communication function includes, but is not limited to, a wireless communication function implemented by technologies such as WLAN, radio frequency identification (REID), wi-Fi P2P, hot spot, infrared ray, ultrasonic wave, bluetooth, zigBee, and UWB, which is not limited in this embodiment.
It will be appreciated that many wireless communication technologies support location functions. For example, the present embodiment is not limited to the positioning function based on Wi-Fi, the positioning function based on bluetooth iBeacon, the positioning function based on bluetooth version 5.1 signal angle of arrival (AOA), the positioning function based on UWB, and the like. The distance between the mobile phone and other electronic equipment can be measured by the positioning function based on Wi-Fi and the positioning function based on Bluetooth iBeacon. The Bluetooth version 5.1 AOA positioning function and the UWB positioning function can measure the distance between the mobile phone and other electronic equipment and the direction of the other electronic equipment relative to the mobile phone.
After searching for surrounding associated devices, the handset may further utilize the location capability of the wireless communication function to measure the relative direction and/or relative distance of the associated device from the handset in real space, and then display a device connection status diagram 504 as shown in fig. 5. The layout of the icons (and names) used to characterize the respective electronic devices in the device connection status diagram 504 may be consistent with the positional layout of the respective electronic devices in real space. For example, in the embodiment provided by the present application, fig. 1 shows a positional relationship (a distance relationship is not shown) between a plurality of associated devices (such as the smart watch 102, the notebook computer 103, the tablet computer 104, the smart band 105, the earphone 106, the smart tv 107, and the smart sound box 108) and the mobile phone 101, and the layout between each associated device icon and the mobile phone icon shown in the device connection state diagram 504 in fig. 5 is in one-to-one correspondence. As fig. 1 shows that the smart watch 102 is located directly in front of the cell phone 101, and correspondingly, the device connection status diagram 504 in fig. 5 shows that the smart watch icon 506 is directly above the cell phone icon 505.
The associated device icons shown in fig. 5 are free at the periphery of the core device and are not attached, overlapped or attached to the core device icon, which indicates that these associated devices are not connected to the core device at present. That is, in the process of detecting which associated devices are around by the core device, the wireless communication function needs to be turned on by the mobile phone and each associated device, but the wireless connection between the mobile phone and each associated device is not required to be established.
If a user wants a super terminal composed of a certain associated device and a core device mobile phone to realize some cooperative functions, in one implementation manner, the user may select the icon of the associated device on the super terminal interface shown in fig. 5, and drag the icon corresponding to the associated device to be close to the icon of the core device, referring to the user interface 600 shown in fig. 6.
Assuming that the distance between the associated device icon and the core device icon is a first distance before the devices are uncoordinated, when the associated device icon is dragged to be close to the core device icon and the distance between the associated device icon and the core device icon is smaller than or equal to a second distance, the user releases the finger, and the associated device icon can be attached around the core device icon, referring to the user interface 700 shown in fig. 7.
As shown in fig. 6, if the user wants the smart watch and the phone to form a super terminal, and work together, the user can press the smart watch icon 506 and drag the smart watch icon 506 close to the phone icon 505. In the user interface 600, a prompt caption 602 "smart watch selected" may also be displayed, stating the current user's selection status.
When the distance between the smart watch icon 506 and the mobile phone icon 505 is smaller than or equal to the second distance, or the smart watch icon 506 enters a circular area with the center of the mobile phone icon 505 as the center and the radius as the first radius distance R, the user releases the finger to display the device connection state diagram 701 shown in fig. 7, and the smart watch icon 702 is adsorbed at the edge of the mobile phone icon 505. In the user interface 700, a prompt description 703 "collaborated with the smart watch" may also be displayed, which indicates the current connection state of the super terminal.
Of course, if the smart watch has already established a connection with the mobile phone before the super terminal interface is opened, when the mobile phone detects that the connection has already been established, the interface where the smart watch icon 702 is attached to the edge of the mobile phone icon 505 is directly displayed, so as to express the situation that the smart watch has already established a connection with the mobile phone, such as the user interface 700 shown in fig. 7.
It will be appreciated that the possibility of establishing a communication connection with a handset is a necessary and inadequate condition for the associated device. The device for establishing communication connection with the mobile phone is not necessarily an associated device which can form a super terminal, and the associated device is a device which can establish communication connection with the mobile phone.
In fig. 7, the smart watch icon 702 that has been connected to the handset to form a super terminal may display a distinguishing logo to facilitate distinguishing from the associated device icon in the unconnected state. If the smart watch icon 702 is displayed in a dark filling form, the smart watch and the mobile phone are indicated to be in a connected state, the size of the smart watch icon 702 is smaller than that of the smart watch icon 506 when the smart watch is not connected, the diameter of the circular icon of the smart watch icon 702 is R2, the circular icon is adsorbed on the edge of the mobile phone icon 505, and the smart watch icon 702 is tangent to the edge of the mobile phone icon 505.
In other embodiments, the two device icons being attached together may also mean that the two device icons are wholly or partially overlapped together. The present embodiment does not limit the expression form of adsorption.
Correspondingly, if a user wants to make a certain associated device depart from the super terminal system, the associated device icon can be selected and dragged to be far away from the core device icon when the associated device icon and the core device are in an adsorption state. When the mobile phone detects that the user drags the associated device icon to be away from the core device icon by more than a third distance, when the user releases the finger, the associated device icon returns to the position before the associated device icon is connected, the adsorption state is released, the associated device is disconnected from the core device at the same time, the associated device is separated from the super terminal system, and the service provided by the device in the super terminal is also terminated. In some examples, the core device icon remains stationary, the user selects the associated icon and drags it further away from the core device icon than a certain distance, and the two devices can be disconnected. In other examples, the user may select the associated icon and the core device icon at the same time, press the two icons, respectively, drag the two icons in opposite directions for more than a certain distance, and then the two devices may be disconnected.
The present embodiment is not limited to the conditions for triggering the adsorption and desorption or the user operation.
In other embodiments, when the user drags the associated device icon to be close to the core device icon by less than the second distance, the user finger does not release the associated device icon, but the associated device icon and the core device icon may be adsorbed together after the stay time at the position exceeds the first preset time period. Correspondingly, the user can drag the associated device icon to be away from the core device icon by more than a third distance, the associated device icon is not released by the finger of the user, but after the staying time at the position exceeds a second preset time length, the associated device icon returns to the position before the connection, the adsorption state is released, the associated device is disconnected from the core device at the same time, the associated device is separated from the super terminal system, and the service provided by the device in the super terminal is also terminated.
Or, in other embodiments, when the user clicks an icon of a certain associated device in an unconnected state, the connection between the associated device and the core device may be triggered, and the associated device icon and the core device icon are shown to be adsorbed together on the interface. Accordingly, when a user clicks a related device in a connected state, the related device can be triggered to be disconnected from the core device, the related device icon is separated from the core device icon on the interface, and the related device icon returns to the position in the unconnected state.
In some embodiments, the selected device icon may change position with the trajectory of the user's finger dragging across the screen. The present embodiment does not limit the motion trajectory of the finger when the user drags the device icon, and the user dragging trajectory may be an arbitrary curve. For example, after a user selects a certain associated device icon, a finger of the user drags the associated device icon for a distance on the screen or stays at a certain position on the screen for a second preset time, and the mobile phone may detect whether the distance between the current associated device icon and the mobile phone icon is smaller than or equal to the certain distance, for example, the second distance, and if so, the associated device icon and the mobile phone icon are adsorbed together. As the user's finger moves, the associated device icon moves with the user's finger trajectory.
In other embodiments, the selected device icon may not change position with the trajectory of the user's finger dragging across the screen. For example, when a user selects a certain associated device icon, the user's finger slides on the screen for a certain distance or stays at a certain position on the screen for a second preset time, and the mobile phone may determine whether the distance between the current position of the user's finger and the mobile phone icon is less than or equal to a certain distance, such as the second distance, and if so, the associated device icon and the mobile phone icon are adsorbed together. The associated device icon does not move as the user's finger moves.
In other embodiments, the user may select the core device icon next to the associated device icon, which may also trigger the two devices to connect. If the user holds the cell-phone icon and drags the cell-phone icon and be close to the smart watch icon, when the cell-phone icon pasted or had the overlap region with the smart watch icon mutually, the user loosened the finger, this smart watch icon also can adsorb with the cell-phone icon together, represents that the cell-phone is connected with the smart watch, can the collaborative work.
In some embodiments, a key disconnection control may also be set on the super terminal interface, and clicking the key disconnection control may disconnect all the associated devices from the core device, and recover to a state where the core device is independent and temporarily does not have any associated device connected, and the service provided by each associated device is terminated accordingly.
It may be understood that, without being limited to the connection between the associated device and the core device to form the super terminal, in other embodiments, the user may also connect a certain associated device and another associated device in the super terminal interface to form the super terminal. That is to say, a user may connect any two or more devices, the devices having a connection relationship are not limited to associated devices or core devices, for example, the smart sound box and the smart television of the associated devices may be directly connected to form a super terminal, and various functions, schematic interfaces, or user operations of the super terminal may refer to the descriptions in the foregoing or the following embodiments, which are not described herein again.
In the embodiment provided by the application, when a plurality of devices are connected to form a super terminal, the super terminal system may analyze the usage scenario of the current user based on the currently connected device type and/or device characteristics, or according to the detected current environment of the user, or the running state of the device, and the like, further enumerate the scenario services or functions that can be provided by the super terminal system, and display them on the user interface, thereby facilitating the user to quickly start the relevant services or functions.
Specifically, different device types have different device characteristics, for example, a smart television has a larger screen, so that the user has better visual viewing experience and is more suitable for viewing videos; the intelligent sound box has strong audio output capability and good loudspeaking effect, and is more suitable for playing audio; the earphone is small in size, easy to carry, free of being placed outside, free of disturbing surrounding people, capable of having a noise reduction effect, capable of shielding external environment noise, and suitable for listening to audio in public places, such as public conversation, music and the like; compared with a mobile phone, the smart watch and the smart bracelet can more accurately measure other health information such as exercise data or heart rate of the user; the notebook computer has strong processing capacity and is more suitable for office work. The device types and device features are not described in detail here, and the above example description does not limit other embodiments.
Or, the device can also detect the current environment of the user and analyze the use scene of the current user. The embodiment does not limit the manner of detecting the environment where the user is located, and for example, it may be determined whether the device is indoors or outdoors according to the current positioning information of the device; or the camera is used for collecting the photo of the current environment, or the ultrasonic wave, the infrared ray and the like are used for collecting the space depth or size of the current environment so as to judge whether the current user is indoors or outdoors.
In some embodiments, the super-terminal system may partition out a variety of consumer usage scenarios. The super terminal system can match the relevant use scenes based on the information of the equipment type, the equipment characteristics, the product positioning, the common functions and the like of the equipment in the current system.
In one example, a consumer usage scenario may include the following five categories: mobile office, intelligent home, audio-visual entertainment, sports health, intelligent trip. Certainly, the classification of the usage scenarios of the consumer is not constant, and the super terminal system may continuously update, expand, or refine more usage scenarios, which is not limited in this embodiment and may be changed according to actual situations. For example, new use scenes may occur due to addition of related devices or functions that have not occurred before, for example, a super terminal may be formed by a new smart car as a related device and a mobile phone, and then a use scene formed by the new mobile phone and the smart car is an intelligent travel scene.
After the super terminal system matches the usage scenario according to the device, the scenario service, the atomization service, and the like, which can be provided by the device combination, can be analyzed and queried based on the usage scenario.
A scenario service refers to a functional service based on a usage scenario, and is a set of multiple functions or services that can be provided for a particular scenario. The atomic service refers to a minimum capacity unit which can run independently, is a concept of abstract packaging of single function/capacity, and can be a hardware service or a software service. Generally, a scenario service can be supported by a plurality of atomization services, and an atomization service combination comprises atomization services formed by atomic capabilities of a plurality of devices. Wherein the atomic capability may include one or more of: audio output capability, audio input capability, display capability, camera capability, touch input capability, keyboard and mouse input capability, and the like. For example, the audio output capability may include that the device supports mono or multi-channel, supported sound effect, supportable frequency response range, noise reduction capability, audio resolution capability, and the like when playing audio; audio input capabilities may include a range of articulation, noise reduction capabilities, etc.; the display capabilities may include the screen size of the device, display resolution parameters, refresh rate, color rendering, etc.; the camera capability may include the camera type of the device, the pixel, night view, image adjustment, etc.; touch input capability, keyboard and mouse input capability refer to whether the device can support touch input or keyboard and mouse input, etc.
For example, a multi-screen collaborative scenarized service between a mobile phone and a large screen is supported by the following atomization services: display service, audio input service, audio output service, touch input service, keyboard and mouse input service, camera shooting service, and the like. The display service can be provided by a large screen, the audio input service can be provided by a mobile phone, the audio output service can be provided by the large screen, the touch input service can be provided by the mobile phone, the keyboard and mouse input service can be provided by the large screen, and the like.
It is understood that the scenario service and the atomization service are only words used in this embodiment, and the meaning of the words is described in this embodiment, and the name of the words does not limit this embodiment in any way. For example, in some other embodiments, a scenario service may also be referred to as a scenario function, a scenario service, a business function, or other terms; an atomic service may also be referred to as a meta-capability, an atomic capability (availability), an atomic service, a functional component, and other terms. In the embodiment of the present application, the description is mainly given by "scenario service" and "atomization service".
Taking a super terminal consisting of a mobile phone and a smart watch as an example, the use scene of the super terminal can be matched to be a sports health scene, and a super terminal system can inquire scene services and atomization services which can be provided by the combination of the mobile phone and the smart watch. For example, the services that can be provided by the combination of a mobile phone and a smart watch include: support for exercise services, support for health services, count exercise amounts, measure heart rate, measure blood oxygen saturation, and the like. Combining different scenes, the scene service for recommending the combination of the mobile phone and the smart watch comprises the following steps: indoor running, outdoor running, heart health, blood oxygen health, and the like. Querying the atomization service supported by the smart watch includes: GPS positioning, sound output, vibration alerts, etc.
In one example, referring to table one, table one lists the usage scenario categories corresponding to the categories when the core device is a mobile phone and some different associated devices and the mobile phone form a super terminal, and analyzes the recommended scenario service after querying the device capability. It should be noted that, in the first table, only the case that the associated device is a single device is shown, but not limited to this, in other embodiments, there may also be a case that multiple associated devices and a mobile phone form a super terminal, for example, the associated devices are a smart television and a smart speaker, and the super terminal forms with the mobile phone, and the corresponding execution operations and function descriptions thereof may refer to the foregoing or the following embodiments, and are not described again here.
Watch 1
Figure BDA0003554416580000321
The scenario service may be a scenario service option obtained by the super terminal system through comprehensive statistics according to the device type, the device usage, the cooperative work content commonly used by the user, the current environment of the device, the device running state, the application recently used by the user, the current running application, and the like of the core device and/or the associated device.
In some embodiments, the super terminal may recommend a scenario-based service according to the device type, the product location, the device usage, and the like, for example, if the combination of the mobile phone and the smart watch is used for statistics of the amount of exercise, then the super terminal system may classify the combination of the mobile phone and the smart watch into an exercise health scenario based on the device type, the device usage, and the common collaborative work content of the combination of the mobile phone and the smart watch. The scenarized service options that may be provided in an athletic health scenario include outdoor running, indoor running, heart health, vascular health, and the like.
In some embodiments, when the scenized service options are listed, the probability of selecting each scenized service option may be calculated according to the recent frequency of use of the user or the environment where the current device is detected, and the scenized service options with a high probability of being selected are displayed in the area of the scenized service options preferentially.
The super terminal system can display the option with high utilization rate in front according to the counted use frequency of the user to each scene service option in a certain time. The certain time may be all the time of the history, or may be a certain period of time, such as the last three months. The higher the historical usage of the scenized service options, the earlier the display.
In other embodiments, the core device or the associated device may detect the current environment in which the device is located, and sort the scenized service options according to the located environment. If the core device or the associated device detects that the environment where the user is located is outdoor, the outdoor related service option is preferentially recommended, such as outdoor running. If the core device or the associated device detects that the environment in which the user is located is indoor, indoor related service options such as indoor running are preferentially recommended. The embodiment does not limit the manner of detecting the environment where the user is located, and for example, the environment where the user is located may be determined according to the current location information of the device; or a camera is used for collecting the picture of the current environment, or ultrasonic waves, infrared rays and the like are used for collecting the depth or the size of the current space so as to judge the current environment of the user.
In other embodiments, the super terminal system may further determine an environment or a scene where the device is located according to an application recently opened or used by the user or a currently running application, and further recommend a scenario-based service option. For example, when a super terminal is formed by a mobile phone and a smart television, if it is detected that the currently running application of the mobile phone is an office demonstration application, a multi-screen cooperative service option can be preferentially recommended; or, if the application currently running by the mobile phone is detected to be a video application, preferentially recommending a home theater service option; or, when the condition that the webpage browsed by the user is the game-related webpage is detected, service options related to game entertainment are preferentially recommended.
In other embodiments, the super terminal system may recommend the scenario service according to the operation state of the device. For example, when a mobile phone is connected with an intelligent television to form a super terminal, if the intelligent television is detected to be in a bright screen state, the scene service is preferentially recommended to be a screen mirror image, and if the intelligent television is detected to be in a dark screen state, the scene service is preferentially recommended to be a voice service.
In other embodiments, the super terminal system may also recommend the scenario service according to the atomization service that the device can provide.
In some embodiments, a user may custom add a scenarized service, such as a user setting a name to a certain usage scene and associating the scene with other application services or system services. The newly added scene service and other scene services are displayed in the super terminal interface in parallel.
The above embodiment does not limit other embodiments, and the super terminal system may comprehensively consider one or more of the above factors and reasonably recommend the scenario service option.
The super terminal system can automatically set the atomic service combination in the scene based on the current equipment combination, the inquired atomic services which can be provided by each equipment, and the actually selected scene service of the user, the commonly used service combination of the user, the default service combination set by the user, the service combination which is most suitable for the scene based on life cognition, the service combination which is most frequently used in the scene obtained from the server, and the like, thereby omitting the fussy selection operation of the user.
For example, after a super terminal is formed by a mobile phone and a smart television, if the scene service selected by the user is a cross-screen conference, the atomization service combination is automatically set as follows: the intelligent television is used for providing an audio output function, the mobile phone is used for providing an audio input (microphone) function, the intelligent television is used for providing a camera function, and the intelligent television is used for providing a display function. For another example, after the mobile phone, the smart watch, the earphone and the treadmill form a super terminal, if the scenario service selected by the user is indoor running, the atomization service combination is automatically set as follows: the audio output function is provided by the headset, the audio input (microphone) function is provided by the headset, the camera function is provided by the cell phone, and the display function is provided by the treadmill.
In some embodiments, the super-terminal system may analyze the optimal atomic service composition according to device characteristics possessed by different device types. For example, the smart television has a larger screen, so that the user has better visual viewing experience and is more suitable for viewing videos, and when the smart television and a mobile phone form a super terminal, the smart television is preferentially set to provide a display function. The intelligent sound box has stronger audio output ability, and it is better to raise one's voice the effect, and more suitable broadcast audio frequency when intelligent sound box and cell-phone constitute super terminal, the priority sets up intelligent sound box and provides the audio output function so. The earphone is small in size, easy to carry, free of being placed outside, free of disturbing surrounding people, capable of having a noise reduction effect, capable of shielding external environment noise, suitable for listening to audio in public places, such as conversation in public places, listening to music and the like, and when the earphone and the mobile phone form a super terminal, the earphone is preferentially arranged to provide an audio output function. Compared with a mobile phone, the smart watch and the smart bracelet can more accurately measure exercise data or other health information such as heart rate of a user, so that when the smart watch or the smart bracelet and the mobile phone form a super terminal, the exercise data acquired by the smart watch or the smart bracelet is preferentially set; the notebook computer has strong processing capacity and is more suitable for office work, and when the notebook computer and the mobile phone form a super terminal, the notebook computer is preferentially arranged to provide a display function, a keyboard input function and the like. The device types and device features are not described in detail here, and the above example description does not limit other embodiments.
In some embodiments, the super terminal supports a user to set an atomic service combination in a personalized manner, and the user can manually modify the atomic service combination according to personal preferences or actual conditions to adjust a certain atomic service to be provided by other devices. If the surrounding environment is noisy, as in a cross-screen conference, the user may provide the audio output function by the smart television and instead by the earphones.
In the embodiment provided by the application, the super terminal system can also analyze the current scene based on the types of the devices which form the super terminal system at present, and enumerate and recommend other devices suitable for the current scene to the user so as to help the user to obtain better scene experience; or presume more extensible use scenes and then list and recommend other devices to the user, and if the user adopts the devices recommended by the system, more and richer use scenes can be further expanded.
For example, after the mobile phone is cooperated with the smart television, the wireless microphone, the wireless game handle and other devices of the user can be recommended, and the Karaoke, game and other scene services can be correspondingly expanded. For another example, after the mobile phone is cooperated with the smart watch, the devices such as a user treadmill, a heart rate belt and a blood glucose meter can be recommended, and the scene services such as indoor running, heart health and blood glucose health can be correspondingly expanded. After a recommendation device is added, the system can also recommend a new device according to the newly formed super terminal. The user can select to add more devices to form the super terminal, so that the use scene of the user is richer. After more associated devices are added, the scene services provided by the super terminal are more and richer, and the ecosystem of the super terminal is improved.
The functions provided by the above embodiments are explained below by way of example with reference to the drawings.
Referring to the user interface 700 shown in fig. 7, in the example of fig. 7, when the mobile phone and the smart watch form a super terminal system, a scenario service option area 705 may be displayed below the device connection state diagram 701, and one or more scenario services recommended by the super terminal system according to the current device combination are listed in the scenario service option area 705. Also displayed is a recommended devices area 704 for listing devices that may be linked with the current super-terminal system device to bring the user a richer scene or better experience, which the user may select as desired. In the user interface 700, an atomization service entry option 706 is also displayed, and when the atomization service entry option 706 is selected, an atomization service combination can be displayed, and a user can select a providing device of an atomization service by himself, referring to the embodiment shown in fig. 10.
In the example of fig. 7, when the mobile phone and the smart watch form a super terminal system, a plurality of scenario service options, such as outdoor running, indoor running, heart health, blood vessel health, etc., are displayed in the scenario service option area 705. The user may also slide left and right using a finger on the scenarized service options area 705 on the screen to view more options not shown.
The scene service options can be obtained by the super terminal system through comprehensive statistics according to the device types, device purposes, collaborative work contents commonly used by users, the current environment of the device, the running state of the device, the application recently used by the user, the current running application and other conditions of the mobile phone and the smart watch. Moreover, the probability of selecting each scenario service option may be calculated according to the recent frequency of use of the user or the environment where the current device is detected, and the scenario service option having a high probability of being selected is preferentially displayed in the scenario service option area 705.
If the combination of the mobile phone and the smart watch is commonly used for counting the exercise amount, the super terminal system can classify the combination of the mobile phone and the smart watch into an exercise health scene based on the equipment type, the equipment use and the commonly used cooperative work content of the combination of the mobile phone and the smart watch. The scenized service options that may be provided in the sports health scene include outdoor running, indoor running, heart health, vascular health, etc., are displayed in the scenized service options area 705.
The super terminal system can display the option with high utilization rate in front according to the counted use frequency of the user to each scene service option in a certain time. The certain time may be all the time of the history, or may be a certain period of time, such as the last three months. The higher the historical usage of the scenized service options, the earlier the display. For example, if it is obtained that the scenario service option most frequently used by the user is outdoor running, then indoor running, etc., in the scenario service option area 705, the scenario service option displayed first is outdoor running, and the scenario service option displayed second is outdoor running, etc.
In other embodiments, the cell phone or smart watch may detect the current environment in which the device is located, and sort the scenized service options according to the environment in which the device is located. If the mobile phone or the smart watch detects that the environment where the user is located is outdoor, the user prefers to recommend an outdoor running service option. If the mobile phone or the smart watch detects that the environment where the user is located is indoor, the indoor running service option is preferentially recommended. The embodiment does not limit the manner of detecting the environment where the user is located, and for example, it may be determined whether the device is indoors or outdoors according to the current positioning information of the device; or the camera is used for collecting the photo of the current environment, or the ultrasonic wave, the infrared ray and the like are used for collecting the space depth or size of the current environment so as to judge whether the current user is indoors or outdoors.
In other embodiments, the super terminal system may further determine the environment or the scene where the device is located according to an application recently opened or used by the user or a currently running application, and further recommend the scenario service option. For example, before the user opens the super terminal interface to connect the smart watch and the mobile phone, the application run by the user is the exercise health application, and the service used by the user is the outdoor running sub-service in the exercise health application, so when the user connects the smart watch and the mobile phone to form the super terminal system, in the scenario service option area 705, the first displayed scenario service option is outdoor running.
The above example does not limit other embodiments, and the super terminal system may reasonably recommend the scenario service option by comprehensively considering one or more of the above factors.
In some embodiments, when the devices are connected to form a super terminal, a certain scenario service may be automatically selected as a default service. For example, the scenario service option ranked first in the scenario service option area 705 is set as a default service, and when devices are connected to form a super terminal, each device is configured by default according to an atomization service combination corresponding to the scenario service. If the user wants to select other scenarized services, other scenarized service options may be selected. The default scene service is set, the seamless service switching of the equipment can be facilitated, and the user operation can be omitted to a certain extent. For example, when the earphone is connected with the mobile phone, the default service setting position is provided with the audio output and audio input functions by the earphone, and then at the moment when the earphone is successfully connected with the mobile phone, the audio output and audio input functions are immediately switched to be provided by the earphone, so that a user does not need to perform additional operation, and complicated operation is omitted. The default service may be a default service selected by the user, including a scenario service and/or an atomization service, or the default service may be a service commonly used by the user and obtained by statistics by the system as the default service, and the like, which is not limited in this embodiment.
In other embodiments, when the devices are connected to form the super terminal, the scenario service may not be selected first, but one or more scenario service options are listed, and after a certain scenario service is selected by a user, the scenario service or the corresponding atomization service is switched to a set state. The method has the advantages that the use will of the user is fully considered, and the trouble or puzzlement caused by automatic switching to wrong service is avoided. For example, when a mobile phone is connected with a PC, a user intends to transfer files, if the default service is screen mirroring, the screen of the mobile phone is projected to the PC, and then the user is required to switch the service, especially if the current PC is used for other purposes, such as presenting documents, the screen projection service can prevent the display of the presentation documents, which causes troubles to the user, or if the mobile phone of the user has private contents and does not want to be displayed on the PC, the user is troubled by directly switching to the screen mirroring service. Therefore, when the mobile phone is connected with the PC, the scene service options can be displayed on the super terminal interface, and the user selects the needed scene service options and/or the needed atomization service combination options in advance and then executes the switching service.
It is noted that when two or more devices form a super terminal and a user selects a scenario service, the devices may have the capability or status of performing an interactive function, but the two or more devices are not limited to performing the interactive function immediately. For example, if the user drags the device icon, the sound projection function between the mobile phone and the smart sound box is triggered. If the mobile phone does not have the audio playing task or the audio playing task of the mobile phone is suspended, the mobile phone does not immediately send audio data to the smart sound box. However, when the mobile phone has an audio playing task, the mobile phone sends audio data to be played to the smart sound box, and the audio data is played by the smart sound box.
In the user interface 700 of fig. 7, icons of a treadmill and a spinning are displayed in the recommendation device area 704, which prompts the user that the treadmill or the spinning can be added to the super terminal, so that better scene experience can be obtained. In the example shown in fig. 7, the super terminal system may analyze a user usage scenario as an exercise health scenario based on the devices currently forming the super terminal being a mobile phone and a smart watch, and the recommended device area 704 may display recommended exercise devices or health monitoring devices, such as a treadmill, a spinning bike, a blood glucose meter, a heart rate meter, and the like.
When the user selects (clicks or drags) the device icon of the recommended device area 704, if the device is an associated device detected by the current core device, that is, the device displayed in the device connection state diagram 701, the device may initiate connection to join the super terminal. When the user selects (clicks or drags) the device icon of the recommended device area 704, if the device is not displayed in the device connection status diagram 701, then a jump may be made to a page related to the device product, such as to introduce the user about the recommended device's functionality, use experience, how the super-terminal may be used and extended with other devices, purchase links, purchase channels, and so forth.
The recommendation devices can be added to give a better scene experience to the user or expand more using scenes. After a recommendation device is added, the system can also recommend a new device according to the newly formed super terminal. The user can choose to add more devices to form the super terminal, so that the use scene of the user is richer. For example, a mobile phone and a smart watch are connected to form a super terminal, the system recommendation device comprises a treadmill, the user selects the treadmill to connect with the super terminal, the system recommendation device can comprise an earphone, and the recommendation user can listen to music while running. With the addition of more devices, a more complete super-terminal ecosystem can be formed.
In some embodiments, the mobile phone may display a prompt message of the recommendation device on the user interface, for example, after it is detected that the mobile phone is cooperated with the smart television, the mobile phone finds that the Sound box is also around, and then may display a prompt message, such as "recommend that Sound box Sound X is added to the super terminal and allowed" or not ", and give an option" yes "or" no ", if the user selects" yes ", the Sound box Sound X is added to the super terminal, and if the user selects" no ", no other operation is performed. Or only displaying the option 'yes' after the prompt information is given, connecting the Sound box Sound X into the super terminal if the user selects 'yes', and automatically canceling to display the prompt information and not performing other operations if the user does not operate for more than a certain time, such as two seconds.
In the embodiment of the present application, the service or function between devices that can be provided by the super terminal includes, but is not limited to, a system-level service or function and an application-level service or function. As illustrated herein, system level services or functions may include, but are not limited to, voice casting, screen casting, etc., and application level services or functions may include, but are not limited to, application relay, content sharing, etc.
The audio projection means that an audio stream played on one electronic device is sent to another electronic device and played by the other electronic device. For example, the projection of the mobile phone and the smart speaker, the projection of the mobile phone and the smart screen in the off-screen state, the projection of the mobile phone and the vehicle, and the like.
The screen projection includes streaming (stream) screen projection (also referred to as heterogeneous screen projection) based on Digital Living Network Alliance (DLNA) technology, screen mirroring (also referred to as homogeneous screen projection) based on Miracast technology, and screen projection based on other technologies (e.g., airPlay-based). For example, a screen shot between a cell phone and a smart tv/PC/tablet. If the user wants to project the video or audio and the like played on the mobile phone to other electronic equipment, stream screen projection can be performed. Then, the mobile phone may send the playing address of the video or audio to a certain electronic device (e.g., a PC), and the PC plays the corresponding video or audio according to the received playing address. Or the mobile phone sends the video data or audio data played by the mobile phone to the PC, and the PC can directly play the data according to the received video data or audio data. In the screen projection process, the interface displayed on the mobile phone can be different from the content displayed on the PC. If the user wants to project the display interface on the mobile phone screen to other electronic equipment, screen mirroring can be performed. Then, the mobile phone sends the mobile phone interface data to other electronic equipment (such as a PC), and the PC displays the interface of the mobile phone. In the screen projection process, the interface displayed on the mobile phone is the same as the interface displayed on the PC.
The application relay means that one electronic device can relay to operate an application operating on another electronic device.
Content sharing includes sharing content such as files/photos/videos/audio/text among different electronic devices. For example, application relay between a mobile phone and a smart tv/PC/tablet. Examples of the relayed applications include an internet connection application (a video call application), a video application, a music application, and a document application.
As can be seen from the above, the exemplary user interface 700 shown in fig. 7 may be a unified entry provided by the core device handset for interaction with multiple associated devices, and a user may trigger any two or more electronic devices in the user interface 700 to form a super terminal. And the super terminal can judge the use scene according to the conditions of the equipment type, the equipment characteristic, the product positioning, the common function, the equipment state, the current running application, the current equipment environment and the like of each equipment, and provide one or more scene service options, atomic service combination options or interactive function options based on the use scene. The user can swiftly select scene service and/or atomization service according to individual demand, need not loaded down with trivial details dispersion setting, from this, can solve the interactive function's between electronic equipment setting entry and distribute scattered, set up loaded down with trivial details, can not list the problem of abundant use scene for the user.
According to the method and the device, the device combination is selected by the user, the available scene service and/or atomization service under the device combination is provided in a targeted manner, detailed and rich use scenes are listed for the user, the user is guided to use the functions of the super terminal to fully explore more application scenes, and the effect of recommending the scene service according to the device combination is achieved.
Referring to fig. 8, when the cell phone detects that the user clicks on the outdoor run option 707 listed in the selected scenic services option area 705 in the user interface 700, the cell phone may jump to an outdoor run related page, as shown in the user interface 800. Wherein the outdoor run option 707 may be highlighted when selected.
The user interface 800 is an athletic health application page that can be seen with sub-options displayed below the athletic health title bar: an outdoor run sub option 801, an indoor run sub option, a cardiac health sub option, a vascular health sub option. Since the user selects the outdoor running option 707 in the user interface 700, the current jump to the user interface 800 displays a page corresponding to the outdoor running sub-option 801, such as a map, a user positioning icon, a user movement route, a statistical user movement amount, and the like.
Accordingly, the smart watch displays an outdoor running related interface, referenced to the smart watch user interface 802. The smart watch user interface 802 may not be consistent with the cell phone outdoor running related user interface 800, and the smart watch and the cell phone may share athletic data. The movement data such as pace, movement distance, heart rate, movement time and the like detected by the smart watch can be displayed on the smart watch user interface 802.
Fig. 9 shows that the scenario service options and the recommendation devices may change correspondingly with different device combinations, and the super terminal may provide more appropriate scenario service options and recommendation devices according to different device combinations.
Referring to the user interface 900 shown in fig. 9, in the user interface 900, the device connection state diagram indicates that the mobile phone and the smart television Vision are connected to form a super terminal, a reference indication area 901 shows that a mobile phone icon and a smart television Vision icon are adsorbed together, and "coordination with Vision" is described below a super terminal title.
In the user interface 900, the scenized service option area 903 includes scenized service options such as multi-screen collaboration, screen expansion, cross-screen conference, somatosensory games, and the like. The recommended equipment area 902 includes equipment icons of a sound box, a notebook computer, and a game pad. The scene service option and the recommended equipment option are recommended options obtained by mobile phones and intelligent television Vision based on equipment included in the super terminal.
If the user selects a Sound box icon in the recommendation device area 902 to join the super terminal, or drags a Sound box Sound X icon in the device connection state diagram to be attached around the mobile phone icon, the user interface 910 may be displayed.
In the user interface 910, the device connection state diagram indicates that the mobile phone is connected with the smart tv Vision and the Sound box Sound X to form the super terminal, the reference indication area 911 displays that the mobile phone icon, the smart tv Vision icon and the Sound box Sound X icon are adsorbed together, and "cooperate with Vision and Sound X" is described below the super terminal title.
In the user interface 910, the scenarized service option area 913 includes scenarized service options such as home theater, motion sensing game, chang-karaoke, cross-screen conference, and the like. The recommended devices area 912 includes the device icons of the laptop, gamepad. The scene service option and the recommendation device option are recommendation options obtained by a mobile phone, an intelligent television Vision and a Sound box Sound X based on the devices included by the super terminal. In this example, the super terminal system may speculate that the combination of the mobile phone and the smart tv Vision and the speaker Sound X is more entertainment than the combination of the mobile phone and the smart tv Vision, and thus preferentially recommend the entertainment-oriented scenic service.
If the user selects the atomic service entry option in the user interface 910, one or more atomic service combinations may be displayed, such as the user interface 1000 and the user interface 1010 shown in FIG. 10, and the user may select different atomic services to be provided by different devices.
Referring to fig. 10, a service setup window 1001 is displayed in the user interface 1000, and a plurality of atomized service options, such as screen display, sound output, sound input, camera, etc., shown in the user interface 1000 are displayed in the service setup window 1001. Wherein, the picture display service/function is provided by the smart television Vision, the Sound output service/function is provided by the smart speaker Sound X, the Sound input service/function is provided by the cellular phone HuaWei P50, the camera service/function is provided by the cellular phone HuaWei P50, and the like.
In this example, the option bar 1002 corresponding to the Sound output includes the Sound output service/function identifier, and the icon and name of the smart Sound box Sound X of the device currently providing the Sound output service/function.
In some embodiments, the super terminal supports a user to set an atomization service combination in a personalized manner, and the user can manually modify the atomization service combination according to personal preference or actual conditions to adjust a certain atomization service to be provided by other devices. For example, the user can select to switch the service of playing audio provided by the mobile phone to the service of playing audio provided by the smart sound box.
If the user wants to change the device providing the service, the user can click on the control 1003, and the mobile phone can display the user interface 1010.
It can be seen that, inside a service setting window 1011 in the user interface 1010, a plurality of device options, namely, a cellular phone huaweii P50, a smart television Vision, and a Sound box Sound X in the super terminal, are displayed in an option column 1012 corresponding to Sound output. The user may click on the corresponding selection control 1013 on the right side of the device name to confirm that the sound output service/function is provided by the device.
In some embodiments, the super terminal system may analyze the optimal atomic service combination according to device characteristics of different device types, and allocate different devices to carry different services. For example, in this example, the smart television, the smart speaker, and the mobile phone form a super terminal, the smart television has a larger screen, the user has better visual viewing experience, and is more suitable for viewing videos, so the smart television is preferentially set to provide a display function. The intelligent sound box has strong audio output capacity, the loud-speaking effect is good, and the intelligent sound box is more suitable for playing audio, so that the intelligent sound box is preferentially arranged to provide an audio output function. The mobile phone is light and handy, and the user grasps easily, and the microphone radio is convenient, so the mobile phone is preferentially set to provide audio input and camera shooting functions.
In other embodiments, an atomization service combination in the scene can be automatically set based on a service combination commonly used by a user, a default service combination set by the user, a service combination which is most suitable for the scene based on life cognition, or a service combination which is obtained from a server and has the highest use frequency in the scene, so that tedious selection operations of the user are omitted.
In some embodiments, a blank option may also be set in the service settings window 1011, i.e., the atomized service is not provided by any one device.
In some embodiments, a corresponding prompt message may be displayed on the interface of the mobile phone to prompt the user of a device that can provide the best atomization service, for example, the current speaker is the best device for providing the playing audio.
In some embodiments, when the user has finished setting, the service settings window may be slid down and then back up to the previous interface.
Fig. 11, fig. 12, fig. 13A, fig. 13B, fig. 13C, and fig. 13D show other interaction modes and schematic interfaces for selecting a scenario-based service option.
As shown in the user interface 1100 of fig. 11, the scenarized service options may be displayed in the form of floating controls, and different scenarized service options correspond to different areas, so that the interaction manner is more interesting and intuitive, the interface is simpler, and the corresponding user operation is more convenient.
In the user interface 1100, when the user presses the smart watch icon, a circular suspension area 1101 may be displayed on the super terminal interface with the mobile phone as the center, the circular suspension area 1101 may be divided into a plurality of areas, and each area corresponds to one scenario service option. As in the user interface 1100, the circular floating area 1101 is divided into four areas, which respectively correspond to the scenarized service options: outdoor running, indoor running, heart health, blood vessel health.
If the user wants to select a certain scenic service option, such as an outdoor running service option, the user can drag the smart watch icon to an area corresponding to the outdoor running service option, and then drag the smart watch icon along the direction of the inner side of the area corresponding to the outdoor running service option, and the smart watch icon is close to the mobile phone icon, and the finger movement track of the user can refer to the schematic track in the user interface 1100. When the smart watch icon reaches the adsorption area, the user releases the finger, the service corresponding to outdoor running is started, and the mobile phone skips to display the outdoor running interface shown in the user interface 800. Of course, if there is only one scenarized service option, the circular floating area 1101 does not need to be blocked, and the user can select the same scenarized service from any direction close to the mobile phone icon. The selected scenarized service option may appear highlighted, as in fig. 11 where the area corresponding to the outdoor running service option appears highlighted.
One operation in fig. 11 is equivalent to both operations in fig. 6 and fig. 8, that is, one user operation in fig. 11 includes two functions of associating the associated device to the core device and selecting the scenario service, which improves user operation efficiency.
It should be noted that the scenized service options represented in the floating area may also have a recommended order, and the present embodiment does not limit the display manner of the recommended order of the scenized service options. For example, taking four scenarized service options corresponding to four areas as an example, taking the center as the origin, the first quadrant orientation is referred to as area a, the second quadrant orientation is referred to as area B, the third quadrant orientation is referred to as area C, and the fourth quadrant orientation is referred to as area D. The recommendation sequence of the scenario service options is outdoor running, indoor running, heart health and blood vessel health, and the areas respectively corresponding to the scenario service options can be an area A, an area B, an area C and an area D; or, region B, region a, region C, region D; or, region B, region C, region a, region D; or, region B, region a, region D, region C; or, region a, region B, region D, region C; or, region B, region C, region D, region a; or, region a, region D, region B, region C; or, region a, region D, region C, region B; and the present embodiment is not limited, and may be specifically set according to actual situations, such as setting according to the operation habits of the user, the constant point region, and the like.
Referring to the user interface 1200, when the user presses the smart watch icon, a circular floating area 1201 may be displayed on the super terminal interface with the cell phone as the center, where the circular floating area 1201 may be divided into a plurality of areas, and each area corresponds to one scenario service option. For example, in the user interface 1200, the circular suspension area 1201 is divided into four areas, which respectively correspond to the scenarized service options: outdoor running, indoor running, heart health, blood vessel health.
If the user wants to select a certain scenario service option, such as an outdoor running service option, the user can drag the smart watch icon to an area corresponding to the outdoor running service option, then release a finger, that is, the user selects the outdoor running service option, the service corresponding to the outdoor running is started, and the mobile phone jumps to the outdoor running interface shown in the user interface 800. The trajectory of the user's finger movements may be referenced to a gesture trajectory in the user interface 1200. Of course, if there is only one scenarized service option, the circular hovering area 1201 does not need to be blocked, and the same scenarized service is selected when the user drags the device icon to any position in the hovering area 1201 and releases the icon. The selected scenarized service option may appear highlighted, as in fig. 12 where the area corresponding to the outdoor running service option appears highlighted. Likewise, the present embodiment does not limit the display manner of the recommendation ranking of the scenized service options.
Compared with the interaction mode shown in fig. 11, the interaction mode shown in fig. 12 is easier for the user to operate and has lower probability of false selection.
The circular floating areas shown in fig. 11 and 12 are schematic diagrams, and in other embodiments, the areas representing the scenized service options may have other representations.
Referring to the user interface 1300 shown in FIG. 13A, the scenarized service option area is a square hover area. The user interface 1300 displays that the mobile phone is used as a center, the square floating area 1301 is divided into four areas, and the four areas respectively correspond to the scene service options: outdoor running, indoor running, heart health, blood vessel health. If the user wants to select the outdoor running service, the smart watch icon can be dragged to an area corresponding to the outdoor running service option in an interactive mode as shown in fig. 11, then the smart watch icon is dragged along the direction of the inner side of the area corresponding to the outdoor running service option and is close to the mobile phone icon, when the smart watch icon reaches the adsorption area, the user releases the finger, the service corresponding to the outdoor running is started, and the mobile phone jumps to display an outdoor running interface shown in the user interface 800. Or in an interactive manner as shown in fig. 12, the user drags the smart watch icon to an area corresponding to the outdoor running service option, then releases the finger, that is, the user selects the outdoor running service option, starts the service corresponding to the outdoor running, and jumps to the outdoor running interface shown in the user interface 800. The selected scenarized service option may appear highlighted, as in fig. 13A where the area corresponding to the outdoor running service option appears highlighted. Similarly, the present embodiment does not limit the display manner of the recommendation ranking of the scenario service option.
In some examples, user interface 1310 as shown in fig. 13B, as compared to fig. 12, may not display the cell phone icon in the center, but instead display a circular hover region 1311 that includes only a plurality of scenarized service options after the user selects the smart watch icon. The circular floating area 1311 shown in fig. 13B is divided into four areas, which respectively correspond to the scenarized service options: outdoor running, indoor running, heart health, blood vessel health. If the user wants to select the outdoor running service, the selected smart watch icon can be dragged to the area corresponding to the outdoor running service option, the area corresponding to the outdoor running service option is highlighted, then the finger is released, namely the outdoor running service option is selected, the service corresponding to the outdoor running is started, and the mobile phone jumps to display the outdoor running interface shown in the user interface 800. Likewise, the present embodiment does not limit the display manner of the recommendation ranking of the scenized service options.
Another display style is shown in the user interface 1320 in fig. 13C, after the user selects the smart watch icon, the mobile phone displays a circular floating area 1321, where the circular floating area 1321 includes a plurality of circular ring-shaped areas, and each circular ring-shaped area corresponds to one scenarized service option. As shown in fig. 13C, the circular levitation area 1321 has four circular ring areas, which respectively correspond to the scenarized service options: outdoor running, indoor running, heart health, blood vessel health. If the user wants to select the outdoor running service, the selected smart watch icon can be dragged to the area corresponding to the outdoor running service option, the area corresponding to the outdoor running service option is highlighted, then the finger is released, namely the outdoor running service option is selected, the service corresponding to the outdoor running is started, and the mobile phone jumps to display the outdoor running interface shown in the user interface 800. Likewise, the present embodiment does not limit the display manner of the recommendation ranking of the scenized service options. In one implementation, the more preferentially recommended scenized service options are displayed closer to the inner ring, where a ring with a smaller diameter is referred to as an inner ring relative to a ring with a larger diameter, or a ring closer to the center of the circle is referred to as an inner ring, for example, the outdoor running service option is the first recommended scenized service option, and may be displayed in the circle of the innermost ring. The indoor running service option is a second recommended scenarized service option and is displayed in a circle of a second inner ring. The heart health service option is a third recommended scenized service option and is displayed in a circle of a third inner ring. The vessel health service option is a fourth recommended scenarized service option and is displayed in a circle of the outermost ring.
Fig. 13D shows another display mode of the user interface 1330, after the user selects the smart watch icon, the cell phone displays a fan-shaped floating area 1331, where the circular floating area 1331 includes a plurality of fan-shaped areas, and each fan-shaped area corresponds to one scenized service option. As shown in fig. 13D, the sector-shaped floating area 1331 has four sector-ring areas, which respectively correspond to the scenario service options: outdoor running, indoor running, heart health, blood vessel health. If the user wants to select the outdoor running service, the selected smart watch icon can be dragged to the area corresponding to the outdoor running service option, the area corresponding to the outdoor running service option is highlighted, then the finger is released, namely the outdoor running service option is selected, the service corresponding to the outdoor running is started, and the mobile phone jumps to display the outdoor running interface shown in the user interface 800.
In the example shown in fig. 13D, the sector-shaped floating area 1331 is a sector-shaped area of a quarter circle with the center of the circle being the lower right corner, which is considered that the user often uses the right hand to operate the mobile phone, and such a design is reasonable for the convenience of the user. Of course, without being limited to the example shown in fig. 13D, the fan-shaped floating area 1331 may also be a fan-shaped area of a quarter circle with the center at the lower left corner, which is convenient for a left-handed user to operate. The mobile phone can switch the style of different display controls for the right hand or the left hand according to the common hand set by the user, or the mobile phone can judge whether the user uses the left hand or the right hand currently when the user operates the mobile phone, so as to confirm the display style of the floating control, and the like.
Similarly, the present embodiment does not limit the display manner of the recommendation ranking of the scenario service option. In one implementation, as shown in the user interface 1330, considering that the user often holds the mobile phone in the lower half area of the mobile phone with the right hand, the prioritized scenized service options are displayed closer to the inner ring, which is referred to as the inner ring corresponding to the fan ring closer to the center of the circle, or the inner ring corresponding to the fan ring with the smaller radius. If the outdoor running service option is the first recommended scenarized service option, the first recommended scenarized service option may be displayed in the innermost ring of the fan rings. The indoor running service option is a second recommended scenarized service option and is displayed in the circle of the second inner circle. The heart health service option is a third recommended scenarized service option and is displayed in a fan ring of the third inner ring. The vessel health service option is a fourth recommended scenarized service option and is displayed in the fan ring of the outermost ring.
The above embodiments are only examples, and the embodiments of the present application do not limit the form of selecting the scenario service option and the atomization service option, and the technical solutions of the same inventive concepts are within the protection scope of the present application.
In the embodiment shown in fig. 11, 12, 13A, 13B, 13C, and 13D, after the user selects the outdoor running service option, the service corresponding to outdoor running is started, and the mobile phone jumps to the interface of the outdoor running service shown in the user interface 800. When the user selects to return to the super terminal page, referring to the user interface 1400 shown in fig. 14, the device connection state diagram area in the super terminal page displays that the smart watch icon is attached around the cell phone icon, indicating that the smart watch has established a connection with the cell phone, and may work cooperatively.
In some embodiments, when a plurality of devices form a super terminal and a scene service is started to work cooperatively, interactive orientation information can also be used as an input parameter in an interface, and the input of different orientation information can correspond to different feedbacks. For example, when a mobile phone is connected with associated equipment such as a PAD/PC/smart television to start a multi-screen cooperative service, a user drags the associated equipment icon to attach and adsorb from the left or right of the mobile phone icon or the middle area, so that different effects are generated. For example, if the user drags the associated device icon to fit over the left area of the mobile phone icon and adsorb the associated device icon together, the mobile phone screen will be projected to the right side of the associated device screen; if the user drags the associated equipment icon to be attached to the right area of the mobile phone icon and the associated equipment icon is adsorbed together, the mobile phone screen is projected to the left side of the associated equipment screen; if the user drags the associated equipment icon to the middle area of the mobile phone icon and the associated equipment icon is adsorbed together, the mobile phone screen can be projected on the associated equipment screen in a full screen mode.
Of course, in some embodiments, the adsorption-side device icon and the adsorbed-side device icon interchange drag directions, and perform the same service or interaction function. If the user can also press the mobile phone icon and drag the mobile phone icon to the left or right or middle area of the associated device icon for attaching and adsorbing, the service feedback of the mobile phone icon is different and has the same effect as the effect described in the previous embodiment, that is, if the user drags the mobile phone icon to attach and adsorb together from the right area of the associated device icon, the mobile phone screen is projected to the right side of the associated device screen; if the user drags the mobile phone icon to be attached to the left area of the associated equipment icon and the mobile phone icon is adsorbed together, the mobile phone screen is projected to the left side of the associated equipment screen; if the user drags the mobile phone icon to the middle area of the associated equipment icon and adsorbs the mobile phone icon together, the mobile phone screen can be projected on the screen of the associated equipment in a full screen mode. In summary, the effect exhibited by the scenized service may vary according to the orientation in which the core device icon is connected to the associated device icon.
In another example, for audio devices, such as speakers, headphones, etc., different channels may be associated with different orientations of the suction. Assuming that a user has two or more sound boxes, the user drags a first sound box icon to fit and adsorb the first sound box icon together from the left area of the mobile phone icon, and then the first sound box can provide audio output of a left sound channel; the user drags the second loudspeaker box icon to fit the right area of the mobile phone icon and adsorb the second loudspeaker box icon together, so that the second loudspeaker box can provide audio output of a right sound channel. Other multi-channel arrangements work equally well.
In other embodiments, the adsorption-side device icon and the adsorbed-side device icon interchange drag directions, which may perform different services or interactive functions. If the data transmission direction is different, the device serves as an adsorption party or serves as an adsorbed party to indicate that the device is a sending party or a receiving party of the data, and if the data transmission direction is defined as that the adsorption party device sends a data stream to the adsorbed party device. For example, when a file is shared, if a PC icon is pressed and dragged to be attached to a mobile phone icon, the corresponding service is to send the file of the PC to the mobile phone, and if the mobile phone icon is pressed and dragged to be attached to the PC icon, the corresponding service is to send the file of the mobile phone to the PC.
For another example, the device icon may be different from the adsorption party or the adsorbed party, which triggers the scenized service. For example, if the PC icon is pressed and dragged to be attached to the mobile phone icon, the triggered service is to send the file of the PC to the mobile phone, and if the mobile phone icon is pressed and dragged to be attached to the PC icon, the triggered service is to project the screen of the mobile phone onto the screen of the PC.
Reference is made to the examples shown in fig. 15, 16, 17. Fig. 15, 16, and 17 illustrate that when a user drags the MateBook icon of the notebook computer to be close to the mobile phone icon to establish the multi-screen cooperative connection, the adsorption in different directions and different areas produces different effects.
As shown in fig. 15, in a user interface 1500 of the mobile phone, a user drags the notebook computer mathebook icon to fit to the left area 1501 of the mobile phone icon, and when the notebook computer mathebook icon is located in the left area 1501, a prompt "screen projection on the right side" may be displayed on the interface, prompting the user to select the option and cause the mobile phone to project the screen on the right side of the screen of the notebook computer. When the user drags the notebook computer MateBook icon to the left area 1501 and then releases the finger, the mobile phone icon and the notebook computer MateBook icon may be adsorbed together. It can be seen that the display 1502 is projected to the right of the display on the notebook computer.
As shown in fig. 16, in the user interface 1600 of the mobile phone, the user drags the notebook computer MateBook icon to fit to the right region 1601 of the mobile phone icon, and when the notebook computer MateBook icon is located in the right region 1601, a prompt "screen projection on the left side" may be displayed on the interface, prompting the user to select the option and make the mobile phone screen projection on the left side of the screen of the notebook computer. When the user drags the notebook computer MateBook icon to the right side area 1601 and then releases the finger, the mobile phone icon and the notebook computer MateBook icon can be adsorbed together. It can be seen that, on the screen of the notebook computer, the display interface 1602 of the mobile phone is projected to the left side of the display interface of the notebook computer.
As shown in fig. 17, in the user interface 1700 of the mobile phone, the user drags the notebook computer MateBook icon to the middle area 1701 of the mobile phone icon, and when the notebook computer MateBook icon is located in the middle area 1701, a prompt of "full screen projection" may be displayed on the interface, prompting the user to select the option to make the mobile phone fully display on the screen of the notebook computer. When the user drags the notebook computer matchbook icon to the middle area 1701 and then releases the finger, the mobile phone icon may be adsorbed or overlapped with the notebook computer matchbook icon. It can be seen that, on the screen of the notebook computer, the display interface 1702 of the mobile phone is displayed on the notebook computer in a full screen projection manner.
When the core device and the associated device are connected to execute the interactive function, the interface contents shown by the core device and the associated device may be the same or different.
In some embodiments, the core device and the associated device may be used with the associated device, although the interface content shown is different, and the displayed interfaces have relevance rather than displaying irrelevant content separately.
As shown in fig. 18, in the user interface 1800 shown in the mobile phone, the mobile phone is connected to the smart television, and the scenized service area displays service options such as multi-screen collaboration, home theater, cross-screen conference, and motion sensing game. The user selects home theater service option 1801, and the mobile phone and the smart tv can display the home theater service related page.
As shown in fig. 18, the smart tv plays videos, a video frame is displayed in the interface 1806, a current application title "home theater" is displayed on top of the video frame, and a prompt "the device has collaborated with HuaWei P50", weather, time, etc. The mobile phone user interface 1810 displays a remote control interface of the home theater application, the title "home theater" is displayed in the user interface 1810, a prompt 1802 "the device cooperates with Vision, remote control Vision is clicked", a remote control operation area 1803, a switching control 1804, and the like. The remote control operation area 1803 may be used for a user to control the smart television, and may include a homepage control, a switch control, a volume increase control, a volume decrease control, a return control, a menu control, a voice control, a confirmation control, an up-down, left-right direction control, and the like.
The switching control 1804 is used for switching the mobile phone interface between a video picture and a remote control interface, and the switching control 1804 prompts 'switching to a display picture' on the remote control interface. As shown in fig. 19, when the user clicks on toggle control 1804, the cell phone can display video screen 1901 in user interface 1900, as well as toggle control 1902. The switching control 1902 prompts "switch to remote interface" on the video playing interface of the mobile phone, and if the user clicks the switching control 1902, the mobile phone may return to displaying the remote interface, i.e., the user interface 1810.
In other embodiments, after the core device cooperates with the associated device, some functions of the core device may be migrated to the associated device, and the associated device provides a service, and the core device may hide the service to a background and run another application in a foreground. For example, after a mobile phone is cooperated with a smart television, a service for playing a video is provided by the smart television, and a user can hide the service to a background, open a chat application in a foreground, and watch the video while chatting.
It should be understood that the user interfaces described in fig. 4 to fig. 19 are only exemplary interfaces for assisting the reader in understanding the technical solution described in the present application, and do not limit the user interfaces of other embodiments of the present application. In other embodiments, more or fewer user interfaces may be added or subtracted according to actual situations, or more or fewer controls may be added or subtracted, or different man-machine interaction operations may be designed, so that the user interfaces are more suitable for the user experience.
In conjunction with the foregoing embodiments shown in fig. 1 to fig. 19, a device interaction method provided by the embodiments of the present application is described below.
The following method embodiments may be applied in a communication system formed by a plurality of devices, such as the aforementioned communication system 10. The communication system may also be referred to as a super terminal. The plurality of electronic devices may be mobile phones, tablet computers, notebook computers, personal Computers (PCs), smart televisions (also called smart screens, large screens, etc.), or wearable devices such as smart watches, smart bracelets, etc., and the application does not limit the types of the devices.
And a communication connection is established between the core device and the associated device, and communication can be carried out through the communication connection to transmit data or instructions. The communication connection may be a wired connection, a wireless connection, or a combination of these, and the embodiment is not limited thereto.
Each terminal device in the communication system can be respectively carried with
Figure BDA0003554416580000431
A system,
Figure BDA0003554416580000432
A system,
Figure BDA0003554416580000433
Figure BDA0003554416580000434
The operating system of each terminal device in the communication system may be the same or different, and the application is not limited to this. In some embodiments, the communications system may be implemented with harmony os installed in each terminal Referred to as a harmony os super virtual device, and may also be referred to as a harmony os super terminal.
In the following method embodiment, a core device/center device (e.g., a mobile phone) is a control center of a super terminal system, bears functions of device management, service scheduling, and the like, and may be connected to and cooperate with one or more other associated devices (e.g., a notebook computer, a PC, a tablet computer, a smart watch, a smart television, and the like), so as to provide a plurality of scenario services and a plurality of atomic service combinations for a user.
In some embodiments, the super terminal application is installed on the core device or the associated device, so that a user can conveniently manage each terminal device in the super terminal system.
It will be appreciated that the user may interact on any device on which the super-terminal application is installed. The instructions and data input by the user in the super terminal application on any one device can be synchronously migrated to other devices in the super terminal system.
Fig. 20 is a flowchart of the device interaction method provided in this embodiment, which specifically includes the following steps:
s101, the first device displays a first interface, and the first interface comprises a first control indicating the first device and a second control indicating the second device.
The first control is a device option indicating a first device and the second control is a device option indicating a second device.
The first device may be the core device, the second device may be the associated device, and the second device is a peripheral device discovered by the first device.
The first interface is a super terminal interactive interface, and a user can control and manage each device in the super terminal system through user operation acting on the first interface. The embodiment does not limit the representation form of the interaction interface of the super terminal. One expression of which may be found in the embodiment described with reference to figure 5.
The first control is an icon indicating a core device, such as a cell phone icon shown in fig. 5. The second control is an icon indicating an associated device, such as a smart watch icon as shown in fig. 5.
Referring to fig. 5, a device connection status diagram may be displayed in the super terminal interactive interface, where the device connection status diagram has a core device icon at the center and one or more associated device icons surrounded by the core device icon.
The device connection status diagram may indicate a connection relationship, a relative positional relationship, a signal strength relationship, and the like between the core device and the associated device.
As shown in fig. 7, the mobile phone icon and the smart watch icon are attracted together to indicate that the mobile phone and the smart watch are connected, so that the mobile phone and the smart watch can cooperate to provide system-level services. The smart band icon is dissociated at the periphery of the mobile phone icon, is not adsorbed, overlapped or attached to the mobile phone icon, is in a separated state, and represents that the mobile phone is not connected with the smart band.
In some embodiments, the relative positions of the associated device icon and the core device icon displayed in the device connection state diagram may indicate the position relationship of the associated device and the core device in the real space, such as relative orientation information, relative distance information, and the like, which facilitates the user to quickly find and distinguish the electronic devices. If the positions of the core device and the associated device in the real space are consistent with the positions of the core device icon and the associated device icon in the device connection state diagram. And as in the real space, the farther the associated device is from the core device, as in the real space, the farther the associated device icon is from the core device icon, which is correspondingly displayed in the interface. The embodiment does not limit the specific technology for implementing the function, for example, the mobile phone may obtain the direction of the directional information carried by the associated device and the measured distance based on the UWB technology to obtain the spatial mapping relationship between the associated device and the mobile phone in the real space.
In other embodiments, the distance between the associated device icon and the core device icon in the interface may indicate the signal strength of the associated device detected by the core device, such as bluetooth signal strength or WiFi signal strength, for example, the stronger the signal strength of the associated device detected by the core device, the closer the associated device icon and the core device icon are displayed in the interface.
S102, the first device detects a first user operation.
If the user wants the second device and the first device to form the super terminal, some cooperative functions are realized, and the first user operation of selecting the second device can be implemented. The embodiment does not set any limit to the specific manner of the first user operation. The first user operation is for user association of the first device and the second device.
In one implementation, the first user operation may be a user operation that acts on the second control, e.g., the user holds the second control and drags the second control close to the first control, and releases the second control when the second control reaches the designated area. In some embodiments, the designated area may be a circular area centered on the first control with a radius of a first radial distance.
Referring to the embodiment shown in fig. 6, a user may select a smart watch icon 506 corresponding to a smart watch on a super terminal interface, and drag the smart watch icon 506 to be close to a cell phone icon 505, when a distance between the smart watch icon 506 and the cell phone icon 505 is smaller than or equal to a second distance (R2 shown in fig. 6), or when the smart watch icon 506 enters a designated area (an annular area formed by a closest circle around the cell phone icon in fig. 6, that is, a circular area with a radius R and a center around the cell phone icon 505), the user may release a finger, and then the device connection state diagram 701 shown in fig. 7 may be displayed, and the smart watch icon 702 is attached to an edge of the cell phone icon 505.
In some embodiments, the selected device icon may change position along with the track of the user's finger dragging on the screen. The present embodiment does not limit the motion trajectory of the finger when the user drags the device icon, and the user dragging trajectory may be an arbitrary curve. For example, after a user selects a certain associated device icon, a finger of the user drags the associated device icon for a distance on the screen or stays at a certain position on the screen for a second preset time, and the mobile phone may detect whether the distance between the current associated device icon and the mobile phone icon is smaller than or equal to the certain distance, for example, the second distance, and if so, the associated device icon and the mobile phone icon are adsorbed together. As the user's finger moves, the associated device icon moves with the user's finger trajectory.
In other embodiments, the selected device icon may not change position with the trajectory of the user's finger dragging across the screen. For example, when a user selects a certain associated device icon, the user finger slides for a distance on the screen or stays at a certain position on the screen for a second preset time period, and the mobile phone may determine whether the distance between the current position of the user finger and the mobile phone icon is smaller than or equal to the certain distance, for example, the second distance, and if yes, the associated device icon and the mobile phone icon are adsorbed together. The associated device icon does not move as the user's finger moves.
In another implementation manner, the first user operation may be that the user clicks the second control in the unconnected state, or clicks the first control and the second control at the same time, which may trigger the connection between the second device and the first device, and is represented on the interface that the second device icon is attached to the first device icon.
In another implementation manner, the first user operation may also be a user operation that acts on the first control, for example, the user presses the first control and drags the first control to be close to the second control, and releases the first control when the first control reaches the designated area, which may also implement that the first control and the second control are adsorbed together.
Optionally, in another implementation, the first user operation may also be a user operation of selecting the second control, such as pressing and holding the second control, with reference to the embodiments described in fig. 11, fig. 12, fig. 13A, fig. 13B, fig. 13C, and fig. 13D.
S103, responding to the operation of the first user, the first device displays a second interface, wherein one or more service options of the scene service are displayed in the second interface, and the scene service is a service which can be cooperatively provided by the first device and the second device.
Optionally, a first control connected to a second control is displayed in the second interface. In response to the first user operation, a second control and the first control are attracted together in the second interface. The adsorption of two device icons together may mean that the edges of the icons corresponding to the two devices are tangent, or the adsorption of two device icons together may mean that the two device icons are completely or partially overlapped. The present embodiment is not particularly limited to the form of expression of adsorption.
The second interface in this embodiment may refer to the user interface 700 shown in fig. 7.
In some embodiments, the second control in the second interface may be in a different representation than the second control in the first interface. For example, in fig. 7, a smart watch icon 702 that has been connected to a cell phone to form a super terminal may display a distinguishing logo to facilitate distinguishing from an associated device icon in an unconnected state. If the smart watch icon 702 is displayed in a dark filling form, indicating that the smart watch and the mobile phone are in a connected state, the size of the smart watch icon 702 is smaller than that of the smart watch icon 506 when the smart watch and the mobile phone are not connected, the diameter of the circular icon of the smart watch icon 702 is R2, the circular icon is attached to the edge of the mobile phone icon 505, and the smart watch icon 702 is tangent to the edge of the mobile phone icon 505.
Optionally, in another implementation manner, in response to the first user operation of selecting the second control, the embodiment described with reference to fig. 11, 12, 13A, 13B, 13C, and 13D displays a floating control in the second interface, where the floating control has one or more regions, and each region corresponds to one scenarized service option. The floating control is represented by a plurality of circular areas, or a plurality of rectangular areas, or a plurality of fan-shaped annular areas, or a plurality of circular annular areas, or a plurality of polygonal areas. In the second interface, the first control may be displayed or may not be displayed. The present embodiment does not limit the layout manner of the floating control.
Referring to the user interface 1100 shown in fig. 11, or the user interface 1200 shown in fig. 12, or the embodiments shown in fig. 13A, fig. 13B, fig. 13C, and fig. 13D, a floating control is displayed in the second interface, where the floating control may be divided into one or more areas, each area corresponds to one scenarized service option, and different scenarized service options correspond to different areas, which is more interesting and intuitive in an interaction manner, and the interface is simpler. The shape of the floating control is not limited in the embodiment, and the manner of dividing the region is not limited at all.
For example, in the user interface 1100, when the user presses the smart watch icon, the super terminal interface may display a circular floating area 1101 centered on the mobile phone, where the circular floating area 1101 may be divided into a plurality of areas, and each area corresponds to one scenario service option. As in the user interface 1100, the circular levitation area 1101 is divided into four areas, which respectively correspond to the scenarized service options: outdoor running, indoor running, heart health, blood vessel health.
One or more scene service options are displayed in the second interface, wherein the scene service is a service which can be cooperatively provided by the first device and the second device.
Referring to the description in the foregoing embodiments, a scenario service refers to a functional service based on a usage scenario, and is a set of multiple functions or services that can be provided for a particular scenario. A scenarized service may be supported by one or more atomization services. The service option of the scenario service displayed on the second interface may be a scenario service that can be provided by the current super terminal system, that is, a device combination of the first device and the second device, and the atomic capability corresponding to the scenario service is already installed and deployed on the first device or the second device.
In some embodiments, a certain scenarized service may be based on a certain same application under which different devices provide different atomic capabilities. The scenarized service is actually provided by the same service provider, for example, in the example shown in fig. 8, the sports health application is installed on both the mobile phone and the smart watch, after the user selects the scenarized service option of outdoor running, both the mobile phone and the smart watch jump to the outdoor running service in the sports health application, the smart watch can provide positioning service, user movement data statistics service and the like, the smart watch can upload user data to the application server, the application server synchronously issues the user data to the mobile phone, and the mobile phone can provide service for displaying the movement track of the user and the like. For the same scene service, even if the same client is used, interfaces displayed by different devices and services provided by different devices can be different.
The scenario service can be scenario service options obtained by comprehensively considering the conditions of the super terminal system, such as the equipment type, the equipment characteristics, the product location, the equipment use, the cooperative service provided by the equipment, the current environment of the equipment, the equipment running state, the application recently used by the user, the frequency of using the scenario service by the user, the current running application, the sequencing or default option of the scenario service set by the user and the like, and is displayed on a user interface, so that the user can conveniently and quickly start related services or functions. And the service option of the scene service with higher probability of being selected by the user, which is calculated by the super terminal system, is recommended to be displayed more preferentially.
In some embodiments, the super terminal may recommend a scene service according to the device type, product location, device usage, and the like, for example, if a combination of a mobile phone and a smart watch is commonly used for counting the amount of exercise, the super terminal system may classify the combination of the mobile phone and the smart watch into a sport health scene based on the device type, the device usage, and the commonly used cooperative work content of the combination of the mobile phone and the smart watch. The scenarized service options that may be provided in an athletic health scenario include outdoor running, indoor running, heart health, vascular health, and the like.
For another example, different device types have different device characteristics, for example, a smart television has a larger screen, so that the user has better visual viewing experience and is more suitable for viewing videos; the intelligent sound box has strong audio output capability and good loudspeaking effect, and is more suitable for playing audio; the earphone is small in size, easy to carry, free of being placed outside, free of disturbing surrounding people, capable of having a noise reduction effect, capable of shielding external environment noise, and suitable for listening to audio in public places, such as public conversation, music and the like; compared with a mobile phone, the smart watch and the smart bracelet can more accurately measure other health information such as exercise data or heart rate of the user; the notebook computer has strong processing capacity and is more suitable for office work. The device types and device features are not described in detail here, and the above example description does not limit other embodiments.
In some embodiments, when the scenized service options are listed, the probability of selecting each scenized service option may be calculated according to the frequency of the users recently using different collaborative services or the environment where the current device is detected, and the scenized service options with the high probability of being selected are displayed in the area of the scenized service options preferentially.
The super terminal system can display the option with high utilization rate in front according to the counted use frequency of the user to each scene service option in a certain time. The certain time may be all the time of the history, or may be a certain period of time, such as the last three months. The higher the historical usage of the scenized service options, the earlier the display.
In other embodiments, the core device or the associated device may detect the environment in which the current device is located, and sort the scenized service options according to the environment in which the current device is located. If the core device or the associated device detects that the environment where the user is located is outdoor, the outdoor related service option is preferentially recommended, such as outdoor running. If the core device or the associated device detects that the environment in which the user is located is indoor, indoor related service options such as indoor running are preferentially recommended. The embodiment does not limit the manner of detecting the environment where the user is located, and for example, the environment where the user is located may be determined according to the current location information of the device; or a camera is used for collecting the picture of the current environment, or ultrasonic waves, infrared rays and the like are used for collecting the depth or the size of the current space so as to judge the current environment of the user.
In other embodiments, the super terminal system may further determine the environment or the scene where the device is located according to an application recently opened or used by the user or a currently running application, and further recommend the scenario service option. For example, when a super terminal is formed by a mobile phone and a smart television, if it is detected that an application currently running by the mobile phone is an office demonstration application, a multi-screen collaborative service option can be preferentially recommended; or, if the application currently running by the mobile phone is detected to be a video application, preferentially recommending a home theater service option; or, when the condition that the webpage browsed by the user is the game-related webpage is detected, service options related to game entertainment are preferentially recommended.
In other embodiments, the super terminal system may recommend the scenario service according to the operation state of the device. For example, when a mobile phone is connected with a smart television to form a super terminal, if the smart television is detected to be in a bright screen state, the scene service is preferentially recommended to be a screen mirror image, and if the smart television is detected to be in a dead screen state, the scene service is preferentially recommended to be a voice service.
Optionally, in some embodiments, the super terminal system may include a first database, where the first database records one or more scenario services supported by one or more device combinations, where the device combination includes a device combination formed by a first device and a second device associated with the first device, and the first scenario service is a scenario service queried from the first database. The first database may be stored locally or on a cloud server. The scenarized service is determined by one or more of the following parameters: device type of device combination, device characteristics, product location, device usage, environment or scenario in which the device is located, state in which the device is located, recently running application, and the like.
Optionally, in some embodiments, a second database may be included in the super terminal system, where the second database records one or more scenario services supported by one or more atomic service combinations, and the first scenario service is a scenario service queried from the second database. The second database may be stored locally or on a cloud server. The atomic service composition includes an atomic service composition comprised of atomic capabilities possessed by a first device and its associated second device. Wherein the atomic capability includes one or more of: audio output capability, audio input capability, display capability, camera capability, touch input capability, keyboard and mouse input capability, and the like. For example, the audio output capability may include that the device supports mono or multi-channel, supported sound effect, supportable frequency response range, noise reduction capability, audio resolution capability, and the like when playing audio; audio input capabilities may include a range of sound reception, noise reduction capabilities, etc.; the display capabilities may include the screen size of the device, display resolution parameters, refresh rate, color rendering, etc.; the camera shooting capability can comprise the camera type of the equipment, the shooting pixel, the shooting night scene capability, the image adjusting capability and the like; the touch input capability and the keyboard and mouse input capability refer to whether the device can support touch input or keyboard and mouse input and the like.
In this embodiment, the floating controls have different layouts, and the recommended scenarized services are arranged in different orders.
Optionally, referring to the embodiment shown in fig. 13C and 13D, the shape of the floating control is a circular ring or a sector ring, where a plurality of circular ring areas or sector ring areas respectively indicate a plurality of scenarization services, and the scenarization service indicated by the circular ring area or the sector ring area closer to the center of the circle is a scenarization service recommended more preferentially.
Optionally, referring to the embodiment shown in fig. 11, 12, 13A, and 13B, the shape of the floating control is a circle or a rectangle, the circle or the rectangle is divided into four regions, the first quadrant is located as region a, the second quadrant is located as region B, the third quadrant is located as region C, the fourth quadrant is located as region D, the recommendation order of the scenarized service options is scenarized service option a, scenarized service option B, scenarized service option C, and scenarized service option D, the regions respectively placed are region B, region a, region C, region a, and region D, or region a, region D, region B, region C, region B, region D, or region a, region B, region D, and region C, or region a, region B, region C, and region D.
Certainly, the scenario service option is not invariable, and the super terminal system may continuously update, expand or refine more scenario service options, which is not limited in this embodiment and may be changed specifically according to the actual situation. In some embodiments, a user may add a scenario-based service in a customized manner, such as setting a name for a certain usage scenario and associating the scenario with other application services or system services. And the newly added scene service option and other scene service options are displayed in the super terminal interface in parallel.
The embodiment does not limit other embodiments, and the super terminal system may comprehensively consider one or more of the above factors and reasonably recommend the scenario-based service option.
For a specific example, reference may be made to the description of the embodiments shown in fig. 7 and fig. 9, which is not described herein again.
In some embodiments, when the devices are connected to form the super terminal, a certain scenario service may be automatically selected as a default service. For example, the scenario service option ranked first in the scenario service option area is set as a default service, and when the devices are connected to form the super terminal, each device is configured according to the corresponding atomization service combination of the scenario service by default. If the user wants to select another scenarized service, another scenarized service option may be selected. The default scene service is set, the seamless service switching of the equipment can be facilitated, and the user operation can be omitted to a certain extent. For example, when the earphone is connected with the mobile phone, the default service setting position is provided with the audio output and audio input functions by the earphone, and then the audio output and audio input functions are immediately switched to be provided by the earphone at the moment when the earphone is successfully connected with the mobile phone, so that a user does not need to perform additional operation, and complicated operation is omitted. The default service setting may be a default service selected by the user, including a scenario service and/or an atomization service, or the default service setting may be a service commonly used by the user and obtained by statistics by the system as the default service, and the like, which is not limited in this embodiment.
In other embodiments, when the devices are connected to form the super terminal, the scenario service may not be selected first, but one or more scenario service options are listed, and after a certain scenario service is selected by a user, the scenario service or the corresponding atomization service is switched to a set state. The method has the advantages that the use will of the user is fully considered, and the trouble or puzzlement caused by automatic switching to the wrong service is avoided. For example, when a mobile phone is connected with a PC, a user intends to transfer files, if the default service is screen mirroring, the screen of the mobile phone is projected to the PC, and then the user is required to switch the service, especially if the current PC is used for other purposes, such as presenting documents, the screen projection service can prevent the display of the presentation documents, which causes troubles to the user, or if the mobile phone of the user has private contents and does not want to be displayed on the PC, the user is troubled by directly switching to the screen mirroring service. Therefore, when the mobile phone is connected with the PC, the scenario service options can be displayed on the super terminal interface, and the user selects the required scenario service options and/or the required atomization service combination options in advance and then executes the switching service.
It is noted that when two or more devices form a super terminal and a user selects a scenario service, the devices may have the capability or status of performing an interactive function, but the two or more devices are not limited to performing the interactive function immediately. For example, if the user drags the device icon, the sound projection function between the mobile phone and the smart sound box is triggered. If the mobile phone does not have the audio playing task or the audio playing task of the mobile phone is suspended, the mobile phone does not immediately send audio data to the smart sound box. However, when the mobile phone has an audio playing task, the mobile phone sends audio data to be played to the smart sound box, and the audio data is played by the smart sound box.
In the second interface, a recommendation association device, which may also be referred to as a third device, may also be displayed. The super terminal system can also analyze the current scene based on the type, the device characteristics, the product positioning, the device usage, the environment or scene of the device, the state of the device and the recently operated application of the device which currently form the super terminal system, enumerate and recommend other devices suitable for the current scene to the user, so as to help the user to obtain better scene experience; or more extensible use scenes are presumed, then other equipment is listed and recommended to the user, and if the user adopts the equipment recommended by the system, more and richer use scenes can be further expanded.
Optionally, the third device is a device discovered by the first device, and the device option of the third device is clicked, and the first device displays the service option of the second scenarized service. The service option of the second scenario service may be some scenario service options added on the basis of the service option of the first scenario service, or the service option of the second scenario service is a scenario service updated again and is different from the service option of the first scenario service.
Optionally, the third device is not the device discovered by the first device, the device option of the third device is clicked, and the first device jumps to display a fifth interface, for example, the fifth interface may be a purchase interface of the third device, the fifth interface includes a plurality of item options, the plurality of item options are from a plurality of providing sources, and the plurality of item options include an item option indicating the third device.
For example, after the mobile phone is cooperated with the smart television, the wireless microphone, the wireless game handle and other devices of the user can be recommended, and the scene services such as karaoke, games and the like can be correspondingly expanded. For another example, after the mobile phone is cooperated with the smart watch, the devices such as a user treadmill, a heart rate belt and a blood glucose meter can be recommended, and the scene services such as indoor running, heart health and blood glucose health can be correspondingly expanded. After a recommendation device is added, the system can also recommend a new device according to the newly formed super terminal. The user can choose to add more devices to form the super terminal, so that the use scene of the user is richer. After more associated devices are added, the scene services provided by the super terminal are more and richer, and the ecosystem of the super terminal is improved.
In the example shown in fig. 7, the super terminal system may analyze a user usage scenario as an exercise health scenario based on the devices currently constituting the super terminal being a mobile phone and a smart watch, and the recommended device area 704 may display recommended exercise devices or health monitoring devices, such as a treadmill, a spinning bike, a blood glucose meter, a heart rate meter, and the like. When the user selects (clicks or drags) the device icon of the recommended device area 704, if the device is an associated device detected by the current core device, that is, the device displayed in the device connection state diagram 701, the device may initiate connection to join the super terminal. When the user selects (clicks or drags) the device icon of the recommended device area 704, if the device is not displayed in the device connection status diagram 701, then a jump may be made to a page related to the device product, such as to introduce the user about the functionality of the recommended device, the experience of use, how the super-terminal may be used and extended when formed with other devices, and so forth.
S104, the first device detects a second user operation of the selected first scene service, and the first scene service indicates that the first device provides the first service and/or the second device provides the second service.
The first scenario service corresponds to a first atomization service combination, and the first atomization service combination provides at least one first service for the first device and/or provides at least one second service for the second device. A scenarized service may be supported by one or more atomization services.
The atomic service refers to a minimum capacity unit which can run independently, is a concept of abstract packaging of single function/capacity, and can be a hardware service or a software service.
The second user operation is not limited in this embodiment. Referring to the embodiment illustrated in fig. 8, the second user action may be the user clicking on the first scenized service option.
Alternatively, the embodiments described with reference to fig. 11, 12, 13A, 13B, 13C, 13D. The second user operation may be that the user drags the second control to the area corresponding to the first scenized service option, and then releases the second control; the second user operation may also be that the user drags the second control to be close to the first device icon via the area corresponding to the service option of the first scenarized service. The embodiment also does not limit the motion track dragged by the user.
In some embodiments, the aforementioned hover control may include a first region that displays service options for a first scenarized service. The second user operation is that the user releases the operation of the second control after dragging the second control to the first area, or the second user operation is that the user releases the operation of the second control after dragging the second control to a specified area through the first area and moving the second control to the specified area towards the direction close to the first control, and the specified area can be a circular area with the first control as the center and the radius as the distance of the first radius.
For example, in the example shown in fig. 11, if the user wants to select a certain scenic service option, such as an outdoor running service option, the user may drag the smart watch icon to an area corresponding to the outdoor running service option, and then approach towards the direction of the cell phone icon via the area corresponding to the outdoor running service option, and the finger movement track of the user may refer to the schematic track in the user interface 1100. When the smart watch icon reaches the adsorption area, the user releases the finger, the service corresponding to outdoor running is started, and the mobile phone jumps to the outdoor running interface shown in the user interface 800. Of course, if there is only one scenizable service option, the circular floating area 1101 does not need to be blocked, and the user can select the same scenizable service from any direction next to the phone icon.
As shown in fig. 12, if the user wants to select a certain scenario service option, such as an outdoor running service option, the user may drag the smart watch icon to an area corresponding to the outdoor running service option, and then release the finger, that is, the user selects the outdoor running service option, the service corresponding to the outdoor running is started, and the mobile phone jumps to the interface of the outdoor running shown in the user interface 800. The trajectory of the user's finger movements may be referenced to a gesture trajectory in the user interface 1200. Of course, if there is only one scenarized service option, the circular hovering area 1201 does not need to be blocked, and the same scenarized service is selected when the user drags the device icon to any position in the hovering area 1201 and releases the icon.
Compared with the interaction mode shown in fig. 11, the interaction mode shown in fig. 12 is easier for the user to operate and has lower probability of false selection.
The super terminal system can automatically set the atomic service combination in the scene according to the actual selected scene service of the user, or the common service combination of the user, or the default service combination set by the user, or the service combination which is most suitable for the scene based on the life cognition, or the service combination with the highest use frequency in the scene obtained from the server, based on the atomic service which can be provided by each device and inquired by the current device combination, so that the tedious selection operation of the user is omitted.
For example, after a super terminal is formed by a mobile phone and a smart television, if the scene service selected by the user is a cross-screen conference, the atomization service combination is automatically set as follows: the intelligent television is used for providing an audio output function, the mobile phone is used for providing an audio input (microphone) function, the intelligent television is used for providing a camera function, and the intelligent television is used for providing a display function. For another example, after the mobile phone, the smart watch, the earphone and the treadmill form a super terminal, if the scenario service selected by the user is indoor running, the atomization service combination is automatically set as follows: the audio output function is provided by the headset, the audio input (microphone) function is provided by the headset, the camera function is provided by the cell phone, and the display function is provided by the treadmill.
In some embodiments, the super-terminal system may analyze the optimal atomic service composition according to device characteristics possessed by different device types. For example, the smart television has a larger screen, so that the user has better visual viewing experience and is more suitable for watching videos, and when the smart television and a mobile phone form a super terminal, the smart television is preferentially set to provide a display function. The intelligent sound box has stronger audio output ability, and it is better to raise one's voice the effect, and more suitable broadcast audio frequency when intelligent sound box and cell-phone constitute super terminal, the priority sets up intelligent sound box and provides the audio output function so. The earphone is small in size, easy to carry, free of being placed outside, free of disturbing surrounding people, capable of having a noise reduction effect, capable of shielding external environment noise, suitable for listening to audio in public places, such as conversation in public places, listening to music and the like, and when the earphone and the mobile phone form a super terminal, the earphone is preferentially arranged to provide an audio output function. Compared with a mobile phone, the smart watch and the smart bracelet can more accurately measure exercise data or other health information such as heart rate of a user, so that when the smart watch or the smart bracelet and the mobile phone form a super terminal, the exercise data acquired by the smart watch or the smart bracelet is preferentially set; the notebook computer has strong processing capacity and is more suitable for office work, and when the notebook computer and the mobile phone form a super terminal, the notebook computer is preferentially arranged to provide a display function, a keyboard input function and the like. The device types and device features are not described in detail here, and the above example description does not limit other embodiments.
In some embodiments, the super terminal supports a user to set an atomic service combination in a personalized manner, and the user can manually modify the atomic service combination according to personal preferences or actual conditions to adjust a certain atomic service to be provided by other devices. If the surrounding environment is noisy, as in a cross-screen conference, the user may provide the audio output function by the smart television and instead by the earphones.
Optionally, the first device displays the first atomic service combination on the seventh interface, where the first atomic service combination is selected by the user, or the first atomic service combination is determined according to one or more of the following parameters: the method comprises the steps of obtaining a device type and device characteristics of the first device and/or the second device, or an atomization service combination commonly used by a user, or an atomization service combination set by the user to be default, or an atomization service combination which is obtained through analysis and is most suitable for the first scenarized service, or an atomization service combination which is obtained from a server and has the highest use frequency under the first scenarized service.
Referring to the foregoing example shown in FIG. 10, if a user selects an atomic service entry option in user interface 910, one or more atomic service combinations may be displayed, such as user interface 1000 and user interface 1010 shown in FIG. 10, and the user may select different atomic services to be provided by different devices on their own.
It is understood that the scenario service and the atomization service are only words used in this embodiment, and the meaning of the words is described in this embodiment, and the name of the words does not limit this embodiment in any way. For example, in some other embodiments, a scenario service may also be referred to as a scenario function, a scenario service, a business function, or other terms; an atomic service may also be referred to as a meta-capability, an atomic capability (Ability), an atomic service, a functional component, and other terms. In the embodiment of the present application, description is mainly made of "scenario service" and "atomization service".
And S105, the first device and/or the second device run the first scenario-based service, that is, the first device runs the first service, and/or the second device runs the second service.
In this embodiment, the learning approach for the first device and/or the second device to run the scenario service is not limited. In some embodiments, the first device may directly notify the second device to run the atomization service corresponding to the first scenario service through the communication connection. In other embodiments, the first device may send an instruction notifying that the first scenarization service is to be operated to the cloud server, or the first device may report an instruction or a message obtained by the first device or an operation state of the first device to the cloud server, and then the cloud server generates an instruction for operating the first scenarization service and issues the instruction to the second device. And after receiving the instruction of running the first scene service, the second device runs the atomization service indicated by the first scene service.
The third interface displayed when the first device runs the first scenarized service may be different from or the same as the fourth interface displayed when the second device runs the first scenarized service.
Optionally, the first scenarized service is a motion monitoring service, the second device is a wearable motion device, the third interface may be a picture of a motion trajectory, and the fourth interface may be detected motion data of the user, where the motion data may include: speed of movement, distance of movement, time of movement, heart rate, etc.
Referring to the example shown in fig. 8, when the cell phone detects that the user clicks on the outdoor run option 707 listed in the scenic services option area 705 in the selected user interface 700, the cell phone may jump to an outdoor run related page, as shown in user interface 800. The user interface 800 is an athletic health application page that can be seen with sub-options displayed below the athletic health title bar: an outdoor run sub option 801, an indoor run sub option, a cardiac health sub option, a vascular health sub option. Since the user selects the outdoor run option 707 in the user interface 700, the current jump to the user interface 800 displays a page corresponding to the outdoor run sub-option 801, such as a map, a user positioning icon, a user movement route, a statistical user movement amount, and the like.
Accordingly, the smart watch displays an outdoor running related interface, referenced to the smart watch user interface 802. The smart watch user interface 802 may not be consistent with the cell phone outdoor running related user interface 800, and the smart watch and the cell phone may share athletic data. The movement data such as pace, movement distance, heart rate, movement time and the like detected by the smart watch can be displayed on the smart watch user interface 802.
Optionally, referring to the embodiment shown in fig. 18, the first scenarized service is a screen projection service, the second device is a large-screen device, the fourth interface is a video or image picture, and the third interface is a control interface for controlling a display function of the video or image picture.
Alternatively, the third interface and the fourth interface may display the same picture content, and referring to the embodiment shown in fig. 19, both the mobile phone and the television display the same video picture.
Optionally, when the user selects to return to the page of the super terminal, it may be displayed that the second control is adsorbed together with the first control. The adsorption of two device icons together may mean that the edges of the icons corresponding to the two devices are tangent, or the adsorption of two device icons together may mean that the two device icons are completely or partially overlapped. The present embodiment is not particularly limited in the form of expression of adsorption. For example, referring to the user interface 1400 shown in fig. 14, in the device connection state diagram area in the super terminal page, the smart watch icon is displayed to be attached around the cell phone icon, which indicates that the smart watch has established a connection with the cell phone, and may cooperate with the cell phone icon.
In some embodiments, when a user drags the associated device icon to attach to the core device icon from different directions and different areas, the resulting effect and triggered service may also be different.
In a screen projection scene, the first device determines the screen projection position of the screen interface of the first device in the screen of the second device according to the relative position of the second control and the first control when the dragging operation is released. The embodiments described with reference to fig. 15, 16 and 17 may be used specifically.
If the relative position is that the second control is on the left side of the first control, the screen interface of the first device is projected on the right side of the screen of the second device; if the relative position is that the second control is positioned on the right side of the first control, the screen interface of the first equipment is projected on the left side of the screen of the second equipment; and if the relative position is that the second control is in the middle of the first control, the screen interface of the first device is projected on the screen of the second device in a full screen mode.
If the user wants to leave a certain associated device from the super terminal system or end the service, a fourth user operation may be operated in the super terminal interface in a state where the associated device icon and the core device are in an attached state, where the fourth user operation includes an operation of separating the first control from the second control. And if the second control is selected, dragging the second control to move towards the direction far away from the first control, and releasing the operation of the second control until the second control is moved out of the specified area. In one example, the user may select the associated device icon and drag the associated device icon away from the core device icon. When the mobile phone detects that the user drags the associated equipment icon to be away from the core equipment icon by more than a third distance, when the user releases the finger, the associated equipment icon returns to the position before the connection, the adsorption state is released, the associated equipment is also disconnected from the core equipment at the same time, the associated equipment is separated from the super terminal system, and the service provided by the equipment in the super terminal is also terminated. Alternatively, when the user clicks a related device in a connected state, or the user holds a related device icon and a core device icon and drags them in opposite directions to separate them, the related device and the core device can be triggered to disconnect, and the related device icon appears on the interface as being separated from the core device icon, and the related device icon returns to the position of the unconnected state. The present embodiment is not limited to the conditions for triggering the adsorption and desorption or the user operation.
In some embodiments, a key disconnection control may also be set on the super terminal interface, and clicking the key disconnection control may disconnect all the associated devices from the core device, and recover to a state where the core device is independent and temporarily without any associated device connection, and the service provided by each associated device is terminated accordingly.
It may be understood that, without being limited to the connection between the associated device and the core device to form the super terminal, in other embodiments, the user may also connect a certain associated device and another associated device in the super terminal interface to form the super terminal. That is to say, a user may connect any two or more devices, the devices having a connection relationship are not limited to associated devices or core devices, for example, the smart sound box and the smart television of the associated devices may be directly connected to form a super terminal, and various functions, schematic interfaces, or user operations of the super terminal may refer to the description in the foregoing embodiments, which is not repeated herein.
In some embodiments, the first device detects a third user action on the second interface, the third user action associating a fourth device with the first device, the third user action including selection of a third control action indicative of the fourth device. Responding to a third user operation, the first device displays a sixth interface, the first control, the second control and the third control are displayed in the sixth interface in an adsorbed mode, a third scenario service option is also displayed in the sixth interface, the third scenario service option is determined according to the first device, the second device and the fourth device, and the third scenario service option is different from the first scenario service option. Reference is made to the embodiment shown in fig. 9, which is not repeated herein.
As can be seen from the above, the embodiment of the present application provides a unified entry for interaction between a core device and multiple associated devices, that is, a super terminal interface, where a user can trigger any two or more electronic devices in the super terminal interface to form a super terminal. And the super terminal can judge the use scene according to the conditions of the equipment type, the equipment characteristic, the product positioning, the common function, the equipment state, the current running application, the current equipment environment and the like of each equipment, and provide one or more scene service options, atomic service combination options or interactive function options based on the use scene. The user can swiftly select scene service and/or atomization service according to individual demand, need not loaded down with trivial details dispersion setting, from this, can solve the interactive function's between electronic equipment setting entry and distribute scattered, set up loaded down with trivial details, can not list the problem of abundant use scene for the user.
According to the method and the device, the device combination is selected by the user, the available scene service and/or atomization service under the device combination is provided in a targeted manner, detailed and rich use scenes are listed for the user, the user is guided to fully explore more application scenes by using the functions of the super terminal, and the effect of recommending the scene service according to the device combination is achieved.
In conjunction with the foregoing embodiments, the functional modules of the communication system 10 provided in the embodiments of the present application are described below.
Referring to fig. 21, fig. 21 shows functional modules of the communication system 10. The communication system 10 may include a device combination generation module 2101, a scenario service decision module 2102, a scenario service provision module 2103, an atomic service provision module 2104, and the like. Communication system 10 may also be referred to as a super terminal, which may include a core device and one or more associated devices. For the explanation of the super terminal, the core device, the associated device, and the like, reference may be made to the foregoing embodiments, which are not described herein again.
The device combination generation module 2101 may be used for combination of a core device and an associated device in a super terminal, and the like, and includes functions of discovering a surrounding associated device by the core device, indicating a location relationship or a signal strength relationship between the associated device and the core device on an interface, and connecting the core device and the associated device. The core device may have a super terminal interface displayed thereon, and the discovered peripheral associated devices may be displayed on the super terminal interface. In some embodiments, the user may drag the associated device icon close to the core device icon, and when the designated area is reached, the associated device icon may be attached to the edge of the core device icon, indicating that the core device has established a connection with the associated device. For descriptions of the connection condition between the core device and the associated device, user operations, schematic interfaces, and the like, reference may be made to the foregoing described embodiments in fig. 5, fig. 6, fig. 15, fig. 16, fig. 17, and the like, which are not described herein again.
The scenario service decision module 2102 may be configured to analyze a usage scenario of a current user based on information about a currently connected device type and/or device characteristics, product location, a commonly used function, or according to a detected current environment of the user, or an operation state of the device. The scenario service decision module 2102 may demarcate a variety of consumer usage scenarios. In one example, a consumer usage scenario may include the following five categories: mobile office, intelligent home, audio-visual entertainment, sports health, intelligent trip. Certainly, the classification of the usage scenarios of the consumer is not constant, and the scenario service decision module 2102 may continuously update, expand, or refine more usage scenarios. For example, new associated devices or functions that have not been added before may result in new usage scenarios. For specific description, reference may be made to the foregoing embodiments, which are not described in detail herein.
The scenario service providing module 2103 is configured to, when multiple devices are connected, comprehensively consider one or more scenario service options obtained according to the device types, device features, product locations, device usages, collaborative services that can be provided by the devices, the current environment of the devices, the device operating state, applications that are recently used by a user, currently-operating applications, orientation information input by user operations, and the like of the core device and/or the associated devices, and display the scenario service options on a user interface, so that the user can conveniently and quickly start related services or functions.
In some embodiments, when listing the scenized service options, the scenized service providing module 2103 may calculate a probability that each of the scenized service options is selected according to a frequency of recent use of different collaborative services by the user, or detection of an environment where the current device is located, or a recent application run by the user, and display the scenized service options with a high probability of being selected in the area of the scenized service options before preference. Certainly, the scenario service option is not invariable, and the scenario service providing module 2103 may continuously update, expand or refine more scenario service options, which is not limited in this embodiment and may be changed according to the actual situation. In some embodiments, a user may custom add a scenarized service, such as a user setting a name to a certain usage scene and associating the scene with other application services or system services. And the newly added scene service option and other scene service options are displayed in the super terminal interface in parallel. The scene service providing module 2103 may comprehensively consider one or more of the above factors to reasonably recommend scene service options.
The context service providing module 2103 may also be used to recommend associated devices. The scene service providing module 2103 may analyze a current scene based on types of devices currently constituting the super terminal system, and enumerate and recommend other devices suitable for the current scene to the user, so as to help the user obtain better scene experience; or more extensible use scenes are presumed, then other equipment is listed and recommended to the user, and if the user adopts the equipment recommended by the system, more and richer use scenes can be further expanded.
For description of functions that can be provided by the scenario service providing module 2103, reference may be made to descriptions of the foregoing embodiments in fig. 7, fig. 8, fig. 9, fig. 11, fig. 12, fig. 13A, fig. 13B, fig. 13C, fig. 13D, fig. 15, fig. 16, fig. 17, fig. 18, fig. 19, and the like, which are not repeated herein.
The atomic service providing module 2104 may be configured to automatically set an atomic service combination in the scene based on the current device combination and the atomic services that can be provided by the queried devices, and according to a scenario service actually selected by the user, or based on a service combination commonly used by the user, or a default service combination set by the user, or based on a service combination that is most suitable for the scene in living cognition, or a service combination that is obtained from the server and has the highest frequency of use in the scene, so as to save a cumbersome selection operation of the user. The atomic service refers to a minimum capacity unit which can run independently, is a concept of abstract packaging of single function/capacity, and can be a hardware service or a software service.
In some embodiments, the atomic service providing module 2104 supports user-customized setting of an atomic service combination, and the user can manually modify the atomic service combination according to personal preference or actual conditions to adjust an atomic service to be provided by other devices. If the surrounding environment is noisy, as in a cross-screen conference, the user may provide the audio output function by the smart television and instead by the earphones.
For a functional description of the more atomic service providing module 2104, reference may be made to the description of the foregoing embodiment in fig. 10, and details are not described here.
In some embodiments, the super-terminable electronic device may include: the device comprises the same account number device, a cross account number device and a non-account number device.
The same account device may include an electronic device whose login account and a core device login account of the super terminal are the same account. For example, the account numbers logged in the mobile phone of zhang san and the tablet computer of zhang san are the same account number, that is, the account number of zhang san. Under the condition that the core device of the super terminal is the mobile phone with the third account, the tablet computer with the third account can be used as a device with the same account to be added into the super terminal.
The same account device may further include an electronic device paired with a core device of the super terminal and associated with a login account on the core device. In some embodiments, some electronic devices (e.g., headsets, smartwatches, smartbands, etc.) do not have the ability to independently log in to an account. The user can be paired with the electronic equipment through a mobile phone (or electronic equipment such as a tablet personal computer and the like with independent login account number capability), so that the electronic equipment is associated with the account number logged in on the mobile phone, and the electronic equipment is convenient to control and manage through the mobile phone. The pairing method can comprise Bluetooth pairing, wi-Fi pairing and the like. The embodiment of the present application does not limit the method of pairing. For example, the headset of zhang san does not have the capability of logging in an account number independently. The mobile phone with Zhang three is matched with the earphone with Zhang three. Then, the zhang san headset is associated with the login account number on zhang san handset (i.e., zhang san account number). That is, the third earphone may be an electronic device managed under the account belonging to third. And under the condition that the core equipment of the super terminal is the mobile phone with the third account, the earphone with the third account can be added into the super terminal as the equipment with the same account number.
The cross-account device may include an electronic device whose login account is different from a core device login account of the super terminal. For example, the login account number of Zhang III on the mobile phone of Zhang III is the account number of Zhang III. And logging in the account number of the Li IV on the tablet computer of the Li IV as the account number of the Li IV. The login account number of the mobile phone of Zhang III is different from the login account number of the tablet personal computer of Li IV. Under the condition that the core device of the super terminal is the mobile phone with the third account, the tablet computer with the fourth account can be used as a cross-account device to be added into the super terminal.
The cross-account device can also comprise an electronic device which is associated with one account by being paired with other electronic devices, and the associated account is different from a core device login account of the super terminal. For example, the login account number on the mobile phone of lee four is the account number of lee four. The headset of lee four is paired with the handset of lee four, and is thus associated with the account number of lee four. Under the condition that the core device of the super terminal is the mobile phone with the third account (the login account is the account with the third account), the earphone of the lee four can be used as a cross-account device to be added into the super terminal.
The non-account device may be an electronic device that does not have a login account and does not have a related account. After establishing communication connection with the account-less device, other electronic devices (such as a mobile phone, a tablet computer, and the like) can cooperate with the account-less device to provide a scene service for a user. The above-mentioned non-account device may be equivalent to a common device. For example, no account number is logged in the printer, and no associated account number. And under the condition that the core equipment of the super terminal is the Zhang III mobile phone (the login account is the Zhang III account), the printer can be added into the super terminal as equipment without the account.
The embodiment of the application does not limit the types of the same account device, the cross-account device and the non-account device.
In some embodiments, when the cross-account device joins the super terminal, device binding may be performed first. The core device of the super terminal may request to bind with the cross-account device. After the cross-account device agrees to bind, the core device may bind with the cross-account device, so that the cross-account device may join the super terminal. The super terminal can provide scene services for the user by using the cross-account device.
It can be understood that the user to which the cross-account device belongs is not the same user as the user to which the core device of the super terminal belongs. The user to which the core device belongs wants to provide the scene service by using the cross-account device, and the user to which the cross-account device belongs needs to obtain the consent. Then, when the cross-account device is added to the super terminal, the core device may perform device binding with the cross-account device to request that the user to which the cross-account device belongs agrees to use the cross-account device by the user to which the core device belongs. Therefore, the situation that the electronic equipment of the user is used by other users under the condition that the user to which the cross-device account belongs is not informed or not informed can be reduced, and the use experience of the user is improved.
In some embodiments, when the device with the same account number joins the super terminal, the process of binding the device may not be performed. It can be understood that the user to which the above-mentioned device with the same account belongs and the user to which the core device of the super terminal belongs are generally the same user. The user to which the core device belongs wants to cooperate with the core device by using other devices of the user, and the core device can cooperate with the same account device by the methods shown in fig. 5 to 8, so as to provide the scene service. Therefore, the user uses the cooperation of a plurality of electronic devices of the user, and the device binding is not needed. The operation that the user uses the super terminal to provide the scene service can be simplified, and the use experience of the user is improved.
In some embodiments, the device without account number is added to the super terminal, and device binding may be performed or may not be performed.
A scenario that an electronic device provided by the embodiment of the present application joins a super terminal to perform device binding is specifically described below.
Fig. 22A to fig. 22H are schematic diagrams illustrating a scenario in which an electronic device joins a super terminal and performs device binding.
As shown in fig. 22A, the electronic device 100 may display a user interface 2210. Here, the electronic device 100 is exemplified as a mobile phone. Not limited to a mobile phone, the electronic device 100 may also be other types of electronic devices (e.g., a tablet computer, a notebook computer, etc.). The user interface 2210 may be as described above with reference to the user interface 500 shown in FIG. 5.
Among other things, the user interface 2210 may contain a device connection state illustration 2211 and an add control 2212. The device connection status diagram 2211 may contain a cell phone icon 505. The cell phone icon 505 may indicate that the core device is a cell phone, i.e., the electronic device 100. The electronic device represented by the device icon and the name surrounding the cell phone icon 505 may be the electronic device searched by the electronic device 100. For example, sound X icon, freeBuds icon, matePad icon, and MateBook icon, respectively, in the device connection status diagram 2211.
In some embodiments, when an account is logged on the electronic device, the name of the electronic device may be the account name + the type and/or model of the electronic device. When the other electronic device searches the electronic device, the name of the electronic device may be displayed. For example, the type of electronic device is MatePad. The electronic equipment is logged with an account, and the name of the account is Zhang III. The name of the electronic device may be MatePad for zhang. When the electronic device 100 searches for the electronic device, a MatePad icon of zhang, and the name "MatePad of zhang" of the electronic device may be displayed in the device connection status diagram 2211 shown in fig. 22A.
In some embodiments, the name of the electronic device may be user-remarked. Wherein, the user can manage the electronic device paired with the electronic device 100 through the electronic device 100, for example, modify the name of the electronic device paired with the electronic device 100. Illustratively, the electronic device 100 is paired with a headset. The user may modify the name of the headset to freebubbles, zhang, via the electronic device 100. The electronic device searches for the headset, and may display the tree buses icon of zhang in the device connection status diagram 2211 shown in fig. 22A, along with the name "tree buses of zhang" of the electronic device.
The method for determining the name of the electronic device is not limited in the embodiment of the present application.
The add control 2212 can be used to trigger the electronic device 100 to search for other electronic devices. For example, electronic devices located around the electronic device 100, electronic devices connected and communicating through the Internet, and the like.
In some embodiments, in the device connection state diagram 2211, the electronic devices represented by the device icons around the mobile phone icon 505 may include a same account device, a cross-account device that completes device binding with the electronic device 100, and a no account device. That is, the user may directly implement the cooperative work of the electronic device represented by the device icon and the electronic device 100 by dragging the device icon around the mobile phone icon 505 to be close to the mobile phone icon 505.
In addition to the electronic devices represented by the device icons in the device connection status diagram 2211, there may be other electronic devices that can cooperate with the electronic device 100. The electronic device 100 may search for an electronic device whose device icon is not displayed in the device connection state diagram 2211 in response to an operation acting on the add control 2212. For example, a cross-account device, a non-account device, etc. that does not complete device binding with the electronic device 100.
When an electronic device whose device icon is not displayed in the device connection state diagram 2211 is searched, the electronic device 100 may display the device list 2213 of fig. 22B on the user interface 2210. The device list 2213 may display device identifiers of the electronic devices searched by the electronic device 100, so that the user may select the electronic devices to perform device binding. The device identifier of the electronic device in the device list 2213 may include a device icon and a device name.
As shown in fig. 22B, the device list 2213 may include a smart bracelet id 2213B, a Vision id 2213C, a MatePad id 2213D of lie iv, and a freebubbles id 2213E of lie iv. The device identification of any one of the electronic devices may be used to trigger the electronic device 100 to perform device binding with the one electronic device. A search control 2213A and a cancel control 2213F may also be included in the device list 2213. The search control 2213A may be used to trigger the electronic device 100 to continue searching for other electronic devices. A cancel control 2213F may be used to trigger the electronic device 100 to cancel the display device list 2213.
As shown in fig. 22B, in response to an operation on the MatePad identification 2213D of lee, the electronic device 100 may display a connection code setting box 2214 and an input keyboard 2215 shown in fig. 22C. Wherein:
the connection code setting box 2214 may be used to set a connection code for device binding between the electronic device 100 and the electronic device named MatePad of liquad. The connection code settings box 2214 may contain setting prompts. The setting prompt may be, for example, a text message: matePad, link lee, please enter the connection code that is verified on the opposite device. The connection code settings box 2214 may also contain a cancel control 2214A and a determine control 2214B. Cancel control 2214A may be used to cancel device binding of electronic device 100 with an electronic device named MatePad of lie. Determining control 2214B may be used to trigger electronic device 100 to send a binding request for device binding to an electronic device named MatePad of lee.
The input keypad 2215 may be used for the user to enter a connection code in the connection code setting box 2214. The input keypad 2215 may include numeric input keys, alphabetic input keys, character input keys, and the like. I.e. the connection code may comprise one or more of the following: numbers, letters, characters, etc. The embodiment of the present application does not limit the content of the connection code. For example, the connection code may consist of 6 digits: 710710.
As shown in fig. 22C, in the case where a connection code is input in the connection code setting box 2214, the electronic device 100 may transmit a binding request for device binding to the electronic device 200 in response to an operation on the determination control 2214B. The electronic device 200 may be the electronic device named MatePad of lee above. A communication connection may be established between the electronic device 100 and the electronic device 200.
As shown in fig. 22D, in the course of the electronic device 100 requesting device binding with the electronic device 200, the electronic device 100 may display a waiting prompt box 2216. The wait prompt box 2216 may be used to prompt the user to add the electronic device 200 to the super terminal of the electronic device 100, which is the core device, and to wait for the user to whom the electronic device 200 belongs to agree. The wait prompt box 2216 may contain text prompts: waiting for the other party to agree. The content in the waiting prompt box 2216 is not limited in the embodiment of the present application.
Upon receiving the binding request transmitted by the electronic device 100, the electronic device 200 may display a binding prompt box 2221 shown in fig. 22D. The binding prompt box 2221 may be used to prompt the user that the electronic device 100 requests to join the electronic device 200 to the super terminal, so that the electronic device 200 can provide a scenized service in the super terminal, and ask the user whether to approve.
The binding prompt box 2221 may contain a decline control 2221A and an agree control 2221B. The reject control 2221A may be configured to reject the electronic device 200 from joining the core device as a super terminal of the electronic device 100. In response to operation of the reject control 2221A, the electronic device 200 can send a reject message to the electronic device 100. The electronic device 100 may then prompt the user that the electronic device 200 refuses to join the super terminal. Consent control 2221B may be used to consent electronic device 200 to join the core device as a super terminal for electronic device 100. In response to operation of consent control 2221B, electronic device 200 may display connection code input box 2222 shown in fig. 22E. The connection code entry box 2222 may be used to enter a connection code. When the connection code input in the connection code input box 2222 is the same as the connection code set by the user in the aforementioned connection code setting box 2214 shown in fig. 22C, the electronic device 100 and the electronic device 200 may complete the binding.
As shown in fig. 22E, the connection code input in the connection code input box 2222 is 710710, which is the same as the connection code set in the connection code setting box 2214 shown in fig. 22C. Then, in response to the operation of determination control 2222A in connection code input box 2222, electronic device 200 may transmit an approval message to electronic device 100. The consent message may include the connection code entered in connection code entry box 2222. Upon receiving the consent message, the electronic device 100 may verify whether the connection code included in the consent message is accurate. When detecting that the connection code included in the consent message is accurate, the electronic device 100 may display the user interface 2210 shown in fig. 22F.
As shown in fig. 22F, the electronic device 100 may display a MatePad icon 2211A of lie four in a device connection status illustration 2211 of the user interface 2210. The MatePad icon 2211A of lee may represent an electronic device named MatePad of lee, namely electronic device 200.
It can be seen that when the electronic device 200 agrees to join the super terminal, the electronic device 100 may display the identity of the electronic device 200 in the device connection state diagram 2211, so that the user may quickly coordinate the electronic device 100 with the electronic device 200.
As shown in fig. 22G, the electronic device 100 can display a user interface 2210 shown in fig. 22H in response to an operation of dragging the MtaPad icon 2211A of lee close to the cell phone icon 505.
As shown in fig. 22H, the itapad icon of lee may be displayed on user interface 2210 in the style of icon 2211B. The MtaPad icon of lie four can be attached to the edge of the cell phone icon 505.
User interface 2210 may contain a prompt 2217 "collaborated with MatePad of Liqu". The prompt 2217 can prompt the user that the electronic device 100 and the electronic device 200 are currently connected, and can cooperate to provide a scene service for the user.
The user interface 2210 shown in FIG. 22H may also contain a recommended devices area 2218A, a scenarized services options area 2218B, and an atomic services entry options 2219. The recommended devices area 2218A may be used to list devices that may be linked with a super terminal composed of currently coordinated devices (e.g., a super terminal composed of the electronic device 100 and the electronic device 200) to provide a user with a richer scene or better experience. The scenized services option area 2218B may be used to list one or more scenized services recommended by the super terminal according to the current device combination. For example, the mobile phone and lee MatePad (i.e. tablet) work together to provide the scenario services, which include: multi-screen collaboration, camera collaboration, cross-screen conferencing, screen expansion, and the like. The atomic service entry option 2219 can be used to display a combination of atomic services, and the user can select the providing device of the atomic service at his or her discretion.
The electronic device 100 and the electronic device 200 cooperate to provide a scene service, which may specifically refer to the embodiments described in fig. 6 to fig. 10. And will not be described in detail herein.
As can be seen from the scenarios shown in fig. 22A to 22H, the super terminal using the electronic device 100 as a core device may join the cross-account electronic device. In this way, the user to whom the electronic device 100 belongs may not only use the own electronic device to perform cooperative work, but also use the own electronic device and the electronic devices of other users to perform cooperative work, thereby enjoying more convenient and richer scene services. The user can request the cross-account equipment to complete equipment binding by setting the connection code, so that a connection relation capable of working cooperatively is established. This can reduce the situation that the user of the cross-device account is used by other users under the condition of disagreement or unconsciousness, and the verification process of the connection code can ensure the accuracy of the electronic device bound with the electronic device 100, and improve the use experience of the user.
Fig. 22I and 22J illustrate a unified portal of interaction between electronic devices in some of the super-terminals.
As shown in fig. 22I, the electronic device 100 may display a user interface 2210. The user interface 2210 may include a device connection status illustration 2231. Device icons for one or more electronic devices may be displayed in the device connection status diagram 2231. Such as cell phone icon 505. The cell phone icon 505 may represent a core device of the super terminal, i.e., the electronic device 100. The device icon of the electronic device searched by the electronic device 100 may also be included in the device connection state diagram 2231.
In some embodiments, the electronic device 100 may display the device icons of the same account device on a circle centered on the cell phone icon 505 and having a radius of distance A2. For example, the MateBook icon by Zhang three, the MatePad icon by Zhang three, and the FreeBuds icon by Zhang three. The electronic device 100 may display a device icon across account devices and a device icon without an account on a circle having the mobile phone icon 505 as a center and a radius of a distance A3. For example, lee ads icon for lee, matePad icon for lee, smart watch icon. The distance A3 may be greater than the distance A2.
When the device icon of the electronic device is dragged to the circular area with the mobile phone icon 505 as the center and the radius of the circular area as the distance A1, the user releases the finger, the icon of the electronic device can be adsorbed by the mobile phone icon 505, and the electronic device can cooperate with the electronic device 100. The above process may refer to the scenarios shown in fig. 6 and 7. The distance A1 may be smaller than the distance A2.
As shown in fig. 22J, the electronic device 100 may display a user interface 2210. The user interface 2210 may include a device connection status illustration 2211, a non-co-account device display area 2232. The above device connection state diagram 2211 can refer to the description of the foregoing embodiment. The non-same account device display area 2232 may be used to display device icons for cross-account devices and device icons for non-account devices. For example, lee four's FreeBuds icon, lee four's MataPad icon, smartwatch icon, and the like. The electronic devices represented by the device icons in the non-co-account device display area 2232 may all be searched by the electronic device 100.
In some embodiments, the electronic device 100 may absorb the two device icons in accordance with an operation of dragging the device icon in the device connection state diagram 2211 to a distance less than a preset distance from the device icon in the non-account device display area 2232. The electronic devices represented by the two device icons can establish cooperation to provide scene services. In some embodiments, the electronic device 100 may absorb the two device icons in the non-same-account device display area 2232 according to an operation of dragging the device icon to a distance less than a preset distance from the device icon in the device connection state diagram 2211. The electronic devices represented by the two device icons can establish cooperation to provide a scene service.
That is, the user may drag the device icon in the device connection status diagram 2211 to be close to the device icon in the non-same-account device display area 2232, so that the two device icons are attracted, and the electronic devices represented by the two device icons respectively establish cooperation. The user may also drag a device icon in the non-co-account device display area 2232 to be close to a device icon in the device connection state diagram 2211, so that the two device icons are adsorbed, and the electronic devices represented by the two device icons respectively establish cooperation.
The embodiment of the application does not limit the expression form of the interactive unified entry between the electronic devices in the super terminal. The electronic device 100 may also display the core devices and associated devices of the super-terminal through other representations.
In some embodiments, the electronic device represented by the device icon in the device connection state diagram 2211 shown in fig. 22A may further include: the cross-account device searched by the electronic device 100 and not bound with the electronic device 100. In response to the operation of coordinating the cross-account device bound to the unfinished device with the electronic device 100, the electronic device 100 may perform device binding with the cross-account device bound to the unfinished device. Then, the electronic device 100 and the cross-account device may cooperate to provide a scenario-based service.
Another scenario that the electronic device joins the super terminal to perform device binding is described below.
Fig. 23A to 23D are schematic diagrams illustrating a scenario in which an electronic device joins a super terminal and performs device binding.
In response to an operation to open the super terminal function related device interaction interface (e.g., a touch operation on the super terminal function card 414 shown in FIG. 4 described above), the electronic device 100 may display a user interface 2210 as shown in FIG. 23A.
The user interface 2210 may include a device connection status illustration 2211. The device connection status diagram 2211 can refer to the description of the foregoing embodiments, and will not be described here. In the device connection status diagram 2211 of fig. 23A, a MatePad icon 2311 of lie may be displayed around the cell phone icon 505. The MatePad icon 2311 of lee may represent an electronic device named MatePad of lee, such as the electronic device 200 shown in fig. 22D described above. The account number registered on the electronic device 200 may be, for example, the account number of lee, which is different from the account number registered on the electronic device 100. Therein, the electronic device 100 may search for the electronic device 200 to display the MatePad icon 2311 of lie four in the device connection status diagram 2211. The electronic device 100 is not device bound with the electronic device 200.
User interface 2210 may also contain an add control 2212. In response to operation of the add control 2212, the electronic device 100 may search for more electronic devices for the user to opt-in to the super terminal.
As shown in fig. 23B, in response to an operation of cooperating the electronic device 200 with the electronic device 100, for example, an operation of dragging the MtaPad icon 2311 of lie four close to the cell phone icon 505, the electronic device 100 may perform device binding with the electronic device 200.
In some embodiments, in response to the above-described operation of cooperating the electronic device 200 with the electronic device 100, the electronic device 100 may first determine whether the electronic device 200 is a cross-account device. In a case where it is determined that the electronic device 200 is a cross-account device, the electronic device 100 may determine whether the electronic device 200 is device-bound to the electronic device 100. In a case where it is determined that the electronic apparatus 200 is not device-bound with the electronic apparatus 100, the electronic apparatus 100 may perform device-binding with the electronic apparatus 200.
In some embodiments, when the MtaPad icon 2311 of lie is dragged to be close to the cell phone icon 505, the electronic device 100 may display the MtaPad icon of lie in the style of the icon 2312 shown in fig. 23B. Icon 2312 is different from the MtaPad icon 2311 of liquad so that the user can distinguish which electronic devices establish a cooperative connection with electronic device 100.
As shown in fig. 23C, in the process of device binding between the electronic device 100 and the electronic device 200, the electronic device 100 may display a connection code setting frame 2313 and an input keyboard 2314 for a user to input a connection code for device binding with the electronic device 100. The electronic device 100 and the electronic device 200 may perform device binding according to the embodiments described in fig. 22C to fig. 22E. And will not be described in detail herein.
When the electronic devices 100 and 200 complete the device binding, the electronic device 100 may display the user interface 2210 shown in fig. 23D. The user interface shown in fig. 23D can refer to the user interface shown in fig. 22H described earlier. And will not be described in detail herein.
In some embodiments, the electronic device 100 may display the device icon of the searched electronic device in the device connection state diagram 2211 according to a preset rule. For example, the number of device icons displayed in the device connection state diagram 2211 is limited. The electronic device 100 may be ranked in the following priority: same account device > no account device > across account devices, an electronic device with a high priority is selected to display a device icon in the device connection state diagram 2211. As another example, the number of device icons displayed in the device connection state diagram 2211 is limited. The electronic device 100 may select an electronic device that is close to the electronic device 100 to display a device icon in the device connection state diagram 2211 according to the degree of distance of the searched electronic device from the electronic device 100. As another example, the number of device icons displayed in the device connection state diagram 2211 is limited. The electronic device 100 may select an electronic device with a high signal strength to display a device icon in the device connection state diagram 2211 according to the signal strength corresponding to the searched electronic device. The embodiment of the present application does not limit the preset rule for displaying the device icon in the device connection state diagram 2211.
As can be seen from the scenarios shown in fig. 23A to 23D, the electronic device 100 may display the identifier of the searched electronic device (such as an account device, a bound cross-account device, an unbound cross-account device, and a non-account device) around the core device icon of the super terminal. Therefore, the user can quickly establish the cooperative connection relation between different electronic devices in a mode of dragging the device icon. When the electronic device represented by the dragged device icon includes an unbound cross-account device, the electronic device 100 may establish a binding relationship with the unbound electronic device, so as to establish a connection relationship of cooperative work. Through the method, the user to which the electronic device 100 belongs can not only utilize the own electronic device to perform cooperative work, but also utilize the own electronic device and the electronic devices of other users to perform cooperative work, thereby enjoying more convenient and richer scene services.
In some embodiments, when an electronic device (e.g., a cross-account device) that has been device-bound with a core device of a super terminal joins the super terminal again, the process of device-binding with the core device may be simplified. For example, the electronic device 200 and the electronic device 100 have been device-bound, and when the core device is added again as the super terminal of the electronic device 100, the electronic device 100 and the electronic device 200 may mutually verify the identities of each other, thereby completing the device-binding without requiring the user to input the connection code for the device-binding again. The method can simplify the user operation of rejoining the super terminal by the electronic equipment, is convenient for the user to rapidly use the scene service provided by the super terminal, and improves the use experience of the user.
A scenario that another electronic device provided in the embodiment of the present application is added to a super terminal to perform device binding is described below.
Fig. 24A to 24D are schematic diagrams illustrating a scenario in which an electronic device joins a super terminal and performs device binding.
As shown in fig. 24A, the electronic device 100 may display a user interface 2210. The user interface 2210 may be as described above with reference to the embodiment depicted in FIG. 22A.
In FIG. 24A, the user interface 2210 may include a device connection state illustration 2211. A MatePad icon 2411 for lee four, a freebooks icon 2412 for lee four may be displayed around the cell phone icon 505 in the device connection status diagram 2211. The MatePad icon 2411 of lee may represent an electronic device named MatePad of lee, such as the electronic device 200 shown in fig. 22D described above. The login account on the electronic device 200 may be the account of lee, which is different from the login account on the electronic device 100 represented by the mobile phone icon 505. The lee bids icon 2412 of lee four may represent an electronic device named lee bids of lee four. The electronic device named leebuses of lee four may be an electronic device associated with the account number of lee four.
It can be seen that both the electronic device named MatePad of lee and the electronic device named freeballs of lee may be core devices of the super terminal of the electronic device 100.
In some embodiments, the representation of the device icon of the cross-account device that has been device-bound with the electronic device 100 and the representation of the device icon of the cross-account device that has not been device-bound with the electronic device 100 may be different. For example, an electronic device named lee MatePad (hereinafter abbreviated lee MatePad) has been device bound with the electronic device 100. The electronic device named leebds of lee (hereinafter leebds of lee) has not device bound with the electronic device 100. The MatePad icon 2411 of lee four may be displayed as a dark filled form as shown in fig. 24A. The lee bids icon 2412 of lee four can be displayed as a light colored fill as shown in fig. 24A. The different representations of the device icons may facilitate a user to distinguish which cross-account devices have been device-bound with the electronic device 100 and which cross-account devices have not been device-bound with the electronic device 100.
In some embodiments, the cross-account device joins the super terminal of the electronic device 100, which is the core device, for the first time, and the cross-account device and the electronic device 100 may perform the device binding illustrated in fig. 22C to 22E. In the process of performing the device binding, the cross-account device and the electronic device 100 may store device identifiers of each other. When the cross-account device exits from the super terminal and joins the super terminal again, the electronic device 100 and the cross-account device may use the device identifier to mutually verify the identities of each other, thereby completing device binding. That is, when the user joins the cross-account device that has once joined the super terminal to the super terminal again, it is not necessary to set the connection code for device binding again.
Exemplarily, lee's MatePad (i.e., electronic device 200) has been device bound with electronic device 100. In response to the operation of dragging the MatePad icon 2411 of lie four to be close to the cell phone icon 505 (i.e., the operation of joining the electronic device 200 to the super terminal of which the core device is the electronic device 100 again) as shown in fig. 24B, the electronic device 100 may send a binding request requesting binding to the electronic device 100. The binding request may contain an identification of the device used for authentication.
In the process that the electronic device 100 requests the electronic device 200 to complete the device binding, the electronic device 100 may display a user interface shown in fig. 24C to prompt the user to wait for the user to agree with the electronic device 200 in cooperation with the electronic device 200.
When receiving the binding request from the electronic device 100, the electronic device 200 may determine that it has finished device binding with the electronic device 200 through the connection code according to the device identifier in the binding request. The electronic apparatus 200 may further display a user interface 2220 shown in fig. 24C to ask the user whether the user agrees to the user to which the electronic apparatus 100 belongs to perform the cooperative work using the electronic apparatus 200. The user interface 2220 shown in fig. 24C can be referred to as the description of the user interface 2220 shown in fig. 22D described above. And will not be described in detail herein.
When detecting a request for the user to approve the binding of the electronic device 200 with the electronic device 100, the electronic device 200 may transmit an approval message to the electronic device 100. When receiving the agreement message, the electronic device 100 may add the electronic device 200 to the super terminal, thereby establishing a cooperative connection with the electronic device 200 and providing a scenario service to the user. After the electronic device 200 joins the super terminal, the electronic device 100 may display a user interface 2210 shown in fig. 24D for a user to control and manage the scenarized service provided by the cooperative work between the electronic device 100 and the electronic device 200. The user interface 2210 shown in FIG. 24D may be referred to the user interface 2210 described previously in FIG. 22H. And will not be described in detail herein.
It is understood that the user to which the electronic device 100 belongs and the user to which the electronic device 200 belongs may be different. Although the electronic device 200 has been added to the super terminal with the core device as the electronic device 100, the user of the electronic device 100 should still be informed by the user of the electronic device 200 when the user wants to perform the cooperative work by using the electronic device 200. This can reduce the situation that the user of the cross-device account is used by other users under the condition of disagreement or unconsciousness.
As can be seen from the scenarios shown in fig. 24A to 24D, when the electronic device has been added to the super terminal, the user operation of adding the electronic device to the super terminal again to perform device binding may be simpler. This may improve the user experience of using the super terminal.
In some embodiments, a cross-account device joining a super terminal is paired with other electronic devices, thereby associating accounts logged on the paired electronic devices. In the process that the cross-account device joins the super terminal of which the core device is the electronic device 100, the electronic device 100 may complete device binding between the cross-account device and the electronic device 100 through the electronic device paired with the cross-account device.
Another scenario that the electronic device joins the super terminal to perform device binding is described below.
Fig. 25A to 25F are schematic diagrams illustrating a scenario in which an electronic device joins a super terminal and performs device binding.
As shown in fig. 25A, electronic device 100 may display user interface 2210. The user interface 2210 may include a device connection state illustration 2211. The FreeBuds icon 2511, which is displayed around the cell phone icon 505 in the device connection status diagram 2211 and includes Liqu as the device icon. The lee ads icon 2511 of lee four may represent an electronic device named lee ads of lee four (hereinafter lee ads of lee four). The lee buses of lee four can be paired with an electronic device, such as a mobile phone of lee four, which logs in an account number of lee four. The account number of lee four is different from the account number registered on the electronic device 100. Namely leebds of lee may be added as a cross-account device to the super terminal of which the core device is the electronic device 100.
Other contents of the user interface 2210 may refer to the description of the foregoing embodiments. And will not be described in detail herein.
In response to the operation of dragging the lee days icon 2511 to close to the cell phone icon 505, the electronic device 100 may perform device binding with lee days. The electronic device 100 may display the connection code setting box 2513 shown in fig. 25B and an input keyboard 2514 for the user to set a connection code for device binding. The user interface shown in fig. 25B can be referred to in the description of the user interface shown in fig. 22C. And will not be described in detail herein.
When detecting that the connection code setting operation is completed in the above-described connection code setting box 2513, the electronic device 100 may send a binding request for device binding to freebanks of lee, and display the user interface 2210 shown in fig. 25C.
In some embodiments, lee's FreeBuds is paired with the electronic device 300. The electronic device 300 may be, for example, a lee mobile phone. The account number registered on the electronic device 300 may be an account number of lee four. Upon receiving the binding request from the electronic device 100, leeb, lee can send the binding request to the electronic device 300. Then, the electronic device 300 may display the user interface 2520 shown in fig. 25D.
As shown in fig. 25D, the user interface 2520 may include a binding prompt box 2521. A decline control 2521A and an consent control 2521B may be included in binding prompt box 2521. The binding prompt box may refer to the binding prompt box 2221 shown in fig. 22D described above. In response to operation of the consent control 2521B, the electronic device 300 may display a connection code input box 2522 shown in fig. 25E for a user to input a connection code for device binding of lee's freebooks with the electronic device 100. Connection code input box 2522 may include a determination control 2522A. The concatenated code entry box 2522 may refer to the concatenated code entry box 2222 shown in fig. 22E previously described.
As shown in fig. 25E, the connection code input in the connection code input box 2522 is 710710, which is the same as the connection code set in the connection code setting box 2513 shown in fig. 25B. Then, in response to operation of determination control 2522A, electronic device 300 can send a consent message to electronic device 100 via leeb, lee. The agreement message may include the connection code entered in the connection code entry box 2522. Upon receiving the consent message, the electronic device 100 may verify whether the connection code included in the consent message is accurate. When detecting that the connection code included in the consent message is accurate, the electronic device 100 may display the user interface 2210 shown in fig. 25F. In the user interface shown in FIG. 25F, a FreeBuds icon 2512 from Liqun may be attached to the cell phone icon 505. The leeway four freebusses icon 2512 can be presented differently than the leeway four freebusses icon 2511 shown in fig. 25A, thereby facilitating the user to distinguish whether leeway four freebusses have coordinated with the electronic device 100.
The user interface 2210 shown in FIG. 25F may also contain a prompt 2531 "synergized with FreeBuds of Li four". The prompt 2531 can prompt the user that the electronic device 100 and lee's freebooks have established a cooperative communication connection to provide the user with the scenized service.
The user interface 2210 shown in FIG. 25F may also include a scenarized service option area 2532, an atomization service entry option 2533. The scenized service option area 2532 may be used to list one or more scenized services recommended by the current device combination by the super terminal (e.g., the super terminal composed of the electronic device 100 and lee buses of lee). For example, the scenario services that the freebusses of cell phone and lee four can provide include: audio sharing and audio playing. The audio sharing can be used to share one or more audio streams played by the electronic device 100 to the FreeBuds of Li IV. Lee bubbles of lee four can play audio shared by the electronic device 100. The electronic device 100 can continue to play the shared audio while sharing the audio. The audio delivery can be used to deliver one or more audio streams played by the electronic device to the FreeBuds of lie four. Lee buds of lie four can play audio delivered by the electronic device 100. After the electronic device 100 puts the audio into freebanks of lie four, the put audio can no longer be played.
As can be seen from the scenarios shown in fig. 25A to fig. 25F, when the electronic device 100 is device-bound with a cross-account device, the device-binding may be completed through an electronic device paired with the cross-account device, so that the cross-account device can join a super terminal in which the core device is the electronic device 100. The electronic equipment paired with the cross-account equipment can better remind the user to which the cross-account equipment belongs that other users want to use the cross-account equipment to perform cooperative work. The electronic equipment paired with the cross-account equipment can facilitate the user to confirm whether the cross-account equipment agrees to join the super terminal or not.
In some embodiments, an electronic device paired with a cross-account device is also paired with other electronic devices. The other electronic devices may also display a message inviting the cross-account device to join the super terminal for device binding.
As can be seen from the foregoing scenarios shown in fig. 25A to 25C, for example, the electronic device 100 may send a binding request for device binding to freebanks across account device lie four. As shown in fig. 25G, lee's FreeBuds binds with the electronic device 300. The electronic device 300 may be a leeward mobile phone. The electronic device that is paired with the electronic device 300 may include a smart watch shown in fig. 25G in addition to the lee balls of lie four, described above. This smart watch can be li si smart watch. Leeb eds of lee can send a binding request from the electronic device to the electronic device 300. Upon receiving the binding request, the electronic device 300 may display the binding prompt box 2521 shown in fig. 25D, previously described. In addition, the electronic device 300 may also send the binding request to a smart watch paired with itself. The smart watch may display the user interface 2520 shown in fig. 25G. The user interface 2520 may be used to prompt the user of the cell phone of ltd (i.e., the electronic device 100) to request the freebuns of ltd, so that the freebuns of ltd can provide a scenarized service in the super terminal and ask the user whether to approve.
A decline control 2521 and an agreement control 2522 may be included in the user interface 2520. The reject control may be used to reject lee's FreeBuds from joining the core device as a super-terminal for the electronic device 100. The consent control 2522 may be used to agree that lee's FreeBuds join the core device as a super terminal for the electronic device 100. In response to operation of the consent control 2522, the smart watch may display a user interface 2530 illustrated in fig. 25H. The user interface 2530 may be used to enter a connection code to complete device binding with the electronic device 100. User interface 2530 may include a determination control 2531.
As shown in fig. 25H, in the event that the user interface 2530 enters a connection code (e.g., 710710), the smart watch may send a consent message to lee's FreeBuds in response to operation of the determination control 2531. Wherein the consent message may be sent to lee's FreeBuds via the electronic device 300. Leebds, leed, may then send a consent message to the electronic device 100. The electronic device 100 and lee buses of lee can accomplish device binding.
It can be seen from the above embodiments that the user can use the electronic device 300 and the smart watch to complete the device binding between the electronic device 100 and lee's freebooks.
A scenario that another electronic device provided in the embodiment of the present application is added to a super terminal to perform device binding is described below.
Fig. 26A to 26C are schematic diagrams illustrating a scenario in which an electronic device joins a super terminal and performs device binding.
As shown in fig. 26A, the electronic device 100 may display a user interface 2210. The user interface 2210 may be as described above with reference to the user interface 2210 shown in FIG. 26A. And will not be described in detail herein. In response to the operation of dragging the leeway icon 2511 to close to the cell phone icon 505, the electronic device 100 can perform device binding with leeway icons of leeway.
In some embodiments, during device binding of the electronic device 100 with leebuss of lee four, the electronic device 100 can display the binding prompt box 2611 shown in fig. 26B and send a binding request for device binding to lee four freebuss.
As shown in fig. 26B, the binding prompt box 2611 may be used to prompt the user to complete the device binding with leebds of lee, which may be required on leebds. For example, the binding prompt box 2611 may contain a text prompt: and the cooperative connection with FreeBuds of Li IV is requested to be established, the establishment of the cooperative connection can be completed after the other party agrees, and the other party agrees to press the earphone handles of the left earphone/the right earphone. The embodiment of the present application does not limit the content of the prompt in the binding prompt box 2611.
As shown in fig. 26B, leeb ears may send a consent message to electronic device 100 in response to a long press operation on leeb ears (e.g., the ear grip of the right earphone). Upon receiving the consent message, the electronic device 100 may display the user interface 2210 shown in FIG. 26C. The user interface 2210 shown in FIG. 26C may be as described above with reference to the user interface 2210 shown in FIG. 25F. And will not be described in detail herein.
In some embodiments, when receiving the binding request sent by the electronic device 100, the leebds of lee can prompt the user that the electronic device 100 requests to bind the lee with the lee through voice playing, blinking of an indicator light, and the like, and complete the user operation required for device binding.
In some embodiments, the device binding process shown in fig. 26A to 26C may be a process in which leebds of lee rejoins the super terminal to bind with the electronic device 100. It can be seen that the above-described procedure of device binding shown in fig. 26A to 26C is simpler than the above-described procedure of device binding shown in fig. 25A to 25F. When the electronic device 100 is first device-bound with lee buses of lee four, the device-binding can be performed according to the process shown in fig. 25A to 25F, so that the electronic device 100 and the lee buses of lee four can more accurately verify the identities of each other through the connection codes, and the possibility of device-binding errors is reduced. In the case of device binding, the freeballs of lie iv is added to the super terminal again, and when device binding is performed with the electronic device 100 again, device binding may be performed according to the procedure shown in fig. 26A to 26C. The method and the device can simplify the user operation of device binding when the user joins the cross-account device into the super terminal again, and improve the use experience of the user.
As can be seen from the foregoing embodiments, the device connection state diagram 2211 shown in fig. 22A may provide a unified entry for interaction among multiple electronic devices, so as to facilitate users to perform cooperative work with different electronic devices. In some embodiments, the user may also remove one or more device icons from the device connection status illustration 2211 as desired.
A scenario that the cooperative work between the electronic devices is stopped and the electronic devices are removed from the super terminal is described below according to an embodiment of the present application.
Fig. 27A to 27E are schematic views illustrating a scenario in which the electronic device is removed from the super terminal by stopping the cooperative work between the electronic devices.
As shown in fig. 27A, the electronic device 100 may display a user interface 2210. The user interface 2210 may be as described above with reference to FIG. 25F. And will not be described in detail herein. The freebuses icon 2512 of lee four is adsorbed to the mobile phone icon 505, which may indicate that the lee four freebuses and the electronic device 100 have established a connection relationship capable of cooperating. In response to an operation, such as a long press operation, acting on the leeway four freebusses icon 2512, the electronic device 100 can move the position of the leeway four freebusses icon 2512 according to the operation that continues to act on the leeway four freebusses icon 2512.
As shown in fig. 27B, in response to an operation of long-pressing and dragging the freebusses icon 2512 of lee four, the electronic device 100 may display the freebusses icon 2512 of lee four according to the position indicated by the operation. Electronic device 100 may also display a delete control 2711 at user interface 2210. The delete control 2711 may be used to remove the device icon dragged to the location of the delete control 2711 from the device connection state diagram 2211.
In some embodiments, in response to dragging the lee four freebooks icon 2512 to the position shown in fig. 27B and releasing the lee four freebooks icon 2512, the electronic device 100 may display the user interface 2210 shown in fig. 27B. The aforementioned lee four free ads icon 2512 release may indicate that the finger operating the lee four free ads icon 2512 is off the screen of the electronic device 100.
As shown in fig. 27C, the device icon for leeb of lee four may change from a leeb icon 2512 of lee four to a leeb icon 2511 of lee four. The leeb icons 2512 of lee four can be adsorbed by the mobile phone icon 505, and can indicate that leeb of lee four cooperates with the electronic device 100, so that a scene service can be provided for the user. The lee bubbles icon 2511 of lee four is not attached to the phone icon 505, which may indicate that lee bubbles of lee four are not yet associated with the electronic device 100.
Illustratively, lee's FreeBuds has coordinated with the electronic device 100. Wherein the electronic device 100 shares audio to leeb of lie four. Lee bubbles of lee four can play audio shared by the electronic device 100. In response to an operation of dragging the lee four freebus icon 2512 away from the cell phone icon 505, the electronic device 100 may cease to collaborate with lee four freebus. Wherein the electronic device 100 can stop sharing audio to FreeBuds of lie four.
That is, in the case where leebds of lee has coordinated with the electronic device 100, the user can stop the coordination work between the electronic devices by operating the leebds icon of lee away from the mobile phone icon 505.
As shown in FIG. 27D, in response to an operation of dragging the FreeBuds icon 2512 of Li four to the location of the delete control 2711 described above, the electronic device 100 may display the user interface 2210 shown in FIG. 27E. Wherein the electronic device 100 can remove the device icon for leerows from the device connection status illustration 2211. In time, the user can remove the electronic device added into the super terminal from the super terminal.
As can be seen from the scenarios shown in fig. 27A to 27B, the user operation for stopping the cooperative work between the electronic devices and removing the electronic device from the super terminal is simple. The user can rapidly manage the electronic equipment in the super terminal and control the cooperative work among the electronic equipment.
In some embodiments, when a cross-account device joins the super terminal in which the core device is the electronic device 100, a user to which the cross-account device belongs may stop the cross-account device from working in cooperation with other electronic devices in the super terminal, and remove the cross-account device from the super terminal.
Another scenario for stopping the cooperative work between the electronic devices and removing the electronic devices from the super terminal, which is provided by the embodiment of the present application, is described below.
Fig. 28A to 28E are schematic diagrams illustrating a scenario in which the cooperative work between the electronic devices is stopped and the electronic device is removed from the super terminal.
As shown in fig. 28A, the electronic device 100 may display a user interface 2210. The user interface 2210 may be described with reference to the user interface illustrated in FIG. 25F, previously described. The lee ads icon 2512 of lee four is attached to the phone icon 505, which may indicate that lee four freeads cooperate with the electronic device 100. Leebusses of lee four are cross-account devices in a super terminal with a core device as the electronic device 100.
As shown in fig. 28B, the electronic device 300 may display a user interface 2810. The electronic device 300 may be an electronic device that is paired with the lee bugs of lee. For example, the electronic device 300 may be a cell phone. The electronic device 300 may have the account number of lee four logged therein. The user interface 2810 may be a device interaction interface associated with a core device being a super-terminal function of the electronic device 300.
The user interface 2810 may include a device connection status illustration 2811. The device connection status illustration 2811 may appear as a ring diagram. The center of the device connection status illustration 2811 may be displayed with a cell phone icon 2801. The cell phone icon 2801 may indicate that the core device is a cell phone, i.e., the electronic device 300. The device identifier of the electronic device searched by the electronic device 300 may be surrounded around the cell phone icon 2801. The device identifier of the electronic device may include a device icon and a device name. For example, cell phone icon 2801 may include, around it: leeb icons 2811A of lee, vision icons, smart watch icons, mateBook icons of lee, matePad icons of lee, and so forth. The leebds icon 2811A of lee iv can represent leebds of lee iv.
As shown in fig. 28C, in response to an operation of dragging the lee books FreeBuds icon 2811A close to the cell phone icon 2801, the electronic device 100 may display a user interface 2810 shown in fig. 28D. Among them, lee buds of lee can establish a connection relationship with the electronic device 300 capable of cooperating with each other, and stop the cooperation with the electronic device 100 as shown in fig. 28A.
As shown in fig. 28D, when attached to the cell phone icon 2801, the freebubbles device icon of lee four can display the representation shown in lee four freebubbles icon 2811B. The FreeBuds icon 2811B of Li IV attached to the cell phone icon 2801 may indicate that Li IV FreeBuds is already associated with the electronic device 300.
When the FreeBuds of Li four stops working in cooperation with the electronic device 100, the electronic device 100 may display the user interface 2210 shown in FIG. 28E. In fig. 28E, lee four freebubbles icon 2511 is not attached to the phone icon 505. I.e. the cooperation between the electronic device 100 and lee's FreeBuds has ceased. As can be seen from a comparison between fig. 28A and fig. 28E, through the process of coordinating the leebds of lee with the electronic device 300 shown in fig. 28B to fig. 28D, the leebds of lee can stop coordinating with the electronic device 100, and exit the core device as the super terminal of the electronic device 100.
In some embodiments, when lee's FreeBuds stop collaborating with the electronic device 100, the electronic device 100 can also display a prompt message to prompt the user that lee's FreeBuds have stopped collaborating.
Illustratively, leeb eds of lee collaborates with the electronic device 100, which may play audio shared by the electronic device 100. In the process of the leeb icons of lee four cooperating with the electronic device 100, the leeb icons of lee four receive execution in cooperation with the electronic device 300 paired with oneself. Leeb eds of lee can stop cooperating with the electronic device 100 and cooperate with the electronic device 300. The FreeBuds of Li IV can stop playing the audio shared by the electronic device 100 and play the audio delivered by the electronic device 300.
The embodiment of the present application does not limit the user operation for stopping the cooperation of the electronic device and other electronic devices in the super terminal. For example, in response to an operation (e.g., a long press operation) acting on the lee days icon 2811 of lee four shown in fig. 28B above, the electronic device 300 can display a control for stopping the lee days in cooperation with the electronic device 100. In this way, the user of lee buds, lee, can stop other users from using their own headphones for cooperative work.
As can be seen from the above-described scenarios shown in fig. 28A to 28E, when the electronic device of the user is added to the super terminal by another user to provide the cooperative service, the user can stop the electronic device of the user from being used by the other user at any time according to the need of the user.
In some embodiments, a user may actively join his or her electronic device as a cross-account device to a super terminal of another user. In this way, the user can utilize his own electronic device to cooperate with the electronic devices of other users. For example, liqi may request to join the earphone of its own as a cross-account device to the super terminal of the mobile phone whose core device is zhang. When the super terminal is added, the earphone of lie four can cooperate with the mobile phone of zhang san. The mobile phone of Zhang III can share the audio to the earphone of Li IV. Thus, li IV can listen to the audio shared by the mobile phone from Li III through the earphone of Li IV.
Illustratively, as shown in fig. 28F, the electronic device 300 may display a user interface 2810. User interface 2810 may contain a device connection status illustration 2811, a non-co-account device display area 2821. A cell phone icon 2801 may be displayed in the device connection status diagram 2811. The handset icon 2801 may represent the core device of the super terminal, i.e., the electronic device 300. Here, the description will be given taking an example in which the electronic device 300 registers the account id of lee four. I.e. the electronic device 300 may be a lee mobile phone. The device connection status illustration 2811 may also include a device icon of the co-account device searched by the electronic device 300. For example, lee's mathebook icon, lee's mathpad icon, lee's freebooks icon 2811A, vision icon, smart watch icon. The non-same account device display area 2821 may be used to display device icons of cross-account devices and device icons of non-account devices searched by one or more electronic devices 300. For example, the MateBook icon of Zhang three, the MatePad icon of Zhang three, and the P50 icon 2822 of Zhang three. The aforementioned zhang-three P50 icon 2822 may represent an electronic device named zhang-three P50, which may be the aforementioned electronic device 100. That is, the electronic device 100 may be a cell phone with three sheets.
In response to the drag operation of leeway icons 2811A of leeway four to P50 icon 2822 near zhang as shown in fig. 28F, electronic device 300 may instruct leeway icons of leeway four to device bind with electronic device 100. It is understood that lee people belong to a different user (e.g., lee four) than the user (e.g., zhang three) to which the electronic device 100 belongs. Then the electronic device 300 instructs lee's FreeBuds to collaborate with the electronic device 100, asking the user to whom the electronic device 100 belongs if they agree. That is, lee's FreeBuds may first perform device binding with the electronic device 100 before establishing collaboration.
During the device binding process between the lee books of lee and the electronic device 100, the electronic device 300 may display the connection code setting box 2831 and the input keyboard 2832 shown in fig. 28G. The connection code setting box 2831 may be used to set a connection code for device binding of lee's freebooks with the electronic device 100. The input keypad 2832 may be used for a user to input a connection code in the connection code setting box 2831.
As shown in fig. 28G, when the connection code is set in the connection code setting box 2831, the electronic device 300 may transmit a binding request to the electronic device 100 in response to an operation of the determination control acting in the connection code setting box 2831. The binding request may be used to indicate that lee's FreeBuds request is coordinated with the electronic device 100. In some embodiments, the binding request may be sent to the electronic device 100 via leebds of lee.
Upon receiving the binding request, the electronic device 100 may display a binding prompt box 2833 shown in fig. 28H. The binding prompt 2833 may refer to the binding prompt 2221 shown in FIG. 22D, previously described. In response to an operation acting on the consent control in the binding prompt box 2833, the electronic device 100 may display the connection code input box 2834 illustrated in fig. 28I. The connection code entry box 2834 can refer to the connection code entry box 2222 shown in fig. 22E, previously described.
As shown in fig. 28I, when the connection code is input in the connection code input box 2834, the electronic device 100 may transmit an approval message to the electronic device 300 in response to the operation of the determination control in the connection code input box 2834. The consent message may contain the connection code entered in connection code entry box 2834 (e.g., 710710). Upon receiving the consent message, the electronic device 300 may verify whether the connection code in the consent message is the same as the connection code set in the connection code setting block 2831 shown in fig. 28G. If so, the electronic device 300 may instruct lee's FreeBuds to collaborate with the electronic device.
When leebds, lee, cooperates with the electronic device 100, the electronic device 100 can display the user interface 2210 shown in fig. 28J, and the electronic device 300 can display the user interface 2810 shown in fig. 28K. In fig. 28J, leesides icon of lee four is attached to cell phone icon 505. In fig. 28K, leesides icon 2823 of lee four is adsorbed with P50 icon 2822 of zhang three.
In some embodiments, the electronic device 100 can cease collaboration of the electronic device 100 with leeb icons of leeb in response to dragging the leeb icons of leeb to a separate operation from the cell phone icon 505, acting on the leeb icons of leeb in fig. 28J.
In some embodiments, the electronic device 300 may cease the electronic device 100 from cooperating with leeb icons of lee four in response to an operation acting on the leeb icons of lee four shown in fig. 28K dragging the leeb icons of lee four to separate from the P50 icon 2822 of yew three.
According to the embodiment, the cross-account device joins the super terminal, and the cross-account device may be invited to join by core equipment of the super terminal, or may be applied to join by the cross-account device. When the core device of the super terminal invites the cross-account device to join, the core device may ask whether the cross-account device agrees to join the super terminal. When the cross-account device applies for joining, the cross-account device may ask the core device whether to allow the cross-account device to join the super terminal.
It should be noted that in some embodiments, one electronic device may work with multiple electronic devices simultaneously. For example, lee's FreeBuds, discussed above, may also cooperate with the electronic device 300 while cooperating with the electronic device 100. Among them, lee four's freeballs may contain two earphones (left ear earphone and right ear earphone). One of the headphones in leebds, lee can play audio from the electronic device 100. At the same time, the other earphone in lee's freebooks can play audio from the electronic device 300. As another example, a printer may join different super terminals at the same time. The printer may receive print requests from a plurality of electronic devices, and provide corresponding print services to the plurality of electronic devices.
In some embodiments, in a case where the electronic device coordinated in the super terminal includes a cross-account device, the cross-account device may stop coordinating with other electronic devices in the super terminal at any time in response to a user operation. For example, the superterminal using the electronic device 100 as a core device includes the leebds of lie four. In conjunction with the electronic device 100 in the super terminal, leeb's FreeBuds can play audio from the electronic device 100. Lee buds of lee can stop playing audio in response to an operation acting on freeboards to stop playing. Lee buses of lee may also receive a coordination stop instruction sent by the electronic device paired with lee (e.g., a mobile phone of lee), and exit the super terminal according to the coordination stop instruction, and stop coordinating with the electronic device 100. That is, when the leebds of lee is added to the super terminal as a cross-account device, and cooperates with other electronic devices in the super terminal, the user to which the leebds of lee belongs can withdraw the leebds of lee at any time, and stop other users from using their own electronic devices.
In some embodiments, a non-account device may only be able to join one super terminal at the same time, and cooperate with one electronic device. If the non-account device has joined the super terminal, the non-account device may send a waiting message to the device that sent the cooperation request if a cooperation request for joining another super terminal to perform cooperation is received again. The waiting message can be used for prompting the user that the equipment without the account number is occupied, and the equipment can be used after other users cooperate to finish.
In some embodiments, the electronic device 100 may record the electronic device removed from the super terminal within a device recycle bin. In this way, the user can view the electronic device which removes the super terminal at the device recycle bin and can add the removed electronic device into the super terminal again.
A scenario for managing a super terminal according to an embodiment of the present application is described below.
Fig. 29A and 29B exemplarily show scene diagrams for managing a super terminal.
As shown in fig. 29A, the electronic device 100 may display a user interface 2210. The user interface 2210 may include a device recycle station control 2901. Additional details contained in user interface 2210 may be had in reference to the description of the user interface previously described in connection with FIG. 22A.
In response to the operation of the device recycle bin controller 2901, the electronic device 100 may display a recycle bin list 2902 shown in fig. 29B. The recycle bin list 2902 may be used to display device icons that are removed from the device connection status illustration 2211 in order for the user to view which electronic devices that he has removed.
Illustratively, the electronic device 100 may remove the FreeBuds icon of Liqu from the device connection status illustration 2211 according to the operation illustrated in FIG. 27D previously described. The electronic device 100 may then display the FreeBuds icon of Liqu in the recycle bin list 2902 described above.
As shown in fig. 29B, the recycle list 2902 may include leebds icon 2902A, restore control 2902B for lee. Wherein leesides icon 2902A for lee four may represent leesides for lee four. The resume control 2902B may be used to trigger the electronic device 100 to add lee's FreeBuds to the core device as a super-terminal of the electronic device 100. For example, in response to an operation (e.g., a touch operation) of the restore control 2902B, the electronic device 100 may display the freebubbles icon of lee in the device connection status illustration 2211. The recycle bin list 2902 may also include device icons for other electronic devices and corresponding recovery controls. The embodiment of the present application does not limit the method for representing the electronic device in the recycle bin list. For example, the electronic device may be represented in the recycle bin list by the device name only.
In some embodiments, when the device icon of the electronic device is removed from the device connection state illustration 2211, the electronic device 100 may no longer display the device icon of the electronic device in the device connection state illustration 2211. That is, the device icons of the electronic devices in the recycle bin list 2902 may be displayed again in the device connection status diagram 2211. When the user restores the electronic device in the recycle bin list 2902 through the restore control in the recycle bin list 2902, the electronic device 100 may again display the device identification of the electronic device in the device connection state diagram 2211.
In some embodiments, when a cross-account device in the recycle bin list 2902 is added to a super terminal, the electronic device 100 and the device binding process of the cross-account device may refer to the processes shown in fig. 24A to 24D. It is to be appreciated that the cross-account devices in recycle bin list 2902 were once device bound to electronic device 100. The user may join the cross-account device in the recycle bin list 2902 to the super terminal without setting a connection code. This may simplify the operation of the user joining the cross-account device in the recycle bin list 2902 to the super terminal, improving the user experience.
It should be noted that, in the embodiment of the present application, a cross-account device is taken as an example to describe a device binding process in which an electronic device joins a super terminal, and a process in which the electronic device is removed from the super terminal. Other electronic devices, for example, devices without account numbers, may also perform device binding with core devices of the super terminal when joining the super terminal, and the binding process may refer to the description of the foregoing embodiment.
Another scenario for managing a super terminal according to an embodiment of the present application is described below.
Fig. 30A to 30C exemplarily show a scenario of managing a super terminal.
As shown in fig. 30A, the electronic device 100 may display a user interface 3010. The user interface 3010 may be a control center interface of the electronic device 100. The user interface 3010 may include a super terminal function card 3011.
The super terminal function card 3011 may display device icons of one or more electronic devices. The one or more electronic devices may be searched by the electronic device 100. The super terminal function card 3011 may also contain setup controls 3012. The settings control 3012 may be used to open a settings interface of the super terminal.
Additional content contained within user interface 3010 may be referenced to user interface 410 previously described in connection with FIG. 4. And will not be described in detail herein.
In response to an operation of the settings control 3012, the electronic device 100 may display a user interface 3020 illustrated in fig. 30B. The user interface 3020 is a setting interface of the super terminal.
As shown in fig. 30B, the user interface 3020 may include a super terminal portal 3021, a video vignette 3022A, a usage tip 3022B, a native control 3023, and a co-account device control 3024. Wherein:
the super-terminal portal 3021 may be used to trigger the electronic device 100 to provide a unified portal for interaction between a core device and multiple associated devices in the super-terminal. For example, in response to an operation of the superterminal portal 3021, the electronic apparatus 100 may display the user interface illustrated in fig. 22A described earlier.
Video vignette 3022A may contain video for introducing the functionality, usage methods, etc. of the super terminal. This can promote the user to super terminal's cognition, helps the user to use super terminal better.
The usage prompt 3022B may contain text information for introducing the functions, usage methods, etc. of the super terminal, thereby helping the user to know about the super terminal. The content in the usage hint 3022B is not limited in the embodiment of the present application.
Native controls 3023 may be used to view relevant device information for electronic device 100. The electronic device 100 may be a core device of a super terminal. For example, electronic device 100 may be a cell phone model number Hua Wei P50. The login account on the electronic device 100 may be 123xxxxxx. The model and login account number of the electronic device 100 are merely exemplary, and should not be limited in this application. Electronic device 100 may also display more device information about electronic device 100 in response to operation of native control 3023.
The same account device control 3024 may be used to view an electronic device that is the same account as electronic device 100 (i.e., core device) in a super terminal. For example, the account number logged in on the electronic device 100 may be the account number of Zhang III. Account numbers logged on the MateBook of zhang san and the MatePad of zhang san are account numbers of zhang san. The MateBook of zhangsan and the MatePad of zhangsan may be the same account device in the super terminal of the electronic device 100 as the core device. The homonym device control 3024 may include a device identifier of matchbook of yei and a device identifier of matchad of yei. In response to operation of one of the device identifications in the co-account device control 3024, the electronic device 100 may display a display device management interface. The device management interface can facilitate a user to manage the electronic device with a login account (or associated account) being a third account, for example, remove the electronic device from the super terminal, cancel the electronic device associated with the third account, and the like.
The user interface 3020 may also contain more content.
For example, in response to a user operation described in fig. 30B acting to slide up on the user interface 3020, the electronic apparatus 100 may display the user interface 3020 illustrated in fig. 30C.
As shown in fig. 30C, user interface 3020 may also contain a shared device control 3025, a cross-account device control 3026, a binding control 3027, and a more settings control 3028. Wherein:
shared device control 3025 may be used to view the home group device, the shared device. The home group device may be an electronic device located in the same home group as the electronic device 100. The above-described shared device may be an electronic device that is commonly managed and controlled by a plurality of electronic devices including the electronic device 100. For example, the MatePad icon in the shared device control 3025 shown in fig. 30C may represent a family group device and the MateBook icon may represent a shared device. The family group device and the sharing device can be associated devices in the super terminal, and can cooperate with other electronic devices in the super terminal.
In some embodiments, in response to operation of shared device control 3025, electronic device 100 may display a user interface for a smart life application. The smart life application may be an APP for managing the home group device and the shared device.
Cross-account device control 3026 may be used to view cross-account devices. For example, the FreeBuds of lie four and the MatePad of lie four, which are cross-account devices, are both device-bound with the electronic device 100, and a core device is added as a super terminal of the electronic device 100. The electronic device 100 may display the leebeds icon for lie four and the MatePad icon for lie four in the cross-account device control 3026. Among them, the cross-account device control 3026 may also contain an unbinding control. For example, a unbind control corresponding to lee ads icon of lee four, and a unbind control corresponding to MatePad icon of lee four. The unbinding control can be used to contact a device binding relationship between the corresponding electronic device and the electronic device 100. When the binding is released, the related electronic device may exit the core device as the super terminal of the electronic device 100.
Binding control 3027 may be used to trigger electronic device 100 to search for other electronic devices and perform device binding.
More settings controls 3028 may include service flow recommendation options, multi-device task center options. The service circulation recommendation option can be used for starting or closing the function of recommending the scene services provided by the electronic equipment in cooperation in the super terminal. For example, when the state of the service flow recommendation option is in an open state, the super terminal may comprehensively consider one or more scenario service options obtained according to the conditions of the device type, the device characteristics, the product location, the device usage, the cooperative service that the device may provide, the current environment of the device, the device operating state, the application recently used by the user, the current operating application, the direction information input by the user operation, and the like in the super terminal, and display the scenario service options on the user interface, thereby facilitating the user to quickly start the related service or function. The multi-device task center option may be used to turn on or off the function of distributing the atomized services provided by the respective electronic devices cooperating in the super terminal. For example, when the multi-device task center option is in an open state, the super terminal may display an option for selecting an atomization service on the user interface according to the atomization service that can be provided by each electronic device in the super terminal. Therefore, the user can select the electronic equipment to provide the atomization service in the scenario service provided by the super terminal according to the requirement.
Not limited to the controls shown in fig. 30B and 30C for managing a super-terminal, more or fewer controls for managing a super-terminal may be included in the user interface 3020. The layout of the user interface 3020 is merely an exemplary illustration of the embodiments of the present application and should not be construed as limiting the present application.
As can be seen from the scenes shown in fig. 30A to 30C, the electronic device 100 can provide rich management controls to facilitate the user to manage the super terminal, and provide the user with experience of using the super terminal.
In some embodiments, when a plurality of electronic devices in the super terminal are coordinated, the core device of the super terminal may provide one or more options of the scenario service for the user to select. The one or more scene services may be obtained by the super terminal by comprehensive consideration according to the device type, device characteristics, product location, device usage, the cooperative service that the device can provide, the current environment of the device, the running state of the device, the application recently used by the user, the currently running application, the orientation information input by the user operation, and the like of the cooperative electronic device. The coordinated electronic equipment can provide the corresponding scene service according to the option of the scene service selected by the user. The specific scenarios described above can refer to the scenarios shown in fig. 6 to 8.
In other embodiments, when a plurality of electronic devices in the super terminal are coordinated, the plurality of electronic devices may directly provide a scenario service. The scenario service may be a scenario service which is most likely to be required by the user and is obtained by comprehensively considering the conditions of the super terminal, such as the device type, the device feature, the product location, the device usage, the cooperation service which can be provided by the device, the current environment of the device, the device running state, the application which is used by the user most recently, the current running application, and the direction information input by the user operation according to the cooperation of the electronic device.
A scenario of cooperation of electronic devices in a super terminal according to an embodiment of the present application is described below.
Fig. 31A to 31C exemplarily show scene diagrams of electronic device cooperation in the super terminal.
As shown in fig. 31A, the electronic device 100 may display a user interface 2210. The user interface 2210 may include a device connection status illustration 2211. The device connection state diagram 2211 may display a cell phone icon 505 and a MatePad icon 3111 of zhang. In response to the operation of dragging the zhangsan MatePad icon 3111 to be close to the cell phone icon 505, the electronic device 100 may cooperate with an electronic device named zhangsan MatePad (hereinafter abbreviated to zhangsan MatePad). The electronic device 100 may comprehensively consider the scenario services that are most likely to be required by the user to cooperate with MatePad of zhang san, for example, a multi-screen cooperation service. Then, in response to the operation illustrated in fig. 31A, the electronic device 100 may cooperate with MatePad of zhang, providing a multi-screen cooperation service.
In response to the operations shown in fig. 31A, the electronic device 100 may also display a user interface 2210 shown in fig. 31B.
As shown in fig. 31B, adsorption of the MatePad icon 3112 of zhang with the mobile phone icon 505 may indicate that the MatePad of zhang has cooperated with the electronic device 100. The user interface 2210 shown in FIG. 31B may also contain a service setting control 3114. In response to an operation of the service setting control 3114, the electronic device 100 may display a service selection box 3115 illustrated in fig. 31C.
As shown in fig. 31C, the service selection box 3115 may be used for selecting a scenarized service provided by MatePad of zhang san in cooperation with the electronic device 100. The service selection box 3115 may include a multi-screen coordination option 3115A and a camera coordination option 3115B. When one option in the service selection box 3115 is in the selected state, it may indicate that the scenized service corresponding to the option is the scenized service provided by MatePad of zhang. According to the above embodiments, the multi-screen collaboration service is a scenario service that is most likely to be needed by the user when the three-fold MatePad is collaborated with the electronic device 100. Then, after receiving the operation of attaching the MatePad icon of zhangsan to the mobile phone icon 505, the electronic device 100 may provide a multi-screen collaboration service with the MatePad of zhangsan. The multi-screen collaborative option 3115A in the service selection box 3115 is in a selected state. If the scenario service required by the user is not the multi-screen collaborative service, the user may adjust the scenario service provided by the electronic device 100 and MatePad of zhang by third through the options in the service selection box 3115. For example, the user may select the camera cooperation option 3115B, so that the electronic device 100 provides a camera cooperation service with MatePad of zhang.
As can be seen from the scenarios shown in fig. 31A to 31C, the super terminal may predict the scenario service required by the user, and directly provide the predicted scenario service after detecting the user operation that the user triggers the electronic device to establish the cooperative connection. Therefore, the user can quickly enable the electronic equipment to cooperate to provide the services required by the user in a mode of pulling and closing the equipment icon. If the user needs the scene service which is not provided by the super terminal according to the prediction, the user can also readjust the service provided by the electronic equipment in cooperation.
Another scenario of electronic device cooperation in a super terminal provided in this embodiment is described below.
Fig. 32A and 32B exemplarily show scene diagrams of electronic device cooperation in a super terminal.
As illustrated in fig. 32A, the electronic device 100 may display a user interface 3010. The user interface 3010 may be as described above with reference to fig. 30A. The super terminal function card 3011 in the user interface 3010 may include device icons of one or more electronic devices. For example, the MateBook icon 3011A of Zhang three, the MatePad icon of Zhang three, the FreeBuds icon of Zhang three, and the Sound X icon.
The device icon of one electronic device in the super terminal function card 3011 can be used to trigger the electronic device 100 to cooperate with the one electronic device.
For example, the electronic device 100 may cooperate with the MateBook of yeld in response to an operation, such as a touch operation, on the MateBook icon 3011A of yeld. The electronic device 100 may predict a scenario service (e.g., a multi-screen coordination service) that is most likely to be required by the user to coordinate with the MateBook of zhang. The electronic device 100 and the mathebook of Zhang III cooperate to provide the predicted scenarization service. When the electronic device 100 cooperates with matchbook of zhang, the electronic device 100 may display the user interface 3010 shown in fig. 32B.
As shown in fig. 32B, a three-in-one matchbook icon 3011B and a collaborative prompt control 3011C may be displayed in the super terminal function card 3011. The representation form of the mathebook icon 3011B of zhang and the mathebook icon 3011A of zhang shown in fig. 32A may be different. For example, the matchbook icon 3011B of zhang may be displayed in a dark filled form. This can be conveniently used to distinguish which electronic devices corresponding to which device icons are cooperating with other electronic devices in the super terminal among the device icons presented by the super terminal function card 3011. The collaboration prompt control 3011C may also prompt the user that the matchbook of zhang san is collaborating with the electronic device 100. The embodiment of the present application does not limit the representation form of the device icon in the super terminal function card 3011.
In some embodiments, if the user wants to adjust the service provided by the MateBook of zhang san and the electronic device 100 in cooperation, the device interaction interface related to the super terminal function may be opened through the super terminal function card (refer to the user interface 2210 shown in fig. 22A). Then, the user adjusts services cooperatively provided by the three-piece MateBook and the electronic device 100 on the device interaction interface related to the functions of the super terminal.
As can be seen from the scenarios shown in fig. 32A and 32B, a user can quickly trigger an electronic device interaction in a super terminal at the control center interface of the electronic device 100, so as to provide a scenario-based service. The electronic equipment in the super terminal is triggered to cooperate, operation is simple and convenient, and the experience of a user in interaction with a plurality of electronic equipment can be improved.
Another scenario of electronic device cooperation in a super terminal provided in the embodiment of the present application is described below.
Fig. 33A to 33C exemplarily show scene diagrams of electronic device cooperation in the super terminal.
As shown in fig. 33A, the electronic device 100 may display a user interface 3010. The user interface 3010 may be as described above with reference to fig. 30A. The super terminal function card 3011 in the user interface 3010 may include a Mate 40 icon 3011D. Mate 40 icon 3011D may represent an electronic device named Mate 40 (hereinafter referred to as Mate 40 for short). In response to an operation, for example, a touch operation, on the Mate 40 icon 3011D, the electronic apparatus 100 may display the service selection box 3311 illustrated in fig. 33B. The service selection box 3311 may be used for a user to select a scenized service that the Mate 40 is able to provide in cooperation with the electronic device 100.
As shown in fig. 33B, the service selection box 3311 may include a camera cooperation option 3311A and a microphone cooperation option 3311B. Among them, the options of the scenized services included in the service selection box 3311 may be determined according to the atomized services of the electronic device 100 and the Mate 40. Not limited to the camera coordination option 3311A, the microphone coordination option 3311B, described above, more or fewer options may be included in the service selection box 3311. As shown in fig. 33B, in response to an operation of the camera cooperation option 3311A, the electronic apparatus 100 and the Mate 40 may cooperate to provide a service of the camera cooperation. For example, electronic device 100 may invoke Mate 40 to turn on a camera to capture an image.
When the electronic device and the Mate 40 cooperate, the electronic device 100 may display the user interface 3010 as shown in fig. 33C. In fig. 33C, a Mate 40 icon 3011E and a collaborative prompt control 3011F may be displayed in the super terminal function card 3011. The Mate 40 icon 3011E and the Mate 40 icon 3011D shown in fig. 33A may be represented in different forms. For example, mate 40 icon 3011E may be displayed as a dark filled form. The collaboration prompt control 3011F may be used to prompt the user that the Mate 40 is collaborating with the electronic device 100.
As can be seen from the scenarios shown in fig. 33A to 33C, a user can quickly trigger electronic device interaction in a super terminal at the control center interface of the electronic device 100, so as to provide a scenario service. The scenarized services provided by the electronic devices in the super terminal may be selected by the user. The electronic equipment in the super terminal is triggered to cooperate, operation is simple and convenient, and the experience of a user in interaction with a plurality of electronic equipment can be improved.
In some embodiments, a printer may be added to the super terminal. The electronic device 100 may print a file with a printer in cooperation with the printer.
The following describes a scenario in which a super terminal provided in an embodiment of the present application provides a cooperative service by using a printer.
Fig. 34A to 34O are diagrams illustrating a scenario in which a super terminal provides a cooperative service using a printer.
As shown in fig. 34A, the electronic device 100 may display a user interface 2210. The electronic device 100 may display the device icon of the searched electronic device in the device connection state diagram 2211 of the user interface 2210.
For example, the electronic device 100 may search for an electronic device named PixLabX1 printer (hereinafter abbreviated as PixLabX1 printer). A PixLabX1 printer icon 3401 may be displayed in the device connection status diagram 2211. In response to the operation of dragging the PixLabX1 printer icon 3401 to be close to the cell phone icon 505 shown in fig. 34A, the electronic apparatus 100 may display a print setting option box 3402 of fig. 34B.
As shown in fig. 34B, one or more setting options for instructing the printer to print a file may be contained in the setting option box 3402. For example, a number of copies option, a print area option, a print paper single/double side option, a print paper horizontal/vertical option, a print paper size option, a margin option, and the like.
The copy number option can be used for setting the number of copies to be printed by the printer on the received file to be printed. Such as 1 part, 2 parts, etc. The print area option may be used to set the area in the file that needs to be printed. For example, the region to be printed may be the entire document, or may be the first page of one document. The print paper single/double-sided option can be used to set whether to print single-sided or double-sided. The print paper landscape/portrait option may be used to set whether to print on paper landscape or portrait. The print paper size option may be used to set the size of the paper used for printing. For example, the size of the paper is A3, A4, and so on. The page office option may be used to set the margins of printed content on paper. Not limited to the setup options shown in fig. 34B, the setup options box 3402 may also contain more or fewer options.
When the electronic device 100 is coordinated with the PixLabX1 printer described above, the electronic device 100 may display the user interface 2210 shown in fig. 34C. In the user interface 2210 shown in fig. 34C, the PixLabX1 printer icon 3402 is attached to the cell phone icon 505, which may indicate that the PixLabX1 printer has coordinated with the electronic device 100. The PixLabX1 printer icon 3402 may be represented in a form different from that of the PixLabX1 printer icon 3401 shown in fig. 34A described previously. For example, the PixLabX1 printer icon 3402 may be displayed in a dark filled form. The user interface 2210 may also contain a printed hover ball 3405. The print hover ball 3405 may include a print control 3405A. The print control 3405A may be used to trigger the electronic device 100 to send a print file to the PixLabX1 printer for printing. The printed hover ball 3405 described above may float on the top layer of the user interface.
In some embodiments, during the time that the electronic device 100 is coordinated with the PixLabX1 printer described above, if an operation on the print hover ball 3405 is not detected within a preset time period, the electronic device 100 may display the print hover ball 3406 shown in fig. 34D. The printed hover ball 3405 and the printed hover ball 3406 may have different representations. For example, the printed hover ball 3406 may display a lower transparency than the printed hover ball 3405, and a portion of the printed hover ball 3405 is hidden. That is, in the case where the user does not use the collaborative service provided by the PixLabX1 printer and the electronic device 100 for a long time, the print hovercall may still be hovered in the user interface, but the transparency of the display and the occupied display area may become small, thereby reducing the influence of the user using other controls in the user interface.
As shown in fig. 34E, the printed hovercall may continue to hover over the user interface during the time that the electronic device 100 is coordinated with the PixLabX1 printer described above. The user may open a user interface on the electronic device 100 that displays the file to be printed.
Illustratively, the electronic device 100 may display the user interface 3410 shown in fig. 34E. The user interface 3410 may be a user interface of a gallery application. The user interface 3410 may include a picture 3411. The user interface 3410 may have a printed hover ball 3406 suspended thereon. In response to an operation on the print hover ball 3406, the electronic device 100 may display the print hover ball 3407 illustrated in fig. 34F. The printed hover ball 3406 may be obtained by shrinking the printed hover ball 3405 shown in fig. 34C, which is equivalent to a shrunk hover ball. The printed levitation ball 3407 may be obtained by spreading a printed levitation ball 3405 as an unfolded levitation ball.
As shown in fig. 34F, the print hover ball 3407 may contain a print control 3405A. The printed hover ball 3407 may also contain a text prompt "click left control printable" to prompt the user how to use the printed hover ball 3407. In response to an operation, such as a touch operation, on the print control 3405A, the electronic device 100 may send a file (i.e., a picture 3411) displayed on the current user interface 3410 to the PixLabX1 printer for printing.
In a case where the electronic device 100 has transmitted a file for printing by the PixLabX1 printer, the electronic device 100 may display the printing hover 3408 illustrated in fig. 34G. Print controls 3405A may also be included in the print hover ball 3408. However, the printing of the text prompt "in printing, and clicking to view a print job" included in the floating ball 3408 is different from the printing of the text prompt in the floating ball 3407. Text prompts in the print hover ball 3408 may prompt the user how to view the print job with the print hover ball. It is understood that the electronic device 100 may display the printing hover ball 3407 shown in fig. 34F described above without a printing job between the electronic device 100 and the PixLabX1 printer. In the case where there is a print job between the electronic apparatus 100 and the PixLabX1 printer, the electronic apparatus 100 may display the print hover 3408 shown in fig. 34G described above. The embodiment of the application does not limit the expression form of the printing floating ball under different conditions. For example, in the different cases described above, the representation of printing the hover ball may also be the same.
As shown in fig. 34F, the user may view different pictures by a user operation acting on the user interface 3410 to slide left or right. Illustratively, in response to a user operation acting on the user interface 3410 to slide to the left, the electronic apparatus 100 may display the user interface 3410 shown in fig. 34H. In fig. 34H, a picture 3412 may be displayed on the user interface 3410. In response to operation of print control 3405A, electronic device 100 may send the file (i.e., picture 3412) currently displayed on the user interface to the PixLabX1 printer for printing.
As shown in fig. 34I, in response to an operation, for example, a touch operation, applied to an area other than the print control 3405A in the print hover ball 3408, the electronic apparatus 100 may display a print job list 3413 shown in fig. 34J. A file name 3414, a print status 3415, a file name 3416, and a pause control 3417 may be included in the print job list 3413. The file name 3414 may be cat. Jpg, for example, and may represent a print job for printing the picture 3411 shown in fig. 34F. The print status 3415 may be used to reflect the status of the print job to which the file name 3414 corresponds. For example, the print status 3415 may be finished, which may indicate that the PixLabX1 printer has finished printing the picture 3411.
The file name 3416 may be dog.jpg, for example, and may represent a print job for printing the picture 3412 shown in fig. 34H. Pause control 3417 may indicate that the print job corresponding to file name 3416 is printing and has not yet been completed. Pause control 3417 may be used to pause the print job to which file name 3416 corresponds. Illustratively, in response to an operation acting on the above-described pause control 3417, the electronic device 100 may send an instruction to the PixLabX1 printer to pause printing of the picture 3412. When the picture 3412 is printed, the electronic device 100 may change the pause control 3417 to the control shown in the print status 3415 to prompt the user that the PixLabX1 printer has finished printing the picture 3412.
The embodiment of the present application does not limit the implementation method for providing the printing service by the cooperation of the electronic device 100 and the PixLabX1 printer. For example, in response to an operation acting on the print control 3405A, the electronic device 100 may send a file displayed on the current user interface to the PixLabX1 printer for printing. The files displayed on the current user interface are not limited to pictures, but may be text-type documents, tables, and the like. For another example, in response to an operation applied to the print control 3405A, the electronic device 100 may perform a screen capture operation on the currently displayed user interface and send a file obtained by the screen capture operation to the PixLabX1 printer for printing.
In some embodiments, the user may stop the electronic device 100 from cooperating with the pixlab x1 printer by dragging the pixlab x1 printer icon 3403 away from the cell phone icon 505 in the user interface shown in fig. 34C.
In some embodiments, the user may also stop the electronic device 100 from cooperating with the PixLabX1 printer by printing the hover ball as described above. Illustratively, as shown in fig. 34K, in response to an operation acting on the print hover ball 3408, such as a long press operation, the electronic device 100 may move the print hover ball on the user interface according to a finger motion trajectory when the user drags the print hover ball 3408.
Among them, the electronic apparatus 100 may display the print hover ball 3409 illustrated in fig. 34L. The printed hover ball 3409 may have a different form of expression than the printed hover ball 3408. In the process that the display position of the print hovel 3409 is changed as the user drags the trajectory, the electronic apparatus 100 may display a deletion control 3431 illustrated in fig. 34L on the user interface 3410.
As shown in fig. 34M, in response to an operation of dragging the print hovel 3409 to the position where the deletion control 3431 is located, the electronic apparatus 100 may stop the electronic apparatus 100 from cooperating with the PixLabX1 printer and cancel displaying the above-described print hovel on the user interface.
In some embodiments, upon receiving an operation to act on the above-described print control 3405A, the electronic device 100 may retrieve a file that is printable on the currently displayed user interface. If there is no printable file on the currently displayed user interface, the electronic device 100 may prompt the user that there is no printable content on the current page.
Illustratively, as shown in fig. 34N, the electronic device 100 may display the user interface 3440 during the time period in which the electronic device 100 and the PixLabX1 printer are coordinated. The user interface 3440 may be a desktop of the electronic device 100. The user interface 3440 has a printed hover ball 3407 suspended thereon. In response to an operation of the print control 3405A in the print hover ball 3407, the electronic device 100 may display a prompt box 3441 illustrated in fig. 34O. The prompt box 3441 may be used to prompt the user interface 3440 that no printable file exists and that a printing operation cannot be performed. A print awareness control 3442 may be included in the prompt box 3441. The print awareness control 3442 may be configured to trigger the electronic device 100 to display a notice for printing using the PixLabX1 printer, thereby helping a user to know how to use the printing service cooperatively provided by the electronic device 100 and the PixLabX1 printer. The content included in the prompt box 3441 is not limited in the embodiment of the present application.
In some embodiments, the PixLabX1 printer may be device bound to the electronic device 100 prior to cooperating with the electronic device 100. The above device binding procedure may refer to the introduction of the aforementioned embodiments related to device binding. And will not be described in detail herein.
As can be seen from the above-described scenarios shown in fig. 34A to 34O, the printer may be added to the super terminal, and provide the printing service to the user in cooperation with other electronic devices in the super terminal. The method can facilitate the user to quickly cooperate the electronic device 100 with the printer. In addition, during the period when the electronic device 100 cooperates with the printer, the user can browse and search the file to be printed, and the printing hover ball can be used for realizing rapid printing. The user operation who prints above-mentioned is simple and convenient, can improve the user and use the printer and carry out synergistic use and experience. When all the printing tasks are finished, the user can also quickly stop the cooperation of the electronic device 100 and the printer through the printing floating ball.
The implementation manner described in the above embodiments is only an example, and does not set any limit to other embodiments of the present application. The specific internal implementation manner may be different according to different types of electronic devices, different loaded operating systems, different used programs, and different called interfaces, and the embodiments of the present application are not limited at all, and may implement the feature functions described in the embodiments of the present application.
As used in the above embodiments, the term "when 8230; may be interpreted to mean" if 8230, "or" after 8230; or "in response to a determination of 8230," or "in response to a detection of 8230," depending on the context. Similarly, the phrase "at the time of determination of \8230;" or "if (a stated condition or event) is detected" may be interpreted to mean "if it is determined 8230;" or "in response to the determination of 8230;" or "upon detection (a stated condition or event)" or "in response to the detection (a stated condition or event)" depending on the context.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
Those skilled in the art can understand that all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer readable storage medium and can include the processes of the method embodiments described above when executed. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and these modifications or substitutions do not depart from the scope of the technical solutions of the embodiments of the present application.

Claims (43)

1. A device interaction method, the method comprising:
a first device displays a first interface, wherein the first interface comprises a first control and a second control, the first control indicates a device option of the first device, the second control indicates a device option of a second device, and the second device is a device discovered by the first device;
The first device detects a first user operation, wherein the first user operation is used for associating the first device with the second device;
the first device displays a second interface, wherein a service option of a first scenarized service is displayed in the second interface, and the first scenarized service is cooperatively supported by the first device and the second device associated with the first user operation;
the first device detects a second user operation for selecting the first scenario service;
and the first equipment indicates to run the first scenario service selected by the second user operation.
2. The method of claim 1, wherein in the first interface, the second controls are distributed on a circle centered on the first control, and the first control and the second control are in a separated state.
3. The method according to claim 1 or 2, wherein the first user operation comprises an operation in which the second control is selected, or the first control is selected, or both the first control and the second control are selected simultaneously.
4. The method according to any one of claims 1-3, wherein the first user operation is an operation of releasing the second control after the user selects the second control and drags the second control to move to a specified area in a direction close to the first control, and the specified area is a circular area with the first control as a center and a radius of a first radius distance; the second interface also displays that the first control and the second control are adsorbed together.
5. The method of any of claims 1-3, wherein a hover control is displayed in the second interface, the hover control including a first region that displays service options for the first scenarized service; the second user operation is that the user releases the operation of the second control after dragging the second control to the first area, or the second user operation is that the user drags the second control to move to a designated area through the first area and towards the direction close to the first control, and then releases the operation of the second control, wherein the designated area is a circular area with the first control as the center and the radius as the first radius distance.
6. The method of claim 5, wherein the levitation control comprises a plurality of circular regions, or a plurality of rectangular regions, or a plurality of sector-shaped annular regions, or a plurality of doughnut-shaped regions, or a plurality of polygonal regions.
7. The method of any of claims 1-6, wherein the first control is included in the second interface.
8. The method of any one of claims 1-7, wherein the positional relationship of the first control to the second control in the first interface indicates one or more of the following relationships between the first device and the second device: a connection relation, or an orientation relation, or a distance relation, or a signal strength relation;
The first control is separated from the second control and indicates that the first device is not connected with the second device, and the first control is connected with the second control and indicates that the first device is connected with the second device;
and/or the second control is positioned at the left side, the right side, the upper side or the lower side of the first control, and indicates that the second device is positioned at the left side, the right side, the front side or the rear side of the first device respectively;
and/or the farther the distance between the first control and the second control is, the farther the distance between the first device and the second device is indicated, or the weaker the signal between the first device and the second device is, the closer the distance between the first control and the second control is, the closer the distance between the first device and the second device is indicated, or the stronger the signal between the first device and the second device is.
9. The method according to any one of claims 1-8, wherein the first scenarized service is a scenarized service queried from a first database that records one or more scenarized services supported by one or more device combinations comprising a device combination of the first device and its associated second device;
The scenarized service is determined by one or more of the following parameters: the device type, device characteristics, product location, device usage, environment or scene of the device, state of the device, and recently running application of the device combination.
10. The method according to any one of claims 1-8, wherein the first scenarized service is a scenarized service queried from a second database that records one or more scenarized services supported by one or more atomic service combinations comprising an atomic service combination composed of atomic capabilities possessed by the first device and its associated second device;
the atomic capability includes one or more of: audio output capability, audio input capability, display capability, camera capability, touch input capability, keyboard and mouse input capability.
11. The method of any of claims 1-10, wherein the first scenarized service calculates the probability of being selected based on one or more of the following parameters: the frequency of the first scenario-based service used by the user, the sequence of the first scenario-based service set by the user, the environment or scenario in which the first device and/or the second device is located, the state of the first device and/or the second device, and the application recently run by the first device and/or the second device; the service option of the first scene service with higher probability selected by the user is recommended to be displayed in higher priority.
12. The method according to any one of claims 5 to 11, wherein the shape of the floating control is a donut shape or a sector shape, wherein a plurality of donut-shaped areas or sector-shaped areas respectively indicate a plurality of the first scenarized services, and the first scenarized services indicated by the donut-shaped area or sector-shaped area closer to the center of the circle are recommended scenarized services more preferentially.
13. The method of any of claims 1-12, wherein a device option for a third device is also displayed in the second interface, the third device being a device recommended for evaluating device usage requirements of the first device and its associated second device.
14. The method of claim 13, wherein the third device is a device discovered by the first device, and wherein clicking on a device option of the third device causes the first device to display a service option of a second scenarized service.
15. The method of claim 13, wherein the third device is not a device discovered by the first device, wherein clicking on a device option of the third device skips displaying a fifth interface, wherein the fifth interface comprises a plurality of item options, wherein the plurality of item options are from a plurality of sources, and wherein the plurality of item options comprises an item option that indicates the third device.
16. The method of any of claims 13-15, wherein the third device is a device recommended based on one or more parameters: the device type, device characteristics, product location, device usage, device environment or scenario, device status, recently running application, or the first scenarized service, or the predicted deployable usage scenario of the first device and/or its associated second device.
17. The method according to any of claims 1-16, wherein the first scenarized service is supported by a first composition of atomized services, the first composition of atomized services comprising at least one first service provided by the first device and/or at least one second service provided by the second device.
18. The method according to any one of claims 1-17, further comprising:
the first device runs the first scenarized service and/or the second device associated with the first device runs the first scenarized service.
19. The method of any of claims 1-18, wherein a third interface displayed by the first device running the first scenarized service is different from a fourth interface displayed by the second device running the first scenarized service.
20. The method of claim 19, wherein if the first scenizable service is a screen projection service, the second device is a large screen device, the fourth interface is a video or image screen, and the third interface is a control interface for controlling a display function of the video or image screen.
21. The method of claim 19, wherein if the first scenarized service is a motion monitoring service, the second device is a wearable motion device, the third interface is a frame of a motion trail, and the fourth interface is detected motion data of the user, the motion data comprising: speed of movement, distance of movement, time of movement, heart rate.
22. The method of any one of claims 1-20, further comprising:
and under a screen projection scene, the first device determines the screen projection position of the screen interface of the first device in the screen of the second device according to the relative position of the second control and the first control when the dragging operation is released.
23. The method of claim 22,
if the relative position is that the second control is to the left of the first control, then the screen interface of the first device is projected to the right in the screen of the second device;
If the relative position is that the second control is to the right of the first control, then the screen interface of the first device is projected to the left in the screen of the second device;
if the relative position is that the second control is in the middle of the first control, the screen interface of the first device is projected on the screen of the second device in a full screen mode.
24. The method of any one of claims 4, 7-11, 13-23, further comprising:
the first device detects a third user operation acting on the second interface, the third user operation is used for associating the fourth device with the first device, and the third user operation comprises selection of a third control element operation indicating the fourth device;
responding to the third user operation, the first device displays a sixth interface, the sixth interface displays that the first control, the second control and the third control are adsorbed together, the sixth interface also displays a third scenario service option, the third scenario service option is determined according to the first device, the second device and the fourth device, and the third scenario service option is different from the first scenario service option.
25. The method according to any one of claims 1-24, wherein after the first device indicates execution of the first scenarized service selected by the second user operation, the method further comprises:
the first device detecting a fourth user operation comprising an operation that separates the first control from the second control;
the first device instructs the first scenario service selected by the second user operation to stop running.
26. The method of claim 25, further comprising:
and the first equipment stops running the first scenario service selected by the second user operation, and/or the second equipment stops running the first scenario service selected by the second user operation.
27. The method of claim 25 or 26, wherein the fourth user action is an action by which the user selects the second control and drags the second control to move away from the first control until the second control is released after moving out of the designated area.
28. The method according to any of claims 1-27, wherein the first device comprises any of: mobile phones, tablet computers, portable/non-portable computers, personal computers, smart televisions; the second device comprises any one of: cell-phone, panel computer, portable mobile computer, desktop personal computer, intelligent audio amplifier, intelligent wrist-watch, intelligent bracelet, smart television, earphone, intelligent glasses, car machine, intelligent passenger cabin, game machine, treadmill, spinning bike, personal weighing scale, body fat balance, water heater, lamps and lanterns, air conditioner, blood glucose meter, oximetry, cardiotachometer, AR/VR equipment, cloud host computer/cloud server, intelligent wearable equipment, intelligent house equipment, printer.
29. The method of any of claims 1-28, wherein the first scenarized service is a print service and the second device is a printer, the method further comprising:
the first device displays a printing suspension control, the printing suspension control is displayed when the first device runs the first scenarized service, and the printing suspension control is displayed in a suspension manner on a user interface of the first device.
30. The method of claim 29, further comprising:
the first device detects a fifth user operation on the print hover control;
and the first device sends a first print file to the second device for printing based on the fifth user operation, wherein the first print file is content displayed on the first device when the fifth user operation is detected by the first device.
31. The method of claim 29 or 30, further comprising:
the first device detects a sixth user operation on the print hover control;
and the first device deletes the printing suspension control based on the sixth user operation and instructs the first scenarized service to stop running.
32. A device interaction method, the method comprising:
a first device displays a first interface, wherein the first interface comprises a first control and a second control, the first control indicates a device option of the first device, the second control indicates a device option of a second device, and the second device is a device discovered by the first device;
the first device detects a first user operation, and the first user operation is used for associating the first device and the second device;
the first equipment sends a first binding request to the second equipment according to the first user operation, and receives a first consent message of the second equipment;
and the first device displays a second interface according to the first agreement message, wherein service options of a first scene service are displayed in the second interface, and the first scene service is a scene service cooperatively supported by the first device and the second device.
33. The method according to claim 32, wherein the first device sends a first binding request to the second device according to the first user operation, specifically comprising:
The first equipment displays a connection code setting frame according to the first user operation, wherein the connection code setting frame is used for setting a connection code for binding the first equipment and the second equipment;
the first device receives a first connection code input in the connection code setting frame and sends the first binding request to the second device;
wherein the first intention message includes the first connection code.
34. The method according to claim 32 or 33, wherein in the first interface, the second controls are distributed on a circle with a radius of a first length centered on the first control, and the first control and the second control are in a separated state.
35. The method of claim 34, wherein the first interface further comprises a fourth control, wherein the fourth control indicates a device option of a fifth device, wherein the fifth device is a device discovered by the first device, and an account associated with the fifth device is the same as an account associated with the first device; the fourth control is distributed on a circle which takes the first control as a center and has a radius of a second length, the first control and the fourth control are in a separated state, and the first length is different from the second length.
36. The method of any of claims 32-35, wherein the account associated with the first device is different from an account associated with the second device, or wherein the account associated with the second device is not.
37. The method of claim 35 or 36, wherein the first interface further comprises a fifth control, wherein the fifth control is a device option indicating a sixth device, wherein the sixth device is a device discovered by the first device, wherein the fifth control is in a separate state from the fourth control, and wherein the method further comprises:
detecting, by the first device, a seventh user operation for associating the fifth device with the sixth device;
the first device instructs the fifth device to request to bind with the sixth device according to the seventh user operation, and receives a second agreement message for instructing the sixth device to agree to bind with the fifth device;
and the first equipment displays an eighth interface according to the second agreement message, wherein the fourth control and the fifth control are adsorbed together in the eighth interface.
38. The method according to any one of claims 32-37, further comprising:
And the first device detects user operation acting on the second control, and removes the second control on the first interface.
39. An electronic device, characterized in that the electronic device comprises: a communication device, a display device, a memory, and a processor coupled to the memory, and one or more programs; the communication device is used for communication, the display device is used for displaying an interface, and the memory stores computer-executable instructions which when executed by the processor cause the electronic equipment to realize the method according to any one of claims 1 to 17, 19 to 25 and 27 to 31.
40. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-17, 19-25, 27-31.
41. A communication system comprising a first device and a second device, wherein the first device performs the method of any of claims 1 to 17, 19 to 25, 27 to 31.
42. An electronic device, characterized in that the electronic device comprises: a communication device, a display device, a memory, and a processor coupled to the memory, and one or more programs; the communication device is for communication, the display device is for displaying an interface, and the memory has stored therein computer-executable instructions that, when executed by the processor, cause the electronic device to implement the method of any one of claims 32 to 38.
43. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 32-38.
CN202210270366.5A 2021-08-31 2022-03-18 Equipment interaction method, electronic equipment and system Pending CN115729392A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/115212 WO2023030196A1 (en) 2021-08-31 2022-08-26 Device interaction method, electronic device and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2021110164845 2021-08-31
CN202111016484 2021-08-31

Publications (1)

Publication Number Publication Date
CN115729392A true CN115729392A (en) 2023-03-03

Family

ID=85292348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210270366.5A Pending CN115729392A (en) 2021-08-31 2022-03-18 Equipment interaction method, electronic equipment and system

Country Status (2)

Country Link
CN (1) CN115729392A (en)
WO (1) WO2023030196A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105302285B (en) * 2014-08-01 2019-03-01 福州瑞芯微电子股份有限公司 Multi-display method, equipment and system
CN111580764B (en) * 2020-04-18 2023-08-29 广州视源电子科技股份有限公司 Screen sharing method, device, equipment and storage medium of intelligent interaction tablet
CN112083867A (en) * 2020-07-29 2020-12-15 华为技术有限公司 Cross-device object dragging method and device
CN112698761A (en) * 2020-12-30 2021-04-23 维沃移动通信有限公司 Image display method and device and electronic equipment

Also Published As

Publication number Publication date
WO2023030196A1 (en) 2023-03-09

Similar Documents

Publication Publication Date Title
CN109814766B (en) Application display method and electronic equipment
WO2020238871A1 (en) Screen projection method and system and related apparatus
WO2021103981A1 (en) Split-screen display processing method and apparatus, and electronic device
WO2021212922A1 (en) Object dragging method and device
WO2021213164A1 (en) Application interface interaction method, electronic device, and computer readable storage medium
WO2021036770A1 (en) Split-screen processing method and terminal device
CN111240547A (en) Interactive method for cross-device task processing, electronic device and storage medium
CN110910872A (en) Voice interaction method and device
WO2022068483A1 (en) Application startup method and apparatus, and electronic device
CN115729511A (en) Audio playing method and electronic equipment
CN113778574B (en) Card sharing method, electronic equipment and communication system
CN113961157B (en) Display interaction system, display method and equipment
WO2022152024A1 (en) Widget display method and electronic device
WO2020238759A1 (en) Interface display method and electronic device
WO2022135527A1 (en) Video recording method and electronic device
WO2022037463A1 (en) Function switching entry determining method and electronic device
WO2021143650A1 (en) Method for sharing data and electronic device
WO2021196980A1 (en) Multi-screen interaction method, electronic device, and computer-readable storage medium
CN113448658A (en) Screen capture processing method, graphical user interface and terminal
CN113794796A (en) Screen projection method and electronic equipment
WO2023088459A1 (en) Device collaboration method and related apparatus
US20240272865A1 (en) Audio playing method, electronic device, and system
CN114173184B (en) Screen projection method and electronic equipment
CN114285938B (en) Equipment recommendation method and device and computer readable storage medium
EP4390643A1 (en) Preview method, electronic device, and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination