CN117424772A - Display control method, electronic device, and computer-readable storage medium - Google Patents

Display control method, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN117424772A
CN117424772A CN202210818264.2A CN202210818264A CN117424772A CN 117424772 A CN117424772 A CN 117424772A CN 202210818264 A CN202210818264 A CN 202210818264A CN 117424772 A CN117424772 A CN 117424772A
Authority
CN
China
Prior art keywords
user
electronic device
central control
data
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210818264.2A
Other languages
Chinese (zh)
Inventor
张程蛟
孙继强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210818264.2A priority Critical patent/CN117424772A/en
Priority to PCT/CN2023/106604 priority patent/WO2024012413A1/en
Publication of CN117424772A publication Critical patent/CN117424772A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of smart home, in particular to a display control method, electronic equipment and a computer readable storage medium. The method comprises the following steps: the first electronic equipment identifies the current first user; the first electronic device obtains first display data corresponding to the first user from the second electronic device according to the identification result, and displays a first control interface based on the first display data, wherein the first display data is associated with the use data of the second electronic device by the first user. According to the method and the device, when the user identity is identified, the personalized desktop matched with the user identity can be displayed. Therefore, the user can use the personalized desktop which accords with the preference or the use habit of the user to control operation on any central control equipment in the home environment, and the user experience is improved.

Description

Display control method, electronic device, and computer-readable storage medium
Technical Field
The invention relates to the technical field of intelligent home, in particular to a display control method, electronic equipment and a computer readable storage medium.
Background
With the development of smart Home technology, in some smart Home scenarios, a user may turn on smart Home devices such as a smart television, a smart sound box, an electric kettle, a water dispenser, and a curtain in a Home through a Home Panel. The desktop provided by the central control screen for implementing the control function may also be referred to as Super desktop (Super desktop).
However, in the process of controlling each intelligent home device through the central control screen, the current central control screen cannot flexibly switch the desktop display modes so as to adapt to the preference and the use habit of different users.
Disclosure of Invention
The embodiment of the application provides a display control method, electronic equipment and a computer readable storage medium, and based on the scheme of the application, a user can use a personalized desktop meeting the preference or the use habit of the user to control operation on any central control equipment in a home environment, so that the user experience is improved.
In a first aspect, an embodiment of the present application provides a display control method, which is applied to an intelligent home system, where the intelligent home system includes a first electronic device and a second electronic device having a screen, and the method includes: the first electronic equipment identifies the current first user; the first electronic device obtains first display data corresponding to the first user from the second electronic device according to the identification result, and displays a first control interface based on the first display data, wherein the first display data is associated with the use data of the second electronic device by the first user.
That is, the first electronic device may display a control interface that matches the first user identity when the first user identity is identified that is currently using the electronic device. The first electronic device and the second electronic device may be, for example, central control devices such as a central control screen in an intelligent home system. The second electronic device may be, for example, a common central control device of the user, and refer to the central control device 100-1 described in the following detailed description, and the first electronic device is a central control device that the user needs to use currently, and refer to the central control device 100-2 described in the following detailed description. The first display data corresponding to the identified first user, which is acquired by the first electronic device, is display data generated by the common central control device of the user, that is, display data generated by the second electronic device.
The first display data is related to the usage data of the second electronic device by the first user, for example, the common central control device may adjust a display style of a control interface corresponding to the user according to the usage habit of the user or the corresponding usage data such as the setting operation of the user, and generate corresponding display data.
It can be appreciated that the process of the first electronic device obtaining the first display data from the second electronic device may include the first electronic device directly obtaining the first display data from the second electronic device; the method may also include, without limitation, the first electronic device obtaining, from a cloud or a server, the first display data uploaded by the second electronic device.
In a possible implementation of the first aspect, the first control interface includes a plurality of controls, where the controls are used to control smart home devices in the smart home system.
The user can control and operate the intelligent household equipment in the intelligent household system through the control interface displayed by the first electronic equipment.
In a possible implementation of the first aspect, the smart home system further includes a third electronic device having a screen, and the method further includes: the first electronic equipment identifies the current second user; the first electronic device obtains second display data corresponding to the second user from the third electronic device according to the identification result, and displays a second control interface based on the second display data, wherein the second display data is associated with the use data of the third electronic device by the second user.
I.e. the first electronic device may display a control interface matching the identity of the second user when it recognizes the second user currently using the first electronic device. Reference is made specifically to the above description related to the control interface for displaying the first user match, which is not described herein.
In one possible implementation manner of the first aspect, the first electronic device obtains, from the second electronic device, first display data corresponding to the first user according to the identification result, including: acquiring first user data for identifying a first user; determining device information of the second electronic device based on a first association relationship between the first user data and the second electronic device; and sending a data acquisition request to the second electronic device based on the device information of the second electronic device to acquire the first display data.
The first electronic device may search the obtained first user data for an association relationship between the first user data and the second electronic device, so as to determine device information of the second electronic device that may provide the first display data. The first electronic device may then request the second electronic device to obtain the first display data.
In other embodiments, the first electronic device may also send the obtained first user data to the cloud end or the server, so as to match the first display data uploaded by the second electronic device in the cloud end or the server, and further obtain the first display data returned by the cloud end or the server. There is no limitation in this regard.
In one possible implementation of the first aspect, the first electronic device obtains the first user data by: detecting user interaction, and collecting first user data of a first user; alternatively, the first user data is received from a third electronic device.
In a possible implementation of the first aspect described above, the user interaction is detected, including any of the following: detecting that the user approaches; detecting touch operation of a user on a screen of the first electronic equipment; and detecting the pressing operation of the key on the first electronic equipment by the user.
In one possible implementation of the first aspect, the first user data includes: any one of face recognition data, fingerprint recognition data, voiceprint recognition data.
The first user data may be, for example, face information, fingerprint information, or voiceprint information of the user acquired.
In a possible implementation of the first aspect, determining device information of the second electronic device based on a first association relationship between the first user data and the second electronic device includes: and determining the equipment information of the second electronic equipment based on the first association relation acquired in advance from the second electronic equipment.
In a possible implementation manner of the first aspect, the manner in which the second electronic device generates the first association relationship includes any one of the following: generating a first association relationship based on the detected operation of the first user to input the first user data and the operation of setting the second electronic device as a common device associated with the first user; generating a first association relation based on the frequency of use of the second electronic equipment by the first user; and generating a first association relation based on the statistical time length of the position of the first user and the position of the second electronic equipment in the first space, wherein the position of the first user is determined through position data acquired by the fourth electronic equipment.
For example, the second electronic device is a common central control device of the user, and when the central control device detects that the user registers identity information on the central control device and operates to set the central control device as the common central control device, the central control device can establish an association relationship with the user. Or the central control equipment identifies that the frequency of using the central control equipment by the user is highest, or the time that the user and the central control equipment are in the same room is longest, and the association relationship with the user is established as common central control equipment of the user.
It will be appreciated that, in other embodiments, the association between the user and the common central control device may be established by other manners, which is not limited herein.
In one possible implementation manner of the first aspect, the method for the first electronic device to obtain the first association from the second electronic device in advance includes: the first electronic equipment receives a first association relation synchronized by the second electronic equipment through an open Internet of things communication protocol; or the first electronic equipment sends the acquisition request to the second electronic equipment through the open internet of things communication protocol, and receives the first association relation returned by the second electronic equipment in response to the acquisition request.
In one possible implementation of the first aspect, the open internet of things communication protocol is a HiLink communication protocol.
For example, the association relationship between the user and the common device, that is, the first association relationship, between the first electronic device and the second electronic device may be synchronized through the hicink channel. It will be appreciated that, in other embodiments, the communication manner of the synchronous association relationship between the first electronic device and the second electronic device may be other manners, which is not limited herein.
In a possible implementation of the first aspect, the second electronic device includes a second control interface, and the usage data of the second electronic device by the first user includes at least one of: data acquired according to the operation habit of the first user; or, according to the data acquired by the first user for adjusting the display content of the second control interface.
In a possible implementation of the first aspect, the method further includes: before the first electronic device displays the first control interface based on the first display data, the first electronic device displays a third control interface; displaying a first control interface based on the first display data, comprising: and based on the first display data, displaying the third control interface in a switching mode as a first control interface.
For example, the first electronic device is a central control device currently used by the user, and the central control device can display a default desktop, namely the third control interface when the central control device is on. For example, when the central control device does not recognize the identity of the user or does not acquire the face information of the user, a default desktop can be displayed if a screen needs to be lightened. When the central control device recognizes the identity of the user currently using the device, the displayed third control interface may switch to the first control interface matching the user.
In a possible implementation manner of the first aspect, the first electronic device and the second electronic device establish a data transmission channel based on any one of WiFi, bluetooth or a distributed soft bus, and the data transmission channel is used for transmitting the first display data from the second electronic device to the first electronic device.
In a possible implementation of the first aspect, the first control interface is an interface of a desktop application, and the first display data includes at least one of: application data of a desktop application, application data of a first application interacting with the desktop application, and device data of intelligent home devices correspondingly controlled by a control on a first control interface.
For example, the personalized desktop displayed after the central control screen recognizes the identity of the user may be the first control interface.
In a second aspect, an embodiment of the present application provides a display control method, which is applied to an intelligent home system, where the intelligent home system includes a first electronic device and a second electronic device with a screen, and the method includes: the second electronic equipment receives an acquisition request of the first electronic equipment for first display data, wherein the first display data are used for displaying a first control interface of the first electronic equipment, and the first display data are generated by the second electronic equipment according to the use data of the first electronic equipment by a first user; in response to the acquisition request, the second electronic device transmits the first display data to the first electronic device.
In a possible implementation of the second aspect, the second electronic device includes a second control interface, and the usage data of the second electronic device includes: the second electronic equipment obtains data based on the operation habit of the first user; or the second electronic device obtains data based on the adjustment operation of the display content of the second control interface by the user.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; one or more memories; the one or more memories store one or more programs that, when executed by the one or more processors, cause the electronic device to perform the display control method provided in the first aspect or the display control method provided in the second aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon instructions that, when executed on a computer, cause the computer to perform the display control method provided in the first aspect or perform the display control method provided in the second aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program/instruction which, when executed by a processor, implements the display control method provided in the first aspect or the second aspect.
Drawings
Fig. 1 is a schematic view of an intelligent home scene.
Fig. 2a is a schematic diagram of a desktop of a display default display of a wake-up screen of a central control device when a user approach is detected.
Fig. 2b is a schematic diagram of a personalized desktop corresponding to a user after identifying the identity of the user when a central control device wakes up a screen according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a central control device according to an embodiment of the present application.
Fig. 4 is a schematic block diagram of an operating system architecture of a central control device according to an embodiment of the present application.
Fig. 5 is a schematic diagram of a display data circulation process between central control devices in an intelligent home scene according to an embodiment of the present application.
Fig. 6 is a schematic flow chart of an implementation of a display control method according to an embodiment of the present application.
Fig. 7a is a schematic diagram of a personalized desktop style according to an embodiment of the present application.
Fig. 7b is a schematic diagram of another personalized desktop style provided in an embodiment of the present application.
Fig. 8 is a schematic diagram of an implementation flow of registering user identity information on a common central control device and setting the common central control device associated with the user identity information according to an embodiment of the present application.
Fig. 9 is a schematic diagram of an association relationship between a sub-account of a family member, face data registered by a user, and a common central control device according to an embodiment of the present application.
Fig. 10 is a schematic diagram of an interaction flow of a display control method according to an embodiment of the present application.
Detailed Description
Fig. 1 shows a schematic view of a smart home scenario according to an embodiment of the present application.
As shown in fig. 1, the smart home scene 10 includes a plurality of central control screens, such as central control screens 100-1, 100-2, 100-3, 100-4, 100-5 in each space shown in fig. 1, and a user may operate the central control screen not only in the space where the user is often active, such as central control screen 100-1, but also in other spaces, such as central control screen 100-2.
However, different users may have different demands on the desktop style of the central control screen, for example, a young and middle-aged user may tend to display as many control buttons or device data of each home device as possible on the desktop for operation; elderly people may prefer a simple mode desktop with a large font and only display control buttons of a few common intelligent devices, etc.; the desktop of the central control screen used by the child can automatically shield threat operation of the child to some electrical equipment, and display content recommendation, control buttons and the like of some learning related applications or equipment.
However, the current central control screen can only display the local universal desktop when the approach of the user is detected, and cannot display the matched personalized desktop according to the preference or the using habit of different users.
In order to solve the problem that the electronic devices executing the central control function, such as the central control screen, in the smart home scene 10 shown in fig. 1 cannot display the corresponding personalized desktops in a matching manner according to the preference or the use habit of different users, the embodiment of the application provides a display control method which is applied to the smart home scene. The electronic device executing the central control function is hereinafter referred to as a central control device, for example, a central control screen in the scene shown in fig. 1.
Specifically, the method stores the association relationship between the user identity information and the corresponding common central control equipment in each central control equipment in advance, so that if the user sets personal display data on the common central control equipment, namely, the personal desktop style which accords with personal preference or use habit, the common central control equipment can store the personal display data set by the user. Before the user transfers from the common central control device to the current central control device, the current central control device starts to identify user identity information when detecting user interaction, wherein the user interaction can be, for example, a wake-up operation performed when the user approaches the current central control device or the user touches the screen of the current central control device, and the like, and the user interaction is not limited herein. The current central control equipment can acquire display data on the common central control equipment corresponding to the identified user identity information based on the association relation between the stored user identity information and the corresponding common central control equipment, and display the personalized desktop conforming to the user preference or the use habit.
Based on the control method, the user can use the personalized desktop which accords with the user preference or the use habit to perform control operation on any central control device in the home environment, and therefore the user experience is improved.
In some embodiments, the association relationship between the user identity information and the corresponding common central control device may be stored in each central control device. The manner in which the current central control device obtains the display data on the common central control device corresponding to the identified user identity information may include: the current central control device can determine the device information of the common central control device corresponding to the user identity information from the association relationship stored by the current central control device after the identity information of the user is identified, and send a request for acquiring display data to the corresponding common central control device, and the corresponding common central control device returns corresponding display data based on the request.
Optionally, the current central control device may also receive notification and user identity information sent by other electronic devices in the home environment, so as to obtain display data matched with the received user identity information, and display a corresponding personalized desktop, which is not limited herein. The notification and the user identity information may be information sent by other electronic devices when they are close to the current central control device and recognize the user identity. For example, when the smart watch worn by the user detects that the user is close to the current central control device, a notification of the user's approach and identity information of the user may be sent to the current central control device.
In some embodiments, the association relationship between the user identity information and the corresponding common central control device may be stored in the central device connected to each central control device or pre-stored in the cloud. The current central control equipment can send the user identity information to the central equipment or the cloud after the identity information of the user is identified, the central equipment or the cloud determines the equipment information of the common central control equipment corresponding to the user identity information and returns the equipment information to the current central control equipment, and then the current central control equipment sends a request for acquiring display data to the corresponding common central control equipment, and the corresponding common central control equipment returns corresponding display data based on the request.
In some embodiments, the association relationship between the user identity information and the corresponding common central control device, and the personalized desktop display data set by the user in the common central control device are stored in the cloud. The current central control equipment can send the user identity information to the cloud after identifying the user identity information, and the cloud directly calls display data of the common central control equipment corresponding to the user identity information according to the stored association relationship and sends the display data to the current central control equipment for use. In order to protect the user privacy data, the association relationship, the display data and the like stored in the cloud in advance may be subjected to corresponding encryption processing, which is not limited herein.
In other embodiments, the manner in which the current central control device quickly obtains the display data on the common central control device corresponding to the identity information of the user after identifying the identity information of the user may be other manners, which is not limited herein.
It will be appreciated that the identity information includes, but is not limited to, face data, fingerprint data, voiceprint data, etc., and is not limited thereto. The display data set includes display data corresponding to a personalized desktop style set by a user in a common central control device, including, but not limited to, desktop application (host) data of the common central control device, application data or device status data displayed in each card window of the desktop application, and some system application data capable of setting a desktop global style such as a display font, a font size, and the like of the desktop application, where the system application may be, for example, a setting application.
It will be appreciated that in some embodiments, a data flow channel may be established between the central control devices in the smart home scenario, so as to facilitate transmission of display data or related requests for acquiring corresponding display data, and so on. The data transfer channel may be a transfer channel established based on the same wireless local area network to which the data transfer channel is connected, for example, a near field data transfer channel implemented based on WiFi, bluetooth, zigBee, or a distributed soft bus, and the like, which is not limited herein. In this way, the personalized desktop display data generated based on the user setting can only flow among the central control devices in the home environment, which is beneficial to protecting the privacy data of the user.
In other embodiments, the data flow channel may be established based on a wireless network and a cloud, for example, display data on a common central control device may be transmitted to the cloud through a wireless local area network or a mobile data communication network, and the current central control device acquires the display data from the cloud when identifying the identity of the user. Therefore, the transmission of the display data is not limited to the wireless local area network, and the display control method provided by the embodiment of the application can also have greater network scene adaptability.
FIG. 2a shows a control interface schematic of a central control device waking up a screen display default display upon detection of a user approach. It will be appreciated that the central control device 100-2 shown in fig. 2a may be the current central control device described above, i.e. the current central control device with which the user is currently interacting.
As shown in fig. 2a, when the central control device 100-2 detects that the user approaches, the central control device can wake up the screen to display the control interface 210 shown in fig. 2a, the icons of the control interface 210 are more, the function introduction word size corresponding to each icon is moderate, and the use habit of the public is better met. However, if the user operating the center control device 100-2 at this time is an old person at home, the functional introduction of each icon may not be made clear.
Fig. 2b shows a personalized desktop schematic corresponding to a user after identifying the identity of the user when the central control device wakes up a screen according to an embodiment of the present application.
As shown in fig. 2b, the personalized desktop may be, for example, a simple mode desktop 220 that conforms to the usage habits of elderly people. The desktop 220 has larger character size, fewer functional controls and a layout which is more convenient for the old. In other embodiments, the personalized desktop shown in fig. 2b may be other style that matches the preferences of other users, without limitation.
Referring to the scenario shown in fig. 1, for example, the user is an elderly person in home, the common central control device of the user is a central control device 100-1 located in a bedroom, and the personalized desktop of the central control device 100-1 is correspondingly provided with a simple mode desktop conforming to the use habit of the elderly person. If the user walks from the bedroom to the living room, who wants to drink water, he can walk near the central control device 100-2 located in the living room. When the central control device 100-2 detects that the user is approaching, it may wake up the screen and identify the user by means of face recognition or fingerprint recognition. When the central control device 100-2 confirms that the user is an old person in the home, the central control device 100-1 can acquire display data of the simple mode desktop conforming to the use habit of the old person, further display the simple mode desktop 220 shown in fig. 2b, and the user can click the "hot water one key" displayed by the large characters on the desktop 220 to meet the drinking water requirement of the user.
If the simple mode desktop 220 is the control interface displayed when the central control device 100-2 recognizes the identity information of the elderly in the home, it is denoted as the first control interface. When the central control apparatus 100-2 recognizes another user's identity, for example, recognizes the identity information of a child in the home, a control interface corresponding to the child setting may also be displayed, and the control interface may be denoted as a second control interface. Before the central control device 100-2 displays the first control interface or the second control interface, for example, when no identity information of any user is recognized, a default desktop shown in fig. 2a may be displayed, for example, denoted as a third control interface.
In this way, the above-mentioned central control apparatus 100-2 recognizes the identity information of the elderly person in the home, and switches from displaying the third control interface to displaying the first control interface. Similarly, the center control device 100-2 displays the second control interface upon recognizing the identity information of the child in the home, i.e., switches from displaying the third control interface to displaying the second control interface.
In addition, the simple mode desktop 220 displayed by the central control device 100-2 may also display an interface prompt 221, for example, "identify owner, enter simple mode" as shown in fig. 2 b. It can be understood that, based on the display control method provided by the embodiment of the application, no matter the display control method is used for old people in home or other users, on any central control device in the home environment, the desktop meeting the personalized requirements of the corresponding user can be used for performing control operation on the home device, for example, clicking the "full-house dehumidification" function shown in fig. 2b to start the air conditioner in home for dehumidification and the like, so that the user experience is excellent.
It may be appreciated that, in the display control method provided in the embodiments of the present application, applicable electronic devices (i.e., the above-mentioned central control device) may include, but are not limited to, a mobile phone, a tablet computer, a desktop, a laptop, a handheld computer, a netbook, and an Augmented Reality (AR) or Virtual Reality (VR) device, a wearable device such as a smart television, a smart watch, a car machine device, a portable game machine, a portable music player, a reader device, a television with one or more processors embedded or coupled therein, or other electronic devices capable of accessing a network.
Fig. 3 shows a schematic structural diagram of a central control device 100 according to an embodiment of the present application.
The central control device 100 may include a processor 110, an external memory interface 120, an internal memory 130, a power management module 140, a wireless communication module 150, a display 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an ear speaker interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, and the like. The sensor modules 180 may include a pressure sensor 180A, a gas pressure sensor 180B, a distance sensor 180C, a proximity light sensor 180D, a fingerprint sensor 180E, a temperature sensor 180F, a touch sensor 180G, an ambient light sensor 180H, and the like.
It should be understood that the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the central control apparatus 100. In other embodiments of the present application, the central control device 100 may include more or fewer components than shown, or certain components may be combined, certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. In the embodiment of the present application, the processor 110 may control the instruction fetch and execute the instruction through the controller to implement the display control method provided in the embodiment of the present application.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, and the like.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present invention is only schematically illustrated, and does not constitute a structural limitation of the central control apparatus 100. In other embodiments of the present application, the central control device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The power management module 140 is connected to the processor 110, etc. for power management of the processor 110, the internal memory 130, the display 160, the camera 193, the wireless communication module 150, etc. In other embodiments, the power management module 141 may also be provided in the processor 110.
The wireless communication function of the center control device 100 may be implemented by an antenna, the wireless communication module 150, and the like.
The antenna is used for transmitting and receiving electromagnetic wave signals. Each antenna in the central control device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The wireless communication module 150 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., for use on the central control device 100. In some embodiments, the antenna of the central control device 100 is coupled to the wireless communication module 150 such that the central control device 100 may communicate over a wireless communication network as well as other devices.
The central control apparatus 100 implements display functions through a GPU, a display screen 160, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 160 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 160 is used to display images, videos, and the like. The display screen 160 includes a display panel.
The central control apparatus 100 may implement a function of collecting user face information for confirming the identity of a user through an ISP, a camera 193, an application processor, and the like, without limitation.
Wherein the ISP may be used to process data fed back by the camera 193. The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the central control device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions.
The internal memory 130 may be used to store computer executable program code including instructions. The internal memory 130 may include a storage program area and a storage data area. The storage program area may store an operating system, an application program required for at least one function, and the like. The storage data area may store application data generated by the central control device 100 during use, such as desktop application data, and other application data displayed on a desktop application interface. In this embodiment of the present application, the application program stored in the storage program area may be, for example, a desktop application (Launcher) for managing and controlling various intelligent devices in a home environment, and the data stored in the storage data area may be, for example, personalized display data of user operation settings acquired during the process of running the desktop application by the central control device 100, identity information related data for comparing user face information or fingerprint information acquired when the user approaches the personal data, and the like. There is no limitation in this regard.
In addition, the internal memory 130 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the central control apparatus 100 and data processing by executing instructions stored in the internal memory 130 and/or instructions stored in a memory provided in the processor. In this embodiment of the present application, for example, the processor 110 of the central control device 100 may execute relevant instructions stored in the internal memory 130 for executing steps of an implementation flow of the display control method described below, so as to implement the switching display of the personalized desktop of the user.
The center control device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as voice recording, voiceprint recognition, etc.
The fingerprint sensor 180E is used to collect a fingerprint. In some embodiments of the present application, the central control device 100 may utilize the collected fingerprint characteristics to compare registered user identity information to identify a user identity.
A distance sensor 180C for measuring a distance. The central control device 100 may measure the distance by infrared or laser light. In some embodiments of the present application, the central control device 100 may utilize the distance sensor 180C to measure distance to detect approach or separation of the user, so as to determine whether to wake up the screen, collect face information of the user, etc. for identification, and display a personalized desktop corresponding to the identified user.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The central control apparatus 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the central control apparatus 100. The motor 191 may generate a vibration cue. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
Fig. 4 shows a schematic block diagram of an operating system architecture of the central control device 100 according to an embodiment of the present application.
The central control device 100 may have installed thereon a distributed operating system, such as a Harmony OS TM Android may also be installed TM The operating system such as the system is not limited herein. It is understood that in the smart home scenario, there may be multiple central control devices 100, and distributed interaction between the central control devices 100 may be implemented based on the installed distributed operating system. Taking the central control device 100-1 and the central control device 100-2 shown in fig. 4 as an example, a description will be given of a distributed operating system adopted by the central control device 100. The central control device 100-1 may be a common central control device commonly used by the user, and the central control device 100-2 may be a second electronic device currently used by the user.
In some embodiments, the central control device 100-1 and the distributed operating system installed on the central control device 100-2 employ a hierarchical architecture. The hierarchical architecture divides the distributed operating system into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the distributed operating system is divided into four layers, an application layer, an application framework layer, a system service layer, and a kernel layer, from top to bottom. In other embodiments, the distributed operating system may be divided into other numbers of hierarchies, without limitation.
As shown in fig. 4, the application layer may include a series of application packages. Application packages may include desktop (desktop), settings, and music applications, such as calendars, WLANs, bluetooth, cameras, gallery, phone calls, maps, video, etc., without limitation.
The application layer may also include some functional software development kits (Software Development Kit, SDK), such as an information hub SDK, for storing and managing some device static information, such as member account information logged in by the user on the central control device, registered user identity information, and association between the user identity information and device information of the central control device, so as to set the central control device as association information of the user's common device, and so on.
Among them, a desktop (desktop) application may display a corresponding personalized desktop in response to a user's setting operation. The setting operation of the user may include, for example, an operation of setting personalized wallpaper, oversized fonts, a Feature capability (FA) card subscribed to collect express, a private device FA card in the user's own room, and the like. Wherein the desktop/FA card displayed on the desktop corresponds to the status of the device, a corresponding device or application is required to provide. For example, the physical distribution state corresponding to the express FA card can be provided by application data of the corresponding shopping APP, the ambient air humidity detected by the FA card corresponding to the private device such as the humidifier can be provided by the associated humidifier, and so on.
The application framework layer provides a distributed framework for the application layer, and a light device proxy service. The distributed frameworks comprise functional frameworks for realizing intercommunication (including starting, calling and migration) in the system and between the systems, such as remote installation, remote starting, migration management, remote calling, distributed system service management, FA association management and the like, and application programming interfaces (application programming interface, APIs) corresponding to the frameworks. The light device proxy service may provide a light device hosting service, a master-slave service call proxy, and the like, where the light device in the HarmonyOS refers to a screen device, and in this embodiment, the light device hosted by the light device proxy service proxy may be, for example, an electric water heater, an air conditioner, a sweeping robot, a humidifier, and other home intelligent devices in an intelligent home environment, and a wearable device such as intelligent glasses, and the like.
The system service layer is the core of the distributed operating system and provides services for application programs and SDKs in the application layer through the application framework layer. The system service layer comprises a Hilink communication component, distributed data management, a distributed soft bus and the like.
The Hilink communication component is used for establishing a communication channel for synchronizing static information of some devices between the central control device 100-1 and the central control device 100-2, wherein the static information of the devices comprises user identity information registered by each central control device, association relation between the user identity information and device id, home equipment related information which is associated and managed by each central control device, and the like. The device static information of each central control device can be synchronized based on the Hilink channel provided by the Hilink communication component.
A distributed soft bus, which is an example structure of the underlying network, is used to transmit display data and the like that are circulated between the central control device commonly used by the user and the central control device currently used by the user, that is, to implement the hong-and-Monte distributed circulation (Harmony distributed flow). For an understanding of the distributed soft bus functionality, reference may be made to a computer hardware bus. For example, the distributed soft bus is a "intangible" bus built between 1+8+N devices (1 is a mobile phone; 8 stands for car set, sound box, earphone, watch, bracelet, tablet, large screen, personal computer (personal computer, PC), augmented Reality (Augmented Reality, AR), virtual Reality (VR), N generally refers to other Internet of things (Internet of Things, IOT) devices), and has the characteristics of automatic discovery, instant use, ad hoc network (heterogeneous network networking), high bandwidth, low delay, high reliability and the like. That is, through the distributed soft bus technology, not only can all data be shared between the above-mentioned central control devices, but also instant interconnection can be realized with any device connected with the central control devices through Bluetooth or the same local area network. In addition, the distributed soft bus can also share files between heterogeneous networks such as bluetooth, wireless-Fidelity (WiFi), etc. (e.g., receiving files via bluetooth on the one hand and transmitting files via WiFi on the other).
The kernel layer is a layer between hardware and software. The kernel layer of the distributed operating system includes: a core subsystem and a driver subsystem. The kernel subsystem may adopt a multi-kernel design between distributed operating systems, so that the kernel subsystem supports the selection of a suitable 0S kernel for different resource-constrained devices, for example, in the embodiment of the present application, a Lite OS may be selected for a central control device.
A kernel abstraction layer (Kernel Abstract Layer, KAL) on the kernel subsystem provides underlying kernel capabilities to upper layers including process/thread management, memory management, file system, network management, peripheral management, etc., by masking multi-kernel differences.
The driving framework provided by the driving subsystem is an ecologically open foundation of the distributed system hardware, and provides unified peripheral access capability and driving development and management framework. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Based on the above-mentioned central control device structure shown in fig. 3 and the distributed operating system architecture shown in fig. 4, fig. 5 shows a schematic diagram of a display data transfer process between central control devices in a smart home scenario according to an embodiment of the present application.
As shown in fig. 5, the smart home scenario includes a central control device 100-1, a central control device 100-2, and a central control device 100-3. The central control device 100-1 may be a common central control device commonly used by the user, the central control device 100-2 may be a second electronic device currently used by the user, and the central control device 100-3 may be a central control device commonly used by other users. Each central control device at least comprises the following functional modules:
the approach interaction module can be realized based on a distance sensor and is used for detecting the approach of a user, so that the central control equipment is awakened to display a desktop, namely a control interface.
The face recognition module can be realized based on a camera and a corresponding image processor (such as an ISP), and is used for collecting face information of a user as user identity information when the user registers common central control equipment and collecting the face information of the user to perform identification verification of the user identity when the central control equipment wakes up.
The central control device can identify the device information, such as the device ID, of the target central control device based on the previously acquired or stored static device information, so that the current central control device or a desktop (host) application on the current central control device can send a request for acquiring display data based on the target central control device ID. The information center can realize synchronization of information of each static device between each central control device based on the Hilink channel provided by the Hilink communication module, and a request for acquiring display data sent by the current central control device based on the ID of the target central control device can be transmitted based on a data channel such as a distributed soft bus, which is specifically described in the related content of the distributed database, and is not described herein.
It will be appreciated that in some embodiments, different users may respectively establish an association with different central control devices in the home environment, for example, the user a may register identity information with the central control device 100-1 in his own room to establish an association with the central control device 100-1. The user B may also register identity information through the central control apparatus 100-3 of his own room to establish an association relationship or the like. In other embodiments, the central control device may actively establish an association relationship with the user identity information with the highest frequency of use according to the frequency of using the central control device by each user. Or the central control equipment can acquire the position of the user through the handheld equipment and the like of the user, and accordingly the duration of the user in the room of the central control equipment is counted, so that an association relationship is actively established with the user in the room of the central control equipment, namely the identity information of the user is actively associated. There is no limitation in this regard. The process of establishing the association between the specific user and the central control device will be described in detail below, and will not be described in detail here.
The distributed database, namely an application database which can be shared by all applications in each central control device system, is used for storing desktop application data, other application program data for displaying corresponding state data on a desktop through interaction with the desktop application, data reported by associated devices and the like. In some embodiments, the distributed database in the central control device may also be a database provided for desktop (desktop) applications, without limitation. The distributed database in the central control device can be implemented based on the distributed Data management capability (i.e., dataavailability) provided by the Data template, which can enable each application program to manage access to the stored Data of itself and other application programs, and provide a method for sharing Data with other application programs. The Data can be used for Data sharing of different applications of the same equipment, and can also support Data sharing of different applications across the equipment. It will be appreciated that the transmission of display data between the respective central control devices may be implemented based on a distributed soft bus or based on other wireless communication technologies, without limitation.
Referring to the scenario shown in fig. 1, if the central control device 100-1 is a common device of a user, for example, a central control device in the user's own room, the device stores display data of a personalized desktop set by the user. When the user enters the living room and needs to perform control operation through the central control device 100-2 in the living room, the central control device 100-2 can send a data acquisition request to the central control device 100-1 through a data flow channel such as a distributed soft bus. The central control device 100-1 may send the personalized desktop display data of the user to the central control device 100-2 through the distributed soft bus based on the received request, referring to the display data circulation process implemented based on the distributed soft bus shown in fig. 5. In this way, the personalized desktop of the user can be displayed on the central control device 100-2 of the living room. On this basis, the central control device 100-2 may further display the status of some public home devices located in the living room, for example, display the water temperature, water quality, etc. of the water dispenser, so as to be selected by the user. There is no limitation in this regard.
In other embodiments, the display data flow between the central control device 100-1 and the central control device 100-2 may be other manners, for example, the central control device 100-1 may upload the display data to the cloud through the wireless lan. When the user enters the living room and needs to perform control operation through the central control device 100-2 in the living room, the central control device 100-2 can acquire display data of the central control device 100-1 from the cloud. The present application is not limited in this regard.
Fig. 6 is a schematic flow chart of an implementation of a display control method according to an embodiment of the application. It can be understood that the main execution body of each step in the flowchart shown in fig. 6 is a central control device currently operated by a user, and may be, for example, the central control device 100-2 described above. For convenience of description, the execution bodies of the following steps will not be repeated.
It should be understood that the flow shown in fig. 6 does not constitute a specific limitation on the implementation of the embodiment of the present application, and in other embodiments, the execution sequence of the steps shown in fig. 6 may be modified according to implementation needs, which is not limited herein.
As shown in fig. 6, the flow includes the steps of:
601: user interaction is detected, and personal information of the user to be identified is collected.
The user interaction may be, for example, a wake-up operation performed by the user approaching the current central control device or the user touching the screen of the current central control device, and the like, which is not limited herein. The central control device 100-2 may detect the approach of the user, for example, based on the distance sensor 180C, etc., at which time the central control device 100-2 wakes up and may collect personal information of the user currently approaching before displaying the desktop to identify the user.
The personal information of the user collected by the central control device 100-2 at this time may be any one of face information, fingerprint information, and voiceprint information of the user. It should be noted that the type of personal information of the user collected when the central control device 100-2 performs this step may correspond to the type of identity information that the user registers on the common central control device 100-1 and synchronizes to the central control device 100-2. The central control device 100-1 is, for example, a common device of a user, and may be, for example, a central control screen of a bedroom of the user in the scenario shown in fig. 1; the central control device 100-2 may be a central control device of other rooms in the home environment, for example, a central control screen of a living room in the scenario shown in fig. 1, which is not limited herein.
If the identity information of the user registered on the common central control device 100-1 and synchronized to the central control device 100-2 is face information, the central control device 100-2 may collect face information of the user currently approaching and extract face features when executing the step, so as to match with the face data registered by the user, and further identify the user identity. Similarly, if the identity information of the user registered on the common central control device 100-1 and synchronized to the central control device 100-2 is fingerprint information, the central control device 100-2 may collect fingerprint information of the user currently approaching for identifying the user identity when performing this step, which is not limited herein.
602: and judging whether the acquired user personal information has matched user identity information. If the determination result is yes, it indicates that the user has registered identity information on the common central control device, the central control device 100-2 may further perform the determination in step 603 described below, to determine common central control device information bound with the user; if the determination result is no, indicating that the user has not registered the identity information, the central control apparatus 100-2 does not need to match the common apparatus of the user, and performs step 606 described below to display the desktop based on the local display data.
Illustratively, the central control device 100-2 may compare the collected personal information of the user, such as face information, with the identity information of each user synchronized by the central control device 100-2 from other central control devices (such as the central control device 100-1), and if the personal information of the user can be matched with the personal information of the user, it may be determined that the user is a user with registered identity information on the common central control device; if there is no identity information matching the user's personal information, it may be determined that the user is not registered with a common central control device, e.g., the user may be a visitor, without limitation.
603: and judging whether the matched user identity information has associated common central control equipment. If the determination result is yes, it indicates that the identity information of the user is already bound to the common central control device, then the central control device 100-2 may acquire the bound central control device information, for example, a device ID (device ID) of the central control device 100-1, and continue to execute step 604 described below; if the determination result is no, which indicates that the identity information of the user is not already bound to the common central control device, the central control device 100-2 may execute step 606 described below, and display the desktop based on the local display data.
In some embodiments, the association relationship between the user identity information and the corresponding common central control device may be stored in each central control device. Therefore, when the central control device 100-2 performs the above steps and matches the registered user identity information based on the collected user personal information, the device information of the central control device 100-1 corresponding to the user identity information can be determined from the association relationship stored by itself, and a request for obtaining display data is sent to the central control device 100-1, and the corresponding central control device 100-1 returns corresponding display data based on the request.
In other embodiments, the association between the user identity information and the corresponding common central control device may be stored in the central device connected to each central control device or pre-stored in the cloud. The central device may be, for example, a certain electronic device specified in the smart home scenario. The central control device 100-2 may send the user identity information to the central control device or the cloud after identifying the user identity information, where the central control device or the cloud determines the device information of the central control device 100-1 corresponding to the user identity information and returns the device information to the central control device 100-2, and further the central control device 100-2 sends a request for obtaining display data to the corresponding central control device 100-1, and the corresponding central control device 100-1 returns corresponding display data based on the request.
In some embodiments, the association relationship between the user identity information and the corresponding common central control device, and the personalized desktop display data set by the user in the common central control device are stored in the cloud. The central control device 100-2 can send the user identity information to the cloud after identifying the user identity information, and the cloud directly calls the display data of the central control device 100-1 corresponding to the user identity information according to the stored association relationship and sends the display data to the central control device 100-2 for use. In order to protect the user privacy data, the association relationship, the display data and the like stored in the cloud in advance may be subjected to corresponding encryption processing, which is not limited herein.
604: and acquiring the matched personalized desktop display data on the common central control equipment.
Illustratively, the central control device 100-2 may send a data acquisition request, i.e., a request for acquiring display data corresponding to the personalized desktop, to the central control device 100-1 based on the device ID of the central control device 100-1. Accordingly, the central control device 100-1 may send the locally stored personalized desktop display data according to the preference or the usage habit of the user to the central control device 100-2 in response to the acquisition request sent by the central control device 100-2. It will be appreciated that in some embodiments, the data acquisition request sent by the central control device 100-2 to the central control device 100-1 may also be used to request that a display data flow channel be established with the central control device 100-1 to acquire personalized desktop display data from the central control device 100-1 in real time during the duration of the flow channel maintenance.
Optionally, the central control device 100-2 may also obtain usage data of the user from the central control device 100-1 commonly used by the user, where the usage data includes data generated by corresponding operation habits of the user on the central control device 100-1; or, the user sets or adjusts the desktop style of the central control device 100-1, and the like, and the corresponding generated data is not limited herein.
The central control device 100-1 sends personalized desktop display data to the central control device 100-2, and display data transmission/circulation can be realized through the data circulation channel, such as a distributed soft bus. In other embodiments, the transmission of the display data may be accomplished by other communication methods, which are not limited herein.
As an example, the user may edit the display style of the desktop according to his own preference for the desktop displayed by the general central control apparatus 100-1. Editing operations performed by a user on a desktop of the commonly used center control device 100-1 may include: 1) Adjusting the sequence and the position of the full-house service cards displayed on the desktop, wherein the full-house service cards can be full-house constant temperature, full-house dehumidification, full-house ventilation and the like; 2) Adding and deleting the common use in the common application folder, adjusting the sequence and the like; 3) Setting the font size of the desktop, a dark and light color display mode and the like; 4) Adding subscribed official or three-party value added services, for example, by adding FA cards of corresponding services; 5) Setting a scene of normal execution, adding a device of normal control, and the like, is not limited herein.
Accordingly, the central control device 100-1 may generate corresponding display data in response to the editing operations of the user, and store the display data in the local storage space, where the local storage space may be, for example, a distributed database of the central control device 100-1 related to the data circulation process shown in fig. 5.
605: based on the received display data, the personalized desktop is displayed. After the central control apparatus 100-2 performs this step, the following step 607 may be continued, and an operation input of the user on the desktop displayed by the central control apparatus 100-2 is received.
Illustratively, after receiving the personalized desktop display data sent by the central control device 100-1 commonly used by the user, the central control device 100-2 performs desktop rendering display based on the display data, so as to display the personalized desktop according to the personal preference and the use habit of the user. It will be understood that the personalized desktop displayed by the central control device 100-2 at this time may include, but is not limited to, font size, wallpaper, a dark and light display mode set by the user on the central control device 100-1, layout of each associated device or application card on the desktop, and official or third party value added services subscribed by the user.
As an example, fig. 7a to 7b show personalized desktop style schematics of different user settings according to embodiments of the present application. It will be appreciated that the interfaces shown in fig. 7 a-7 b do not limit the style of the personalized desktop, and in other embodiments, the personalized desktops provided by different users may be in other styles.
If the user is an elderly person, for example, the desktop style can be set as the simple mode desktop 710 shown in fig. 7a, the display word size of the simple desktop 710 is larger, the desktop layout is flatter, and the user can conveniently identify and use the controls. Accordingly, the center control device 100-2 displays a personalized desktop based on the received display data, and then may refer to the simple mode desktop 710 shown in fig. 7 a. Referring to fig. 7a, when the central control apparatus 100-2 displays the simple mode desktop 710, a prompt 711 indicating that the user identification is successful may also be displayed on the desktop, for example, "identify owner, has entered simple mode" shown in fig. 7 a.
If the user is a child, the child mode desktop 720 shown in fig. 7b may be set, threat operation of the child to some electrical devices may be automatically shielded on the child mode desktop 720, an operation boundary may be set, for example, lock some home device function cards 721, prompt contents such as "device is locked" and "dad and mom please use to help unlock for use" shown in fig. 7b may be displayed, the display style of the child mode desktop 720 may be cartoon, a learning channel 722 may be displayed for the child to selectively watch or listen, some intelligent home devices may be automatically controlled to enter a child safety mode, for example, the upper limit of the temperature of the water fountain at home may be controlled to be adjusted to be within 37 ℃, and the child may click on "i want to drink water" 723 to drink water safely. Accordingly, the center control device 100-2 may refer to the child mode desktop 720 shown in fig. 7b, etc. based on the personalized desktop displayed by the received display data, without limitation.
Referring to fig. 7b, when the central control device 100-2 displays the child mode desktop 720, a prompt 724 indicating that the user identification is successful may also be displayed on the desktop, for example, "identify small owner, enter child mode" shown in fig. 7 b. There is no limitation in this regard.
It will be appreciated that, in some embodiments, the central control device 100-2 may also display card information or control windows of associated home devices in the space where the central control device 100-2 is currently located when performing desktop rendering display based on the received personalized display data, which is not limited herein.
606: based on the local display data, the local desktop is displayed. After the central control apparatus 100-2 performs this step, the following step 607 may be continued, and an operation input of the user on the desktop displayed by the central control apparatus 100-2 is received.
Illustratively, if the central control device 100-2 confirms that the collected user personal information does not match the registered user identity information when performing the above step 602, or the central control device 100-2 confirms that the matched user identity information does not associate with the common central control device when performing the above step 603, the native desktop, that is, the desktop that is rendered and displayed based on the native display data, may be displayed. It may be understood that the native desktop of the central control device 100-2 may be a factory default desktop of the device, or may be a desktop that dynamically adjusts display based on the associated home device in the space where the central control device 100-2 is currently located, which is not limited herein.
607: responsive to user related operations on the desktop, and performing corresponding processing.
For example, the user may view subscribed value-added service information on a desktop displayed by the central control device 100-2, operate to play music, control to turn on a large-screen television, or click on a full-house service card such as a full-house ventilation, etc., and the central control device 100-2 may display a corresponding application interface (e.g. display a music playing interface) in response to the operation of the user, or send related control instructions to the associated home device, such as sending an on instruction to the large-screen television, and an instruction to play specified video content, etc.
608: and detecting that the user leaves for more than a preset time, disconnecting the data stream and extinguishing the screen.
Illustratively, the central control device 100-2 may detect that the user is far away based on the distance sensor, where the central control device 100-2 may confirm that the user does not need to use the current desktop any more, for example, when the user is far away from the preset distance for more than a preset time, and at this time, the central control device 100-2 may disconnect the display data flow channel with the central control device 100-1 and put out the screen to enter the sleep state.
It can be understood that, based on the implementation flow shown in fig. 6, when the user performs the control operation on the unusual central control device 100-2, that is, the user is currently located in front of the central control device 100-2, the current central control device 100-2 can identify the user identity and acquire the personalized desktop display data on the usual central control device 100-1 of the user, and then the central control device 100-2 can display the personalized desktop that accords with the preference or the use habit of the user for the user to use. Therefore, the operation of the user can be facilitated, and the use experience of the user is improved.
Fig. 8 is a schematic diagram of an implementation flow of registering user identity information on a common central control device and setting the common central control device associated with the user identity information according to an embodiment of the present application. It should be understood that the flow shown in fig. 8 does not constitute a specific limitation on the implementation flow of the embodiment of the present application, and in other embodiments, the implementation sequence of the steps shown in fig. 8 may be adjusted according to the needs, which is not limited herein.
As shown in fig. 8, the flow includes the steps of:
801: initializing devices, and logging in a main account on any central control device.
For example, after the installation of the central control device is completed in the whole house, the user may operate to access each central control device to a unified trusted network and log in to the main account, for example, to access WiFi in a unified manner, and further, a trusted relationship, a communication channel (for example, a high link) and a data transmission channel (for example, a distributed soft bus) may be established between each central control device.
It can be understood that in some full-house smart home scenarios, a main account may be configured in a full-house scenario, so that each central control device in the full-house scenario is managed, and relevant operation result data corresponding to a setting operation or a device management operation performed by a user after logging in the main account may be synchronized to each central control device. For example, after logging in the main account on any central control device, the user can uniformly set a default desktop for each central control device, add home devices to be controlled, and the like. In other embodiments, the user may log in the main account on an electronic device other than the central control device to manage each central control device, for example, log in the main account on a tablet computer or a notebook computer, which is not limited herein.
802: a family group is created and sub-accounts for family members are created.
Illustratively, after logging into the main account, the user may create a family group to add/set up multiple family member sub-accounts, e.g., create M sub-accounts for M family members, respectively. In this manner, each family member may make some personalized settings by logging into a sub-account.
It will be appreciated that in the embodiments of the present application, for convenience of description, the steps of the flow shown in fig. 8 are described as family members, that is, users of the central control devices, and the two are not distinguished unless otherwise specified.
803: setting sub-account information of common family members of each central control device.
In an exemplary embodiment, in a full-house smart home scenario, each central control device may correspondingly set common sub-account information of a family member, for example, N central control devices in a certain full-house scenario are distributed in each room, and then the central control devices in each room may be set corresponding resident sub-account information of the family member as the common sub-account of the central control device, so as to establish an association relationship between each central control device and the family member.
It will be appreciated that in other embodiments, a room in a full-house scenario may have multiple common family members, where multiple family member sub-account information may also be provided on a central control device, which is not limited herein.
804: and establishing an association relationship between the equipment information of each central control equipment and the sub-account information of the family member.
The device information of each central control device is, for example, a device ID, and the association relationship between each central control device and the sub-account information of the family member may be, for example, a one-to-one or one-to-many relationship, which is not limited herein.
805: each family member registers identity information on the associated common central control device.
For example, each family member may log into a sub-account on the associated common device and register, as identity information, some item such as face data, fingerprint data, or voiceprint data for identifying the user identity, for dynamically identifying the user identity to match the device information of the corresponding common central device for the user when using each central device.
806: the identity information of each family member is associated with a set sub-account of the family member.
It can be understood that after each family member logs in the sub-account on the common central control device and registers the identity information, the identity information can be associated with the corresponding logged-in sub-account information of the family member, and further can be associated with the corresponding central control device information.
It will be appreciated that the process of establishing the association between the user and the common central control device is not limited to the processes described in steps 804 to 806. In other embodiments, the central control device may actively establish an association relationship with the user identity information with the highest frequency of use according to the frequency of using the central control device by each user. Or the central control equipment can acquire the position of the user through the handheld equipment and the like of the user, and accordingly the duration of the user in the room of the central control equipment is counted, so that an association relationship is actively established with the user in the room of the central control equipment, namely the identity information of the user is actively associated. There is no limitation in this regard.
Referring to fig. 9, taking registered identity information as face data as an example, each family member sub-account may be associated with face data of each family member on the one hand, and central control equipment commonly used by each family member on the other hand. Thus, when a user uses a certain central control device to perform face recognition to determine the identity of the user, the central control device can be quickly matched with the central control device corresponding to the associated family sub-account based on the association corresponding relation shown in fig. 9, and further the device information of the central control device is obtained.
807: and storing and synchronizing the information of each central control device, the sub-account information of the associated family member and the identity information of the associated family member on each central control device.
For example, after the user performs the above steps 804 to 806 to complete the establishment of the association relationship between the central control device information, the sub-account information of the family member, and the user identity information of the family member, the associated central control device information, the sub-account information of the family member, and the identity information of the family member may be stored in the distributed database of the corresponding central control device, and further synchronization between the central control devices may be completed based on the hill channel and the like. Thus, the security of privacy data in information such as user identity information and the like synchronized among the central control devices can be ensured.
In other embodiments, the information of each central control device, the sub-account information of the associated family member, and the identity information of the associated family member may also be synchronized among the central control devices in other manners, for example, uniformly sent to a certain central control device in the smart home scene for management and synchronization among the central control devices, or uploaded to the cloud for uniform management and synchronization among the central control devices, which is not limited herein.
Fig. 10 is a schematic diagram illustrating an interaction flow of a display control method according to an embodiment of the present application. It will be appreciated that the flow illustrated in fig. 10 relates to interactions between the central control device 100-1 and the central control device 100-2, wherein the central control device 100-1 is, for example, a common device of a user, such as a central control screen of a bedroom of the user; the central control device 100-2 may be a central control device of other rooms in the home environment, such as a central control screen of a living room, which is not limited herein.
Specifically, as shown in fig. 10, the flow includes the steps of:
1001: the center control device 100-1 stores face information registered by an associated user and sets as a common center control device for the user.
By way of example, a user may register personal identity information on a common central control device, such as central control device 100-1, which may include, for example, face information, fingerprint information, voiceprint information, etc., without limitation. Taking face information as an example, when a user registers face information on the central control device 100-1, association relation binding between the face information of the user and device information (for example, device id) of the central control device 100-1 can be completed, which is equivalent to marking the central control device 100-1 in a home environment as a common central control device of the user.
The detailed process of registering the user identity information on the central control device 100-1 and establishing the association relationship between the user identity and the common central control device may refer to the related descriptions in steps 803 to 806 in the flow shown in fig. 8, which are not described herein.
1002: the central control apparatus 100-1 synchronizes registered user identity information with the central control apparatus 100-2, and an association relationship between the central control apparatus 100-1 and the user identity information.
Illustratively, the central control device 100-1 may implement synchronization of static device information based on the Hilink channel between information hubs, including synchronization of registered user identity information, association between the central control device 100-1 and the user identity information, and the like.
In other embodiments, the central control device 100-1 synchronizes the registered user identity information and the static device information such as the association relationship between the central control device 100-1 and the user identity information with the central control device 100-2, which may also be implemented by other communication technologies, for example, by WiFi, bluetooth, etc., which is not limited herein.
It may be appreciated that in the embodiment of the present application, the synchronization of the registered user identity information by the central control device 100-1 to the central control device 100-2 and the association relationship between the central control device 100-1 and the user identity information are merely exemplary illustrations, and in other embodiments, the central control device 100-1 may also upload the acquired user identity information and the association relationship between the central control device 100-1 and the user identity information to the cloud, so as to implement a process of determining, at the cloud, that the device information of the central control device 100-1 corresponding to the user personal information acquired by the central control device 100-2 is fed back to the central control device 100-2, and the like, which is not limited herein.
1003: the central control apparatus 100-1 stores personalized display data set for a desktop by a user.
Specifically, the process of editing the desktop of the common central control device 100-1 by the user may refer to the related description in step 604, and the personalized desktop display style set by the user on the common central control device 100-1 may refer to the description in fig. 7a or fig. 7b, which are not described herein.
1004: the central control apparatus 100-2 detects that the user is approaching.
For example, the central control device 100-2 may detect that the user approaches based on the distance sensor 180C, etc., and the specific process may refer to the related description in the above step 601, which is not repeated herein.
1005: the central control apparatus 100-2 triggers face recognition.
Illustratively, the central control apparatus 100-2 may activate the camera 193 to collect face information for face recognition after detecting that the user is approaching. In other embodiments, the central control device 100-2 may trigger other identification methods, such as fingerprint identification, after detecting the approach of the user.
1006: the central control apparatus 100-2 detects face information.
Illustratively, the central control apparatus 100-2 may detect the currently approaching user face information after activating the camera 193, i.e., a process of collecting face information to be recognized. The specific process of collecting the face information may refer to the related description in step 601, which is not described herein.
It should be noted that the type of personal information of the user collected when the central control device 100-2 performs this step may correspond to the type of identity information of the user registered on the central control device 100-1 and synchronized to the central control device 100-2 in the above steps 1001 to 1002.
1007: the center control device 100-2 determines whether the detected face information is registered. If yes, indicating that the detected face information is registered, continuing to execute a step 1008 to match the common central control equipment; if the determination result is no, indicating that the detected face information is not registered, the following step 1012 may be executed to display the local desktop.
Illustratively, the central control apparatus 100-2 may compare the detected face information with face data of registered family members (or users) acquired in the distributed database to determine whether the detected face information is registered. The specific determination process may refer to the description of step 605, and will not be described herein.
1008: the center control device 100-2 determines whether there is a common center control device that matches the detected face information. If the judgment result is yes, indicating that the current user has set the common central control equipment, continuing to execute the following step 1009 to acquire personalized display data on the common central control equipment; if the result of the determination is negative, indicating that the current user does not set the common central control device, the following step 1012 may be executed to display the local desktop.
The specific determination process may refer to the description of step 603, which is not repeated herein.
1009: the central control device 100-2 transmits a request for acquiring display data corresponding to the personalized desktop to the central control device 100-1.
1010: the central control device 100-1 transmits personalized desktop display data to the central control device 100-2.
For example, after determining that the current user has registered face data and has set the common central control device, the central control device 100-2 may acquire device information of the common central control device, for example, a device ID of the central control device 100-1, and then may send a display data acquisition request to the central control device 100-1 based on the device information, and then the central control device 100-1 may return, in response to the acquisition request, display data corresponding to the personalized desktop set by the user to the central control device 100-2.
The process for specifically acquiring the personalized display data may refer to the description of step 604, which is not described herein.
1011: based on the received display data, the personalized desktop is displayed.
For example, the central control device 100-2 may perform rendering based on the received display data corresponding to the personalized desktop, so as to display the personalized desktop that accords with the preference and the use habit of the current user for the current user, thereby effectively improving the use experience of the user. The process of displaying the personalized desktop specifically may also refer to the related description in step 605, which is not described herein.
1012: based on the local display data, the local desktop is displayed.
Illustratively, the central control device 100-2 may display the local desktop based on the local display data, such as displaying the universal desktop based on a default setting, after determining that the current user does not register face data, or that the common central control device is not set.
The process of displaying the local desktop specifically may also refer to the related description in step 606, which is not repeated here.
1013: responding to the related operation of the user on the desktop, and executing corresponding processing.
The process of responding to the related operation performed by the user on the displayed desktop by the central control device 100-2 may refer to the related description in step 607, which is not described herein.
It will be appreciated that in some embodiments, the user may also adjust the desktop display style on the desktop displayed by the central control device 100-2, and the central control device 100-2 may generate a corresponding operation instruction from the received user operation, and send the corresponding operation instruction to the central control device 100-1 through the distributed soft bus to respond, that is, remotely control the central control device 100-1 to respond to the operation instruction, so as to change the display data of the corresponding personalized desktop. If the desktop displayed by the central control device 100-2 is a personalized desktop displayed based on the received display data, the central control device 100-2 may save the display data changed by the operation of adjusting the desktop display style by the user to the distributed database of the central control device 100-1 in real time based on the display data circulation channel between the central control device 100-1 and the central control device 100-1, for example, the distributed soft bus, so as to update the data source of rendering display on the central control device 100-2, which is the personalized desktop display data stored on the central control device 100-1. There is no limitation in this regard.
1014: and detecting that the user leaves for more than a preset time, disconnecting the data stream and extinguishing the screen.
The process of disconnecting the data stream and extinguishing the screen by the central control apparatus 100-2 may refer to the related description in step 608, and will not be described herein.
Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one example implementation or technique disclosed in accordance with embodiments of the present application. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
The disclosure of the embodiments of the present application also relates to an operating device for executing the text. The apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random Access Memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application Specific Integrated Circuits (ASICs), or any type of media suitable for storing electronic instructions, and each may be coupled to a computer system bus. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processors for increased computing power.
Additionally, the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the disclosed subject matter. Accordingly, the present application example disclosure is intended to be illustrative, but not limiting, of the scope of the concepts discussed herein.

Claims (19)

1. A display control method applied to an intelligent home system, wherein the intelligent home system comprises a first electronic device and a second electronic device with screens, and the method is characterized by comprising the following steps:
the first electronic equipment identifies the current first user;
the first electronic device obtains first display data corresponding to the first user from the second electronic device according to the identification result, and displays a first control interface based on the first display data, wherein the first display data is associated with the use data of the second electronic device by the first user.
2. The method of claim 1, wherein the first control interface comprises a plurality of controls for controlling smart home devices in a smart home system.
3. The method of claim 1, wherein the smart home system further comprises a third electronic device having a screen, and the method further comprises:
The first electronic equipment identifies the current second user;
and the first electronic device acquires second display data corresponding to the second user from the third electronic device according to the identification result, and displays a second control interface based on the second display data, wherein the second display data is associated with the use data of the third electronic device by the second user.
4. The method of claim 1, wherein the first electronic device obtaining first display data corresponding to the first user from a second electronic device according to the recognition result, comprising:
acquiring first user data for identifying the first user;
determining device information of the second electronic device based on a first association relationship between the first user data and the second electronic device;
and sending a data acquisition request to the second electronic device based on the device information of the second electronic device so as to acquire the first display data.
5. The method of claim 4, wherein the first electronic device obtains the first user data by:
detecting user interaction, and collecting first user data of the first user; or,
The first user data is received from a third electronic device.
6. The method of claim 5, wherein the detecting of the user interaction comprises any one of:
detecting that the user approaches;
detecting touch operation of a user on the screen of the first electronic device;
and detecting the pressing operation of a key on the first electronic equipment by a user.
7. The method of any of claims 4 to 6, wherein the first user data comprises:
any one of face recognition data, fingerprint recognition data, voiceprint recognition data.
8. The method of claim 4, wherein the determining device information for the second electronic device based on a first association between the first user data and the second electronic device comprises:
and determining the equipment information of the second electronic equipment based on the first association relation obtained in advance from the second electronic equipment.
9. The method of claim 8, wherein the manner in which the second electronic device generates the first association comprises any one of:
generating the first association relation based on the detected operation of the first user for inputting first user data and the operation of setting the second electronic device as a common device associated with the first user;
Generating the first association relation based on the frequency of use of the second electronic equipment by the first user;
and generating the first association relation based on the statistical time length of the first user position and the second electronic equipment position in the first space, wherein the first user position is determined through position data acquired by fourth electronic equipment.
10. The method of claim 7, wherein the first electronic device pre-acquiring the first association from the second electronic device comprises:
the first electronic equipment receives the first association relation synchronized by the second electronic equipment through an open internet of things communication protocol; or,
the first electronic device sends an acquisition request to the second electronic device through an open internet of things communication protocol, and receives the first association relation returned by the second electronic device in response to the acquisition request.
11. The method of claim 10, wherein the open internet of things communication protocol is a HiLink communication protocol.
12. The method of any of claims 1 to 11, wherein the second electronic device comprises a second control interface, and wherein the first user's usage data for the second electronic device comprises at least one of:
Data acquired according to the operation habit of the first user; or,
and according to the data acquired by the first user for adjusting the display content of the second control interface.
13. The method according to any one of claims 1 to 12, further comprising:
before the first electronic device displays a first control interface based on the first display data, the first electronic device displays a third control interface;
the displaying a first control interface based on the first display data includes: and based on the first display data, displaying the third control interface as the first control interface in a switching mode.
14. The method of any one of claims 1 to 13, wherein the first electronic device and the second electronic device establish a data transmission channel based on any one of WiFi, bluetooth, or a distributed soft bus, the data transmission channel being used to transmit the first display data from the second electronic device to the first electronic device.
15. The method of any of claims 1 to 13, wherein the first control interface is an interface of a desktop application and the first display data comprises at least one of:
The method comprises the steps of enabling application data of a desktop application, application data of a first application interacted with the desktop application and device data of intelligent household devices correspondingly controlled by a control on a first control interface.
16. A display control method applied to an intelligent home system, wherein the intelligent home system comprises a first electronic device and a second electronic device with screens, and the method is characterized by comprising the following steps:
the method comprises the steps that a second electronic device receives a request for acquiring first display data from a first electronic device, wherein the first display data are used for displaying a first control interface of the first electronic device, and the first display data are generated by the second electronic device according to usage data of a first user on the second electronic device;
and responding to the acquisition request, and sending the first display data to the first electronic device by the second electronic device.
17. The method of claim 16, wherein the second electronic device includes a second control interface, and wherein the usage data of the second electronic device includes:
the second electronic equipment is based on the data acquired by the operation habit of the first user; or,
And the second electronic equipment is based on the data acquired by the user for adjusting the display content of the second control interface.
18. An electronic device, comprising: one or more processors; one or more memories; the one or more memories store one or more programs that, when executed by the one or more processors, cause the electronic device to perform the display control method of any of claims 1-15, or the display control method of claim 16 or 17.
19. A computer-readable storage medium having stored thereon instructions that, when executed on a computer, cause the computer to perform the display control method of any one of claims 1 to 15 or to perform the display control method of claim 16 or 17.
CN202210818264.2A 2022-07-11 2022-07-11 Display control method, electronic device, and computer-readable storage medium Pending CN117424772A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210818264.2A CN117424772A (en) 2022-07-11 2022-07-11 Display control method, electronic device, and computer-readable storage medium
PCT/CN2023/106604 WO2024012413A1 (en) 2022-07-11 2023-07-10 Display control method, electronic device and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210818264.2A CN117424772A (en) 2022-07-11 2022-07-11 Display control method, electronic device, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN117424772A true CN117424772A (en) 2024-01-19

Family

ID=89531324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210818264.2A Pending CN117424772A (en) 2022-07-11 2022-07-11 Display control method, electronic device, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN117424772A (en)
WO (1) WO2024012413A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111752443A (en) * 2019-03-28 2020-10-09 华为技术有限公司 Method, related device and system for controlling page by display equipment
CN110688179B (en) * 2019-08-30 2021-02-12 华为技术有限公司 Display method and terminal equipment
CN114449110B (en) * 2020-10-31 2023-11-03 华为技术有限公司 Control method and device of electronic equipment
CN112558491A (en) * 2020-11-27 2021-03-26 青岛海尔智能家电科技有限公司 Home scene linkage intelligent home system based on voice recognition and control method and control device thereof
CN115826418A (en) * 2021-09-16 2023-03-21 华为技术有限公司 Intelligent household control method

Also Published As

Publication number Publication date
WO2024012413A1 (en) 2024-01-18

Similar Documents

Publication Publication Date Title
CN110336720B (en) Equipment control method and equipment
CN111510781B (en) Display device standby control method and display device
CN105426141B (en) Information processing method and electronic device supporting the same
EP4224294A1 (en) Cross-device content sharing method, electronic device and system
EP3136793B1 (en) Method and apparatus for awakening electronic device
CN111752443A (en) Method, related device and system for controlling page by display equipment
WO2020155014A1 (en) Smart home device sharing system and method, and electronic device
EP2955907B1 (en) Wearable electronic device, system and control method
CN112449332B (en) Bluetooth connection method and electronic equipment
CN110730114B (en) Method and equipment for configuring network configuration information
US20240095329A1 (en) Cross-Device Authentication Method and Electronic Device
CN113496426A (en) Service recommendation method, electronic device and system
WO2020151385A1 (en) Method and device for controlling networking of smart home device
WO2020224447A1 (en) Method and system for adding smart home device to contacts
CN113923230A (en) Data synchronization method, electronic device, and computer-readable storage medium
WO2023083262A1 (en) Multiple device-based method for providing service, and related apparatus and system
CN114286165A (en) Display device, mobile terminal and screen projection data transmission method
WO2020216160A1 (en) Automatic routing method for se, and electronic device
CN113676879A (en) Method, electronic device and system for sharing information
CN114173193A (en) Multimedia stream playing method and electronic equipment
AU2014315883A1 (en) Method of controlling short-range wireless communication and apparatus supporting the same
WO2021227942A1 (en) Information sharing method, electronic devices and system
WO2021179829A1 (en) Human-machine interaction method and device
CN117424772A (en) Display control method, electronic device, and computer-readable storage medium
WO2022165939A1 (en) Cross-device authentication method and electronic devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination