CN116257201A - Content collaboration method, electronic device, and computer-readable storage medium - Google Patents

Content collaboration method, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN116257201A
CN116257201A CN202111510555.7A CN202111510555A CN116257201A CN 116257201 A CN116257201 A CN 116257201A CN 202111510555 A CN202111510555 A CN 202111510555A CN 116257201 A CN116257201 A CN 116257201A
Authority
CN
China
Prior art keywords
electronic device
target content
electronic
state
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111510555.7A
Other languages
Chinese (zh)
Inventor
徐杰
龙嘉裕
胡靓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111510555.7A priority Critical patent/CN116257201A/en
Priority to PCT/CN2022/136632 priority patent/WO2023103974A1/en
Publication of CN116257201A publication Critical patent/CN116257201A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • G06F9/5038Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the execution order of a plurality of tasks, e.g. taking priority or time dependency constraints into consideration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/50Indexing scheme relating to G06F9/50
    • G06F2209/5021Priority

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application is applicable to the technical field of terminals, and particularly relates to a content collaboration method, electronic equipment and a computer readable storage medium. The method comprises the steps of determining a second electronic device when the first electronic device displays target content, and sending a processing request to the second electronic device to request cooperative processing of the target content through the second electronic device. The second electronic device may display a hint information based on the processing request. When the user wants to cooperatively process the target content through the second electronic device, a corresponding operation may be performed in the second electronic device. The second electronic equipment sends an acquisition request to the first electronic equipment based on the operation, acquires the target content or the identifier of the target content, and can rapidly display the target content based on the received target content or the identifier of the target content without searching the target content in the second electronic equipment, so that the operation process is simplified, the target content is conveniently cooperatively processed through the second electronic equipment, and the user experience is improved.

Description

Content collaboration method, electronic device, and computer-readable storage medium
Technical Field
The application belongs to the technical field of terminals, and particularly relates to a content collaboration method, electronic equipment and a computer readable storage medium.
Background
At present, small-screen devices such as smart watches have more and more comprehensive functions, such as message viewing, sports health detection and data statistics, electronic payment and the like. However, due to size limitation, the display screen of the small-screen device such as the smart watch is generally smaller, the display and touch capabilities are weaker, and many complex tasks cannot be performed on the small-screen device such as the smart watch. When a user views a content on a small-screen device such as a smart watch, if the user wants to process the content (e.g. reply a message, etc.), the user needs to open a corresponding application on a large-screen device such as a mobile phone or a tablet computer to find the content, or enter a notification center to find the content, and then process the content, so that the operation process is complex, and the user experience is affected.
Disclosure of Invention
The embodiment of the application provides a content collaboration method, electronic equipment and a computer readable storage medium, which can solve the problems that the process of processing a message received by small-screen equipment is complex and the user experience is influenced in the prior art.
In a first aspect, an embodiment of the present application provides a content collaboration method, applied to a first electronic device, where the method may include:
the first electronic device displays target content;
the first electronic device determines a second electronic device and sends a processing request to the second electronic device, wherein the processing request is used for requesting cooperative processing of the target content through the second electronic device;
responding to an acquisition request of the second electronic equipment, and sending the target content or the identification of the target content to the second electronic equipment by the first electronic equipment; and when the second electronic equipment acquires the processing request, the prompt information is displayed in a display interface of the second electronic equipment.
Through the content coordination method, when the first electronic device displays the target content, the first electronic device can determine the second electronic device and send a processing request to the second electronic device so as to request the second electronic device to perform coordination processing on the target content. After the second electronic device receives the processing request, prompt information can be displayed in the display interface so as to prompt a user to cooperatively process the target content through the second electronic device. When the user wants to cooperatively process the target content through the second electronic device, the user can perform a corresponding operation in the second electronic device. When the second electronic device detects the corresponding operation, an acquisition request may be sent to the first electronic device to acquire the target content or the identifier of the target content. After receiving the acquisition request, the first electronic device may send the target content or the identifier of the target content to the second electronic device. After the second electronic device receives the target content or the identifier of the target content, the received target content can be quickly displayed, or the target content stored by the second electronic device can be quickly displayed according to the identifier of the target content, so that the user does not need to search the target content in the second electronic device, and the operation process can be simplified, thereby facilitating the user to cooperatively process the target content through the second electronic device and improving the user experience.
Illustratively, the first electronic device determining the second electronic device may include:
the first electronic equipment acquires the display duration of the target content;
and when the display time length reaches a first preset time length, the first electronic device determines the second electronic device.
In the content collaboration method provided by the implementation manner, when the first electronic device displays the target content, the first electronic device can acquire the display time of the target content, so that the operation intention of the user is identified according to the display time, namely whether the user wants to process the target content is determined according to the display time, invalid transmission of the content by the first electronic device can be reduced, interference caused to the user is reduced, and user experience is improved.
In one possible implementation manner, the determining, by the first electronic device, the second electronic device may include:
the first electronic equipment acquires a distance from at least one third electronic equipment, wherein the at least one third electronic equipment is an electronic equipment which is in communication connection with the first electronic equipment;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance.
In the content collaboration method provided by the implementation manner, the first electronic device may determine the second electronic device from one or more electronic devices with a distance smaller than or equal to a preset distance threshold, so as to determine, according to the distance, the electronic device most likely to be used by the user as the second electronic device, thereby facilitating collaborative processing of the target content by the user through the second electronic device. The preset distance threshold can be specifically set by a technician according to an actual scene, and can also be set by a user in a self-defined manner according to actual needs. For example, the user may set the preset distance threshold to 10 cm or 15 cm or 30 cm or the like according to the habit of using the electronic device itself.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the distance may include:
the first electronic equipment acquires the type of at least one third electronic equipment and determines a first priority corresponding to the type of each third electronic equipment;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance and the first priority.
In the content collaboration method provided by the implementation manner, the first electronic device can determine the electronic device which is most convenient for the user to operate as the second electronic device according to the first priority corresponding to the type, so that the user can conveniently conduct collaboration processing on the target content through the second electronic device.
It should be appreciated that the first priority corresponding to the type may be specifically set by a technician according to the actual situation, or may be custom set by a user. For example, the user may set the first priority custom for the type as: the first priority corresponding to the mobile phone is greater than the first priority corresponding to the tablet personal computer, and the first priority corresponding to the notebook personal computer is greater than the first priority corresponding to the desktop personal computer.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the distance may include:
the first electronic device determines a user account number of the third electronic device login;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance and the user account; or alternatively, the process may be performed,
and the first electronic equipment determines the second electronic equipment from the at least one third electronic equipment according to the distance, the user account and the first priority corresponding to the type of the third electronic equipment.
In the content collaboration method provided by the implementation manner, when the second electronic device is determined according to the distance, or the second electronic device is determined according to the priority corresponding to the distance and the type, the first electronic device can also determine the second electronic device by combining the user account logged in by the third electronic device, so that the finally determined second electronic device is the electronic device belonging to the same user as the first electronic device, or is the electronic device authorized by the user belonging to the first electronic device and capable of controlling the first electronic device, so that the target content is prevented from being sent to the electronic devices of other users, the privacy of the target content is ensured, and the privacy of the user is protected.
In another possible implementation manner, the determining, by the first electronic device, the second electronic device may include:
the first electronic equipment acquires the use state of at least one third electronic equipment, wherein the at least one third electronic equipment is an electronic equipment which is in communication connection with the first electronic equipment;
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the usage state may include:
The first electronic device determines an electronic device with a use state of a preset state in the at least one third electronic device, and determines the second electronic device according to the electronic device with the use state of the preset state;
the preset state is any one of an operation state, a picking-up state, a first screen-lighting state, a service state and a second screen-lighting state, the operation state is a state with key input, touch input, keyboard input or mouse input in the third electronic device, the picking-up state is a state that the third electronic device is picked up, the first screen-lighting state is a state that the screen-lighting time of the third electronic device is smaller than a second preset time period, the service state is a state that the third electronic device is being used except the operation state, the second screen-lighting state is a state that the screen-lighting time of the third electronic device exceeds the second preset time period, but the third electronic device is not in the operation state, the service state or the picking-up state.
For example, the determining, by the first electronic device, the electronic device whose usage state is the preset state in the at least one third electronic device, and determining, by the second electronic device according to the electronic device whose usage state is the preset state, may include:
When the electronic equipment with the use state being the picking-up state exists in the at least one third electronic equipment, the first electronic equipment determines whether the electronic equipment in the picking-up state is positioned in the same hand as the first electronic equipment;
when the electronic device in the picking-up state is located in the same hand as the first electronic device, the first electronic device determines the electronic device in the picking-up state as the second electronic device.
For example, the determining, by the first electronic device, the electronic device whose usage state is the preset state in the at least one third electronic device, and determining, by the second electronic device according to the electronic device whose usage state is the preset state, may include:
when a plurality of electronic devices with use states being preset exist in the at least one third electronic device, the first electronic device determines a second priority corresponding to each use state, and determines the electronic device with the highest second priority as the second electronic device.
In the content collaboration method provided by the implementation manner, the first electronic device can determine the second electronic device according to the use state so as to determine the electronic device most likely to be used by the user currently as the second electronic device, so that the user can conveniently conduct collaboration processing on the target content in time through the second electronic device.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the usage state may include:
the first electronic device obtains the distance between the first electronic device and the at least one third electronic device;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance and the use state.
In the content collaboration method provided by the implementation manner, the first electronic device can determine the second electronic device according to the use state and the distance, so that the electronic device which is most likely to be used by the user and is closest to the user is determined as the second electronic device, and therefore the user can conveniently conduct collaboration processing on the target content through the second electronic device.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the usage state may include:
the first electronic equipment acquires the type of at least one third electronic equipment and determines a first priority corresponding to the type of each third electronic equipment;
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state and the first priority; or alternatively, the process may be performed,
The first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the first priority and the distance between the first electronic device and the third electronic device.
In the content collaboration method provided by the implementation manner, the first electronic device may determine the second electronic device according to the priority corresponding to the use state and the type, or according to the priority corresponding to the use state, the distance and the type, so as to determine the electronic device which is most likely to be used by the user currently, is closest to the user and is most convenient for the user to operate as the second electronic device, thereby facilitating the user to conduct collaborative processing on the target content through the second electronic device.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the usage state may include:
the first electronic device determines a user account number of the at least one third electronic device login;
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state and the user account; or alternatively, the process may be performed,
The first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the user account and a first priority corresponding to the type of the third electronic device; or alternatively, the process may be performed,
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the user account and the distance between the first electronic device and the third electronic device; or alternatively, the process may be performed,
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the user account, the first priority corresponding to the type of the third electronic device and the distance between the first electronic device and the third electronic device.
In the content collaboration method provided by the implementation manner, when the second electronic device is determined according to the use state, or the second electronic device is determined according to the use state and the distance, or the second electronic device is determined according to the priority corresponding to the use state, the distance and the type, the first electronic device can also determine the second electronic device by combining the user account logged in by the third electronic device, so that the finally determined second electronic device is the electronic device belonging to the same user as the first electronic device, or is the electronic device authorized by the user belonging to the first electronic device and capable of controlling the first electronic device, so that the target content is prevented from being sent to the electronic devices of other users, the privacy of the target content is ensured, and the privacy of the user is protected.
In a second aspect, an embodiment of the present application provides a content collaboration method, applied to a first electronic device, where the method may include:
the first electronic device displays target content;
the first electronic device determines a second electronic device and sends a processing request and the target content to the second electronic device or sends the processing request and the identification of the target content to the second electronic device; the processing request is used for requesting the second electronic equipment to cooperatively process the target content, wherein after the second electronic equipment receives the processing request, prompt information is displayed on the second electronic equipment, and the prompt information is used for prompting the second electronic equipment to cooperatively process the target content.
Through the content collaboration method, when the first electronic device displays the target content, the first electronic device can determine the second electronic device and can send the processing request and the target content to the second electronic device or send the processing request and the identification of the target content to the second electronic device. After the second electronic device receives the processing request, prompt information can be displayed in the display interface so as to prompt a user to cooperatively process the target content in the second electronic device. If the second electronic device detects the corresponding operation, the second electronic device can directly and quickly display the received target content, or can quickly display the target content stored by the second electronic device according to the identification of the target content, so that the user does not need to search the target content in the second electronic device, and the operation process can be simplified, thereby facilitating the user to cooperatively process the target content through the second electronic device and improving the user experience.
Illustratively, the first electronic device determining the second electronic device may include:
the first electronic equipment acquires the display duration of the target content;
and when the display time length reaches a first preset time length, the first electronic device determines the second electronic device.
In one possible implementation manner, the determining, by the first electronic device, the second electronic device may include:
the first electronic equipment acquires a distance from at least one third electronic equipment, wherein the at least one third electronic equipment is an electronic equipment which is in communication connection with the first electronic equipment;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the distance may include:
the first electronic equipment acquires the type of at least one third electronic equipment and determines a first priority corresponding to the type of each third electronic equipment;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance and the first priority.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the distance may include:
the first electronic device determines a user account number of the third electronic device login;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance and the user account; or alternatively, the process may be performed,
and the first electronic equipment determines the second electronic equipment from the at least one third electronic equipment according to the distance, the user account and the first priority corresponding to the type of the third electronic equipment.
In another possible implementation manner, the determining, by the first electronic device, the second electronic device may include:
the first electronic equipment acquires the use state of at least one third electronic equipment, wherein the at least one third electronic equipment is an electronic equipment which is in communication connection with the first electronic equipment;
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the usage state may include:
The first electronic device determines an electronic device with a use state of a preset state in the at least one third electronic device, and determines the second electronic device according to the electronic device with the use state of the preset state;
the preset state is any one of an operation state, a picking-up state, a first screen-lighting state, a service state and a second screen-lighting state, the operation state is a state with key input, touch input, keyboard input or mouse input in the third electronic device, the picking-up state is a state that the third electronic device is picked up, the first screen-lighting state is a state that the screen-lighting time of the third electronic device is smaller than a second preset time period, the service state is a state that the third electronic device is being used except the operation state, the second screen-lighting state is a state that the screen-lighting time of the third electronic device exceeds the second preset time period, but the third electronic device is not in the operation state, the service state or the picking-up state.
For example, the determining, by the first electronic device, the electronic device whose usage state is the preset state in the at least one third electronic device, and determining, by the second electronic device according to the electronic device whose usage state is the preset state, may include:
When the electronic equipment with the use state being the picking-up state exists in the at least one third electronic equipment, the first electronic equipment determines whether the electronic equipment in the picking-up state is positioned in the same hand as the first electronic equipment;
when the electronic device in the picking-up state is located in the same hand as the first electronic device, the first electronic device determines the electronic device in the picking-up state as the second electronic device.
For example, the determining, by the first electronic device, the electronic device whose usage state is the preset state in the at least one third electronic device, and determining, by the second electronic device according to the electronic device whose usage state is the preset state, may include:
when a plurality of electronic devices with use states being preset exist in the at least one third electronic device, the first electronic device determines a second priority corresponding to each use state, and determines the electronic device with the highest second priority as the second electronic device.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the usage state may include:
The first electronic device obtains the distance between the first electronic device and the at least one third electronic device;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance and the use state.
The first electronic device determines the second electronic device from the at least one third electronic device according to the usage state, and includes:
the first electronic equipment acquires the type of at least one third electronic equipment and determines a first priority corresponding to the type of each third electronic equipment;
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state and the first priority; or alternatively, the process may be performed,
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the first priority and the distance between the first electronic device and the third electronic device.
For example, the determining, by the first electronic device, the second electronic device from the at least one third electronic device according to the usage state may include:
The first electronic device determines a user account number of the at least one third electronic device login;
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state and the user account; or alternatively, the process may be performed,
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the user account and a first priority corresponding to the type of the third electronic device; or alternatively, the process may be performed,
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the user account and the distance between the first electronic device and the third electronic device; or alternatively, the process may be performed,
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the user account, the first priority corresponding to the type of the third electronic device and the distance between the first electronic device and the third electronic device.
In a third aspect, an embodiment of the present application provides a content collaboration method, applied to a second electronic device, where the method may include:
The second electronic equipment acquires a processing request of the first electronic equipment, wherein the processing request is used for requesting cooperative processing of target content displayed by the first electronic equipment through the second electronic equipment;
the second electronic equipment displays prompt information according to the processing request;
the second electronic equipment detects a first preset operation on the prompt information and displays the target content.
Through the content collaboration method, when the second electronic device receives the processing request of the first electronic device to request the second electronic device to process the target content displayed by the first electronic device, the second electronic device can display prompt information in the display interface to prompt a user to cooperatively process the target content in the second electronic device. When the user wants to cooperatively process the target content through the second electronic device, the user can execute a first preset operation on the prompt information in the second electronic device. When the second electronic equipment detects the first prediction operation, the target content can be displayed quickly, the user does not need to search the target content in the second electronic equipment, and the like, and the operation process can be simplified, so that the user can conveniently carry out cooperative processing on the target content through the second electronic equipment, and the user experience is improved.
Illustratively, the second electronic device displaying the target content may include:
the second electronic device sends an acquisition request to the first electronic device, wherein the acquisition request is used for acquiring the target content or the identification of the target content;
the second electronic equipment receives the target content sent by the first electronic equipment and displays the target content; or alternatively, the process may be performed,
the second electronic equipment receives the identification of the target content sent by the first electronic equipment, and displays the target content stored in the second electronic equipment according to the identification of the target content, wherein the target content stored in the second electronic equipment is owned by the second electronic equipment.
For example, the second electronic device obtaining the processing request of the first electronic device may include:
the second electronic device obtains the processing request of the first electronic device and the target content, or obtains the processing request of the first electronic device and the identification of the target content;
the second electronic device displaying the target content, comprising:
the second electronic device displays the target content or displays the target content stored in the second electronic device according to the identification of the target content, wherein the target content stored in the second electronic device is owned by the second electronic device.
For example, the displaying, by the second electronic device, the prompt information may include:
and the second electronic equipment displays the prompt information through the suspension ball.
In the content collaboration method provided by the implementation mode, the second electronic device can display the prompt information through the small-size suspension ball, so that the interference of the display of the prompt information to the user is reduced.
The suspending ball can be any graph. For example, in order to facilitate the user to clearly request the source of the target content to be processed, the hover ball may be an icon of the application corresponding to the target content.
It should be appreciated that the hover ball may be displayed at any location on the display interface, such as at any location on the lower left, upper left, lower right, upper right, or middle of the display interface. In one example, to reduce the impact of the display of the hover ball on the user operation, the second electronic device may also determine a position of the hover ball in the display interface based on the user's operational position in the display interface to display the hover ball in a position other than the operational position. For example, the user's operational position in the display interface may be determined from a touch event or may be determined from content displayed in the display interface. The determination manner of the operation position of the user in the display interface may be specifically set by a technician according to an actual scene, which is not specifically limited in the embodiment of the present application.
Illustratively, the second electronic device displaying the target content may include:
and the second electronic equipment displays the target content through a window.
In the content collaboration method provided by the implementation manner, the second electronic device may display the target content through a pop-up window. Wherein the size of the window may be set by default by the second electronic device. The second electronic device may also illustratively resize the window according to a user related operation (e.g., stretching the window edge, etc.). It should be understood that when the target content is displayed through the window, the second electronic device may perform layout adaptation on the target content according to the size of the window, so as to improve the display effect of the target content, and facilitate the user to cooperatively process the target content through the second electronic device.
In one example, the second electronic device detecting the first preset operation on the prompt information and displaying the target content may include:
when a first preset operation on the prompt information is detected, the second electronic equipment determines the current equipment state;
when the equipment state is an unlocking state, the second electronic equipment displays the target content;
When the equipment state is a screen locking state, the second electronic equipment outputs an unlocking prompt, and the unlocking prompt is used for prompting unlocking of the second electronic equipment;
and responding to the unlocking operation of the second electronic equipment, and displaying the target content by the second electronic equipment.
In the content collaboration method provided by the implementation manner, when the second electronic device detects the first preset operation, whether the second electronic device is currently located on the screen locking interface or not can be determined first. When the second electronic device is not currently located on the screen locking interface, the second electronic device can consider that a user currently using the second electronic device is a legal user, and at this time, the second electronic device can directly display target content in the display interface. When the second electronic device is currently located on the screen locking interface, the second electronic device can display an unlocking prompt interface to prompt a user to unlock the second electronic device. When the second electronic equipment detects the correct unlocking password, the second electronic equipment can consider that the user currently using the second electronic equipment is a legal user, and at the moment, the second electronic equipment can display target content in the display interface after unlocking; when the second electronic device does not detect the correct unlocking password, the second electronic device can consider that the user who uses the second electronic device at present is not a legal user, at this time, the second electronic device can not display the target content, so that the target content is prevented from being checked by other users, the privacy of the target content is ensured, and the privacy of the user is protected.
In one possible implementation, the method may further include:
when the display time length of the prompt message is detected to reach the preset time length or the hiding operation of the prompt message is detected, the second electronic device hides the prompt message.
For example, after the second electronic device conceals the prompt information, the method may further include:
and the second electronic equipment acquires the notification corresponding to the target content and props the notification on a notification column of the second electronic equipment.
In the content collaboration method provided by the implementation mode, when the display duration of the prompt information reaches the preset duration, the second electronic device can hide the prompt information so as to reduce the interference caused by the display of the prompt information to the user. Or when the second electronic equipment detects a second preset operation on the prompt information, the second electronic equipment can also hide the prompt information, so that a user can actively hide the prompt information according to actual needs, and user experience is improved. The hidden prompt information may be a deletion prompt information, or may be a prompt information covered by other contents.
After the prompt information is hidden, the second electronic device can acquire the notification corresponding to the target content, and can place the notification on the notification column of the second electronic device to remind the user to cooperatively process the target content in time. It should be understood that the notification corresponding to the target content may be a notification already stored in the second electronic device, or may be a notification generated by the second electronic device according to the prompt information.
In a fourth aspect, an embodiment of the present application provides a content coordination apparatus, applied to a first electronic device, where the apparatus may include:
the target content display module is used for displaying target content;
the processing request sending module is used for determining second electronic equipment and sending a processing request to the second electronic equipment, wherein the processing request is used for requesting cooperative processing of the target content through the second electronic equipment;
a target content sending module, configured to respond to an acquisition request of the second electronic device, where the first electronic device sends the target content or an identifier of the target content to the second electronic device; and when the second electronic equipment acquires the processing request, the prompt information is displayed in a display interface of the second electronic equipment.
Illustratively, the processing request sending module may include:
a display time length obtaining unit, configured to obtain a display time length of the target content;
and the first determining unit is used for determining the second electronic equipment when the display time length reaches a first preset time length.
In one possible implementation manner, the processing request sending module may include:
a distance obtaining unit, configured to obtain a distance between the distance obtaining unit and at least one third electronic device, where the at least one third electronic device is an electronic device communicatively connected to the first electronic device;
and the second determining unit is used for determining the second electronic equipment from the at least one third electronic equipment according to the distance.
The second determining unit may include:
a type acquisition sub-unit, configured to acquire a type of the at least one third electronic device, and determine a first priority corresponding to the type of each third electronic device;
a first determining subunit, configured to determine the second electronic device from the at least one third electronic device according to the distance and the first priority.
The second determining unit may include:
the account number determining sub-unit is used for determining a user account number of the third electronic equipment login;
a second determining sub-unit, configured to determine the second electronic device from the at least one third electronic device according to the distance and the user account; or alternatively, the process may be performed,
The second determining unit may include:
and the third determination sub-unit is used for determining the second electronic equipment from the at least one third electronic equipment according to the distance, the user account and the first priority corresponding to the type of the third electronic equipment.
In another possible implementation manner, the processing request sending module may include:
the using state acquisition unit is used for acquiring the using state of at least one third electronic device, wherein the at least one third electronic device is an electronic device in communication connection with the first electronic device;
and a third determining unit, configured to determine the second electronic device from the at least one third electronic device according to the usage state.
The third determining unit is further configured to determine an electronic device with a use state being a preset state in the at least one third electronic device, and determine the second electronic device according to the electronic device with the use state being the preset state;
the preset state is any one of an operation state, a picking-up state, a first screen-lighting state, a service state and a second screen-lighting state, the operation state is a state with key input, touch input, keyboard input or mouse input in the third electronic device, the picking-up state is a state that the third electronic device is picked up, the first screen-lighting state is a state that the screen-lighting time of the third electronic device is smaller than a second preset time period, the service state is a state that the third electronic device is being used except the operation state, the second screen-lighting state is a state that the screen-lighting time of the third electronic device exceeds the second preset time period, but the third electronic device is not in the operation state, the service state or the picking-up state.
The third determining unit is further configured to determine, when an electronic device whose use state is a pickup state exists in the at least one third electronic device, whether the electronic device in the pickup state is located in the same hand as the first electronic device; and when the electronic device in the picking-up state is positioned in the same hand as the first electronic device, determining the electronic device in the picking-up state as the second electronic device.
The third determining unit is further configured to determine, when a plurality of electronic devices whose usage states are preset states exist in the at least one third electronic device, a second priority corresponding to each usage state, and determine, as the second electronic device, an electronic device with a highest second priority.
The third determining unit is further configured to obtain a distance between the third determining unit and the at least one third electronic device; and determining the second electronic device from the at least one third electronic device according to the distance and the using state.
The third determining unit is further configured to obtain a type of the at least one third electronic device, and determine a first priority corresponding to the type of each third electronic device; determining the second electronic device from the at least one third electronic device according to the use state and the first priority; or alternatively, the process may be performed,
The third determining unit is further configured to determine the second electronic device from the at least one third electronic device according to the usage state, the first priority, and a distance between the first electronic device and the third electronic device.
The third determining unit is further configured to determine a user account logged in by the at least one third electronic device; determining the second electronic equipment from the at least one third electronic equipment according to the using state and the user account; or alternatively, the process may be performed,
the third determining unit is further configured to determine the second electronic device from the at least one third electronic device according to the usage state, the user account, and a first priority corresponding to a type of the third electronic device; or alternatively, the process may be performed,
the third determining unit is further configured to determine the second electronic device from the at least one third electronic device according to the usage state, the user account, and a distance between the first electronic device and the third electronic device; or alternatively, the process may be performed,
the third determining unit is further configured to determine the second electronic device from the at least one third electronic device according to the usage status, the user account, a first priority corresponding to a type of the third electronic device, and a distance between the first electronic device and the third electronic device.
In a fifth aspect, an embodiment of the present application provides a content coordination apparatus, applied to a first electronic device, where the apparatus may include:
the target content display module is used for displaying target content;
a processing request sending module, configured to determine a second electronic device, and send a processing request and the target content to the second electronic device, or send an identification of the processing request and the target content to the second electronic device; the processing request is used for requesting the second electronic equipment to cooperatively process the target content, wherein after the second electronic equipment receives the processing request, prompt information is displayed on the second electronic equipment, and the prompt information is used for prompting the second electronic equipment to cooperatively process the target content.
Illustratively, the processing request sending module may include:
a display time length obtaining unit, configured to obtain a display time length of the target content;
and the first determining unit is used for determining the second electronic equipment when the display time length reaches a first preset time length.
In one possible implementation manner, the processing request sending module may include:
A distance obtaining unit, configured to obtain a distance between the distance obtaining unit and at least one third electronic device, where the at least one third electronic device is an electronic device communicatively connected to the first electronic device;
and the second determining unit is used for determining the second electronic equipment from the at least one third electronic equipment according to the distance.
The second determining unit may include:
a type acquisition sub-unit, configured to acquire a type of the at least one third electronic device, and determine a first priority corresponding to the type of each third electronic device;
a first determining subunit, configured to determine the second electronic device from the at least one third electronic device according to the distance and the first priority.
The second determining unit may include:
the user account number determining sub-unit is used for determining the user account number of the third electronic equipment login;
a second determining sub-unit, configured to determine the second electronic device from the at least one third electronic device according to the distance and the user account; alternatively, the second determining unit may include:
and the third determination sub-unit is used for determining the second electronic equipment from the at least one third electronic equipment according to the distance, the user account and the first priority corresponding to the type of the third electronic equipment.
In another possible implementation manner, the processing request sending module may include:
the using state acquisition unit is used for acquiring the using state of at least one third electronic device, wherein the at least one third electronic device is an electronic device in communication connection with the first electronic device;
and a third determining unit, configured to determine the second electronic device from the at least one third electronic device according to the usage state.
The third determining unit is further configured to determine an electronic device with a use state being a preset state in the at least one third electronic device, and determine the second electronic device according to the electronic device with the use state being the preset state;
the preset state is any one of an operation state, a picking-up state, a first screen-lighting state, a service state and a second screen-lighting state, the operation state is a state with key input, touch input, keyboard input or mouse input in the third electronic device, the picking-up state is a state that the third electronic device is picked up, the first screen-lighting state is a state that the screen-lighting time of the third electronic device is smaller than a second preset time period, the service state is a state that the third electronic device is being used except the operation state, the second screen-lighting state is a state that the screen-lighting time of the third electronic device exceeds the second preset time period, but the third electronic device is not in the operation state, the service state or the picking-up state.
The third determining unit is further configured to determine, when an electronic device whose use state is a pickup state exists in the at least one third electronic device, whether the electronic device in the pickup state is located in the same hand as the first electronic device; and when the electronic device in the picking-up state is positioned in the same hand as the first electronic device, determining the electronic device in the picking-up state as the second electronic device.
The third determining unit is further configured to determine, when a plurality of electronic devices whose usage states are preset states exist in the at least one third electronic device, a second priority corresponding to each usage state, and determine, as the second electronic device, an electronic device with a highest second priority.
The third determining unit is further configured to obtain a distance between the third determining unit and the at least one third electronic device; and determining the second electronic device from the at least one third electronic device according to the distance and the using state.
The third determining unit is further configured to obtain a type of the at least one third electronic device, and determine a first priority corresponding to the type of each third electronic device; determining the second electronic device from the at least one third electronic device according to the use state and the first priority; or alternatively, the process may be performed,
The third determining unit is further configured to determine the second electronic device from the at least one third electronic device according to the usage state, the first priority, and a distance between the first electronic device and the third electronic device.
The third determining unit is further configured to determine a user account logged in by the at least one third electronic device; determining the second electronic equipment from the at least one third electronic equipment according to the using state and the user account; or alternatively, the process may be performed,
the third determining unit is further configured to determine the second electronic device from the at least one third electronic device according to the usage state, the user account, and a first priority corresponding to a type of the third electronic device; or alternatively, the process may be performed,
the third determining unit is further configured to determine the second electronic device from the at least one third electronic device according to the usage state, the user account, and a distance between the first electronic device and the third electronic device; or alternatively, the process may be performed,
the third determining unit is further configured to determine the second electronic device from the at least one third electronic device according to the usage status, the user account, a first priority corresponding to a type of the third electronic device, and a distance between the first electronic device and the third electronic device.
In a sixth aspect, an embodiment of the present application provides a content coordination apparatus applied to a second electronic device, where the apparatus may include:
the processing request acquisition module is used for acquiring a processing request of the first electronic equipment, wherein the processing request is used for requesting cooperative processing of target content displayed by the first electronic equipment through the second electronic equipment;
the prompt information display module is used for displaying prompt information according to the processing request;
and the target content display module is used for detecting a first preset operation on the prompt information and displaying the target content.
Illustratively, the target content display module may include:
an acquisition request sending unit, configured to send an acquisition request to the first electronic device, where the acquisition request is used to acquire the target content or an identifier of the target content;
the first target content display unit is used for receiving the target content sent by the first electronic equipment and displaying the target content; or alternatively, the process may be performed,
the target content display module may include:
the second target content display unit is used for receiving the identification of the target content sent by the first electronic device and displaying the target content stored in the second electronic device according to the identification of the target content, wherein the target content stored in the second electronic device is owned by the second electronic device.
The processing request obtaining module is further configured to obtain a processing request of the first electronic device and the target content, or obtain an identifier of the processing request of the first electronic device and the target content;
the target content display module is further configured to display the target content, or display the target content stored in the second electronic device according to an identifier of the target content, where the target content stored in the second electronic device is owned by the second electronic device.
The prompt information display module is specifically configured to display the prompt information through a suspension ball.
The target content display module is further configured to display the target content through a window.
In one example, the target content display module may further include:
the device state determining unit is used for determining the current device state by the second electronic device when the first preset operation on the prompt information is detected;
a third target content display unit configured to display the target content when the device state is an unlocked state;
the unlocking prompt unit is used for outputting an unlocking prompt when the equipment state is a screen locking state, and the unlocking prompt is used for prompting unlocking of the second electronic equipment;
And a fourth target content display unit configured to display the target content in response to an unlocking operation of the second electronic device.
In one possible implementation, the apparatus may further include:
and the prompt information hiding module is used for hiding the prompt information by the second electronic equipment when detecting that the display duration of the prompt information reaches the preset duration or detecting the hiding operation of the prompt information.
Illustratively, the apparatus may further include:
and the notification top setting module is used for acquiring the notification corresponding to the target content and setting the notification to the notification column of the second electronic equipment.
In a seventh aspect, an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to cause the electronic device to implement the content collaboration method in any one of the first aspect or the second aspect.
In an eighth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, where the computer program when executed by a computer causes the computer to implement the content collaboration method according to any one of the first aspect or the second aspect.
In a ninth aspect, embodiments of the present application provide a computer program product, which when run on an electronic device, causes the electronic device to perform the content collaboration method of any one of the first aspect or the second aspect described above.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device to which a content collaboration method according to an embodiment of the present application is applicable;
FIG. 2 is a schematic diagram of a software architecture to which a content collaboration method according to an embodiment of the present application is applicable;
FIG. 3 is a flow chart of a content collaboration method according to an embodiment of the present application;
FIG. 4 is a flow chart of a content collaboration method according to another embodiment of the present application;
fig. 5 is a schematic diagram of an application scenario provided in an embodiment of the present application;
fig. 6 and fig. 7 are schematic diagrams of an application scenario two according to an embodiment of the present application;
fig. 8 is a third application scenario schematic diagram provided in an embodiment of the present application;
fig. 9 is a schematic diagram of an application scenario provided in an embodiment of the present application.
Detailed Description
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
Furthermore, references to "a plurality of" in the examples of this application should be interpreted as two or more.
The steps involved in the content collaboration method provided in the embodiments of the present application are merely examples, not all steps are necessarily performed, or the content in each information or message is not necessarily optional, and may be increased or decreased as needed during use. The same steps or messages having the same function in the embodiments of the present application may be referred to and used by reference between different embodiments.
The service scenario described in the embodiments of the present application is for more clearly describing the technical solution of the embodiments of the present application, and does not constitute a limitation to the technical solution provided in the embodiments of the present application, and as a person of ordinary skill in the art can know that, with the evolution of the network architecture and the appearance of a new service scenario, the technical solution provided in the embodiments of the present application is equally applicable to similar technical problems.
The functions of small-screen devices such as smart watches are becoming more and more comprehensive, for example, messages can be viewed, sports health detection and data statistics can be performed, electronic payment can be performed, and the like. Due to size limitation, the display screen of small-screen devices such as smart watches is small, and display and touch capabilities are weak, so that many complex tasks cannot be performed on the small-screen devices such as smart watches. When a user views content on a small-screen device such as a smart watch, if the content needs to be processed (for example, a message is replied, etc.), the user needs to open a corresponding application on a large-screen device such as a mobile phone or a tablet computer to find the message, or enter a notification center to find the message, and then the message is processed, so that the operation process is complex, and the user experience is affected.
In order to solve the above-mentioned problems, an embodiment of the present application provides a content collaboration method, where, when a first electronic device displays a target content, the first electronic device may determine a second electronic device that is communicatively connected to the first electronic device and has a screen size larger than that of the first electronic device, and send a processing request to the second electronic device to request that the second electronic device perform collaborative processing on the target content; the second electronic equipment displays corresponding prompt information according to the processing request sent by the first electronic equipment; when the user wants to cooperatively process the target content through the second electronic device, the user can execute a first preset operation on the prompt information in the second electronic device. The second electronic equipment can rapidly display the target content based on the first preset operation, search and the like of the target content are not needed in the second electronic equipment, the operation process can be simplified, the user can conveniently cooperatively process the target content through the second electronic equipment, the user experience is improved, and the method has strong usability and practicability.
In this embodiment of the present application, the first electronic device may be a wearable device with a small-sized screen, such as a smart watch, a smart bracelet, and the second electronic device may be an electronic device with a large-sized screen, such as a mobile phone, a tablet computer, a notebook computer, and a desktop computer, and the specific type of the electronic device (including the first electronic device and the second electronic device) is not limited in this embodiment of the present application.
The following first describes an electronic device according to an embodiment of the present application. Referring to fig. 1, fig. 1 shows a schematic structural diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 140, a wireless communication module 150, a sensor module 160, keys 190, a camera 170, a display 180, and the like. The sensor module 160 may include a pressure sensor 160A, a gyro sensor 160B, a magnetic sensor 160C, an acceleration sensor 160D, a distance sensor 160E, a proximity light sensor 160F, a fingerprint sensor 160G, a touch sensor 160H, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processingunit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 160H, the camera 170, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may couple to the touch sensor 160H through an I2C interface, such that the processor 110 communicates with the touch sensor 160H through an I2C bus interface to implement a touch function of the electronic device 100.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 150. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 150 through a UART interface to implement a bluetooth function.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as the display 180, the camera 170, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 170 communicate through a CSI interface to implement the photographing function of electronic device 100. The processor 110 and the display 180 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 170, the display 180, the wireless communication module 150, the sensor module 160, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB type c interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 140, the wireless communication module 150, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 140 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied on the electronic device 100. The mobile communication module 140 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 140 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 140 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 140 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 140 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor displays images or video through the display 180. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 140 or other functional module, independent of the processor 110.
The wireless communication module 150 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 150 may be one or more devices that integrate at least one communication processing module. The wireless communication module 150 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 150 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2.
In some embodiments, antenna 1 and mobile communication module 140 of electronic device 100 are coupled, and antenna 2 and wireless communication module 150 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 180, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 180 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display 180 is used to display images, videos, and the like. The display 180 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 180, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 170, a video codec, a GPU, a display 180, an application processor, and the like.
The ISP is used to process the data fed back by the camera 170. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 170.
The camera 170 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device 100 may include 1 or N cameras 170, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The pressure sensor 160A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 160A may be disposed on display 180. The pressure sensor 160A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor 160A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display 180, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 160A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 160A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 160B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 160B. The gyro sensor 160B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 160B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 160B may also be used for navigation, somatosensory of a game scene.
The magnetic sensor 160C includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 160C. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 160C. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 160D may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 160E for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 160E to achieve quick focus.
The proximity light sensor 160F may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 160F, so as to automatically extinguish the screen to achieve the purpose of saving power. The proximity light sensor 160F may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The fingerprint sensor 160G is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The touch sensor 160H is also referred to as a "touch device". The touch sensor 160H may be disposed on the display screen 180, and the touch sensor 160H and the display screen 180 form a touch screen, which is also referred to as a "touch screen". The touch sensor 160H is used to detect a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 180. In other embodiments, the touch sensor 160H may also be disposed on the surface of the electronic device 100 at a different location than the display 180.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 2 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include a camera,
Figure BDA0003405165220000191
calendar, talk, map, navigation,>
Figure BDA0003405165220000192
Figure BDA0003405165220000193
Figure BDA0003405165220000194
music, video, short message, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated below in connection with content display scenarios.
When touch sensor 160H receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the original input event. Taking the touch operation as a touch click operation, taking a control corresponding to the click operation as an example of a control of a short message application icon, the short message application calls an interface of an application framework layer, starts the short message application, further starts a display driver by calling a kernel layer, and displays the content of the short message application through a display screen 180.
The content collaboration method provided in the embodiments of the present application will be described in detail below with reference to the accompanying drawings and specific application scenarios.
Referring to fig. 3, fig. 3 is a schematic flowchart of a content collaboration method according to an embodiment of the present application. As shown in fig. 3, the method may include:
s301, the first electronic device displays target content and determines the second electronic device.
S302, the first electronic device sends a processing request to the second electronic device.
S303, the second electronic equipment displays prompt information according to the processing request.
S304, the second electronic equipment detects a first preset operation on the prompt information.
S305, the second electronic device sends an acquisition request to the first electronic device.
S306, the first electronic device sends the target content or the identification of the target content to the second electronic device based on the acquisition request.
S307, the second electronic device displays the target content.
As can be seen from the above, when a user opens a target content in a first electronic device, the first electronic device may display the target content, and the first electronic device may determine a second electronic device and send a processing request to the second electronic device to request cooperative processing of the target content by the second electronic device. After the second electronic device receives the processing request, prompt information can be displayed in the display interface so as to prompt a user to cooperatively process the target content through the second electronic device. When the user wants to cooperatively process the target content through the second electronic device, the user can execute a first preset operation in the second electronic device. When the second electronic device detects the first preset operation, an acquisition request can be sent to the first electronic device to acquire the target content. After receiving the acquisition request, the first electronic device may send the target content to the second electronic device. After the second electronic device receives the target content, the target content can be displayed, so that a user can cooperatively process the target content through the second electronic device.
It should be understood that the second electronic device may also install an application corresponding to the target content, and the second electronic device and the first electronic device may log in to the application through the same application account, so when the first electronic device receives the target content, the second electronic device may also receive the target content.
Therefore, when the second electronic device detects the first preset operation, an acquisition request may be sent to the first electronic device to acquire the identifier of the target content. After receiving the acquisition request, the first electronic device may send the identifier of the target content to the second electronic device. After the second electronic device receives the identifier of the target content, the second electronic device may acquire the target content stored by the second electronic device and display the target content stored by the second electronic device, for example, the application corresponding to the target content may be opened and the user may jump to the page corresponding to the target content, so that the user may perform cooperative processing on the target content through the second electronic device.
For example, when the target content is opened from the notification of the first electronic device, the identifier corresponding to the target content may be the notification corresponding to the target content. For example, the identifier corresponding to the target content may be an application to which the target content belongs, a page corresponding to the target content in the application, and the like.
Referring to fig. 4, fig. 4 is a schematic flowchart of a content collaboration method according to another embodiment of the present application. As shown in fig. 4, the method may include:
s401, the first electronic device displays target content and determines the second electronic device.
S402, the first electronic device sends the processing request and the target content to the second electronic device or sends the processing request and the identification of the target content to the second electronic device.
S403, the second electronic equipment displays prompt information according to the processing request.
And S404, when the first preset operation on the prompt information is detected, the second electronic equipment displays the target content.
As can be seen from the above, when the first electronic device displays the target content, the first electronic device may determine the second electronic device, and may send the processing request and the target content to the second electronic device. After the second electronic device receives the processing request, prompt information can be displayed in the display interface so as to prompt a user to cooperatively process the target content in the second electronic device. If the second electronic device detects the first preset operation, the second electronic device can directly display the target content, so that the user can cooperatively process the target content through the second electronic device.
Or when the second electronic device installs the application corresponding to the target content and the second electronic device logs in the application through the same application account number with the first electronic device, after determining the second electronic device, the first electronic device may send a processing request and an identifier of the target content to the second electronic device. After the second electronic device receives the processing request, prompt information can be displayed in the display interface so as to prompt a user to cooperatively process the target content in the second electronic device. If the second electronic device detects the first preset operation, the second electronic device may determine the target content stored by the second electronic device according to the identifier of the target content, and display the target content stored by the second electronic device, for example, the second electronic device may open an application corresponding to the target content and jump to a page corresponding to the target content, so that the user may perform cooperative processing on the target content through the second electronic device.
It should be understood that the first electronic device is an electronic device of a small-sized screen that is inconvenient to directly process the target content, the screen size of the second electronic device is larger than that of the first electronic device, and the second electronic device is an electronic device that is convenient to process the target content. For example, the first electronic device may be a wearable device with a small-sized screen, such as a smart watch, a smart bracelet, and the second electronic device may be an electronic device with a large-sized screen, such as a mobile phone, a tablet computer, a notebook computer, or a desktop computer.
In one possible implementation manner, when the first electronic device displays the target content, the first electronic device may acquire a display duration of the target content, so as to identify an operation intention of the user according to the display duration, that is, determine whether the user wants to process the target content according to the display duration, so that invalid sending of the content by the first electronic device may be reduced, interference caused to the user is reduced, and user experience is improved.
For example, when the display time period is greater than or equal to the first preset time period, the user may be indicated to be interested in the target content, and at this time, the first electronic device may consider that the user wants to process the target content, so the first electronic device may determine the second electronic device, so that the user may perform cooperative processing on the target content through the second electronic device. When the display duration is less than the first preset duration, the user is shown to view the target content briefly, but the target content is not interested in, and the first electronic device can consider that the user does not want to process the target content, so that the first electronic device can not determine the second electronic device, and therefore the target content is not sent, and interference to the user is reduced.
The first preset duration can be specifically set by a technician according to actual conditions, and can also be set by a user according to actual needs. For example, the user may set the first preset duration to 5s in a user-defined manner according to actual needs, so when the display duration of the target content in the first electronic device reaches 5s, the first electronic device may consider that the user wants to process the target content, and at this time, the first electronic device may determine the second electronic device, so that the user may cooperatively process the target content through the second electronic device. When the display duration of the target content in the first electronic device is only 3s, the first electronic device may consider that the user does not want to process the target content, and at this time, the first electronic device may not determine the second electronic device, so as to reduce interference caused to the user.
In another possible implementation manner, the first electronic device is provided with an application white list which can be used for content collaborative processing through the second electronic device; and/or one or more preset keywords are set, when a certain content contains at least one preset keyword, the fact that the content is not required to be cooperatively processed through the second electronic equipment is indicated, so that the filtering of the garbage content is carried out by applying the white list and/or the preset keywords, interference to users is reduced, and user experience is improved. The application white list and the preset keywords can be set by user definition or by default of the first electronic equipment.
For example, the user may set the application whitelist customization to: short message application,
Figure BDA0003405165220000221
Application, navigation application and navigation program
Figure BDA0003405165220000222
Figure BDA0003405165220000223
Application, etc. Thus, when the first electronic device displays the target content, the first electronic device may determine an application to which the target content corresponds. When the application corresponding to the target content is an application in the application white list, the first electronic device can determine the second electronic device; when the application corresponding to the target content is not an application in the application white list, the first electronic device may not perform the determination of the second electronic device.
For example, the first electronic device may set the preset keyword custom according to the historical behavior of the user or big data as follows: low price rush purchases, burst money limited and surprise gift bags and the like containing promotional or advertised words or phrases. Therefore, when the first electronic device displays the target content, the first electronic device can identify the target content to determine whether the target content contains the preset keyword. When the target content contains a preset keyword, the first electronic device may not determine the second electronic device; when the target content does not contain the preset keyword, the first electronic device can determine the second electronic device.
It should be appreciated that when the application white list and the preset keyword are set in the first electronic device, the first electronic device may default to the priority between the two, or the user may customize to the priority between the two. For example, the user may set the priority of the application whitelist to be greater than the priority of the preset keyword in a user-defined manner, so when the application corresponding to the target content is an application in the application whitelist and the target content includes the preset keyword, the first electronic device may still determine that the target content needs to be processed by the second electronic device, and at this time, the first electronic device may determine that the second electronic device is capable of performing the determination of the second electronic device.
In this embodiment of the present invention, the first electronic device may also determine whether to process the target content through the second electronic device according to the display duration and the application white list of the target content, or according to the display duration and the preset keyword of the target content, or according to the display duration, the application white list and the preset keyword of the target content. For example, when it is determined that the display duration of the target content reaches a first preset duration, the first electronic device may determine whether an application corresponding to the target content is an application in the application white list, and/or determine whether a preset keyword is included in the target content; when the application corresponding to the target content is an application in the application white list or the target content does not contain a preset keyword, the first electronic device can determine the second electronic device.
The process of determining the second electronic device by the first electronic device will be described in detail.
In one possible implementation, the first electronic device may determine the second electronic device based on the distance. In one example, when the first electronic device displays the target content, the first electronic device may determine one or more electronic devices communicatively connected to the first electronic device (hereinafter, the electronic device communicatively connected to the first electronic device will be referred to as electronic device a) and acquire distances between the respective electronic devices a and the first electronic device. Subsequently, from these electronic apparatuses a, the first electronic apparatus may acquire an electronic apparatus whose distance is less than or equal to a preset distance threshold (hereinafter, an electronic apparatus whose distance from the first electronic apparatus is less than or equal to the preset distance threshold is referred to as an electronic apparatus B), and may determine one or more electronic apparatuses from among the one or more electronic apparatuses B as the second electronic apparatus. The preset distance threshold can be specifically set by a technician according to an actual scene, and can also be set by a user in a self-defined manner according to actual needs. For example, the user may set the preset distance threshold to 10 cm or 15 cm or 30 cm or the like according to the habit of using the electronic device itself.
It should be understood that, in order to reduce the invalid transmission of the target content, the electronic devices a in communication with the first electronic device in the embodiment of the present application refer to electronic devices that are powered on. For example, the electronic device a may include an electronic device in an unlocked state and an electronic device in a locked state.
In another example, to avoid sending the target content to the electronic device of the other user, so as to cause disclosure of the privacy of the user, after determining the electronic device B, the first electronic device may determine an electronic device of the electronic device B that is logged in to the same user account as the first electronic device or may control the first electronic device (hereinafter, an electronic device of the electronic device that is logged in to the same user account as the first electronic device or may control the first electronic device is referred to as an electronic device C), and determine one or more electronic devices from the one or more electronic devices C as a second electronic device, so that the second electronic device is an electronic device belonging to the same user as the first electronic device, or is an electronic device authorized by the user to which the first electronic device belongs and may control the first electronic device, so as to ensure privacy of the target content and protect the privacy of the user.
Or after determining the electronic device a in communication connection with the first electronic device, the first electronic device may determine the electronic device C in the electronic device a that logs in the same user account as the first electronic device or may control the first electronic device, and may obtain a distance between the electronic device C and the first electronic device. Then, the first electronic device may acquire an electronic device B, where the distance between the electronic device C and the first electronic device is smaller than or equal to a preset distance threshold, and determine one or more electronic devices from the one or more electronic devices B as a second electronic device, so that the second electronic device is an electronic device belonging to the same user as the first electronic device, or is an electronic device authorized by the user to which the first electronic device belongs and capable of controlling the first electronic device, thereby ensuring privacy of the target content and protecting privacy of the user.
It should be noted that, because the electronic devices logged in to the same user account or capable of controlling the first electronic device may also be public devices, the public devices often have no privacy, which easily causes disclosure of user privacy. Therefore, after determining the electronic device C, the first electronic device may acquire the type of the electronic device C, and determine the second electronic device according to the type of the electronic device C, so that the second electronic device is a private device of the user, thereby ensuring privacy of the target content and protecting privacy of the user. The public equipment refers to electronic equipment which can be used by multiple people at the same time, and can comprise an intelligent television, an intelligent large screen, an intelligent sound box and the like. Private devices refer to devices that are typically used only by the user's individual and may include, for example, cell phones, tablet computers, notebook computers, desktop computers, and the like.
In this embodiment, when determining the second electronic device according to the electronic device B, if there is only one electronic device B, the first electronic device may directly determine the electronic device B as the second electronic device. If the plurality of electronic devices B are included, the first electronic device can determine one or more electronic devices with the smallest distance in the electronic devices B as second electronic devices, so that a user can conveniently cooperatively process target content through the second electronic devices; or the first electronic device may acquire the type of the electronic device B, and determine the second electronic device according to the type of the electronic device B and the first priority corresponding to the type, so as to determine one or more electronic devices most convenient for the user to operate as the second electronic device according to the first priority corresponding to the type, so that the user can conveniently cooperatively process the target content through the second electronic device. It should be understood that the basic principle of the first electronic device determining the second electronic device according to the electronic device C is the same, and will not be described herein.
The first priority corresponding to the type may be specifically set by a technician according to an actual situation, or may be set by a user in a user-defined manner. For example, the user may set the first priority custom for the type as: the first priority corresponding to the mobile phone is greater than the first priority corresponding to the tablet personal computer, and the first priority corresponding to the notebook personal computer is greater than the first priority corresponding to the desktop personal computer. Thus, when the mobile phone is included in the electronic device B, the first electronic device may determine the mobile phone as the second electronic device; when the mobile phone is not included in the electronic device B, but the tablet computer is included, the first electronic device may determine the tablet computer as the second electronic device; when the mobile phone and the tablet are not included in the electronic device B but the notebook is included, the first electronic device may determine the notebook as the second electronic device. When the mobile phone, the tablet computer, and the notebook computer are not included in the electronic device B, but the desktop computer is included, the first electronic device may determine the desktop computer as the second electronic device.
In the embodiment of the application, the first electronic device may acquire the distance between the first electronic device and the electronic device a through a ranging function of the first electronic device. For example, when the first electronic device displays the target content, the first electronic device may initiate a ranging function of the first electronic device to measure a distance between the first electronic device and the electronic device a. Or the first electronic device may acquire the distance between the first electronic device and the electronic device a through the ranging function of the electronic device a. For example, when the first electronic device displays the target content, the first electronic device may send a ranging request to electronic device a, such as a bluetooth broadcast with a ranging identification. After receiving the bluetooth broadcast, the electronic device a may start a ranging function of the electronic device a to measure a distance between the first electronic device and the electronic device a, and send the measured distance to the first electronic device. Similarly, the distance between the first electronic device and the electronic device C may also be determined by the ranging function of the first electronic device or the ranging function of the electronic device C. It should be understood that the ranging function may be Ultra Wide Band (UWB) ranging, or may be ultrasonic ranging, or may be bluetooth ranging, and the implementation manner of the ranging function is not particularly limited in this embodiment of the present application.
In another possible implementation manner, the first electronic device may determine the second electronic device according to the use state of the electronic device, so as to determine the electronic device most likely to be used by the user currently as the second electronic device, so that the user can conveniently and timely perform cooperative processing on the target content through the second electronic device.
It should be understood that the use state is a state in which the electronic device is currently used. For example, when the electronic device currently has a key input, the use state of the electronic device may be an operation state. For example, when the electronic device is picked up, the use state of the electronic device may be a picked-up state. For example, the usage state of the electronic device may be a service state while the electronic device is playing video.
In one example, while the first electronic device is displaying the target content, the first electronic device may acquire one or more electronic devices a communicatively connected to the first electronic device and determine a current usage status of each electronic device a. Subsequently, from these electronic devices a, the first electronic device may determine an electronic device whose usage state is a preset state (hereinafter, the electronic device whose usage state is the preset state is referred to as an electronic device D), and may determine one or more electronic devices from the one or more electronic devices D as the second electronic device.
The preset state may be any one of an operation state, a picking state, a first screen-lighting state, a service state, a second screen-lighting state, and other states. The operation state refers to a state in which the electronic device has any input event such as a key input, a touch input, a keyboard input, or a mouse input. The picked-up state refers to a state in which the electronic device is picked up. The first bright screen state refers to a state in which the display screen of the electronic device is just lighted, that is, a state in which the bright screen time of the display screen is less than a second preset time period. The service state refers to a state in which the electronic device is being used other than the operation state, for example, a state in which the electronic device is playing video or playing music. The second screen-lighting state refers to a state that the screen-lighting time of the display screen exceeds a second preset duration, but the electronic equipment is not in an operation state, a service state or a picking state. The other states refer to states other than the operation state, the pickup state, the first screen-on state, the service state, and the second screen-on state.
It may be appreciated that the electronic device may detect whether the electronic device is picked up by an inertial measurement unit (inertial measurement unit, IMU) in the electronic device, where a specific detection manner may be specifically set by a technician according to an actual scenario, which is not specifically limited in the embodiments of the present application. The second preset duration may be specifically set by the technician according to an actual scenario, and for example, the second preset duration may be set to 5s. Therefore, in a scene where the electronic device only is on screen, when the on screen time of the electronic device is 3s, the first electronic device may determine that the use state of the electronic device is the first on screen state. When the screen-lighting time of the electronic device is 6s, the first electronic device can determine that the use state of the electronic device is the second screen-lighting state.
In another example, to avoid sending the target content to the electronic device of the other user, thereby causing disclosure of privacy of the user, after determining the electronic device D according to the above example, the first electronic device may determine the electronic device C in the electronic device D that logs in to the same user account as the first electronic device or may control the first electronic device. Then, the first electronic device may determine one or more electronic devices from the one or more electronic devices C as a second electronic device, so that the second electronic device is an electronic device belonging to the same user as the first electronic device, or is an electronic device authorized by the user to which the first electronic device belongs and capable of controlling the first electronic device, thereby ensuring privacy of the target content and protecting privacy of the user.
Or after determining the electronic device a in communication connection with the first electronic device, the first electronic device may determine the electronic device C in the electronic device a that logs in the same user account as the first electronic device or may control the first electronic device, and may obtain the current use state of the electronic device C. Then, the first electronic device may determine, from among the electronic devices C, the electronic device D whose usage status is a preset status, and determine, from one or more electronic devices D, one or more electronic devices as a second electronic device, so that the second electronic device is an electronic device belonging to the same user as the first electronic device, or is an electronic device authorized by the user to which the first electronic device belongs and capable of controlling the first electronic device, thereby ensuring privacy of the target content and protecting privacy of the user.
In this embodiment, when determining the second electronic device according to the electronic device D, if there is only one electronic device D, the first electronic device may directly determine the electronic device D as the second electronic device.
For example, when the electronic device D includes a plurality of electronic devices and there is an electronic device in a picked-up state in the electronic device D, the first electronic device may determine whether the electronic device in the picked-up state is located in the same hand as the first electronic device. When the electronic device in the picked-up state is located in the same hand as the first electronic device, the first electronic device may determine the electronic device in the picked-up state as the second electronic device.
It should be appreciated that the first electronic device may determine whether the electronic device in the picked-up state is located in the same hand as the first electronic device based on the acceleration measured by the acceleration sensor in the first electronic device and the acceleration measured by the acceleration sensor in the electronic device in the picked-up state, and/or may determine whether the electronic device in the picked-up state is located in the same hand as the first electronic device based on the angular velocity measured by the gyro sensor in the first electronic device and the angular velocity measured by the gyro sensor in the electronic device in the picked-up state.
For example, when the electronic device D includes a plurality of electronic devices, the first electronic device may determine a second priority corresponding to the usage state, and determine the second electronic device according to the second priority. It should be understood that the basic principle of the first electronic device determining the second electronic device according to the electronic device C is the same, and will not be described herein.
The second priority corresponding to the use state may be specifically set by a technician according to an actual scene, or may be set by a user in a user-defined manner. For example, the technician may set the second priority corresponding to the usage status as follows according to the actual scenario: the second priority corresponding to the operation state is greater than the second priority corresponding to the picking state, the second priority corresponding to the first screen-on state is greater than the second priority corresponding to the service state, and the second priority corresponding to the second screen-on state is greater than the second priority corresponding to the other states.
Thus, when an electronic device having an operation state among the electronic devices D, the first electronic device can determine the electronic device having the operation state as the second electronic device. When the electronic device D has no electronic device in the operation state but has an electronic device in the pickup state, the first electronic device may determine the electronic device in the pickup state as the second electronic device. When the electronic device D has no electronic device in the operation state and has no electronic device in the pickup state but has the first bright screen state, the first electronic device may determine the electronic device in the first bright screen state as the second electronic device. When the electronic device D has no electronic device in the operation state, no electronic device in the pickup state, and no electronic device in the first bright screen state, but has an electronic device in the service state, the first electronic device may determine the electronic device in the service state as the second electronic device. When the electronic device D has no electronic device in the operation state, no electronic device in the pickup state, no electronic device in the first bright screen state, and no electronic device in the service state, but has the electronic device in the second bright screen state, the first electronic device may determine the electronic device in the second bright screen state as the second electronic device. When the electronic device D has no electronic device in the operation state, no electronic device in the pickup state, no electronic device in the first bright screen state, no electronic device in the service state, and no electronic device in the second bright screen state, but has an electronic device in another state, the first electronic device may determine the electronic device in the other state as the second electronic device.
It should be noted that, when the electronic devices determined according to the second priorities corresponding to the usage states include a plurality of electronic devices, that is, when the electronic devices in the same second priority still include a plurality of electronic devices, the first electronic device may determine the second electronic device according to the first priority corresponding to the type of the electronic device. For example, the first electronic device may determine the second electronic device according to the first priority corresponding to the mobile phone > the first priority corresponding to the tablet computer > the first priority corresponding to the notebook computer > the first priority corresponding to the desktop computer. For example, when the electronic device D determined according to the second priority corresponding to the usage state includes a mobile phone and a tablet computer, the first electronic device may determine the mobile phone as the second electronic device according to the first priority corresponding to the type.
Or, when the electronic devices determined according to the second priority corresponding to the usage state include a plurality of electronic devices, that is, when the electronic devices at the same second priority include a plurality (for example, M) of electronic devices, the first electronic device may determine that each of N electronic devices in the plurality of electronic devices is the second electronic device. Wherein N and M are integers, N is more than or equal to 2 and less than or equal to M, and M is more than or equal to 2. It should be understood that the specific value of N may be specifically set by a technician according to an actual scenario, or may be set by a user in a customized manner.
In another example, when the first electronic device displays the target content, the first electronic device may acquire one or more electronic devices a communicatively connected to the first electronic device, and acquire a usage state of each electronic device a within a third preset duration. Then, from the electronic devices a, the first electronic device may determine that the usage state is the first electronic device D in the preset state, and may determine one or more electronic devices from the electronic devices D as the second electronic device, so as to determine, as the second electronic device, the electronic device most likely to be used by the user in the first preset time period, so that the user can conveniently and timely perform collaborative processing on the target content through the second electronic device.
For example, when the electronic device D whose usage state is the preset state first includes a plurality of electronic devices, the first electronic device may determine the second electronic device according to the second priority corresponding to the usage state. And when the electronic devices determined according to the second priorities corresponding to the use states still include a plurality of electronic devices, the first electronic device may determine the second electronic device according to the first priorities corresponding to the types of the electronic devices, or may determine N electronic devices in the plurality of electronic devices as the second electronic device. It should be appreciated that the third preset time period may be specifically set by the technician according to an actual scenario, for example, the third preset time period may be set to 3s.
In another possible implementation, the first electronic device may determine the second electronic device based on the distance and the usage status of the electronic devices. In one example, the second electronic device may be an electronic device whose distance from the first electronic device is less than or equal to a preset distance threshold and whose usage state is a preset state. In another example, to avoid sending the target content to the electronic device of the other user, thereby causing disclosure of the privacy of the user, the second electronic device may be an electronic device that has a distance from the first electronic device that is less than or equal to a preset distance threshold, has a use state that is in a preset state, and logs in to the same user account as the first electronic device, or may control the first electronic device. The specific determination process may be analogized with reference to the foregoing description, and will not be described herein.
For example, while the first electronic device is displaying the target content, the first electronic device may determine one or more electronic devices a communicatively connected to the first electronic device and obtain a distance between each electronic device a and the first electronic device. Then, from the electronic devices a, the first electronic device may determine an electronic device B with a distance less than or equal to a preset distance threshold, and acquire an electronic device C with the same user account number as the first electronic device in the electronic device B, or may control the first electronic device. Then, the first electronic device may acquire the current use state of the electronic device C, determine the electronic device D whose use state is the preset state in the electronic device C, or determine the electronic device D whose use state is the preset state in the electronic device C first, and determine one or more electronic devices from the electronic devices D as the second electronic device.
It may be understood that when the electronic device determined according to the distance and the usage state of the electronic device includes a plurality of electronic devices, or when the electronic device determined according to the distance, the usage state of the electronic device, and the user account number includes a plurality of electronic devices, the first electronic device may determine the second electronic device in combination with the type of the electronic device, that is, the first electronic device may determine the second electronic device in combination with the first priority corresponding to the type of the electronic device. The specific determination process may also refer to the analogy described above, and will not be described herein.
In this embodiment of the present application, after determining the second electronic device, the first electronic device may send a processing request to the second electronic device to request that the target content be cooperatively processed by the second electronic device. The second electronic device may display a prompt message in a display interface of the second electronic device based on the processing request of the first electronic device, so as to prompt a user to cooperatively process the target content in the first electronic device through the second electronic device.
The second electronic device can display the prompt information in any display mode. In one example, the second electronic device may display a prompt through a pop-up window. In another example, to reduce interference of the display of the reminder information with the user, the second electronic device may display the reminder information through a small-sized hover ball. That is, the second electronic device may directly display the small-sized hover ball, so as to use the hover ball as a prompt message to prompt the user to cooperatively process the target content in the second electronic device. The suspending ball can be any graph. For example, in order to facilitate the user to clearly request the source of the target content to be processed, the hover ball may be an icon of the application corresponding to the target content. An example of an icon of an application corresponding to the target content of the hover sphere will be described below.
It will be appreciated that the hover ball may be displayed at any location on the display interface, such as at any location on the display interface in the lower left, upper left, lower right, upper right, or middle. In one example, to reduce the impact of the display of the hover ball on the user operation, the second electronic device may also determine a position of the hover ball in the display interface based on the user's operational position in the display interface to display the hover ball in a position other than the operational position. For example, the user's operational position in the display interface may be determined from a touch event or may be determined from content displayed in the display interface. The determination manner of the operation position of the user in the display interface may be specifically set by a technician according to an actual scene, which is not specifically limited in the embodiment of the present application.
It should be noted that the second electronic device may be an electronic device in a bright screen state or an electronic device in a dead screen state. For example, when the processing request is received, if the second electronic device is in the bright screen state, the second electronic device may directly display the prompt information in the display interface; if the second electronic equipment is in the off-screen state, the second electronic equipment can firstly lighten the display interface and display prompt information in the lightened display interface. The display interface can be any interface such as a lock screen interface, a main interface or an application interface.
Referring to fig. 5, fig. 5 shows a schematic view of an application scenario provided in an embodiment of the present application. In this application scenario, the first electronic device may be a smart watch 502. As shown in fig. 5 (a), the smart watch 502 may display a short message "latest weather is good" from TOM, and go to the outing bar together on weekends. As shown in fig. 5 (b), the second electronic device may be a mobile phone 503, and the current display interface of the mobile phone 503 is a main interface. When the mobile phone 503 receives a processing request sent by the smart watch 502, the processing request is used for requesting that the mobile phone 503 cooperatively process the short message, the mobile phone 503 may pop up the prompt window 500 in the main interface, and the prompt window 500 may display "is the smart watch requesting that the smart watch cooperatively process the short message, is it processed? "prompt information" and "process" buttons and "ignore" buttons may be displayed. Icons for applications such as clocks, calendars, gallery, memos, file management, email, music, sports health, cameras, phones, short messages, etc. may also be included in the main interface of the handset 503. When the user wants to cooperatively process the short message through the mobile phone 503, the user can click or touch a "process" button; when the user does not want to process the short message through the handset 503, the user may click or touch the "ignore" button.
Alternatively, as shown in (c) of fig. 5, the second electronic device may be the tablet 504, and the current display interface of the tablet 504 is a lock screen interface. When the tablet computer 504 receives the processing request sent by the smart watch 502, the tablet computer 504 may display the suspension ball 501 at the lower left corner of the lock screen interface, where the suspension ball 501 may be an icon of a short message application, so as to prompt a user to cooperatively process the short message being displayed by the smart watch 502 through the tablet computer 504. When the user wants to cooperatively process the short message through the tablet computer 504, the user can click or touch the suspension ball 501.
In the embodiment of the application, when the user wants to perform collaborative processing on the target content through the second electronic device, the user can execute the first preset operation in the second electronic device. When the second electronic device detects the first preset operation, an acquisition request can be sent to the first electronic device to request to acquire target content, and the acquired target content can be displayed, so that a user can cooperatively process the target content through the second electronic device.
Or when the second electronic device detects the first preset operation, the target content stored in the second electronic device can be directly displayed, so that the user can cooperatively process the target content through the second electronic device. The target content stored in the second electronic device may be target content that the first electronic device sends to the second electronic device. For example, when the first electronic device sends a processing request to the second electronic device, the first electronic device may send the target content at the same time, so that when the second electronic device detects the first preset operation, the second electronic device may directly display the target content, without acquiring the target content, thereby improving the display speed and improving the user experience.
Alternatively, the target content stored in the second electronic device may be target content received by the second electronic device itself. For example, the second electronic device may also install an application corresponding to the target content, and the second electronic device and the first electronic device may log in to the application through the same application account, so when the first electronic device receives the target content, the second electronic device may also receive the target content.
As shown in (b) of fig. 5, when the second electronic device displays the hint information through the hint window 500, the first preset operation may be an operation of clicking or touching the "process" button, and thus, when the second electronic device detects that the "process" button is clicked or touched, the second electronic device may display the target content in the display interface.
As shown in (c) of fig. 5, when the second electronic device displays the hover sphere 501 as the hint information, the first preset operation may be a click or touch operation on the hover sphere 501, and thus, when the second electronic device detects an operation of clicking or touching the hover sphere 501, the second electronic device may display the target content in the display interface.
In one example, the second electronic device may display the target content through a pop-up window. Wherein the size of the window may be set by default by the second electronic device. The second electronic device may also illustratively resize the window according to a user related operation (e.g., stretching the window edge, etc.). It should be understood that when the target content is displayed through the window, the second electronic device may perform layout adaptation on the target content according to the size of the window, so as to improve the display effect of the target content, and facilitate the user to cooperatively process the target content through the second electronic device.
In another example, when the second electronic device and the first electronic device are both installed with an application corresponding to the target content, and the second electronic device and the first electronic device log in the application through the same application account, the second electronic device may also directly start the application corresponding to the target content and jump to the interface where the target content is located, so that the user may directly process the target content in the application.
In the embodiment of the application, the user may delete, reply or copy the target content in the second electronic device.
In one possible implementation, when the user processes the target content in the second electronic device, a prompt window may pop up in the display interface of the first electronic device, where the content "processing the target content in the second electronic device" may be displayed in the prompt window. After the processing of the target content is completed in the second electronic device, the processed content can be synchronously displayed in the first electronic device.
Alternatively, when the user processes the target content in the second electronic device, the processing state of the second electronic device may be synchronized in the first electronic device, for example, the content being input in the second electronic device is displayed in synchronization, the content being copied in the second electronic device is displayed in synchronization, and so on.
Referring to fig. 6 and fig. 7, fig. 6 and fig. 7 show a second application scenario schematic diagram provided in an embodiment of the present application. In this application scenario, the first electronic device may be a smart watch 502, and the second electronic device may be a mobile phone 503. As shown in fig. 5 (a), the smart watch 502 may display a short message "latest weather is good" from TOM, and go to the outing bar together on weekends. As shown in (a) of fig. 6, the interface currently displayed by the handset 503 may be a main interface. When the handset 503 receives a processing request sent by the smart watch 502, the handset 503 may display the hover sphere 501 in the main interface. When the user wants to cooperatively process the short message through the mobile phone 503, the user can click or touch the suspension ball 501. As shown in (b) of fig. 6, when the mobile phone 503 detects a click or touch operation on the hover ball 501, the mobile phone 503 may display the short message through the hover window 600; alternatively, as shown in fig. 6 (c), when the mobile phone 503 detects a click or touch operation on the hover ball 501, the mobile phone 503 may start a short message application and jump to an interface corresponding to the short message.
As shown in fig. 7 (a), after jumping to the interface corresponding to the short message, the user may reply to the short message in the interface corresponding to the short message, for example, input "good, weekend" in the input box. As shown in (b) of fig. 7, a prompt window 700 may be displayed in the display interface of the smart watch 502, and "processing the target content in the mobile phone" may be displayed in the prompt window 700; alternatively, as shown in fig. 7 (c), an input box 701 may also appear in the smart watch 502, and "good, weekend" may be displayed simultaneously in the input box 701. As shown in fig. 7 (d) and fig. 7 (e), after the user completes the reply content of "good, weekend together" in the mobile phone 503, the reply content may be synchronously displayed in the smart watch 502.
In another possible implementation manner, after the second electronic device displays the target content based on the processing request of the first electronic device, the first electronic device may exit the currently displayed target content, for example, may return to the main interface, or may still display the target content, but the first electronic device and the second electronic device are independent from each other, so when the user processes the target content in the second electronic device, the first electronic device may not respond to the processing of the second electronic device, that is, a prompt window of "processing the target content in the second electronic device" may not be popped up in the display interface of the first electronic device, and the processing state of the second electronic device may not be synchronized.
In order to maintain the synchronicity of the contents of the first electronic device and the second electronic device, after the user completes processing the target content in the second electronic device, the second electronic device may send the processed content to the first electronic device, so that the user may see the processed content in the second electronic device when viewing the target content in the first electronic device.
In this embodiment of the present application, when the second electronic device detects the first preset operation, it may first determine whether the second electronic device is currently located at the screen locking interface. When the second electronic device is not currently located on the screen locking interface, the second electronic device can consider that a user currently using the second electronic device is a legal user, and at this time, the second electronic device can directly display target content in the display interface. When the second electronic device is currently located on the screen locking interface, the second electronic device can display an unlocking prompt interface to prompt a user to unlock the second electronic device. When the second electronic equipment detects the correct unlocking password, the second electronic equipment can consider that the user currently using the second electronic equipment is a legal user, and at the moment, the second electronic equipment can display target content in the display interface after unlocking; when the second electronic device does not detect the correct unlocking password, the second electronic device can consider that the user who uses the second electronic device at present is not a legal user, at this time, the second electronic device can not display the target content, so that the target content is prevented from being checked by other users, the privacy of the target content is ensured, and the privacy of the user is protected. The unlocking password can be any one of numbers, fingerprints, patterns, faces and the like.
Referring to fig. 8, fig. 8 shows a third application scenario diagram provided in the embodiment of the present application. In this application scenario, the first electronic device may be a smart watch 502, and the second electronic device may be a mobile phone 503. As shown in fig. 5 (a), the smart watch 502 may display a short message "latest weather is good" from TOM, and go to the outing bar together on weekends. As shown in fig. 8 (a), the current display interface of the mobile phone 503 may be a lock screen interface. When the mobile phone 503 receives the processing request sent by the smart watch 502, the mobile phone 503 may display the suspension ball 501 in the lock screen interface. When the user wants to cooperatively process the short message through the mobile phone 503, the user can click or touch the suspension ball 501. Since the mobile phone 503 is still in the lock screen interface when the click or touch operation on the hover ball 501 is detected, as shown in (b) of fig. 8, the mobile phone 503 may display an unlock prompt interface to prompt the user to unlock the mobile phone 503. When the user inputs the correct unlock code, as shown in fig. 6 (b), the mobile phone 503 may display the short message through the floating window 600 in the main interface after unlocking. Or as shown in fig. 6 (c), the mobile phone 503 may start a short message application, and may directly jump to the interface where the short message is located to display the short message.
In one possible implementation manner, when the display duration of the prompt information reaches the fourth preset duration, the second electronic device may hide the prompt information, so as to reduce interference caused by display of the prompt information to the user. That is, when the user does not process the prompt information within the fourth preset time period, the second electronic device may automatically hide the prompt information. The fourth preset duration can be specifically set by a technician according to an actual scene, or can be set by a user in a self-defining way. For example, the user may set the fourth preset duration to be 5s in a user-defined manner, so that when the display duration of the prompt message in the display interface reaches 5s, the prompt message automatically disappears from the display interface. The hidden prompt information may be a deletion prompt information, or may be a prompt information covered by other contents.
In another possible implementation manner, when the second electronic device detects the second preset operation on the prompt information, the second electronic device can also hide the prompt information, so that the user can actively hide the prompt information according to actual needs, and user experience is improved.
For example, as shown in (b) of fig. 5, when the second electronic device displays the hint information through the hint window 500, the second preset operation may be an operation of clicking or touching the "ignore" button, and thus, when the handset 503 detects an operation of clicking or touching the "ignore" button, the handset 503 may hide the hint information. Alternatively, the "close" button may be displayed in the prompt window 500, and the second preset operation may be an operation of clicking or touching the "close" button, so that when the mobile phone 503 detects an operation of clicking or touching the "close" button, the mobile phone may also hide the prompt information.
For example, as shown in (c) of fig. 5, when the second electronic device displays the hover sphere 501 as the prompt information, the second preset operation may be an operation of sliding the hover sphere 501 to the left, and thus, when the mobile phone 503 detects an operation of sliding the hover sphere 501 to the left, the mobile phone 503 may hide the prompt information. It should be understood that, in this application scenario, the setting of the second preset operation to the operation of sliding the suspension ball 501 to the left is only schematically explained, and should not be construed as limiting the embodiment of the present application, in this embodiment of the present application, the second preset operation may also be set to any operation, such as the operation of sliding the suspension ball 501 to the right, the operation of sliding the suspension ball 501 upward, the operation of sliding the suspension ball 501 downward, or the operation of double clicking the suspension ball 501.
In this embodiment of the present application, after the second electronic device hides the prompt information, the second electronic device may obtain a notification corresponding to the target content, and may place the notification on a notification bar of the second electronic device, so as to remind the user to perform cooperative processing on the target content in time. In one example, the second electronic device may also add source information to the notification, which may include device information of the first electronic device and a request time for co-processing. The device information of the first electronic device may be a name or a number of the first electronic device.
In one possible implementation, the notification corresponding to the target content may be a notification already stored in the second electronic device. For example, when the second electronic device and the first electronic device both install an application corresponding to the target content and the second electronic device and the first electronic device log in the application through the same application account, when the application receives the target content, the first electronic device and the second electronic device may both generate a notification corresponding to the target content. When the first electronic device displays the target content, the second electronic device can keep the notification corresponding to the target content in the second electronic device, so that the notification corresponding to the target content cannot disappear from the second electronic device.
At this time, when the first electronic device requests the second electronic device to perform cooperative processing on the target content, but the second electronic device does not perform cooperative processing on the target content based on the processing request of the first electronic device, the second electronic device may hide the prompt information, and may place a notification corresponding to the target content in the second electronic device on a notification bar of the second electronic device. For example, when the second electronic device displays the hover ball on the display interface based on the processing request of the first electronic device, if the user does not click or touch the hover ball within the fourth preset duration, the second electronic device may hide the prompt information, and may place the notification corresponding to the target content in the second electronic device on the notification bar of the second electronic device.
It should be understood that when the first electronic device displays the target content, if the first electronic device does not request the second electronic device to perform cooperative processing on the target content, or the second electronic device has performed cooperative processing on the target content based on the processing request of the first electronic device, the second electronic device may hide the notification corresponding to the target content in the second electronic device. The hidden notification may be a deletion notification, or may be a notification by other contents coverage, or the like.
In another possible implementation manner, the notification corresponding to the target content may also be a notification generated by the second electronic device according to the prompt information. That is, when the notification corresponding to the target content is not stored in the second electronic device, the second electronic device may generate a notification for performing cooperative processing on the target content according to the prompt information, and may place the notification on a notification column of the second electronic device, so as to remind the user to perform cooperative processing on the target content in time.
Referring to fig. 9, fig. 9 shows a fourth application scenario diagram provided in the embodiment of the present application. In this application scenario, the first electronic device may be a smart watch 502, and the second electronic device may be a mobile phone 503. As shown in fig. 5 (a), the smart watch 502 may display a short message "latest weather is good" from TOM, and go to the outing bar together on weekends. The mobile phone 503 may display the suspension ball 501 as a prompt message. As shown in fig. 9 (a), when the handset 503 does not perform the notification set-top operation, the notification bar of the mobile phone 503 may include, in order from top to bottom, notification corresponding to weather application, notification corresponding to application mall application, and,
Figure BDA0003405165220000311
And applying the corresponding notification and the short message application corresponding notification. As shown in fig. 9 (b), when the user does not click or touch the suspension ball 501 within the fourth preset duration, the mobile phone 503 may hide the suspension ball 501, and at the same time, the mobile phone 503 may place a notification corresponding to the short message on a notification column of the mobile phone 503, and may add source information, such as adding a name "smart watch" and a source time "now", to the notification, so as to remind the user to process the short message in time.
In the embodiment of the application, when the first electronic device displays the target content, the first electronic device may determine the second electronic device, and may send a processing request to the second electronic device, so as to request that the second electronic device cooperatively process the target content; the second electronic equipment displays corresponding prompt information according to the processing request sent by the first electronic equipment; when the user wants to cooperatively process the target content through the second electronic device, the user can execute a first preset operation in the second electronic device. The second electronic equipment can rapidly display the target content based on the first preset operation, operations such as searching the target content in the second electronic equipment are not needed, the operation process can be simplified, the user can conveniently cooperatively process the target content through the second electronic equipment, and the user experience is improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Corresponding to the content collaboration method described in the foregoing embodiments, the embodiments of the present application further provide a content collaboration device, where each module of the device may correspond to each step of implementing the content collaboration method.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
The embodiment of the application also provides an electronic device, which comprises at least one memory, at least one processor and a computer program stored in the at least one memory and capable of running on the at least one processor, wherein the processor executes the computer program to enable the electronic device to realize the steps in any of the method embodiments. The structure of the electronic device may be as shown in fig. 1, for example.
Embodiments of the present application also provide a computer-readable storage medium storing a computer program that, when executed by a computer, causes the computer to implement the steps of any of the various method embodiments described above.
Embodiments of the present application provide a computer program product for causing an electronic device to carry out the steps of any of the various method embodiments described above when the computer program product is run on the electronic device.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application implements all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable storage medium may include at least: any entity or device capable of carrying computer program code to an apparatus/electronic device, a recording medium, a computer memory, a read-only memory (ROM), a random access memory (random access memory, RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer-readable storage media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other manners. For example, the apparatus/electronic device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (20)

1. A content collaboration method, for application to a first electronic device, the method comprising:
the first electronic device displays target content;
the first electronic device determines a second electronic device and sends a processing request to the second electronic device, wherein the processing request is used for requesting cooperative processing of the target content through the second electronic device;
Responding to an acquisition request of the second electronic equipment, and sending the target content or the identification of the target content to the second electronic equipment by the first electronic equipment; and when the second electronic equipment acquires the processing request, the prompt information is displayed in a display interface of the second electronic equipment.
2. A content collaboration method, for application to a first electronic device, the method comprising:
the first electronic device displays target content;
the first electronic device determines a second electronic device and sends a processing request and the target content to the second electronic device or sends the processing request and the identification of the target content to the second electronic device; the processing request is used for requesting the second electronic equipment to cooperatively process the target content, wherein after the second electronic equipment receives the processing request, prompt information is displayed on the second electronic equipment, and the prompt information is used for prompting the second electronic equipment to cooperatively process the target content.
3. The method of claim 1 or 2, wherein the first electronic device determines a second electronic device, comprising:
the first electronic equipment acquires the display duration of the target content;
and when the display time length reaches a first preset time length, the first electronic device determines the second electronic device.
4. A method according to any one of claims 1 to 3, wherein the first electronic device determines a second electronic device, comprising:
the first electronic equipment acquires a distance from at least one third electronic equipment, wherein the at least one third electronic equipment is an electronic equipment which is in communication connection with the first electronic equipment;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance.
5. The method of claim 4, wherein the first electronic device determining the second electronic device from the at least one third electronic device based on the distance comprises:
the first electronic equipment acquires the type of at least one third electronic equipment and determines a first priority corresponding to the type of each third electronic equipment;
The first electronic device determines the second electronic device from the at least one third electronic device according to the distance and the first priority.
6. The method according to claim 4 or 5, wherein the first electronic device determining the second electronic device from the at least one third electronic device according to the distance comprises:
the first electronic device determines a user account number of the third electronic device login;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance and the user account; or alternatively, the process may be performed,
and the first electronic equipment determines the second electronic equipment from the at least one third electronic equipment according to the distance, the user account and the first priority corresponding to the type of the third electronic equipment.
7. A method according to any one of claims 1 to 3, wherein the first electronic device determines a second electronic device, comprising:
the first electronic equipment acquires the use state of at least one third electronic equipment, wherein the at least one third electronic equipment is an electronic equipment which is in communication connection with the first electronic equipment;
The first electronic device determines the second electronic device from the at least one third electronic device according to the use state.
8. The method of claim 7, wherein the first electronic device determining the second electronic device from the at least one third electronic device based on the usage status comprises:
the first electronic device determines an electronic device with a use state of a preset state in the at least one third electronic device, and determines the second electronic device according to the electronic device with the use state of the preset state;
the preset state is any one of an operation state, a picking-up state, a first screen-lighting state, a service state and a second screen-lighting state, the operation state is a state with key input, touch input, keyboard input or mouse input in the third electronic device, the picking-up state is a state that the third electronic device is picked up, the first screen-lighting state is a state that the screen-lighting time of the third electronic device is smaller than a second preset time period, the service state is a state that the third electronic device is being used except the operation state, the second screen-lighting state is a state that the screen-lighting time of the third electronic device exceeds the second preset time period, but the third electronic device is not in the operation state, the service state or the picking-up state.
9. The method of claim 8, wherein the first electronic device determining that a use state of the at least one third electronic device is an electronic device in a preset state, and determining the second electronic device according to the electronic device in the use state being the preset state, comprises:
when the electronic equipment with the use state being the picking-up state exists in the at least one third electronic equipment, the first electronic equipment determines whether the electronic equipment in the picking-up state is positioned in the same hand as the first electronic equipment;
when the electronic device in the picking-up state is located in the same hand as the first electronic device, the first electronic device determines the electronic device in the picking-up state as the second electronic device.
10. The method of claim 8, wherein the first electronic device determining that a use state of the at least one third electronic device is an electronic device in a preset state, and determining the second electronic device according to the electronic device in the use state being the preset state, comprises:
when a plurality of electronic devices with use states being preset exist in the at least one third electronic device, the first electronic device determines a second priority corresponding to each use state, and determines the electronic device with the highest second priority as the second electronic device.
11. The method according to any one of claims 7 to 10, wherein the first electronic device determining the second electronic device from the at least one third electronic device according to the usage status comprises:
the first electronic device obtains the distance between the first electronic device and the at least one third electronic device;
the first electronic device determines the second electronic device from the at least one third electronic device according to the distance and the use state.
12. The method according to any one of claims 7 to 11, wherein the first electronic device determining the second electronic device from the at least one third electronic device according to the usage status comprises:
the first electronic equipment acquires the type of at least one third electronic equipment and determines a first priority corresponding to the type of each third electronic equipment;
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state and the first priority; or alternatively, the process may be performed,
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the first priority and the distance between the first electronic device and the third electronic device.
13. The method according to any one of claims 7 to 12, wherein the first electronic device determining the second electronic device from the at least one third electronic device according to the usage status comprises:
the first electronic device determines a user account number of the at least one third electronic device login;
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state and the user account; or alternatively, the process may be performed,
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the user account and a first priority corresponding to the type of the third electronic device; or alternatively, the process may be performed,
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the user account and the distance between the first electronic device and the third electronic device; or alternatively, the process may be performed,
the first electronic device determines the second electronic device from the at least one third electronic device according to the use state, the user account, the first priority corresponding to the type of the third electronic device and the distance between the first electronic device and the third electronic device.
14. A content collaboration method, for application to a second electronic device, the method comprising:
the second electronic equipment acquires a processing request of the first electronic equipment, wherein the processing request is used for requesting cooperative processing of target content displayed by the first electronic equipment through the second electronic equipment;
the second electronic equipment displays prompt information according to the processing request;
the second electronic equipment detects a first preset operation on the prompt information and displays the target content.
15. The method of claim 14, wherein the second electronic device displaying the target content comprises:
the second electronic device sends an acquisition request to the first electronic device, wherein the acquisition request is used for acquiring the target content or the identification of the target content;
the second electronic equipment receives the target content sent by the first electronic equipment and displays the target content; or alternatively, the process may be performed,
the second electronic equipment receives the identification of the target content sent by the first electronic equipment, and displays the target content stored in the second electronic equipment according to the identification of the target content, wherein the target content stored in the second electronic equipment is owned by the second electronic equipment.
16. The method of claim 14, wherein the second electronic device obtaining the processing request of the first electronic device comprises:
the second electronic device obtains the processing request of the first electronic device and the target content, or obtains the processing request of the first electronic device and the identification of the target content;
the second electronic device displaying the target content, comprising:
the second electronic device displays the target content or displays the target content stored in the second electronic device according to the identification of the target content, wherein the target content stored in the second electronic device is owned by the second electronic device.
17. The method of any of claims 14 to 16, wherein the second electronic device displaying the hint information comprises:
and the second electronic equipment displays the prompt information through the suspension ball.
18. The method of any of claims 14 to 17, wherein the second electronic device displaying the target content comprises:
and the second electronic equipment displays the target content through a window.
19. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, causes the electronic device to implement the content collaboration method of any one of claims 1-13, or 14-18.
20. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a computer, causes the computer to implement the content collaboration method of any one of claims 1 to 13, or 14 to 18.
CN202111510555.7A 2021-12-10 2021-12-10 Content collaboration method, electronic device, and computer-readable storage medium Pending CN116257201A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111510555.7A CN116257201A (en) 2021-12-10 2021-12-10 Content collaboration method, electronic device, and computer-readable storage medium
PCT/CN2022/136632 WO2023103974A1 (en) 2021-12-10 2022-12-05 Content collaboration method, electronic device, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111510555.7A CN116257201A (en) 2021-12-10 2021-12-10 Content collaboration method, electronic device, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN116257201A true CN116257201A (en) 2023-06-13

Family

ID=86681382

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111510555.7A Pending CN116257201A (en) 2021-12-10 2021-12-10 Content collaboration method, electronic device, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN116257201A (en)
WO (1) WO2023103974A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109740055A (en) * 2018-12-27 2019-05-10 上海掌门科技有限公司 Reading information collaboration, methods of exhibiting, device, electronic equipment and medium
CN111028052A (en) * 2019-11-28 2020-04-17 维沃移动通信有限公司 Interface operation method and electronic equipment
CN111404802A (en) * 2020-02-19 2020-07-10 华为技术有限公司 Notification processing system and method and electronic equipment
CN115428413A (en) * 2020-02-19 2022-12-02 华为技术有限公司 Notification processing method, electronic equipment and system
CN113496426A (en) * 2020-04-02 2021-10-12 华为技术有限公司 Service recommendation method, electronic device and system
CN113286191B (en) * 2021-05-20 2022-08-12 Oppo广东移动通信有限公司 Content collaboration method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2023103974A1 (en) 2023-06-15

Similar Documents

Publication Publication Date Title
CN109814766B (en) Application display method and electronic equipment
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
WO2020134869A1 (en) Electronic device operating method and electronic device
CN110456951B (en) Application display method and electronic equipment
US11930130B2 (en) Screenshot generating method, control method, and electronic device
CN111543042B (en) Notification message processing method and electronic equipment
WO2021036770A1 (en) Split-screen processing method and terminal device
WO2021063237A1 (en) Control method for electronic device, and electronic device
CN110543287A (en) Screen display method and electronic equipment
CN111258700A (en) Icon management method and intelligent terminal
WO2021238370A1 (en) Display control method, electronic device, and computer-readable storage medium
CN115348350B (en) Information display method and electronic equipment
WO2021218429A1 (en) Method for managing application window, and terminal device and computer-readable storage medium
CN113805797B (en) Processing method of network resource, electronic equipment and computer readable storage medium
CN112130788A (en) Content sharing method and device
WO2022143180A1 (en) Collaborative display method, terminal device, and computer readable storage medium
CN112068907A (en) Interface display method and electronic equipment
CN113805487A (en) Control instruction generation method and device, terminal equipment and readable storage medium
WO2022048453A1 (en) Unlocking method and electronic device
WO2023103974A1 (en) Content collaboration method, electronic device, and computer readable storage medium
WO2022022381A1 (en) Method and apparatus for generating graffiti patterns, electronic device, and storage medium
CN116204093B (en) Page display method and electronic equipment
CN114764300B (en) Window page interaction method and device, electronic equipment and readable storage medium
WO2023221895A1 (en) Target information processing method and apparatus, and electronic device
WO2023109636A1 (en) Application card display method and apparatus, terminal device, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination