CN114356187A - Content acquisition method and device - Google Patents

Content acquisition method and device Download PDF

Info

Publication number
CN114356187A
CN114356187A CN202110887743.5A CN202110887743A CN114356187A CN 114356187 A CN114356187 A CN 114356187A CN 202110887743 A CN202110887743 A CN 202110887743A CN 114356187 A CN114356187 A CN 114356187A
Authority
CN
China
Prior art keywords
content
screen
equipment
content acquisition
acquisition instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110887743.5A
Other languages
Chinese (zh)
Inventor
李轩恺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN114356187A publication Critical patent/CN114356187A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Telephone Function (AREA)

Abstract

A content acquisition method and device can include: the method comprises the steps that a first device triggers a content acquisition instruction to a second device, wherein the content acquisition instruction is used for instructing the second device to perform screen capture or screen recording operation on content currently displayed on a display screen of the second device, and the content obtained through the screen capture or screen recording operation is fed back to the first device; and the first equipment receives and displays the content which is sent by the second equipment and corresponds to the content acquisition instruction. By the method, the second equipment can be controlled to execute the screen capture operation or the screen recording operation on the first equipment, so that a user can conveniently obtain the screen capture content or the screen recording content on the second equipment, and the user experience is improved.

Description

Content acquisition method and device
The present application claims priority from the chinese patent application filed on 29/09/2020 and entitled "a content acquisition method and apparatus" by the chinese patent office, application No. 202011048184.0, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a content acquisition method and apparatus.
Background
Currently, an electronic device can acquire desired content through a screen capture or recording function. Taking screen capturing as an example, a user can trigger screen capturing operation of the electronic device by means of three-finger sliding down or finger joint knocking, that is, the electronic device can only perform screen capturing operation on the electronic device, but cannot perform screen capturing operation on other devices.
As the variety of electronic devices increases, the trend of increasing the linkage between different electronic devices is inevitable. Therefore, how to perform screen capture or recording operations on other devices on one electronic device becomes a problem to be solved.
Disclosure of Invention
The application provides a content acquisition method and a content acquisition device, which are used for controlling other equipment to execute operation through one equipment so as to acquire content on the other equipment.
In a first aspect, the present application provides a content obtaining method, including: the method comprises the steps that a first device triggers a content acquisition instruction to a second device, wherein the content acquisition instruction is used for instructing the second device to perform screen capture or screen recording operation on content currently displayed on a display screen of the second device, and the content obtained through the screen capture or screen recording operation is fed back to the first device; and the first equipment receives and displays the content which is sent by the second equipment and corresponds to the content acquisition instruction.
Through the technical scheme, the first equipment can control the second equipment to execute screen capture operation or screen recording operation, and can receive and display the content obtained by the screen capture operation or screen recording operation executed by the second equipment, so that the screen capture operation or screen recording operation executed by other one or more equipment controlled by one equipment can be realized, the linkage among different equipment is enhanced, the convenience is brought to users to quickly obtain information, and the user experience is improved.
In this application, the triggering, by the first device, the content obtaining instruction to the second device may include the following implementation manners:
the first mode is as follows: the first device responds to a first operation of a user, and the first operation is used for triggering a content acquisition instruction to the second device.
The second mode is as follows: the first device triggers a content acquisition instruction to the second device by executing a collision operation with the second device.
The second mode is as follows: the first device sends a notification message to a third device, wherein the notification message is used for notifying the third device to trigger a content acquisition instruction to the second device.
It should be noted that the solution of the present application may be applied to a short-range communication scenario, that is, the communication between the first device and the second device is a short-range communication.
Through the three modes, a user can autonomously select the controlled equipment, and can also trigger a content acquisition instruction to the second equipment through the operation of 'hitting one another' with the second equipment, or can indirectly trigger the content acquisition instruction to the second equipment through the intermediate equipment, so that the screen capturing operation or the screen recording operation of other equipment is controlled to be executed on one equipment.
In one possible design, before the first device responds to the first operation of the user, the first device may further respond to a second operation of the user, and the second operation is used for triggering display of the connectable device list, and the current connectable device list is displayed on the display screen; and the first equipment responds to a third operation of the user, selects the second equipment in the displayed connectable equipment list, and the third operation is an operation of selecting the corresponding equipment in the displayed connectable equipment list by the user.
Through the technical scheme, the user can select the controlled device from the multiple devices, and then triggers a content acquisition instruction to the selected device on the first device, so that the second device is controlled to execute the operation.
In a possible design, in the process of receiving and displaying the content corresponding to the content acquisition instruction sent by the second device, the first device may also synchronously display a first interface, where the first interface includes a transmission progress of the content.
Through the technical scheme, the transmission progress of the content corresponding to the content acquisition instruction can be displayed on the first equipment, so that a user can know the receiving condition of the content of the opposite-end equipment for screen capture or screen recording, and the user experience is improved.
In one possible design, the method further includes: and the first equipment displays first prompt information, wherein the first prompt information is used for prompting the second equipment to execute the state of the operation corresponding to the content acquisition instruction.
Through the technical scheme, when the second device executes the operation corresponding to the content acquisition instruction, the first device can display the prompt information for prompting the second device to execute the operation state corresponding to the content acquisition instruction, so that a user can conveniently check whether the operation executed by the second device is correct.
In one possible design, the method further includes: and the first equipment displays second prompt information, wherein the second prompt information is used for prompting a user to receive the content which is sent by the second equipment and corresponds to the content acquisition instruction.
Through the technical scheme, when the second device feeds back the content corresponding to the content acquisition instruction to the first device, the prompt message can be displayed on the first device, so that the user can timely receive the feedback content of the second device.
In a second aspect, the present application further provides a content obtaining method, including: the method comprises the steps that a second device receives a content obtaining instruction triggered by a first device, wherein the content obtaining instruction is used for indicating the second device to carry out screen capture or screen recording operation on content currently displayed on a display screen of the second device; the second equipment carries out screen capture or screen recording operation on the content currently displayed on the display screen of the second equipment according to the content acquisition instruction to obtain the content corresponding to the content acquisition instruction; and the second equipment sends the content corresponding to the content acquisition instruction to the first equipment.
In the technical scheme, the second device can execute the corresponding operation after receiving the content acquisition instruction triggered by the first device, and send the content corresponding to the content acquisition instruction to the first device, so that the content displayed on the second device can be captured or recorded in a mode that the first device controls the second device, user experience is improved, and the purpose of quickly sharing information between devices is achieved.
In one possible design, the method further includes: and the second equipment displays third prompt information, wherein the third prompt information is used for prompting a user whether to execute a content acquisition instruction sent by the first equipment.
Through the technical scheme, the second device can display the prompt message on the display screen after receiving the content acquisition command triggered by the first device, so that the user can confirm the prompt message, and after the user confirms the prompt message, the second device executes corresponding operation, so that the content acquisition safety can be improved, and the user information safety can be protected.
In a possible design, in the process that the second device sends the content corresponding to the content obtaining instruction to the first device, a second interface may be further displayed, where the second interface includes a transmission progress of the content.
Through the technical scheme, the second device can display the transmission progress of the content on the display screen of the second device in the process of sending the content corresponding to the content acquisition instruction to the first device, so that a user can know the transmission condition, and the problem of transmission interruption caused by other factors such as network interruption is avoided.
In a third aspect, the present application further provides a content acquiring apparatus, including: a display screen; one or more processors; a memory; a plurality of applications; and one or more computer programs; wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when invoked for execution by the one or more processors, cause the content obtaining apparatus to perform any of the solutions of the first aspect and any possible design thereof.
In a fourth aspect, the present application further provides a content acquisition apparatus comprising means for performing the method of the first aspect or any one of the possible designs of the first aspect; these modules/units may be implemented by hardware, or by hardware executing corresponding software.
In a fifth aspect, the present application further provides a content acquiring apparatus, including: a display screen; one or more processors; a memory; a plurality of applications; and one or more computer programs; wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when invoked for execution by the one or more processors, cause the content obtaining apparatus to perform the solution as set forth in any one of the second aspects above and their possible designs.
In a sixth aspect, the present application further provides a content acquisition apparatus comprising means/units for performing the method of the second aspect or any one of the possible designs of the second aspect; these modules/units may be implemented by hardware, or by hardware executing corresponding software.
In a seventh aspect, an embodiment of the present application further provides a chip, where the chip is coupled with a memory in an electronic device, and implements a technical solution of any one of the first aspect and the first aspect of the embodiment of the present application; "coupled" in the context of this application means that two elements are joined to each other either directly or indirectly.
In an eighth aspect, an embodiment of the present application further provides a chip, where the chip is coupled to a memory in an electronic device, and executes a technical solution of any one of the second aspect and its possible design; "coupled" in the context of this application means that two elements are joined to each other either directly or indirectly.
In a ninth aspect, an embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium includes a computer program, and when the computer program runs on an electronic device, the electronic device is enabled to execute the technical solution of the first aspect of the embodiment of the present application and any one of the possible designs of the first aspect of the embodiment of the present application.
In a tenth aspect, an embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium includes a computer program, and when the computer program runs on an electronic device, the electronic device is enabled to execute a technical solution of any one of the second aspect and the possible designs of the second aspect of the embodiment of the present application.
In an eleventh aspect, an embodiment of the present application further provides a computer program product, which when run on an electronic device, causes the electronic device to execute the technical solution of the first aspect of the embodiment of the present application and any one of the possible designs of the first aspect of the embodiment of the present application.
In a twelfth aspect, an embodiment of the present application further provides a computer program product, which, when running on an electronic device, enables the electronic device to execute any one of the technical solutions designed in the second aspect of the present application and the second aspect of the present application.
For each of the third to twelfth aspects and possible technical effects of each aspect, please refer to the description of the possible technical effects for each of the possible solutions in the first and second aspects, and no repeated description is given here.
Drawings
FIG. 1A is a schematic diagram of a user interface provided by an embodiment of the present application;
fig. 1B is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 3A is a schematic flow chart of a content acquisition method according to an embodiment of the present disclosure;
fig. 3B is a flowchart of a content obtaining method according to an embodiment of the present application;
FIG. 4A is a schematic diagram of a user interface provided by an embodiment of the present application;
fig. 4B is a schematic software module diagram of a content obtaining method according to an embodiment of the present application;
4C-4D are schematic diagrams of user interfaces provided by embodiments of the present application;
fig. 5A is a schematic flow chart of a content acquisition method according to an embodiment of the present application;
fig. 5B is a flowchart of a content obtaining method according to an embodiment of the present application;
FIG. 6A is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 6B is a schematic diagram of an operation provided by an embodiment of the present application;
fig. 6C is a schematic diagram of software modules of a content obtaining method according to an embodiment of the present application;
6D-7B are schematic diagrams of user interfaces provided by embodiments of the present application;
FIG. 7C provides an operational schematic of an embodiment of the present application;
FIG. 7D is a schematic diagram of a user interface provided by an embodiment of the present application;
fig. 7E is a schematic software module diagram of a content obtaining method according to an embodiment of the present application;
fig. 8A is a schematic flow chart of a content acquisition method according to an embodiment of the present application;
fig. 8B is a flowchart of a content obtaining method according to an embodiment of the present application;
FIG. 9A is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 9B is a schematic diagram of an operation provided by an embodiment of the present application;
FIG. 9C is a schematic view of a user interface provided by an embodiment of the present application;
fig. 9D is a schematic software module diagram of a content obtaining method according to an embodiment of the present application;
FIG. 10 is a schematic view of a user interface provided by an embodiment of the present application;
fig. 11 is a flowchart of a content obtaining method according to an embodiment of the present application;
fig. 12 is a schematic diagram of a content acquiring apparatus according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of another content acquiring apparatus according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of another content acquiring apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described in detail below with reference to the drawings in the following embodiments of the present application.
For ease of understanding, an explanation of concepts related to the present application is given by way of example for reference, as follows:
1. screen shot: also known as screenshots, screen elements such as desktops, windows, dialog boxes, tabs, etc., displayed by a display screen of an electronic device are saved as pictures.
2. Screen recording: also called screen recording, records the operation process of the user on the display screen of the electronic equipment and saves the operation process as a file in a video format.
Illustratively, as shown in fig. 1A, a schematic view of a user interface provided in an embodiment of the present application is provided. Referring to fig. 1A, for example, assume that a user opens a notification bar by performing a gesture operation, for example, sliding down from above a display screen, that is, the notification bar is displayed on the display screen of the mobile phone, and the user can click a function button in the notification bar to control the mobile phone to perform a corresponding operation. For example, the user may click the "screen capture" button 11, and the mobile phone may perform the screen capture operation in response to the user clicking the "screen capture" button. For another example, the user clicks the "screen recording" button 12, and the mobile phone may perform a screen recording operation in response to the user clicking the "screen recording" button 12.
Currently, taking a mobile phone as an example, the mobile phone may capture a currently displayed content on a display screen of its own device by performing a gesture operation on the touch screen (for example, a finger joint taps the touch screen), but the mobile phone cannot capture or record a currently displayed content on a display screen of another device. Therefore, how to control another device to perform screen capture or recording operation through one device to acquire the content currently displayed on the display screen of the other device is a problem to be solved.
In order to solve the above technical problem, an embodiment of the present application provides a content obtaining method, where one device sends an instruction for triggering screen capture or screen recording to another device or another devices to control the another device or the another devices to perform screen capture or screen recording, so as to obtain content currently displayed on a display screen of the another device or the another devices.
It is to be understood that the "content" in the embodiment of the present application may include an image, a video, a text, an icon, or the like currently displayed on a display screen of another device or another plurality of devices, and the present application is not particularly limited thereto.
An application (App) related to the embodiment of the present application is, for short, an application, and is a software program capable of implementing one or more specific functions. Generally, a plurality of applications may be installed in an electronic device. Such as camera applications, short message applications, mailbox applications, information applications, galleries, maps, Tencent videos, and the like. The application mentioned below may be an application installed in the electronic device when the electronic device is shipped from a factory, or an application downloaded from a network or acquired by another electronic device during use of the electronic device by a user.
It should be noted that the content obtaining method provided in the embodiment of the present application may be applicable to any electronic device with a display screen, such as a mobile phone, a tablet computer, a wearable device (e.g., a watch, a bracelet, an intelligent helmet, etc.), an in-vehicle device, a smart home, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like, and the embodiment of the present application is not limited.
For convenience of description, the following describes a process in which one device sends an instruction for triggering screen capture or screen recording to another device to control the other device to perform screen capture or screen recording so as to acquire content currently displayed on the display screen of the other device, taking two devices as an example, such as a first device and a second device as an example, where a principle of acquiring content respectively displayed on the display screens of other devices is similar to a principle of acquiring content displayed on the display screen of the other device by one device. Assume that a first device is to control a second device to execute a screenshot or a screenshot file to obtain display content on a display screen of the second device. The first device may be a mobile phone, a tablet computer, or the like, and the second device may be a watch, a large screen device (e.g., a smart screen), or the like, which is not limited in the present application.
The structure of the electronic device will be described below by taking the first device and the second device as mobile phones as an example.
As shown in fig. 1B, the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. The controller may be a neural center and a command center of the cell phone 100, among others. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution. A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the mobile phone 100, and may also be used to transmit data between the mobile phone 100 and peripheral devices. The charging management module 140 is configured to receive charging input from a charger. The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the handset 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The display screen 194 is used to display a display interface of an application and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the cell phone 100 may include 1 or N display screens 194, with N being a positive integer greater than 1. In the present embodiment, the display screen 194 may be used to display multiple application interfaces simultaneously.
The camera 193 is used to capture still images or video. The cameras 193 may include a front camera and a rear camera.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system, software codes of at least one application program (such as an Aichi art application, a WeChat application, etc.), and the like. The data storage area can store data (such as images, videos and the like) generated during the use of the mobile phone 100 and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as pictures, videos, and the like are saved in an external memory card.
The mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine the motion attitude of the cellular phone 100. In some embodiments, the angular velocity of the handpiece 100 about three axes (i.e., the x, y, and z axes) may be determined by the gyro sensor 180B.
The gyro sensor 180B may be used for photographing anti-shake. The air pressure sensor 180C is used to measure air pressure. In some embodiments, the handset 100 calculates altitude, aiding in positioning and navigation, from the barometric pressure measured by the barometric pressure sensor 180C. The magnetic sensor 180D includes a hall sensor. The handset 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the handset 100 is a flip phone, the handset 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set. The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The handset 100 may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the cell phone 100 may utilize the range sensor 180F to range for fast focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The cellular phone 100 emits infrared light to the outside through the light emitting diode. The handset 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the cell phone 100. When insufficient reflected light is detected, the cell phone 100 can determine that there are no objects near the cell phone 100. The mobile phone 100 can detect that the mobile phone 100 is held by the user and close to the ear for communication by using the proximity light sensor 180G, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The handset 100 may adaptively adjust the brightness of the display 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the mobile phone 100 is in a pocket to prevent accidental touches. The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a photograph of the fingerprint, answer an incoming call with the fingerprint, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the handset 100 implements a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the mobile phone 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the cell phone 100 heats the battery 142 when the temperature is below another threshold to avoid an abnormal shutdown of the cell phone 100 due to low temperatures. In other embodiments, when the temperature is lower than a further threshold, the mobile phone 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 100, different from the position of the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The cellular phone 100 may receive a key input, and generate a key signal input related to user setting and function control of the cellular phone 100. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the cellular phone 100 by being inserted into the SIM card interface 195 or being pulled out from the SIM card interface 195.
It will be understood that the components shown in fig. 1B are not intended to be limiting, and that the handset may include more or fewer components than those shown, or some components may be combined, or some components may be split, or a different arrangement of components. In the following embodiments, the mobile phone 100 shown in fig. 1B is taken as an example for description.
Furthermore, the embodiments described below relate to at least one, including one or more; wherein a plurality means greater than or equal to two. In addition, it is to be understood that the terms first, second, etc. in the description of the present application are used for distinguishing between the descriptions and not necessarily for describing a sequential or chronological order.
The content acquisition method provided by the embodiment of the application can be applied to a scene where a plurality of electronic devices are interconnected based on a communication network, as shown in fig. 2, the content acquisition method includes a plurality of electronic devices such as a mobile phone, an electronic device 1, an electronic device 2, an electronic device 3, and an electronic device 4. Wherein, electronic equipment can be wrist-watch, panel computer, wisdom screen, remote controller etc. and the cell-phone can be interacted with wrist-watch, wisdom screen etc.. The communication network may be a local area network or a wide area network, and the embodiment of the present application is not particularly limited. Illustratively, the communication network may be a wifi hotspot network, a wifi P2P network, a bluetooth network, a zigbee network, or an NFC network, among other short-range communication networks.
In the scenario shown in fig. 2, the mobile phone may establish a connection with another electronic device, and then the mobile phone may control the other electronic device to perform a screen capture operation or a screen recording operation, and then the other electronic device may transmit content (for example, a picture or a video) obtained by performing the screen capture operation or the screen recording operation to the mobile phone through the local area network, so that the mobile phone may obtain the content displayed on the display screen of the other electronic device. For example, the mobile phone may "touch" the electronic device 3 (watch), establish an NFC communication connection between the mobile phone and the watch by "touch", and trigger an instruction to perform a screen capture operation to the watch. After the watch receives the instruction of screen capture operation, the current displayed content of the watch can be subjected to screen capture, and then after the watch completes the screen capture operation, the content after screen capture (which can be information such as screen capture of pictures or characters) can be transmitted to the mobile phone through Bluetooth or wifi. Of course, it is understood that, in the above, the first device controls the second device to perform the screen capture operation or the screen recording operation is taken as an example, in actual implementation, the second device may also control the first device to perform the screen capture operation or the screen recording operation, and in combination with the above example, that is, the watch may also trigger the instruction for performing the screen capture operation to the mobile phone. After the mobile phone receives the instruction of the screen capturing operation, the currently displayed content of the mobile phone can be captured, and then after the mobile phone completes the screen capturing operation, the content after the screen capturing (which can be information such as image screen capturing or screen capturing of characters) can be transmitted to the watch through Bluetooth or wifi.
Further, in this embodiment of the application, an authority table may be stored in the originating-side electronic device, where the authority table is used to characterize to which electronic devices the originating-side electronic device may send the screen capture operation instruction or the screen recording operation instruction and/or to which electronic devices the electronic device may not send the screen capture operation instruction or the screen recording operation instruction. In other words, the authority table may be used to characterize which electronic devices may be controlled by the originating electronic device to perform screen capture or recording. For example, the authority table can be referred to as table 1.
TABLE 1
Figure BDA0003194817610000091
In table 1, "1" indicates the presence of an authority, and "0" indicates the absence of an authority. For example, the electronic device a may capture a screen of the electronic device C, but may not record the screen of the electronic device C. It is to be understood that table 1 is merely illustrative and may be represented in other forms in the present application without limitation.
After the initiating terminal electronic device has the authority to initiate the screen capturing operation or the screen recording operation instruction to other electronic devices, correspondingly, other electronic devices can receive the screen capturing operation instruction or the screen recording operation instruction and can accept or not accept the operation instruction sent by the initiating terminal device. For example, after the other electronic devices agree to accept the operation instruction sent by the initiating device, the initiating device can control the other electronic devices to perform the screen capturing operation or the screen recording operation. For example, after the electronic device a sends the screen capture operation instruction to the electronic device B, when the electronic device B allows the electronic device a to perform the screen capture operation on its own device, the electronic device a can control the electronic device B to perform the screen capture operation.
It should be noted that, the above description is given by taking the identifier of the electronic device as an example, that is, when one electronic device allows the other electronic device to perform the screen capture operation or the screen recording operation, all the display contents of the device terminal may be subjected to the screen capture operation or the screen recording operation by the other electronic device. In practical applications, permissions may also be set for different contents in one electronic device, for example, permissions may be set for different contents displayed on the electronic device in units, for example, for a content displayed on the electronic device that includes privacy information of a user (for example, an identification number of the user, a bank card number, and the like), screen capturing operation or screen recording operation may not be allowed to be performed by other electronic devices, for a content displayed on the electronic device that does not include the privacy information of the user, screen capturing operation or screen recording operation may be allowed to be performed by other electronic devices, and the like.
The content acquisition method related in the embodiments of the present application is described in detail below with reference to specific scenarios.
Scene one: the main device selects the target device, and interaction between the main device and the target device is further achieved. It should be noted that the master device may be understood as a first device, such as a mobile phone, and the target device may be understood as a second device, such as an electronic device, such as a smart screen or a watch.
As shown in fig. 3A, a schematic block diagram of a content obtaining method according to an embodiment of the present application is shown in fig. 3A. On the main equipment side, the main equipment receives screen capturing or screen recording events, then inquires and displays an available equipment list, and a user can select target equipment in the available equipment list and send a screen capturing or screen recording instruction to the target equipment. On the target equipment side, the target equipment can analyze a screen capturing or recording instruction sent by the main equipment, then starts to execute screen capturing or recording operation, and after the screen capturing or recording operation is completed, the contents obtained by screen capturing or recording can be transmitted to the main equipment for storage.
The content acquiring method related to the present application will be described in detail below with reference to a schematic user interface diagram. As shown in fig. 3B, a flowchart of a content obtaining method provided in an embodiment of the present application is shown, and referring to fig. 3B, the method may include the following steps:
step 301: the master device responds to the user's operation of selecting the target device.
The main device is a mobile phone, and the target device is an intelligent screen. Suppose a user views a map through a smart screen and wants to share a screenshot of the geographical location (for descriptive convenience, the image to be captured may be referred to as "target content") where a certain place in the map is located to other users. Referring to fig. 4A, the handset 100 displays a home interface 400. Among other things, the main interface 400 may include a plurality of applications such as email, clocks, gallery, etc. The user may call up the control bar through gesture operations on the main interface 400, and then select a target device in the control bar. Illustratively, the gesture operation may be a sliding operation of the user from the lower left corner to the upper left corner on the main interface 400 of the cellular phone 100. When the cell phone 100 detects the gesture operation, a user interface 410 may be displayed on the main interface. The user interface 410 may include a control bar 411, where the control bar 411 includes at least one connectable device list, such as "smart screen" 412 and "matchbook X" 413. The user may select "smart screen" 412 in the list of connectable devices and cell phone 100 may display user interface 420 on the display screen in response to the user clicking on "smart screen" 412. Among other things, a plurality of control function options may be included in the user interface 420, which may include, for example, a "screen capture" button 421, a "screen recording" button 422, and so on.
It should be noted that, in the embodiment of the present application, the electronic devices in the connectable device list displayed in the user interface 410 may be other electronic devices in the same local area network as the host device (mobile phone).
In other embodiments, the gesture operation may also be other operations, for example, a sliding operation from the lower right corner to the upper right corner, from the upper left corner to the lower right corner, or from the upper right corner to the lower right corner, or in other embodiments, the gesture operation may also be a sliding operation from the lower left corner to the upper right corner, or from the upper left corner to the upper left corner, or from the upper right corner to the lower right corner, and with a certain pressure, for example, when the cell phone 100 detects a sliding operation from the upper left corner to the lower left corner, and a pressure sensor provided on the display screen detects that a pressure value generated by the sliding operation is greater than a threshold value, the cell phone 100 displays the control bar 411.
In other embodiments, the gesture operation may also be a sliding operation from the lower left corner to the upper right corner to the lower left corner and the upper left corner to the lower right corner and the sliding operation stops at the end position without intervals or presses for a preset time. For example, when the mobile phone 100 detects a downward sliding operation from the upper left corner and the mobile phone 100 detects that the sliding operation slides to the end position, the mobile phone 100 stops at the end position or presses for a preset time period (for example, 2 seconds), and then the control bar 411 is displayed.
In other embodiments, the gesture operation may also be other operations, such as an operation of drawing a circle, drawing a polygon, etc. on the display screen; alternatively, the gesture operation may also be an operation such as "shake", and the embodiment of the present application is not limited.
Step 302: the master device triggers a content acquisition instruction.
For example, the user may click the "screen capture" button 421 in the user interface 420, and the mobile phone 100 may respond to the user's click operation on the "screen capture" button 421 and send a screen capture instruction to the smart screen. It is understood that if the user clicks the "record screen" button 422 in the interface 420, the mobile phone 100 may respond to the user's click operation of the "record screen" button 422 and send a record screen instruction to the smart screen.
For example, referring to fig. 4B, a user may select a smart screen from a list of available devices displayed on a mobile phone, and then the control signaling module may send a screen capture operation instruction triggered by the user to the smart screen to a Soft Bus (SB) of the mobile phone, where the soft bus of the mobile phone sends the screen capture operation instruction to the soft bus of the smart screen through a near field network.
Step 303: and the target equipment executes corresponding operation according to the content acquisition instruction to obtain the content corresponding to the content acquisition instruction.
After the user clicks the "screen capture" button 421 on the user interface 420, the mobile phone 100 may trigger a screen capture instruction to the smart screen in response to the user clicking the "screen capture" button 421. Correspondingly, the soft bus on the smart screen can send the received screen capture instruction triggered by the mobile phone to the control signaling module on the smart screen, and then the control signaling module of the smart screen can control the smart screen to perform screen capture operation on the target content to obtain the screen capture of the target content displayed on the smart screen.
If the user clicks the screen recording button 422 in the interface 420, the mobile phone 100 may trigger a screen recording instruction to the smart screen in response to the user clicking the screen recording button 421. After receiving the screen recording instruction, the smart screen can perform screen recording operation on the displayed target content for a certain time (for example, 15s) to obtain a video file corresponding to the target content displayed on the smart screen within the certain time.
As a possible implementation manner, for example, after receiving the screen recording instruction, the smart screen may perform a screen recording operation on the content displayed on the smart screen, and at the same time, the recording duration in the screen recording process may be displayed on the display screen of the mobile phone 100, as shown in fig. 4C, where fig. 4C shows that the user interface 450 may be displayed on the display screen of the mobile phone 100 during the screen recording process of the smart screen. Wherein, the screen recording duration can be represented by a shaded part on the screen recording button. When the shadow on the display screen of the mobile phone 100 displaying the user interface 460, i.e., the "record screen" button, is filled, it indicates that the smart screen has stopped recording.
As another possible implementation manner, the intelligent screen can record the screen for a period determined by the user. Illustratively, with continued reference to fig. 4A, the user may click a "screen recording" button 422 in the user interface 420 of the cell phone, and then the smart screen may begin the screen recording operation, and if the user wants to stop the screen recording, the user may click the "screen recording" button 422 in the user interface 420 of the cell phone again, at which point the smart screen may stop the screen recording operation. That is to say, the user can select the screen recording duration according to the self requirement, so that the user experience can be improved.
Step 304: and the target equipment transmits the content corresponding to the content acquisition instruction to the main equipment.
For example, and with continued reference to fig. 4A and 4B, after the smart screen captures the displayed interface, the control signaling module on the smart screen may send the captured image to the file transfer module. The file transfer module then transfers the obtained content to the mobile phone 100 through the soft bus, and the user interface 430 can be displayed on the mobile phone 100 during the image transfer process. In which the user interface 430 may display the screen shot with a transmission progress of 50% (i.e., the black half-circle around the "screen shot" in the figure). When the transmission progress is 100%, that is, after the transmission is completed, a user interface 440, such as a map shown in the figure, may be displayed on the display screen of the mobile phone 100.
Of course, when the actual product is implemented, the transmission progress may be displayed by colors such as blue and red, and the transmission progress may be in other display forms, for example, the transmission progress may also be represented by the size of the shaded area, and the like, which is not limited in this application.
As another example, as shown in fig. 4C, after the smart screen performs the screen recording operation, the smart screen may transmit the file after the screen recording to the mobile phone 100, and during the file transmission, the user interface 470 may be displayed on the display screen of the mobile phone 100. In this case, the user interface 470 may display the file transfer progress 471, for example, the file transfer progress may be represented in a text form, such as "30% transferred" illustrated in the figure. It should be noted that the format of the file transmission progress may also be in other forms, and the present application is not limited thereto.
Fig. 4A and 4C are schematic diagrams of user interfaces for content acquisition for a host device (mobile phone). The following describes a process in which a controlled end device, i.e., a target device, such as a smart screen, is controlled to execute an operation instruction.
Referring to fig. 4D, before the mobile phone has not triggered the operation execution command, the smart screen displays the user interface 40, such as a map as shown in the figure. The user can select the smart screen from the connectable list displayed by the mobile phone, and select to execute screen capture operation on the smart screen, and then the mobile phone can trigger a screen capture instruction to the smart screen. After receiving the screen capture instruction, the smart screen may display the user interface 41. Wherein, the user interface 41 may include a prompt box "the huaboei Mate30 requests a screen shot, whether allowed? "4101," allow "4102, and" deny "4103.
After the user selects "allow" 4102 in the user interface 41 of the smart screen, the smart screen may perform a screen capture operation, at which time the user interface 42 may be displayed on the smart screen. For example, a prompt box "taking a screen" 4201 may be included on the user interface 42. After the intelligent screen finishes the screen capturing operation, the picture obtained by screen capturing can be sent to the mobile phone. When a picture of a screenshot is sent to a mobile phone, a user interface 43 can be displayed on a smart screen, and a prompt box "sending a file to the hua wei Mate30, and transmitting 30%" 4301 can be included in the user interface 43, that is, the transmission progress of the file can be displayed on the display screen of the smart screen. It is understood that the prompt box 4301 may disappear after the file transfer is complete.
It should be noted that, assuming that the display screen of the smart screen is a touch screen, in the present application, the user may select the button "allow" 4102 through a remote controller, or may select the button "allow" 4102 through a click operation of the user on the touch screen, which is not limited in this application.
Of course, it is understood that the above-mentioned user interface diagram is only a schematic illustration, and in the actual product implementation, the content displayed on the user interface may also be other content, for example, the content of the prompt box 4101 in fig. 4D may also be "whether the hua bei Mate30 is allowed to capture a screen? "and the like, which the present application does not specifically limit.
Scene two: the main device interacts with the target device in a touch or scanning mode.
As shown in fig. 5A, a schematic block diagram of a content obtaining method according to an embodiment of the present application is shown in fig. 5A. The main device can establish connection with the target device in a mode of ' touch-and-dash ', ' scanning and the like, and sends a screen capturing or recording instruction to the target device. On the target equipment side, the target equipment analyzes a screen capturing or recording instruction sent by the main equipment, then starts to execute screen capturing or recording operation, and after the screen capturing or recording operation is completed, the contents obtained by screen capturing or recording can be transmitted to the main equipment for storage.
Two implementations of this scenario will be described in detail below in conjunction with a user interface diagram. As shown in fig. 5B, a flowchart of a content obtaining method provided in an embodiment of the present application is shown, and referring to fig. 5B, the method may include the following steps:
it should be noted that, step 502 and step 503 are the same as step 303 and step 304 in the embodiment shown in fig. 3B, and the specific process can refer to the description in the embodiment shown in fig. 3B, and will not be described in detail here.
Step 501: and the main equipment triggers an instruction for acquiring the content to the target equipment.
Step 502: and the target equipment executes corresponding operation according to the content acquisition instruction to obtain the content corresponding to the content acquisition instruction.
Step 503: and the target equipment transmits the content corresponding to the content acquisition instruction to the main equipment.
As a possible implementation manner, the master device may establish a connection with the target device by "bumping" with the target device, and simultaneously trigger the target device to execute the instruction for content acquisition. Referring to fig. 6A, for example, when the mobile phone 100 displays the main interface 600, after the mobile phone 100 and the Watch (HUAWEI Watch1) are "touched" (for example, see the "touched" operation diagram shown in fig. 6B), the mobile phone 100 may respond to the "touched" operation and then trigger a screen capture instruction to the Watch. Meanwhile, a user interface 610 may be displayed on the display screen of the mobile phone 100, and a prompt box 611 may be included in the user interface 610. The content of the prompt box 611 may be "the HUAWEI Watch1 is performing a screen capture operation". Of course, it is understood that the prompt box 611 may be displayed for a certain length of time, for example, 10s may be displayed.
Illustratively, referring to fig. 6C, a schematic software module interaction diagram provided in the embodiment of the present application is shown. After the mobile phone and the watch execute the touch-and-dash operation, the control signaling module on the mobile phone can send the screen capture operation instruction to the soft bus on the mobile phone, and then the soft bus on the mobile phone can send the screen capture operation instruction to the soft bus on the watch through the near-field network.
Correspondingly, when the soft bus on the watch can send the received screen capture operation instruction triggered by the mobile phone 100 to the control signaling module on the watch, the control signaling module on the watch can respond to the screen capture operation instruction and capture the target content according to the screen capture operation instruction. After the watch completes the screen capture operation, the image after the screen capture can be transmitted to the file transmission module on the mobile phone 100 through the file transmission module and the soft bus, and then the file transmission module on the mobile phone 100 stores the image after the screen capture in the application program gallery. Illustratively, the watch may display a user interface 620 on the display of the handset 100 before transmitting the image to the handset 100. A prompt box 621 may be included in the user interface 620, and the prompt box 621 may prompt with text "whether to receive a file from the HUAWEI Watch 1? "622, select button" yes "623, and select button" no "624. When the user clicks the select button "yes" 623, the watch may transmit images to the handset 100. During the image transmission process, a user interface 630 may be displayed on the display screen of the mobile phone 100, the user interface 630 may display the progress of the image transmission, for example, a prompt box 631 may be displayed, and the content of the prompt box 631 may be "file is being transmitted, transmitted 70%".
It is understood that the user can view the progress of the transmission of the image by pulling down the "notification management" in the notification bar. For example, after the image transmission is completed, the user may slide from top to bottom on the display screen of the mobile phone 100, and then the mobile phone 100 may respond to the sliding operation of the user's finger on the display screen, and a user interface 640 may be displayed on the display screen, and the user interface 640 may include a file transmission progress notification 641, for example, may display "the file has been transmitted completed". After the user clicks the transmission progress notification 641, a user interface 650 may be displayed on the display screen of the mobile phone 100, and an image captured from the display screen of the watch may be displayed in the user interface 650.
It should be noted that, in the embodiment shown in fig. 4A and 4C, after the smart screen completes the screen capture operation or the screen recording operation, the user interface 620 shown in fig. 6A may also be displayed, for example, a prompt box "whether to receive the file from the smart screen" may be displayed, and after the user clicks the "yes" button, the smart screen may send the file to the mobile phone 100. Of course, in consideration of that the screen capturing operation or the screen recording itself is triggered from the mobile phone 100 to the smart screen, in this embodiment of the application, the user interface 440 shown in fig. 4A may be directly displayed without the user clicking the "yes" button, and directly displaying the picture after the screen capturing or the file of the screen recording.
In other embodiments of the present application, after the cell phone 100 "bumps" the Watch (HUAWEI Watch1), a user interface 660 as shown in fig. 6D may also be displayed on the display screen of the cell phone 100, and the user interface 660 may include a prompt box. Wherein, the prompt box may include a text prompt "is the HUAWEI Watch1 to be screenshot? 661, select button yes 662 and select button no 663. After the user clicks the selection button "yes" 662, the cellular phone 100 may display the user interface 610 of fig. 6A on the display screen in response to the user's clicking operation on the selection button "yes" 662. Of course, it is understood that after the user clicks the "yes" 662 selection button, the display of the mobile phone 100 may not display the user interface 610 of fig. 6A, and then the watch may execute the screen capture command, and after the screen capture operation on the watch is completed, the display of the mobile phone 100 directly displays the user interface 620 of fig. 6A.
It should be noted that fig. 6A and 6D are introduced by taking an example of a mobile phone controlling a watch to perform a screen capture operation, and the screen capture operation may refer to the schematic diagrams in the above embodiments, and is not described herein again.
Referring to fig. 6E, a schematic view of a user interface for a watch to perform a screen capture operation according to an embodiment of the present application is shown. Illustratively, assuming that the watch displays the user interface 60, the user interface 61 may be displayed on the display of the watch after the cell phone (HUAWEI Mate 30) "hits" the watch. Wherein, a prompt box "the huaboei Mate30 requests a screen shot, whether allowed? "6101, select button" allow "6102, select button" reject "6103. When the user clicks on the selection button "allow" 6102, the watch may perform a screen capture operation, at which time the watch may display a user interface 62 on the display screen, for example, the prompt box "capturing screen" 6201 may be included in the user interface 62. After the watch completes the screen capture operation, the picture after the screen capture may be sent to the HUAWEI Mate30, for example, the watch may display the user interface 63 as shown. The user interface 63 may include a prompt box "70% of the file is transmitted" 6301, that is, a display screen of the watch may display a transmission progress of sending the screenshot file to the cellular phone huabei Mate 30. Note that the prompt box 6301 may disappear when the file transfer is completed.
As another possible implementation manner, the master device may trigger a content acquisition instruction to the target device in a "scanning" manner, so that the target device executes an operation corresponding to the content acquisition instruction. For example, in the case of a mobile phone and a smart screen, a user may open a "menu bar" of the smart screen, for example, as shown in (a) of fig. 7A, through a button on a remote controller of the smart screen, and then a diagram as shown in (b) of fig. 7A may be displayed on the smart screen. A user may select a "screen capture (code scanning)" frame in a "menu bar" through a touch screen or a remote controller button, and then a two-dimensional code may pop up on the smart screen, for example, as shown in (c) of fig. 7A, where the two-dimensional code is used to represent Internet Protocol (IP) address information of the smart screen in the local area network and an operation execution instruction (screen capture instruction) selected by the user.
Referring to fig. 7B, the user can scan the two-dimensional code (e.g., the operation diagram in fig. 7C) on the smart screen through "scan" on the system of the mobile phone 100, establish a connection with the smart screen, and then trigger a screen capture instruction to the smart screen according to the screen capture instruction in the two-dimensional code. For example, the handset 100 may display a negative one-screen interface 700 on the display screen, and the interface 700 may include a plurality of function controls, such as "scan" 701. The user may click "scan" 701 and cell phone 100 may display user interface 710 on the display screen in response to the user clicking. Among other things, a scan box 711 may be included in the user interface 710.
The user can scan the two-dimensional code on the smart screen shown in (c) in fig. 7A through the scanning frame 711, and after the scanning, the mobile phone 100 can send a screen capture instruction to the smart screen, and at the same time, the display screen of the mobile phone 100 can display the user interface 720. The user interface 720 may include a prompt 721, where the prompt 721 is used to prompt the user that the smart screen is to perform a screen capture operation. The user interface 730 may then be displayed on the display of the handset 100 after a set duration, such as after 5 s. The user interface 730 may include a prompt message 731, where the prompt message 731 is used to prompt the user that the smart screen is capturing a screen. Accordingly, after the smart screen receives the screen capture instruction, the operation corresponding to the screen capture instruction may be executed, and the following steps and the schematic diagram of the user interface may refer to the description in the embodiment shown in fig. 6A, which is not repeated herein.
In other embodiments, referring to fig. 7D, the user can scan the two-dimensional code on the smart screen through the scanning frame 711, and after the scanning, the user interface 740 can also be displayed on the display screen of the mobile phone 100. In which, the user interface 740 may include a text prompt "is the wisdom screen to be captured? "741, select button" yes "742, and select button" no "743. When the user clicks the selection button "yes" 742, the user interface 730 of fig. 7B may be displayed on the display screen of the handset 100. Then, after the smart screen receives the screen capturing instruction, the operation corresponding to the screen capturing instruction may be executed, and the following steps and the schematic diagram of the user interface may refer to the description in the embodiment shown in fig. 6A, which is not repeated herein.
It can be understood that, when the actual product is implemented, the user may also open a "scan" in an application program such as a WeChat or a browser to scan the two-dimensional code, which is not limited in this application.
Illustratively, referring to fig. 7E, a schematic diagram of software module interaction provided in the embodiment of the present application is shown. After the mobile phone and the smart screen execute scanning operation, a control signaling module on the mobile phone can send a screen capture operation instruction to a soft bus on the smart screen, and then the soft bus on the mobile phone can send the screen capture operation instruction to the soft bus on the smart screen through a near-field network. Correspondingly, the soft bus on the smart screen can send the screen capture operation instruction sent by the mobile phone to the control signaling module on the smart screen, and then the control signaling module of the smart screen can control the smart screen to perform screen capture operation on the target content to obtain the screenshot of the target content displayed on the smart screen. The following steps may specifically refer to the foregoing embodiments, for example, the descriptions in fig. 4B and fig. 6C, and are not repeated here.
Scene three: the master device interacts with the target device through the intermediate device.
As shown in fig. 8A, a schematic block diagram of a content obtaining method according to an embodiment of the present application is shown in fig. 8A. On the main device side, the mobile phone can interact with the intermediate device (remote controller), the remote controller processes the notification message triggered by the mobile phone, and then the remote controller can trigger a screen capture or recording instruction to the target device. On the target equipment side, the target equipment can analyze a screen capturing or recording instruction sent by the main equipment, then starts to execute screen capturing or recording operation, and after the screen capturing or recording operation is completed, the contents obtained by screen capturing or recording can be transmitted to the main equipment for storage.
As shown in fig. 8B, a flowchart of a content obtaining method provided in an embodiment of the present application is shown, and referring to fig. 8B, the method may include the following steps:
step 801: the master device sends a notification message to the intermediate device. The notification message is used for triggering a content acquisition instruction to the target device.
Taking the host device as a mobile phone and the target device as an intelligent screen as an example, in the embodiment of the present application, a remote controller of the intelligent screen has an NFC module, for example, as shown in fig. 9A, the remote controller with the NFC module is provided for the embodiment of the present application. For example, the mobile phone 100 may "touch" the remote controller of the smart screen (for example, refer to the operation diagram shown in fig. 9B), so that the mobile phone 100 establishes a connection with the remote controller and simultaneously triggers sending of a notification message to the remote controller, where the notification message is used to notify the remote controller that the command for sending the screen capture operation to the smart screen can be triggered.
Note that, the NFC in the schematic diagram of fig. 9A is only to illustrate the location of the NFC module, and when an actual product is implemented, the remote controller may select a built-in manner instead of displaying the NFC.
As an example, as shown in fig. 9C, after the mobile phone 100 "hits" the remote controller, the user interface 900 may be displayed on the display interface of the mobile phone 100. The user interface 900 may include a prompt box, and the prompt box may specifically include text prompts "whether to perform a screen capture operation on the smart screen" 901, a selection button "yes" 902, and a selection button "no" 903. After the user clicks the selection button "yes" 902, the mobile phone 100 may notify the remote controller to send a screen capture instruction to the smart screen in response to the user clicking the selection button "yes" 902.
It can be understood that the mobile phone can touch the area where the NFC module on the remote controller is located, so that the remote controller can receive the touch operation between the mobile phone and the remote controller.
Step 802: the intermediate device triggers a content acquisition instruction to the target device.
After receiving the notification message triggered by the mobile phone 100, the remote controller may send a screen capture instruction to the smart screen, so that the smart screen executes a screen capture operation corresponding to the screen capture instruction. Illustratively, referring to fig. 9D, a schematic software module interaction diagram provided in the embodiment of the present application is shown. After the mobile phone and the remote controller execute the 'touch-and-dash' operation, the soft bus on the remote controller can send the notification message to the soft bus on the intelligent screen through the near field network.
Step 803: and the target equipment executes corresponding operation according to the content acquisition instruction to obtain the content corresponding to the content acquisition instruction.
Step 804: and the target equipment transmits the content corresponding to the content acquisition instruction to the main equipment.
Correspondingly, the soft bus on the intelligent screen can send the received screen capture instruction triggered by the remote controller to the control signaling module on the intelligent screen, and then the control signaling module of the intelligent screen can control the intelligent screen to perform screen capture operation on the target content to obtain the screen capture of the target content displayed on the intelligent screen. After the control signaling module of the intelligent screen captures the displayed content, the control signaling module on the intelligent screen can send the captured image to the file transmission module. The file transmission module transmits the obtained content to a file transmission module of the mobile phone through a soft bus, and the file transmission module of the mobile phone can store the received screenshot into a gallery of a corresponding application program.
It should be noted that, steps 803 and 804 are the same as steps 303 and 304 in the embodiment shown in fig. 3B, and the specific process can refer to the description in the embodiment shown in fig. 3B, and will not be described in detail here.
Further, the above embodiments are only described by taking as an example that the initiating electronic device can control the target electronic device to perform a screen capture operation or a screen recording operation. In the embodiment of the application, if the originating-side electronic device does not have the authority to control the target-side electronic device to execute the screen capturing operation or the screen recording operation, prompt information can be displayed on the originating-side electronic device and used for prompting a user that the target-side electronic device cannot be controlled by the originating-side electronic device to execute the screen capturing operation or the screen recording operation.
For example, referring to fig. 10, for example, the mobile phone (the huabei Mate 30) does not have the authority to control the Watch (the huabei Watch2) to perform the screen capture operation, and after the mobile phone triggers the screen capture operation command to the Watch, the user interface 1000 may be displayed on the display screen of the mobile phone. Wherein, a prompt box 1001 may be included on the user interface 1000, and the content of the prompt box 1001 may be "the HUAWEI Mate30 cannot capture the HUAWEI Watch 2". It is understood that the prompt box 1001 may disappear after being displayed for a certain length of time, e.g., 10 s.
Based on the foregoing embodiment, as shown in fig. 11, an embodiment of the present application further provides a content obtaining method, and referring to fig. 11, the method may include the following steps:
step 1101: the first device triggers a content acquisition instruction to the second device.
The content acquisition instruction is used for instructing the second equipment to perform screen capture or screen recording operation on the content currently displayed on the display screen of the second equipment, and feeding back the content obtained through the screen capture or screen recording operation to the first equipment.
Step 1102: and the second equipment carries out screen capture or screen recording operation on the content currently displayed on the display screen of the second equipment according to the content acquisition instruction to obtain the content corresponding to the content acquisition instruction.
And after the second device receives the content acquisition instruction, the second device can execute the operation corresponding to the content acquisition instruction to obtain the content corresponding to the content acquisition instruction. And transmits the obtained content to the first device. Accordingly, the first device may receive the content corresponding to the content acquisition instruction sent by the second device.
Step 1103: and the first equipment receives the content which is sent by the second equipment and corresponds to the content acquisition instruction.
Step 1104: and the first equipment displays the content which is sent by the second equipment and corresponds to the content acquisition instruction.
For the specific implementation of the above steps, reference may be made to the detailed description in the foregoing embodiments, which are not described herein again.
In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the perspective of an electronic device as an execution subject. In order to implement the functions in the method provided by the embodiments of the present application, the electronic device may include a hardware structure and/or a software module, and the functions are implemented in the form of a hardware structure, a software module, or a hardware structure and a software module. Whether any of the above-described functions is implemented as a hardware structure, a software module, or a hardware structure plus a software module depends upon the particular application and design constraints imposed on the technical solution.
Referring to fig. 12, a schematic diagram of a content obtaining apparatus according to an embodiment of the present application is shown, where referring to fig. 12, the apparatus may include: the device comprises a screen capture module, a manager and a soft bus.
The screen capture and recording module is used for adapting to different equipment and carrying out screen capture and recording processing by combining factors such as screen shape, size, resolution and the like of the equipment; the capturing module is used for capturing screen capture events, such as touch, scanning, gesture operation and the like; the manager is used for providing an interface, processing command transmission (such as a mobile phone requesting screen capture from the smart screen), file transmission (such as transmitting screen capture files of the smart screen to the mobile phone) and business logic (such as inquiring a screen capture device list) of the cross-device; and the soft bus is a bottom layer interaction channel between the devices. For example, when an image obtained by the screen capture operation executed by the smart screen is transmitted between the mobile phone (device a) and the smart screen (device B), an interface of the manager may be called to send the image to the soft bus, and the image is transmitted to the mobile phone through the soft bus.
As shown in fig. 13, a schematic structural diagram of a content obtaining apparatus according to an embodiment of the present application is provided, where the device may be an electronic device having a display screen. Illustratively, the electronic device may be a first device, such as a cell phone. Referring to fig. 13, the apparatus 1300 includes: a display screen 1301; one or more processors 1302; a memory 1303; a plurality of applications 1304 (not shown); and one or more computer programs 1305 (not shown) which may be connected by one or more communication buses 1306.
The display screen 1301 is used for displaying a display interface of an application in the electronic device, and the like.
Wherein memory 1303 has stored therein one or more computer programs comprising instructions; the processor 1302 invokes the instructions stored in the memory 1303, so that the content obtaining apparatus 1300 performs the following steps:
triggering a content acquisition instruction to a second device, wherein the content acquisition instruction is used for instructing the second device to perform screen capturing or screen recording operation on the content currently displayed on a display screen of the second device, and feeding back the content obtained by the screen capturing or screen recording operation to the first device; and receiving and displaying the content which is sent by the second equipment and corresponds to the content acquisition instruction.
In one possible implementation manner, the triggering a content obtaining instruction to the second device includes: responding to a first operation of a user, wherein the first operation is used for triggering a content acquisition instruction to the second equipment.
In one possible implementation manner, the triggering a content obtaining instruction to the second device includes: and triggering a content acquisition instruction to the second equipment by executing a collision operation with the second equipment.
In one possible implementation manner, triggering a content acquisition instruction to a second device includes: and sending a notification message to a third device, wherein the notification message is used for notifying the third device to trigger a content acquisition instruction to the second device.
In one possible implementation, before the instructions are invoked by the one or more processors 1302 to be executed, the content obtaining apparatus is further caused to perform the following steps before responding to a first operation of a user: responding to a second operation of the user, and displaying a current connectable device list, wherein the second operation is used for triggering the display of the connectable device list; and responding to a third operation of the user, selecting a second device in the displayed connectable device list, wherein the third operation is an operation of the user selecting a corresponding device in the displayed connectable device list.
In one possible implementation, when the instructions are invoked for execution by the one or more processors 1302, the content obtaining apparatus 1300 is further caused to perform the following steps: and displaying a first interface in the process of receiving and displaying the content corresponding to the content acquisition instruction sent by the second equipment, wherein the first interface comprises the transmission progress of the content.
In one possible implementation, when the instructions are invoked for execution by the one or more processors 1302, the content obtaining apparatus 1300 is further caused to perform the following steps: and displaying first prompt information, wherein the first prompt information is used for prompting the second equipment to execute the state of the operation corresponding to the content acquisition instruction.
In one possible implementation, when the instructions are invoked for execution by the one or more processors 1302, the content obtaining apparatus 1300 is further caused to perform the following steps: and displaying second prompt information, wherein the second prompt information is used for prompting a user to receive the content which is sent by the second equipment and corresponds to the content acquisition instruction.
As shown in fig. 14, a schematic structural diagram of another content acquiring apparatus provided in the embodiment of the present application is shown, where the device may be an electronic device having a display screen. Illustratively, the electronic device may be a second device, such as a smart screen, a watch, and the like. Referring to fig. 14, the apparatus 1400 includes: a display screen 1401; one or more processors 1402; a memory 1403; a plurality of applications 1404 (not shown in the figure); and one or more computer programs 1405 (not shown), which may be connected via one or more communication buses 1406.
The display 1401 is used to display a display interface of an application in the electronic device.
Wherein memory 1403 has stored therein one or more computer programs comprising instructions; the processor 1402 invokes the instructions stored in the memory 1403 causing the electronic device 1400 to perform the following steps:
receiving a content acquisition instruction triggered by first equipment, wherein the content acquisition instruction is used for instructing the second equipment to perform screen capture or screen recording operation on content currently displayed on a display screen of the second equipment; performing screen capture or screen recording operation on the content currently displayed on the display screen of the user according to the content acquisition instruction to obtain the content corresponding to the content acquisition instruction; and sending the content corresponding to the content acquisition instruction to the first equipment.
In one possible implementation, the instructions, when executed by the one or more processors 1402, cause the content obtaining apparatus 1400 to further perform the following steps: and displaying third prompt information, wherein the third prompt information is used for prompting whether to determine to execute the content acquisition instruction sent by the first equipment.
In one possible implementation, the instructions, when invoked for execution by the one or more processors 1402, cause the electronic device 1400 to further perform the steps of: and displaying a second interface in the process of sending the content corresponding to the content acquisition instruction to the first equipment, wherein the second interface comprises the transmission progress of the content.
In the embodiments of the present application, the processors 1302, 1402 may be general-purpose processors, digital signal processors, application specific integrated circuits, field programmable gate arrays or other programmable logic devices, discrete gate or transistor logic, discrete hardware components, or the like, which may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor. The software modules may be located in the memory 1303 and the memory 1403, and the processor 1302 and the processor 1402 read the program instructions in the memory 1303 and the memory 1403, and complete the steps of the above method by combining with hardware thereof.
In the embodiment of the present application, the memories 1303 and 1403 may be nonvolatile memories, such as Hard Disk Drives (HDDs) or solid-state drives (SSDs), and may also be volatile memories (RAMs). The memory can also be, but is not limited to, any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory in the embodiments of the present application may also be a circuit or any other device capable of implementing a storage function for storing instructions and/or data.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Based on the above embodiments, an embodiment of the present application further provides a chip, where the chip is coupled with a memory in an electronic device, and executes the content obtaining method provided in the embodiment of the present application; "coupled" in the context of this application means that two elements are joined to each other either directly or indirectly.
Based on the above embodiments, the present application also provides a computer storage medium, in which a computer program is stored, and when the computer program is executed by a computer, the computer is enabled to execute the content acquisition method provided by the above embodiments.
The embodiment of the present application also provides a computer program product, which includes instructions that, when run on a computer, cause the computer to execute the content obtaining method provided in the above embodiment.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by instructions. These instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

Claims (23)

1. A content acquisition method, comprising:
the method comprises the steps that a first device triggers a content acquisition instruction to a second device, wherein the content acquisition instruction is used for instructing the second device to perform screen capture or screen recording operation on content currently displayed on a display screen of the second device, and the content obtained through the screen capture or screen recording operation is fed back to the first device;
and the first equipment receives and displays the content which is sent by the second equipment and corresponds to the content acquisition instruction.
2. The method of claim 1, wherein the first device triggering a content acquisition instruction to the second device comprises:
the first device responds to a first operation of a user, and the first operation is used for triggering a content acquisition instruction to the second device.
3. The method of claim 1, wherein the first device triggering a content acquisition instruction to a second device comprises:
the first device triggers a content acquisition instruction to the second device by executing a collision operation with the second device.
4. The method of claim 1, wherein the first device triggering a content acquisition instruction to a second device comprises:
the first device sends a notification message to a third device, wherein the notification message is used for notifying the third device to trigger a content acquisition instruction to the second device.
5. The method of claim 2, wherein prior to the first device responding to the first operation by the user, the method further comprises:
the first equipment responds to a second operation of a user and displays a current connectable equipment list, and the second operation is used for triggering and displaying the connectable equipment list;
and the first equipment responds to a third operation of the user, selects the second equipment in the displayed connectable equipment list, and the third operation is an operation of selecting the corresponding equipment in the displayed connectable equipment list by the user.
6. The method of any one of claims 1-5, further comprising:
and the first equipment displays a first interface in the process of receiving and displaying the content corresponding to the content acquisition instruction sent by the second equipment, wherein the first interface comprises the transmission progress of the content.
7. The method of any one of claims 1-6, further comprising:
and the first equipment displays first prompt information, wherein the first prompt information is used for prompting the second equipment to execute the state of the operation corresponding to the content acquisition instruction.
8. The method of any one of claims 1-7, further comprising:
and the first equipment displays second prompt information, wherein the second prompt information is used for prompting a user to receive the content which is sent by the second equipment and corresponds to the content acquisition instruction.
9. A content acquisition method, comprising:
the method comprises the steps that a second device receives a content obtaining instruction triggered by a first device, wherein the content obtaining instruction is used for indicating the second device to carry out screen capture or screen recording operation on content currently displayed on a display screen of the second device;
the second equipment carries out screen capture or screen recording operation on the content currently displayed on the display screen of the second equipment according to the content acquisition instruction to obtain the content corresponding to the content acquisition instruction;
and the second equipment sends the content corresponding to the content acquisition instruction to the first equipment.
10. The method of claim 9, wherein the method further comprises:
and the second equipment displays third prompt information, wherein the third prompt information is used for prompting whether to determine to execute the content acquisition instruction sent by the first equipment.
11. The method of any one of claims 9-10, further comprising:
and displaying a second interface by the second equipment in the process of sending the content corresponding to the content acquisition instruction to the first equipment, wherein the second interface comprises the transmission progress of the content.
12. A content acquisition device is applied to a first device and is characterized by comprising a display screen; one or more processors; a memory; a plurality of applications; and one or more computer programs;
wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions that, when executed by the one or more processors, cause the content acquisition device to perform the steps of:
triggering a content acquisition instruction to a second device, wherein the content acquisition instruction is used for instructing the second device to perform screen capturing or screen recording operation on the content currently displayed on a display screen of the second device, and feeding back the content obtained by the screen capturing or screen recording operation to the first device;
and receiving and displaying the content which is sent by the second equipment and corresponds to the content acquisition instruction.
13. The apparatus of claim 12, wherein triggering a content acquisition instruction to the second device comprises:
responding to a first operation of a user, wherein the first operation is used for triggering a content acquisition instruction to the second equipment.
14. The apparatus of claim 12, wherein triggering a content acquisition instruction to the second device comprises:
and triggering a content acquisition instruction to the second equipment by executing a collision operation with the second equipment.
15. The apparatus of claim 12, wherein triggering a content acquisition instruction to a second device comprises:
and sending a notification message to a third device, wherein the notification message is used for notifying the third device to trigger a content acquisition instruction to the second device.
16. The apparatus of claim 13, wherein the instructions, when executed by the one or more processors, further cause the content obtaining apparatus to perform the following steps prior to responding to a first operation by a user:
responding to a second operation of the user, and displaying a current connectable device list, wherein the second operation is used for triggering the display of the connectable device list;
and responding to a third operation of the user, selecting a second device in the displayed connectable device list, wherein the third operation is an operation of the user selecting a corresponding device in the displayed connectable device list.
17. The apparatus of any one of claims 12-16, wherein the instructions, when executed by the one or more processors, cause the content acquisition apparatus to further perform the steps of:
and displaying a first interface in the process of receiving and displaying the content corresponding to the content acquisition instruction sent by the second equipment, wherein the first interface comprises the transmission progress of the content.
18. The apparatus of any one of claims 12-17, wherein the instructions, when executed by the one or more processors, cause the content acquisition apparatus to further perform the steps of:
and displaying first prompt information, wherein the first prompt information is used for prompting the second equipment to execute the state of the operation corresponding to the content acquisition instruction.
19. The apparatus of any one of claims 12-18, wherein the instructions, when executed by the one or more processors, cause the content acquisition apparatus to further perform the steps of:
and displaying second prompt information, wherein the second prompt information is used for prompting a user to receive the content which is sent by the second equipment and corresponds to the content acquisition instruction.
20. A content acquisition device is applied to a second device and is characterized by comprising a display screen; one or more processors; a memory; a plurality of applications; and one or more computer programs;
wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions that, when executed by the one or more processors, cause the content acquisition device to perform the steps of:
receiving a content acquisition instruction triggered by first equipment, wherein the content acquisition instruction is used for instructing the second equipment to perform screen capture or screen recording operation on content currently displayed on a display screen of the second equipment;
performing screen capture or screen recording operation on the content currently displayed on the display screen of the content acquisition device according to the content acquisition instruction to obtain the content corresponding to the content acquisition instruction;
and sending the content corresponding to the content acquisition instruction to the first equipment.
21. The apparatus of claim 20, wherein the instructions, when executed by the one or more processors, cause the content acquisition apparatus to further perform the steps of:
and displaying third prompt information, wherein the third prompt information is used for prompting whether to determine to execute the content acquisition instruction sent by the first equipment.
22. The apparatus of any one of claims 20-21, wherein the instructions, when executed by the one or more processors, cause the content acquisition apparatus to further perform the steps of:
and displaying a second interface in the process of sending the content corresponding to the content acquisition instruction to the first equipment, wherein the second interface comprises the transmission progress of the content.
23. A computer storage medium comprising computer instructions that, when run on an electronic device, cause the electronic device to perform the content acquisition method of any one of claims 1-8 or 9-11.
CN202110887743.5A 2020-09-29 2021-08-03 Content acquisition method and device Pending CN114356187A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2020110481840 2020-09-29
CN202011048184 2020-09-29

Publications (1)

Publication Number Publication Date
CN114356187A true CN114356187A (en) 2022-04-15

Family

ID=81095426

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110887743.5A Pending CN114356187A (en) 2020-09-29 2021-08-03 Content acquisition method and device

Country Status (1)

Country Link
CN (1) CN114356187A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135385A (en) * 2023-03-23 2023-11-28 荣耀终端有限公司 Multi-device collaborative screen recording and sharing method, electronic device and communication system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102404641A (en) * 2011-12-06 2012-04-04 深圳Tcl新技术有限公司 Method and system for remotely controlling television by using smart phone
CN105338399A (en) * 2015-10-29 2016-02-17 小米科技有限责任公司 Image acquisition method and device
CN105592364A (en) * 2016-02-25 2016-05-18 腾讯科技(深圳)有限公司 Trans-terminal screen-shot image acquiring method and trans-terminal screen-shot image acquiring device
CN109684025A (en) * 2019-01-08 2019-04-26 深圳市网心科技有限公司 A kind of remote communication method and relevant apparatus
CN110581919A (en) * 2018-06-11 2019-12-17 阿里巴巴集团控股有限公司 Information transmission and data processing method, device, system and storage medium
CN111062224A (en) * 2019-10-30 2020-04-24 华为终端有限公司 Content transmission method and terminal equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102404641A (en) * 2011-12-06 2012-04-04 深圳Tcl新技术有限公司 Method and system for remotely controlling television by using smart phone
CN105338399A (en) * 2015-10-29 2016-02-17 小米科技有限责任公司 Image acquisition method and device
CN105592364A (en) * 2016-02-25 2016-05-18 腾讯科技(深圳)有限公司 Trans-terminal screen-shot image acquiring method and trans-terminal screen-shot image acquiring device
CN110581919A (en) * 2018-06-11 2019-12-17 阿里巴巴集团控股有限公司 Information transmission and data processing method, device, system and storage medium
CN109684025A (en) * 2019-01-08 2019-04-26 深圳市网心科技有限公司 A kind of remote communication method and relevant apparatus
CN111062224A (en) * 2019-10-30 2020-04-24 华为终端有限公司 Content transmission method and terminal equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135385A (en) * 2023-03-23 2023-11-28 荣耀终端有限公司 Multi-device collaborative screen recording and sharing method, electronic device and communication system

Similar Documents

Publication Publication Date Title
CN110865744B (en) Split-screen display method and electronic equipment
WO2020259452A1 (en) Full-screen display method for mobile terminal, and apparatus
CN109766066B (en) Message processing method, related device and system
CN110727382A (en) Split-screen display method and electronic equipment
CN113885759B (en) Notification message processing method, device, system and computer readable storage medium
WO2020134869A1 (en) Electronic device operating method and electronic device
WO2020000448A1 (en) Flexible screen display method and terminal
WO2021213164A1 (en) Application interface interaction method, electronic device, and computer readable storage medium
CN110809297B (en) Data transmission method and electronic equipment
WO2021052279A1 (en) Foldable screen display method and electronic device
WO2021047567A1 (en) Callback stream processing method and device
CN114363462B (en) Interface display method, electronic equipment and computer readable medium
EP4236434A1 (en) Channel switching method, electronic device, and storage medium
CN112771900A (en) Data transmission method and electronic equipment
WO2021036898A1 (en) Application activation method for apparatus having foldable screen, and related device
CN111208925A (en) Method for establishing application combination and electronic equipment
WO2021238370A1 (en) Display control method, electronic device, and computer-readable storage medium
CN111835530A (en) Group joining method and device
EP4184905A1 (en) Device recognition method and related apparatus
CN112445276A (en) Folding screen display application method and electronic equipment
CN114257671B (en) Image display method and electronic equipment
CN113973398A (en) Wireless network connection method, electronic equipment and chip system
WO2022166435A1 (en) Picture sharing method and electronic device
WO2022100219A1 (en) Data transfer method and related device
WO2022022674A1 (en) Application icon layout method and related apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination