CN115734021A - Screen recording method, electronic equipment and computer readable storage medium - Google Patents

Screen recording method, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN115734021A
CN115734021A CN202111006119.6A CN202111006119A CN115734021A CN 115734021 A CN115734021 A CN 115734021A CN 202111006119 A CN202111006119 A CN 202111006119A CN 115734021 A CN115734021 A CN 115734021A
Authority
CN
China
Prior art keywords
screen recording
screen
electronic device
application
media stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111006119.6A
Other languages
Chinese (zh)
Inventor
罗芊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111006119.6A priority Critical patent/CN115734021A/en
Priority to PCT/CN2022/113756 priority patent/WO2023030057A1/en
Publication of CN115734021A publication Critical patent/CN115734021A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a screen recording method, and relates to the field of terminals. The electronic equipment responds to a first operation of starting a screen recording function, and obtains identification information of applications associated with the first operation, wherein the applications associated with the first operation comprise applications running in a background of the electronic equipment. The electronic equipment acquires the media stream of the application according to the identification information, and generates a screen recording file corresponding to the application according to the media stream of the application. The embodiment of the application also provides the electronic equipment and a computer readable storage medium. According to the method and the device, the screen recording of the application running in the background of the electronic equipment can be realized, and the screen recording experience of a user is improved.

Description

Screen recording method, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of terminals, and in particular, to a screen recording method, an electronic device, and a computer-readable storage medium.
Background
The screen recording function is a common function of electronic equipment (such as a mobile phone, a tablet computer and the like). The existing screen recording method is to record the current screen display interface, and store the screenshot or store the screenshot as a video file. However, the existing screen recording method cannot record the interface of the application program which is not displayed on the screen, and the use experience of the user is reduced.
Disclosure of Invention
In view of this, it is necessary to provide a screen recording method, which can record a screen of an interface of an application running in a background or record a screen of an application window projected and displayed on other electronic devices on a local device side.
The first aspect of the embodiment of the application discloses a screen recording method, which is applied to first electronic equipment, and the screen recording method comprises the following steps: in response to a first operation of starting a screen recording function, acquiring identification information of application programs associated with the first operation, wherein the application programs associated with the first operation comprise application programs running in a background of the first electronic device; acquiring a media stream of the application program according to the identification information; and generating a screen recording file corresponding to the application program according to the media stream of the application program.
By adopting the technical scheme, the identification information of the application needing screen recording is obtained, and the screen recording file is generated based on the media stream obtained by the identification information, so that compared with the prior art that the screen recording can only be carried out on the interface displayed at the front end, the screen recording on the interfaces of one or more background application programs of the electronic equipment can be realized.
In some embodiments, the identification information includes a process Identity (ID) of the application or a package name of the application.
In some embodiments, obtaining the media stream of the application based on the identification information comprises: determining a screen data buffer corresponding to the application program according to the identification information; and acquiring the media stream of the application program from the screen data buffer area.
By adopting the technical scheme, the media stream is extracted from the screen data buffer area instead of recording the interface displayed by the current screen, so that the screen recording of the interfaces of one or more application programs of the electronic equipment can be realized, and the application programs with the recorded screen can run in the foreground of the electronic equipment and can also run in the background of the electronic equipment.
In some embodiments, generating a screen recording file corresponding to an application program according to a media stream of the application program includes: and carrying out image synthesis on the media stream of the application program to generate a screen recording file corresponding to the application program.
By adopting the technical scheme, the electronic equipment realizes image synthesis of the media stream of the application program by calling a SurfaceFlinger or Direct Digital Synthesizer (DDS) algorithm, and generates a screen recording file.
In some embodiments, the generating, by the application program, a screen recording file corresponding to the application program according to the media stream of the application program, where the screen recording file corresponds to the first screen data buffer and the second screen data buffer, includes: performing image synthesis on a first media stream acquired from a first screen data buffer area to generate a first screen recording subfile; performing image synthesis on a second media stream acquired from a second screen data buffer area to generate a second screen recording subfile; and splicing the first screen recording subfile and the second screen recording subfile to generate a screen recording file corresponding to the application program.
By adopting the technical scheme, two or more screen display contents in a distributed display scene can be recorded on the electronic equipment side, the synchronization of recording clocks can be ensured by recording a plurality of screen display contents by the same equipment, and simultaneously a plurality of screen recording subfiles can be spliced on the electronic equipment side to generate one screen recording file, so that a user can view the contents of a plurality of display interfaces in one screen recording file.
In some embodiments, splicing the first screen recording subfile with the second screen recording subfile includes: performing time alignment processing on the first screen recording subfile and the second screen recording subfile according to the time stamps of the first screen recording subfile and the second screen recording subfile; and splicing the first screen recording subfile and the second screen recording subfile which are subjected to the time alignment processing.
By adopting the technical scheme, the same equipment records a plurality of screen display contents, so that the clock synchronization can be ensured, and meanwhile, the screen recording subfiles can be spliced on the electronic equipment side based on the time stamps of the screen recording subfiles to obtain one screen recording file, so that a user can watch the contents of a plurality of display interfaces with time synchronization in one screen recording file.
In some embodiments, the generating, by the application program, the screen recording file corresponding to the application program according to the media stream of the application program corresponds to the first screen data buffer area and the second screen data buffer area includes: obtaining a first media stream from a first screen data buffer and a second media stream from a second screen data buffer; and carrying out image synthesis and splicing on the first media stream and the second media stream to generate a screen recording file corresponding to the application program.
By adopting the technical scheme, two or more screen display contents in a distributed display scene can be recorded on the electronic equipment side, the synchronization of recording clocks can be ensured by recording a plurality of screen display contents by the same equipment, meanwhile, a plurality of media streams can be aligned in time on the electronic equipment side, and then image synthesis and splicing are carried out to generate a screen recording file, so that a user can watch the contents of a plurality of display interfaces in the screen recording file.
In some embodiments, image compositing and splicing the first media stream with the second media stream comprises: time alignment processing is carried out on the first media stream and the second media stream according to the time stamps of the first media stream and the second media stream; and carrying out image synthesis and splicing on the first media stream and the second media stream which are subjected to the time alignment processing.
By adopting the technical scheme, the synchronization of the recording clocks can be ensured by recording a plurality of screen display contents by the same equipment, meanwhile, a plurality of media streams can be time-aligned on the electronic equipment side based on the timestamps of the media streams, and then image synthesis and splicing are carried out to obtain a screen recording file, so that a user can watch the contents of a plurality of time-synchronized display interfaces in the screen recording file.
In some embodiments, the screen recording file includes a first interface and a second interface, where the first interface and the second interface are displayed in a left-right split screen manner, or the first interface and the second interface are displayed in an up-down split screen manner, or a part of or all of the area of the first interface is displayed on the second interface in a suspended manner.
By adopting the technical scheme, a plurality of interfaces in the screen recording file can be set to be displayed in an upper-lower split screen mode, a left-right split screen mode or a suspended window mode according to actual needs of a user, and the use experience of the user is improved.
In some embodiments, the screen recording method further comprises: and responding to the screen recording instruction sent by the second electronic equipment, and acquiring identification information of the application program associated with the screen recording instruction.
By adopting the technical scheme, for a screen projection scene, a user can perform screen recording selection operation on a source end device side or a target device side and designate the source end device to perform screen recording application.
In some embodiments, wherein the application associated with the first operation further comprises an application that is projected for display onto the second electronic device.
By adopting the technical scheme, the screen recording of the application displayed on the screen to the second electronic equipment on the side of the first electronic equipment can be realized, and the application of the screen can run in the foreground or the background of the first electronic equipment.
In a second aspect, an embodiment of the present application provides a screen recording method, where a second electronic device is applied, and a first electronic device projects a screen of one or more application windows to be displayed on the second electronic device, where the screen recording method includes: receiving a media stream sent by first electronic equipment; responding to a first operation of starting a screen recording function, and acquiring identification information of an application program window associated with the first operation; acquiring a media stream corresponding to the identification information of the application program window associated with the first operation from the received media stream; and generating a screen recording file according to the acquired media stream.
By adopting the technical scheme, the screen recording of the application program interface projected and displayed by the second electronic equipment side can be realized by acquiring the identification information of the application window needing screen recording, and extracting the media stream corresponding to the identification information from the received media stream based on the identification information to generate the screen recording file.
In some embodiments, the identification information includes a process ID of the application window and a device ID of the first electronic device.
By adopting the technical scheme, the process ID of the application program window and the equipment ID of the first electronic equipment can be used as the unique identification of the application window for recording the screen on the second electronic equipment side.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device executes a screen recording method according to the first aspect or the second aspect.
In a fourth aspect, an embodiment of the present application provides an electronic device, where the electronic device includes a processor and a memory, where the memory is used to store instructions, and the processor is used to call the instructions in the memory, so that the electronic device executes a screen recording method according to the first aspect or the second aspect.
In a fifth aspect, the present application provides a computer program product, which when run on a computer, causes the computer to execute the screen recording method according to the first aspect or the second aspect.
In a sixth aspect, an apparatus is provided, where the apparatus has a function of implementing the behavior of the first electronic device in the method provided by the first aspect or the second aspect. The functions may be implemented by hardware, or by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above.
It should be understood that the computer-readable storage medium of the third aspect, the electronic device of the fourth aspect, the computer program product of the fifth aspect, and the apparatus of the sixth aspect all correspond to the method of the first aspect or the second aspect, and therefore, the beneficial effects achieved by the apparatus can refer to the beneficial effects in the corresponding methods provided above, and are not described herein again.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic view of an application scenario of a screen recording method according to an embodiment of the present application;
fig. 4 is a schematic view of an application scenario of a screen recording method according to another embodiment of the present application;
fig. 5 is a schematic view of an application scenario of a screen recording method according to another embodiment of the present application;
fig. 6 is a schematic flowchart of a screen recording method according to an embodiment of the present application;
fig. 7 is a schematic flowchart of a screen recording method according to another embodiment of the present application;
fig. 8 is a schematic structural diagram of a possible first electronic device according to an embodiment of the present disclosure.
Detailed Description
In the present application, "at least one" means one or more, "and" a plurality "means two or more. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, e.g., A and/or B may represent: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The terms "first," "second," "third," "fourth," and the like in the description and in the claims and drawings of the present application, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion.
For ease of understanding, some descriptions of concepts related to the embodiments of the present application are given by way of illustration and reference.
The plurality of electronic devices can form the super terminal through the communication network, the super terminal can be defined as presenting the plurality of electronic devices as a unified whole, and the plurality of electronic devices are organically integrated in the use experience, so that the user experience under the use environment of the plurality of electronic devices is improved. Each electronic device is a component of the super terminal, and a background application program of a certain electronic device may be used on other electronic devices in the super terminal, or some running information of other electronic devices in the super terminal may be viewed on a certain electronic device.
The electronic device may be at least one of a mobile phone, a foldable electronic device, a tablet computer, a Personal Computer (PC), a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a wearable device, a smart car device, an intelligent home device, or a city device, and the embodiment of the present application does not particularly limit the specific type of the electronic device. The communication network may be a wired network or a wireless network. For example, the communication network may be a Local Area Network (LAN) or a Wide Area Network (WAN), such as the internet. When the communication network is a local area network, the communication network may be a wifi hotspot network, a wifi P2P network, a bluetooth network, a zigbee network, or a Near Field Communication (NFC) network, for example. When the communication network is a wide area network, the communication network may be, for example, a third generation mobile communication technology (3 rd-generation wireless telephone technology, 3G) network, a fourth generation mobile communication technology (4G) network, a fifth generation mobile communication technology (5G) network, a future-evolution Public Land Mobile Network (PLMN), the internet, or the like.
The electronic device may install one or more applications. An application program may be simply referred to as an application, and is a software program capable of implementing one or more specific functions. For example, an instant messaging type application, a video type application, an audio type application, an image capture type application, a cloud desktop type application, and the like. The instant messaging applications may include, for example, short message applications, and the like,
Figure BDA0003237260880000041
Figure BDA0003237260880000042
Photo sharing
Figure BDA0003237260880000043
Figure BDA0003237260880000044
Figure BDA0003237260880000045
And so on. The image capture class application may, for example, include a camera application (system camera or third party camera application). Video-like applications, for example, may include
Figure BDA0003237260880000046
And so on. Audio-like applications, for example, may include
Figure BDA0003237260880000047
And so on. The application mentioned in the following embodiments may be a system application installed when the electronic device is shipped from the factory, or may be a system application installed when a user uses the electronic deviceAnd third-party applications downloaded from the network or acquired by other electronic devices.
Electronic equipment including but not limited to a carrier
Figure BDA0003237260880000048
Figure BDA0003237260880000049
Or other operating system.
Fig. 1 illustrates a schematic structure of an electronic device 10.
The electronic device 10 may include a processor 110, an external memory interface 120, an internal memory 121, an antenna 1, an antenna 2, a mobile communication module 130, a wireless communication module 140, an audio module 150, a sensor module 160, a camera module 170, a display screen 180, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the electronic device 10. In other embodiments of the present application, the electronic device 10 may include more or fewer components than illustrated, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The processor can generate an operation control signal according to the instruction operation code and the time sequence signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 may be a cache memory. The memory may store instructions or data that have been used or used more frequently by the processor 110. If the processor 110 needs to use the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc. The processor 110 may be connected to the audio module, the wireless communication module, the display, the camera, and the like through at least one of the above interfaces.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only an exemplary illustration, and does not limit the structure of the electronic device 10. In other embodiments of the present application, the electronic device 10 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The wireless communication function of the electronic device 10 may be implemented by the antenna 1, the antenna 2, the mobile communication module 130, the wireless communication module 140, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 10 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 130 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the electronic device 10. The mobile communication module 130 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 130 can receive the electromagnetic wave from the antenna 1, and filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 130 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 130 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 130 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 180. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 130 or other functional modules, independent of the processor 110.
The wireless communication module 140 may provide a solution for wireless communication applied to the electronic device 10, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), bluetooth Low Energy (BLE), ultra Wide Band (UWB), global Navigation Satellite System (GNSS), frequency Modulation (FM), short-range wireless communication (NFC), infrared (infrared, IR), and the like. The wireless communication module 140 may be one or more devices integrating at least one communication processing module. The wireless communication module 140 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 140 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 10 is coupled to mobile communication module 130 and antenna 2 is coupled to wireless communication module 140 so that electronic device 10 can communicate with networks and other electronic devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 10 may implement display functionality via a GPU, a display screen 180, and an application processor, among other things. The GPU is a microprocessor for image processing, connected to the display screen 180 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The sensor module comprises a touch sensor, a pressure sensor, a fingerprint sensor and the like. The camera module 170 includes a camera. The display screen 180 is used to display images, videos, and the like. The display screen 180 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 10 may include 1 or more display screens 180.
The electronic device 10 may implement a camera function via the camera module 170, isp, video codec, GPU, display screen 180, application processor AP, neural network processor NPU, etc.
The digital signal processor is used for processing digital signals, and can also process other digital signals. For example, when the electronic device 10 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 10 may support one or more video codecs. In this way, the electronic device 10 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 10 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 10. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card. Or files such as music, video and the like are transmitted from the electronic equipment to the external memory card.
The internal memory 121 may be used to store computer executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The data storage area may store data created during use of the electronic device 10 (e.g., audio data, phone book, etc.), and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 performs various functional methods or data processing of the electronic device 10 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The audio module 150 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 150 may also be used to encode and decode audio signals. In some embodiments, the audio module 150 may be disposed in the processor 110, or some functional modules of the audio module 150 may be disposed in the processor 110.
The software system of the electronic device 10 may employ a hierarchical architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the present application, a software structure of the electronic device 10 is exemplarily described by taking an Android system with a layered architecture as an example.
Fig. 2 is a block diagram of a software configuration of the electronic device 10 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, an application Layer, an application framework Layer, an Android Runtime (ART) and native C/C + + library, a Hardware Abstraction Layer (HAL), and a kernel Layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, resource manager, notification manager, activity manager, input manager, and the like.
The Window Manager provides a Window Management Service (WMS), which may be used for Window management, window animation management, surface management, and a relay station as an input system.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Activity Manager may provide an Activity Manager Service (AMS), which may be used for the start-up, switching, scheduling of system components (e.g., activities, services, content providers, broadcast receivers), and the management and scheduling work of application processes.
The Input Manager may provide an Input Manager Service (IMS) that may be used to manage inputs to the system, such as touch screen inputs, key inputs, sensor inputs, and the like. The IMS takes events from the input device nodes and assigns them to the appropriate windows through interaction with the WMS.
The android runtime comprises a core library and an android runtime. Android runtime is responsible for converting source code into machine code. Android runtime mainly includes adopting Advanced (AOT) compilation technology and Just In Time (JIT) compilation technology.
The core library is mainly used for providing the functions of basic Java class libraries, such as basic data structure, mathematics, IO, tools, database, network and the like libraries. The core library provides an API for android application development of users.
The native C/C + + library may include a plurality of functional modules. For example: surface manager (surface manager), media Framework (Media Framework), libc, openGL ES, SQLite, webkit, etc.
Wherein the surface manager is used for managing the display subsystem and providing the fusion of the 2D and 3D layers for a plurality of application programs. The media framework supports playback and recording of a variety of commonly used audio and video formats, as well as still image files, and the like. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like. OpenGL ES provides for the rendering and manipulation of 2D graphics and 3D graphics in applications. SQLite provides a lightweight relational database for applications of electronic device 10.
The hardware abstraction layer runs in a user space (user space), encapsulates the kernel layer driver, and provides a calling interface for an upper layer.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of the software and hardware of the electronic device 10 in connection with capturing a photo scene.
When the display screen receives the touch operation, the corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application frame layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera.
An application scenario diagram of the screen recording method provided by an embodiment of the present invention is exemplarily described below with reference to fig. 3.
This embodiment may include two electronic devices (e.g., electronic device 10 as shown in fig. 1). For convenience of description differentiation, the two electronic apparatuses are hereinafter referred to as a first electronic apparatus 100 and a second electronic apparatus 200. The first electronic device 100 and the second electronic device 200 may be located under the same local area network, for example, the first electronic device 100 and the second electronic device 200 establish a communication connection through Wi-Fi, the first electronic device 100 includes a first display 1001, and the second electronic device 200 includes a second display 2001. The first electronic device 100 is a source device, the second electronic device 200 is a destination device, and a certain application running on the first electronic device 100 may be displayed on the second electronic device 200 by projecting a screen, where the application may be a desktop application, a gallery application, a call application, a map application, a music application, a video application, and the like.
The first electronic device 100 and the second electronic device 200 include, but are not limited to, a mobile phone, a tablet computer, a PC, a notebook computer, and the like. The following description will be given by taking the first electronic device 100 as a mobile phone and the second electronic device 200 as a notebook computer as an example.
First electronic device 100 may open n applications APP 1 ~APP n And n is a positive integer. When the first electronic device 100 applies m applications APP 1 ~APP m The screen is projected to the second electronic device 200, and the second electronic device 200 can generate m APPs through the existing distributed display technology 1 ~APP m Corresponding m application windows, wherein m is a positive integer. E.g. multiple applications APP 1 ~APP m The frame data output by the SurfaceFlinger during operation can be respectively encoded and compressed by a video encoder to generate a plurality of applications APP 1 ~APP m For example, YUV format data. Multiple applications APP that the first electronic device 100 can output in real time from a video encoder 1 ~APP m The encoded data of (a) is sent to the second electronic device 200, and the second electronic device 200 may draw a plurality of applications APP according to the decoded encoded data 1 ~APP m The application window of (1).
The user can display one or more applications APP on the second electronic device side in a manner of opposite projection screen 1 ~APP m Performing control operation, for example, the user can apply APP via the mouse and keyboard of the second electronic device 200 1 ~APP m And performing control operation.
For example, the first electronic device 100 has a computer application APP started 1 Graph library application APP 2 And chat application APP 3 . The user can apply APP in the gallery 2 The interface is controlled by touch to call screen-projecting function icons. When the user clicks the screen-projection function icon, a device list capable of performing screen projection is popped up, and the device list may include information such as an Identity (ID) or a device name. When the user selects the second electronic device 200 as the destination device in the device list, the first electronic device 100 may apply the gallery to the APP 2 The application window of (2) is projected to the second electronic device 200, the second electronic device200 may generate an APP for gallery application 2 The corresponding application window. The user can employ the gallery application APP on the first electronic device 100 side 2 Similar screen-casting operation mode, applying chat application APP 3 The application window of (2) is projected to the second electronic device 200.
As shown in FIG. 3, calculator application APP 1 In foreground display of the first electronic device 100, gallery application APP 2 Chat application APP 3 Running in the background of the first electronic device 100 and applying an APP to the gallery 2 Chat application APP 3 The screen is displayed on the second electronic device 200.
When screen recording on the first electronic device 100 is required, the user may select an application requiring screen recording on the first electronic device 100 or the second electronic device 200. For example, the operating system of the first electronic device 100 has a screen recording function, and the user may click a screen recording icon on the first electronic device 100 to start the screen recording function. When the user clicks the screen recording icon, the operating system may acquire information of an application currently opened by the first electronic device 100, and pop up an application list that may be selected for screen recording, where the application list may display names or thumbnails of the applications. When the user selects one or more applications in the application list, the first electronic device 100 may obtain identification information of the one or more applications according to an operation of the user, where the identification information may refer to a process ID of the application or a package name of the application, and the first electronic device 100 may start to record a screen of the one or more applications according to the identification information of the application. The process ID is a value used by the kernel of the device operating system to uniquely identify the process of the application.
In some embodiments, each application running on an electronic device has a unique process ID and package name, and each electronic device has a unique device ID, so that a process in an electronic device can be uniquely determined by the process ID and the device ID as unique identification information of the application. For example, the device ID of the electronic device is an international mobile equipment identity, and the applications installed in the electronic device include QQ
Figure BDA0003237260880000091
The device ID of the electronic device, the process ID of each application, and the package name are shown in table 1 below:
TABLE 1
Figure BDA0003237260880000101
In some embodiments, when the screen recording scene only includes the first electronic device 100, and the first electronic device 100 records a screen of a foreground application or a background application, only a process ID or a package name of the application may be used as the unique identification information of the application. When the screen recording scene includes a plurality of electronic devices, for example, a scene displayed by the application of the first electronic device 100 on the second electronic device 200 when the screen of the application is projected, the first electronic device 100 may also record the foreground application or the background application, and only the process ID or the package name of the application may be used as the unique identification information of the application. When the second electronic device 200 records the screen projection application, the process ID and the device ID of the first electronic device 100 may be used as unique identification information of the screen projection application.
The first electronic device 100 may also be installed with a third-party screen recording application or a screen recording application pre-installed when the first electronic device leaves a factory, and the first electronic device 100 may start the screen recording application. The screen recording application may acquire application information currently started by the first electronic device 100, when a user selects one or more applications for screen recording in the screen recording application of the first electronic device 100, the first electronic device 100 may acquire identification information of the one or more applications according to an operation of the user, and the first electronic device 100 may start to record the one or more applications according to the identification information of the applications.
In some embodiments, the user may further specify a screen projection application (such as a gallery application APP) on the second electronic device 200 that requires screen recording 2 Chat application APP 3 ) Specifically, the screen recording operation is performed by the first electronic device 100. For example, the second electronic device 200 may generate a screen recording instruction in response to an operation of a user, and the second electronic device 200 may send the screen recording instruction to the first electronic device 100 through a Wi-Fi channel, so that the first electronic device 100 enters an application designated by the userAnd (6) line recording screen. For example, the second electronic device 200 is installed with a screen recording application, and the screen recording application of the second electronic device 200 may acquire application information of the first electronic device 100 currently displayed on the screen of the second electronic device 200. When the user selects one or more applications for screen recording in the screen recording applications of the second electronic device 200, the second electronic device 200 may generate a screen recording instruction associated with the one or more applications in response to the user operation, and the screen recording instruction may include identification information of the applications. The second electronic device 200 may send a screen recording instruction to the first electronic device 100, so as to control the first electronic device 100 to start recording one or more applications selected by the user.
When the first electronic device 100 opens an application, the process of the application may request the WMS to create a Surface for the application window. Multiple processes of an application may correspond to multiple surfaces. The Surface may refer to a data buffer provided by a screen data consumer to a screen data producer, the screen data producer may produce image content on the Surface, and the screen data consumer may draw or convert the produced data on a screen into data required by the screen data producer. For example, in a screen recording scenario of the first electronic device 100, a screen data producer, such as a virtual display (virtual display), may generate data at the Surface, a screen data consumer, such as a multimedia recorder (mediacoder), may obtain the data from the Surface, and the multimedia recorder may encode the data into a video using an encoding method, such as Fast Forward Moving Picture Expert Group (FFMPEG), after obtaining the data.
Suppose that the first electronic device 100 has started a calculator application APP 1 Graph library application APP 2 Chat application APP 3 Calculator application APP 1 In foreground display of the first electronic device 100, gallery application APP 2 Chat application APP 3 In the background operation of the first electronic device 100, the user selects to apply the APP to the gallery 2 Chat application APP 3 And recording the screen. The first electronic device 100 may obtain a gallery application APP according to a screen recording operation of a user 2 Identification information and chat applicationAPP 3 The identification information of (1). First electronic device 100 applies APP according to gallery 2 The identification information of the application library APP is obtained from the corresponding Surface 2 The first electronic device 100 may apply the APP according to the gallery 2 Media stream generation and gallery application APP 2 And the corresponding screen recording file f1. For example, the first electronic device 100 may invoke a SurfaceFlinger or Direct Digital Synthesis (DDS) algorithm to apply the APP to the gallery 2 The media stream is subjected to image synthesis to generate a screen recording file f1.
Similarly, the first electronic device 100 may apply APP according to chat 3 The identification information of the chat application APP is obtained from the corresponding Surface 3 The first electronic device 100 may apply APP according to the chat 3 Media stream generation and chat application APP 3 And the corresponding screen recording file f2.
In some embodiments, the first electronic device 100 may encode the media stream on the Surface to form a screen-recording file during or before the media stream in the Surface is sent to the second electronic device 200.
Suppose a user selects to apply APP to a calculator 1 And recording the screen. The first electronic device 100 may obtain a calculator application APP according to a screen recording operation of a user 1 The identification information of (a). The first electronic device 100 may apply the APP according to a calculator 1 Obtains the calculator application APP from the corresponding Surface 1 The first electronic device 100 may apply the APP according to the calculator 1 Media stream generation and calculator application APP 1 And the corresponding screen recording file f3. For example, the first electronic device 100 calls the surfeFinger or DDS algorithm to apply the APP to the calculator 1 And (4) performing image synthesis on the media stream to generate a screen recording file f3.
In some embodiments, the second electronic device 200 may also be a screen-casting application (e.g., gallery application APP) 2 Chat application APP 3 ) And recording the screen. For example, the operating system of the second electronic device 200 has a screen recording function, and the user may click a screen recording icon on the second electronic device 200 to start the screen recording function. When the user clicksWhen the screen icon is recorded, the operating system may acquire application information currently displayed on the screen of the second electronic device 200, and pop up an application list capable of performing screen recording selection, where the application list may display names or thumbnails of applications. When the user selects one or more applications in the application list, the second electronic device 200 may obtain the identification information of the one or more applications according to the operation of the user, and the second electronic device 200 may record the one or more applications according to the identification information of the applications.
In a screen projection display scene, the first electronic device 100 applies the graph library APP 2 Media stream and chat application APP 3 The media stream is sent to the second electronic device 200, and the second electronic device 200 decodes and displays the received media stream, so as to display the gallery application APP2 and the chat application APP on the second display screen 2001 3
The second electronic device 200 may apply APP according to chat 3 Obtaining chat application APP from received media stream by identification information 3 The identification information may include a process ID and a device ID (it is determined by the device ID that the media stream is sent by the first electronic device 100), and the second electronic device 200 may call a surfaceflag or DDS algorithm to apply APP to the chat application 3 The media stream is synthesized into an APP for chatting 3 And (5) corresponding screen recording files. For example, the second electronic device 200 may be applying chat 3 For chat applications 3 The media streams of (2) are image-composited.
The second electronic device 200 may also apply the APP according to the picture 2 Obtain picture application APP from received media stream 2 The second electronic device 200 may call a surface flicker or a DDS algorithm to apply an APP to the picture 2 The media stream is subjected to image synthesis to generate and apply APP to the image 2 And (5) corresponding screen recording files.
An application scenario diagram of a screen recording method according to another embodiment of the present invention is exemplarily described below with reference to fig. 4.
The embodiment includes a first electronic device 100 and a second electronic device 200. The following description will be given by taking the first electronic device 100 as a mobile phone and the second electronic device 200 as a car-mounted device as an example. The first electronic device 100 can establish communication connection with the second electronic device 200 through bluetooth, and the first electronic device 100 and the second electronic device 200 establish a communication channel between a mobile phone and an automobile through a vehicle-mounted mobile phone mapping scheme, so that applications and services of the mobile phone are extended to the automobile. The car phone mapping scheme includes, but is not limited to, hiCar, carPlay, or carpife, etc.
As shown in FIG. 4, the first electronic device 100 applies a map to an APP 11 Telephone application APP 12 Music application APP 13 The screen is displayed on the second electronic device 200. In the existing vehicle-mounted screen projection scene, when a certain application on a mobile phone is projected on a screen of a vehicle-mounted device, the application runs in a background of the mobile phone, the mobile phone cannot open the application any more, and the mobile phone cannot switch the application to a foreground for screen recording. Because the screen recording function is not supported by the vehicle equipment generally, the screen recording function cannot be performed on the screen projecting application by a user at the mobile phone side or the vehicle side.
In this application, in order to record a screen of a screen-projected application in a vehicle-mounted screen-projected scene, the first electronic device 100 may obtain, according to a screen recording operation of a user (an operation on the first electronic device 100 side), identification information of an application that needs to be recorded, where the identification information may be a process ID of the application or a package name of the application. The first electronic device 100 may obtain the media stream of the application from the corresponding Surface according to the identification information of the application, and the first electronic device 100 may generate a screen recording file corresponding to the application according to the media stream of the application. For example, the first electronic device 100 invokes a surfefinger or DDS algorithm to perform image synthesis on the applied media stream, and generates a screen recording file.
When the first electronic device 100 starts the map application APP 11 Telephone application APP 12 And music application APP 13 Map application APP 11 May request the WMS to create a Surface, telephony application APP, for its application window 12 The process may request the WMS to create a Surface, music application APP, for its application window 13 Can beRequesting the WMS to create a Surface for its application window. Following to apply APP to map 11 Applying to telephone APP 12 Taking screen recording as an example. Assuming that the operating system of the first electronic device 100 has a screen recording function, the user may click a screen recording icon on the first electronic device 100 to start the screen recording function. When the user clicks the screen recording icon, the operating system may acquire information of an application currently opened by the first electronic device 100, and pop up an application list that may be selected for screen recording, for example, the application list may display names of the applications. Applying APP when user selects graph in application list 11 APP with telephone application 12 The first electronic device 100 may obtain the map application APP 11 Applying to telephone APP 12 The first electronic device 100 applies the APP according to the map 11 The identification information of the map application APP is obtained from the corresponding Surface 11 First electronic device 100 applies APP to the map 11 The media stream is subjected to image synthesis to generate and map application APP 11 And (5) corresponding screen recording files. The first electronic device 100 may also apply APP according to the phone 12 The identification information of the user obtains the telephone application APP from the corresponding Surface 12 First electronic device 100 applies APP to the phone 12 The media stream is subjected to image synthesis to generate and use the APP with the telephone 12 And (5) corresponding screen recording files.
In some embodiments, the first electronic device 100 may also be installed with a third-party screen recording application or a screen recording application pre-installed when the device leaves a factory, and the user may also start the screen recording application in a vehicle-mounted screen projection scene to record a screen. After the first electronic device 100 starts the screen recording application, the screen recording application may acquire application information currently started by the first electronic device 100, and when the user selects the map application APP in the screen recording application 11 Applying to telephone APP 12 When recording the screen, the first electronic device 100 may obtain the map application APP according to the operation of the user 11 APP with telephone application 12 The first electronic device 100 may apply the APP according to the map 11 APP with telephone application 12 Identification information obtaining map application APP 11 Application to telephoneAPP 12 The first electronic device 100 then respectively applies the APP to the map 11 Media stream, telephony application APP 12 The media stream is synthesized into an image, and the image is generated and applied to a map APP 11 Corresponding screen recording file and telephone application APP 12 And (5) corresponding screen recording files.
An application scenario diagram of a screen recording method according to another embodiment of the present invention is exemplarily described below with reference to fig. 5.
The embodiment includes a first electronic device 100 and a second electronic device 200. The first electronic device 100 may be a small-screen terminal device such as a mobile phone and a tablet, and the second electronic device 200 may be a large-screen display device (e.g., a television, a smart screen, etc.). In the following, the first electronic device 100 is a mobile phone, and the second electronic device 200 is a television. The first electronic device 100 may establish a communication connection with the second electronic device 200 through Wi-Fi. In a distributed game scenario, the first electronic device 100 may serve as an operation handle, and the second electronic device 200 may serve as a display end to display a game screen projected by the first electronic device 100.
In the screen recording in the existing distributed game scene, the screen display content of the first electronic device 100 is recorded at the first electronic device 100 side, and the screen display content of the second electronic device 200 is recorded at the second electronic device 200 side, so that the screen display contents at two sides cannot be recorded at the same time, and currently, the screen display contents at two sides can only be recorded respectively, and then the screen recording files at two sides are spliced. Due to the existence of network time delay, clocks of equipment on two sides cannot be accurately synchronized, and time difference exists in spliced screen recording files.
In the present application, in order to implement screen recording for synchronizing (clock synchronizing) screen display contents on both sides in a distributed game scene, screen display contents of the first electronic device 100 and the second electronic device 200 are recorded on the first electronic device 100 side. As shown in fig. 5, in a distributed game scenario, a mobile phone serves as a game console, a television serves as a game display, and device IDs of two devices (the mobile phone and the television) are different, because game applications run by the two devices belong to the same game developer, the two devices are named by the same package name, that is, the package names of the game applications of the two devices are the same. The kernel of the operating system of the different devices generates different values for uniquely identifying the processes of the application running thereon, i.e. the process IDs of the gaming applications of the two devices are different.
In a distributed game scenario, when the first electronic device 100 projects a game application to the second electronic device 200, the first electronic device 100 may automatically open a game control, at this time, the first electronic device 100 displays a game control interface, and the second electronic device 200 displays a game screen. The user can customize the position and size of the keys in the game control interface. The WMS creates two mutually independent surfaces for the progress of the game application in the first electronic device 100. Two mutually independent surfaces can correspond to the same process ID, and a screen data producer can produce the content of a game control interface on one Surface and produce the content of a game picture on the other Surface.
In some embodiments, the first electronic device 100 may display the handle function icon when the first electronic device 100 casts the game application to the second electronic device 200. When the user clicks the handle function icon, the first electronic device 100 enters a "handle mode", and the first electronic device 100 displays a game manipulation interface.
The first electronic device 100 may respond to a screen recording operation of a user, acquire the media streams from the two surfaces, and the first electronic device 100 may perform time alignment on the two media streams according to timestamps of the two media streams, and perform image synthesis and splicing processing on the two media streams subjected to the time alignment processing to generate a screen recording file including a game operation interface and a game picture interface, so that the user may view a screen recording of the time-synchronized game operation interface and the game picture interface, thereby improving user experience. For example, the splicing process may be that the game operation interface and the game screen interface in the screen recording file are arranged in a vertical split screen manner, or in a horizontal split screen manner.
In some embodiments, the game operation interface in the screen recording file can also be displayed in a certain designated area of the game picture interface in a floating manner, and the position of the area can be customized by the user. For example, when the user performs a screen recording operation on the first electronic device 100, the screen recording operation includes setting the game operation interface to be displayed in a lower left corner area or a lower right corner area of the game screen interface in a floating manner.
In some embodiments, the first electronic device 100 may further obtain media streams from the two Surface in response to a screen recording operation of the user, and the first electronic device 100 may perform image synthesis on the two media streams respectively to generate a first screen recording subfile corresponding to the game operation interface and a second screen recording subfile corresponding to the game screen interface. The first electronic device 100 may perform time alignment processing and splicing processing on the first screen recording subfile and the second screen recording subfile according to the time stamp of the first screen recording subfile and the time stamp of the second screen recording subfile, generate a screen recording file including a game operation interface and a game picture interface, and then the user may view screen recording of the time-synchronized game operation interface and the game picture interface.
Assuming that the operating system of the first electronic device 100 has a screen recording function, the user may click a screen recording icon on the first electronic device 100 to start the screen recording function. When the user clicks the screen recording icon, the operating system may acquire information of a game application currently opened by the first electronic device 100, the first electronic device 100 acquires a media stream of the game application from a corresponding Surface according to identification information of the game application, for example, the identification information of the game application includes a process ID of the game application, and the first electronic device 100 may determine two surfaces corresponding to the game application according to the process ID of the game application. The first electronic device 100 may obtain the media streams from the two surfaces, and the first electronic device 100 may perform time alignment on the two media streams according to timestamps of the two media streams, and then perform image synthesis and splicing processing on the two media streams that have undergone the time alignment processing, to generate a screen recording file including a game operation interface and a game screen interface.
Referring to fig. 6, an embodiment of the present application provides a screen recording method applied to a first electronic device 100. In this embodiment, the screen recording method may include:
61. in response to a first operation of starting a screen recording function, identification information of an application associated with the first operation is acquired.
In some embodiments, when the user wishes to perform screen recording, the user may start the screen recording function of the first electronic device 100, for example, perform a first operation to start the screen recording function of the first electronic device 100. The first operation may be any one of operations of pressing physical keys by a user (for example, simultaneously pressing a power key and a volume plus key), performing predefined gestures by the user (for example, sliding three fingers down on a screen, knocking the screen by finger joints, shaking the mobile phone), operating corresponding buttons by the user on the screen (clicking a screen recording control in a notification bar), inputting voice commands by the user, operating the screen recording application by the user, and the like. The embodiment of the application does not specifically limit screen recording operation.
In some embodiments, the application associated with the first operation may include one or more applications, and the one or more applications may be running in the foreground or in the background, or partially in the foreground or partially in the background, of the first electronic device 100. The identification information may refer to information that may uniquely identify the application, for example, the identification information may include a process ID of the application or a package name of the application. In other embodiments, the identification information may further include a process ID of the application and a device ID for running the application, or a package name of the application and a device ID for running the application.
In some embodiments, in a screen projection scene, when one or more applications running on the first electronic device 100 are projected to be displayed on the second electronic device 200, the user may further specify a screen projection application that needs screen recording on the second electronic device 200, and a specific screen recording operation is performed by the first electronic device 100. The first electronic device 100 may further acquire identification information of an application associated with the screen recording instruction in response to the screen recording instruction sent by the second electronic device 200. For example, the second electronic device 200 may generate screen recording instructions associated with one or more applications in response to a user operation, and the screen recording instructions may include identification information of the applications. The second electronic device 200 may send the screen recording instruction to the first electronic device 100, and the first electronic device 100 may further obtain the identification information of the application requiring screen recording.
62. And acquiring the media stream of the application according to the identification information.
In some embodiments, when the first electronic device 100 opens an application, a process of the application may request the WMS to create a Surface for the application window. Each process of the application may correspond to at least one Surface. Surface may refer to a data buffer provided by a screen data consumer to a screen data producer, where the screen data producer may produce image content, and the screen data consumer may consume the produced data on a screen (draw) or convert the produced data into data required by the screen data producer. The first electronic device 100 may determine a Surface corresponding to the application according to the identification information of the application, and the first electronic device 100 may acquire the media stream of the application from the Surface corresponding to the application.
63. And generating a screen recording file corresponding to the application according to the media stream of the application.
In some embodiments, when the first electronic device 100 acquires the media stream of the application, the first electronic device 100 may invoke a surfaceflag or a DDS algorithm to perform image synthesis on the media stream of the application, and generate a screen recording file corresponding to the application.
In some embodiments, in some application scenarios, the first electronic device 100 may acquire the two media streams according to identification information of a certain application, for example, in a distributed game scenario, when the first electronic device 100 casts a game application to the second electronic device 200, the first electronic device 100 may automatically open a game control, at this time, the first electronic device 100 displays a game control interface, and the second electronic device 200 displays a game screen. The WMS creates two independent surfaces (e.g., a first Surface and a second Surface) for a process of a game application in the first electronic device 100. The two mutually independent surfaces can correspond to the same process ID, and a screen data producer can produce the content of the game control interface on the first Surface and produce the content of the game picture on the second Surface.
The first electronic device 100 may respond to a screen recording operation of a user, acquire the media streams from the first Surface and the second Surface, and the first electronic device 100 may time-align the two media streams according to timestamps of the two media streams, and then perform image synthesis and splicing processing on the two media streams subjected to the time alignment processing to generate a screen recording file including a game operation interface and a game picture interface, so that the user may view a screen recording of the time-synchronized game operation interface and the game picture interface, thereby improving user experience. For example, the splicing process may be that the game operation interface and the game screen interface in the screen recording file are arranged in a vertical split screen manner, or in a horizontal split screen manner.
In some embodiments, the game operation interface in the screen recording file may also be displayed in a floating manner in a certain designated area of the game screen interface, and the position of the area may be customized by the user. For example, when the user performs the screen recording operation on the first electronic device 100, the screen recording operation includes setting the game operation interface to be displayed in a lower left corner area or a lower right corner area of the game screen interface in a floating manner.
In some embodiments, the first electronic device 100 may further obtain media streams from the first Surface and the second Surface in response to a screen recording operation of the user, and the first electronic device 100 may perform image synthesis on the two media streams to generate a first screen recording subfile corresponding to the game operation interface and a second screen recording subfile corresponding to the game screen interface. The first electronic device 100 may perform time alignment processing and splicing processing on the first screen recording subfile and the second screen recording subfile according to the timestamp of the first screen recording subfile and the timestamp of the second screen recording subfile, generate a screen recording file including a game operation interface and a game picture interface, and then the user may view the screen recording of the time-synchronized game operation interface and the game picture interface.
According to the screen recording method, the identification information of the application needing to record the screen is obtained, and the screen recording file is generated based on the media stream obtained by the identification information, compared with the existing screen recording on a local desktop, the screen recording method can be used for recording the screen of one or more application interfaces of the electronic equipment, the application recorded with the screen can run on the foreground of the electronic equipment and can also run on the background of the electronic equipment, meanwhile, the screen recording method can record two or more screen display contents in a distributed display scene on the side of the local electronic equipment, the synchronization of the recording clocks can be guaranteed by recording the screen display contents by the same equipment, and a user can watch the contents of the time-synchronized display interfaces in one screen recording file.
Referring to fig. 7, an embodiment of the present application provides a screen recording method applied to a first electronic device 100. The first electronic device 100 and the second electronic device 200 may be located under the same local area network, for example, the first electronic device 100 and the second electronic device 200 establish a communication connection through Wi-Fi or bluetooth, the first electronic device 100 includes a first display 1001, and the second electronic device 200 includes a second display 2001. The first electronic device 100 is a source device, the second electronic device 200 is a destination device, and one or more applications running on the first electronic device 100 can be projected and displayed on the second electronic device 200. In this embodiment, the screen recording method may include:
71. when the first electronic device 100 detects a first operation for starting a screen recording function, the first electronic device 100 acquires identification information of an application associated with the first operation, wherein the application associated with the first operation includes an application for projecting a screen to a second electronic device.
In some embodiments, in a screen projection scenario, when a user wishes to perform screen recording, the user may start a screen recording function of the first electronic device 100, for example, perform a first operation to start the screen recording function of the first electronic device 100. The first operation may be any one of, for example, an operation of pressing a physical key by a user, a user performing a predefined gesture, a user operating a corresponding button on a screen, a user inputting a voice command, a user operating on a screen recording application, and the like. The embodiment of the present application does not specifically limit the screen recording operation.
In some embodiments, the applications associated with the first operation may include one or more applications that may be running in the foreground of the first electronic device 100, or in the background (e.g., screen-cast on the second electronic device 200), or partially in the foreground and partially in the background (e.g., screen-cast on the second electronic device 200). The identification information may refer to information that may uniquely identify the application, for example, the identification information may include a process ID of the application or a package name of the application. In other embodiments, the identification information may also include a device ID for running the application.
72. When the first electronic device 100 receives a screen recording instruction sent by the second electronic device 200, the first electronic device 100 acquires identification information of an application associated with the screen recording instruction, where the application associated with the screen recording instruction includes an application that is projected to the second electronic device.
In some embodiments, the user may also specify a screen-casting application on the second electronic device 200 that requires screen-casting, and a specific screen-casting operation is performed by the first electronic device 100. When the first electronic device 100 receives the screen recording instruction sent by the second electronic device 200, the first electronic device 100 acquires the identification information of the application associated with the screen recording instruction. For example, the second electronic device 200 may generate screen recording instructions associated with one or more applications in response to a user operation, and the screen recording instructions may include identification information of the applications. The second electronic device 200 may send the screen recording instruction to the first electronic device 100, and the first electronic device 100 may further obtain the identification information of the application requiring screen recording.
In some embodiments, the first electronic device 100 may choose to perform step 71 or choose to perform step 72.
73. The first electronic device 100 acquires the media stream of the application according to the identification information.
In some embodiments, the first electronic device 100 may determine a Surface corresponding to the application according to the identification information of the application, and the first electronic device 100 may obtain a media stream of the application from the Surface corresponding to the application.
74. The first electronic device 100 generates a screen recording file corresponding to the application according to the media stream of the application.
In some embodiments, when the first electronic device 100 acquires the media stream of the application, the first electronic device 100 may invoke a surfaceflunger or a DDS algorithm to perform image synthesis on the media stream of the application, and generate a screen recording file corresponding to the application.
In some embodiments, in some application scenarios, the first electronic device 100 may acquire the two media streams according to identification information of a certain application, for example, in a distributed game scenario, the first electronic device 100 displays a game control interface, and the second electronic device 200 displays a game screen. The WMS creates two separate surfaces (e.g., a first Surface and a second Surface) for the process of the game application in the first electronic device 100. The two mutually independent surfaces can correspond to the same process ID, and a screen data producer can produce the content of the game control interface on the first Surface and produce the content of the game picture on the second Surface.
The first electronic device 100 may respond to a screen recording operation of a user, and acquire the media streams from the first Surface and the second Surface, and the first electronic device 100 may perform time alignment on the two media streams according to timestamps of the two media streams, and perform image synthesis and splicing processing on the two media streams subjected to the time alignment processing to generate a screen recording file including a game operation interface and a game picture interface, so that the user may view a screen recording of the time-synchronized game operation interface and the game picture interface, thereby improving user experience. For example, the splicing process may be that the game operation interface and the game screen interface in the screen recording file are arranged in a vertical split screen manner, or in a horizontal split screen manner.
In some embodiments, the game operation interface in the screen recording file may also be displayed in a floating manner in a certain designated area of the game screen interface, and the position of the area may be customized by the user. For example, when the user performs a screen recording operation on the first electronic device 100, the screen recording operation includes setting the game operation interface to be displayed in a lower left corner area or a lower right corner area of the game screen interface in a floating manner.
In some embodiments, the first electronic device 100 may further obtain media streams from the first Surface and the second Surface in response to a screen recording operation of the user, and the first electronic device 100 may perform image synthesis on the two media streams respectively to generate a first screen recording subfile corresponding to the game operation interface and a second screen recording subfile corresponding to the game screen interface. The first electronic device 100 may perform time alignment processing and splicing processing on the first screen recording subfile and the second screen recording subfile according to the time stamp of the first screen recording subfile and the time stamp of the second screen recording subfile, generate a screen recording file including a game operation interface and a game picture interface, and then the user may view screen recording of the time-synchronized game operation interface and the game picture interface.
In some embodiments, the second electronic device 200 may also record the screen projection application. In an existing screen-projection display scenario, the second electronic device 200 may receive the media stream sent by the first electronic device 100, and decode and display the received media stream. When the user wishes to perform screen recording on the second electronic device 200 side, the user may start the screen recording function of the second electronic device 200, for example, execute the second operation to start the screen recording function of the second electronic device 200. The second operation may be, for example, any one of an operation of pressing a physical key by a user, a user performing a predefined gesture, a user operating a corresponding button on a screen, a user inputting a voice command, a user operating on a screen recording application, and the like. The embodiment of the application does not specifically limit screen recording operation.
When the second electronic device 200 detects a second operation for starting the screen recording function, the second electronic device 200 acquires identification information of an application program associated with the second operation, wherein the identification information of the application program associated with the second operation may include a process ID of the application program and a device ID of the first electronic device 100 (it is determined by the device ID of the first electronic device 100 that the media stream actually originates from the first electronic device 100). The second electronic device 200 may acquire a media stream corresponding to the identification information of the application program associated with the second operation from the received media stream, and the second electronic device 200 generates a screen recording file according to the acquired media stream. For example, the second electronic device 200 may image-synthesize the media stream to generate a screen-recorded file before displaying the received media stream.
According to the screen recording method, the identification information of the screen recording application is acquired, and then the screen recording file is generated based on the media stream acquired by the identification information, compared with the existing screen recording method which cannot be carried out on the screen projection application, the screen recording method can be used for recording the screen from the background application of the electronic equipment or the screen projection application from the screen projection to other electronic equipment, and simultaneously, two or more screen display contents in a distributed display scene can be recorded on the local electronic equipment side, the same equipment can record a plurality of screen display contents to ensure the synchronization of the recording clocks, and a user can view the contents of a plurality of display interfaces with synchronous time in one screen recording file.
Referring to fig. 8, a hardware structure diagram of a first electronic device 100 according to an embodiment of the present disclosure is provided. As shown in fig. 8, the first electronic device 100 may include a first display 1001, a first processor 1002, a first memory 1003, and a first communication bus 1004. The first memory 1003 is used to store one or more first computer programs 1005. One or more first computer programs 1005 are configured to be executed by the first processor 1002. The one or more first computer programs 1005 include instructions which may be used to implement the screen recording method described in fig. 6 or fig. 7 in the first electronic device 100.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the first electronic device 100. In other embodiments, the first electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components.
The present embodiment further provides a computer storage medium, where computer instructions are stored in the computer storage medium, and when the computer instructions are run on an electronic device, the electronic device is caused to execute the above related method steps to implement the screen recording method in the above embodiments.
The present embodiment further provides a computer program product, which when running on a computer, causes the computer to execute the relevant steps described above, so as to implement the screen recording method in the foregoing embodiments.
In addition, an apparatus, which may be specifically a chip, a component or a module, may include a processor and a memory connected to each other; when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the screen recording method in the above method embodiments.
The first electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, and therefore, the beneficial effects that can be achieved by the first electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are illustrative, and for example, the division of the module or unit into one logical functional division may be implemented in another way, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented as a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application.

Claims (15)

1. A screen recording method is applied to first electronic equipment and is characterized by comprising the following steps:
in response to a first operation of starting a screen recording function, acquiring identification information of application programs associated with the first operation, wherein the application programs associated with the first operation comprise application programs running in a background of the first electronic device;
acquiring a media stream of the application program according to the identification information;
and generating a screen recording file corresponding to the application program according to the media stream of the application program.
2. The screen recording method according to claim 1, wherein the identification information includes a process identification number ID of the application program or a package name of the application program.
3. The screen recording method according to claim 1 or 2, wherein the acquiring the media stream of the application program according to the identification information comprises:
determining a screen data buffer area corresponding to the application program according to the identification information;
and acquiring the media stream of the application program from the screen data buffer area.
4. The screen recording method according to any one of claims 1 to 3, wherein the generating a screen recording file corresponding to the application program according to the media stream of the application program comprises:
and carrying out image synthesis on the media stream of the application program to generate a screen recording file corresponding to the application program.
5. The screen recording method according to any one of claims 1 to 3, wherein the application program corresponds to a first screen data buffer and a second screen data buffer, and the generating of the screen recording file corresponding to the application program according to the media stream of the application program comprises:
performing image synthesis on a first media stream acquired from the first screen data buffer area to generate a first screen recording subfile;
performing image synthesis on a second media stream acquired from the second screen data buffer area to generate a second screen recording subfile;
and splicing the first screen recording subfile and the second screen recording subfile to generate a screen recording file corresponding to the application program.
6. The screen recording method of claim 5, wherein said splicing the first screen recording subfile with the second screen recording subfile comprises:
performing time alignment processing on the first screen recording subfile and the second screen recording subfile according to the time stamps of the first screen recording subfile and the second screen recording subfile;
and splicing the first screen recording subfile and the second screen recording subfile which are subjected to the time alignment processing.
7. The screen recording method according to any one of claims 1 to 3, wherein the application program corresponds to a first screen data buffer and a second screen data buffer, and the generating of the screen recording file corresponding to the application program according to the media stream of the application program comprises:
acquiring a first media stream from the first screen data buffer and a second media stream from the second screen data buffer;
and carrying out image synthesis and splicing on the first media stream and the second media stream to generate a screen recording file corresponding to the application program.
8. The screen recording method of claim 7, wherein the image compositing and splicing the first media stream with the second media stream comprises:
time-aligning the first media stream and the second media stream according to the time stamps of the first media stream and the second media stream;
and carrying out image synthesis and splicing on the first media stream and the second media stream which are subjected to the time alignment processing.
9. The screen recording method according to any one of claims 5 to 8, wherein the screen recording file includes a first interface and a second interface, the first interface and the second interface are displayed in a left-right split screen manner, or the first interface and the second interface are displayed in an up-down split screen manner, or a part of or all of the area of the first interface is displayed on the second interface in a floating manner.
10. The screen recording method according to any one of claims 1 to 9, wherein the screen recording method further comprises:
and responding to a screen recording instruction sent by the second electronic equipment, and acquiring identification information of an application program associated with the screen recording instruction.
11. The screen recording method according to any one of claims 1 to 9, wherein the application program associated with the first operation further comprises an application program projected for display on a second electronic device.
12. A screen recording method is applied to second electronic equipment, and one or more application program windows are projected and displayed on the second electronic equipment by first electronic equipment, and the screen recording method is characterized by comprising the following steps of:
receiving a media stream sent by the first electronic equipment;
responding to a first operation of starting a screen recording function, and acquiring identification information of an application program window associated with the first operation;
acquiring a media stream corresponding to the identification information of the application program window associated with the first operation from the received media stream;
and generating a screen recording file according to the acquired media stream.
13. The screen recording method of claim 12, wherein the identification information includes a process ID of the application window and a device ID of the first electronic device.
14. A computer-readable storage medium storing computer instructions that, when executed on an electronic device, cause the electronic device to perform the screen recording method of any one of claims 1-13.
15. An electronic device, wherein the electronic device comprises a processor and a memory, the memory is used for storing instructions, and the processor is used for calling the instructions in the memory to enable the electronic device to execute the screen recording method according to any one of claims 1 to 13.
CN202111006119.6A 2021-08-30 2021-08-30 Screen recording method, electronic equipment and computer readable storage medium Pending CN115734021A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111006119.6A CN115734021A (en) 2021-08-30 2021-08-30 Screen recording method, electronic equipment and computer readable storage medium
PCT/CN2022/113756 WO2023030057A1 (en) 2021-08-30 2022-08-19 Screen recording method, electronic device, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111006119.6A CN115734021A (en) 2021-08-30 2021-08-30 Screen recording method, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN115734021A true CN115734021A (en) 2023-03-03

Family

ID=85291003

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111006119.6A Pending CN115734021A (en) 2021-08-30 2021-08-30 Screen recording method, electronic equipment and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN115734021A (en)
WO (1) WO2023030057A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2013201494B1 (en) * 2012-12-12 2013-07-11 Rokt Pte Ltd A Digital Advertising System and Method
CN107977436A (en) * 2017-12-05 2018-05-01 广东欧珀移动通信有限公司 Filename generation method, device, terminal and the storage medium of record screen file
CN108550040A (en) * 2018-03-09 2018-09-18 福州米鱼信息科技有限公司 A kind of electronic evidence acquisition method and system based on throwing screen display
CN111182235B (en) * 2019-12-05 2022-05-20 浙江大华技术股份有限公司 Method, device, computer device and storage medium for recording spliced screen pictures
CN112153436B (en) * 2020-09-03 2022-10-18 Oppo广东移动通信有限公司 Screen recording method, device, equipment and storage medium

Also Published As

Publication number Publication date
WO2023030057A1 (en) 2023-03-09

Similar Documents

Publication Publication Date Title
WO2020244495A1 (en) Screen projection display method and electronic device
CN110109636B (en) Screen projection method, electronic device and system
CN110377250B (en) Touch method in screen projection scene and electronic equipment
WO2020244492A1 (en) Screen projection display method and electronic device
US20230359424A1 (en) Multi-Screen Collaboration Method and System, and Electronic Device
CN114040242B (en) Screen projection method, electronic equipment and storage medium
WO2021249318A1 (en) Screen projection method and terminal
CN114741213B (en) Notification processing method, chip, electronic device and computer-readable storage medium
CN112527174B (en) Information processing method and electronic equipment
WO2022078295A1 (en) Device recommendation method and electronic device
WO2023030099A1 (en) Cross-device interaction method and apparatus, and screen projection system and terminal
CN112383664A (en) Equipment control method, first terminal equipment and second terminal equipment
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
WO2023005711A1 (en) Service recommendation method and electronic device
WO2022052706A1 (en) Service sharing method, system and electronic device
WO2023030057A1 (en) Screen recording method, electronic device, and computer readable storage medium
CN113835802A (en) Device interaction method, system, device and computer readable storage medium
WO2023283941A1 (en) Screen projection image processing method and apparatus
CN115016871B (en) Multimedia editing method, electronic device and storage medium
US20240086035A1 (en) Display Method and Electronic Device
WO2022206600A1 (en) Screen projection method and system, and related apparatus
WO2024099206A1 (en) Graphical interface processing method and apparatus
CN117785345A (en) Application display method and related equipment
CN117742846A (en) Method for adding service card, electronic device and computer readable storage medium
CN117241086A (en) Communication method, chip, electronic device, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination