WO2022161006A1 - Procédé et appareil de synthèse de photographie, et dispositif électronique et support de stockage lisible - Google Patents

Procédé et appareil de synthèse de photographie, et dispositif électronique et support de stockage lisible Download PDF

Info

Publication number
WO2022161006A1
WO2022161006A1 PCT/CN2021/138892 CN2021138892W WO2022161006A1 WO 2022161006 A1 WO2022161006 A1 WO 2022161006A1 CN 2021138892 W CN2021138892 W CN 2021138892W WO 2022161006 A1 WO2022161006 A1 WO 2022161006A1
Authority
WO
WIPO (PCT)
Prior art keywords
media file
parameter
shot
receiving end
media
Prior art date
Application number
PCT/CN2021/138892
Other languages
English (en)
Chinese (zh)
Inventor
鲍鑫东
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022161006A1 publication Critical patent/WO2022161006A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1101Session protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/24Negotiation of communication capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone

Definitions

  • the present application relates to the field of electronic technologies, and in particular, to a method, an apparatus, an electronic device and a readable storage medium for synchronizing.
  • communication software plays an important role in people's life and work, making human communication more and more abundant. Therefore, the functional richness of communication software also greatly affects the convenience of people's communication. Often when people are in different positions, they also need to take pictures to interact.
  • the traditional way of taking photos is that both parties take their own photos, and then one party forwards their photos to the other party through the file server, and the other party synthesizes the photos taken and the photos received to generate a group photo. Concord of two parties in different positions.
  • both ends of the co-photo are shot according to their respective parameters, so the parameters of the generated group photo are not uniform, resulting in a poor picture effect of the group photo effect.
  • the present application provides a method, an apparatus, an electronic device and a readable storage medium for co-shot, which can improve the quality of co-shot images.
  • a method for co-production comprising: a receiving end receiving a co-production request sent by a requesting end, the co-production request carrying a first parameter, and the first parameter is a media file parameter supported by the requesting end; The receiving end determines the co-shot parameter according to the second parameter and the first parameter, the second parameter is a media file parameter supported by the receiving end, and the quality of the media file corresponding to the co-shot parameter is lower than or equal to the first parameter.
  • the quality of the media file corresponding to the parameter, and the quality of the media file corresponding to the co-production parameter is lower than or equal to the quality of the media file corresponding to the second parameter; the receiving end sends the co-production parameter to the requesting end, the The co-production parameter is used by the requesting end to generate a first media file, and the parameters of the first media file are the same as the co-production parameters; the receiving end generates a second media file according to the co-production parameters, and the second media file
  • the parameters of are the same as the co-shot parameters; the second media file is used to synthesize the co-shot file with the first media file.
  • the first media file may be a photo or a recorded video specially taken by the requesting end for co-production, or may be an image or video captured by the requesting end in a video stream recorded during a video call with the receiving end; similarly, the second media file
  • the file may be a photo or a recorded video specially taken by the receiving end for co-shooting, or may be an image or video intercepted by the receiving end in a video stream recorded during a video call with the requesting end, which is not limited in this embodiment.
  • the above-mentioned co-shot parameter may be a parameter corresponding to a lower media file quality among the first parameter and the second parameter, wherein the media file quality may be image resolution and/or video fluency.
  • the co-shot parameter may be the smaller of the highest resolution supported by the requester and the highest resolution supported by the receiver, and/or, the co-shot parameter may be the highest frame rate supported by the requester and the highest frame rate supported by the receiver. the smaller one.
  • the quality of the media file corresponding to the co-shot parameter may be lower than the quality of the media file corresponding to the first parameter and the second parameter.
  • the requesting end and the receiving end implement the negotiation of the co-shot parameters of the co-shot image quality through signaling exchange before co-shot, and determine a same co-shot parameter, so that the co-shot parameters with the same parameters can be respectively generated according to the negotiated co-shot parameters to be synthesized.
  • media files ie, the first media file and the second media file.
  • the requesting end or the receiving end can synthesize media files with the same parameters into a co-production file, which avoids the problem of poor picture effect of the co-production file caused by inconsistent parameters of the media files to be synthesized, thereby improving the quality of the co-production file.
  • the first parameter is the highest standard media file parameter supported by the requesting end
  • the second parameter is the highest standard media file parameter supported by the receiving end.
  • the co-shot parameter When the co-shot parameter is one of the first parameter and the second parameter corresponding to the lower quality of the media file, the co-shot parameter can represent the maximum shooting capability of both ends, so the highest standard media file supported by both ends can be generated, so that the The quality of the co-shot files is as high as possible.
  • the first media file and the second media file are both videos
  • the first parameter includes the maximum frame rate and the maximum image resolution supported by the requester
  • the second parameter includes the The maximum frame rate and maximum image resolution supported by the receiving end
  • the co-shot parameters include the co-shot frame rate and co-shot image resolution.
  • the co-shot frame rate is less than or equal to the smaller of the maximum frame rate supported by the requesting end and the maximum frame rate supported by the receiving end
  • the co-shot image resolution is less than or equal to the maximum image resolution supported by the requesting end and The smaller of the maximum image resolutions supported by the receiver.
  • the frame rate of the first media file and the frame rate of the second media file are both the same as the co-shot frame rate
  • the image resolution of the first media file and the image resolution of the second media file are both the same as the co-shot image resolution rate is the same.
  • the requesting end and the receiving end negotiate the co-production frame rate and co-production resolution to ensure that the parameters of the video recorded at both ends are unified, realize the coordinated control of the double-end picture quality, and improve the picture quality of the video co-production.
  • the user experience of video co-shooting is improved.
  • both the first media file and the second media file are media files obtained from the video call between the receiving end and the requesting end, and the co-production frame rate is greater than the initial frame of the video call. and the resolution of the co-shot image is smaller than the initial image resolution of the video call.
  • the two parties conduct a video call according to the initial frame rate and initial image resolution.
  • the two parties can negotiate the co-production parameters according to the strategy of prioritizing the smoothness of the video and sacrificing the clarity of the image.
  • the receiving end increases the frame rate based on the initial frame rate as the co-shot frame rate, and reduces the image resolution of the video frame based on the initial image resolution as the co-shot image resolution, thereby ensuring the smoothness of the co-shot video.
  • the parameters of the video file transmitted by the video call are the same as the co-production parameters.
  • the picture quality of the video call displayed on both ends is the same as the picture quality of the co-production video file.
  • the video seen on the respective device interface is the same as the final generated co-production video, that is, what you see is what you get, which improves the user's co-production experience.
  • both the first media file and the second media file are images
  • the first parameter includes the maximum image resolution supported by the requesting end
  • the second parameter includes the image resolution supported by the receiving end.
  • the co-shot parameter includes the co-shot image resolution.
  • the co-shot image resolution is less than or equal to the smaller of the maximum image resolution supported by the requesting end and the maximum image resolution supported by the receiving end. For example, after the requesting end and the receiving end negotiate to obtain the co-shot image resolution, the requesting end captures the first media file according to the co-shot image resolution, and the receiving end captures the second media file according to the co-shot image resolution. Both the image resolution and the image resolution of the second media file are the same as the co-shot image resolution.
  • the requesting end and the receiving end negotiate the resolution of the co-shot image to ensure the unity of the definition of the co-shot image, realize the coordinated control of the image quality at both ends, and improve the quality of the co-shot photo.
  • both the first media file and the second media file are media files obtained from the video call between the receiving end and the requesting end, and the co-production frame rate is less than the initial frame of the video call. and the resolution of the co-shot image is greater than the initial image resolution of the video call.
  • the two parties conduct a video call according to the initial frame rate and initial image resolution.
  • the two parties can negotiate the parameters of the co-shooting according to the strategy of prioritizing image clarity and sacrificing video fluency.
  • the receiving end reduces the frame rate on the basis of the initial frame rate as the frame rate of the video call, and increases the image resolution of the video frame on the basis of the initial image resolution as the co-shot image resolution, thereby ensuring the clarity of the group photo .
  • the parameters of the video frame image transmitted by the video call are the same as the resolution of the co-shot image.
  • the image quality of the video frame displayed at both ends is also the same as that of the co-shot photo.
  • the images of users on their respective device interfaces are the same as the final generated images, that is, what you see is what you get, which further improves the user's experience of taking photos together.
  • the method further includes: the receiving end sends the second media file to the requesting end through the media path of the video call; or, the receiving end receives the second media file through the media path of the video call the first media file sent by the requester.
  • media files are transmitted between the receiving end and the transmitting end by multiplexing the media path of the existing video call, which reduces the delay compared to the cross-server transmission method of using the message file server to transmit files. , which improves the real-time nature of co-production and further improves the user's co-production experience.
  • the method further includes: when the receiver sends the second media file to the requester through the media path of the video call, the receiver increases the maximum bit rate of the media path. .
  • the receiving end When the receiving end needs to multiplex the existing media channel of the video call for media files, the receiving end increases the capacity of the media channel by increasing the maximum bit rate that can be accepted by the network state at this time, thereby improving the transmission efficiency of media files. .
  • the transmission efficiency is increased, whether the frame rate of the co-shot or the resolution of the co-shot image is increased, the display effect of the co-shot file can be further improved.
  • a method of synchronizing comprising:
  • the requesting end sends a co-production request to the receiving end, the co-production request carries a first parameter, and the first parameter is a media file parameter supported by the requesting end;
  • the requesting end receives the co-production parameters sent by the receiving end, the quality of the media files corresponding to the co-production parameters is lower than or equal to the quality of the media files corresponding to the first parameters, and the quality of the media files corresponding to the co-production parameters is lower than or equal to the quality of the media files corresponding to the first parameters. or equal to the quality of the media file corresponding to the second parameter, the second parameter is the media file parameter supported by the receiving end;
  • the requesting end generates a first media file according to the co-production parameters, the parameters of the first media file are the same as the co-production parameters, and the co-production parameters are used by the receiving end to generate a second media file, the second media file is The parameters of the media file are the same as the co-shot parameters, and the second media file is used for synthesizing a co-shot file with the first media file.
  • the first parameter is the highest standard media file parameter supported by the requesting end
  • the second parameter is the highest standard media file parameter supported by the receiving end.
  • the first media file and the second media file are both videos
  • the first parameter includes the maximum frame rate and the maximum image resolution supported by the requester
  • the second parameter includes the The maximum frame rate and maximum image resolution supported by the receiving end
  • the co-shot parameters include the co-shot frame rate and co-shot image resolution.
  • both the first media file and the second media file are media files obtained from the video call between the receiving end and the requesting end, and the co-production frame rate is greater than the initial frame of the video call. and the resolution of the co-shot image is smaller than the initial image resolution of the video call.
  • both the first media file and the second media file are images
  • the first parameter includes the maximum image resolution supported by the requesting end
  • the second parameter includes the image resolution supported by the receiving end.
  • the co-shot parameter includes the co-shot image resolution.
  • both the first media file and the second media file are media files obtained from the video call between the receiving end and the requesting end, and the co-production frame rate is less than the initial frame of the video call. and the resolution of the co-shot image is greater than the initial image resolution of the video call.
  • the method further includes: the requesting terminal receives the second media file sent by the receiving terminal through the media path of the video call; or, the requesting terminal uses the media path of the video call to receive the second media file. sending the first media file to the receiving end.
  • the method further includes: when the requesting end sends the first media file to the receiving end through the media path of the video call, the requesting end increases the maximum bit rate of the media path. .
  • an in-beat device including a unit composed of software and/or hardware, and the unit is configured to execute any one of the methods in the technical solutions described in the first aspect.
  • an in-time device including a unit composed of software and/or hardware, and the unit is configured to execute any one of the methods in the technical solutions described in the second aspect.
  • an electronic device comprising a processor and a memory
  • the memory is used to store a computer program
  • the processor is used to call and run the computer program from the memory, so that the electronic device executes the first aspect. Any method in the technical solution.
  • an electronic device including a processor and a memory, the memory is used for storing a computer program, the processor is used for calling and running the computer program from the memory, so that the electronic device executes the second aspect. Any method in the technical solution.
  • a computer-readable storage medium where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the processor is made to execute the technology described in the first aspect any method in the program.
  • a computer-readable storage medium is provided, a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the processor is made to execute the technology described in the second aspect any method in the program.
  • a computer program product includes: computer program code, when the computer program code is run on an electronic device, the electronic device is made to execute the technical solution described in the first aspect. any method.
  • a computer program product comprising: computer program code, when the computer program code is run on an electronic device, the electronic device is made to perform the technical solution described in the second aspect any method.
  • FIG. 1 is a schematic structural diagram of an example of a terminal device 100 provided by an embodiment of the present application
  • FIG. 2 is a software structural block diagram of a terminal device 100 provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an in-time system to which an example in-time method provided by an embodiment of the present application is applied;
  • FIG. 4 is a schematic diagram of the internal structure of an example of a receiving end and a requesting end provided by an embodiment of the present application;
  • FIG. 5 is an example of a signaling interaction diagram in which a receiving end and a requesting end are in sync according to an embodiment of the present application
  • FIG. 6 is a schematic structural diagram of an example of a device for co-timing provided by an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of an example of a matching device provided by an embodiment of the present application.
  • first”, second”, third, and fourth are only used for descriptive purposes, and should not be understood as indicating or implying relative importance or implying the number of indicated technical features.
  • a feature defined as “first”, “second”, “third”, “fourth” may expressly or implicitly include one or more of that feature.
  • the matching method provided by the embodiments of the present application can be applied to mobile phones, tablet computers, wearable devices, in-vehicle devices, augmented reality (AR)/virtual reality (VR) devices, notebook computers, and super mobile personal computers
  • AR augmented reality
  • VR virtual reality
  • terminal devices such as (ultra-mobile personal computer, UMPC), netbook, personal digital assistant (personal digital assistant, PDA), the embodiment of the present application does not impose any restrictions on the specific type of the terminal device.
  • FIG. 1 is a schematic structural diagram of an example of a terminal device 100 provided by an embodiment of the present application.
  • the terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the terminal device 100 may include more or less components than shown, or some components are combined, or some components are separated, or different components are arranged.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the terminal device 100 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the terminal device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface, so as to realize the shooting function of the terminal device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the terminal device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the terminal device 100, and can also be used to transmit data between the terminal device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones. This interface can also be used to connect other terminal devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the terminal device 100 .
  • the terminal device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the terminal device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the terminal device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • the structures of the antenna 1 and the antenna 2 in FIG. 1 are only an example.
  • Each antenna in terminal device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the terminal device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the terminal device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the terminal device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the terminal device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the terminal device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the terminal device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the terminal device 100 can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the terminal device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the terminal device 100 may support one or more video codecs.
  • the terminal device 100 can play or record videos in various encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the terminal device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the terminal device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the terminal device 100 by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the terminal device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the terminal device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the terminal device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the terminal device 100 answers a call or a voice message, the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the terminal device 100 may be provided with at least one microphone 170C.
  • the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals.
  • the terminal device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (open mobile terminal platform, OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the terminal device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the terminal device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the terminal device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the terminal device 100 .
  • the angular velocity of the end device 100 about three axes ie, the x, y and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyroscope sensor 180B detects the shaking angle of the terminal device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the terminal device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the terminal device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the terminal device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the terminal device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the terminal device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the terminal device 100 is stationary. It can also be used to identify the posture of terminal devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the terminal device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the terminal device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the terminal device 100 emits infrared light to the outside through the light emitting diode.
  • the terminal device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100 . When insufficient reflected light is detected, the terminal device 100 may determine that there is no object near the terminal device 100 .
  • the terminal device 100 can use the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the terminal device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the terminal device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking photos with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the terminal device 100 uses the temperature detected by the temperature sensor 180J to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the terminal device 100 reduces the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the terminal device 100 when the temperature is lower than another threshold, the terminal device 100 heats the battery 142 to avoid abnormal shutdown of the terminal device 100 caused by the low temperature.
  • the terminal device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the terminal device 100 , which is different from the position where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the terminal device 100 may receive key input and generate key signal input related to user settings and function control of the terminal device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the terminal device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the terminal device 100 interacts with the network through the SIM card to realize functions such as calls and data communication.
  • the terminal device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the terminal device 100 and cannot be separated from the terminal device 100 .
  • the software system of the terminal device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of the terminal device 100 .
  • FIG. 2 is a block diagram of a software structure of a terminal device 100 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and so on.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the telephony manager is used to provide the communication function of the terminal device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications that appear on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the terminal device vibrates, and the indicator light flashes.
  • the Android runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (media library), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library media library
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • the methods of the embodiments of the present application can be applied to the co-op system as shown in FIG. 3 , where the co-op system includes two terminal devices capable of interacting, one serving as a requesting end and the other serving as a receiving end.
  • the requesting end and the receiving end can perform signaling interaction through the signaling server, negotiate the shooting quality of the two parties, and respectively shoot media files according to the negotiation results. Since the co-production file is obtained by synthesizing two media files based on the same negotiation result, the parameters of the co-production file are unified, the display effect of the co-production file is improved, and the user's co-production experience is further improved.
  • the first parameter to describe the media file parameters supported by the requester.
  • the first parameter may be the highest standard media file parameter that the requester can generate.
  • the first parameter may include the request The highest resolution of the image captured by the requesting end, and may also include the highest frame rate of the video captured by the requesting end.
  • the first parameter may further include parameters of dimensions such as brightness, beauty, beauty, style, and filter of the generated media file.
  • the second parameter can be the highest standard media file that the receiving end can generate.
  • the second parameter can reflect the shooting capability of the receiving end.
  • the second parameter For the types of the included parameters, reference may be made to the description of the first parameter, which will not be repeated here.
  • the user triggers the matching process by operating the requesting terminal, for example, the user clicks the matching button on the screen of the requesting terminal to initiate the matching process, or the user uses voice to input the matching command to the requesting terminal.
  • the requesting end may send a co-op request to the receiving end through the signaling server based on the user operation.
  • the receiver receives the co-shot request, according to the first parameter carried in the co-shot request, combined with the second parameter representing the receiver's own shooting capability, determine the co-shot parameters applicable to both the requester and the receiver, and the co-shot parameter can be Characterize the quality of the media file required for co-production.
  • the receiving end can generate a second media file according to the co-timing parameter, and feed back the co-timing parameter to the requesting end.
  • the requesting end may generate the first media file according to the co-production parameter.
  • the receiving end may send the second media file to the requesting end, and the requesting end synthesizes the first media file and the received second media file to obtain a co-production file.
  • the requesting end may also send the generated first media file to the receiving end, and the receiving end synthesizes the first media file and the second media file to obtain a co-production file.
  • the parameters of the first media file and the parameters of the second media file are the same as the in-time parameters, and the generated co-time files have the same parameters as the in-time parameters.
  • the first media file may be a photo or a video recorded by the requesting end for co-shooting, or may be an image or video intercepted by the requesting end in a video stream recorded during a video call with the receiving end;
  • the second media file may be a photo or video recorded by the receiving end for co-shooting, or may be an image or video captured by the receiving end from a video stream recorded during a video call with the requesting end, which is not done in this embodiment. limited.
  • the above-mentioned co-shot parameter may be a parameter corresponding to a lower media file quality among the first parameter and the second parameter, wherein the media file quality may be image resolution and/or video fluency.
  • the co-shot parameter may be the smaller of the highest resolution supported by the requester and the highest resolution supported by the receiver, and/or, the co-shot parameter may be the highest frame rate supported by the requester and the highest frame rate supported by the receiver.
  • the smaller one; at this time, the co-shot parameter is the highest standard media file parameter that can adapt to the shooting capabilities of the requester and the receiver at the same time. Therefore, if the capabilities of the devices at both ends allow, the picture quality of the co-shot file can reach the optimum. .
  • the co-production parameters can also be adjusted according to network conditions. For example, when the network is busy, there are fewer network resources that can be allocated to the co-production process, and the quality of the media files corresponding to the co-production parameters may be higher than that of the first parameter and the second parameter. The quality of the media file corresponding to the parameter is lower.
  • the co-production resolution in the co-production parameter is lower than the highest resolution supported by the requester and the highest resolution supported by the receiver, whichever is smaller, or the co-production frame rate in the co-production parameter is low. Whichever is smaller between the highest frame rate supported by the requester and the highest frame rate supported by the receiver.
  • the requesting end and the receiving end implement the negotiation of co-production parameters of co-production image quality through signaling exchange before co-production, and determine a same co-production parameter, so that the co-production parameters with the same parameters can be respectively generated according to the negotiated co-production parameters.
  • Composite media files ie, first media file and second media file.
  • the requesting end or the receiving end can synthesize media files with the same parameters into a co-production file, which avoids the problem of poor picture effect of the co-production file caused by non-uniform parameters of the media files to be synthesized. This method improves the display effect of the co-production file. Thus, the user's co-production experience is improved.
  • the first parameter when the first media file and the second media file are videos, the first parameter may include the maximum frame rate supported by the requester and the maximum image resolution of each frame of image, and the second parameter may include It includes the maximum frame rate supported by the receiving end and the maximum image resolution of each frame of image, and the determined co-shot parameter may include the co-shot frame rate and the co-shot image resolution.
  • the co-shot frame rate is less than or equal to the smaller of the maximum frame rate supported by the requesting end and the maximum frame rate supported by the receiving end
  • the co-shot image resolution is less than or equal to the maximum image resolution supported by the requesting end and the receiving end. The smaller image resolution of the maximum image resolution.
  • the requester records the first media file according to the co-shot frame rate and co-shot image resolution
  • the receiver records the co-shot frame rate and co-shot image resolution according to the co-shot frame rate and co-shot image resolution.
  • Recording the second media file therefore, the frame rate of the first media file and the frame rate of the second media file are the same as the frame rate of the co-shot, and the image resolution of the first media file and the image resolution of the second media file are both the same as the co-shot frame rate Image resolution is the same.
  • the requesting end and the receiving end negotiate the co-production frame rate and co-production resolution to ensure that the parameters of the video recorded at both ends are unified, realize the coordinated control of the double-end picture quality, and improve the picture quality of the video co-production.
  • the user experience of video co-shooting is improved.
  • the two parties conduct a video call according to the initial frame rate and initial image resolution.
  • the initial frame rate is the frame rate of the video call at the moment before the co-production
  • the initial image resolution is the image resolution of the video call at any moment before the co-production.
  • the receiving end increases the frame rate based on the initial frame rate as the co-shot frame rate, and reduces the image resolution of the video frame based on the initial image resolution as the co-shot image resolution, thereby ensuring the smoothness of the co-shot video.
  • the parameters of the video file transmitted by the video call are the same as the co-production parameters.
  • the picture quality of the video call displayed on both ends is the same as the picture quality of the co-production video file.
  • the video seen on the respective device interface is the same as the final generated co-production video, which ensures real-time performance, that is, what you see is what you get, which further improves the user experience of co-production video.
  • the first parameter when the first media file and the second media file are images, the first parameter may include the maximum image resolution supported by the requesting end, and the second parameter may include the maximum image resolution supported by the receiving end , the determined co-shot parameter may include the co-shot image resolution.
  • the co-shot image resolution is less than or equal to the smaller image resolution between the maximum image resolution supported by the requesting end and the maximum image resolution supported by the receiving end. Specifically, after the requesting end and the receiving end negotiate to obtain the resolution of the co-shot image, the requesting end captures the first media file according to the resolution of the co-shot image, and the receiving end captures the second media file according to the resolution of the co-shot image.
  • the first media file and the image resolution of the second media file is the same as the co-shot image resolution.
  • the requesting end and the receiving end negotiate the resolution of the co-shot image to ensure the unity of the definition of the co-shot image, realize the coordinated control of the image quality at both ends, and improve the quality of the co-shot photo.
  • the two parties conduct a video call according to the initial frame rate and initial image resolution.
  • the two parties can negotiate the parameters of the co-shooting according to the strategy of prioritizing image clarity and sacrificing video fluency.
  • the receiving end reduces the frame rate on the basis of the initial frame rate as the frame rate of the video call, and increases the image resolution of the video frame on the basis of the initial image resolution as the co-shot image resolution, thereby ensuring the clarity of the group photo .
  • the parameters of the video image transmitted by the video call are the same as the resolution of the co-shot image.
  • the video frames displayed at both ends have the same quality as the co-shot photo. Users at both ends have the same quality.
  • the images on the respective device interfaces are the same as the final generated images, that is, what you see is what you get, which further improves the quality of photos taken by users.
  • a media server is usually used to transmit the video stream.
  • the receiving end can use the existing media server and use the existing video call
  • the media path sends the second media file to the receiving end.
  • the receiving end transmits media files by multiplexing the media channels of the existing video call. Compared with the cross-server transmission method of using the message file server to transmit files, the delay is reduced and the co-production is improved. The real-time nature further improves the user experience of co-production.
  • the requesting end and the receiving end may also adjust the parameters during the video call according to the network state.
  • the sender's end encodes the media files according to a certain encoding strategy, that is, the current maximum bit rate, and then sends the encoded media stream to the receiver's end.
  • the receiver's end decodes the received media stream (ie video stream, photo stream).
  • the requesting end or the receiving end as the encoding end can encode the media file according to the current maximum bit rate and transmit it.
  • the requesting end acts as the encoding end
  • the receiving end acts as the decoding end
  • the receiving end acts as the decoding end
  • the requesting end acts as the decoding end
  • the encoding end of the requesting end or the receiving end can increase the capacity of the media channel by increasing the maximum bit rate that can be accepted by the network state at this time.
  • the transmission efficiency of media files is improved.
  • the display effect of the co-shot file can be further improved, whether it is to increase the co-shot frame rate or the co-shot image resolution.
  • the encoding side of the requesting side or the receiving side can also adjust the encoding strategy adaptively according to the busyness of the network (that is, adjust the maximum bit rate that the network can allow) to transmit media files, such as adjusting the co-shot frame. rate, co-shot image resolution and maximum bit rate.
  • the encoding side of the requesting end or the receiving end can also fully use the maximum bit rate that the network can allow at this time, switching from the traditional video call strategy of ensuring smoothness to the strategy of ensuring clarity. For example, when starting a video call, both ends use the parameters of 720P resolution, 30fps frame rate, and 1035 bit rate to transmit video files.
  • the requester and the receiver can increase the current resolution to 1080P, the frame rate is changed to 10fps, and the bit rate is changed to 1800 to transmit video files, making full use of network conditions to ensure the clarity of the shot.
  • the requesting end and the receiving end respectively install the application program of the video call, and the requesting end and the receiving end respectively include the transmission protocol components (including the transmission control protocol and/or the user data packet protocol) that support communication, and the ability to collect and edit A media component of a (or composited) media file, with cameras on both ends.
  • the requesting end and the receiving end perform signaling interaction through the signaling server, such as sending a matching request and returning a matching response.
  • the requesting end and the receiving end respectively include an intelligent encoding and decoding module, and the intelligent encoding and decoding module includes a network adaptive module, which is used to determine the encoding strategy.
  • the intelligent encoding and decoding module further includes an encoding and decoding module, which is used for encoding and decoding the media file according to the encoding and decoding strategy.
  • the receiving end uses an intelligent encoding and decoding module to encode the video stream or photographing stream collected and generated by the camera according to the encoding strategy, and then sends it to the requesting end through the media server.
  • the requesting end uses the intelligent codec module to decode the received video stream or photo stream, and then uses its own media component to generate the media files collected by itself and the received media files to obtain a co-shot file.
  • a video call is established between the requester and the receiver through the media server.
  • the requester sends a co-production request to the receiver through the signaling server.
  • the receiver Based on the co-production request, the receiver identifies the co-production scene and determines the co-production. Whether the scene is a co-photo or video.
  • the receiving end generates or adjusts the co-shot parameters according to the co-shot scene. For example, if the co-production scene is a co-production video, the receiver can increase the co-production frame rate, reduce the co-production image resolution, and adjust the coding strategy adaptively according to the network conditions, that is, adjust the maximum bit rate.
  • the receiving end can increase the maximum bit rate. Afterwards, the receiving end returns a timing response to the requesting terminal, and the timing response may carry a timing parameter.
  • the requesting end adjusts the acquisition strategy according to the co-production parameters, collects media files, and receives the media files collected according to the co-production parameters sent by the receiving end through the media server, decodes the received media files, and compares the decoded media files with the collected media files.
  • the media files are synthesized to obtain a co-shot file.
  • the corresponding apparatuses include corresponding hardware structures and/or software modules for performing each function.
  • the present application can be implemented in hardware or a combination of hardware and computer software with the units and algorithm steps of each example described in conjunction with the embodiments disclosed herein. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each particular application, but such implementations should not be considered beyond the scope of this application.
  • the present application may divide the function modules of the in-time device according to the above method examples. For example, each function may be divided into each function module, or two or more functions may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. It should be noted that the division of modules in this application is schematic, and is only a logical function division, and other division methods may be used in actual implementation.
  • FIG. 6 shows a schematic structural diagram of an in-beat device provided by the present application.
  • Apparatus 600 includes:
  • the first receiving module 601 is configured to control the receiving end to receive a co-production request sent by the requesting end, the co-production request carrying a first parameter, and the first parameter is a media file parameter supported by the requesting end;
  • a determination module 602 configured to control the receiving end to determine a co-shot parameter according to a second parameter and the first parameter, where the second parameter is a media file parameter supported by the receiving end, and the quality of the media file corresponding to the co-shot parameter is lower than or equal to the quality of the media file corresponding to the first parameter, and the quality of the media file corresponding to the co-production parameter is lower than or equal to the quality of the media file corresponding to the second parameter;
  • the first sending module 603 is configured to control the receiving end to send the co-timing parameter to the requesting end, where the co-timing parameter is used by the requesting end to generate a first media file, and the parameters of the first media file are the same as those of the requesting end.
  • the above-mentioned co-shot parameters are the same;
  • the first generation module 604 is configured to control the receiving end to generate a second media file according to the co-shot parameters, the parameters of the second media file are the same as the co-shot parameters, and the second media file is used to match the co-shot parameters.
  • the first media file is synthesized into a co-shot file.
  • the first parameter is the highest standard media file parameter supported by the requesting end
  • the second parameter is the highest standard media file parameter supported by the receiving end.
  • the first media file and the second media file are both videos
  • the first parameter includes the maximum frame rate and the maximum image resolution supported by the requester
  • the second parameter includes the The maximum frame rate and maximum image resolution supported by the receiving end
  • the co-shot parameters include the co-shot frame rate and co-shot image resolution.
  • both the first media file and the second media file are media files obtained from the video call between the receiving end and the requesting end, and the co-production frame rate is greater than the initial frame of the video call. and the resolution of the co-shot image is smaller than the initial image resolution of the video call.
  • both the first media file and the second media file are images
  • the first parameter includes the maximum image resolution supported by the requesting end
  • the second parameter includes the image resolution supported by the receiving end.
  • the co-shot parameter includes the co-shot image resolution.
  • both the first media file and the second media file are media files obtained from the video call between the receiving end and the requesting end, and the co-production frame rate is less than the initial frame of the video call. and the resolution of the co-shot image is greater than the initial image resolution of the video call.
  • the apparatus 600 further includes a third sending module, configured to control the receiving end to send the second media file to the requesting end through the media path of the video call.
  • a third sending module configured to control the receiving end to send the second media file to the requesting end through the media path of the video call.
  • the apparatus 600 further includes a third receiving module, configured to control the receiving end to receive the first media file sent by the requesting end through the media path of the video call.
  • a third receiving module configured to control the receiving end to receive the first media file sent by the requesting end through the media path of the video call.
  • the apparatus 600 further includes a first control module, configured to control the receiving end to add the second media file when the receiving end sends the second media file to the requesting end through the media path of the video call The maximum bit rate of the media path.
  • a first control module configured to control the receiving end to add the second media file when the receiving end sends the second media file to the requesting end through the media path of the video call The maximum bit rate of the media path.
  • FIG. 7 shows a schematic structural diagram of an in-beat device provided by the present application.
  • Apparatus 700 includes:
  • the second sending module 701 is configured to control the requesting end to send a co-shot request to the receiving end, where the co-shot request carries a first parameter, and the first parameter is a media file parameter supported by the requesting end;
  • the second receiving module 702 is configured to control the requesting end to receive the co-pacing parameters sent by the receiving end, the quality of the media files corresponding to the co-pacing parameters is lower than or equal to the quality of the media files corresponding to the first parameters, and the The quality of the media file corresponding to the co-production parameter is lower than or equal to the quality of the media file corresponding to the second parameter, and the second parameter is the media file parameter supported by the receiving end;
  • the second generating module 703 is configured to control the requesting end to generate a first media file according to the co-timing parameters, the parameters of the first media file are the same as the co-timing parameters, and the co-timing parameters are used by the receiving end to generate A second media file, the parameters of the second media file are the same as the co-shot parameters, and the second media file is used for synthesizing a co-shot file with the first media file.
  • the first parameter is the highest standard media file parameter supported by the requesting end
  • the second parameter is the highest standard media file parameter supported by the receiving end.
  • the first media file and the second media file are both videos
  • the first parameter includes the maximum frame rate and the maximum image resolution supported by the requester
  • the second parameter includes the The maximum frame rate and maximum image resolution supported by the receiving end
  • the co-shot parameters include the co-shot frame rate and co-shot image resolution.
  • both the first media file and the second media file are media files obtained from the video call between the receiving end and the requesting end, and the co-production frame rate is greater than the initial frame of the video call. and the resolution of the co-shot image is smaller than the initial image resolution of the video call.
  • both the first media file and the second media file are images
  • the first parameter includes the maximum image resolution supported by the requesting end
  • the second parameter includes the image resolution supported by the receiving end.
  • the co-shot parameter includes the co-shot image resolution.
  • both the first media file and the second media file are media files obtained from the video call between the receiving end and the requesting end, and the co-production frame rate is less than the initial frame of the video call. and the resolution of the co-shot image is greater than the initial image resolution of the video call.
  • the apparatus 700 further includes a fourth receiving module, configured to control the requesting end to receive the second media file sent by the receiving end through the media path of the video call.
  • a fourth receiving module configured to control the requesting end to receive the second media file sent by the receiving end through the media path of the video call.
  • the apparatus 700 further includes a fourth sending module, configured to control the requesting end to send the first media file to the receiving end through the media path of the video call.
  • a fourth sending module configured to control the requesting end to send the first media file to the receiving end through the media path of the video call.
  • the apparatus 700 further includes a second control module, configured to control the requesting end when the requesting end sends the first media file to the receiving end through the media path of the video call. end to increase the maximum bit rate of the media path.
  • a second control module configured to control the requesting end when the requesting end sends the first media file to the receiving end through the media path of the video call. end to increase the maximum bit rate of the media path.
  • An embodiment of the present application further provides an electronic device, including the above-mentioned processor.
  • the electronic device provided in this embodiment may be the terminal device 100 shown in FIG. 1 , and is configured to execute the above-mentioned method for co-timing.
  • the terminal device may include a processing module, a storage module and a communication module.
  • the processing module may be used to control and manage the actions of the terminal device, for example, may be used to support the terminal device to perform steps performed by the display unit, the detection unit, and the processing unit.
  • the storage module can be used to support the terminal device to execute stored program codes and data.
  • the communication module can be used to support the communication between the terminal device and other devices.
  • the processing module may be a processor or a controller. It may implement or execute the various exemplary logical blocks, modules and circuits described in connection with this disclosure.
  • the processor may also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of digital signal processing (DSP) and a microprocessor, and the like.
  • the storage module may be a memory.
  • the communication module may specifically be a device that interacts with other terminal devices, such as a radio frequency circuit, a Bluetooth chip, and a Wi-Fi chip.
  • the terminal device involved in this embodiment may be a device having the structure shown in FIG. 1 .
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the processor is made to execute the description in any of the foregoing embodiments. the matching method.
  • Embodiments of the present application further provide a computer program product, which, when the computer program product runs on a computer, causes the computer to execute the above-mentioned relevant steps, so as to implement the method for in-step in the above-mentioned embodiments.
  • the electronic device, computer-readable storage medium, computer program product or chip provided in this embodiment are all used to execute the corresponding method provided above. Therefore, for the beneficial effects that can be achieved, reference may be made to the above-provided method. The beneficial effects in the corresponding method will not be repeated here.
  • the disclosed apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or May be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • Units described as separate components may or may not be physically separated, and components shown as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed in multiple different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium.
  • a readable storage medium including several instructions to make a device (which may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • Studio Devices (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

La présente demande se rapporte au domaine technique de l'électronique. Un procédé et un appareil de synthèse de photographie, et un dispositif électronique et un support de stockage lisible sont divulgués. Le procédé fait appel aux étapes suivantes : une extrémité de réception recevant une demande de synthèse de photographie envoyée par une extrémité demandeuse, la demande de synthèse de photographie portant un premier paramètre, et le premier paramètre étant un paramètre de fichier multimédia pris en charge par l'extrémité demandeuse; l'extrémité de réception déterminant un paramètre de synthèse de photographie selon un second paramètre et le premier paramètre, le second paramètre étant un paramètre de fichier multimédia pris en charge par l'extrémité de réception, et la qualité d'un fichier multimédia correspondant au paramètre de synthèse de photographie étant inférieure ou égale à la qualité d'un fichier multimédia correspondant au premier paramètre, et étant inférieure ou égale à la qualité d'un fichier multimédia correspondant au second paramètre; l'extrémité de réception envoyant le paramètre de synthèse de photographie à l'extrémité demandeuse, le paramètre de synthèse de photographie étant utilisé par l'extrémité demandeuse pour générer un premier fichier multimédia; et l'extrémité de réception générant un second fichier multimédia selon le paramètre de synthèse de photographie, à la fois un paramètre du premier fichier multimédia et un paramètre du second fichier multimédia étant identiques au paramètre de synthèse de photographie, et le second fichier multimédia étant synthétisé avec le premier fichier multimédia afin de former un fichier de synthèse de photographie. Au moyen du procédé, la qualité d'un fichier de synthèse de photographie peut être améliorée.
PCT/CN2021/138892 2021-01-28 2021-12-16 Procédé et appareil de synthèse de photographie, et dispositif électronique et support de stockage lisible WO2022161006A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110120273.XA CN114827098A (zh) 2021-01-28 2021-01-28 合拍的方法、装置、电子设备和可读存储介质
CN202110120273.X 2021-01-28

Publications (1)

Publication Number Publication Date
WO2022161006A1 true WO2022161006A1 (fr) 2022-08-04

Family

ID=82526985

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/138892 WO2022161006A1 (fr) 2021-01-28 2021-12-16 Procédé et appareil de synthèse de photographie, et dispositif électronique et support de stockage lisible

Country Status (2)

Country Link
CN (1) CN114827098A (fr)
WO (1) WO2022161006A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11875556B2 (en) * 2020-06-12 2024-01-16 Beijing Bytedance Network Technology Co., Ltd. Video co-shooting method, apparatus, electronic device and computer-readable medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104333693A (zh) * 2014-11-03 2015-02-04 深圳市中兴移动通信有限公司 拍摄方法、拍摄系统和拍摄装置
CN106488106A (zh) * 2015-08-31 2017-03-08 杭州华为数字技术有限公司 一种图像处理方法及装置
CN106657791A (zh) * 2017-01-03 2017-05-10 广东欧珀移动通信有限公司 一种合成图像的生成方法及装置
US20170272644A1 (en) * 2016-03-18 2017-09-21 Altek Semiconductor Corp. Multi-camera electronic device and control method thereof
CN107995420A (zh) * 2017-11-30 2018-05-04 努比亚技术有限公司 远程合影控制方法、双面屏终端及计算机可读存储介质
CN110944109A (zh) * 2018-09-21 2020-03-31 华为技术有限公司 一种拍照方法、装置与设备

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102131248B (zh) * 2010-01-19 2013-12-04 华为技术有限公司 一种速率协商方法及数据传输系统以及相关设备
CN102223201B (zh) * 2010-04-15 2014-01-01 中兴通讯股份有限公司 一种编解码器能力协商方法及终端
CN101986648B (zh) * 2010-11-24 2012-12-12 北京星网锐捷网络技术有限公司 一种tcp选项的协商方法、装置及网络设备
CN103067151B (zh) * 2013-02-04 2016-04-13 恒为科技(上海)股份有限公司 一种保持被串接链路两端状态同步的装置及其方法
CN104426870B (zh) * 2013-08-29 2019-03-15 中兴通讯股份有限公司 远程无线屏幕共享方法、装置及系统
WO2018227346A1 (fr) * 2017-06-12 2018-12-20 华为技术有限公司 Système d'accès intégré, procédé de configuration et unité de traitement de bande de base
CN111050072B (zh) * 2019-12-24 2022-02-01 Oppo广东移动通信有限公司 一种异地合拍方法、设备以及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104333693A (zh) * 2014-11-03 2015-02-04 深圳市中兴移动通信有限公司 拍摄方法、拍摄系统和拍摄装置
CN106488106A (zh) * 2015-08-31 2017-03-08 杭州华为数字技术有限公司 一种图像处理方法及装置
US20170272644A1 (en) * 2016-03-18 2017-09-21 Altek Semiconductor Corp. Multi-camera electronic device and control method thereof
CN106657791A (zh) * 2017-01-03 2017-05-10 广东欧珀移动通信有限公司 一种合成图像的生成方法及装置
CN107995420A (zh) * 2017-11-30 2018-05-04 努比亚技术有限公司 远程合影控制方法、双面屏终端及计算机可读存储介质
CN110944109A (zh) * 2018-09-21 2020-03-31 华为技术有限公司 一种拍照方法、装置与设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11875556B2 (en) * 2020-06-12 2024-01-16 Beijing Bytedance Network Technology Co., Ltd. Video co-shooting method, apparatus, electronic device and computer-readable medium

Also Published As

Publication number Publication date
CN114827098A (zh) 2022-07-29

Similar Documents

Publication Publication Date Title
WO2020253719A1 (fr) Procédé de d'enregistrement d'écran et dispositif électronique
CN111316598B (zh) 一种多屏互动方法及设备
WO2020093988A1 (fr) Procédé de traitement d'image et dispositif électronique
WO2022127787A1 (fr) Procédé d'affichage d'image et dispositif électronique
CN111628916B (zh) 一种智能音箱与电子设备协作的方法及电子设备
WO2022143128A1 (fr) Procédé et appareil d'appel vidéo basés sur un avatar, et terminal
CN112119641B (zh) 通过转发模式连接的多tws耳机实现自动翻译的方法及装置
CN113722058B (zh) 一种资源调用方法及电子设备
WO2022100610A1 (fr) Procédé et appareil de projection d'écran, ainsi que dispositif électronique et support de stockage lisible par ordinateur
WO2022033320A1 (fr) Procédé de communication bluetooth, équipement terminal et support d'enregistrement lisible par ordinateur
CN113448382B (zh) 多屏幕显示电子设备和电子设备的多屏幕显示方法
WO2022007862A1 (fr) Procédé de traitement d'image, système, dispositif électronique et support de stockage lisible par ordinateur
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
CN114040242A (zh) 投屏方法和电子设备
CN111316604B (zh) 一种数据传输方法及电子设备
CN114827581A (zh) 同步时延测量方法、内容同步方法、终端设备及存储介质
WO2022170856A1 (fr) Procédé d'établissement de connexion et dispositif électronique
CN111372329B (zh) 一种连接建立方法及终端设备
WO2021052388A1 (fr) Procédé de communication vidéo et appareil de communication vidéo
WO2022161006A1 (fr) Procédé et appareil de synthèse de photographie, et dispositif électronique et support de stockage lisible
US20230335081A1 (en) Display Synchronization Method, Electronic Device, and Readable Storage Medium
WO2022170854A1 (fr) Procédé d'appel vidéo et dispositif associé
WO2022135195A1 (fr) Procédé et appareil permettant d'afficher une interface de réalité virtuelle, dispositif, et support de stockage lisible
EP4293997A1 (fr) Procédé d'affichage, dispositif électronique et système
WO2022033344A1 (fr) Procédé de stabilisation vidéo, dispositif de terminal et support de stockage lisible par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21922596

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21922596

Country of ref document: EP

Kind code of ref document: A1