CN115499619A - Image transmission method and device, electronic equipment and readable storage medium - Google Patents

Image transmission method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN115499619A
CN115499619A CN202211122532.3A CN202211122532A CN115499619A CN 115499619 A CN115499619 A CN 115499619A CN 202211122532 A CN202211122532 A CN 202211122532A CN 115499619 A CN115499619 A CN 115499619A
Authority
CN
China
Prior art keywords
images
image
terminal
terminals
target terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211122532.3A
Other languages
Chinese (zh)
Inventor
李莹莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202211122532.3A priority Critical patent/CN115499619A/en
Publication of CN115499619A publication Critical patent/CN115499619A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a method and a device for transmitting an image, electronic equipment and a readable storage medium, which belong to the technical field of data transmission, wherein the method for transmitting the image comprises the following steps: receiving images uploaded by at least two terminals; obtaining a first image and at least two second images according to images uploaded by at least two terminals, wherein the first image comprises the images uploaded by the at least two terminals, and each second image in the at least two second images comprises the image uploaded by the corresponding terminal; according to a conference architecture mode operated by a target terminal, sending a first image or at least two second images to the target terminal for the target terminal to display; the target terminal is any one of at least two terminals.

Description

Image transmission method and device, electronic equipment and readable storage medium
Technical Field
The application belongs to the technical field of data transmission, and particularly relates to an image transmission method and device, electronic equipment and a readable storage medium.
Background
In the related technical solution, the conference architecture mode includes an MCU architecture mode and an SFU architecture mode. The MCU architecture mode and the SFU architecture mode are both star-shaped structures formed by a server and a plurality of terminals, and the operation modes of the two architecture modes are as follows:
under the MCU framework mode, the MCU terminals upload images to the server, the server can mix the images uploaded by the MCU terminals together, and then the mixed images are forwarded to each MCU terminal; and in the SFU architecture mode, the SFU terminal uploads images to the server, the server does not process the uploaded images, and the server directly forwards the videos to other SFU terminals, wherein the terminal in the MCU architecture mode is an MCU terminal, the terminal in the SFU architecture mode is an SFU terminal, the MCU terminal can only receive one image at a time, and the SFU terminal can receive a plurality of images at a time.
In a large-scale open conference scene, the terminals participating in the conference may be MCU terminals or SFU terminals, and when the server supports the MCU terminals, the SFU terminals cannot participate in the conference, otherwise, when the server supports the SFU terminals, the MCU terminals cannot participate in the conference.
Disclosure of Invention
An object of the embodiments of the present application is to provide an image transmission method, an image transmission device, an electronic device, and a readable storage medium, which can solve the problem that in the related art, when a server supports an MCU terminal, an SFU terminal cannot participate in a conference, and conversely, when the server supports the SFU terminal, the MCU terminal cannot participate in the conference.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an image transmission method, which is executed by a server, where the server is in communication with at least two terminals, where the at least two terminals include a first type terminal and a second type terminal, and conference architecture modes of operations of the first type terminal and the second type terminal are different, and the image transmission method includes: receiving images uploaded by at least two terminals; obtaining a first image and at least two second images according to images uploaded by at least two terminals, wherein the first image comprises the images uploaded by the at least two terminals, and each of the at least two second images comprises the image uploaded by the corresponding terminal; according to a conference architecture mode operated by a target terminal, sending a first image or at least two second images to the target terminal for the target terminal to display; the target terminal is any one of at least two terminals.
In a second aspect, an embodiment of the present application provides an image transmission apparatus, which is executed by a server, where the server communicates with at least two terminals, where the at least two terminals include a first type terminal and a second type terminal, and conference architecture modes of operations of the first type terminal and the second type terminal are different, and the image transmission apparatus includes: the receiving module is used for receiving images uploaded by at least two terminals; the distribution module is used for obtaining a first image and at least two second images according to the images uploaded by the at least two terminals, the first image comprises the images uploaded by the at least two terminals, and each of the at least two second images comprises the image uploaded by the corresponding terminal; the sending module is used for sending the first image or at least two second images to the target terminal according to the conference architecture mode of the operation of the target terminal so as to be displayed by the target terminal; the target terminal is any one of at least two terminals.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect.
In the embodiment of the application, in a conference scene, a server can interact with terminals operating in different conference architecture modes, receive images uploaded by each terminal, process the received images, generate a first image and at least two second images adapted to the different conference architecture modes, and select to send the first image or the at least two second images according to the conference architecture mode operated by a target terminal.
In the process, the server can automatically distribute images suitable for being displayed on the target terminal according to the conference architecture mode running on the terminal, so that the coexistence of terminals in different conference architecture modes is realized, the conference can be carried out in the same conference scene even if the terminals in different conference architecture modes run, and the influence of the conference architecture mode of the terminal on the video conference is eliminated.
Drawings
Fig. 1 is a schematic flowchart of an image transmission method according to an embodiment of the present application;
fig. 2 is a schematic diagram illustrating correspondence between an image uploaded by a terminal, a first image, and at least two second images according to an embodiment of the present application;
fig. 3 is a schematic diagram of an area for storing a first image and at least two second images according to an embodiment of the present application;
FIG. 4 is a schematic flowchart of processing a second image according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram illustrating packetization of a network packet according to ssrc-rtp provided in an embodiment of the present application;
fig. 6 is a schematic flowchart of processing a first image and at least two second images according to an embodiment of the present application;
fig. 7 is a schematic block diagram of an image transmission apparatus provided in an embodiment of the present application;
fig. 8 is a hardware schematic diagram of an electronic device according to an embodiment of the present disclosure;
fig. 9 is a second hardware schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below clearly with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived from the embodiments in the present application by a person skilled in the art, are within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail an image transmission method, an image transmission apparatus, an electronic device, and a readable storage medium provided in the embodiments of the present application with specific embodiments and application scenarios thereof in conjunction with the accompanying drawings.
The embodiment of the application provides an image transmission method, which is executed by a server, wherein the server is communicated with at least two terminals, the at least two terminals comprise a first type terminal and a second type terminal, and the first type terminal and the second type terminal are different in running conference architecture mode.
In the conference architecture mode, that is, the communication architecture adopted between the server and the terminals, each terminal operates in a conference architecture mode at the same time.
As shown in fig. 1, the image transmission method includes:
and 102, receiving images uploaded by at least two terminals.
In one embodiment, in a video conference, each terminal acquires an image of a user according to a preset acquisition frequency, encodes acquired image data, and uploads the encoded image to a server.
In one embodiment, the image that is one frame of the video and is continuously uploaded to the server by the terminal for a period of time may be regarded as the terminal sending a video to the server, and specifically, the video, that is, the video data, is composed of a plurality of pictures, and due to the characteristic of the visual residual of the human eye, generally, a video has 24 pictures (that is, 24 pictures) in one second, and the pictures that the human eye looks like are consecutive, and can be transmitted by using a video transmission technology such as a video stream.
Step 104, obtaining a first image and at least two second images according to the images uploaded by the at least two terminals, wherein the first image comprises the images uploaded by the at least two terminals, and each of the at least two second images comprises the image uploaded by the corresponding terminal.
In one embodiment, it may be understood that the server may be regarded as a computer, where a memory space in the computer is used to deploy a canvas thread, the canvas thread is used to take images uploaded by one or more terminals, continuously synthesize new images (such as the first image and the plurality of second images), form a video as the synthesized images continue, and send the video to the terminals participating in the conference for rendering, so as to be displayed by the terminals participating in the conference.
And 106, sending the first image or at least two second images to the target terminal according to the conference architecture mode operated by the target terminal, so that the target terminal can display the first image or the at least two second images.
The target terminal is any one of at least two terminals.
In the embodiment of the application, in a conference scene, a server can interact with terminals operating in different conference architecture modes, receive images uploaded by each terminal, process the received images, generate a first image and at least two second images adapted to the different conference architecture modes, and select to send the first image or the at least two second images according to the conference architecture mode operated by a target terminal.
In the process, the server can automatically distribute images suitable for being displayed on the target terminal according to the conference architecture mode running on the terminal, so that the coexistence of terminals in different conference architecture modes is realized, the conference can be carried out in the same conference scene even if the terminals in different conference architecture modes run, and the influence of the conference architecture mode of the terminal on the video conference is eliminated.
In one embodiment, the sending a first image or at least two second images to a target terminal for the target terminal to display according to a conference architecture mode in which the target terminal operates specifically includes: under the condition that the conference architecture mode operated by the target terminal is an MCU architecture mode, sending a first image to the target terminal; and under the condition that the conference architecture mode operated by the target terminal is the SFU architecture mode, at least two second images are sent to the target terminal.
In this embodiment, when the conference architecture mode in which the terminal operates is the MCU architecture mode, that is, when the terminal is the MCU terminal, since the MCU terminal can only receive one image at a time, the first image is transmitted to the target terminal, and the characteristic that the first image includes images uploaded by at least two terminals is utilized, so that the target terminal can display images of all users to the users by using the received first image.
Similarly, when the conference architecture mode in which the terminal operates is the SFU architecture mode, that is, when the terminal is the SFU terminal, since the SFU terminal can receive a plurality of images at one time, by sending at least two second images to the target terminal, the target terminal can display images of all the participants to the user based on the at least two second images under the condition that the target terminal receives the at least two second images.
In the above embodiment, the first image or at least two second images are selected and transmitted according to the conference architecture mode of the target terminal, so that the target terminal can receive and display the images of all participants.
Specifically, for example, as shown in fig. 2, the number of the terminals is four, wherein the number of the terminals is terminal 1, terminal 2, terminal 3, and terminal 4, respectively, and after receiving the images uploaded by terminal 1, terminal 2, terminal 3, and terminal 4, the server obtains a first image and at least two second images according to the images uploaded by terminal 1, terminal 2, terminal 3, and terminal 4.
In one embodiment, as shown in fig. 3, M first areas are preset in the server, where the first areas are used to store a first image obtained after the wiring process and N second areas, where the second areas are used to store a second image obtained after the wiring process, and the number of enabled first areas and the number of enabled second areas are selected according to actual usage requirements.
For example, M is 8, N is 12, that is, first region 1, first region 2, \8230: \8230, first region 8, second region 9, second region 10, \8230 \, 8230, and second region 20.
In one embodiment, the sending at least two second images to the target terminal specifically includes: coding each of the at least two second images to obtain a coding result; matching a corresponding identification value for each coding result, wherein the identification value is associated with the corresponding second image; and sending the coding result and the identification value to the target terminal.
In this embodiment, after the at least two second images are received by the target terminal, the at least two second images need to be displayed respectively, and in this process, the corresponding relationship between the at least two received second images and the at least two second images received before cannot be known, the second images from the same terminal cannot be combined, and the display screen is very likely to be disturbed.
In the embodiment of the application, after each second image is coded, the coding result is matched with an identification value, and the coding result corresponding to the second image and the identification value are sent to the target terminal together, so that after the target terminal receives the data, the received second images are distinguished and combined according to the identification value, the accuracy of a display picture is ensured, and the probability of disorder of the display picture is reduced.
In one embodiment, the encoding result and the identification value are associated and then sent to the target terminal, so that after the target terminal receives one of the encoding result and the identification value, the other one of the encoding result and the identification value can be known according to the association relationship, and the probability of disorder in matching of the encoding result and the identification value is reduced.
Specifically, as shown in fig. 4, the server encodes the second image, and then inserts the encoded result of the second image (i.e., video image, frame) into the downstream queue of the terminals (e.g., terminal 1 and terminal 2) viewing the second image.
In one embodiment, the encoding result and the identification value are transmitted by using a real-time transmission protocol; matching a corresponding identification value for each coding result specifically comprises: acquiring an identification value of each of at least two second images; and modifying the synchronous source identifier of each encoding result into an identification value based on the real-time transmission protocol.
In this embodiment, a specific transmission scheme of the identification value is defined, in this embodiment, a real-time Transport Protocol is used to transmit the second image, and under a real-time Transport Protocol (RTP), the ssrc field refers to a shared synchronization source, that is, the above synchronization source identifier, and different images use different ssrcs, so that different images can be distinguished according to different ssrcs.
Specifically, as shown in fig. 5, a network packet of the real-time transport protocol, that is, an RTP packet, has a plurality of RTP sub-packets, wherein each RTP sub-packet has an ssrc, such as ssrc2 and ssrc3, and thus, the plurality of RTP sub-packets in the RTP packet can be divided into RTP packet 1 and RTP packet 2 based on ssrc2 and ssrc3.
Wherein ssrc of the RTP sub-packet in the RTP packet 1 is ssrc2, and ssrc of the RTP sub-packet in the RTP packet 2 is ssrc3.
In the embodiment, the identification value is represented by using the field in the transmission protocol, and the identification value does not need to be sent independently, so that the data volume of interaction between the server and the terminal is reduced, and the probability of abnormal matching between the identification value and the second image caused by disorder of the identification value during transmission is reduced.
Specifically, for example, the value of the ssrc field of the real-time transport protocol that transmits the video of the 1 st second image is 1; the value of the ssrc field of the real-time transport protocol for transmitting the video of the 2 nd second image is 2, so that whether the second image is the second image of the same terminal can be judged according to the ssrc values of network packets of a plurality of real-time transport protocols.
Specifically, as shown in fig. 6, the number of terminals is five, wherein, the number of terminals is terminal 1, terminal 2, terminal 3, terminal 4 and terminal 5, respectively, and the server encodes the images uploaded by terminals 1, 2, 3, 4 and 5 after receiving the images uploaded by terminals 1, 2, 3 and 4, modifies ssrc of the encoded result to 2, 3, 4 and 5, respectively, and then inserts the frame of image into the downstream queue of the terminal (such as terminals 1, 2, 3 and 5) viewing the second image.
The Input/output (IO) thread of the server acquires an image of a downlink queue of the terminal and sends the image, and the terminal receives a network packet of a real-time transmission protocol and renders a video according to a service rule.
In one embodiment, the server has at least two first data queues and at least two second data queues, where the two first data queues and the at least two second data queues correspond to at least two terminals one to one, and the first data queues and the second data queues are used to store images uploaded by the corresponding terminals, respectively, and obtain a first image according to the images uploaded by the at least two terminals, and specifically includes: respectively reading images from a first data queue corresponding to each terminal; performing image splicing on the read image to obtain a first image; obtaining at least two second images according to the images uploaded by the at least two terminals, specifically comprising: respectively reading images from a second data queue corresponding to each terminal; and obtaining at least two second images according to the read images.
In the embodiment, a first data queue and a second data queue are matched for each terminal, wherein the first data queue and the second data queue respectively store images uploaded by the terminal, so that when a first image and at least two second images are generated, data can be taken from the corresponding data queues, in the process, the generation of the first image and the generation of the at least two second images cannot influence each other, and the probability of image reading disorder is reduced.
Specifically, the canvas thread may take images from the video image queues of the terminals (i.e., the first data queue, such as video _ queue1, and the second data queue, such as video _ queue2, in the foregoing) to mix into a new image, and in the process of obtaining the first image, read images from the first data queue corresponding to each terminal, and perform image stitching on the read images to obtain a mixed image; in the process of obtaining at least two second images, the images are respectively read from the second data queues corresponding to each terminal, and the second images are obtained according to the read images.
In one embodiment, the second data queue may be understood as a repetition of the first data queue, wherein the first data queue may be understood as a plurality of images, wherein the plurality of images are arranged in a front-to-back order of the upload time.
According to the image transmission method provided by the embodiment of the application, the execution main body can be an image transmission device. In the embodiment of the present application, a method for transmitting an image performed by an image transmission device is taken as an example, and the image transmission device provided in the embodiment of the present application is described.
In one embodiment, as shown in fig. 7, there is provided an image transmission apparatus 700, which is executed by a server, the server communicating with at least two terminals, the at least two terminals including a first type terminal and a second type terminal, the first type terminal and the second type terminal operating in different conference architecture modes, the image transmission apparatus 700 comprising: a receiving module 702, configured to receive images uploaded by at least two terminals; the distribution module 704 is used for obtaining a first image and at least two second images according to the images uploaded by the at least two terminals, wherein the first image comprises the images uploaded by the at least two terminals, and each of the at least two second images comprises the image uploaded by the corresponding terminal; the sending module 706 is configured to send the first image or the at least two second images to the target terminal according to a conference architecture mode in which the target terminal operates, so that the target terminal can display the first image or the at least two second images; the target terminal is any one of at least two terminals.
In the embodiment of the application, in a conference scene, a server can interact with terminals operating in different conference architecture modes, receive images uploaded by each terminal, process the received images, generate a first image and at least two second images adapted to the different conference architecture modes, and select to send the first image or the at least two second images according to the conference architecture mode operated by a target terminal.
In the process, the server can automatically distribute images suitable for being displayed on the target terminal according to the conference architecture mode running on the terminal, so that the coexistence of terminals in different conference architecture modes is realized, the conference can be carried out in the same conference scene even if the terminals in different conference architecture modes run, and the influence of the conference architecture mode of the terminal on the video conference is eliminated.
In one embodiment, the sending module 706 is specifically configured to: under the condition that the conference architecture mode operated by the target terminal is an MCU architecture mode, transmitting a first image to the target terminal; and under the condition that the conference architecture mode operated by the target terminal is the SFU architecture mode, sending a plurality of at least two second images to the target terminal.
In this embodiment, when the conference architecture mode in which the terminal operates is the MCU architecture mode, that is, when the terminal is the MCU terminal, since the MCU terminal can only receive one image at a time, the first image is transmitted to the target terminal, and the characteristic that the first image includes images uploaded by at least two terminals is utilized, so that the target terminal can display images of all users to the users by using the received first image.
Similarly, when the conference architecture mode in which the terminal operates is the SFU architecture mode, that is, when the terminal is the SFU terminal, since the SFU terminal can receive a plurality of images at a time, by sending the at least two second images to the target terminal, the target terminal can display the images of all the participants to the user based on the at least two second images under the condition that the target terminal receives the at least two second images.
In the embodiment, the first image or the at least two second images are selected and transmitted according to the conference architecture mode of the target terminal, so that the target terminal can receive and display the images of all the participants.
In one embodiment, the sending module 706 is specifically configured to: coding each of the at least two second images to obtain a coding result; matching a corresponding identification value for each coding result, wherein the identification value is associated with the corresponding second image; and sending the coding result and the identification value to the target terminal.
In this embodiment, after receiving the at least two second images, the target terminal needs to display the at least two second images separately, and in this process, the corresponding relationship between the at least two received second images and the at least two previously received second images cannot be known, the second images from the same terminal cannot be combined, and thus, the display screen is likely to be disturbed.
In the embodiment of the application, after each second image is coded, the coding result is matched with an identification value, and the coding result corresponding to the second image and the identification value are sent to the target terminal together, so that after the target terminal receives the data, the received second images are distinguished and combined according to the identification value, the accuracy of a display picture is ensured, and the probability of disorder of the display picture is reduced.
In one embodiment, the coding result and the identification value are associated and then sent to the target terminal, so that after the target terminal receives one of the coding result and the identification value, the other one of the coding result and the identification value can be known according to the association relationship, and the probability of disorder in matching of the coding result and the identification value is reduced.
In one embodiment, the encoding result and the identification value are transmitted by using a real-time transmission protocol; the sending module 706 is specifically configured to: acquiring an identification value of each of at least two second images; and modifying the synchronous source identifier of each encoding result into an identification value based on the real-time transmission protocol.
In the embodiment, the identification value is represented by using the field in the transmission protocol, and the identification value does not need to be sent independently, so that the data volume of interaction between the server and the terminal is reduced, and meanwhile, the probability of abnormal matching between the identification value and the second image caused by disorder of the identification value during transmission is reduced.
In one embodiment, the server is provided with at least two first data queues and at least two second data queues, the two first data queues and the at least two second data queues correspond to the at least two terminals one by one, and the first data queues and the second data queues are used for respectively storing images uploaded by the corresponding terminals; the distribution module is specifically used for: respectively reading images from a first data queue corresponding to each terminal; performing image splicing on the read image to obtain a first image; respectively reading images from a second data queue corresponding to each terminal; and obtaining at least two second images according to the read images.
In the embodiment, a first data queue and a second data queue are matched for each terminal, wherein the first data queue and the second data queue respectively store the images uploaded by the terminal, so that when a first image and at least two second images are generated, the data can be taken from the corresponding data queues, in the process, the generation of the first image and the generation of the at least two second images cannot influence each other, and the probability of image reading disorder is reduced.
The image transmission apparatus 700 in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (Network Attached Storage, NAS), a personal computer (NAS), a Television (TV), an assistant, a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The image transmission device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present application.
In one embodiment, as shown in fig. 8, an electronic device 800 is further provided in the present embodiment of the application, and includes a processor 802 and a memory 804, where the memory 804 stores a program or an instruction that can be executed on the processor 802, and when the program or the instruction is executed by the processor 802, the steps of the foregoing image transmission method embodiment are implemented, and the same technical effects can be achieved, and are not described again here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic device and the non-mobile electronic device described above.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing the embodiment of the present application.
As shown in fig. 9, the electronic device 900 includes, but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, and a processor 910. Those skilled in the art will appreciate that the electronic device 900 may further include a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 910 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
In one embodiment, the processor 910 is configured to: receiving images uploaded by at least two terminals; obtaining a first image and at least two second images according to images uploaded by at least two terminals, wherein the first image comprises the images uploaded by the at least two terminals, and each second image in the at least two second images comprises the image uploaded by the corresponding terminal; according to a conference architecture mode operated by a target terminal, sending a first image or at least two second images to the target terminal for the target terminal to display; the target terminal is any one of at least two terminals.
In one embodiment, the processor 910 is specifically configured to: under the condition that the conference architecture mode operated by the target terminal is an MCU architecture mode, transmitting a first image to the target terminal; and under the condition that the conference architecture mode operated by the target terminal is the SFU architecture mode, at least two second images are sent to the target terminal.
In one embodiment, the processor 910 is specifically configured to: coding each of the at least two second images to obtain a coding result; matching a corresponding identification value for each coding result, the identification value being associated with a corresponding second image; and sending the coding result and the identification value to the target terminal.
In one embodiment, the encoding result and the identification value are transmitted by using a real-time transmission protocol; a processor 910 configured to: acquiring an identification value of each of at least two second images; and modifying the synchronous source identifier of each encoding result into an identification value based on the real-time transmission protocol.
In one embodiment, the server is provided with at least two first data queues and at least two second data queues, the two first data queues and the at least two second data queues correspond to at least two terminals one to one, and the first data queues and the second data queues are used for respectively storing images uploaded by the corresponding terminals; the processor 910 is specifically configured to: respectively reading images from a first data queue corresponding to each terminal; performing image splicing on the read image to obtain a first image; and obtaining at least two second images according to the read images.
In the embodiment of the application, in a conference scene, a server can interact with terminals operating in different conference architecture modes, receive images uploaded by each terminal, process the received images, generate a first image and at least two second images adaptive to the different conference architecture modes, and select to transmit the first image or the at least two second images according to the conference architecture mode operated by a target terminal.
In the process, the server can automatically distribute canvas suitable for being displayed on the target terminal according to the conference architecture mode running on the terminal, coexistence of terminals in different conference architecture modes is achieved, even if the terminals in different conference architecture modes run, a conference can be carried out in the same conference scene, and the influence of the conference architecture mode of the terminal on the video conference is eliminated.
It should be understood that, in the embodiment of the present application, the input Unit 904 may include a Graphics Processing Unit (GPU) 9041 and a microphone 9042, and the Graphics Processing Unit 9041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 907 includes at least one of a touch panel 9071 and other input devices 9072. A touch panel 9071, also called a touch screen. The touch panel 9071 may include two parts, a touch detection device and a touch controller. Other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, and the like) required for at least one function, and the like. Further, the memory 909 may include volatile memory or nonvolatile memory, or the memory 909 may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct bus RAM (DRRAM). The memory 909 in the embodiments of the subject application includes, but is not limited to, these and any other suitable types of memory.
Processor 910 may include one or more processing units; optionally, the processor 910 integrates an application processor, which mainly handles operations related to the operating system, user interface, and applications, and a modem processor, which mainly handles wireless communication signals, such as a baseband processor. It is to be appreciated that the modem processor described above may not be integrated into processor 910.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image transmission method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device in the above embodiment. Readable storage media, including computer readable storage media such as computer read only memory ROM, random access memory RAM, magnetic or optical disks, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the above-mentioned embodiment of the image transmission method, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as a system-on-chip, or a system-on-chip.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing image transmission method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the embodiments of the present application or portions thereof contributing to the prior art may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (which may be a mobile phone, a computer, a server, or a network device) to execute the method of the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the present embodiments are not limited to those precise embodiments, which are intended to be illustrative rather than restrictive, and that various changes and modifications may be effected therein by one skilled in the art without departing from the scope of the appended claims.

Claims (12)

1. A method for transmitting an image, performed by a server, wherein the server is in communication with at least two terminals, the at least two terminals include a first type terminal and a second type terminal, and a conference architecture mode of the first type terminal and the second type terminal are different, the method for transmitting an image comprising:
receiving images uploaded by the at least two terminals;
obtaining a first image and at least two second images according to the images uploaded by the at least two terminals, wherein the first image comprises the images uploaded by the at least two terminals, and each of the at least two second images comprises the image uploaded by the corresponding terminal;
according to a conference architecture mode of a target terminal, sending the first image or the at least two second images to the target terminal for the target terminal to display;
wherein the target terminal is any one of the at least two terminals.
2. The method according to claim 1, wherein the sending the first image or the at least two second images to a target terminal for display by the target terminal according to a conference architecture mode in which the target terminal operates specifically includes:
under the condition that the conference architecture mode of the target terminal is an MCU architecture mode, transmitting the first image to the target terminal;
and sending the at least two second images to the target terminal under the condition that the conference architecture mode operated by the target terminal is an SFU architecture mode.
3. The method for transmitting an image according to claim 2, wherein sending the at least two second images to the target terminal specifically includes:
coding each second image of the at least two second images to obtain a coding result;
matching a corresponding identification value for each encoding result, the identification value being associated with a corresponding second image;
and sending the coding result and the identification value to the target terminal.
4. The image transmission method according to claim 3, wherein the encoding result and the identification value are transmitted using a real-time transmission protocol;
the matching of the corresponding identification value for each coding result specifically includes:
acquiring an identification value of each of the at least two second images;
and modifying the synchronous source identifier of each encoding result into the identification value based on a real-time transmission protocol.
5. The method for transmitting the image according to any one of claims 1 to 4, wherein the server has at least two first data queues and at least two second data queues, the two first data queues and the at least two second data queues are in one-to-one correspondence with the at least two terminals, and the first data queues and the second data queues are used for respectively storing the images uploaded by the corresponding terminals;
obtaining a first image according to the images uploaded by the at least two terminals, which specifically comprises:
respectively reading images from a first data queue corresponding to each terminal;
performing image splicing on the read image to obtain the first image;
obtaining at least two second images according to the images uploaded by the at least two terminals, which specifically comprises:
respectively reading images from a second data queue corresponding to each terminal;
and obtaining the at least two second images according to the read images.
6. An image transmission apparatus, executed by a server, wherein the server communicates with at least two terminals, the at least two terminals include a first type terminal and a second type terminal, and the first type terminal and the second type terminal operate in different conference architecture modes, the image transmission apparatus comprising:
the receiving module is used for receiving the images uploaded by the at least two terminals;
the distribution module is used for obtaining a first image and at least two second images according to the images uploaded by the at least two terminals, wherein the first image comprises the images uploaded by the at least two terminals, and each of the at least two second images comprises the image uploaded by the corresponding terminal; the sending module is used for sending the first image or the at least two second images to the target terminal according to a conference architecture mode of the operation of the target terminal so as to be displayed by the target terminal;
wherein the target terminal is any one of the at least two terminals.
7. The image transmission apparatus according to claim 6, wherein the sending module is specifically configured to:
under the condition that the conference architecture mode of the target terminal is an MCU architecture mode, transmitting the first image to the target terminal;
and sending the at least two second images to the target terminal under the condition that the conference architecture mode operated by the target terminal is an SFU architecture mode.
8. The image transmission apparatus according to claim 7, wherein the sending module is specifically configured to:
coding each second image of the at least two second images to obtain a coding result;
matching a corresponding identification value for each encoding result, the identification value being associated with a corresponding second image;
and sending the coding result and the identification value to the target terminal.
9. The apparatus for transmitting image according to claim 8, wherein the encoding result and the identification value are transmitted using a real-time transmission protocol;
the sending module is specifically used for:
acquiring an identification value of each of the at least two second images;
and modifying the synchronous source identifier of each encoding result into the identification value based on a real-time transmission protocol.
10. The image transmission apparatus according to any one of claims 6 to 9, wherein the server has at least two first data queues and at least two second data queues, the two first data queues and the at least two second data queues are in one-to-one correspondence with the at least two terminals, and the first data queues and the second data queues are used for respectively storing images uploaded by the corresponding terminals;
the distribution module is specifically used for:
respectively reading images from a first data queue corresponding to each terminal;
performing image stitching on the read image to obtain the first image;
respectively reading images from a second data queue corresponding to each terminal;
and obtaining the at least two second images according to the read images.
11. An electronic device, comprising: a processor and a memory, the memory storing a program or instructions executable on the processor, the program or instructions when executed by the processor implementing the steps of the method of any one of claims 1 to 5.
12. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the method according to any one of claims 1 to 5.
CN202211122532.3A 2022-09-15 2022-09-15 Image transmission method and device, electronic equipment and readable storage medium Pending CN115499619A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211122532.3A CN115499619A (en) 2022-09-15 2022-09-15 Image transmission method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211122532.3A CN115499619A (en) 2022-09-15 2022-09-15 Image transmission method and device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN115499619A true CN115499619A (en) 2022-12-20

Family

ID=84467876

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211122532.3A Pending CN115499619A (en) 2022-09-15 2022-09-15 Image transmission method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN115499619A (en)

Similar Documents

Publication Publication Date Title
JP7133650B2 (en) Modification of video streams with supplemental content for video conferencing
KR102375307B1 (en) Method, apparatus, and system for sharing virtual reality viewport
JP5508450B2 (en) Automatic video layout for multi-stream and multi-site telepresence conferencing system
US9307194B2 (en) System and method for video call
US20060221188A1 (en) Method and apparatus for composing images during video communications
US20050243810A1 (en) Video conference data transmission device and data transmission method adapted for small display of mobile terminals
CN113038287B (en) Method and device for realizing multi-user video live broadcast service and computer equipment
US9344678B2 (en) Information processing apparatus, information processing method and computer-readable storage medium
CN103597468A (en) Systems and methods for improved interactive content sharing in video communication systems
CN108650542A (en) Generate vertical screen video flowing, the method for image procossing, electronic equipment and video system
CN106161219A (en) Message treatment method and device
WO2023125657A1 (en) Image processing method and apparatus, and electronic device
CN106791915A (en) A kind of method and apparatus for showing video image
CN113315927B (en) Video processing method and device, electronic equipment and storage medium
CN117115287A (en) Image generation method, device, electronic equipment and readable storage medium
CN115499619A (en) Image transmission method and device, electronic equipment and readable storage medium
US20220311814A1 (en) Techniques for signaling multiple audio mixing gains for teleconferencing and telepresence for remote terminals
CN113573004A (en) Video conference processing method and device, computer equipment and storage medium
CN114500129A (en) Information processing method and electronic equipment
CN114285956A (en) Video sharing circuit, method and device and electronic equipment
WO2022252797A1 (en) Video presentation method, electronic device, computer storage medium and program product
US20170374368A1 (en) Video Processor, Method, Computer Program
WO2024100028A1 (en) Signalling for real-time 3d model generation
CN114338953A (en) Video processing circuit, video processing method and electronic device
CN114286002A (en) Image processing circuit, method and device, electronic equipment and chip

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination