WO2020251250A1 - Procédé de diffusion en continu d'image et dispositif électronique le prenant en charge - Google Patents

Procédé de diffusion en continu d'image et dispositif électronique le prenant en charge Download PDF

Info

Publication number
WO2020251250A1
WO2020251250A1 PCT/KR2020/007498 KR2020007498W WO2020251250A1 WO 2020251250 A1 WO2020251250 A1 WO 2020251250A1 KR 2020007498 W KR2020007498 W KR 2020007498W WO 2020251250 A1 WO2020251250 A1 WO 2020251250A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
server
processor
video
camera
Prior art date
Application number
PCT/KR2020/007498
Other languages
English (en)
Korean (ko)
Inventor
박선민
채상원
허재영
김무영
김민정
이정은
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2020251250A1 publication Critical patent/WO2020251250A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6131Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via a mobile phone network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6156Network physical structure; Signal processing specially adapted to the upstream path of the transmission network
    • H04N21/6181Network physical structure; Signal processing specially adapted to the upstream path of the transmission network involving transmission via a mobile phone network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • Various embodiments of the present document relate to a method of streaming image data of a broadcast transmission device and an electronic device supporting the same.
  • the live broadcasting system provided so far had to continue broadcasting with the camera of the smart phone that started broadcasting, and it was impossible to provide images taken with different angles or different cameras of the smart phone. This may be because the function of the smart phone is limited or the simultaneous use of a plurality of cameras is limited because a limited resource for processing images in the smart phone must be used.
  • images of various angles and various special effect images (processed images) using a plurality of cameras included in an electronic device or a plurality of cameras included in a plurality of electronic devices can be simultaneously streamed.
  • a possible method and apparatus can be provided.
  • An image transmission apparatus includes a first camera and a processor connected to the first camera, wherein the processor transmits a first image captured by the first camera to a server device And, while receiving an input for a processed image request from a user and transmitting the first image to the server device, in response to the processed image request, a second image provided by the server device from the image transmitting device And transmits control information on the second image including information for producing the processed image to the server device.
  • the server device may include a first server device and a second server device, at least one of the first server device and the second server device may be a multi-access edge computing (MEC) server, and the processor is It may be configured to transmit the first image to the first server device and to transmit the second image to the second server device.
  • MEC multi-access edge computing
  • the electronic device may further include a second camera, and the processor may be configured to capture the second image with the second camera.
  • the processor may be configured to transmit the first image to the server device as a first stream, and to transmit the second image to the server device as a second stream distinct from the first stream.
  • the first camera may be a front camera positioned in front of the image transmission device, and the second camera may be a rear camera positioned in a rear side of the image transmission device.
  • the second image may be an image captured in a processed image mode, and the second image may be included in the first image, or the processor may be configured to further transmit the second image to the server device, and the
  • the control information for the second image may include at least one of an identifier of a processing effect, information on an application section of the processing effect, information on a playback time of an image to which the processing effect is applied, and information on image quality.
  • the processing effect includes a highlight video effect, a slow motion video, a panorama video effect, a screen sharing video effect of the video transmission device, a chromakey video effect, a style transfer video effect, and a video effect including a 3D object. , An enlarged image effect, a blur correction image effect, and an image quality improvement image effect.
  • a server device comprising a communication module and a processor connected to the communication module, wherein the processor comprises: a first image received from an image transmission device or a first image processed by the first image. 1 Transmitting a processed image to an image receiving device, receiving control information on a second image while transmitting the first image or the first processed image to the image receiving device, and receiving the first image or the image receiving device It may be configured to produce a second processed image based on the second image received from and control information on the second image, and transmit the second processed image to the image receiving device.
  • the server device may be a multi-access edge computing (MEC) server.
  • MEC multi-access edge computing
  • the second image may be an image photographed in a processed image mode, and the control information on the second image includes an identifier of a processing effect, information on an application section of the processing effect, and a playback time of the image to which the processing effect is applied. It may include at least one of information about image quality and information about image quality, and the processor may be configured to produce the second processed image based on the second image and the control information.
  • the processor may be configured to configure the second processed image differently from the first image or the second processed image in an image size and resolution, and transmit the second processed image to the image receiving device.
  • the processor may receive the first image through a first stream, and may be configured to receive the second image through a second stream that is distinct from the first stream.
  • the processor may be configured to produce the first image or a picture in picture (PIP) image in which the first processed image and the second processed image are combined on a single screen, and then transmitted to the image receiving device.
  • PIP picture in picture
  • the processor converts the second processed image to a highlight image, a slow motion image, a panorama image, a screen sharing image of the image transmission device, a chromakey image, a style transform image, and an image including a 3D object. , An enlarged image, a shake correction image, and an image quality improvement image.
  • the processor may be configured to provide an additional image list including the second processed image as a list to the image receiving device.
  • An image receiving apparatus comprising a display and a processor connected to the display, the processor comprising: a first image or a first processed image processed by the first image from a server device Receiving and reproducing on the display, and while reproducing the first image, a second processed image produced by the server device based on the control information for the second image and the second image is received from the server device and the Play on the display.
  • the second image may be an image photographed in a processed image mode, and the control information on the second image includes an identifier of a processing effect, information on an application section of the processing effect, and a playback time of the image to which the processing effect is applied. It may include at least one of information about image quality and information about image quality, and the second processed image may be an image produced by the server device based on the second image and the control information.
  • the processor may be configured to combine the second processed image and the first image to be reproduced as a picture in pirture (PIP) image.
  • PIP picture in pirture
  • the server device may be a multi-access edge computing (MEC) server, and the processor may receive the first image through a first stream, and the third image is a second stream distinguished from the first stream. It can be configured to receive.
  • MEC multi-access edge computing
  • the second processed image is a highlight image, a slow motion image, a panorama image, a screen sharing image of the image transmission device, a chromakey image, a style transform image, an image including a 3D object, and an enlarged image. , A shake correction image, and an image quality improvement image.
  • a streaming method and apparatus capable of real-time broadcasting of a plurality of images using a plurality of camera sources in an electronic device, and using other functions of the electronic device without resource limitation even during real-time broadcasting. Can provide.
  • a method and apparatus capable of providing various special effect images (processed images) together through streaming by enabling processing in which parallax varies in one image or processing that takes a long time during real-time broadcasting Can provide.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to an embodiment.
  • FIG. 2 is a block diagram of an electronic device for supporting legacy network communication and 5G network communication according to an embodiment.
  • FIG. 3 shows an embodiment of the structure of the third antenna module described with reference to FIG. 2.
  • FIG. 4 is a cross-sectional view taken along line B-B' of the third antenna module 300a of FIG. 3.
  • FIG. 5 is a diagram illustrating an environment of the second network of FIG. 2 according to an embodiment.
  • FIG. 6 is a diagram illustrating a cover area of a server according to an embodiment.
  • FIG. 7 is a flowchart illustrating operations of a broadcast transmission device, a server, and a broadcast reception device according to an embodiment.
  • FIG. 8 is a flowchart illustrating operations between a broadcast transmission device, a server, and a broadcast reception device according to an embodiment.
  • FIG. 9 is a diagram illustrating a broadcast transmission screen of a broadcast transmission device and a broadcast reception screen of a broadcast reception device according to an embodiment.
  • FIG. 10 is a diagram illustrating a broadcast transmission screen of a broadcast transmission device and a broadcast reception screen of a broadcast reception device according to an embodiment.
  • FIG. 11 is a flowchart illustrating operations between a first broadcast transmission device, a second broadcast transmission device, a server, and a broadcast reception device according to an embodiment.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to an embodiment.
  • 2 is a block diagram of an electronic device for supporting legacy network communication and 5G network communication according to an embodiment.
  • 3 shows an embodiment of the structure of the third antenna module described with reference to FIG. 2.
  • FIG. 4 is a cross-sectional view taken along line B-B' of the third antenna module 300a of FIG. 3.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to exemplary embodiments.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (for example, a short-range wireless communication network), or a second network 199 It is possible to communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network).
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, and a sensor module ( 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197 ) Can be included.
  • a sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197
  • the electronic device 101 includes a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, and a sensor module ( 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197 ) Can be included.
  • at least one of these components (
  • the processor 120 for example, executes software (eg, a program 140) to implement at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and can perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 120 may store commands or data received from other components (eg, the sensor module 176 or the communication module 190) to the volatile memory 132 The command or data stored in the volatile memory 132 may be processed, and result data may be stored in the nonvolatile memory 134.
  • software eg, a program 140
  • the processor 120 may store commands or data received from other components (eg, the sensor module 176 or the communication module 190) to the volatile memory 132
  • the command or data stored in the volatile memory 132 may be processed, and result data may be stored in the nonvolatile memory 134.
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and a secondary processor 123 (eg, a graphics processing unit, an image signal processor) that can be operated independently or together with the main processor 121 (eg, a central processing unit or an application processor). , A sensor hub processor, or a communication processor). Additionally or alternatively, the coprocessor 123 may be set to use lower power than the main processor 121 or to be specialized for a designated function. The secondary processor 123 may be implemented separately from the main processor 121 or as a part thereof.
  • main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphics processing unit, an image signal processor
  • the coprocessor 123 may be set to use lower power than the main processor 121 or to be specialized for a designated function.
  • the secondary processor 123 may be implemented separately from the main processor 121 or as a part thereof.
  • the coprocessor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, an application is executed). ) While in the state, together with the main processor 121, at least one of the components of the electronic device 101 (for example, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the functions or states related to. According to an embodiment, the coprocessor 123 (eg, an image signal processor or a communication processor) may be implemented as part of another functionally related component (eg, the camera module 180 or the communication module 190). have.
  • an image signal processor or a communication processor may be implemented as part of another functionally related component (eg, the camera module 180 or the communication module 190). have.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176).
  • the data may include, for example, software (eg, the program 140) and input data or output data for commands related thereto.
  • the memory 130 may include a volatile memory 132 or a nonvolatile memory 134.
  • the program 140 may be stored as software in the memory 130, and may include, for example, an operating system 142, middleware 144, or an application 146.
  • the input device 150 may receive a command or data to be used for a component of the electronic device 101 (eg, the processor 120) from an outside (eg, a user) of the electronic device 101.
  • the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 155 may output an sound signal to the outside of the electronic device 101.
  • the sound output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display device 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display device 160 may include a touch circuitry set to sense a touch, or a sensor circuit (eg, a pressure sensor) set to measure the strength of a force generated by the touch. have.
  • the audio module 170 may convert sound into an electric signal or, conversely, convert an electric signal into sound. According to an embodiment, the audio module 170 acquires sound through the input device 150, the sound output device 155, or an external electronic device (for example, an external electronic device directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (for example, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101, or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used for the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that a user can perceive through a tactile or motor sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture a still image and a video.
  • the camera module 180 may include a plurality of cameras.
  • the plurality of cameras may be located on the front side and/or the rear side of the electronic device 101.
  • the plurality of cameras may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101.
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, electronic device 102, electronic device 104, or server 108). It is possible to support establishment and communication through the established communication channel.
  • the communication module 190 operates independently of the processor 120 (eg, an application processor), and may include one or more communication processors that support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg : A LAN (local area network) communication module, or a power line communication module) may be included.
  • a wireless communication module 192 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 eg : A LAN (local area network) communication module, or a power line communication module
  • a corresponding communication module is a first network 198 (for example, a short-range communication network such as Bluetooth, WiFi direct or IrDA (infrared data association)) or a second network 199 (for example, a cellular network, the Internet, or It may communicate with the external electronic device 104 through a computer network (for example, a telecommunication network such as a LAN or WAN).
  • a computer network for example, a telecommunication network such as a LAN or WAN.
  • These various types of communication modules may be integrated into one component (eg, a single chip), or may be implemented as a plurality of separate components (eg, multiple chips).
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 in a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the antenna module 197 may transmit a signal or power to the outside (eg, an external electronic device) or receive from the outside.
  • the antenna module 197 may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is, for example, provided by the communication module 190 from the plurality of antennas. Can be chosen.
  • the signal or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as part of the antenna module 197.
  • At least some of the components are connected to each other through a communication method (e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI))) between peripheral devices and signals ( E.g. commands or data) can be exchanged with each other.
  • a communication method e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the external electronic devices 102 and 104 may be a device of the same or different type as the electronic device 101.
  • all or part of the operations executed by the electronic device 101 may be executed by one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 needs to perform a function or service automatically or in response to a request from a user or another device, the electronic device 101 does not execute the function or service by itself.
  • One or more external electronic devices receiving the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit the execution result to the electronic device 101.
  • the electronic device 101 may process the result as it is or additionally and provide it as at least part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology Can be used.
  • Electronic devices may be devices of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • phrases such as “at least one of, B, or C” may include any one of the items listed together in the corresponding one of the phrases, or all possible combinations thereof.
  • Terms such as “first”, “second”, or “first” or “second” may be used simply to distinguish the component from other corresponding components, and the components may be referred to in other aspects (eg, importance or Order) is not limited.
  • Some (eg, a first) component is referred to as “coupled” or “connected” with or without the terms “functionally” or “communicatively” to another (eg, second) component. When mentioned, it means that any of the above components can be connected to the other components directly (eg by wire), wirelessly, or via a third component.
  • module used in this document may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic blocks, parts, or circuits.
  • the module may be an integrally configured component or a minimum unit of the component or a part thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document include one or more instructions stored in a storage medium (eg, internal memory 136 or external memory 138) readable by a machine (eg, electronic device 101). It may be implemented as software (for example, the program 140) including them.
  • the processor eg, the processor 120 of the device (eg, the electronic device 101) may call and execute at least one command among one or more commands stored from a storage medium. This makes it possible for the device to be operated to perform at least one function according to the at least one command invoked.
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • non-transient only means that the storage medium is a tangible device and does not contain a signal (e.g., electromagnetic waves), and this term refers to the case where data is semi-permanently stored in the storage medium. It does not distinguish between temporary storage cases.
  • a signal e.g., electromagnetic waves
  • a method according to various embodiments disclosed in the present document may be provided by being included in a computer program product.
  • Computer program products can be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a device-readable storage medium (eg compact disc read only memory (CD-ROM)), or through an application store (eg Play StoreTM) or two user devices ( It can be distributed (e.g., downloaded or uploaded) directly between, e.g. smartphones).
  • a device eg compact disc read only memory (CD-ROM)
  • an application store eg Play StoreTM
  • two user devices It can be distributed (e.g., downloaded or uploaded) directly between, e.g. smartphones).
  • at least a portion of the computer program product may be temporarily stored or temporarily generated in a storage medium that can be read by a device such as a server of a manufacturer, a server of an application store, or a memory of a relay server.
  • each component (eg, module or program) of the above-described components may include a singular number or a plurality of entities.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar to that performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are sequentially, parallel, repeatedly, or heuristically executed, or one or more of the above operations are executed in a different order or omitted. Or one or more other actions may be added.
  • the wireless communication module 192 of the electronic device 101 includes a first communication processor 212, a second communication processor 214, a first radio frequency integrated circuit (RFIC) 222, and a second RFIC. 224, a third RFIC 226, a fourth RFIC 228, a first radio frequency front end (RFFE) 232, a second RFFE 234, a first antenna 242, a second antenna 244 , And a third antenna 248 may be included.
  • the electronic device 101 may further include a processor 120 and a memory 130.
  • the second network 199 may include a first cellular network 292 and a second cellular network 294.
  • the electronic device 101 may further include at least one of the components illustrated in FIG. 1, and the second network 199 may further include at least one other network.
  • the first communication processor 212, the second communication processor 214, the first RFIC 222, the second RFIC 224, the fourth RFIC 228, the first RFFE 232, And the second RFFE 234 may form at least a part of the wireless communication module 192.
  • the fourth RFIC 228 may be omitted or included as a part of the third RFIC 226.
  • the first communication processor 212 may support establishment of a communication channel of a band to be used for wireless communication with the first cellular network 292 and communication of a legacy network through the established communication channel.
  • the first cellular network 292 may be a legacy network including a second generation (2G), a third generation (3G), a fourth generation (4G), and/or a long term evolution (LTE) network.
  • the second communication processor 214 establishes a communication channel corresponding to a designated band (eg, about 6 GHz to about 60 GHz) among bands to be used for wireless communication with the second cellular network 294, and a 5G network through the established communication channel. Can support communication.
  • the second cellular network 294 may be a 5G network defined by 3GPP. Additionally, according to an embodiment, the first communication processor 212 or the second communication processor 214 corresponds to another designated band (eg, about 6 GHz or less) among bands to be used for wireless communication with the second cellular network 294. It is possible to establish a communication channel to communicate with, and support 5G network communication through the established communication channel. According to an embodiment, the first communication processor 212 and the second communication processor 214 may be implemented in a single chip or a single package. According to various embodiments, the first communication processor 212 or the second communication processor 214 is in a single chip or a single package with the processor 120, the auxiliary processor 123 of FIG. 1, or the communication module 190. Can be formed.
  • the first communication processor 212 or the second communication processor 214 corresponds to another designated band (eg, about 6 GHz or less) among bands to be used for wireless communication with the second cellular network 294. It is possible to establish a communication channel to communicate with, and support 5G network communication through
  • the first RFIC 222 when transmitting, transmits a baseband signal generated by the first communication processor 212 to about 700 MHz used for the first cellular network 292 (eg, a legacy network). It can be converted into a 3GHz radio frequency (RF) signal.
  • RF radio frequency
  • an RF signal is obtained from the first cellular network 292 (eg, a legacy network) through an antenna (eg, the first antenna 242), and through an RFFE (eg, the first RFFE 232). It can be preprocessed.
  • the first RFIC 222 may convert the preprocessed RF signal into a baseband signal so that it can be processed by the first communication processor 212.
  • the second RFIC 224 when transmitting, uses the baseband signal generated by the first communication processor 212 or the second communication processor 214 to the second cellular network 294 (for example, a 5G network). It can be converted into an RF signal (hereinafter, referred to as 5G Sub6 RF signal) of the Sub6 band (eg, about 6 GHz or less).
  • 5G Sub6 RF signal RF signal
  • a 5G Sub6 RF signal is obtained from the second cellular network 294 (eg, 5G network) through an antenna (eg, second antenna 244), and RFFE (eg, second RFFE 234). Can be pretreated through.
  • the second RFIC 224 may convert the preprocessed 5G Sub6 RF signal into a baseband signal so that it can be processed by a corresponding one of the first communication processor 212 or the second communication processor 214.
  • the third RFIC 226 transmits the baseband signal generated by the second communication processor 214 to the 5G Above6 band (eg, about 6 GHz to about 60 GHz) to be used in the second cellular network 294 (eg, 5G network). It can be converted into an RF signal (hereinafter, 5G Above6 RF signal).
  • the 5G Above6 RF signal may be obtained from the second cellular network 294 (eg, 5G network) through an antenna (eg, the third antenna 248) and preprocessed through the third RFFE 236.
  • the third RFFE 236 may perform signal preprocessing using the phase converter 238.
  • the third RFIC 226 may convert the preprocessed 5G Above 6 RF signal into a baseband signal to be processed by the second communication processor 214.
  • the third RFFE 236 may be formed as part of the third RFIC 226.
  • the electronic device 101 may include a fourth RFIC 228 separately or at least as a part of the third RFIC 226.
  • the fourth RFIC 228 converts the baseband signal generated by the second communication processor 214 into an RF signal of an intermediate frequency band (for example, about 9 GHz to about 11 GHz) (hereinafter, IF (intermediate frequency ) Signal), the IF signal may be transferred to the third RFIC 226.
  • the third RFIC 226 may convert the IF signal into a 5G Above6 RF signal.
  • the 5G Above6 RF signal may be received from the second cellular network 294 (eg, 5G network) through an antenna (eg, antenna 248) and converted into an IF signal by the third RFIC 226. have.
  • the fourth RFIC 228 may convert the IF signal into a baseband signal so that the second communication processor 214 can process it.
  • the first RFIC 222 and the second RFIC 224 may be implemented as a single chip or at least part of a single package.
  • the first RFFE 232 and the second RFFE 234 may be implemented as a single chip or at least part of a single package.
  • at least one of the first antenna 242 and the second antenna 244 may be omitted or combined with another antenna module to process RF signals of a plurality of corresponding bands.
  • the third RFIC 226 and the third antenna 248 may be disposed on the same substrate to form the third antenna module 246.
  • the wireless communication module 192 or the processor 120 may be disposed on a first substrate (eg, a main PCB).
  • the third RFIC 226 is located in a partial area (eg, lower surface) of a second substrate (eg, sub PCB) separate from the first substrate, and a third antenna ( 248 is disposed so that a third antenna module 246 may be formed.
  • the third antenna 248 may include, for example, an antenna array that can be used for beamforming.
  • the electronic device 101 may improve the quality or speed of communication with the second cellular network 294 (eg, a 5G network).
  • the second cellular network 294 eg, a 5G network
  • the second cellular network 294 can be operated independently from the first cellular network 292 (e.g., a legacy network) (e.g., Stand-Alone (SA)) or connected and operated ( Example: Non-Stand Alone (NSA)).
  • a 5G network may have only an access network (eg, 5G radio access network (RAN) or next generation RAN (NG RAN)) and no core network (eg, next generation core (NGC)).
  • the electronic device 101 may access an external network (eg, the Internet) under the control of the core network (eg, evolved packed core (EPC)) of the legacy network.
  • EPC evolved packed core
  • Protocol information (eg, LTE protocol information) for communication with a legacy network or protocol information (eg, New Radio (NR) protocol information) for communication with a 5G network is stored in the memory 230 and other components (eg, processor information) 120, the first communication processor 212, or the second communication processor 214.
  • LTE protocol information for communication with a legacy network
  • protocol information eg, New Radio (NR) protocol information
  • 5G network is stored in the memory 230 and other components (eg, processor information) 120, the first communication processor 212, or the second communication processor 214.
  • the first which is a legacy network including a second generation (2G), a third generation (3G), a fourth generation (4G), and/or a long term evolution (LTE) network.
  • a module including components used for wireless communication with the cellular network 292 may be referred to as a legacy module.
  • the legacy module may include a first RFIC 222, a second RFEE 232, and a first antenna 242.
  • a module including components used for wireless communication with the second cellular network 294, which is a 5G network may be referred to as a 5G module.
  • the 5G module may be a first 5G module or a second 5G module.
  • the first 5G module may include a second RFIC 224, a second RFFE 234 and a second antenna 244.
  • the second 5G module may be a third antenna module 246.
  • 300a of FIG. 3 is a perspective view of the third antenna module 246 viewed from one side
  • 300b of FIG. 3 is a perspective view of the third antenna module 246 viewed from the other side
  • 300C of FIG. 3 is a cross-sectional view taken along line A-A' of the third antenna module 246.
  • the third antenna module 246 includes a printed circuit board 310, an antenna array 330, a radio frequency integrate circuit (RFIC) 352, and a power manage integrate circuit (PMIC). 354, a module interface (not shown) may be included.
  • the third antenna module 246 may further include a shield member 390.
  • at least one of the aforementioned parts may be omitted, or at least two of the parts may be integrally formed.
  • the printed circuit board 310 may include a plurality of conductive layers, and a plurality of non-conductive layers alternately stacked with the conductive layers.
  • the printed circuit board 310 may provide electrical connection between the printed circuit board 310 and/or various electronic components disposed outside by using wires and conductive vias formed on the conductive layer.
  • the antenna array 330 may include a plurality of antenna elements 332, 334, 336, or 338 arranged to form a directional beam.
  • the antenna elements may be formed on the first surface of the printed circuit board 310 as shown.
  • the antenna array 330 may be formed inside the printed circuit board 310.
  • the antenna array 330 may include a plurality of antenna arrays (eg, a dipole antenna array and/or a patch antenna array) of the same or different shape or type.
  • the RFIC 352 (eg, the third RFIC 226 of FIG. 2) is a different area of the printed circuit board 310 (eg, the opposite side of the first side), spaced apart from the antenna array 330. 2nd side).
  • the RFIC 352 may be configured to process a signal of a selected frequency band transmitted/received through the antenna array 330.
  • the RFIC 352 may convert a baseband signal obtained from a communication processor (not shown) into an RF signal of a designated band during transmission.
  • the RFIC 352 may convert an RF signal received through the antenna array 330 into a baseband signal and transmit it to a communication processor.
  • the RFIC 352 is an IF signal obtained from an intermediate frequency integrate circuit (IFIC) (eg, the fourth RFIC 228 of FIG. 2) (eg, about 9 GHz to about 11GHz) can be up-converted to the RF signal of the selected band.
  • IFIC intermediate frequency integrate circuit
  • the RFIC 352 may down-convert the RF signal obtained through the antenna array 330, convert it into an IF signal, and transmit it to the IFIC.
  • the PMIC 354 may be disposed in another partial area (eg, the second surface) of the printed circuit board 310, spaced apart from the antenna array.
  • the PMIC 354 may receive a voltage from a main PCB (not shown) and provide power required for various components (eg, RFIC 352) on the antenna module.
  • the shielding member 390 may be disposed on a part (eg, the second surface) of the printed circuit board 310 to electromagnetically shield at least one of the RFIC 352 and the PMIC 354.
  • the shielding member 390 may include a shield can.
  • the third antenna module 246 may be electrically connected to another printed circuit board (eg, a main circuit board) through a module interface.
  • the module interface may include a connecting member, for example, a coaxial cable connector, a board to board connector, an interposer, or a flexible printed circuit board (FPCB).
  • FPCB flexible printed circuit board
  • FIG. 4 shows a cross-section taken along line B-B' of the third antenna module 246 of 300a of FIG. 3.
  • the printed circuit board 310 of the embodiment illustrated in FIG. 3 may include an antenna layer 411 and a network layer 413.
  • the antenna layer 411 may include at least one dielectric layer 437-1, and an antenna element 336 and/or a power feeding unit 425 formed on or inside the outer surface of the dielectric layer.
  • the feeding part 425 may include a feeding point 427 and/or a feeding line 429.
  • the network layer 413 includes at least one dielectric layer 437-2, at least one ground layer 433 formed on or inside the outer surface of the dielectric layer, at least one conductive via 435, and a transmission line 423, and/or a signal line 429 may be included.
  • the third RFIC 226 is electrically connected to the network layer 413 through, for example, first and second solder bumps 440-1 and 440-2. Can be connected. In other embodiments, various connection structures (eg, soldering or ball grid array (BGA)) may be used instead of the connection part.
  • the third RFIC 226 may be electrically connected to the antenna element 336 through a first connection unit 440-1, a transmission line 423, and a power supply unit 425.
  • the third RFIC 226 may also be electrically connected to the ground layer 433 through the second connection portion 440-2 and the conductive via 435.
  • the third RFIC 226 may also be electrically connected to the module interface mentioned above through the signal line 429.
  • FIG. 5 is a diagram illustrating an environment of the second network of FIG. 2 according to an embodiment.
  • 6 is a diagram illustrating a cover area of a server according to an embodiment.
  • the electronic device 500a may access the Internet (data network) through a mobile network (cellular network).
  • the electronic device 500a may be the electronic device 101 described above with reference to FIGS. 1 to 4.
  • the mobile network is composed of an access network and a core network.
  • the access network may include a plurality of base stations each having a service provision area.
  • the electronic device 500a may be connected to the access network by wireless communication, and the access network to the data network through the core network may be connected by wired communication.
  • the server 500b may be an edge computing server.
  • the edge computing server may be a multi-access edge computing (MEC) server.
  • MEC multi-access edge computing
  • the MEC network including the MEC server is a distributed cloud network connected to and deployed to a mobile network, providing cloud computing resources (eg, cpu, storage, software) to 3rd party service providers. can do.
  • the MEC server providing the cloud computing resource may be located in a location geographically close to the electronic device 500a in the communication network, for example, in or near the access network.
  • the server 500b may be an OpenFog Consortium or an EdgeX Foudry.
  • the client applications 501 and 502 of the electronic device 500a may divide and access two server applications.
  • an application running in the electronic device 500a is a client application
  • an application application running in the server 500b is a server application (when the server 500b is a MEC server, it may be a MEC application)
  • the application application running on the server will be called a remote application.
  • the first client application 501 inside the electronic device 500a is an external data network (eg, a mobile network). You can use various content services by accessing the remote application of the server located in the Internet).
  • the second client application 502 of the electronic device 500a requiring low latency is directly connected (distributed) to a base station providing a service to the electronic device 500a.
  • a server application and/or service of the server 500b and, for example, Internet communication (eg, HTTP communication and/or socket communication) may be performed.
  • the server 600b may include a processor 611, and the processor 611 may include a video manager 612 and a codec 613.
  • the server 600b of FIG. 6 may be an edge computing server.
  • the edge computing server may be a multi-access edge computing (MEC) server.
  • the server 600b may be an OpenFog Consortium or an EdgeX Foudry.
  • the processor 611 may generate the image received from the broadcast transmission device 600a as an image suitable for broadcasting.
  • the video manager 612 included in the processor 611 may generate the image received from the broadcast transmission device 600a as an image suitable for broadcasting.
  • the video manager 612 may synthesize or re-encode a video suitable for broadcasting based on a policy of a streaming server such as YouTube and a policy of the video manager 612.
  • the video manager 612 may manage the image so that the image transmitted from the broadcast transmission device 600a can be processed into an image to be provided by the processor 611 to the broadcast reception device 600c.
  • the broadcast transmission device 600a may transmit control information such as image quality, special effect (processing effect), duration, and playback time along with the actual image, and such control information is received and managed by the video manager 612. I can.
  • the video manager 612 may be separate hardware controlled by the processor 611.
  • the video manager 612 may be configured with a server application (eg, MEC application) and/or a service (eg, MEC service).
  • the video manager 612 eg, a video application included in the video manager 612 is broadcast transmission devices 600a and 600f through Internet communication (eg, HTTP communication and/or socket communication). And/or data communication with the broadcast receiving device 600c.
  • the processor 611 may synthesize, for example, a front camera image and/or a rear camera image received from the broadcast transmission device 600a into an image suitable for broadcasting. In addition, an operation of processing a separate image may be performed based on control information transmitted from the broadcast transmission device 600a. For example, the processor 611 may form a slow motion image, a GIF image, a fast forward image, an image quality improvement image for an enlarged portion, and a best shot image.
  • the processor 611 may encode the generated image for broadcasting using the codec 613.
  • the processor 611 may generate images of various quality using a plurality of codecs.
  • the codec 613 may be separate hardware under the control of the processor 611.
  • the processor may transmit the generated image according to the situation of the connected broadcast receiving device 600c.
  • the processor (for example, the video manager 612) may provide a communication status of the base station to the broadcast reception device 600c in advance to quickly control the quality of the transmitted image.
  • the broadcast transmission device 600a may acquire a plurality of images through the plurality of cameras 601.
  • the broadcast transmission device 600a may have the same configuration as the electronic device 101 described with reference to FIGS. 1 to 4.
  • the broadcast transmission device 600a may transmit each of a plurality of images to the server 600b through a communication module.
  • the server 600b may receive a plurality of images from a plurality of broadcast transmission devices 600a and 600f.
  • the broadcast receiving device 600c in the area of the server 600b may receive and display an image transmitted from the server 600b.
  • the broadcast reception device 600c may have the same configuration as the electronic device 101 described with reference to FIGS. 1 to 4.
  • the broadcast receiving device 600e outside the area of the server 600b may receive and display the image transmitted from the server 600b through a broadcasting service server 600d outside the area of the server 600b. have.
  • FIG. 7 is a flowchart illustrating operations of an image transmission device (broadcast transmission device), a server, and an image reception device (broadcast reception device) according to an exemplary embodiment.
  • the operation of the image transmission device may be performed through the processor of the image transmission device
  • the operation of the server may be performed through the processor of the server
  • the operation of the image reception device is It can be performed through the processor of the receiving device.
  • the server of FIG. 7 may be an edge computing server.
  • the edge computing server may be a MEC server.
  • the server may be an open fog consortium or an edge X foundry.
  • the image transmission device 700a may have the same configuration as the electronic device 101 described with reference to FIGS. 1 to 4.
  • the image transmission device 700a may connect to a server for broadcasting (701).
  • the image transmission device 700a may connect to a server for broadcasting when receiving an input for starting broadcasting from a user who is a broadcasting subject.
  • the image transmission device 700a may prepare a camera necessary for broadcasting to capture an image.
  • the camera of the broadcast transmission device 700a may start capturing an image.
  • the broadcast transmission device 700a may start capturing an image through a front camera located on the front side of the broadcast transmission device 700a, for example, a surface on which the display screen is located.
  • camera shooting of the image transmission device 700a may be omitted.
  • the image transmission device 700a may transmit (upload) the first image to the server 700b (703).
  • the first image may be an image stored in the image transmission device 700a or an image captured by one or more cameras included in the image transmission device 700a.
  • the image transmission device 700a may upload a captured image to a server in real time.
  • the image transmission device 700a may upload an image currently being captured to the server 700b in real time through a front camera located in front of the image transmission device 700a.
  • the image transmission apparatus 700a may transmit each image as a separate stream.
  • the image transmission device 700a may receive an input for a request for a processed image (special image) from a user who is a broadcasting subject (704).
  • the processed video includes a slow motion video, a highlight video, a panorama video, a screen sharing video of the broadcast transmission device 700a, a chromakey video, and a style transfer video (the video is converted to a specific work or painter’s style).
  • Changed to) an image including a 3D object (an image obtained by converting at least one of the objects of the image into 3D), an enlarged image, a shake correction image, and an image quality improvement image, and the user may include at least one processed image You can choose to provide
  • the processing transmission device 700a may capture an image for a processed image for providing the selected processed image (705 ).
  • photographing of an image for a processed image may be referred to as a processed image mode photographing.
  • a slow motion video can be captured.
  • the object currently being photographed may be photographed as a slow motion image using the front camera.
  • the video for slow motion may be a video in which a larger number of frames are captured during the same time period than a normal video.
  • another object using another camera eg, rear camera of the image transmission device 700a You can shoot a video for slow motion.
  • the image transmission device 700a may capture a screen image of the image transmission device 700a and use it as an image for producing a processed image.
  • the part marked when the user currently uploads the image is processed without performing separate shooting for a special image. It can also be used as a video for video production.
  • the image transmission device 700a may upload control information required for production of a processed image to the server 700b (706).
  • the image transmission device 700a may transmit the processed image production image (second image) to the server 700b together, and when the second image is included in the first image, the section corresponding to the second image Only information about (eg, the part marked by the user when shooting) may be transmitted to the server 700b, and transmission of the second image may be omitted.
  • the image transmission device 700a may transmit the copied image of the section corresponding to the second image to the server 700b.
  • the control information required to produce a processed image may include an identifier of a processing effect, a temporal section to which the processing effect is applied or a spatial section on an image, a time at which the processing effect image is reproduced, and information on image quality.
  • the image transmission device 700a When transmitting the second image, when the time required to be transmitted in real time is not large, the image transmission device 700a transmits a second image through the legacy module used for wireless communication with the legacy network in the wireless communication module 192 of FIG. 2. Images can be transmitted. Alternatively, when the second image transmission device 700a has a large time required to be transmitted in real time, the wireless communication module 192 of FIG. 2 transmits a second image through a 5G module used for wireless communication with a 5G network. Images can be transmitted. At this time, since the 5G module may not always be driven, the image transmission apparatus 700a may turn on the 5G module to use the 5G module.
  • the server 700b may receive the first image transmitted from the image transmission device 700a (711).
  • the server 700b may transmit the image received from the image transmission device 700a to the image reception device 700c in real time.
  • the server 700b can check whether there is control information for producing a processed image). When there is control information for the processed image, the server 700b may process the second image based on the control information (713). As described above in the operation of the image transmission apparatus 700a, the second image may be included in the first image and transmitted, or may be separately transmitted to the server 700b.
  • the server 700b when the server 700b receives a slow motion video (second video) and information (control information) for a slow motion video, the server 700b slows motion video based on the information on the slow motion video.
  • Slow motion video can be produced by controlling the number of frames played per time of motion video.
  • the server 700b cuts an image based on information on the received processed image to produce a highlight image, or includes a panorama image, a chroma key image, a style transfer image, and a 3D object based on the information on the received processed image.
  • An image an image obtained by converting at least one of the objects of the image into 3D
  • a shake correction image and a quality improvement image may be produced.
  • the server 700b may transmit the produced processed image to the image receiving device 700c.
  • the server 700b may transmit an additional image list including the produced processed image to the image receiving device 700c, and the server 700b is selected from the image receiving device 700c among the additional image lists.
  • the image requested from the server 700b may be transmitted to the image receiving device 700c.
  • the image receiving apparatus 700c may have the same configuration as the electronic device 101 described with reference to FIGS. 1 to 4.
  • the image receiving device 700c may connect to a server for viewing a broadcast (721 ). According to an embodiment, when receiving an input for viewing a broadcast, the image receiving device 700c may connect to a server for viewing a broadcast.
  • the image receiving device 700c may receive the first image transmitted by the image transmitting device 700a from the server 700b (722 ). Alternatively, according to an embodiment, the image receiving device 700c may receive an image processed from the first image transmitted by the image transmitting device 700a from the server 700b (722 ).
  • the image receiving device 700c may receive an additional image list when the server 700b transmits it (723).
  • the additional image list may include processed images processed by the server 700b. However, this step may be omitted depending on the embodiment.
  • the video receiving apparatus 700c may receive an input for selection of an additional video that the viewer wants to watch (operation 724). However, this step may be omitted depending on the embodiment.
  • the image receiving device 700c may request a processed image from the server 700b (725). According to an embodiment, when receiving an input for selecting an additional image, the image receiving apparatus 700c may request the selected additional image from the server 700b. When requesting a processed image (additional image), the image receiving device 700c may turn on a 5G module used for wireless communication with a 5G network in the wireless communication module 192 of FIG. 2.
  • the image receiving device 700c may reproduce the processed image received from the server 700b (726).
  • the image receiving apparatus 700c may receive a processed image through a 5G module used for wireless communication with a 5G network in the wireless communication module 192 of FIG. 2.
  • FIG. 8 is a flowchart illustrating an operation between an image transmission device (broadcast transmission device), a server, and an image reception device (broadcast reception device) according to an embodiment.
  • 9 is a diagram illustrating a broadcast transmission screen of a broadcast transmission device and a broadcast reception screen of a broadcast reception device according to an embodiment.
  • the operation of the broadcast transmission device may be performed through the processor of the broadcast transmission device, and the operation of the server may be performed through the processor of the server.
  • the operation may be performed through the processor of the broadcast receiving device.
  • the servers of FIGS. 8 and 9 may be edge computing servers.
  • the edge computing server may be a MEC server.
  • the server may be an open fog consortium or an edge X foundry.
  • the broadcast transmission device 800a may receive an input for starting a broadcast from a user who is a broadcast subject (801 ).
  • the broadcast transmission device 800a receives an input for starting broadcasting from a user who is a broadcasting subject
  • the first camera image 903 photographed by the first camera 901 included in the broadcast transmission device 800a is captured. It uploads to the server 800b in real time, and the server 800b may broadcast the received first camera image 903 to the broadcast reception device 800c in real time (802 ).
  • the broadcast transmission device 800a and the broadcast reception device 800c may have the same configuration as the electronic device 101 described with reference to FIGS. 1 to 4.
  • the broadcast reception device 800c may display the first camera image 921 received from the server 800b.
  • the server 800b may process the first camera image 903 received from the broadcast transmission device 800a and broadcast it to the broadcast reception device 800c.
  • the server 800b may broadcast the first camera image 903 received from the broadcast transmission device 800a to the broadcast reception device 800c by performing processing to improve stability of vibration correction and image quality improvement.
  • the broadcast receiving device 800c requests a high-definition image
  • the first camera image 903 may be transmitted as a legacy session and may be changed to a 5G session and transmitted.
  • the broadcast reception device 800c may turn on the 5G module to use the 5G module used for wireless communication with the 5G network in the wireless communication module 192 of FIG. 2.
  • the broadcast transmission device 800a may receive an input for slow motion mode shooting from a user who is a broadcast subject (803).
  • the broadcast transmission device 800a photographs a second object in a slow motion mode using the second camera 902 included in the broadcast transmission device 800a and uploads the slow motion mode original image 905 to the server 800b. Can do it (804).
  • the second camera 902 may be located on a rear surface of a surface on which the display screen is located in the broadcast transmission device 800a.
  • the slow motion mode original image 905 may be an image in which a larger number of frames are captured during the same time period than that of a general image.
  • the broadcast transmission device 800a may transmit the slow motion mode original image 905 to the server 800b (804) and transmit information about the slow motion mode original image 905 together.
  • the information on the slow motion mode original video 905 includes the identifier of the special effect (processing effect), the section to which the special effect (processing effect) is applied, the time at which the special effect (processing effect) video is played, and control information on the quality. Can include.
  • the wireless communication module 192 of FIG. 2 uses a legacy network used for wireless communication with a legacy network.
  • the slow motion mode original image 905 may be transmitted through the module.
  • the wireless communication module 192 of FIG. 2 can transmit through a 5G module used for wireless communication with a 5G network.
  • the slow motion mode original image 905 is not temporally large, it can be transmitted through the legacy module as a separate stream from the first camera image 903.
  • the broadcast transmission device 800a may transmit the first camera image 903 as a first stream, and the slow motion mode original image 905 may be transmitted as a second stream different from the first stream.
  • the first camera image 903 may be transmitted through the legacy module and the slow motion mode original image 905 may be transmitted through the 5G module.
  • the broadcast transmission device 800a transmits only control information including information on a temporal section captured in the slow motion mode from the first camera image 903 to the server 800b, and transmits the original slow motion mode image 905 ) You can also omit the transmission of itself.
  • the control information may include information on special effects, application period, playback time, and image quality.
  • the broadcast transmission device 800a may transmit a copied image for a temporal section captured in the slow motion mode from the first camera image 903 as the slow motion mode original image 905 to the server 800b. (904).
  • the server 800b may produce a slow motion image based on information on the slow motion mode original image 905 and the slow motion mode original image 905 received from the broadcast transmission device 800a (805 ). For example, the server 800b may produce a slow motion image by adjusting the number of frames played per time of the slow motion mode original image 905 based on information on the slow motion mode original image 905.
  • the broadcast transmission device 800a may receive an input for the screen addition 904 from a user who is a broadcast subject (806). When the broadcast transmission device 800a receives an input for the screen addition 904 from a user who is a broadcast subject, the broadcast transmission device 800a may transmit an image processing request 807 to the server 800b.
  • the image processing request 807 may transmit information for adding a screen and/or an image for adding a screen together.
  • the information on adding a screen may include control information on a special effect, an application section, a playback time, and image quality.
  • the server 800b may process an image based on information on adding a screen. For example, if the information about adding a screen is information about an enlarged image in which a part of the screen of the first camera image 903 is enlarged, the server 800b enlarges a part of the screen of the first camera image 903 You can create an enlarged image. When producing an enlarged image, an enlarged image may be produced by processing to improve the image quality. 9 shows that the image of the added screen 904 is an enlarged image of a part of the screen of the first camera image 903, but the image of the added screen 904 is a broadcast transmission device ( It may be an image captured using an internal or external third camera of 800a).
  • the server 800b may transmit an additional video list including the produced slow motion video and the produced enlarged video in the list to the broadcast receiving device 800c (808).
  • the broadcast receiving device 800c receiving the additional image list may display the additional image list as a plurality of thumbnails 922 and 923.
  • the additional image list is shown as a plurality of thumbnails 922 and 923.
  • the additional image list may be displayed as separate objects such as icons and text.
  • the broadcast reception device 800c may receive a picture in picure (PIP) video request input from a viewer (809 ).
  • the broadcast reception device 800c may receive an input for selecting an image to be played back through a picture in picure (PIP) from a viewer from among the plurality of thumbnails 922 and 923.
  • the broadcast reception device 800c may request the selected image from the server 800b (810).
  • the server 800b may transmit the requested image to the broadcast receiving device 800c (811).
  • the wireless communication module 192 of FIG. 2 can turn on the 5G module used for wireless communication with the 5G network, and transmit the video through the 5G module. Can receive.
  • the broadcast reception device 800c may receive an image from the server 800b and display it together with the first camera image 921.
  • the broadcast transmission device 800a may request a screen change (812).
  • the broadcast transmission device 800a may transmit, to the server 800b, an image processing request including an image for a screen to be changed and/or information on an image for a screen to be changed (813 ).
  • the information on the image for the screen to be changed may include control information on a special effect, an application section, a playback time, and image quality.
  • the server 800b may receive an image for a screen to be changed and/or information on an image for a screen to be changed, and may produce a second camera image that is an image for the screen to be changed.
  • the server 800b provides a slow motion video, a highlight video, a screen sharing video of the broadcast transmission device 800a, a chroma key video, and a style transfer video (video to a specific work or painter's style) based on the information about the video to be changed.
  • the second camera image can be produced through processing into an image changed to), an image including a 3D object (an image obtained by converting at least one of the objects of the image into 3D), an enlarged image, a shake correction image, and an image quality improvement image. .
  • the server 800b may transmit the produced second camera image to the broadcast receiving device 800c (814).
  • the broadcast receiving device 800c may receive the second camera image and display the first camera image 921 by changing the second camera image.
  • the broadcast receiving device 800c when the broadcast receiving device 800c is outside the area of the server 800b, the broadcast receiving device 800c transmits an image transmitted from the server 800b to a broadcast service outside the area of the server 800b. It may be received and displayed through the server (broadcasting service server) 800e.
  • FIGS. 8 and 9 illustrate operations of electronic devices according to an exemplary embodiment, and the present invention is not limited thereto, and operations of FIGS. 8 and 9 may be omitted or changed.
  • the additional image is described as a slow motion image and an enlarged image, but the additional image is not limited thereto, and various special effect images may be included, and the number of additional images may be one or more.
  • the screen change operation of FIG. 8 may be omitted according to embodiments.
  • a personal broadcast provided by combining a plurality of camera sources When a personal broadcast provided by combining a plurality of camera sources is performed, a broadcast image may be processed or produced based on control information on the broadcast image in an interlocked server (eg, MEC server). Therefore, since the server performs post-processing of the video by transmitting the broadcast image to the server, the electronic device can utilize other camera sources even during real-time broadcasting, unlike the conventional method of processing and broadcasting the broadcast image in an electronic device. You can use other functions of the electronic device without the resource limitation of.
  • MEC server eg, MEC server
  • a general server may be used according to the embodiment, but when a MEC server located close to an electronic device is used, it is easy to process a large-capacity image in real time, thereby improving the quality of broadcasting.
  • the video manager of the MEC server can provide the communication status of the base station to the electronic device in advance to control the quality of the transmitted broadcast image faster than that of the existing server.
  • FIG. 10 is a diagram illustrating a broadcast transmission screen of an image transmission device (broadcast transmission device) and a broadcast reception screen of an image reception device (broadcast reception device) according to an exemplary embodiment.
  • the operation of the broadcast transmission device may be performed through the processor of the broadcast transmission device, and the operation of the broadcast reception device may be performed through the processor of the broadcast reception device.
  • the broadcast transmission device 1000a may receive a user input (eg, a touch input) for a live streaming start icon 1001 from a user who is a broadcast subject.
  • a user input eg, a touch input
  • the broadcast transmission device 1000a When the broadcast transmission device 1000a receives a real-time streaming start input, the first object 1 can be photographed using the front camera 1002, and the first camera image 1003 being photographed using the front camera 1002. ) Can be transmitted to a server (not shown) in real time.
  • the broadcast reception device 1000c may receive a user input for a channel entering icon 1020 from a user.
  • the broadcast receiving device 1000c transmits a first camera image 1003 transmitted by the broadcast transmitting device 1000a from a server (not shown). Can receive.
  • the broadcast transmission device 1000a may activate the rear camera 1005 when receiving a user input for adding a screen. In addition, when receiving a user input for adding a screen, the broadcast transmission device 1000a divides the display area of the broadcast transmission device 1000a to provide the rear camera 1005 together with the first camera image 1003 currently being captured. The screen 1006 shown by may be displayed.
  • the broadcast transmission device 1000a may display user interfaces 1004 and 1007 for controlling the functions of start, pause, and stop shooting of the front camera 1002 and the rear camera 1005, and the user may use a user interface. Through the fields 1004 and 1007, the front camera 1002 and the rear camera 1005 can be operated.
  • the broadcast transmission device 1000a Upon receiving a user input for the recording start user interface 1007 of the rear camera 1005, the broadcast transmission device 1000a captures the second object 2 with the rear camera 1005. Can be sent to the server.
  • the second camera image 1006 may be transmitted to the server in real time and post-processed in the server if necessary.
  • the server transmits the second camera image or the image processed by the second camera to the broadcast reception device 1000c. 1000c), and the broadcast receiving device 1000c may display the corresponding image in a PIP format 1023.
  • the viewer can watch the PIP image 1023 on a small screen at the same time as the broadcast viewing of the first camera image 1021.
  • the broadcast reception device 1000c may provide a chat screen 1022 while watching a real-time broadcast.
  • FIG. 11 is a flow chart showing an operation between a first broadcast transmission device (a first video transmission device), a second broadcast transmission device (a second video transmission device), a server, and a broadcast reception device (video reception device) according to an embodiment to be.
  • the operation of the first broadcast transmission device may be performed through the processor of the first broadcast transmission device, and the operation of the second broadcast transmission device is performed through the processor of the second broadcast transmission device.
  • the operation of the server may be performed, and the operation of the server may be performed through the processor of the server, and the operation of the broadcast receiving device may be performed through the processor of the broadcast receiving device.
  • the server of FIG. 11 may be an edge computing server.
  • the edge computing server may be a MEC server.
  • the server may be an open fog consortium or an edge X foundry.
  • the first broadcast transmission device 1100a may receive an input for starting a broadcast from a user who is a broadcast subject (1101).
  • the first broadcast transmission device 1100a receives an input for starting a broadcast from a user who is a broadcaster
  • the first camera image captured by the first camera included in the first broadcast transmission device 1100a is captured in real time.
  • the server 1100c may broadcast the received first camera image to the broadcast receiving device 1100d in real time (1102).
  • the broadcast reception device 1100d may display the first camera image received from the server 1100c.
  • the first broadcast transmission device 1100a, the second broadcast transmission device 1100b, and the broadcast reception device 1100d may have the same configuration as the electronic device 101 described with reference to FIGS. 1 to 4.
  • the first broadcast transmission device 1100a may receive an input for slow motion mode shooting from a user who is a broadcast subject (1103).
  • the first broadcast transmission device 1100a may photograph a second object in a slow motion mode using a camera included in the broadcast transmission device 1100a and upload the slow motion mode original image to the server 1100d (1104). .
  • the first broadcast transmission device 1100a may transmit the slow motion mode original image to the server 1100c and transmit control information on the slow motion mode original image as well.
  • the server 1100c may produce a slow motion image based on the information on the slow motion mode original image and the slow motion mode original image received from the first broadcast transmission device 1100a (1105).
  • the first broadcast transmission device 1100a may receive an input for adding a screen from a user who is a broadcast subject (1106). When the first broadcast transmission device 1100a receives an input for adding a screen from a user who is a broadcasting subject, it transmits an image processing request including information for adding a screen and/or an image for adding a screen to the server 1100c. Can (1107). The server 1100c may process an image based on information on adding a screen.
  • the server 1100c may transmit an additional video list including the produced additional videos in the list to the broadcast receiving device 1100d (1108).
  • the broadcast reception device 1100d may receive a PIP video request input from a viewer (1109).
  • the broadcast receiving device 1100d may request an image selected from a list of additional images from the server 1100c (1110).
  • the server 1100c may transmit the requested image to the broadcast receiving device 1100d (1111).
  • the second broadcast transmission device 1100b may receive an input for capturing an additional image from the user (1112).
  • the second broadcast transmission device 1100b may transmit an additionally captured second camera image to the server 1100c (1113).
  • the server may provide an additional image list including the received second camera image or the processed image of the second camera image to the broadcast reception device 1100d (1114).
  • the first broadcast transmission device 1100a may request a screen change (1115).
  • the first broadcast transmission device 1100a may transmit, to the server 1100c, an image processing request including an image for a screen to be changed and/or information on an image for a screen to be changed (1116).
  • the server 1100c may receive an image for a screen to be changed and/or information on an image for a screen to be changed, and may produce a third camera image, which is an image for the screen to be changed.
  • the server 1100c may transmit the produced third camera image to the broadcast receiving device 1100d (1117).
  • the broadcast receiving device 1100d may receive the third camera image and display a third camera image together with the first camera image or instead of the first camera image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Telephone Function (AREA)

Abstract

Selon un mode de réalisation, la présente invention porte sur un dispositif de transmission d'image qui comprend : une première caméra ; et un processeur connecté à la première caméra, le processeur transmettant une première image capturée par la première caméra à un dispositif serveur et recevant d'un utilisateur une entrée d'une demande d'image traitée, et étant configuré pour transmettre des informations de commande concernant une seconde image en réponse à la demande d'image traitée tout en transmettant la première image au dispositif serveur, les informations de commande comprenant des informations permettant au dispositif serveur de produire l'image traitée sur la base de la seconde image fournie par le dispositif de transmission d'image. Divers autres modes de réalisation sont également possibles, conformément à la description.
PCT/KR2020/007498 2019-06-14 2020-06-10 Procédé de diffusion en continu d'image et dispositif électronique le prenant en charge WO2020251250A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190070929A KR20200143076A (ko) 2019-06-14 2019-06-14 영상 스트리밍 방법 및 이를 지원하는 전자 장치
KR10-2019-0070929 2019-06-14

Publications (1)

Publication Number Publication Date
WO2020251250A1 true WO2020251250A1 (fr) 2020-12-17

Family

ID=73781821

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/007498 WO2020251250A1 (fr) 2019-06-14 2020-06-10 Procédé de diffusion en continu d'image et dispositif électronique le prenant en charge

Country Status (2)

Country Link
KR (1) KR20200143076A (fr)
WO (1) WO2020251250A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024080672A1 (fr) * 2022-10-13 2024-04-18 엘지전자 주식회사 Équipement utilisateur, serveur, et système de génération de contenu d'image, et procédé associé

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003143598A (ja) * 2001-11-07 2003-05-16 Victor Co Of Japan Ltd 画像配信システム
KR100892433B1 (ko) * 2007-05-18 2009-04-10 (주)비디오스퀘어글로벌 이동통신 단말기를 이용한 영상 중계 시스템 및 그 방법
KR101291765B1 (ko) * 2013-05-15 2013-08-01 (주)엠비씨플러스미디어 실시간 중계방송을 위한 공 궤적 제공 시스템
KR20180043712A (ko) * 2016-10-20 2018-04-30 삼성전자주식회사 영상 표시 방법 및 그 전자장치
KR20180092873A (ko) * 2017-02-10 2018-08-20 주식회사 시어스랩 라이브 스트리밍 영상 생성 방법 및 장치, 라이브 서비스 제공 방법 및 장치, 및 라이브 스트리밍 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003143598A (ja) * 2001-11-07 2003-05-16 Victor Co Of Japan Ltd 画像配信システム
KR100892433B1 (ko) * 2007-05-18 2009-04-10 (주)비디오스퀘어글로벌 이동통신 단말기를 이용한 영상 중계 시스템 및 그 방법
KR101291765B1 (ko) * 2013-05-15 2013-08-01 (주)엠비씨플러스미디어 실시간 중계방송을 위한 공 궤적 제공 시스템
KR20180043712A (ko) * 2016-10-20 2018-04-30 삼성전자주식회사 영상 표시 방법 및 그 전자장치
KR20180092873A (ko) * 2017-02-10 2018-08-20 주식회사 시어스랩 라이브 스트리밍 영상 생성 방법 및 장치, 라이브 서비스 제공 방법 및 장치, 및 라이브 스트리밍 시스템

Also Published As

Publication number Publication date
KR20200143076A (ko) 2020-12-23

Similar Documents

Publication Publication Date Title
WO2019066176A1 (fr) Dispositif électronique
WO2016039496A1 (fr) Terminal mobile, et procédé de commande correspondant
WO2021157954A1 (fr) Procédé d'enregistrement vidéo mettant en oeuvre une pluralité de caméras, et dispositif associé
WO2022154387A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2022102972A1 (fr) Dispositif électronique comprenant un capteur d'image et son procédé de fonctionnement
WO2020251250A1 (fr) Procédé de diffusion en continu d'image et dispositif électronique le prenant en charge
WO2022005000A1 (fr) Dispositif électronique et procédé de fonctionnement associé
WO2020171657A1 (fr) Dispositif d'affichage et procédé d'affichage d'image associé
WO2020141753A1 (fr) Dispositif électronique permettant de réaliser une communication avec formation de faisceaux, et son procédé
WO2019107769A1 (fr) Dispositif électronique destiné à compresser sélectivement des données d'image conformément à une vitesse de lecture d'un capteur d'image, et son procédé de fonctionnement
WO2022098204A1 (fr) Dispositif électronique et procédé de fourniture de service de réalité virtuelle
WO2021256709A1 (fr) Dispositif électronique et procédé de fonctionnement de dispositif électronique
WO2022149812A1 (fr) Dispositif électronique comprenant un module de caméra et procédé de fonctionnement de dispositif électronique
WO2022005116A1 (fr) Procédé et dispositif pour commander l'émission ou la réception de données dans un système de communication sans fil
WO2023085564A1 (fr) Dispositif électronique et procédé pour obtenir le contenu d'un dispositif électronique externe par un dispositif électronique
WO2024090803A1 (fr) Procédé permettant d'obtention d'une image et dispositif électronique le prenant en charge
WO2020159192A1 (fr) Dispositif électronique de traitement de fichier comprenant de multiples éléments de données associés
WO2024053824A1 (fr) Dispositif électronique pour fournir une image et son procédé de fonctionnement
WO2024085673A1 (fr) Dispositif électronique pour obtenir de multiples images d'exposition et son procédé de fonctionnement
WO2023033396A1 (fr) Dispositif électronique pour traiter une entrée de prise de vue continue, et son procédé de fonctionnement
WO2024034856A1 (fr) Procédé permettant de fournir une image, et dispositif électronique le prenant en charge
WO2023043057A1 (fr) Dispositif électronique et procédé de partage d'écrans et de signaux audio correspondant aux écrans
WO2024128628A1 (fr) Dispositif électronique et procédé de lecture de vidéo en sens inverse par le même dispositif électronique
WO2023136438A1 (fr) Système de partage de contenu et dispositif électronique
WO2022114809A1 (fr) Dispositif électronique de fourniture de visioconférence et procédé associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20822753

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20822753

Country of ref document: EP

Kind code of ref document: A1