WO2020204401A1 - Dispositif électronique et procédé de traitement d'application de diffusion en continu dans un dispositif électronique - Google Patents

Dispositif électronique et procédé de traitement d'application de diffusion en continu dans un dispositif électronique Download PDF

Info

Publication number
WO2020204401A1
WO2020204401A1 PCT/KR2020/003621 KR2020003621W WO2020204401A1 WO 2020204401 A1 WO2020204401 A1 WO 2020204401A1 KR 2020003621 W KR2020003621 W KR 2020003621W WO 2020204401 A1 WO2020204401 A1 WO 2020204401A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
processor
data
streaming
pdu session
Prior art date
Application number
PCT/KR2020/003621
Other languages
English (en)
Korean (ko)
Inventor
김무영
김민정
박선민
이정은
채상원
허재영
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2020204401A1 publication Critical patent/WO2020204401A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/60Scheduling or organising the servicing of application requests, e.g. requests for application data transmissions using the analysis and optimisation of the required network resources
    • H04L67/61Scheduling or organising the servicing of application requests, e.g. requests for application data transmissions using the analysis and optimisation of the required network resources taking into account QoS or priority requirements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • H04L67/141Setup of application sessions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/75Indicating network or usage conditions on the user display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/10Streamlined, light-weight or high-speed protocols, e.g. express transfer protocol [XTP] or byte stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/30Definitions, standards or architectural aspects of layered protocol stacks
    • H04L69/32Architecture of open systems interconnection [OSI] 7-layer type protocol stacks, e.g. the interfaces between the data link level and the physical level
    • H04L69/322Intralayer communication protocols among peer entities or protocol data unit [PDU] definitions

Definitions

  • Various embodiments relate to an electronic device and a method of processing a streaming application in the electronic device.
  • the service or function may be executed through a cloud computing-based server, and the executed result screen may be transmitted to a user terminal using a streaming technology. In this way, it is possible to use services or functions that require high specifications or high functions through the user terminal.
  • a technology that executes a specific service or function according to the request of the user terminal and transmits the result to the user terminal so that various services or functions can be used in the user terminal is called a cloud streaming service.
  • screen virtualization technologies for virtualizing an application are being provided by executing a program in a cloud server and streaming it to an electronic device without a separate program installation process in the electronic device.
  • the screen virtualization technology delivers control information inputted by a user through an electronic device to a cloud server, and the cloud server provides streaming data (eg, graphic streaming data) processed based on the control information to the user's electronic device.
  • streaming data eg, graphic streaming data
  • the screen virtualization technology may be very sensitive to providing a stable network quality (QoS) because it must generate graphic streaming data in the server according to the control information input by the user and then transmit it to the user's electronic device.
  • QoS network quality
  • the graphic effect (resolution) of the cloud streaming service may be affected by the network transmission speed (bandwidth, throughput).
  • an electronic device capable of seamlessly supporting a high-end streaming service as in 5G communication, and a streaming application processing method in the electronic device may be provided.
  • the electronic device includes a first processor that executes a touch screen and a streaming application, and controls the streaming data received from the server to be output through the touch screen according to the execution of the streaming application, and the streaming application.
  • user input data input through the touch screen is processed to be transmitted to the server using a first protocol data unit (PDU) session, and processed using a second PDU session based on the transmitted user input data.
  • PDU protocol data unit
  • An operation method of an electronic device includes: executing a streaming application in a first processor, receiving user input data for the streaming application through a touch screen, and receiving the user input data in a second processor.
  • PDU protocol data unit
  • QoS of an application eg, service or game
  • high resolution and high level of frame per second (FPS) may be guaranteed.
  • FIG. 1 is a diagram illustrating a network environment according to various embodiments.
  • FIG. 2 is a block diagram illustrating a structure of an electronic device according to various embodiments.
  • FIG. 3 is a block diagram illustrating a structure of an electronic device according to various embodiments.
  • FIG. 4 is a diagram illustrating PDU types in a 4G communication scheme according to various embodiments.
  • 5 is a diagram illustrating PDU types of a 5G communication scheme according to various embodiments.
  • FIG. 6 is a block diagram illustrating a structure of an electronic device according to various embodiments.
  • FIG. 7 is a block diagram illustrating a structure of an electronic device according to various embodiments.
  • FIG. 8 is a diagram illustrating a processing method between an AP and a CP according to various embodiments.
  • FIG. 9 is a diagram for comparing processing methods according to types of applications according to various embodiments.
  • FIG. 10 is a flowchart illustrating a method of processing a streaming application in an electronic device according to various embodiments of the present disclosure.
  • FIG. 11 is a flowchart illustrating a method of processing a streaming application in an electronic device according to various embodiments of the present disclosure.
  • FIG. 12 is a signal flow diagram illustrating a method of processing a streaming application in an electronic device according to various embodiments of the present disclosure.
  • FIG. 13 is a block diagram illustrating a program according to various embodiments.
  • the input device may include a touch panel included in the touch screen or various sensors.
  • a touch screen panel (TSP) or a touch screen display may be understood as the same term as a touch screen or a concept included in the touch screen.
  • the display device may include at least one display.
  • the touch screen may include a touch panel and a display panel, and the touch panel may function as an input device, and the display panel may function as a display device.
  • the input device or display device of various embodiments to be described later is not limited to a touch screen (or a touch screen panel), and any device that receives a user input from an electronic device may be replaced with an input device. Any device that displays may be replaced with a display device.
  • a touch event through a touch screen is described as an example of an input event, but various input events other than the touch event may be processed in the same or similar method.
  • various sensors e.g., a gyro sensor, a gravity sensor, a geomagnetic sensor, an image sensor
  • input data about a user's motion can be processed as an input event
  • an external input device e.g., a joystick or VR (virtual reality)
  • Input data input through a wired/wireless interface between) an input device) and an electronic device may be processed as an input event.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (for example, a short-range wireless communication network), or a second network 199 It is possible to communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network).
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, and a sensor module ( 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197 ) Can be included.
  • a sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197
  • at least one of these components may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components may be implemented as one integrated circuit.
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the display device 160 eg, a display.
  • the processor 120 for example, executes software (eg, a program 140) to implement at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and can perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 120 may store commands or data received from other components (eg, the sensor module 176 or the communication module 190) to the volatile memory 132. The command or data stored in the volatile memory 132 may be processed, and result data may be stored in the nonvolatile memory 134.
  • software eg, a program 140
  • the processor 120 may store commands or data received from other components (eg, the sensor module 176 or the communication module 190) to the volatile memory 132.
  • the command or data stored in the volatile memory 132 may be processed, and result data may be stored in the nonvolatile memory 134.
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and a secondary processor 123 (eg, a graphic processing unit, an image signal processor) that can be operated independently or together with the main processor 121 (eg, a central processing unit or an application processor). , A sensor hub processor, or a communication processor). Additionally or alternatively, the coprocessor 123 may be set to use lower power than the main processor 121 or to be specialized for a designated function. The secondary processor 123 may be implemented separately from the main processor 121 or as a part thereof.
  • main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, an image signal processor
  • the coprocessor 123 may be set to use lower power than the main processor 121 or to be specialized for a designated function.
  • the secondary processor 123 may be implemented separately from the main processor 121 or as a part thereof.
  • the coprocessor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, an application is executed). ) While in the state, together with the main processor 121, at least one of the components of the electronic device 101 (for example, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the functions or states related to. According to an embodiment, the coprocessor 123 (eg, an image signal processor or a communication processor) may be implemented as part of another functionally related component (eg, the camera module 180 or the communication module 190). have.
  • an image signal processor or a communication processor may be implemented as part of another functionally related component (eg, the camera module 180 or the communication module 190). have.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176).
  • the data may include, for example, software (eg, the program 140) and input data or output data for commands related thereto.
  • the memory 130 may include a volatile memory 132 or a nonvolatile memory 134.
  • the program 140 may be stored as software in the memory 130, and may include, for example, an operating system 142, middleware 144, or an application 146.
  • the input device 150 may receive a command or data to be used for a component of the electronic device 101 (eg, the processor 120) from an outside (eg, a user) of the electronic device 101.
  • the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 155 may output an sound signal to the outside of the electronic device 101.
  • the sound output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display device 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display device 160 may include a touch circuitry set to sense a touch, or a sensor circuit (eg, a pressure sensor) set to measure the strength of a force generated by the touch. have.
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101, or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that a user can perceive through a tactile or motor sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture a still image and a video.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101.
  • the power management module 388 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, electronic device 102, electronic device 104, or server 108). It is possible to support establishment and communication through the established communication channel.
  • the communication module 190 operates independently of the processor 120 (eg, an application processor), and may include one or more communication processors that support direct (eg, wired) communication or wireless communication.
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 in a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the antenna module 197 may transmit a signal or power to the outside (eg, an external electronic device) or receive from the outside.
  • the antenna module may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is, for example, provided by the communication module 190 from the plurality of antennas. Can be chosen.
  • the signal or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as part of the antenna module 197.
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the electronic devices 102 and 104 may be a device of the same or different type as the electronic device 101.
  • all or part of the operations executed by the electronic device 101 may be executed by one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 needs to perform a function or service automatically or in response to a request from a user or another device, the electronic device 101 does not execute the function or service by itself.
  • the server 690 may generate streaming data (eg, video information) based on the transmitted user input information and transmit it to the streaming app 624 through an eMBB PDU session.
  • the server 690 is electronically operated in a 4G communication base (eg, one IP communication) or a 5G communication base (eg, two IP communications (eMBB, URLLC)). It can communicate with the streaming app 624 of the device 101.
  • the second PDU session may have a higher transmission rate than the first PDU session.
  • the second processor receives the user input data, determines a touch event based on a pattern of touch data from touch input data included in the received user input data, and determines the determined touch event Control data can be generated based on.
  • the touch data or sensor data acquired through the first TSP driver 621 or the first sensor driver 622 is transferred to the second TSP driver 635 or the second sensor driver 635 of the CP 630. ), or the touch event or sensor event value determined by the input framework 623 may be transmitted to the input processing module 634 of the CP 630 without being transmitted to the streaming app 624.
  • the second TSP driver 635 or the second sensor driver 636 of the CP 630 may transmit touch data or sensor data values transmitted from the AP 620 to the streaming app 624.
  • the input processing module 634 of the CP 630 may directly transmit the touch event or sensor event value received from the input framework 623 of the AP 620 to the server 690 through the URLLC network interface.
  • user input information may be changed again so that the AP 620 can process it.
  • FIG. 9 is a diagram for comparing processing methods according to types of applications according to various embodiments. Referring to FIG. 9, it is possible to determine whether the first app 910 is a streaming app while operating in the foreground.
  • user input information is processed by the AP 620 Can be.
  • the AP 620 may perform various actions of the first app 910 based on user input information (eg, input of text, selection of an object, movement of a cursor). ) Can be controlled.
  • the user input information is CP 630 as detailed in FIG. 6. Can be handled in
  • the electronic device 101 processes user input data in a second processor (eg, CP 630 in FIG. 6). Can be switched to
  • the electronic device 101 may transmit control data processed using a first PDU session (eg, a URLLC PDU session) to a server (eg, the server 690 of FIG. 6).
  • a first PDU session eg, a URLLC PDU session
  • server eg, the server 690 of FIG. 6
  • the electronic device 101 receives streaming data processed using a second PDU session (eg, eMBB PDU session) based on user input data from the server 690, and uses the second PDU session.
  • a second PDU session eg, eMBB PDU session
  • the processed streaming data can be displayed.
  • FIG. 12 is a signal flow diagram illustrating a method of processing a streaming application in an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 stores user input data (eg, data including X and Y coordinates) in operation 1204. It can be transmitted to the AP (620).
  • the AP 620 may process user input data.
  • the AP 620 may notify the CP 630 of a change in the input information delivery path in operation 1210.
  • An operation method of the electronic device 101 includes an operation of executing a streaming application in a first processor, an operation of receiving user input data for the streaming application through a touch screen, and in a second processor. Processing the user input data to be transmitted to the server using a first protocol data unit (PDU) session, and the server sends streaming data processed using a second PDU session based on the user input data from the server It may include an operation of receiving and an operation of displaying the received streaming data through the touch screen.
  • PDU protocol data unit
  • the first processor may be an application processor
  • the second processor may be a communication processor
  • the first PDU session may have a lower delay characteristic than the second PDU session.
  • the first PDU session may be an ultra-reliable and low latency communication (URLLC) PDU session.
  • URLLC ultra-reliable and low latency communication
  • the second PDU session may be an enhanced mobile broadband (eMBB) PDU session.
  • eMBB enhanced mobile broadband
  • the program 140 includes an operating system 142 for controlling one or more resources of the electronic device 101, a middleware 144, or an application 146 executable in the operating system 142.
  • the operating system 142 may include, for example, Android TM , iOS TM , Windows TM , Symbian TM , Tizen TM , or Bada TM .
  • At least some of the programs 140 are, for example, preloaded on the electronic device 101 at the time of manufacture, or when used by a user, an external electronic device (eg, electronic device 102 or 104), or a server ( 108)) or can be updated.
  • the operating system 142 may control management (eg, allocation or collection) of one or more system resources (eg, process, memory, or power) of the electronic device 101.
  • the operating system 142 is, additionally or alternatively, other hardware devices of the electronic device 101, for example, the input device 150, the sound output device 155, the display device 160, the audio module 170 , Sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or One or more driver programs for driving the antenna module 197 may be included.
  • the middleware 144 may provide various functions to the application 146 so that a function or information provided from one or more resources of the electronic device 101 can be used by the application 146.
  • the middleware 144 is, for example, an application manager 1301, a window manager 1303, a multimedia manager 1305, a resource manager 1307, a power manager 1309, a database manager 1311, and a package manager 1313. ), a connectivity manager 1315, a notification manager 1317, a location manager 1319, a graphic manager 1321, a security manager 1323, a call manager 1325, or a voice recognition manager 1327. I can.
  • the application manager 1301 may manage the life cycle of the application 146, for example.
  • the window manager 1303 may manage one or more GUI resources used on a screen, for example.
  • the multimedia manager 1305, for example, identifies one or more formats required for playback of media files, and performs encoding or decoding of a corresponding media file among the media files by using a codec suitable for the selected corresponding format. Can be done.
  • the resource manager 1307 may manage the source code of the application 146 or a memory space of the memory 130, for example.
  • the power manager 1309 may manage the capacity, temperature, or power of the battery 189, for example, and determine or provide related information necessary for the operation of the electronic device 101 using the corresponding information. . According to an embodiment, the power manager 1309 may interwork with a basic input/output system (BIOS) (not shown) of the electronic device 101.
  • BIOS basic input/output system
  • the database manager 1311 may create, search, or change a database to be used by the application 146, for example.
  • the package manager 1313 may manage installation or update of an application distributed in the form of, for example, a package file.
  • the connectivity manager 1315 may manage a wireless connection or a direct connection between the electronic device 101 and the external electronic device, for example.
  • the notification manager 1317 may provide, for example, a function for notifying the user of the occurrence of a designated event (eg, incoming call, message, or alarm).
  • the location manager 1319 may manage location information of the electronic device 101, for example.
  • the graphic manager 1321 may manage, for example, one or more graphic effects to be provided to a user or a user interface related thereto.
  • the security manager 1323 may provide system security or user authentication, for example.
  • the telephony manager 1325 may manage a voice call function or a video call function provided by the electronic device 101, for example.
  • the voice recognition manager 1328 transmits, for example, a user's voice data to the server 108, and a command corresponding to a function to be performed in the electronic device 101 based at least in part on the voice data, Alternatively, text data converted based at least in part on the voice data may be received from the server 108.
  • the middleware 1344 may dynamically delete some of the existing components or add new components.
  • at least a portion of the middleware 144 may be included as a part of the operating system 142 or implemented as separate software different from the operating system 142.
  • the information exchange application may include, for example, a notification relay application configured to deliver specified information (eg, a call, a message, or an alarm) to an external electronic device, or a device management application configured to manage an external electronic device have.
  • the notification relay application for example, transmits notification information corresponding to a specified event (eg, mail reception) generated by another application (eg, e-mail application 1369) of the electronic device 101 to an external electronic device. I can. Additionally or alternatively, the notification relay application may receive notification information from an external electronic device and provide it to the user of the electronic device 101.
  • module used in this document may mean, for example, a unit including one or a combination of two or more of hardware, software, or firmware.
  • Module may be used interchangeably with terms such as, for example, unit, logic, logical block, component, or circuit.
  • the “module” may be the smallest unit of integrally configured parts or a part thereof.
  • the “module” may be a minimum unit or a part of one or more functions.
  • the “module” can be implemented mechanically or electronically.
  • a “module” is one of known or future developed application-specific integrated circuit (ASIC) chips, field-programmable gate arrays (FPGAs), or programmable-logic devices that perform certain operations. It may include at least one.
  • ASIC application-specific integrated circuit
  • FPGAs field-programmable gate arrays
  • programmable-logic devices that perform certain operations. It may include at least one.
  • At least a part of a device (eg, modules or functions thereof) or a method (eg, operations) according to various embodiments is, for example, a computer-readable storage media in the form of a program module. It can be implemented as a command stored in. When the command is executed by a processor (for example, the processor 120), the one or more processors may perform a function corresponding to the command.
  • the computer-readable storage medium may be, for example, the memory 130.
  • Computer-readable recording media include hard disks, floppy disks, magnetic media (e.g. magnetic tape), optical media (e.g. compact disc read only memory (CD-ROM)), DVD ( digital versatile disc), magnetic-optical media (e.g. floptical disk), hardware device (e.g. read only memory (ROM), random access memory (RAM), or flash memory)
  • the program instruction may include not only machine language codes such as those produced by a compiler but also high-level language codes that can be executed by a computer using an interpreter. It can be configured to operate as one or more software modules to perform the task, and vice versa.
  • the instructions are set to cause the at least one processor to perform at least one operation when executed by at least one processor, and the at least one operation is , An operation of executing a streaming application in the first processor, receiving user input data for the streaming application through a touch screen, the user input data in the second processor using a first protocol data unit (PDU) session Processing to be transmitted to a server, receiving streaming data processed by the server using a second PDU session based on the user input data from the server, and displaying the received streaming data through the touch screen May include actions.
  • PDU protocol data unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)
  • Communication Control (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

L'invention concerne un dispositif électronique et un procédé de traitement d'application de diffusion en continu dans le dispositif électronique. Un dispositif électronique selon divers modes de réalisation peut comporter: un écran tactile; un premier processeur servant à exécuter une application de diffusion en continu, et à commander des données de diffusion en continu, qui ont été reçues en provenance d'un serveur en fonction de l'exécution de l'application de diffusion en continu, pour qu'elles soient délivrées par l'intermédiaire de l'écran tactile; et un second processeur servant à traiter des données d'entrée d'utilisateur, qui ont été introduites par l'intermédiaire de l'écran tactile pendant l'exécution de l'application de diffusion en continu, pour qu'elles soient transmises au serveur en utilisant une première session d'unités de données de protocole (PDU), recevoir, en provenance du serveur, des données de diffusion en continu qui ont été traitées à l'aide d'une seconde session de PDU d'après les données d'entrée d'utilisateur, et fournir les données reçues de diffusion en continu au premier processeur. En outre, d'autres modes de réalisation, en plus de divers modes de réalisation de la présente invention, pourraient être possibles.
PCT/KR2020/003621 2019-03-29 2020-03-17 Dispositif électronique et procédé de traitement d'application de diffusion en continu dans un dispositif électronique WO2020204401A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2019-0036893 2019-03-29
KR1020190036893A KR102710747B1 (ko) 2019-03-29 2019-03-29 전자 장치 및 전자 장치에서의 스트리밍 어플리케이션 처리 방법

Publications (1)

Publication Number Publication Date
WO2020204401A1 true WO2020204401A1 (fr) 2020-10-08

Family

ID=72666383

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/003621 WO2020204401A1 (fr) 2019-03-29 2020-03-17 Dispositif électronique et procédé de traitement d'application de diffusion en continu dans un dispositif électronique

Country Status (2)

Country Link
KR (1) KR102710747B1 (fr)
WO (1) WO2020204401A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230142987A (ko) * 2022-04-04 2023-10-11 삼성전자주식회사 무선 통신 시스템에서 평행한 네트워크 슬라이스들에 의하여 스트리밍 서비스의 지연을 제거하기 위한 장치 및 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101066349B1 (ko) * 2006-05-24 2011-09-21 타임 워너 케이블, 인크. 개인용 콘텐츠 서버 장치 및 방법
KR20130012420A (ko) * 2011-07-25 2013-02-04 에스케이플래닛 주식회사 화면 가상화 기반 어플리케이션 실행 시스템 및 방법
KR20160034339A (ko) * 2013-09-17 2016-03-29 인텔 아이피 코포레이션 타깃 미디어 콘텐츠의 전송기법
KR20170097548A (ko) * 2016-02-18 2017-08-28 에스케이텔레콤 주식회사 이종 네트워크상에서의 컨텐츠 전송 방법 및 이를 위한 장치
KR20180033667A (ko) * 2016-09-26 2018-04-04 삼성전자주식회사 스트리밍 서비스를 제공하는 방법 및 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101066349B1 (ko) * 2006-05-24 2011-09-21 타임 워너 케이블, 인크. 개인용 콘텐츠 서버 장치 및 방법
KR20130012420A (ko) * 2011-07-25 2013-02-04 에스케이플래닛 주식회사 화면 가상화 기반 어플리케이션 실행 시스템 및 방법
KR20160034339A (ko) * 2013-09-17 2016-03-29 인텔 아이피 코포레이션 타깃 미디어 콘텐츠의 전송기법
KR20170097548A (ko) * 2016-02-18 2017-08-28 에스케이텔레콤 주식회사 이종 네트워크상에서의 컨텐츠 전송 방법 및 이를 위한 장치
KR20180033667A (ko) * 2016-09-26 2018-04-04 삼성전자주식회사 스트리밍 서비스를 제공하는 방법 및 장치

Also Published As

Publication number Publication date
KR102710747B1 (ko) 2024-09-27
KR20200114707A (ko) 2020-10-07

Similar Documents

Publication Publication Date Title
WO2019088793A1 (fr) Dispositif électronique et procédé de partage d'écran utilisant ledit dispositif
WO2021060836A1 (fr) Procédé et appareil pour l'exécution d'une application
WO2020085636A1 (fr) Dispositif électronique pour afficher une liste d'applications exécutables sur un écran partagé, et procédé de fonctionnement associé
WO2019112268A1 (fr) Procédé et dispositif de rétablissement d'une communication bluetooth
WO2020166894A1 (fr) Dispositif électronique et procédé de recommandation de mot associé
WO2019164212A1 (fr) Dispositif et procédé permettant de prendre en charge une pluralité de sims dans un système de communication sans fil
WO2021230589A1 (fr) Dispositif électronique et procédé pour dispositif électronique traitant un paquet de donnés reçu
WO2019177373A1 (fr) Dispositif électronique pour commander une fonction prédéfinie sur la base d'un temps de réponse d'un dispositif électronique externe à une entrée d'utilisateur, et procédé associé
WO2022030893A1 (fr) Dispositif électronique pour la prise en charge du partage audio
WO2019045255A1 (fr) Procédé de démarrage d'application et dispositif électronique destiné à sa mise en œuvre
WO2019164204A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2020106076A1 (fr) Dispositif électronique d'ordonnancement de service de réseau sans fil selon l'application et procédé d'utilisation de celui-ci
AU2018321518B2 (en) Method for determining input detection region corresponding to user interface and electronic device thereof
WO2020204401A1 (fr) Dispositif électronique et procédé de traitement d'application de diffusion en continu dans un dispositif électronique
WO2021112560A1 (fr) Dispositif électronique et procédé de fourniture de service de conversation par message
WO2019083283A1 (fr) Dispositif électronique d'affichage d'image et son procédé de commande
WO2020060124A1 (fr) Dispositif électronique de filtrage de paquets et procédé permettant de faire fonctionner ledit dispositif
WO2020138909A1 (fr) Procédé de partage de contenu et dispositif électronique associé
WO2019177437A1 (fr) Procédé de commande d'écran et dispositif électronique prenant en charge celui-ci
WO2019164196A1 (fr) Dispositif électronique et procédé de reconnaissance de caractères
WO2021145693A1 (fr) Dispositif électronique de traitement de données d'image et procédé de traitement de données d'image
WO2021162210A1 (fr) Dispositif électronique pour commander une unité de traitement sur la base du temps passé pour générer une trame et d'un temps autorisé maximal et procédé de fonctionnement de dispositif électronique
WO2021157997A1 (fr) Dispositif électronique et procédé de traitement d'entrée utilisateur
WO2022065845A1 (fr) Procédé de traitement de données d'entrée et dispositif électronique le prenant en charge
WO2021025424A1 (fr) Dispositif électronique de commande de luminance et procédé associé de commander de luminance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20783315

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20783315

Country of ref document: EP

Kind code of ref document: A1