WO2022230726A1 - 動画像受信装置、動画像送信装置、動画像送受信システム、制御方法及びプログラム - Google Patents
動画像受信装置、動画像送信装置、動画像送受信システム、制御方法及びプログラム Download PDFInfo
- Publication number
- WO2022230726A1 WO2022230726A1 PCT/JP2022/018208 JP2022018208W WO2022230726A1 WO 2022230726 A1 WO2022230726 A1 WO 2022230726A1 JP 2022018208 W JP2022018208 W JP 2022018208W WO 2022230726 A1 WO2022230726 A1 WO 2022230726A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- moving image
- image
- operation data
- unit
- Prior art date
Links
- 230000005540 biological transmission Effects 0.000 title claims abstract description 130
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000004044 response Effects 0.000 claims abstract description 23
- 238000012545 processing Methods 0.000 claims description 43
- 238000004891 communication Methods 0.000 description 37
- 230000009467 reduction Effects 0.000 description 21
- 230000008569 process Effects 0.000 description 20
- 238000013500 data storage Methods 0.000 description 15
- 238000010295 mobile communication Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 10
- 239000000872 buffer Substances 0.000 description 9
- 238000013459 approach Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 230000001934 delay Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/637—Control signals issued by the client directed to the server or network components
- H04N21/6373—Control signals issued by the client directed to the server or network components for rate control, e.g. request to the server to modify its transmission rate
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/35—Details of game servers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/77—Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
- A63F13/86—Watching games played by other players
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44209—Monitoring of downstream path of the transmission network originating from a server, e.g. bandwidth variations of a wireless network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6131—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via a mobile phone network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4781—Games
Definitions
- the present invention relates to a moving image receiving device, a moving image transmitting device, a moving image transmitting/receiving system, a control method and a program.
- operation data corresponding to the user's input operation on the terminal is wirelessly transmitted from the terminal to the cloud server.
- a frame image representing the game play situation is generated based on the operation data.
- image data obtained by encoding the frame image is wirelessly transmitted from the cloud server to the terminal, and the frame image obtained by decoding the image data is displayed on the terminal.
- the video displayed on the terminal be as high quality as possible, and therefore the data size of the image data generated based on the frame images that make up the video should be as large as possible. is desirable.
- each frame image is displayed on the terminal without delay with low delay even if the data size of the image data is reduced in order to reduce the discomfort of the user during operation. It is important to do so.
- the frame image is Special care should be taken to ensure that the content is displayed on the terminal with low latency.
- the present invention has been made in view of the above circumstances, and one of its objects is to provide a moving image receiving apparatus, a moving image transmitting apparatus, and a moving image transmitting/receiving system capable of generating image data of an appropriate data size in consideration of delay. , to provide a control method and program.
- a moving image receiving device is a moving image receiving device for sequentially receiving image data representing frame images constituting a moving image from a moving image transmitting device. and an operation data transmission unit for transmitting operation data corresponding to the operation data to the moving image transmission device, and corresponding to the operation data transmitted from the moving image transmission device in response to start of generation of the frame image based on the operation data.
- the moving image transmission device Based on the packet reception unit that receives the attached packet and the packet reception time that is the time from the timing when the operation data is transmitted to the timing when the packet associated with the operation data is received, the moving image transmission device and a data size control unit for controlling the data size of image data to be transmitted from now on.
- the data size control unit controls the data size based on the time required to receive the image data transmitted from the moving image transmission device.
- the data size control unit is configured based on the packet reception time in reception of the latest packet and the packet reception time in reception of at least one packet before the latest packet. to control the data size.
- the data size control unit performs control to reduce the data size when it is determined that the packet reception failure continues based on a predetermined condition.
- a moving image transmitting apparatus is a moving image transmitting apparatus that sequentially transmits image data representing frame images constituting a moving image to a moving image receiving apparatus, and transmits operation data according to a user's input operation.
- an operation data reception unit that receives from the moving image reception device; an image generation unit that generates the frame image based on the operation data; a packet transmission unit that transmits packets to the moving image receiving device; an encoding processing execution unit that encodes the frame image to generate image data representing the frame image; and transmits the image data to the moving image receiving device.
- the image data transmitting unit Based on the image data transmitting unit to be transmitted and the time from the timing when the moving image receiving device transmits the operation data to the timing when the packet associated with the operation data is received, the image data transmitting unit will transmit from now on. and a data size control unit for controlling the data size of the image data to be processed.
- a moving image transmitting/receiving system includes a moving image transmitting device for sequentially transmitting image data representing frame images constituting a moving image, and a moving image receiving device for sequentially receiving the image data.
- the moving image receiving device includes an operation data transmission unit that transmits operation data according to a user's input operation to the moving image transmission device; a packet receiving unit that receives a packet associated with the operation data transmitted from the moving image transmission device, and a packet receiving unit that receives the packet associated with the operation data from the timing of transmitting the operation data to the timing of receiving the packet associated with the operation data.
- the moving image transmission device includes an operation data receiving unit that receives the operation data from the moving image receiving device, an image generation unit that generates the frame image based on the operation data, and a frame image generation unit that starts generating the frame image.
- a packet transmission unit that transmits a packet associated with the operation data to the moving image receiving device; an encoding processing execution unit that encodes the frame image to generate image data representing the frame image; an image data transmission unit for transmitting the image data to the moving image reception device; a control data reception unit for receiving the control data; and a data size control unit that controls the data size.
- a moving image receiving device that sequentially receives image data representing frame images constituting a moving image from a moving image transmitting device transmits operation data corresponding to a user's input operation. a step of transmitting to a device, and the moving image receiving device receives a packet associated with the operation data, which is transmitted from the moving image transmitting device in response to start of generation of the frame image based on the operation data. and, based on a packet reception time, which is the time from the timing at which the operation data is transmitted to the timing at which the packet associated with the operation data is received by the moving image receiving device, the moving image transmitting device performs the following: and C. controlling the data size of the transmitted image data.
- a moving image transmitting device for sequentially transmitting image data representing frame images constituting a moving image to a moving image receiving device transmits operation data according to a user's input operation to the moving image. a step of receiving from an image receiving device; a step of the moving image transmitting device generating the frame image based on the operation data; a step of transmitting a packet associated with the operation data to the moving image receiving device; a step of the moving image transmitting device encoding the frame image to generate image data representing the frame image; a moving image transmitting device transmitting the image data to the moving image receiving device; and controlling the data size of the image data to be transmitted by the moving image transmitting device based on the time until the packet is received.
- the program according to the present invention transmits operation data according to a user's input operation to the moving image transmitting device, to a computer that sequentially receives image data representing frame images constituting a moving image from the moving image transmitting device. a procedure for receiving a packet associated with the operation data, which is transmitted from the moving image transmission device in response to the start of generation of the frame image based on the operation data; a procedure for controlling the data size of the image data to be transmitted by the moving image transmission device based on the packet reception time, which is the time until the timing at which the packet associated with the data is received.
- another program transmits operation data corresponding to a user's input operation from the moving image receiving device to a computer that sequentially transmits image data representing frame images constituting a moving image to the moving image receiving device.
- FIG. 1 is a diagram showing an example of the overall configuration of a cloud gaming system according to one embodiment of the present invention
- FIG. FIG. 4 is a diagram schematically showing an example of timing of processing that occurs in the cloud server according to one embodiment of the present invention
- 4 is a diagram schematically showing an example of communication that occurs in the cloud gaming system according to one embodiment of the present invention
- FIG. FIG. 4 is an explanatory diagram illustrating an example of bit rate control performed in the cloud gaming system according to one embodiment of the present invention
- FIG. 4 is an explanatory diagram illustrating an example of pad sync control performed in the cloud gaming system according to one embodiment of the present invention
- 1 is a functional block diagram showing an example of functions implemented in a cloud gaming system according to one embodiment of the present invention
- FIG. 4 is a flow chart showing an example of the flow of processing performed in the cloud server according to one embodiment of the present invention
- FIG. 4 is a flow chart showing an example of the flow of processing performed in the cloud server according to one embodiment of the present invention
- FIG. 4 is a flow chart showing an example of the flow of processing performed in the cloud server according to one embodiment of the present invention
- FIG. 4 is a flow chart showing an example of the flow of processing performed in the cloud server according to one embodiment of the present invention
- FIG. 1 is a diagram showing an example of the overall configuration of a cloud gaming system 1 according to one embodiment of the present invention.
- a cloud gaming system 1 according to this embodiment includes a cloud server 10 and a terminal 12, both of which are mainly composed of computers.
- the cloud server 10 and the terminal 12 are connected to a computer network 14 including mobile communication systems such as the 4th generation mobile communication system (4G) and the 5th generation mobile communication system (5G) and the Internet.
- the cloud server 10 is connected to the Internet
- the terminal 12 is connected to mobile communication systems such as 4G and 5G.
- the cloud server 10 and the terminal 12 can communicate with each other via the computer network 14 .
- the cloud server 10 is, for example, a server computer that executes a game program related to a cloud gaming service.
- the cloud server 10 includes, for example, a processor 10a, a storage unit 10b, a communication unit 10c, and an encoder/decoder unit 10d.
- the processor 10a is, for example, a program-controlled device such as a CPU, and executes various information processing according to programs stored in the storage unit 10b.
- the processor 10a according to this embodiment also includes a GPU (Graphics Processing Unit) that draws an image in a frame buffer based on graphics commands and data supplied from the CPU.
- GPU Graphics Processing Unit
- the storage unit 10b is, for example, a storage element such as ROM or RAM, or a solid state drive (SSD).
- the storage unit 10b stores programs and the like executed by the processor 10a. Further, in the storage unit 10b according to the present embodiment, a frame buffer area in which an image is drawn by the GPU included in the processor 10a is secured.
- the communication unit 10c is a communication interface for exchanging data with a computer such as the terminal 12 via the computer network 14, for example.
- the encoder/decoder unit 10d includes, for example, an encoder and a decoder.
- the encoder generates image data representing an input image by encoding the image.
- the decoder also decodes input image data and outputs an image represented by the image data.
- the terminal 12 is, for example, a computer such as a smartphone or a tablet terminal used by a user who uses the cloud gaming service.
- the terminal 12 may be an electronic device capable of communicating with the cloud server 10 via a communication dongle, such as a television including a communication dongle.
- the terminal 12 includes, for example, a processor 12a, a storage unit 12b, a communication unit 12c, a display unit 12d, an operation unit 12e, a sensor unit 12f, an audio output unit 12g, and an encoder/decoder unit 12h. ing.
- the processor 12a is, for example, a program-controlled device such as a CPU, and executes various types of information processing according to programs stored in the storage unit 12b.
- the storage unit 12b is, for example, a storage element such as ROM or RAM, or a solid state drive (SSD).
- the storage unit 12b stores programs and the like executed by the processor 12a.
- the communication unit 12c is a communication interface for exchanging data with a computer such as the cloud server 10 via the computer network 14, for example.
- the display unit 12d is a display device such as a liquid crystal display or an organic EL display.
- the operation unit 12e is, for example, an operation member for performing operation input to the processor 12a.
- the sensor unit 12f is a sensor such as a motion sensor capable of detecting acceleration and angular velocity, for example.
- the audio output unit 12g is, for example, an audio output device such as a speaker that outputs audio represented by audio data.
- the encoder/decoder unit 12h includes, for example, an encoder and a decoder.
- the encoder generates image data representing an input image by encoding the image.
- the decoder also decodes input image data and outputs an image represented by the image data.
- the terminal 12 may have a touch panel.
- the touch panel serves as both the display unit 12d and the operation unit 12e.
- the terminal 12 when the user performs an input operation on the operation unit 12e in playing a game in the cloud gaming service, the terminal 12 generates operation data corresponding to the input operation, and sends the operation data to the cloud server. Send to 10.
- the operation data will be called pad data P hereinafter.
- the cloud server 10 executes game processing according to the received pad data P. Then, the cloud server 10 generates a play image, which is a frame image representing the play status of the game, based on the result of the game processing, and draws the play image in the frame buffer of the cloud server 10 . In this embodiment, game processing and generation of play images are repeatedly executed.
- the cloud server 10 acquires the play image drawn in the frame buffer and encodes the play image to generate image data representing the play image.
- the cloud server 10 then transmits the generated image data to the terminal 12 .
- the terminal 12 decodes the image data received from the cloud server 10 and causes the display unit 12d to display the play image generated by the decoding.
- the cloud server 10 streams moving images generated according to the game play situation to the terminal 12 used by the user playing the game.
- FIG. 2 is a diagram schematically showing an example of the timing of processing that occurs in the cloud server 10 according to this embodiment.
- the terminal 12 transmits pad data P representing an input operation received by the operation unit 12e at a predetermined cycle (for example, a cycle of 4 milliseconds) to the cloud server 10 at that timing.
- the pad data P according to the present embodiment is associated with an order number indicating the order in which the pad data P is transmitted.
- FIG. 2 shows the reception timing in the cloud server 10 of the pad data P(0) to P(10) transmitted in this way.
- the order numbers associated with the received pad data P are shown as numbers in parentheses in P(0) to P(10).
- the cloud server 10 does not always receive the pad data P at intervals of 4 milliseconds.
- play images are generated at a predetermined cycle (eg, 16 millisecond cycle).
- the cloud server 10 generates a play image based on the input operation indicated by the pad data P (the latest pad data P) received most recently at the timing of starting the generation of the play image, and generates the generated play image. to the framebuffer.
- the period G shown in FIG. 2 represents the play image generation period.
- the play image generation period based on the pad data P(0) corresponds to the period indicated by G(1,0).
- the play image generation period based on the pad data P(4) corresponds to the period indicated by G(2, 4).
- the play image generation period based on the pad data P(8) corresponds to the period indicated by G(3,8).
- the transmission cycle of the pad data P in the terminal 12 and the generation cycle of the play image in the cloud server 10 are different.
- the play image generation cycle is four times the pad data P transmission cycle. Therefore, not all the pad data P received by the cloud server 10 are used to generate play images.
- play images are generated. not used for
- the cloud server 10 transmits a packet indicating that play image generation has started to the terminal 12 at the timing when the play image generation based on the pad data P is started.
- the packet will be called a video sync packet (VSP).
- VSP video sync packet
- the order number of the play image (m above) and the order number of the pad data P ( n) of is included.
- the VSP according to the present embodiment is associated with the play image on a one-to-one basis.
- FIG. 2 shows that VSP (1, 0) is transmitted at the timing when the play image with the sequence number of 1 starts to be generated. Also, it is shown that VSP (2, 4) is transmitted at the timing when the play image with the sequence number of 2 starts to be generated. It also shows that VSP (3, 8) is transmitted at the timing when the play image with the order number of 3 starts to be generated.
- the play image when the play image has been drawn in the frame buffer, the play image is encoded and the image data generated by the encoding is transmitted.
- the frame buffer of the cloud server 10 according to the present embodiment is configured as a multi-buffer so that the next play image can be drawn in parallel with the encoding of the play image that has been drawn.
- the period S shown in FIG. 2 corresponds to the period during which the play image is encoded and the image data is transmitted.
- the play image with the order number m which is generated based on the pad data P with the order number n, is encoded, and the image data generated by the encoding shall be transmitted. That is, in FIG. 2, the period in which the play image with the order number 1 is encoded and the image data generated by the encoding is transmitted corresponds to the period indicated as S (1, 0). . Also, the period during which the play image with the order number 2 is encoded and the image data generated by the encoding is transmitted corresponds to the period indicated as S(2, 4).
- the image data generated by encoding the play image generated based on the pad data P contains the sequence number of the pad data P (n above) and the sequence number of the play image. (m above) is associated.
- the cloud server 10 measures the time from the timing of receiving the pad data P to the timing of starting to generate the play image based on the pad data P. Then, the cloud server 10 generates interval data indicating the measured time. In this embodiment, the image data generated based on the play image is associated with the interval data generated in this way.
- FIG. 3 shows the process between the cloud server 10 and the terminal 12 between the transmission of the pad data P(4) whose order number is 4 and the reception of the image data generated based on the pad data P(4).
- FIG. 4 is a diagram schematically showing an example of communication that occurs; In the example of FIG. 3, the image data transmitted during the period indicated by S(2,4) in FIG. 2 is indicated by D(2,4).
- the terminal 12 specifies the packet reception time, which is the time from the timing when the pad data P is transmitted to the timing when the VSP associated with the pad data P is received.
- the packet reception time is also expressed as PadVspRTT, as shown in FIG.
- the terminal 12 according to this embodiment identifies PadVspRTT, for example, based on the order number of the pad data P included in the VSP received from the cloud server 10 .
- the time from the transmission of the pad data P at the terminal 12 to the reception of the first segment of the image data generated based on the pad data P is called PadFirstFragRTT.
- PadFrameRTT the time from the transmission of the pad data P in the terminal 12 to the reception of the last segment of the image data generated based on the pad data P.
- TransferTime is hereinafter expressed as TT.
- the terminal 12 identifies the value of TT based on the image data received from the cloud server 10.
- the terminal 12 may specify the value of PadFrameRTT and the value of PadFrameRTT based on the order number of the pad data P associated with the image data. Then, the terminal 12 may specify the value of TT by subtracting the value of PadFirstFragRTT from the value of PadFrameRTT.
- the number of slices per frame TT may be specified in consideration of .
- the TT may be specified based on the fact that a predetermined non-communication time occurs from the end of transmission of image data for one slice to the start of transmission of image data for the next slice.
- the value obtained by subtracting the value of PadFirstFragRTT from the value of PadFrameRTT and then subtracting the value obtained by multiplying (the number of slices per frame - 1) by the above-described non-communication time is specified as the value of TT.
- the time from the timing at which the cloud server 10 receives the pad data P to the timing at which the play image is generated based on the pad data P is called PRecvGStartTime.
- the terminal 12 identifies PRecvGStartTime based on the interval data associated with the image data.
- the time from the timing at which the terminal 12 receives the VSP to the timing at which the next pad data P is transmitted is called DiffPadVsp.
- the cloud gaming service provided by the cloud gaming system 1, it is desirable that moving images displayed on the terminal 12 have as high a quality as possible. Therefore, it is desirable that the data size of the image data generated based on the play images forming the moving image is as large as possible.
- each play image can be sent to the terminal 12 smoothly and with low delay even if the data size of the image data is reduced in order to reduce the user's sense of incongruity in operation. It is important to make it visible.
- uplink communication from the terminal 12 to the cloud server 10 in this embodiment
- downlink communication from the cloud server 10 to the terminal 12 in this embodiment
- Communication quality does not change.
- communication environments such as 4G and 5G
- it is likely that one of the uplink and downlink has good communication quality and the other has poor communication quality.
- the bit rate control shown in FIG. 4 is performed so that the cloud server 10 generates image data of an appropriate data size considering the delay.
- FIG. 4 is an explanatory diagram illustrating an example of bit rate control performed in the cloud gaming system 1 according to this embodiment.
- band fluctuation follow-up control As shown in FIG. 4, in the bit rate control according to this embodiment, two types of control are performed: band fluctuation follow-up control and packet clogging reduction control.
- the frame rate (FPS) of the moving image to be transmitted is half of the reciprocal of the frame rate (FPS) of the moving image to be transmitted.
- the bit rate is controlled so that the TT values of the frames are close. For example, when the frame rate of a moving image to be transmitted is 60 fps, the bit rate in communication of the moving image is controlled so that TT approaches approximately 8 milliseconds.
- bit rate value R is determined by PID control in which 1/(FPS ⁇ 2) is the target value, TT is the current value, and the bit rate value R is the manipulated variable.
- R PID(TT-(1/(FPS ⁇ 2))).
- the value of R is controlled to decrease as the value of TT-(1/(FPS ⁇ 2)) increases.
- the value of R is controlled to be small, and when the value of TT is smaller than the value of 1/(FPS ⁇ 2), The value of R is controlled to be large.
- Fine noise is removed by passing the PadVspRTT time-series data through a predetermined low-pass filter.
- bit rate reduction amount value D is determined by PD control using the EstBtmLatency value described later as the target value, the FilPadVspRTT value as the current value, and the bit rate reduction amount value D as the manipulated variable.
- PD control is performed instead of PID control in order to quickly follow band fluctuations.
- D PD (FilPadVspRTT-EstBtmLatency).
- D is controlled to increase as the value of FilPadVspRTT-EstBtmLatency increases.
- the value of D is controlled to be large.
- the value of FilPadVspRTT is smaller than the value of EstBtmLatency, the value of D is controlled to be small.
- the value of EstBtmLatency is set to a predetermined value in the initial state. Then, each time PadVspRTT is specified, the absolute value V of the difference between the latest PadVspRTT (PadVspRTT[n]) and the immediately preceding PadVspRTT (PadVspRTT[n ⁇ 1]) is specified. Then, the value of EstBtmLatency is updated when a state in which the absolute value V is less than Th1 occurs consecutively N times with respect to a given value N and a predetermined threshold value Th1.
- the value of PadVspRTT when the band is stable to some extent is set as the value of EstBtmLatency. Therefore, the greater the divergence between the current value of PadVspRTT and the value of PadVspRTT in the stable state, the larger the value of D to be determined.
- the value B obtained by subtracting the bit rate reduction amount value D from the bit rate value R determined as described above is specified as the final bit rate value.
- Band fluctuation tracking control may be executed at a predetermined timing. For example, band fluctuation tracking control may be performed at a predetermined cycle (eg, 16 millisecond cycle). Also, band fluctuation tracking control may be executed in response to occurrence of a predetermined event (for example, reception of the last segment of image data).
- a predetermined cycle e.g, 16 millisecond cycle.
- band fluctuation tracking control may be executed in response to occurrence of a predetermined event (for example, reception of the last segment of image data).
- the terminal 12 monitors reception of the VSP at the transmission cycle of the pad data P, for example. Then, when the number M of received VSPs at the most recent predetermined time t1 is smaller than a predetermined threshold Th2, the band fluctuation follow-up control is interrupted. For example, when the number of receptions M in the most recent 100 milliseconds is less than 5, or the VSP is not received in the most recent 80 milliseconds, the band fluctuation follow-up control is interrupted.
- packet clogging reduction control which is a process of multiplying the bit rate value B described above by a predetermined ratio r (r is less than 1), is executed. It should be noted that the processing executed in the packet clogging reduction control does not need to be the processing of multiplying the value B by the ratio r as long as the value B is reduced. good too.
- the value B when the value B reaches the predetermined lower limit b1, the value B is controlled so as not to become smaller than that.
- the packet clogging reduction control may be executed at a predetermined timing.
- packet clogging reduction control may be executed at a predetermined cycle (for example, the transmission cycle of pad data P).
- the packet clogging reduction control may be executed in response to occurrence of a predetermined event (for example, transmission of pad data P).
- retransmission and buffering control are performed generously, so data tends to accumulate in the buffers of relay devices such as base stations.
- data transmitted from the cloud server 10 tends to accumulate in a relay device such as a base station.
- the data stored in this way is sent all at once when downlink communication returns to normal.
- Loss of received data at the terminal 12 is caused by an increase in delay due to the discharge of stored data and an overflow of the reception buffer of the terminal 12 due to the terminal 12 receiving a large amount of data at once. becomes.
- the packet clogging reduction control as described above, the amount of data retained (packet clogging) in the computer network 14 when communication becomes impossible due to deterioration of the communication environment is reduced. It is possible to suppress the occurrence of loss of received data.
- the cloud server 10 sets the value of the bit rate of the moving image to be transmitted to the value B determined by the band fluctuation follow-up control or the value B updated by the packet clogging reduction control. change the compression ratio in the encoding of .
- the data size of generated image data is controlled.
- the cloud server 10 generates image data having an appropriate data size in consideration of delay.
- the data size of image data to be transmitted from now on by the cloud server 10 is controlled based on the packet reception time (PadVspRTT). By doing so, it is possible to immediately reduce the bit rate of the moving image in response to a sudden drop in throughput.
- the time from the reception of the operation data in the cloud server 10 to the start of generation of the play image be as constant as possible in order to reduce the user's sense of incongruity in the operation.
- the time it takes for the operation data sent from the terminal 12 to reach the cloud server 10 may vary. This is particularly noticeable in wireless communication using mobile communication systems such as 4G and 5G, which have large band fluctuations.
- FIG. 5 is an explanatory diagram illustrating an example of pad sync control performed in the cloud gaming system 1 according to this embodiment.
- the transmission timing of the pad data P is controlled so that the value indicating DiffPadVsp (see FIG. 2) approaches a predetermined value T1 (for example, a value T1 that is 1.5 times the transmission cycle of the pad data P). be done.
- a predetermined value T1 for example, a value T1 that is 1.5 times the transmission cycle of the pad data P.
- the transmission period T of the pad data P is determined by PD control using the value T1 as the target value, the value of DiffPadVsp as the current value, and the transmission period T of the pad data P as the manipulated variable.
- T PD(DiffPadVsp-T1).
- the value of T is controlled to decrease as the value of (DiffPadVsp-T1) increases.
- the value of DiffPadVsp is greater than the value of T1
- the value of T is controlled to be decreased
- the value of DiffPadVsp is less than the value of T1
- the value of T is controlled to be increased.
- the pad data P may be transmitted at the determined transmission cycle T.
- the pad data P may be transmitted at the timing when the time corresponding to the determined transmission period T has elapsed from the latest transmission timing of the pad data P.
- the pad sync control is shifted from the first control to the second control.
- a predetermined condition such as a state in which a value indicating a change in the value of DiffPadVsp (for example, the absolute value of -T1 of DiffPadVsp) is less than a predetermined threshold Th3 has passed for a predetermined time t2 or more
- pad sync Control is transferred from the first control to the second control.
- the value indicating the time (PRecvGStartTime) from the timing at which the cloud server 10 receives the pad data P to the timing at which the play image is generated based on the pad data P is set to a predetermined value T2 (for example, 1 millisecond).
- T2 for example, 1 millisecond.
- the value T1 which was a fixed value in the first control, is made variable so as to approach.
- the value T1 in this case is expressed as T1_adj.
- the value T1_adj is determined by PD control with the value T2 as the target value, the value of PRecvGStartTime as the current value, and the value T1_adj as the manipulated variable.
- T1_adj PD(PRecvGStartTime ⁇ T2).
- the transmission period T of the pad data P is determined by PD control using the value T1_adj as the target value, the value of DiffPadVsp as the current value, and the transmission period T of the pad data P as the manipulated variable.
- T PD(DiffPadVsp-T1_adj).
- T PD(DiffPadVsp-PD(PRecvGStartTime-T2)).
- the pad data P may be transmitted at the transmission cycle T determined as described above.
- the pad data P may be transmitted at the timing when the time corresponding to the determined transmission period T has elapsed from the latest transmission timing of the pad data P.
- the pad sync control is shifted from the second control to the first control.
- a predetermined condition such as a state in which a value indicating variation in the value of DiffPadVsp (for example, the absolute value of DiffPadVsp ⁇ T1) is equal to or greater than a predetermined threshold Th3 occurs, the pad sync control is performed in the second control to the first control.
- Pad sync control may be executed at a predetermined timing. For example, pad sync control may be performed in a predetermined cycle (eg, 16 millisecond cycle). Also, band fluctuation tracking control may be executed in response to occurrence of a predetermined event (for example, reception of the last segment of image data).
- a predetermined cycle eg, 16 millisecond cycle.
- band fluctuation tracking control may be executed in response to occurrence of a predetermined event (for example, reception of the last segment of image data).
- delay reduction can be achieved by synchronizing the transmission timing of the pad data P and the generation timing of the play image by pad sync control. In this way, according to the present embodiment, the user's sense of discomfort in a situation where a moving image generated by the cloud server 10 is displayed on the terminal 12 in response to an operation on the terminal 12 is reduced.
- PRecvGStartTime can be stabilized by executing the above-described first control and second control.
- FIG. 6 is a functional block diagram showing an example of functions implemented in the cloud gaming system 1 according to this embodiment. Note that the cloud gaming system 1 according to the present embodiment does not need to implement all the functions shown in FIG. 6, and functions other than the functions shown in FIG. 6 may be installed.
- the cloud server 10 functionally includes, for example, a server-side control data storage unit 20, an operation data reception unit 22, a frame image generation unit 24, a VSP transmission unit 26, an encoding process It includes an execution unit 28 , an image data transmission unit 30 and a server side traffic control unit 32 .
- the cloud server 10 plays a role as a moving image transmission device that sequentially transmits image data representing frame images that constitute a moving image.
- the server-side control data storage unit 20 is mainly implemented by the storage unit 10b.
- the operation data reception unit 22, the VSP transmission unit 26, and the image data transmission unit 30 are mainly implemented by the communication unit 10c.
- the frame image generator 24 and the server-side traffic controller 32 are mainly implemented by the processor 10a.
- the encoding processing execution unit 28 is mainly implemented by a processor 10a and an encoder/decoder unit 10d.
- the above functions may be implemented by causing the processor 10a to execute a program containing instructions corresponding to the above functions, installed in the cloud server 10, which is a computer.
- This program may be supplied to the cloud server 10 via computer-readable information storage media such as optical discs, magnetic discs, magnetic tapes, magneto-optical discs, and flash memory, or via the Internet.
- the terminal 12 functionally includes, for example, a terminal-side control data storage unit 40, an operation data generation unit 42, an operation data transmission unit 44, a VSP reception unit 46, an image data It includes a receiving unit 48, a decoding processing executing unit 50, a frame image display control unit 52, a terminal side traffic control unit 54, and a transmission timing control unit 56.
- the terminal 12 plays a role as a moving image receiving device that sequentially receives image data representing frame images that constitute a moving image.
- the terminal-side control data storage unit 40 is mainly implemented by the storage unit 12b.
- the operation data generator 42 is mainly implemented by the processor 10a and the operation unit 12e.
- the operation data transmission unit 44, the VSP reception unit 46, and the image data reception unit 48 are mainly implemented by the communication unit 12c.
- the decoding processing execution unit 50 is mainly implemented by the processor 10a and the encoder/decoder unit 12h.
- the frame image display control section 52 is mainly implemented by the processor 12a and the display section 12d.
- the terminal-side traffic control unit 54 and the transmission timing control unit 56 are mainly implemented by the processor 10a.
- the above functions may be implemented by causing the processor 12a to execute a program containing instructions corresponding to the above functions, installed in the terminal 12, which is a computer.
- This program may be supplied to the terminal 12 via a computer-readable information storage medium such as an optical disk, magnetic disk, magnetic tape, magneto-optical disk, flash memory, or the like, or via the Internet.
- the server-side control data storage unit 20 stores, for example, control data indicating the bit rate value B described above.
- the operation data receiving unit 22 receives, for example, operation data (for example, pad data P described above) corresponding to the user's input operation.
- the operation data receiving unit 22 receives, for example, operation data corresponding to input operations during game play.
- the frame image generation unit 24 generates frame images based on the operation data received by the operation data reception unit 22, for example.
- the frame image generation unit 24 generates, based on the pad data P, for example, a play image representing the play situation of the game being played by the user.
- the VSP transmission unit 26 transmits the VSP, which is a packet associated with the operation data, to the terminal 12 in response to the start of frame image generation based on the operation data.
- the VSP transmission unit 26 transmits the VSP, for example, at the timing when frame image generation is started.
- the encoding processing execution unit 28 encodes the frame image generated by the frame image generation unit 24 to generate image data representing the frame image.
- the encoding processing execution unit 28 may determine a compression ratio such that the value indicating the bit rate of the moving image to be transmitted becomes the value B indicated by the control data stored in the server-side control data storage unit 20 . Then, the encoding process execution unit 28 may generate image data by encoding the frame image at the determined compression rate.
- the encoding process execution unit 28 may generate the interval data described above. Then, the encoding process execution unit 28 may associate the generated interval data with the generated image data.
- the image data transmission unit 30 transmits image data generated by the encoding processing execution unit 28 to the terminal 12 in this embodiment, for example.
- the image data transmission unit 30 may transmit the image data associated with the interval data to the terminal 12 .
- the server-side traffic control unit 32 controls the data size of image data to be transmitted by the image data transmission unit 30 from now on.
- the server-side traffic control unit 32 determines the data size of the image data to be transmitted by the image data transmission unit 30 based on the value of TT, which indicates the time required for the terminal 12 to receive the image data. may be controlled.
- the server-side traffic control unit 32 may control the data size of image data to be transmitted by the image data transmission unit 30 based on the packet reception time (PadVspRTT) described above.
- the terminal-side control data storage unit 40 stores, for example, control data indicating the bit rate value B described above.
- the operation data generation unit 42 generates, for example, the above-described operation data according to the user's input operation.
- the operation data generation unit 42 may generate operation data associated with control data stored in the terminal-side control data storage unit 40 .
- the operation data transmission unit 44 transmits, for example, the operation data generated by the operation data generation unit 42 to the cloud server 10 in this embodiment.
- the operation data transmission unit 44 may transmit operation data associated with control data.
- the server-side traffic control unit 32 may acquire control data associated with the operation data received by the operation data reception unit 22 . Then, the server-side traffic control unit 32 may update the control data stored in the server-side control data storage unit 20 with the acquired control data.
- the operation data transmission unit 44 need not transmit the control data in association with the operation data, and may transmit the control data to the cloud server 10 independently of the operation data. Then, the operation data receiving section 22 may receive the control data transmitted in this manner. In this case, the server-side traffic control section 32 may update the control data stored in the server-side control data storage section 20 to the control data received by the operation data reception section 22 .
- the VSP receiving unit 46 receives, from the cloud server 10, a VSP, which is a packet associated with the operation data, which is transmitted from the cloud server 10 in response to the start of frame image generation based on the operation data. do.
- the image data receiving unit 48 receives image data transmitted from the cloud server 10 in this embodiment, for example. As described above, the image data receiver 48 may receive image data associated with interval data.
- the decoding processing execution unit 50 decodes image data received by the image data receiving unit 48 to generate a frame image (for example, a play image) represented by the image data.
- the frame image display control unit 52 causes the display unit 12d to display a frame image (for example, a play image) generated by the decoding processing execution unit 50, for example.
- the terminal-side traffic control unit 54 controls the data size of image data to be transmitted by the cloud server 10 from now on.
- the terminal-side traffic control unit 54 controls the image data to be transmitted by the image data transmission unit 30 based on the value of TT, which indicates the time required to receive the image data transmitted from the cloud server 10 . You may control the data size of data.
- the terminal-side traffic control unit 54 uses the above-described packet reception time (PadVspRTT), which is the time from the timing when the terminal 12 transmits the operation data to the timing when the VSP associated with the operation data is received. may be specified. Then, the terminal-side traffic control section 54 may control the data size of the image data to be transmitted by the image data transmission section 30 based on the packet reception time.
- PadVspRTT packet reception time
- the terminal-side traffic control unit 54 may execute the above-described band fluctuation follow-up control. Then, the terminal-side traffic control unit 54 may update the control data stored in the terminal-side control data storage unit 40 to control data indicating the value B determined by executing the band fluctuation follow-up control. .
- the terminal-side traffic control unit 54 may execute the above-described packet clogging reduction control. Then, the terminal-side traffic control unit 54 may update the control data stored in the terminal-side control data storage unit 40 to control data indicating the value B updated by executing the packet clogging reduction control. .
- the terminal-side traffic control unit 54 when it is determined that VSP reception failure continues based on a predetermined condition, the image data transmission unit 30 may be controlled to reduce the data size of the image data to be transmitted from now on.
- the packet clogging reduction control described above may be executed when the number M of received VSPs at the most recent predetermined time t1 is smaller than the predetermined threshold Th2.
- the terminal-side traffic control unit 54 may hold a bit rate control mode flag that indicates the mode of bit rate control. Then, for example, when the value of the bit rate control mode flag is 0, the band fluctuation follow-up control is executed, and when the value of the bit rate control mode flag is 1, the packet clogging reduction control is executed. good.
- the transmission timing control unit 56 controls, for example, the transmission timing of operation data in this embodiment.
- the transmission timing control unit 56 may control the time from the timing when the VSP is received until the next operation data is transmitted.
- the transmission timing control unit 56 executes the first control for controlling the timing at which the operation data is transmitted so that the time from the timing at which the VSP is received until the next operation data is transmitted achieves the first target.
- the first target corresponds to satisfying a predetermined condition such as a state in which the value indicating the variation in the value of DiffPadVsp is less than the predetermined threshold Th3 for a predetermined time t2 or longer.
- the transmission timing control unit 56 may control the timing at which the next operation data is transmitted based on the interval data.
- the transmission timing control section 56 may control the timing at which the next operation data is transmitted based on interval data associated with the image data received by the image data reception section 48 .
- the transmission timing control unit 56 may perform second control to control the timing of transmitting the operation data so that the time indicated by the interval data achieves the second target.
- the value of PRecvGStartTime being equal to the predetermined value T2 (eg, 1 millisecond) corresponds to the second target.
- the transmission timing control unit 56 may start the second control in response to the achievement of the above-described first target in the first control.
- the transmission timing control unit 56 may control the transmission cycle T of the operation data to be transmitted based on the interval data.
- the image data transmission unit 30 does not need to transmit the image data associated with the interval data.
- the image data transmission unit 30 may transmit the interval data to the cloud server 10 independently of the image data. Then, the image data receiving section 48 may receive the interval data transmitted in this manner. Then, based on the interval data received from the cloud server 10 in this way, the transmission timing control unit 56 may control the timing at which the next operation data is transmitted.
- the transmission timing control unit 56 may hold a pad sync control mode flag indicating the pad sync control mode. Then, for example, the first control may be executed when the value of the pad sync control mode flag is 0, and the second control may be executed when the value of the pad sync control mode flag is 1.
- the transmission timing control unit 56 may, for example, output a transmission command to the operation data transmission unit 44 at the transmission cycle T determined as described above. Then, the operation data transmission unit 44 may transmit the operation data to the cloud server 10 in response to reception of the transmission command.
- the terminal-side traffic control unit 54 waits until a predetermined execution timing related to bandwidth fluctuation follow-up control arrives (S101).
- the terminal-side traffic control unit 54 identifies the value of TT for the latest image data received by the image data receiving unit 48 (S102).
- the terminal-side traffic control unit 54 identifies the above-described value R based on the TT identified in the processing shown in S102 (S103).
- the terminal-side traffic control unit 54 identifies the latest PadVspRTT value (S104).
- the terminal-side traffic control unit 54 identifies the absolute value V of the difference between the latest PadVspRTT value and the previous PadVspRTT value (S105).
- the terminal-side traffic control unit 54 confirms whether or not a state in which the absolute value V is less than Th1 has occurred N times consecutively (S106).
- the terminal-side traffic control unit 54 updates the value of EstBtmLatency (S107).
- the terminal-side traffic control unit 54 specifies the value of FilPadVspRTT (S108).
- the terminal-side traffic control unit 54 identifies the above value D based on the value of FilPadVspRTT identified in the processing shown in S108 and the latest value of EstBtmLatency (S109).
- the terminal-side traffic control unit 54 identifies the value B based on the value R identified in the processing shown in S103 and the value D identified in the processing shown in S109 (S110).
- the terminal-side traffic control unit 54 updates the control data stored in the terminal-side control data storage unit 40 so that the set value B becomes the value B specified in the processing shown in S109. (S111), returning to the process shown in S101.
- the terminal-side traffic control unit 54 waits until a predetermined execution timing related to packet clogging reduction control arrives (S201).
- the terminal-side traffic control unit 54 specifies the bit rate value B indicated by the control data stored in the terminal-side control data storage unit 40 (S202).
- the terminal-side traffic control unit 54 confirms whether or not the value B specified in the processing shown in S202 is smaller than the lower limit b1 (S203).
- the terminal-side traffic control unit 54 adds a predetermined ratio r (r is less than 1) to the value B specified in the processing shown in S202. ) is specified as a new value B (S204).
- the terminal-side traffic control unit 54 updates the control data stored in the terminal-side control data storage unit 40 so that the set value B becomes the value B specified in the processing shown in S204. (S205), returning to the process shown in S201.
- the terminal-side traffic control unit 54 waits until a predetermined execution timing for switching the bit rate control mode arrives (S301).
- the execution timing may come at a predetermined cycle (for example, the transmission cycle T of the operation data).
- the execution timing may come in accordance with the occurrence of a predetermined event (for example, transmission of operation data).
- the terminal-side traffic control unit 54 identifies the number M of received VSPs at the most recent predetermined time t1 (S302).
- the terminal-side traffic control unit 54 checks the current value of the bit rate control mode flag (S303).
- the terminal-side traffic control unit 54 confirms whether or not the number of receptions M specified in the processing shown in S302 is smaller than a predetermined threshold Th2. (S304).
- reception number M is equal to or greater than the predetermined threshold Th2 (S304: N)
- the process returns to S301.
- the terminal-side traffic control unit 54 changes the value of the held bit rate control mode flag to 1 (S305), and proceeds to S301. Return to the process shown.
- the terminal-side traffic control unit 54 determines that the number of receptions M specified in the process shown in S302 is equal to or greater than a predetermined threshold Th2. It is checked whether or not there is (S306).
- reception number M is smaller than the predetermined threshold Th2 (S306: N)
- the process returns to S301.
- the terminal-side traffic control unit 54 changes the value of the held bit rate control mode flag to 0 (S307), and proceeds to S301. Return to the process shown.
- the transmission timing control unit 56 waits until a predetermined execution timing related to pad sync control arrives (S401).
- the transmission timing control unit 56 identifies the latest DiffPadVsp value (S402).
- the transmission timing control unit 56 checks the current value of the pad sync control mode flag (S403).
- the transmission timing control unit 56 determines a new transmission period T so that the value of DiffPadVsp specified in the processing shown in S402 becomes T1. (S404).
- the transmission timing control unit 56 identifies the absolute value V of the difference between the latest DiffPadVsp value and the value T1 (S405).
- the transmission timing control unit 56 identifies the time during which the state in which the absolute value V is less than the predetermined threshold Th3 continues (S406).
- the transmission timing control unit 56 confirms whether or not the time specified in the processing shown in S406 has reached a predetermined time (S407).
- the transmission timing control unit 56 changes the value of the held pad sync control mode flag to 1 (S408), and performs the processing shown in S401. back to
- the transmission timing control unit 56 identifies the value of the interval data associated with the latest image data (S409). .
- the transmission timing control unit 56 determines a new transmission period T based on the value of the interval data specified in the processing shown in S409 (S410).
- the transmission timing control unit 56 identifies the absolute value V of the difference between the latest DiffPadVsp value and the value T1 (S411).
- the transmission timing control unit 56 confirms whether or not the specified absolute value V is equal to or greater than a predetermined threshold Th3 (S412).
- the transmission timing control unit 56 changes the value of the held pad sync control mode flag to 0 (S413), and proceeds to S401. Return to processing.
- the interval data may be associated with the VSP instead of being associated with the image data.
- the frame image generation unit 24 may generate a frame image with a data size corresponding to the bit rate.
- the scope of application of the present invention is not limited to the computer network 14 including mobile communication systems such as 4G and 5G.
- the present invention can also be applied to the computer network 14 in which wireless communication is performed by Wi-Fi (registered trademark) instead of wireless communication via mobile communication systems such as 4G and 5G.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
(S104)。
(S106:N)、又は、S107に示す処理が実行された場合は、端末側トラフィック制御部54が、FilPadVspRTTの値を特定する(S108)。
Claims (10)
- 動画像を構成するフレーム画像を表す画像データを動画像送信装置から順次受信する動画像受信装置であって、
ユーザの入力操作に応じた操作データを前記動画像送信装置に送信する操作データ送信部と、
前記操作データに基づく前記フレーム画像の生成の開始に応じて前記動画像送信装置から送信される、当該操作データに対応付けられるパケットを受信するパケット受信部と、
前記操作データを送信したタイミングから当該操作データに対応付けられる前記パケットを受信したタイミングまでの時間であるパケット受信時間に基づいて、前記動画像送信装置によってこれから送信される画像データのデータサイズを制御するデータサイズ制御部と、
を含むことを特徴とする動画像受信装置。 - 前記データサイズ制御部は、前記動画像送信装置から送信される前記画像データの受信に要した時間に基づいて、前記データサイズを制御する、
ことを特徴とする請求項1に記載の動画像受信装置。 - 前記データサイズ制御部は、最新の前記パケットの受信における前記パケット受信時間と、最新より前の少なくとも1回の前記パケットの受信における前記パケット受信時間と、に基づいて、前記データサイズを制御する、
ことを特徴とする請求項1又は2に記載の動画像受信装置。 - 前記データサイズ制御部は、所定の条件に基づいて前記パケットの受信の失敗が継続していると判定される場合に、前記データサイズが小さくなるよう制御する、
ことを特徴とする請求項1から3のいずれか一項に記載の動画像受信装置。 - 動画像を構成するフレーム画像を表す画像データを動画像受信装置に順次送信する動画像送信装置であって、
ユーザの入力操作に応じた操作データを前記動画像受信装置から受信する操作データ受信部と、
前記操作データに基づいて、前記フレーム画像を生成する画像生成部と、
前記フレーム画像の生成の開始に応じて、前記操作データに対応付けられるパケットを前記動画像受信装置に送信するパケット送信部と、
前記フレーム画像をエンコードすることで、当該フレーム画像を表す画像データを生成するエンコード処理実行部と、
前記画像データを前記動画像受信装置に送信する画像データ送信部と、
前記動画像受信装置が前記操作データを送信したタイミングから当該操作データに対応付けられる前記パケットを受信したタイミングまでの時間に基づいて、前記画像データ送信部によってこれから送信される画像データのデータサイズを制御するデータサイズ制御部と、
を含むことを特徴とする動画像送信装置。 - 動画像を構成するフレーム画像を表す画像データを順次送信する動画像送信装置と、前記画像データを順次受信する動画像受信装置と、を含む動画像送受信システムであって、
前記動画像受信装置は、
ユーザの入力操作に応じた操作データを前記動画像送信装置に送信する操作データ送信部と、
前記操作データに基づくフレーム画像の生成の開始に応じて前記動画像送信装置から送信される、当該操作データに対応付けられるパケットを受信するパケット受信部と、
前記操作データを送信したタイミングから当該操作データに対応付けられる前記パケットを受信したタイミングまでの時間であるパケット受信時間に基づいて、前記動画像送信装置によってこれから送信される前記画像データのデータサイズを制御する制御データを前記動画像送信装置に送信する制御データ送信部と、を含み、
前記動画像送信装置は、
前記操作データを前記動画像受信装置から受信する操作データ受信部と、
前記操作データに基づいて、前記フレーム画像を生成する画像生成部と、
前記フレーム画像の生成の開始に応じて、前記操作データに対応付けられるパケットを前記動画像受信装置に送信するパケット送信部と、
前記フレーム画像をエンコードすることで、当該フレーム画像を表す画像データを生成するエンコード処理実行部と、
前記画像データを前記動画像受信装置に送信する画像データ送信部と、
前記制御データを受信する制御データ受信部と、
前記制御データに基づいて、前記画像データ送信部によってこれから送信される画像データのデータサイズを制御するデータサイズ制御部と、を含む、
ことを特徴とする動画像送受信システム。 - 動画像を構成するフレーム画像を表す画像データを動画像送信装置から順次受信する動画像受信装置が、ユーザの入力操作に応じた操作データを前記動画像送信装置に送信するステップと、
前記動画像受信装置が、前記操作データに基づく前記フレーム画像の生成の開始に応じて前記動画像送信装置から送信される、当該操作データに対応付けられるパケットを受信するステップと、
前記動画像受信装置が、前記操作データを送信したタイミングから当該操作データに対応付けられる前記パケットを受信したタイミングまでの時間であるパケット受信時間に基づいて、前記動画像送信装置によってこれから送信される画像データのデータサイズを制御するステップと、
を含むことを特徴とする制御方法。 - 動画像を構成するフレーム画像を表す画像データを動画像受信装置に順次送信する動画像送信装置が、ユーザの入力操作に応じた操作データを前記動画像受信装置から受信するステップと、
前記動画像送信装置が、前記操作データに基づいて、前記フレーム画像を生成するステップと、
前記動画像送信装置が、前記フレーム画像の生成の開始に応じて、前記操作データに対応付けられるパケットを前記動画像受信装置に送信するステップと、
前記動画像送信装置が、前記フレーム画像をエンコードすることで、当該フレーム画像を表す画像データを生成するステップと、
前記動画像送信装置が、前記画像データを前記動画像受信装置に送信するステップと、
前記動画像送信装置が、前記動画像受信装置が前記操作データを送信したタイミングから当該操作データに対応付けられる前記パケットを受信したタイミングまでの時間に基づいて、前記動画像送信装置によってこれから送信される画像データのデータサイズを制御するステップと、
を含むことを特徴とする制御方法。 - 動画像を構成するフレーム画像を表す画像データを動画像送信装置から順次受信するコンピュータに、
ユーザの入力操作に応じた操作データを前記動画像送信装置に送信する手順、
前記操作データに基づく前記フレーム画像の生成の開始に応じて前記動画像送信装置から送信される、当該操作データに対応付けられるパケットを受信する手順、
前記操作データを送信したタイミングから当該操作データに対応付けられる前記パケットを受信したタイミングまでの時間であるパケット受信時間に基づいて、前記動画像送信装置によってこれから送信される画像データのデータサイズを制御する手順、
を実行させることを特徴とするプログラム。 - 動画像を構成するフレーム画像を表す画像データを動画像受信装置に順次送信するコンピュータに、
ユーザの入力操作に応じた操作データを前記動画像受信装置から受信する手順、
前記操作データに基づいて、前記フレーム画像を生成する手順、
前記フレーム画像の生成の開始に応じて、前記操作データに対応付けられるパケットを前記動画像受信装置に送信する手順、
前記フレーム画像をエンコードすることで、当該フレーム画像を表す画像データを生成する手順、
前記画像データを前記動画像受信装置に送信する手順、
前記動画像受信装置が前記操作データを送信したタイミングから当該操作データに対応付けられる前記パケットを受信したタイミングまでの時間に基づいて、前記コンピュータによってこれから送信される画像データのデータサイズを制御する手順、
を実行させることを特徴とするプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22795639.8A EP4331693A1 (en) | 2021-04-30 | 2022-04-19 | Moving image reception device, moving image transmission device, moving image transmission/reception system, control method, and program |
JP2023517462A JPWO2022230726A1 (ja) | 2021-04-30 | 2022-04-19 | |
CN202280019271.4A CN117063477A (zh) | 2021-04-30 | 2022-04-19 | 运动图像接收设备、运动图像发送设备、运动图像发送/接收系统、控制方法和程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-077838 | 2021-04-30 | ||
JP2021077838 | 2021-04-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022230726A1 true WO2022230726A1 (ja) | 2022-11-03 |
Family
ID=83846872
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/018208 WO2022230726A1 (ja) | 2021-04-30 | 2022-04-19 | 動画像受信装置、動画像送信装置、動画像送受信システム、制御方法及びプログラム |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4331693A1 (ja) |
JP (1) | JPWO2022230726A1 (ja) |
CN (1) | CN117063477A (ja) |
WO (1) | WO2022230726A1 (ja) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017179230A1 (ja) * | 2016-04-14 | 2017-10-19 | 株式会社ソニー・インタラクティブエンタテインメント | 受信装置、送信装置、制御方法、送信方法及びプログラム |
WO2021039983A1 (ja) * | 2019-08-29 | 2021-03-04 | 株式会社ソニー・インタラクティブエンタテインメント | 送信装置、送信方法及びプログラム |
-
2022
- 2022-04-19 WO PCT/JP2022/018208 patent/WO2022230726A1/ja active Application Filing
- 2022-04-19 JP JP2023517462A patent/JPWO2022230726A1/ja active Pending
- 2022-04-19 EP EP22795639.8A patent/EP4331693A1/en active Pending
- 2022-04-19 CN CN202280019271.4A patent/CN117063477A/zh active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017179230A1 (ja) * | 2016-04-14 | 2017-10-19 | 株式会社ソニー・インタラクティブエンタテインメント | 受信装置、送信装置、制御方法、送信方法及びプログラム |
WO2021039983A1 (ja) * | 2019-08-29 | 2021-03-04 | 株式会社ソニー・インタラクティブエンタテインメント | 送信装置、送信方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN117063477A (zh) | 2023-11-14 |
JPWO2022230726A1 (ja) | 2022-11-03 |
EP4331693A1 (en) | 2024-03-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11695974B2 (en) | Methods, systems, and media for adjusting quality level during synchronized media content playback on multiple devices | |
US10511666B2 (en) | Output data providing server and output data providing method | |
US8719883B2 (en) | Stream transmission server and stream transmission system | |
EP3503570A1 (en) | Method of transmitting video frames from a video stream to a display and corresponding apparatus | |
CN106488265A (zh) | 一种发送媒体流的方法和装置 | |
WO2008076537A1 (en) | Method and system for providing adaptive trick play control of streaming digital video | |
JP6585831B2 (ja) | 受信装置、送信装置、制御方法、送信方法及びプログラム | |
CN113992967A (zh) | 一种投屏数据传输方法、装置、电子设备及存储介质 | |
CN113242436B (zh) | 直播数据的处理方法、装置及电子设备 | |
JPH10336626A (ja) | 映像データの転送方法および転送装置 | |
CN104581340A (zh) | 客户端、流媒体数据接收方法和流媒体数据传输系统 | |
US20190369738A1 (en) | System and method for wireless audiovisual transmission | |
WO2022230726A1 (ja) | 動画像受信装置、動画像送信装置、動画像送受信システム、制御方法及びプログラム | |
WO2022230727A1 (ja) | 動画像受信装置、制御方法及びプログラム | |
US20130094571A1 (en) | Low latency video compression | |
US20240226754A9 (en) | Moving image reception device, control method, and program | |
US20240223638A1 (en) | Moving image reception device, moving image transmission device, moving image transmission/reception system, control method, and program | |
US11410700B2 (en) | Video playback buffer adjustment | |
CN116156233A (zh) | 一种显示画面同步方法、系统及电子设备 | |
JP6711120B2 (ja) | 映像再生装置、映像再生方法および映像再生プログラム | |
WO2023045838A1 (zh) | 信息接收、信息上报方法、装置、设备及计算机存储介质 | |
WO2023098800A1 (zh) | 信息传输方法、装置、终端及网络侧设备 | |
US20220277686A1 (en) | Display control apparatus, transmission apparatus, display control method, and program | |
US11229082B2 (en) | Controlled interruptions to wireless signalling | |
WO2023066107A1 (zh) | 数据传输方法、装置及终端 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22795639 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280019271.4 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18557310 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023517462 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022795639 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022795639 Country of ref document: EP Effective date: 20231130 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |