WO2013168547A1 - アプリケーション管理装置、アプリケーション管理装置の制御方法、アプリケーション管理装置の制御プログラム - Google Patents

アプリケーション管理装置、アプリケーション管理装置の制御方法、アプリケーション管理装置の制御プログラム Download PDF

Info

Publication number
WO2013168547A1
WO2013168547A1 PCT/JP2013/061745 JP2013061745W WO2013168547A1 WO 2013168547 A1 WO2013168547 A1 WO 2013168547A1 JP 2013061745 W JP2013061745 W JP 2013061745W WO 2013168547 A1 WO2013168547 A1 WO 2013168547A1
Authority
WO
WIPO (PCT)
Prior art keywords
user terminal
moving image
user
application
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2013/061745
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
泰正 金子
直也 木原
貴裕 吉田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konami Digital Entertainment Co Ltd
Original Assignee
Konami Digital Entertainment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konami Digital Entertainment Co Ltd filed Critical Konami Digital Entertainment Co Ltd
Publication of WO2013168547A1 publication Critical patent/WO2013168547A1/ja
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/355Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an encoded video stream for transmitting to a mobile phone or a thin client
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network

Definitions

  • the present invention relates to an application management apparatus, an application management apparatus control method, and an application management apparatus control program.
  • This application claims priority based on Japanese Patent Application No. 2012-109515 filed in Japan on May 11, 2012, the contents of which are incorporated herein by reference.
  • an application program is installed in the user terminal in advance, the user terminal performs arithmetic processing based on the application program, and changes and displays the screen based on the calculation result. It is.
  • an application system that uses an application via a network such as the Internet without installing an application program in a user terminal.
  • a user terminal connects to an application server via a network such as the Internet, and transmits instruction information input from the user to the application server.
  • the application server performs predetermined calculation processing according to the instruction information transmitted from the user terminal, and distributes a screen reflecting the result based on the calculation result to the user terminal in real time.
  • Some aspects of the present invention provide an application management apparatus, an application management apparatus control method, and an application management apparatus control program capable of using an application via a network.
  • the game system includes at least a plurality of application processing units, a synthesis unit, and a transmission unit.
  • the plurality of application processing units correspond to each of the user terminals, perform individual calculation processing based on individual instruction information transmitted from the plurality of user terminals, and based on the calculation result of the individual calculation processing You may be comprised so that each moving image according to each of each user terminal may be produced
  • the synthesizing unit may be configured to generate at least one synthesized moving image obtained by synthesizing the moving images generated by the plurality of application processing units so as to be displayed for each predetermined region on one screen.
  • the transmitting unit may be configured to transmit the at least one synthesized moving image to at least one user terminal among the plurality of user terminals.
  • the application management system includes at least a plurality of application processing units, a synthesis unit, and a transmission unit.
  • the plurality of application processing units correspond to each of the user terminals, perform individual calculation processing based on individual instruction information transmitted from the plurality of user terminals, and based on the calculation result of the individual calculation processing You may be comprised so that each moving image according to each of each user terminal may be produced
  • the synthesizing unit may be configured to generate at least one synthesized moving image obtained by synthesizing the moving images generated by the plurality of application processing units so as to be displayed for each predetermined region on one screen.
  • the transmitting unit may be configured to transmit the at least one synthesized moving image to at least one user terminal among the plurality of user terminals.
  • the synthesizing unit is configured such that a size of a first area in one screen on which the moving image generated based on instruction information from a first user terminal among the plurality of user terminals is displayed is the plurality of user terminals.
  • the at least one that is relatively larger than the size of the second area in the one screen on which the moving image generated based on the instruction information from the second user terminal is displayed.
  • One composite video may be generated.
  • the transmission unit may be configured to transmit the synthesized moving image to the first user terminal.
  • the application management system is generated based on instruction information from another user terminal among the plurality of user terminals in one screen of the at least one synthesized video to be transmitted to the at least one user terminal.
  • a synthesis definition information storage unit that stores synthesis definition information that defines attributes of a region where video is synthesized may be added.
  • the synthesis unit may be configured to generate the synthesized moving image to be transmitted to the at least one user terminal based on the synthesis definition information.
  • the combining unit generates a moving image generated based on instruction information from another user terminal among the plurality of user terminals in one screen of the at least one combined moving image transmitted to the at least one user terminal. May be configured to generate the synthesized moving image to be transmitted to the at least one user terminal based on synthesis definition information that defines attributes of a region to be synthesized.
  • Each of the plurality of application processing units may be configured to generate each moving image based on a calculation result of the calculation process that is a game process.
  • Another aspect of the present invention provides a method for controlling an application management system.
  • each of a plurality of application processing units corresponding to each of a plurality of user terminals performs individual calculation processing based on individual instruction information transmitted from the corresponding user terminal, Generating each moving image according to each user terminal based on the calculation result of the individual calculation processing, displaying each moving image generated by the plurality of application processing units for each predetermined area on one screen Generating at least one synthesized moving image, and transmitting the synthesized moving image to at least one user terminal among the plurality of user terminals.
  • Another aspect of the present invention provides a non-transitory storage medium that stores a computer program that is executed to perform a control method of an application management system.
  • each of a plurality of application processing units corresponding to each of a plurality of user terminals performs individual calculation processing based on individual instruction information transmitted from the corresponding user terminal, Generating each moving image according to each user terminal based on the calculation result of the individual calculation processing, displaying each moving image generated by the plurality of application processing units for each predetermined area on one screen Generating at least one synthesized moving image, and transmitting the synthesized moving image to at least one user terminal among the plurality of user terminals.
  • the application management system may include at least a combining unit.
  • the synthesizing unit generates one moving image corresponding to each user terminal generated based on a calculation result of individual calculation processing performed based on individual instruction information transmitted from each of a plurality of user terminals. It may be configured to generate at least one synthesized moving image synthesized so as to be displayed for each predetermined area on the screen.
  • the combining unit generates a moving image generated based on instruction information from another user terminal among the plurality of user terminals in one screen of the at least one combined moving image transmitted to the at least one user terminal. May be configured to generate the at least one synthesized moving image based on synthesis definition information that defines attributes of a region to be synthesized.
  • each moving image corresponding to each user terminal generated based on a calculation result of an individual calculation process performed based on individual instruction information transmitted from each of a plurality of user terminals is displayed as 1 It includes at least generating at least one synthesized moving image synthesized so as to be displayed for each predetermined area on one screen.
  • the generation of the at least one composite video is based on instruction information from another user terminal among the plurality of user terminals in one screen of the at least one composite video to be transmitted to the at least one user terminal. It may be performed based on the synthesis definition information that defines the attributes of the area where the generated moving image is synthesized.
  • each moving image corresponding to each user terminal generated based on a calculation result of an individual calculation process performed based on individual instruction information transmitted from each of a plurality of user terminals is displayed as 1 It includes at least generating at least one synthesized moving image synthesized so as to be displayed for each predetermined area on one screen.
  • the generation of the at least one composite video is based on instruction information from another user terminal among the plurality of user terminals in one screen of the at least one composite video to be transmitted to the at least one user terminal. It may be performed based on the synthesis definition information that defines the attributes of the area where the generated moving image is synthesized.
  • the application management system generates each moving image corresponding to each user terminal generated based on a calculation result of individual calculation processing performed based on individual instruction information transmitted from each of a plurality of user terminals. It includes at least a transmission unit configured to transmit at least one synthesized moving image synthesized so as to be displayed for each predetermined area on one screen to at least one user terminal among the plurality of user terminals.
  • Another aspect of the present invention provides an application management method.
  • each moving image corresponding to each user terminal generated based on a calculation result of an individual calculation process performed based on individual instruction information transmitted from each of a plurality of user terminals is displayed as 1 Transmitting at least one synthesized moving image synthesized so as to be displayed for each predetermined area on one screen to at least one user terminal among the plurality of user terminals.
  • Another aspect of the present invention provides a computer program product that is executed to perform an application management method.
  • each moving image corresponding to each user terminal generated based on a calculation result of an individual calculation process performed based on individual instruction information transmitted from each of a plurality of user terminals is displayed as 1 Transmitting at least one synthesized moving image synthesized so as to be displayed for each predetermined area on one screen to at least one user terminal among the plurality of user terminals.
  • FIG. 1 is a block diagram showing a configuration of an application system 1 according to the present embodiment.
  • the application system 1 includes a plurality of user terminals 10 (user terminal 10-1, user terminal 10-2, user terminal 10-3, etc. And applications connected to the plurality of user terminals 10 via the network 5.
  • a server 30 an example of an “application management device”.
  • the plurality of user terminals 10 have the same configuration, the description of “ ⁇ 1”, “ ⁇ 2”, etc. will be omitted as the user terminals 10 unless otherwise distinguished.
  • the network 5 is an information communication network configured by the Internet, a WAN (Wide Area Network), a LAN (Local Area Network), a dedicated line, or a combination thereof.
  • the user terminal 10 is a computer device used by a user.
  • a PC Personal Computer
  • the user terminal 10 includes input devices such as a keyboard, a mouse, and a touch panel, and accepts input of instruction information from the user.
  • the user terminal 10 includes a communication unit that communicates with the application server 30 via the network 5, a storage unit that stores various types of information, a display unit that is a display that displays information, and the like.
  • the user terminal 10 will be described as a PC.
  • the user terminal 10 uses an application service provided by the application server 30 connected via the network 5.
  • the user terminal 10 transmits instruction information input from the user to the application server 30, and displays a screen that reflects a result of application calculation processing performed by the application server 30 in accordance with the transmitted instruction information. Then, it is received from the application server 30 as a moving image in a streaming format and displayed on the display.
  • the instruction information transmitted from the user terminal 10 to the application server 30 is information indicating, for example, an application start request and operations corresponding to various functions of the application. Accordingly, the user can use the application without installing the application program in the user terminal 10.
  • the user U1 uses the user terminal 10-1
  • the user U2 uses the user terminal 10-2
  • the user U3 uses the user terminal 10-3
  • the user U4 uses the user terminal 10-4. It will be described as being used.
  • the application server 30 is a computer device that provides an application service that allows the user terminal 10 to use an application via the network 5, and includes a storage unit 31, a reception unit 33, an application control unit 34, a synthesis unit 36, and a transmission Part 37.
  • the storage unit 31 stores various types of information that are referred to in order for the application system 30 to operate.
  • the storage unit 31 stores in advance an application program for executing an application.
  • the receiving unit 33 receives instruction information transmitted from the user terminal 10.
  • the application control unit 34 performs a predetermined calculation process based on instruction information transmitted from each of the plurality of user terminals 10 and generates a moving image based on the calculation result, and a plurality of applications corresponding to each user terminal 10
  • a processing unit 35 (an application processing unit 35-1, an application processing unit 35-2, an application processing unit 35-3,...) Is provided.
  • the plurality of application processing units 35 have the same configuration, the description of “ ⁇ 1”, “ ⁇ 2”, etc. is omitted and described as the application processing unit 35 unless otherwise distinguished.
  • the application control unit 34 receives instruction information, which is an application use request, from the user terminal 10 via the reception unit 33, the application control unit 34 reads an application program stored in advance in the storage unit 31, and executes an application execution instance.
  • a certain application processing unit 35 is generated in its own storage area.
  • the application control unit 34 generates the same number of application processing units 35 as the user terminal 10 in order to generate one application processing unit 35 for one user terminal 10 that uses the application.
  • the application processing unit 35 performs predetermined calculation processing based on the instruction information transmitted from the user terminal 10 corresponding to itself, and generates a moving image based on the calculation result.
  • the arithmetic processing performed by the application processing unit 35 is, for example, processing for providing an application such as a game or desktop service.
  • an application such as a game or desktop service.
  • the application for example, a sports game such as a racing game, an action game such as a fighting game, a shooting game such as FPS (First Person Person shooter), a role playing game, a puzzle game, a simulation game, and the like can be applied.
  • the application processing unit 35 may execute an application such as office software such as word processing software, spreadsheet software, or presentation software.
  • the application processing unit 35 renders an application screen using a GPU (Graphics Processing Unit) or the like included in the application server 30, and outputs a moving image that is a continuous frame screen that changes with time series to the synthesis unit 36. .
  • GPU Graphics Processing Unit
  • the synthesizing unit 36 generates a synthesized moving image by synthesizing the moving images generated according to each of the plurality of user terminals 10 by the plurality of application processing units 35 so as to be displayed for each predetermined area on the screen. Specifically, the synthesizing unit 36 buffers the moving image generated by the application processing unit 35, encodes the moving image to a bit rate suitable for real-time moving image distribution, and generates a moving image. Then, a moving image corresponding to the plurality of user terminals 10 is combined to generate a combined moving image.
  • FIG. 2 is a diagram illustrating a screen example of a synthesized moving image generated by the synthesis unit 36.
  • the screen corresponding to the instruction information from the user terminal 10-1 corresponding to the user U1 is reduced and displayed in the area indicated by reference sign d1-1.
  • the screen corresponding to the instruction information from the user terminal 10-2 corresponding to the user U2 is reduced and displayed.
  • the area indicated by reference numeral d1-3 is from the user terminal 10-3 corresponding to the user U3.
  • the screen corresponding to the instruction information is reduced and displayed, and the screen corresponding to the instruction information from the user terminal 10-4 corresponding to the user U4 is reduced and displayed in the area denoted by reference numeral d1-4.
  • each user terminal 10 transmits to the application server 30 a moving image obtained by photographing a user who plays a musical instrument with a web camera or the like.
  • the application server 30 can produce
  • the application server 30 generates all the screens corresponding to the plurality of user terminals 10, and therefore can efficiently generate a composite moving image. For example, when application processing is performed on a user terminal, if a first user terminal and a second user terminal perform P2P (Peereto Peer) communication via the network 5, an attempt is made to generate a composite video. The screen generated in the first user terminal is transmitted to the second user terminal, and the second user terminal receives the screen transmitted from the first user terminal and combines it with its own screen for display. There is a need to.
  • P2P Peereto Peer
  • the transmitting unit 37 transmits the synthesized moving image generated by the synthesizing unit 36 to the user terminal 10.
  • the transmission unit 37 transmits the composite moving image in a streaming format.
  • Streaming is a transfer and playback method in which moving image data corresponding to time series is sequentially played back while being received. Thereby, the screen which changes according to instruction information can be displayed on the user terminal 10 in real time.
  • FIG. 3 is a diagram illustrating an operation example of the application system 1 according to the present embodiment.
  • the user terminal 10-1 connects to the application server 30 and transmits a use request (step S1).
  • the application control unit 34 generates an application (AP) processing unit 35-1 corresponding to the user terminal 10-1. .
  • the application processing unit 35-1 performs predetermined calculation processing according to the instruction information transmitted from the user terminal 10-1, generates a moving image based on the calculation result, and starts processing to output to the synthesis unit 36 (Step S1). S2).
  • the synthesizing unit 36 buffers the moving image output from the application processing unit 35-1 and encodes it to a bit rate suitable for moving image distribution.
  • a process for generating a streaming-format moving image is started (step S4).
  • the synthesis unit 36 outputs only the moving image corresponding to the user terminal 10-1 to the transmission unit 37 (step S5).
  • the transmission unit 37 starts processing for transmitting the moving image generated by the synthesis unit 36 in a streaming format (step S6).
  • the transmission unit 37 transmits the moving image to the user terminal 10-1 (Step S7).
  • the user terminal 10-2 connects to the application server 30 and transmits a use request (step S8).
  • the application control unit 34 When the reception unit 33 of the application server 30 receives the usage request transmitted from the user terminal 10-2, the application control unit 34 generates an application (AP) processing unit 35-2 corresponding to the user terminal 10-2.
  • the application processing unit 35-2 performs predetermined calculation processing according to the instruction information transmitted from the user terminal 10-2, generates a moving image based on the calculation result, and starts processing to output to the synthesis unit 36 (step S9).
  • the synthesizing unit 36 buffers the moving image output from the application processing unit 35-2 and encodes it to a bit rate suitable for moving image distribution. To create a streaming video. Then, a synthesized moving image obtained by combining the moving image output from the application processing unit 35-1 and the moving image output from the application processing unit 35-2 is generated and output to the transmission unit 37 (step S11). The transmitting unit 37 transmits the synthesized moving image generated by the synthesizing unit 36 to the user terminal 10-1 and the user terminal 10-2 in a streaming format (Step S12).
  • the user terminal 10-3 connects to the application server 30 and transmits a use request (step S13).
  • the application control unit 34 When the reception unit 33 of the application server 30 receives the usage request transmitted from the user terminal 10-3, the application control unit 34 generates an application (AP) processing unit 35-3 corresponding to the user terminal 10-3.
  • the application processing unit 35-3 performs predetermined calculation processing according to the instruction information transmitted from the user terminal 10-3, generates a moving image based on the calculation result, and starts processing to output to the synthesis unit 36 (step S14).
  • the synthesizing unit 36 buffers the moving image output from the application processing unit 35-3 and encodes it to a bit rate suitable for moving image distribution. To create a streaming video. Then, a composite video that combines the video output from the application processing unit 35-1, the video output from the application processing unit 35-2, and the video output from the application processing unit 35-3 is generated and transmitted. It outputs to the part 37 (step S16).
  • the transmitting unit 37 transmits the synthesized moving image generated by the synthesizing unit 36 to the user terminal 10-1, the user terminal 10-2, and the user terminal 10-3 in a streaming format (step S17).
  • the user terminal 10-4 connects to the application server 30 and transmits a use request (step S18).
  • the application control unit 34 When the reception unit 33 of the application server 30 receives the usage request transmitted from the user terminal 10-4, the application control unit 34 generates an application (AP) processing unit 35-4 corresponding to the user terminal 10-4.
  • the application processing unit 35-4 performs predetermined calculation processing according to the instruction information transmitted from the user terminal 10-4, generates a moving image based on the calculation result, and starts processing to output to the synthesis unit 36 (step S19).
  • the synthesizing unit 36 buffers the moving image output from the application processing unit 35-4 and encodes it to a bit rate suitable for moving image distribution. To create a streaming video.
  • a synthesized moving image is generated by combining the moving image and output to the transmission unit 37 (step S21).
  • the transmitting unit 37 transmits the synthesized moving image generated by the synthesizing unit 36 to the user terminal 10-4 to the user terminal 10-1, the user terminal 10-2, and the user terminal 10-3 in a streaming format (step S22). ).
  • each of the plurality of user terminals 10 corresponds to the screen corresponding to the instruction information input to itself and the instruction information input to the other user terminals 10. Since the synthesized moving image synthesized with the screen is output, it is possible to grasp the status of other users' applications. In addition, since the screen corresponding to each user terminal 10 is generated in the same application server 30, a synthesized moving image is generated by combining the screens corresponding to each user terminal 10 without performing communication or the like with other devices. It is possible to generate and transmit a composite video without excessively increasing processing load and network load.
  • FIG. 4 is a block diagram showing the configuration of the application system 2 according to the present embodiment. Since the application system 2 of the present embodiment has the same configuration as that of the application system 1 of the first embodiment, the description of the same configuration as the application system 1 will be omitted, and a different configuration will be described.
  • the application provided by the application server 30 is a multi-user FPS game application in which a plurality of users form a team and the teams battle each other.
  • the application processing unit 35 of the application server 30 is a game process. A moving image is generated based on a calculation result of a certain calculation process.
  • the application system 2 of this embodiment includes a matching server 20 connected to the application server 30.
  • the matching server 20 includes a user information storage unit 21, a matching information storage unit 22, and a matching processing unit 23.
  • the application processing unit 35 supports a plurality of users who play a game as a team or an opponent. To do.
  • the application server 30 and the matching server 20 are an example of an application management apparatus.
  • the user information storage unit 21 stores user information indicating user attributes.
  • FIG. 5 is a diagram illustrating a data example of user information stored in the user information storage unit 21.
  • the user information includes information such as a user ID (Identifier), a win rate, a ranking order, and a registration date.
  • the user ID is information for identifying the user.
  • the battle win rate is information indicating the win rate of winning the opponent team in the game.
  • the ranking order is a rank based on the score, winning percentage, etc. in the game.
  • the registration date is information indicating the date when the corresponding user registered an account or the like to use the game.
  • the user information only needs to include at least the user ID.
  • the matching information storage unit 22 stores matching information indicating a correspondence between a plurality of users who play a game as a team or an opponent.
  • FIG. 6 is a diagram illustrating a data example of matching information stored in the matching information storage unit 22.
  • the matching information is associated with information of a user ID, a matching user ID (own team), and a matching user ID (partner team).
  • the user ID is information for identifying a user who uses the application.
  • the matching user ID (own team) is the user ID of another user associated with the same team as the corresponding user ID.
  • a matching user ID (an opponent team) is a user ID of another user associated with an opponent team for the team of the corresponding user ID.
  • the user whose user ID is U1 the user whose user ID is U2, the user whose user ID is U3, and the user whose user ID is U4 are the same team. It is shown that the team includes a user whose ID is U5, a user whose user ID is U6, a user whose user ID is U7, and a user whose user ID is U8. If the application is not a competitive game, the “matching user ID (partner team)” is not necessary.
  • the matching processing unit 23 generates matching information based on the matching request from the application processing unit 35 and stores it in the matching information storage unit 22. For example, the matching processing unit 23 reads the user information stored in the user information storage unit 21 and generates matching information in which users having the same battle win rate and ranking ranking are associated as the same team and the opponent team. Alternatively, for example, the matching processing unit 23 transmits and presents information on a list of users waiting for start and a list of teams waiting for an opponent to the user terminal 10 and accepts selection of the own team or the opponent team, Matching information can also be generated based on the accepted selection.
  • the matching processing unit 23 stores the generated matching information in the matching information storage unit 22, and the generated matching information is generated in the application server 30 in which the application processing unit 35 corresponding to the user included in the generated matching information is activated. Send matching information.
  • the application server 30 receives the matching information
  • the application processing units 35 corresponding to each of a plurality of users associated with the matching information can perform game control such as a battle process.
  • the combining unit 36 of the application server 30 can combine the moving images based on the user ID associated with the matching information as described later.
  • the storage unit 31 of the application server 30 in the present embodiment includes a synthesis definition information storage unit 32.
  • the composition definition information storage unit 32 sets, for each user, an attribute of an area in which a moving image generated based on instruction information from another user terminal 10 is combined in a screen of the synthetic moving image transmitted to the user terminal 10.
  • the defined composition definition information is stored.
  • FIG. 7 is a diagram illustrating a data example of the synthesis definition information stored in the synthesis definition information storage unit 32.
  • the combination definition information includes information such as a user ID, a combination target user ID, a combination position, a scale, and transparency.
  • the user ID is information for identifying the user.
  • the composition target user ID is information indicating the user ID of another user who composes the screen, for example, the user ID of a user on the same team.
  • the composition position is information indicating a position where another user's screen is combined, and a coordinate position on the screen indicating a position where the user's screen is combined is associated with each combination target user ID.
  • the scale indicates the scale of the screen of another user to be synthesized.
  • the transparency indicates the transparency of the screen of another user to be synthesized.
  • composition definition information is generated by the composition unit 36 at the start of an application, for example, and stored in the composition definition information storage unit 32.
  • the synthesizing unit 36 acquires the matching information stored in the matching information storage unit 22 of the matching server 20, and the matching user ID (own team) associated with the specific user ID in the acquired matching information is The combination definition information in which the indicated user ID is associated with the specific user ID as the combination target user ID is generated.
  • the synthesis unit 36 stores in advance the initial values of the information of the synthesis position, the scale, and the transparency in the synthesis definition information in its own storage area, and the synthesis definition information in which the initial values are associated with each other. Can be generated.
  • synthesis definition information instruction information for designating information of a synthesis position, a scale, and transparency is received from the user terminal 10 at the start of the application or during the application, and based on the received instruction information.
  • the synthesis definition information can be generated and stored in the synthesis definition information storage unit 32. Based on such synthesis definition information, the synthesis unit 36 generates a synthesized moving image.
  • the user can specify the position where the other user's screen is displayed, the transparency, and the like, and can play the game while grasping the other user's screen and situation.
  • the composition unit 36 determines that the size of the area in the screen on which the moving image generated based on the instruction information from the user terminal 10 corresponding to the main user is displayed is based on the instruction information from the other user terminals 10.
  • a synthesized moving image that is relatively large with respect to the size of the area in the screen on which the generated moving image is displayed is generated.
  • the main user is a user of the user terminal 10 that transmits the synthesized video, and the synthesis unit 36 generates a different synthesized video for each user.
  • FIG. 8 to FIG. 9 are diagrams showing examples of a composite video screen with the user U1 as the main user.
  • FIG. 8 is a diagram illustrating a first screen example of a synthesized moving image generated based on the synthesis definition information by the synthesis unit 36 of the present embodiment.
  • a screen corresponding to the instruction information from the user terminal 10-1 corresponding to the user U1 is displayed on the entire screen indicated by the symbol d2-1, and the user corresponding to the user U2 is displayed in the region indicated by the symbol d2-2.
  • the screen corresponding to the instruction information from the terminal 10-2 is reduced and displayed, and the screen corresponding to the instruction information from the user terminal 10-3 corresponding to the user U3 is reduced in the area denoted by reference numeral d2-3.
  • the screen corresponding to the instruction information from the user terminal 10-4 corresponding to the user U4 is reduced and displayed.
  • FIG. 9 is a diagram illustrating a second screen example of the synthesized moving image generated by the synthesizing unit 36 based on the synthesis definition information.
  • the screen indicated by reference sign d3 the screen corresponding to the instruction information from the user terminal 10-1 corresponding to the user U1 is reduced and displayed in the area indicated by reference sign d3-1.
  • the screen corresponding to the instruction information from the user terminal 10-2 corresponding to the user U2 is reduced and displayed.
  • the area from the user terminal 10-3 corresponding to the user U3 is displayed.
  • the screen corresponding to the instruction information is reduced and displayed, and the screen corresponding to the instruction information from the user terminal 10-4 corresponding to the user U4 is reduced and displayed in the area d3-4.
  • FIG. 10 is a diagram illustrating a third screen example of the synthesized moving image generated by the synthesis unit 36 based on the synthesis definition information.
  • a screen corresponding to the instruction information from the user terminal 10-1 corresponding to the user U1 is displayed on the entire screen indicated by reference sign d4-1, and the user corresponding to the user U2 is displayed in the area indicated by reference sign d4-2.
  • the screen corresponding to the instruction information from the terminal 10-2 is reduced and displayed, and the screen corresponding to the instruction information from the user terminal 10-3 corresponding to the user U3 is reduced in the area indicated by reference numeral d4-3.
  • the screen corresponding to the instruction information from the user terminal 10-4 corresponding to the user U4 is reduced and displayed.
  • the transmission part 37 transmits the synthetic
  • FIG. 11 is a diagram illustrating an operation example of the application system 2 according to the present embodiment.
  • Step S30 and step S31 are the same as step S1 and step S2 in the first embodiment.
  • the application processing unit 35-1 transmits a matching request to the matching server 20 based on the instruction information transmitted from the user terminal 10-1 (Step S32).
  • the combining unit 36 The moving image output from the application processing unit 35-1 is buffered, encoded to a bit rate suitable for moving image distribution, and a streaming moving image is generated and output (step S34).
  • the transmission unit 37 starts processing for transmitting the moving image generated by the synthesis unit 36 in a streaming format (step S35).
  • the transmission unit 37 transmits the moving image to the user terminal 10-1 (Step S36).
  • the matching processing unit 23 of the matching server 20 generates matching information in which the user U1, the user U2, the user U3, and the user U4 are associated, and stores the matching information in the matching information storage unit 22 (step S55).
  • the matching processing unit 23 of the matching server 20 transmits the generated matching information to the application server 30.
  • the application processing unit 35-1, the application processing unit 35-2, the application processing unit 35-3, and the application processing unit 35-4 receive the matching information transmitted from the matching server 20. .
  • the composition unit 36 of the application server 30 receives the matching information (step S56).
  • the synthesizing unit 36 Upon receiving the matching information transmitted from the matching server 20, the synthesizing unit 36 generates synthesis definition information for each user based on the received matching information and stores it in the synthesis definition information storage unit 32. The synthesizing unit 36 then, the moving image output from the application processing unit 35-1, the moving image output from the application processing unit 35-2, the moving image output from the application processing unit 35-3, and the application processing unit 35. -4 starts to generate a composite video synthesized from the video output from -4 based on the composite definition information for each user (step S57). When the combining unit 36 generates and outputs a combined moving image (step S58), the transmission unit 37 transmits the combined moving image generated for each user to each user terminal 10 (step S59).
  • the user can monitor the situation of other users on the same team. It can be grasped, the field of view is widened, and the game can be played while taking into consideration the conditions of other users, and the game performance is improved. Moreover, since the trend of other users in the team can be grasped more intuitively and specifically than communication by voice chat or the like, it is possible to expect an effect of increasing the strategy and immersiveness of the game.
  • FIG. 12 is a block diagram showing the configuration of the application system 3 according to the present embodiment.
  • the application system 3 according to the present embodiment has the same configuration as that of the application system 2 according to the second embodiment.
  • the application control system 130 includes a storage server 131, a reception server 133, and a plurality of application control servers 134 (applications). Control server 134-1, application control server 134-2,...), Composition server 136, and transmission server 137.
  • the application control system 130 and the matching server 20 are examples of an application management apparatus.
  • the storage server 131 has the same configuration as the storage unit 31 in the first embodiment.
  • the reception server 133 has the same configuration as that of the reception unit 33 in the first embodiment.
  • the application control server 134 has the same configuration as that of the application control unit 34 in the first embodiment.
  • the composition server 136 has the same configuration as that of the composition unit 36 in the first embodiment.
  • the transmission server 137 has the same configuration as the transmission unit 37 in the first embodiment. Even with such a configuration, it is possible to provide an application similar to that of the first embodiment or the second embodiment.
  • the program for realizing the function of the processing unit in the present invention is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read into the computer system and executed to control the application. You may go.
  • the “computer system” includes an OS and hardware such as peripheral devices.
  • the “computer system” includes a WWW system provided with a website providing environment (or display environment).
  • the “computer-readable recording medium” refers to a storage device such as a flexible medium, a magneto-optical disk, a portable medium such as a ROM or a CD-ROM, and a hard disk incorporated in a computer system.
  • the “computer-readable recording medium” refers to a volatile memory (RAM) in a computer system that becomes a server or a client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line. In addition, those holding programs for a certain period of time are also included.
  • RAM volatile memory
  • the program may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or by a transmission wave in the transmission medium.
  • the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
  • the program may be for realizing a part of the functions described above. Furthermore, what can implement

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
PCT/JP2013/061745 2012-05-11 2013-04-22 アプリケーション管理装置、アプリケーション管理装置の制御方法、アプリケーション管理装置の制御プログラム Ceased WO2013168547A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012109515A JP2013239766A (ja) 2012-05-11 2012-05-11 アプリケーション管理装置、アプリケーション管理装置の制御方法、アプリケーション管理装置の制御プログラム
JP2012-109515 2012-05-11

Publications (1)

Publication Number Publication Date
WO2013168547A1 true WO2013168547A1 (ja) 2013-11-14

Family

ID=49550598

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/061745 Ceased WO2013168547A1 (ja) 2012-05-11 2013-04-22 アプリケーション管理装置、アプリケーション管理装置の制御方法、アプリケーション管理装置の制御プログラム

Country Status (2)

Country Link
JP (1) JP2013239766A (enExample)
WO (1) WO2013168547A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11478710B2 (en) 2019-09-13 2022-10-25 Square Enix Co., Ltd. Information processing device, method and medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018164227A (ja) * 2017-03-27 2018-10-18 株式会社Nttぷらら 映像配信システム及び映像配信方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007301371A (ja) * 2006-05-10 2007-11-22 Konami Gaming Inc ゲーミング装置に対してゲームおよびサービスをストリームするためのシステムおよび方法
WO2012053273A1 (ja) * 2010-10-20 2012-04-26 株式会社ソニー・コンピュータエンタテインメント 画像処理システム、画像処理方法、動画像送信装置、動画像受信装置、情報記憶媒体及びプログラム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001157782A (ja) * 1999-12-02 2001-06-12 Dowango:Kk 対戦相手決定システム
JP2011087649A (ja) * 2009-10-20 2011-05-06 Konami Digital Entertainment Co Ltd ゲームシステム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007301371A (ja) * 2006-05-10 2007-11-22 Konami Gaming Inc ゲーミング装置に対してゲームおよびサービスをストリームするためのシステムおよび方法
WO2012053273A1 (ja) * 2010-10-20 2012-04-26 株式会社ソニー・コンピュータエンタテインメント 画像処理システム、画像処理方法、動画像送信装置、動画像受信装置、情報記憶媒体及びプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11478710B2 (en) 2019-09-13 2022-10-25 Square Enix Co., Ltd. Information processing device, method and medium

Also Published As

Publication number Publication date
JP2013239766A (ja) 2013-11-28

Similar Documents

Publication Publication Date Title
US11446582B2 (en) System and method for streaming game sessions to third party gaming consoles
JP5520190B2 (ja) 画像処理システム、画像処理方法、動画像送信装置、動画像受信装置、プログラム及び情報記憶媒体
US9526989B2 (en) Method and apparatus for receiving game streaming data, and method and server for transmitting game streaming data
JP6576245B2 (ja) 情報処理装置、制御方法及びプログラム
JP6378849B1 (ja) サーバおよびプログラム
KR102307714B1 (ko) 서버 장치, 및 그것에 사용되는 컴퓨터 프로그램
JP7366948B2 (ja) ゲームリプレイ方法およびシステム
CN113272031B (zh) 用于动态用户体验的集成界面
JP2021072965A5 (enExample)
JP2016116638A (ja) ゲーム処理システム、ゲーム処理方法、およびゲーム処理プログラム
JP6905826B2 (ja) プログラム、ゲーム装置、及びサーバ装置
CN115738295A (zh) 在线游戏中的观众系统
JP5876432B2 (ja) ゲームシステム及びプログラム
JP2012085876A (ja) 画像処理システム、画像処理方法、プログラム及び情報記憶媒体
JP6826295B1 (ja) コンピュータプログラム、情報処理装置及び情報処理方法
WO2013168547A1 (ja) アプリケーション管理装置、アプリケーション管理装置の制御方法、アプリケーション管理装置の制御プログラム
JP2025142069A (ja) コンテンツ配信システム及びプログラム
JP2020151233A (ja) コンピュータシステム、端末及び配信サーバー
WO2021177439A1 (ja) コンピュータシステム、ゲームシステム、代行プレイ実行制御方法及びプログラム
US12220644B2 (en) Watching system, computer program for watching system, and control method for watching system
WO2022239784A1 (ja) プログラム及び情報処理装置
JP2021137394A (ja) コンピュータシステム、ゲームシステム、リプレイ動画提供方法及びプログラム
JP7185172B1 (ja) 情報処理装置、情報処理方法および情報処理プログラム
JP2014016828A (ja) アプリケーション管理装置、アプリケーション管理システム、アプリケーション管理方法、およびアプリケーション管理プログラム
JP7682668B2 (ja) エンターテインメント情報管理システム及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13788250

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13788250

Country of ref document: EP

Kind code of ref document: A1