JP6568246B2 - Game program, method, and information processing device - Google Patents

Game program, method, and information processing device Download PDF

Info

Publication number
JP6568246B2
JP6568246B2 JP2018015868A JP2018015868A JP6568246B2 JP 6568246 B2 JP6568246 B2 JP 6568246B2 JP 2018015868 A JP2018015868 A JP 2018015868A JP 2018015868 A JP2018015868 A JP 2018015868A JP 6568246 B2 JP6568246 B2 JP 6568246B2
Authority
JP
Japan
Prior art keywords
characters
event
user
game program
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2018015868A
Other languages
Japanese (ja)
Other versions
JP2019130118A (en
Inventor
さと美 藤井
さと美 藤井
Original Assignee
株式会社コロプラ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コロプラ filed Critical 株式会社コロプラ
Priority to JP2018015868A priority Critical patent/JP6568246B2/en
Publication of JP2019130118A publication Critical patent/JP2019130118A/en
Application granted granted Critical
Publication of JP6568246B2 publication Critical patent/JP6568246B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present disclosure relates to a game program, a method for executing the game program, and an information processing apparatus.
  Conventionally, a game executed using a plurality of characters is known (see, for example, Patent Document 1).
JP 2014-128454 A (published July 10, 2014)
  In a game using a plurality of characters, if the game progresses due to the relationship between the characters, the interest of the game is improved.
  An object of one embodiment of the present disclosure is to improve the fun of a game based on the relationship between characters.
  The game program according to the present disclosure is executed by a computer including a processor and a memory. The game program uses a plurality of characters that can be used in a game based on the game program to the processor to form a group of characters, to select two or more characters from the group, and between the two or more characters When the first parameter representing the relationship satisfies the first condition, a step of executing the first event with contents corresponding to two or more characters is executed.
  The method according to the present disclosure is a method in which a computer including a processor, a memory, an operation unit, and a display unit executes a game program. The method includes a processor performing the steps of claim 1.
  An information processing apparatus according to the present disclosure includes a storage unit that stores the above-described game program, and a control unit that controls the operation of the information processing apparatus by executing the game program.
  According to one aspect of the present disclosure, there is an effect of improving the interest of the game by the relationship between characters.
It is a figure which shows the hardware constitutions of the game system which concerns on embodiment of this invention. It is a block diagram which shows the functional structure of the user terminal and server which concern on embodiment of this invention. It is a figure which shows the specific example of the element selection screen in embodiment of this invention. It is a figure which shows the specific example of the group screen in embodiment of this invention. (A) is a figure which shows the specific example of the play screen of the 1st part in embodiment of this invention. (B) is a figure which shows the specific example of the result screen of a 1st part. (A) is a schematic diagram explaining the 1st production in the 1st part of the embodiment of the present invention. (B) is a schematic diagram illustrating the second effect. (A) And (B) is a figure which shows the specific example of the image which respectively comprises the music video in the 1st part of embodiment of this invention. (A)-(E) is a figure which shows the specific example of the image which each implement | achieves the production | presentation included in the music video in the 1st part of embodiment of this invention. (A)-(F) are a series of figures which show the specific example of the screen transition of the music video in the 1st part of embodiment of this invention. (A) is a figure which shows the specific example of the breeding character selection screen in embodiment of this invention. (B) is a diagram showing a specific example of an execution screen of a second event executed in the second part of the embodiment of the present invention. (C) is a diagram showing a specific example of the result screen of the second part of the embodiment of the present invention. It is a flowchart which shows the flow of the process which advances the 1st part performed based on the game program which concerns on embodiment of this invention. It is a flowchart which shows the flow of the process which advances based on the game program which concerns on embodiment of this invention, and advances a 2nd part.
Embodiment
A game system according to the present disclosure is a system for providing a game to a plurality of users. Hereinafter, the game system will be described with reference to the drawings. It should be noted that the present invention is not limited to these exemplifications, but is defined by the scope of claims, and is intended to include all modifications within the meaning and scope equivalent to the scope of claims. The In the following description, the same reference numerals are given to the same elements in the description of the drawings, and repeated description is not repeated.
<Outline of game system 1>
The game system 1 is a system that executes a game including executing a first event. In the present embodiment, it is assumed that the first event is executed in the first part of the game. However, the first event is not limited to that executed in the first part of the game. Moreover, the game performed by the game system 1 should just be a game including performing a 1st event, and is not necessarily restricted to a thing containing a 1st part.
  The first part is a part that proceeds using a group composed of a plurality of characters. In the present embodiment, the first part is the selection of the elements constituting the first event, the organization of the group used in the first part, the selection of two or more main characters from the group, the execution of the first event, and the first It consists of presentation of the progress result of one part. However, the first part only needs to include at least the execution of the first event, and is not limited to the configuration described above. Here, an example in which the first event is a concert with special contents by a group (hereinafter referred to as live) will be described. The concert includes images and videos of music videos to be described later. The special content is content corresponding to two or more characters selected from the group. Therefore, an example of a concert with special contents is an image or video of a music video corresponding to the friendship between two or more characters selected from a group. An example in which the group organization used in the first event is a deck organization will be described. That is, the user selects a plurality of cards from the virtual cards that he / she owns and looks like a character, and organizes a deck used in the first event. An example in which the element constituting the first event is a music video representing a song performed live will be described. Also, an example will be described in which the first part is a music game that is operated in accordance with the rhythm of a song played as a music video. In this case, the first part is configured to include the selection of the music constituting the live, the organization of the group performing the live, the selection of two or more main characters from the group, the performance of the live, and the presentation of the results of the music game. The However, the first event is not limited to live with special contents. Further, the elements constituting the first event are not limited to music videos. Further, the first part is not limited to a music game.
  The game executed by the game system 1 may include a second part. The second part is a part that updates a first parameter between the two or more characters by executing a second event for a combination of two or more characters among a plurality of characters available in the game. The first parameter will be described later. The second event represents an event for updating the first parameter between two or more characters. For example, the second event may be an event for preparing for live performance, an event for deepening friendship between characters, or the like. However, the second event is not limited to these examples.
  In the present embodiment, the second part includes selection of two or more characters, execution of the second event, update of the first parameter, and presentation of the progress result of the second part. However, the second part only needs to include at least execution of the second event and update of the first parameter, and is not limited to the above-described configuration. Note that the game executed by the game system 1 does not necessarily include the second part.
<Hardware configuration of game system 1>
FIG. 1 is a diagram illustrating a hardware configuration of the game system 1. The game system 1 includes a plurality of user terminals 100 and a server 200 as illustrated. Each user terminal 100 is connected to the server 200 via the network 2. The network 2 includes various mobile communication systems constructed by the Internet and a radio base station (not shown). Examples of the mobile communication system include a so-called 3G and 4G mobile communication system, LTE (Long Term Evolution), and a wireless network (for example, Wi-Fi (registered trademark)) that can be connected to the Internet through a predetermined access point. It is done.
  The server 200 (computer, information processing apparatus) may be a general-purpose computer such as a workstation or a personal computer. The server 200 includes a processor 20, a memory 21, a storage 22, a communication IF 23, and an input / output IF 24. These components included in the server 200 are electrically connected to each other via a communication bus.
  The user terminal 100 (computer, information processing apparatus) may be a mobile terminal such as a smartphone, a feature phone, a PDA (Personal Digital Assistant), or a tablet computer. The user terminal 100 may be a game device suitable for game play. As illustrated, the user terminal 100 includes a processor 10, a memory 11, a storage 12, a communication interface (IF) 13, an input / output IF 14, a touch screen 15 (display unit), a camera 17, and a distance measuring sensor 18. With. These components included in the user terminal 100 are electrically connected to each other via a communication bus. Further, as shown in FIG. 1, the user terminal 100 may be configured to be able to communicate with one or more controllers 1020. For example, the controller 1020 establishes communication with the user terminal 100 in accordance with a communication standard such as Bluetooth (registered trademark). The controller 1020 may have one or more buttons and the like, and transmits an output value based on a user input operation to the buttons and the like to the user terminal 100. The controller 1020 may include various sensors such as an acceleration sensor and an angular velocity sensor, and transmits output values of the various sensors to the user terminal 100.
  Note that the controller 1020 may include the camera 17 and the distance measuring sensor 18 instead of or in addition to the user terminal 100 including the camera 17 and the distance measuring sensor 18.
  For example, at the start of a game, the user terminal 100 desirably allows a user who uses the controller 1020 to input user identification information such as the user's name or login ID via the controller 1020. As a result, the user terminal 100 can associate the controller 1020 with the user, and identifies which user the output value belongs to based on the transmission source (controller 1020) of the received output value. be able to.
  When the user terminal 100 communicates with a plurality of controllers 1020, each user grasps each controller 1020, so that the one user terminal 100 does not communicate with other devices such as the server 200 via the network 2. Multi-play can be realized. Further, each user terminal 100 is connected to each other by a wireless standard such as a wireless local area network (LAN) standard (communication connection is not performed via the server 200), so that a plurality of user terminals 100 can realize multi-play locally. You can also When the above-described multiplay is realized locally by one user terminal 100, the user terminal 100 may further include at least a part of various functions described later included in the server 200. Further, when the above-described multi-play is realized locally by a plurality of user terminals 100, the plurality of user terminals 100 may be provided with various functions described later included in the server 200 in a distributed manner.
  Note that the user terminal 100 may communicate with the server 200 even when the above-described multiplayer is realized locally. For example, information indicating a play result such as a result or win / loss in a certain game may be associated with the user identification information and transmitted to the server 200.
  The controller 1020 may be configured to be detachable from the user terminal 100. In this case, a coupling portion with the controller 1020 may be provided on at least one surface of the housing of the user terminal 100. When the user terminal 100 and the controller 1020 are coupled by wire through the coupling unit, the user terminal 100 and the controller 1020 transmit and receive signals via the wire.
  As shown in FIG. 1, the user terminal 100 may accept the mounting of a storage medium 1030 such as an external memory card via the input / output IF 14. Accordingly, the user terminal 100 can read the program and data recorded in the storage medium 1030. The program recorded in the storage medium 1030 is a game program, for example.
  The user terminal 100 may store the game program acquired by communicating with an external device such as the server 200 in the memory 11 of the user terminal 100, or store the game program acquired by reading from the storage medium 1030 in the memory 11. May be stored.
  As described above, the user terminal 100 includes the communication IF 13, the input / output IF 14, the touch screen 15, the camera 17, and the distance measuring sensor 18 as an example of a mechanism for inputting information to the user terminal 100. Each of the above-described units serving as an input mechanism can be regarded as an operation unit configured to accept a user input operation.
  For example, when the operation unit is configured by at least one of the camera 17 and the distance measuring sensor 18, the operation unit detects an object 1010 in the vicinity of the user terminal 100, and performs an input operation from the detection result of the object. Identify. As an example, a user's hand as an object 1010, a marker having a predetermined shape, or the like is detected, and an input operation is specified based on the color, shape, movement, or type of the object 1010 obtained as a detection result. The More specifically, when a user's hand is detected from a captured image of the camera 17, the user terminal 100 performs a user's input operation on a gesture (a series of movements of the user's hand) detected based on the captured image. Identify and accept as Note that the captured image may be a still image or a moving image.
  Alternatively, when the operation unit includes the touch screen 15, the user terminal 100 identifies and accepts a user operation performed on the input unit 151 of the touch screen 15 as a user input operation. Or when an operation part is comprised by communication IF13, the user terminal 100 specifies and receives the signal (for example, output value) transmitted from the controller 1020 as a user's input operation. Alternatively, when the operation unit includes the input / output IF 14, a signal output from an input device (not shown) different from the controller 1020 connected to the input / output IF 14 is specified and accepted as a user input operation.
<Hardware components of each device>
The processor 10 controls the operation of the entire user terminal 100. The processor 20 controls the operation of the entire server 200. The processors 10 and 20 include a central processing unit (CPU), a micro processing unit (MPU), and a graphics processing unit (GPU).
  The processor 10 reads a program from a storage 12 described later and develops it in a memory 11 described later. The processor 20 reads a program from a storage 22 described later and develops it in a memory 21 described later. The processor 10 and the processor 20 execute the developed program.
  The memories 11 and 21 are main storage devices. The memories 11 and 21 include storage devices such as a ROM (Read Only Memory) and a RAM (Random Access Memory). The memory 11 provides a work area to the processor 10 by temporarily storing a program and various data read from the storage 12 described later by the processor 10. The memory 11 also temporarily stores various data generated while the processor 10 is operating according to the program. The memory 21 provides the work area to the processor 20 by temporarily storing various programs and data read from the storage 22 described later by the processor 20. The memory 21 temporarily stores various data generated while the processor 20 is operating according to the program.
  In the present embodiment, the program may be a game program for realizing the game by the user terminal 100. Alternatively, the program may be a game program for realizing the game by cooperation between the user terminal 100 and the server 200. Alternatively, the program may be a game program for realizing the game by cooperation of a plurality of user terminals 100. The various data includes data relating to the game such as user information and game information, and instructions or notifications transmitted / received between the user terminal 100 and the server 200 or between the plurality of user terminals 100.
  The storages 12 and 22 are auxiliary storage devices. The storages 12 and 22 are constituted by a storage device such as a flash memory or an HDD (Hard Disk Drive). Various data relating to the game is stored in the storage 12 and the storage 22.
  The communication IF 13 controls transmission / reception of various data in the user terminal 100. The communication IF 23 controls transmission / reception of various data in the server 200. The communication IFs 13 and 23 control communication using, for example, communication via a wireless local area network (LAN), Internet communication via a wired LAN, a wireless LAN, or a cellular phone network, and short-range wireless communication.
  The input / output IF 14 is an interface for the user terminal 100 to accept data input, and is an interface for the user terminal 100 to output data. The input / output IF 14 may input / output data via a USB (Universal Serial Bus) or the like. The input / output IF 14 may include, for example, a physical button of the user terminal 100, a camera, a microphone, a speaker, or the like. The input / output IF 24 of the server 200 is an interface for the server 200 to accept data input, and is an interface for the server 200 to output data. The input / output IF 24 may include, for example, an input unit that is an information input device such as a mouse or a keyboard, and a display unit that is a device that displays and outputs an image.
  The touch screen 15 of the user terminal 100 is an electronic component that combines an input unit 151 and a display unit 152. The input unit 151 is a touch-sensitive device, for example, and is configured by a touch pad, for example. The display unit 152 is configured by, for example, a liquid crystal display or an organic EL (Electro-Luminescence) display.
  The input unit 151 detects a position where a user operation (physical contact operation such as a touch operation, a slide operation, a swipe operation, and a tap operation) is input to the input surface, and inputs information indicating the position. A function of transmitting as a signal is provided. The input unit 151 may include a touch sensing unit (not shown). The touch sensing unit may adopt any method such as a capacitance method or a resistance film method.
  Although not shown, the user terminal 100 may include one or more sensors for specifying the holding posture of the user terminal 100. This sensor may be, for example, an acceleration sensor or an angular velocity sensor. When the user terminal 100 includes a sensor, the processor 10 can specify the holding posture of the user terminal 100 from the output of the sensor and perform processing according to the holding posture. For example, when the user terminal 100 is held in the portrait orientation, the processor 10 may perform a portrait screen display in which a portrait image is displayed on the display unit 152. On the other hand, when the user terminal 100 is held sideways, a horizontal screen display in which a horizontally long image is displayed on the display unit may be used. As described above, the processor 10 may be able to switch between the vertical screen display and the horizontal screen display according to the holding posture of the user terminal 100.
  The camera 17 includes an image sensor and the like, and generates a captured image by converting incident light incident from a lens into an electric signal.
  The distance measuring sensor 18 is a sensor that measures the distance to the measurement object. The distance measuring sensor 18 includes, for example, a light source that emits pulse-converted light and a light receiving element that receives the light. The distance measuring sensor 18 measures the distance to the measurement object based on the light emission timing from the light source and the light reception timing of the reflected light generated when the light emitted from the light source is reflected by the measurement object. The distance measuring sensor 18 may include a light source that emits light having directivity.
  Here, an example in which the user terminal 100 receives a detection result of detecting an object 1010 in the vicinity of the user terminal 100 using the camera 17 and the distance measuring sensor 18 as an input operation of the user will be further described. The camera 17 and the distance measuring sensor 18 may be provided on the side surface of the housing of the user terminal 100, for example. A distance measuring sensor 18 may be provided in the vicinity of the camera 17. As the camera 17, for example, an infrared camera can be used. In this case, the camera 17 may be provided with an illumination device that emits infrared light, a filter that blocks visible light, and the like. Thereby, it is possible to further improve the detection accuracy of the object based on the captured image of the camera 17 regardless of whether it is outdoors or indoors.
  The processor 10 may perform one or more processes on the captured image of the camera 17, for example, among the processes shown in the following (1) to (5). (1) The processor 10 performs image recognition processing on the captured image of the camera 17 to identify whether the captured image includes a user's hand. The processor 10 may use a technique such as pattern matching as an analysis technique employed in the above-described image recognition processing. (2) Moreover, the processor 10 detects a user's gesture from the shape of a user's hand. For example, the processor 10 specifies the number of the user's fingers (the number of fingers extending) from the shape of the user's hand detected from the captured image. The processor 10 further identifies a gesture performed by the user from the number of identified fingers. For example, when the number of fingers is five, the processor 10 determines that the user has performed a “par” gesture. In addition, when the number of fingers is 0 (no finger is detected), the processor 10 determines that the user has made a “goo” gesture. On the other hand, when the number of fingers is two, the processor 10 determines that the user has performed a “choke” gesture. (3) The processor 10 performs image recognition processing on the image captured by the camera 17 to detect whether the user's finger is in the state where only the index finger is raised or whether the user's finger has moved. . (4) The processor 10 determines an object 1010 (such as a user's hand) in the vicinity of the user terminal 100 based on at least one of the image recognition result of the image captured by the camera 17 and the output value of the distance measuring sensor 18. ) And the user terminal 100 are detected. For example, the processor 10 determines whether the user's hand is near (for example, a distance less than a predetermined value) or far (for example, a predetermined value) depending on the size of the shape of the user's hand specified from the captured image of the camera 17. It is detected whether the distance is above. When the captured image is a moving image, the processor 10 may detect whether the user's hand is approaching or moving away from the user terminal 100. (5) If it is determined that the distance between the user terminal 100 and the user's hand has changed while the user's hand is detected based on the image recognition result of the captured image of the camera 17, the processor 10 recognizes that the user is shaking his / her hand in the shooting direction of the camera 17. When an object is not detected in the distance measuring sensor 18 having higher directivity than the shooting range of the camera 17, the processor 10 indicates that the user is shaking his / her hand in a direction perpendicular to the shooting direction of the camera. recognize.
  In this manner, the processor 10 determines whether or not the user is grasping his / her hand by image recognition on the captured image of the camera 17 (“Goo” gesture or other gesture (eg “Par”)). ) Is detected. The processor 10 also detects how the user is moving the hand along with the shape of the user's hand. In addition, the processor 10 detects whether the user is approaching or moving away from the user terminal 100. Such an operation can correspond to an operation using a pointing device such as a mouse or a touch panel. For example, the user terminal 100 moves the pointer on the touch screen 15 according to the movement of the user's hand, and detects the user's gesture “goo”. In this case, the user terminal 100 recognizes that the user is continuing the selection operation. Continuation of the selection operation corresponds to, for example, maintaining a state where the mouse is clicked and pressed, or maintaining a touched state after a touchdown operation is performed on the touch panel. Further, when the user further moves his / her hand while the user's gesture “go” is detected, the user terminal 100 performs such a series of gestures as an operation corresponding to the swipe operation (or drag operation). It can also be recognized. In addition, when the user terminal 100 detects a gesture that the user repels a finger based on the detection result of the user's hand based on the captured image of the camera 17, the user terminal 100 clicks the gesture or taps the touch panel. You may recognize as operation corresponding to.
<Functional configuration of game system 1>
FIG. 2 is a block diagram illustrating functional configurations of the server 200 and the user terminal 100 included in the game system 1. Each of the server 200 and the user terminal 100 may include a functional configuration necessary for realizing a known function in a game, and a functional configuration necessary for functioning as a general computer (not shown).
  The user terminal 100 has a function as an input device that receives a user input operation and a function as an output device that outputs a game image and sound. The user terminal 100 functions as the control unit 110 and the storage unit 130 by the cooperation of the processor 10, the memory 11, the storage 12, the communication IF 13, the input / output IF 14, and the like.
  The server 200 communicates with each user terminal 100 and has a function of supporting the user terminal 100 to advance the game. For example, sales of valuable data and provision of services are executed. When the game is a multiplayer game, the server 200 may have a function of communicating with each user terminal 100 participating in the game and mediating exchanges between the user terminals 100. The server 200 functions as the control unit 210 and the storage unit 230 by cooperation of the processor 20, the memory 21, the storage 22, the communication IF 23, the input / output IF 24, and the like.
  The storage unit 130 and the storage unit 230 store a game program 131, game information 132, and user information 133. The game program 131 is a game program that is executed by the user terminal 100 and the server 200. The game information 132 is data that the control unit 110 and the control unit 210 refer to when executing the game program 131. The user information 133 is data relating to the user account. In the storage unit 230, game information 132 and user information 133 are stored for each user terminal 100.
(Functional configuration of server 200)
The control unit 210 performs overall control of the server 200 by executing the game program 131 stored in the storage unit 230. For example, the control unit 210 transmits various data and programs to the user terminal 100. The control unit 210 receives part or all of the game information or user information from the user terminal 100. When the game is a multiplayer game, the control unit 210 may receive a multiplayer synchronization request from the user terminal 100 and transmit data for synchronization to the user terminal 100.
  The control unit 210 functions as a transmission / reception unit 211, a server processing unit 213, and a data management unit 212 according to the description of the game program 131. The control unit 210 can also function as other functional blocks (not shown) in order to support the progress of the game on the user terminal 100 according to the nature of the game to be executed.
  The transmission / reception unit 211 transmits / receives various data. For example, the transmission / reception unit 211 receives various data and program transmission requests from the user terminal 100, multiplay synchronization requests, synchronization data, and the like, and sends them to the server processing unit 213. For example, the transmission / reception unit 211 transmits various data and programs to the user terminal 100 in accordance with instructions from the server processing unit 213.
  The server processing unit 213 performs arithmetic processing necessary for providing a game. The server processing unit 213 executes a calculation process described in the game program 221 in response to a request from the user terminal 100 or the like. For example, the server processing unit 213 performs various determination processes related to the progress of the game.
  For example, the server processing unit 213 instructs the data management unit 212 to add, update, or delete the game information 122 or the user information 123. For example, the server processing unit 213 instructs the transmission / reception unit 211 to transmit various data or programs.
  The data management unit 212 manages various data stored in the storage unit 230 in accordance with instructions from the server processing unit 213. For example, the data management unit 212 adds, updates, or deletes the game information 122 or the user information 123 according to an instruction from the server processing unit 213.
  Further, for example, the data management unit 212 reads at least one of the game information 222 and the user information 223 from the storage unit 230 in accordance with an instruction from the server processing unit 213, and transmits it to the user terminal 100 via the transmission / reception unit 211.
  Further, for example, the data management unit 212 reads from the storage unit 230 a part of the game program 121 to be executed by the user terminal 100 in accordance with an instruction from the server processing unit 213, and transmits the program to the user terminal 100 via the transmission / reception unit 211. To do.
(Functional configuration of user terminal 100)
The control unit 110 performs overall control of the user terminal 100 by executing the game program 131 stored in the storage unit 130. For example, the control unit 110 advances the game according to the game program 131 and the user's operation. In addition, the control unit 110 communicates with the server 200 and transmits / receives information as necessary while the game is in progress.
  In accordance with the description of the game program 131, the control unit 110 includes an operation receiving unit 111, a display control unit 112, a user interface (hereinafter referred to as UI) control unit 113, an animation generation unit 114, an element selection unit 115, and a group organization. Functions as a part 116, a first part progression part 117, and a second part progression part 118. The control unit 110 can also function as other functional blocks (not shown) in order to advance the game according to the nature of the game to be executed.
  The operation receiving unit 111 detects and receives a user input operation on the input unit 151. The operation reception unit 111 determines what input operation has been performed based on the user's action on the console via the touch screen 15 and other input / output IF 14, and outputs the result to each element of the control unit 110. To do.
  For example, the operation reception unit 111 receives an input operation on the input unit 151, detects the coordinates of the input position of the input operation, and identifies the type of the input operation. The operation reception unit 111 specifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as the types of input operations. Further, the operation receiving unit 111 detects that the contact input is canceled from the touch screen 15 when the input that has been continuously detected is interrupted.
  The UI control unit 113 controls a UI object displayed on the display unit 152 in order to construct a UI. The UI object is a tool for the user to make an input necessary for the progress of the game to the user terminal 100 or a tool for obtaining information output during the progress of the game from the user terminal 100. The UI object is not limited to this, but is, for example, an icon, a button, a list, a menu screen, or the like.
  The animation generation unit 114 generates an animation indicating the motion of various objects based on the control mode of the various objects.
  The display control unit 112 outputs a game screen on which the processing results executed by the above-described elements are reflected on the display unit 152 of the touch screen 15. The display control unit 112 may display a game screen including the animation generated by the animation generation unit 114 on the display unit 152. Further, the display control unit 112 may draw the UI object described above by superimposing it on the game screen.
  Note that the functions of the server 200 and the user terminal 100 shown in FIG. 2 are merely examples. The server 200 may include at least a part of functions included in the user terminal 100. Further, the user terminal 100 may include at least a part of the functions included in the server 200. Furthermore, a device other than the user terminal 100 and the server 200 may be used as a component of the game system 1, and the other device may be caused to execute part of the processing in the game system 1. That is, the computer that executes the game program in the present embodiment may be any of the user terminal 100, the server 200, and other devices, or may be realized by a combination of these devices.
(Details of information that can be stored in storage unit 130)
The storage unit 130 can store the following information as part of the user information 133. These pieces of information are generated or updated by at least one of the functional blocks of the control unit 110 and stored. These pieces of information are read and used by at least one of the functional blocks of the control unit 110.
  (1) Information related to each character assigned to the user: Information related to each character assigned to the user among a plurality of characters available in the game system 1. For example, one or more characters may be given to the user in the initial state of the game. Further, an additional character may be given to the user as the game progresses. The information regarding each character includes various parameters used in the first part. Various parameters can be updated as the game progresses. Moreover, the information regarding each character contains the information showing the attribute of a character. The attribute may be, for example, an attribute related to the appearance of the character. The attribute can be changed by the progress of the game or the user's setting.
  (2) First parameter: Information representing the relationship between any two characters among the characters assigned to the user. The relationship may be, for example, the degree of friendship between two characters. The first parameter may be stored in association with a combination of two characters. Or the 1st parameter between the said character and other characters may be contained in the information about each character mentioned above. The first parameter can be updated as the game progresses.
  (3) Second parameter: information relating to a user who uses the user terminal 100. The second parameter can be updated as the game progresses. The second parameter may represent a user experience value, for example.
  (4) Information on group: Information representing members constituting the group used in the first part. The information about the group also includes information about two or more main characters among the members and information about the role of each main character. Information about the group can be updated according to user settings.
  (5) Information regarding elements: Information regarding elements given to the user among elements that can be used in the game system 1 as elements that can constitute the first event. For example, one or more elements may be given to the user in the initial state of the game. Further, additional elements may be given to the user as the game progresses. In the present embodiment, as described above, the element that can constitute the first event is a music video representing a song performed live. The music video may include sound and images. The music video may be composed of a plurality of layers.
  Further, the information regarding the element includes information regarding the type of the element. The type of element is reflected in the contents of the production described later. Moreover, the information regarding an element contains the information regarding the combination of the character defined with respect to the said element. Information regarding the combination of characters is reflected in the contents of the effects described later.
  (6) Information about production: Production is a device for exciting the progress of the first part, and is executed in the first part. In the present embodiment, the effect is realized as an image representing the relationship between two or more main characters. For example, the information related to the performance may include information representing an image that can be inserted into a music video. Further, the effects may include, for example, a first effect and a second effect. The first effect is a content determined for each combination of characters. The second effect is content determined according to the attributes of each character in addition to the combination of characters. Further, the second effect may be contents determined according to the role of the character. An example of the attribute is an attribute related to the appearance, but is not limited to this. When the music video is composed of a plurality of layers, the information representing the first effect may be represented as a layer image including the characters in the same scene for each combination of characters. Further, the information representing the first effect may be represented as a combination of layer images prepared for each character. Moreover, the information representing the second effect may be represented as a combination of layer images prepared for each character according to the attribute and role. Further, the information representing the first effect or the information representing the second effect may further include audio information.
  The first event described above may include at least the second effect as a special content. In the present embodiment, an event other than the first event may be executed in the first part. Hereinafter, such an event is also referred to as a normal event. Regular events do not include special content. For example, the normal event may include the first effect without including the second effect.
(Details of group organization unit 116)
The group organization unit 116 forms a group used in the first part using a plurality of characters that can be used in the game system 1. Specifically, the group organization unit 116 organizes the members of the group performing the live in the first part. Characters that can be organized into groups may be characters assigned to the user among a plurality of characters that can be used in the game system 1.
  In addition, the group organization unit 116 selects two or more characters from the characters constituting the group used in the first part. Hereinafter, two or more characters selected from the group are also referred to as main characters. Specifically, the group organization unit 116 selects two or more main characters as the center for the first event from the members of the group performing live in the first part. Hereinafter, although it is assumed that there are two main characters to be selected, three or more may be used. Hereinafter, when it is necessary to distinguish each of the two main characters, they are also referred to as a first main character and a second main character.
  The group organization unit 116 may assign a role to each of the two main characters selected as described above. Examples of roles include master-slave relationships. For example, the group organization unit 116 may assign the first role to one of the main characters and assign the second role to the other. As the first role, for example, the master of the master-slave relationship, and as the second role, the slave of the master-slave relationship can be cited. The first role and the second role are reflected in the effect at the first event executed by the first part progression unit 117.
(Details of the first part progression unit 117)
The first part progression unit 117 proceeds the first part using a group. In proceeding with the first part, the first part progression unit 117 refers to various parameters possessed by each character constituting the group. Specifically, the first part progression unit 117 reproduces a music video as a group live in the first part. Then, the first part progression unit 117 proceeds with the music game by determining whether or not an operation has been accepted in accordance with the rhythm of the song included in the music video with reference to the various parameters described above.
  Moreover, the 1st part progression part 117 performs the effect showing the relationship between main characters in the 1st part. Specifically, the first part advancing unit 117 reproduces the music video as a live performance by the group, including a video serving as an effect representing the relationship between the main characters in the music video. For example, one or both of the first effect and the second effect described above are included in the music video according to various conditions described later. The live music video including the first effect and the second effect is an example of the first event described above. In addition, live music video including the first effect and not including the second effect is an example of the normal event described above. The first part progression unit 117 may insert a layer image including the two selected main characters in the same scene as a first effect at a predetermined point in the music video. The first part progression unit 117 may insert the layer images of the two selected main characters at a predetermined point in the music video as the first effect. Further, as the second effect, the first part progression unit 117 inserts layer images based on the attributes of the two selected main characters into different layers of the music video based on their roles. Good. Note that it is desirable that a layer image based on the attributes of each of the two main characters is provided with a period in which it is displayed simultaneously after being inserted with a time difference based on the role. Details of the first effect and the second effect will be described later.
  Thus, in the first part, an opportunity to enjoy the relationship between the main characters is provided to the user as the first effect or the second effect. The user can enjoy the relationship between the main characters by the first effect. Moreover, the user can enjoy the relationship between the main characters by the 2nd effect that the main character in which the attribute was reflected respectively appears in a time difference based on a role.
  In addition, when the first parameter representing the relationship between the main characters satisfies the first condition, the first part progression unit 117 executes a first event having contents corresponding to the main character in the first part. Here, the first condition is a condition related to the first parameter, and may be, for example, that the first parameter is equal to or greater than a threshold value. Hereinafter, the condition that the first parameter satisfies the first condition is simply referred to as the first condition.
  In this way, the first parameter, which is the relationship between characters, is involved in the execution of the first event, so that the fun of the game is improved.
  The first part progression unit 117 may apply the second condition in addition to the first condition as a condition for executing the first event. The second condition represents a condition that a plurality of characters constituting the organized group is a predetermined combination, and is, for example, a combination determined for an element selected by the element selection unit 115 described later. Represents the condition.
  A case where the second condition is not satisfied will be described. In this case, the first part progression unit 117 does not execute the first event. In this case, for example, the first part progression unit 117 may execute the normal event described above.
  A case where the second condition is satisfied but the first condition is not satisfied will be described. In this case, the first part progression unit 117 does not execute the first event. In this case, for example, the first part progression unit 117 may not start the live in the first part. In other words, for example, the first part progression unit 117 may not execute either the first event or the normal event.
  Thus, the user's motivation for configuring the members of the group with a predetermined combination of characters so that the first event is executed in the first part (that is, the second effect can be appreciated in the first event). improves.
  Further, the first part progression unit 117 may apply a third condition as a condition for executing the first event in addition to the first condition and the second condition. The third condition represents a condition that the selected element is a predetermined type.
  For example, it is assumed that music videos that can be used in the game system 1 are classified as either type A or type B. In this case, the first part progression unit 117 has the selected music video of type A (third condition), and the members of the organized group are a combination determined for the music video ( The second event may be executed when the second condition) and the first parameter between the two main characters satisfy the first condition.
  A case where the third condition is not satisfied will be described. In this case, the first part progression unit 117 does not execute the first event. In this case, for example, the first part progression unit 117 may execute the normal event described above.
  As a result, in order to execute the first event in the first part (that is, to be able to appreciate the second effect at the first event), the members of the group are defined by a predetermined combination of characters determined for predetermined elements. User motivation for configuration is improved.
(Details of the element selection unit 115)
The element selection unit 115 selects any element from one or more elements that can constitute the first event. For example, the one or more elements that can constitute the first event may be elements given to the user among one or more elements that can be used in the game system 1. Further, the number of element choices may change according to the second parameter related to the user. The second parameter will be described later. In the present embodiment, the element that can constitute the first event is the music video described above.
(Details of the second part progression unit 118)
The second part progression unit 118 selects two or more characters whose parameters are to be updated in the second part. Hereinafter, the two or more selected characters are also referred to as breeding characters. In the following description, it is assumed that there are two breeding characters to be selected, but there may be three or more.
  In addition, the second part progression unit 118 executes the second event for the combination of two breeding characters. Further, the second part progression unit 118 updates the first parameter between the two breeding characters in accordance with the execution of the second event. For example, the second part progression unit 118 may update the first parameter between the breeding characters to be larger by executing the second event.
  Further, the second part progression unit 118 may update the second parameter related to the user in accordance with the execution of the second event. For example, the second part progression unit 118 may update the second parameter to be larger by executing the second event. In this case, the larger the second parameter, the greater the number of element options. For example, every time the second parameter exceeds a predetermined value, an element as an option may be added to the user.
  Thereby, the user can enjoy the variation of the first event in the first part.
<Screen example of game system 1>
(Specific example of element selection screen)
FIG. 3 is a specific example of an element selection screen generated by the element selection unit 115. The element selection screen is an example of a screen displayed in the first part, and is a screen for selecting music videos that can constitute the first event executed in the first part. In FIG. 3, the music video is described as MV. In FIG. 3, the element selection screen includes element objects 301 a to 301 c and a next button 302. A next button 302 is a UI object for proceeding to display of the group organization screen.
  Element objects 301a to 301c represent element choices. FIG. 3 shows an example in which the number of element options is three, but the number of element options is not limited to three. Further, when the number of element choices is larger than the number of elements that can be displayed simultaneously on the element selection screen, the element selection screen may further include a scroll bar (not shown). In this case, an element object representing another element that has not been displayed until then is displayed in response to an input operation on the scroll bar. Note that, as described above, the number of element options changes according to the second parameter. In the example of FIG. 3, the element object 301a represents the music video MV1, the element object 301b represents the music video MV2, and the element object 301c represents the music video MV3.
  For example, the element selection unit 115 selects a corresponding element in accordance with an input operation on any of the element objects 301a to 301c. In the present embodiment, the number of elements to be selected is one, but may be two or more.
(Details of group organization screen)
FIG. 4 is a specific example of the group organization screen generated by the group organization unit 116. The group organization screen is an example of a screen displayed in the first part, and is a screen for organizing a group performing live in the first part. In FIG. 4, the group composition screen includes a first main character frame 401 a, a second main character frame 401 b, three sub character frames 401 c to 401 e, and a parameter display area 402. Hereinafter, the first main character frame 401a, the second main character frame 401b, and the three sub character frames 401c to 401e are collectively referred to as character frames 401a to 401e. In this specific example, the number of members constituting the group is five. The sub character is a member who is not the main character. The first main character frame 401a is a UI object for setting the first main character to which the first role is assigned. The second main character frame 401b is a UI object for setting a second main character to which the second role is assigned. The sub character frames 401c to 401e are UI objects for setting sub characters.
  For example, the group organization unit 116 may display a list screen (not shown) of characters assigned to the user in response to an input operation on the first main character frame 401a. Note that “display” refers to display on the display unit 152 using the display control unit 112. Then, when any character is selected from the list screen by an input operation, the group organization unit 116 may close the list screen and display the selected character in the first main character frame 401a. The “input operation” refers to an operation on the input unit 151 received by the operation receiving unit 111. The input operation may be, for example, a tap operation on the touch screen 15, but is not limited thereto. Thereby, the 1st main character to which the 1st role was assigned is selected from the characters given to the user. Similarly, the character selected from the characters given to the user is displayed for the second main character frame 401b and the sub character frames 401c to 401e. Thereby, group members are organized on the group organization screen.
  Further, when a character is selected by an input operation with respect to any of the character frames 401 a to 401 e, the group organization unit 116 displays parameters relating to the character in the parameter display area 402.
  Further, the group organization screen shown in FIG. 4 further includes an exchange button 403, an automatic organization button 404, and a start button 405.
  The exchange button 403 is a UI object for exchanging roles assigned to the two main characters. The group organization unit 116 switches the role of the main character in response to an input operation on the replacement button 403. That is, the group organization unit 116 accepts an input operation to the replacement button 403 in a state where characters are displayed in the first main character frame 401a and the second main character frame 401b, respectively. Then, in response to the input operation, the group organization unit 116 switches the first main character displayed in the first main character frame 401a and the second main character displayed in the second main character frame 401b. Thereby, the character selected as the first main character is selected as the second main character, and the character selected as the second main character is selected as the first main character.
  The automatic organization button 404 is a UI object for automatically organizing group members. The group organization unit 116 selects a member of the group from the characters assigned to the user in response to an input operation on the automatic organization button 404. That is, the group organization unit 116 selects group members from among characters assigned to the user. For example, the group organization unit 116 may select a member of a group that is advantageous for proceeding with the first part in response to an input operation on the automatic organization button 404. The members of the group having such an advantageous configuration may be selected in consideration of various parameters of each character.
  The start button 405 is a UI object for starting live in the first part by the members of the organized group. In other words, the start button 405 is a UI object for starting execution of the first event. However, a normal event may be started in response to an operation on the start button 405. The input operation to the start button 405 is detected by the first part progression unit 117.
  As described above, since the exchange button 403 is included in the group organization screen, there is an advantage that the roles can be easily exchanged while performing the group organization. Further, since the automatic organization button 404 is included in the group organization screen, there is an advantage that even a beginner user can easily perform group organization.
(Specific example of the first event screen)
FIG. 5A is a specific example of the first event screen executed in the first part. As shown in FIG. 5A, the first event screen is an example of a screen displayed in the first part, and includes an effect area 501 and a first part progression area 502. The effect area 501 is an area where a music video, which is an example of an element that can constitute the first event, is reproduced including at least one of the first effect and the second effect.
  A first part progression area 502 in FIG. 5A is an area where the timing index 503 is moved and displayed in accordance with the rhythm of the music in the music video being played in the effect area. The timing index 503 is a UI object that is moved and displayed from the right to the left on the screen, for example. A plurality of timing indicators 503 may be sequentially moved and displayed. When the timing index 503 overlaps the predetermined area 504 in the first part progression area 502, the first part progression unit 117 determines whether an input operation is accepted for the timing index 503. At this time, the first part progression unit 117 proceeds with the music game by making a determination while referring to various parameters of each character constituting the organized group. Thereby, the user can further enjoy the music game while enjoying the relationship between the main characters by the first effect or the second effect reproduced in the effect region.
(Specific example of the result screen)
FIG. 5B is a specific example of a result screen showing the progress result of the first part, and is a screen showing the result of the music game progressed in the first part progress area 502. The result screen includes a display area 505 in which information representing the selected element (for example, music video) is displayed, and a display area 506 in which information representing the score is displayed. Further, the first part progression unit 117 may update at least a part of various parameters of each character constituting the group based on the result of the music game. In this case, the screen representing the result may further include information on the updated parameter.
(Example of realization of the first effect and the second effect)
6 to 9 are diagrams for explaining examples of realizing the first effect and the second effect in the effect area 501. FIG.
  First, with reference to FIGS. 6A and 6B, a music video layer structure and a layer image arranged in the layer will be described. 6A and 6B, arrows 601 to 603 indicate the passage of the playback time t of the layers constituting the music video. Times t0 to t6 represent predetermined points in the music video playback period. Note that the times t0 to t6 are points of arrival in this order during the reproduction period. In this example, the music video is composed of three layers (front layer, middle layer, and rear layer). Hereinafter, instead of describing with the arrows 601 to 603, the preceding layer 601, the middle layer 602, and the subsequent layer 603 are also described. These layers are also collectively referred to as layers 601 to 603. In addition, line segments 611 to 615 drawn in the vicinity of the arrows of the respective layers 601 to 603 represent layer images to be inserted in the period of the range indicated by the line segment in the layer. Hereinafter, it describes also as the layer images 611-615 instead of describing with the line segments 611-615. It should be noted that other layer images (not shown) may be inserted in the layers 601 to 603 other than the period in which the layer images 611 to 615 are inserted. In this specific example, the middle layer 602 is superimposed on the rear layer 603, and the front layer 601 is superimposed on the middle layer 602 and displayed on the display unit 152. Note that the layer images displayed in the layers 601 to 603 are composed of pixels with transparency set, and the pixels and the pixels in the layers behind the pixels are combined according to the transparency of the pixels, and the display unit It is assumed that it is displayed in 152. As a technique for integrating a plurality of layers and displaying them on the display unit 152, a known technique can be employed.
  FIG. 6A schematically shows a music video including the first effect. In other words, FIG. 6A is an example of a music video that is played when at least one of the second condition and the third condition described above is not satisfied.
  In FIG. 6A, layer images 611 to 615 are inserted into any of the layers 601 to 603. Note that each of the layer images 611 to 615 is not necessarily a single still image, and each of the layer images 611 to 615 may be a moving image including a plurality of still images.
  The layer image 611 includes a music video title, lyrics, decorations, and the like. The layer image 611 is arranged on the previous layer 601 during the time t0 to t6.
  The layer image 612 includes a background common to music videos. The layer image 617 is arranged on the rear layer 603 during the time t0 to t6.
  In FIG. 6A, the layer images 613 to 616 constitute the first effect. The layer image 613 is a layer image including the first main character. The layer image 614 is a layer image including the second main character. The layer image 615 is a layer image including two main characters in the same scene. The layer image 613 is inserted into the middle layer 602 during the period from time t1 to time t2. The layer image 614 is inserted into the middle layer 602 during the period from time t2 to t3. The layer image 615 is inserted into the middle layer 602 during the period from time t3 to t4. Thus, in the first effect, the two main characters appear sequentially in a non-overlapping period, or appear simultaneously.
  With such a first effect, the user can enjoy a combination of main characters.
  FIG. 6B schematically shows a music video including the first effect and the second effect. FIG. 6B is an example of a music video that is played when all of the first condition, the second condition, and the third condition described above are satisfied.
  In FIG. 6B, the layer images 611 to 612 are as described above. However, the layer image 611 is arranged in the previous layer 601 not in the period from time t0 to t6 but in the period from time t0 to t4. Further, the layer image 612 is arranged in the rear layer 603 not in the period from time t0 to t6 but in the period from time t0 to t4. Further, since the layer images 613 to 615 constitute the first effect as described above, detailed description will not be repeated.
  The layer images 616 to 617 constitute the second effect. The layer image 616 is a layer image including the first main character to which the first role is assigned with an appearance corresponding to the set attribute. The layer image 617 is a layer image including the second main character to which the second role is assigned with an appearance corresponding to the set attribute. The layer image 616 is inserted into the rear layer 603 during the period from the reproduction time t4 to t6. The layer image 617 is inserted into the previous layer 601 during a period from time t5 to t6. As described above, in the second effect, the layer images corresponding to the attributes of the two main characters start to be inserted with a time difference based on their roles, and at the same time, the insertion ends. As a result, the first main character assigned the first role first appears, and then the second main character assigned the second role additionally appears in the scene where the first main character is appearing. The second effect of performing is realized. In FIG. 6B, the example in which the title, lyrics, decoration, etc. of the music video are not displayed during the second presentation has been described. The present invention is not limited to this, and the title, lyrics, decorations, etc. of the music video may be displayed during the second presentation. In that case, for example, the layer image 617 may be inserted into the middle layer 602 instead of the previous layer 601. In this case, the layer image 611 may be continuously inserted into the previous layer 601 even during the second effect.
  With such a second effect, the user can enjoy a relationship in which at least one of each attribute and role is reflected between the two main characters. In addition, it is considered that the value of the second effect reflecting at least one of the attributes and roles of the two main characters is higher for the user than the first effect. Therefore, in order to appreciate the second effect, the user is more motivated to progress the game so as to satisfy the first condition to the third condition. For example, the user's motivation to acquire all of those characters is increased so that a group can be formed with a predetermined combination of members for a certain music video. In addition, the motivation of the user for playing the second part is increased so that the first parameter between desired characters satisfies the first condition.
  7A and 7B are examples of layer images 611 and 612 used in FIG. 6B. For example, the layer image 611 shown in FIG. 7A includes a music video title, lyrics, and the like. The layer image 611 may be a moving image composed of a plurality of still images following the illustrated still image, and each still image may include, for example, lyrics that are separated along the flow of the song. .
  Further, the layer image 612 illustrated in FIG. 7B includes a common background. The layer image 612 may be a moving image composed of a plurality of still images following the illustrated still image.
  8A to 8E are examples of the layer images 613 to 617 used in FIG. 6A or 6B. For example, FIG. 8A represents a layer image 613 and includes a first main character. Further, the attribute is not reflected in the first main character included in the layer image 613. Here, the attribute set to the character represents the appearance (for example, clothes) of the character. That is, it is assumed that the first main character included in the layer image 611 is wearing the default clothes regardless of the set attributes. The layer image 613 may be a moving image composed of a plurality of still images following the illustrated still image.
  FIG. 8B shows a layer image 614 and includes a second main character. Further, the attribute is not reflected on the second main character included in the layer image 612. That is, it is assumed that the second main character included in the layer image 614 is wearing the default clothes regardless of the set attributes. Note that the layer image 614 may be a moving image including a plurality of still images following the illustrated still image.
  FIG. 8C shows a layer image 615 and includes the first main character and the second main character in the same scene. Further, the attributes are not reflected in the first main character and the second main character included in the layer image 615. The layer image 615 may be a moving image composed of a plurality of still images following the illustrated still image.
  FIG. 8D shows a layer image 616, which includes the first main character in the upper left half triangular area. Further, the attribute is reflected in the first main character included in the layer image 616. Specifically, the first main character included in the layer image 616 is dressed differently from that in FIG. The layer image 616 may be a moving image including a plurality of still images following the illustrated still image.
  FIG. 8E shows a layer image 617 and includes the second main character in the lower right half region. The attribute is reflected in the second main character included in the layer image 617. Specifically, the second main character included in the layer image 617 is dressed differently from that in FIG. The upper left half area of the layer image 617 is a transparent pixel. The layer image 617 may be a moving image including a plurality of still images following the illustrated still image.
  FIGS. 9A to 9F show the first video of the music video when the layer images shown in FIGS. 7 to 8 are sequentially inserted along the timing chart shown in FIG. It represents a state of being played back at.
  FIG. 9A is an example of a screen at t0 in FIG. As shown in FIG. 9A, at t0, the layer image 612 is arranged on the rear layer 603 and the layer image 611 is arranged on the front layer 601, so that a title, lyrics, and the like are superimposed on a common background. Screen is displayed.
  FIG. 9B is a screen example at t1 in FIG. As shown in FIG. 9B, by arranging the layer image 613 on the middle layer 602, the first main character appears in the default clothes on the common background, and the lyrics are superimposed on it. .
  FIG. 9C is an example of a screen at t2 in FIG. As shown in FIG. 9C, by arranging the layer image 614 in the middle layer 602, the second main character appears in the default clothes instead of the first main character on the common background, On top of that, the lyrics following FIG. 9B are superimposed.
  FIG. 9D is an example of a screen at t3 in FIG. As shown in FIG. 9D, by arranging the layer image 615 in the middle layer 602, instead of the second main character, the first main character and the second main character are initialized on the common background. Appears in the same scene with set clothes.
  FIG. 9E is an example of a screen at t4 in FIG. As shown in FIG. 9E, by arranging the layer image 616 in the rear layer 603, the first main character appears wearing clothes reflecting the attributes instead of the common background.
  FIG. 9F is an example of a screen at t5 in FIG. As shown in FIG. 9F, by arranging the layer image 617 on the middle layer 602, the second main character wears clothes reflecting the attributes in the scene where the first main character appears. To appear. In this specific example, the roles of the first main character and the second main character are reflected in the order of appearance in the second effect.
  In addition, since the area | region where the 1st main character is arrange | positioned in the layer image 616 is a transparent pixel area | region in the layer image 617, when it superimposes, two main characters are displayed simultaneously. As a result, the display of the main character assigned the second role that appears later does not cover the display of the main character assigned the first role that appeared first, and the more effective first Two effects are realized.
  In addition, the specific example of the 1st effect and the 2nd effect which were demonstrated using FIGS. 6-9 is an example. Also, the music video does not necessarily have to be composed of a plurality of layers. In that case, the first effect only needs to be in accordance with the combination of the two main characters, and may be realized by a method other than those described above. In addition to the combination of the two main characters, the second effect may be contents corresponding to the respective roles and attributes, and may be realized by methods other than those described above.
(Specific example of the second part screen)
FIG. 10A is a specific example of a breeding character selection screen for selecting two breeding characters to be trained in the second part. In FIG. 10, the breeding character selection screen includes character objects 801a to 801e, a display area 802, and a start button 803.
  Character objects 801a to 801e represent a list of characters assigned to the user. In this example, five characters are given to the user. When the number of characters given to the user is larger than the number of breeding characters that can be displayed simultaneously on the breeding character selection screen, the breeding character selection screen may further include a scroll bar (not shown). In that case, in response to an input operation on the scroll bar, a character object representing another character that has not been displayed is displayed.
  When an input operation is accepted for each of the character objects 801a to 801e, the corresponding character is selected as a breeding character. In this example, the number of breeding characters that can be selected simultaneously is two.
  The display area 802 is an area for displaying a first parameter between two selected breeding characters. The display area 806 allows the user to select two breeding characters that are targets for increasing the first parameter while confirming the first parameter between the characters.
  The start button 803 is a UI object for starting a second event for two selected breeding characters.
  The second part progression unit 118 starts the second event in response to the input operation on the start button 803 for the two breeding characters selected by the input operation on the character objects 801a to 801e.
  Here, the second part progression unit 118 may display a second event selection screen (not shown) for selecting the content of the second event when starting the second event. In this case, the second part progression unit 118 starts a second event with the selected content. For example, such a second event selection screen may be configured such that the contents of a stage where the second event is virtually executed, items used in the second event, and the like can be selected. Specific examples of the contents of the stage include, but are not limited to, “a candy-eating stage”. In this case, a specific example of the item is “inserted candy”, but is not limited thereto.
  Note that the breeding character selection screen shown in FIG. 10A is not necessarily displayed in the second part. The selection of the breeding character may be executed in a part other than the first part and the second part, for example.
  FIG. 10B is a diagram showing a specific example of the execution screen for the second event. In this specific example, as the second event, the second part progression unit 118 reproduces a video representing “a state in which a breeding character eats a candy that is put in at a stage where a candy is eaten”.
  FIG. 10C is a diagram illustrating a specific example of the result screen of the second event. In this specific example, the result screen includes a first parameter display area 804 and a second parameter display area 805. When the execution of the second event ends, the second part progression unit 118 displays the value before execution of the second event and the value after execution of the first parameter between the two breeding characters in the first parameter display area 804. Display the value. Further, the second part progression unit 118 displays, in the second parameter display area 805, the value before execution of the second event and the value after execution of the second parameter related to the user. In this example, the update of the first parameter and the second parameter is to increase the value, respectively.
  Note that the second part progression unit 118 may vary the change width of at least one of the first parameter and the second parameter in accordance with the content of the second event (eg, stage, item, etc.).
  In this way, by playing the second part, the user satisfies the first condition for the first event to be executed in the first part (that is, the first condition for viewing the second effect). The first parameter between the breeding characters can be updated. In addition, the user can update the second parameter related to himself / herself in order to increase the number of options of elements that can constitute the first event by playing the second part.
<Processing flow of game system 1>
FIG. 11 to FIG. 12 are flowcharts showing the flow of processing executed by the user terminal 100 based on the game program 131.
  FIG. 11 is a flowchart illustrating an operation executed by the user terminal 100 in the first part.
  In step S901, the element selection unit 115 selects elements (for example, music video) that can constitute the first event. For example, the element selection unit 115 may select an element that can constitute the first event in response to an input operation on the element selection screen illustrated in FIG.
  In steps S902 to S905, the group organization unit 116 organizes a group using a plurality of characters assigned to the user.
  Specifically, in step S902, the group organization unit 116 selects two main characters from a plurality of characters assigned to the user and assigns a role to each. For example, the group organization unit 116 may select the first main character assigned the first role in response to an input operation on the first main character frame 401a on the group organization screen shown in FIG. Moreover, the group organization part 116 should just select the 2nd main character to which the 2nd role was allocated according to input operation with respect to the 2nd main character frame 401b.
  In step S903, the group organization unit 116 selects each sub character from a plurality of characters assigned to the user. For example, the group organization unit 116 may select each sub character in response to an input operation on the sub character frames 401c to 401e on the group organization screen.
  Note that the group organization unit 116 may execute steps S902 and S903 in response to an input operation to the automatic organization button 404 on the group organization screen. In this case, the group organization unit 116 may select two main characters and each character from among the characters assigned to the user so that the organization is advantageous when the first part is advanced. Such a selection process may be executed based on, for example, the parameters of each character, the first parameter between characters, and the like.
  In step S904, the group organization unit 116 determines whether an input operation for switching the roles of the two main characters has been received. For example, the group organization unit 116 may determine whether an input operation to the group organization screen replacement button 403 has been accepted. If accepted, in step S905, the group organization unit 116 swaps the roles of the two main characters. If not accepted, the process of step S905 is not performed.
  Note that the processes in steps S902 to S905 are not necessarily performed in this order.
  In step S906, the first part progression unit 117 determines whether an input operation for starting execution of the first event has been received. For example, the first part progression unit 117 may determine whether or not an input operation to the start button 405 on the group organization screen has been accepted. If not accepted, at least one of the processes of steps S902 to S905 may be repeated. If accepted, step S907 is executed.
  As can be seen from the flow of steps S902 to S906, the group organization screen includes an exchange button 403 and a start button 405, so that the organization of the group can be changed or the main character and its There is an advantage that it is easy for the user to switch roles.
  In step S907, the first part progression unit 117 determines whether or not the element (eg, music video) selected in step S901 is of a predetermined type (whether or not the third condition is satisfied). If it is not the predetermined type, step S911 described later is executed. If it is the predetermined type, the next step S908 is executed.
  In step S908, the first part progression unit 117 determines whether or not the members of the selected group are a combination determined for the selected element (whether or not the second condition is satisfied). If the combination is not determined, step S911 described later is executed. If the combination is determined, the next step S909 is executed.
  In step S909, the first part progression unit 117 determines whether or not the first parameter between the two selected main characters satisfies the first condition. Here, when the first condition is not satisfied, the first part progression unit 117 does not execute the first event. In this case, at least one process from step S902 to S905 may be repeated, and then the process from step S906 may be repeated. If it is determined that the first condition is satisfied, the next step S910 is executed.
  In step S910, the first part progression unit 117 executes the first event to advance the first part. For example, the first part progression unit 117 includes the first effect and the second effect in the selected music video. In step S910, the first part progression unit 117 may include at least the second effect, and does not necessarily include the first effect. Then, the first part progression unit 117 proceeds with the music game while reproducing the music video.
  On the other hand, the case where the element selected in step S907 is not a predetermined type, or the case where the plurality of characters constituting the group are not the combination determined for the selected element in step S908 will be described. . In this case, the next step S911 is executed.
  In step S911, the first part progression unit 117 executes the normal event without executing the first event. For example, the first part progression unit 117 includes the first effect in the music video corresponding to the selected element. The first part advancing unit 117 does not necessarily include at least the second effect, and does not necessarily include the first effect. Then, the first part progression unit 117 proceeds with the music game while reproducing the music video.
  In step S912, the first part progression unit 117 displays a music game result screen as the progression result of the first part in step S910 or S911.
  In the description of the above operation, at least one of the element selection process in step S901 and the group organization process in steps S902 to S905 is not necessarily performed in the first part, and is performed in a part other than the first part. May be.
  In the description of the operation of the first part described above, when the first condition is not satisfied in step S909, neither the first event nor the normal event is executed, and at least one process from S902 to S905 is repeated. It was explained that it may be. Not limited to this, step S911 may be executed when the first condition is not satisfied in step S909. For example, the first part progression unit 117 may execute a normal event when the first parameter between two main characters does not satisfy the first condition.
  In the description of the operation of the first part described above, if the third condition is not satisfied in step S907, or if the second condition is not satisfied in step S908, a normal event is executed instead of the first event. Explained. Not limited to this, if the third condition is not satisfied in step S907, or if the second condition is not satisfied in step S908, neither the first event nor the normal event may be executed. In this case, at least one process from S902 to S905 may be repeated.
  FIG. 12 is a flowchart showing an operation performed by the user terminal 100 in the second part.
  In step S1001, the second part progression unit 118 selects two breeding characters from the characters assigned to the user. For example, the second part progression unit 118 may select two breeding characters in response to an input operation on the character objects 801a to 801e on the breeding character selection screen shown in FIG.
  In step S1002, the second part progression unit 118 determines whether or not an input operation for starting the second part has been received. For example, the second part progression unit 118 may determine whether or not an input operation to the start button 803 on the breeding character selection screen has been accepted. If not accepted, the process may be repeated from step S1001. If accepted, step S1003 is executed.
  In step S1003, the second part progression unit 118 executes the second event for the two breeding characters.
  In step S1004, the second part progression unit 118 updates the first parameter between the two breeding characters.
  In step S1005, the second part progression unit 118 updates the second parameter related to the user.
  Note that the processing of step S1004 and step S1005 does not necessarily have to be performed in this order.
  In step S1006, the second part progression unit 118 displays information on the updated first parameter and second parameter as a result of the second part.
  In the description of the above operation, the breeding character selection process in step S1001 is not necessarily performed in the second part, and may be performed in a part other than the second part.
[Modification]
In the present embodiment, the first part progression unit 117 has been described as executing the first event based on whether or not at least the first parameter satisfies the first condition. However, the first part progression unit 117 may execute the first event without necessarily considering whether or not the first parameter satisfies the first condition. For example, the first part progression unit 117 performs an effect indicating the relationship of the selected main character to the selected music video as a live performance by the organized group members (for example, one of the first effect and the second effect or The music video may be played back including both.
  In this case, the user terminal 100 may not include the second part progression unit 118. That is, the game based on the game program 131 does not necessarily include the second part. Even in the case of such deformation, the user can enjoy the relationship between the selected main characters by an effect representing the relationship.
  In the present embodiment, the first part progression unit 117 may advance the first part in exchange for a play right item held by the user. The play right item is managed, for example, in units of playing the first part once. In this case, the user can have one or more play right items. For example, the first part progression unit 117 may perform the process from step S901 in FIG. 11 after performing a process of consuming one play right item held by the user. Moreover, the 1st part progression part 117 may complete | finish operation | movement, without performing the process from step S901, if the user does not have the play right item.
  In this case, the second part progression unit 118 may give the play right data to the user in addition to updating the first parameter and the second parameter in accordance with the execution of the second event.
  By deforming in this way, the user's motivation for playing the second part can be further enhanced.
[Example of software implementation]
The control block of the control unit 210 (particularly, the transmission / reception unit 211, the data management unit 212, and the server processing unit 213), and the control block of the control unit 110 (particularly, the operation reception unit 111, the display control unit 112, the UI control unit 113, The animation generation unit 114, the element selection unit 115, the group organization unit 116, the first part progression unit 117, and the second part progression unit 118) are realized by a logic circuit (hardware) formed on an integrated circuit (IC chip) or the like. Alternatively, it may be realized by software using a CPU (Central Processing Unit).
  In the latter case, an information processing apparatus provided with the control unit 210 and / or the control unit 110 includes a CPU that executes instructions of a program that is software for realizing each function, and the program and various data are computers (or CPUs). ROM (Read Only Memory) or storage device (referred to as “recording medium”) recorded in such a manner as to be readable, and a RAM (Random Access Memory) for expanding the program. And the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it. As the recording medium, a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. The program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program. Note that one embodiment of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
  The present invention is not limited to the above-described embodiments, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present invention.
[Additional Notes]
The contents according to one aspect of the present invention are listed as follows.
  (Item 1) The game program (131) has been described. According to an aspect of the present disclosure, the game program is executed by a computer (user terminal 100, server 200) including a processor (10, 20) and a memory (11, 21). The game program uses a plurality of characters that can be used in a game based on the game program to the processor to form a group of characters, to select two or more characters from the group, and between the two or more characters When the first parameter representing the relationship satisfies the first condition, a step of executing the first event with contents corresponding to two or more characters is executed.
  According to the above configuration, the first event is executed when the relationship between two or more characters selected from the group satisfies the first condition. As a result, the interest of the game is improved by the relationship between the characters.
  (Item 2) In (Item 1), the step of selecting two or more characters may select two or more characters for the first event. Thereby, in order to enjoy a 1st event, a user's motivation with respect to selecting two or more characters from a group improves.
  (Item 3) In (Item 1) or (Item 2), the game program may cause the processor to further execute a step of progressing the first part of the game using a group. In this case, the step of progressing the first part includes a step of forming a group, a step of selecting two or more characters, and a step of executing a first event. This improves the motivation for the user to play the first part in order to enjoy the first event.
  (Item 4) In any one of the items (Item 1) to (Item 3), the step of executing the first event includes a plurality of characters constituting the group in addition to the first parameter satisfying the first condition. The first event may be executed when is a predetermined combination. Thereby, the user can enjoy the relationship between two or more characters selected from a group, when performing a 1st event using the group comprised by the character of a predetermined combination. As a result, the user's motivation for forming a group with a predetermined combination of characters is improved.
  (Item 5) In (Item 4), the game program may cause the processor to further execute a step of selecting any element from one or more elements that can constitute the first event. In this case, the step of executing the first event executes the first event when the plurality of characters constituting the group are a predetermined combination associated with the element selected in the step of selecting the element. Also good. As a result, the user executes a first event on a selected element using a group composed of a predetermined combination of characters, and thereby a relationship between two or more characters selected from the group. You can enjoy sex. As a result, the motivation of the user to form a group with a predetermined combination of characters for the selected element is improved.
  (Item 6) In any one item of (Item 1) to (Item 5), the game program further includes a step of causing the processor to select any element from one or more elements that can constitute the first event. It may be executed. In this case, in the step of executing the first event, the first event may be executed with the content according to the element selected in the step of selecting an element in addition to the content according to two or more characters. Thereby, the user can enjoy variations of the first event.
  (Item 7) In (Item 6), the number of elements that are options in the step of selecting an element may change according to the second parameter related to the user. Thereby, since the user can increase the number of elements that can constitute the first event, the user can further enjoy further variations of the first event.
  (Item 8) In any one of (Item 1) to (Item 7), the game program causes the processor to execute a second event for a combination of two or more characters among a plurality of characters available in the game. The step of proceeding with the second part including performing may be further performed. In this case, the step of proceeding with the second part may update the first parameter related to the two or more characters according to the execution of the second event. Thereby, in order to perform the 1st event based on two or more characters, the user can play the 2nd part and can change the relation between the two or more characters. In this way, the interest of the game is improved by making the relationship between the characters changeable.
  (Item 9) In (Item 7), the game program proceeds to the processor in a second part including executing a second event for a combination of two or more characters among a plurality of characters available in the game. Further steps may be executed. In this case, the step of proceeding with the second part may update the second parameter in accordance with the execution of the second event. Thereby, the user can increase the number of elements that can form the first event by updating the second parameter by playing the second part. In this way, the fun of the game is improved by making the number of variations of the first event variable.
  (Item 10) In any one item of (Item 1) to (Item 9), the step of selecting two or more characters includes assigning a role to each of the two or more characters and executing the first event, The first event may be executed based on the role of each of the two or more characters. Thereby, the user can enjoy the 1st event based on the role of each of two or more characters selected from the group.
  (Item 11) In (Item 10), the step of selecting two or more characters is selected on the screen for selecting two or more characters and the first operation object for switching the roles of the two selected characters. The second operation object for starting execution of the first event may be displayed using the two characters. In the step of selecting two or more characters, when an operation on the first operation object is accepted, the roles assigned to the two selected characters may be switched. The step of executing the first event may start execution of the first event when an operation on the second operation object is accepted. Thereby, the user can enjoy the 1st event which replaced the role of each of two or more characters selected from the group.
  (Item 12) A method for executing a game program has been described. According to an aspect of the present disclosure, the game program is executed by a computer including a processor and a memory. The method is a method in which the processor executes each step described in (Item 1). The method according to (Item 12) has the same effects as the game program according to (Item 1).
  (Item 13) The information processing apparatus has been described. According to an aspect of the present disclosure, the information processing apparatus includes a storage unit (120) that stores a game program according to (Item 1) and the information processing apparatus (user terminal 100) by executing the game program. A control unit (110) for controlling the operation. The information processing apparatus according to (Item 13) has the same effects as the game program according to (Item 1).
1 game system, 2 network, 10,20 processor, 11,21 memory, 12,22 storage, 13,23 communication IF (operation unit), 14,24 input / output IF (operation unit), 15 touch screen (display unit, Operation unit), 17 camera (operation unit), 18 distance sensor (operation unit), 100 user terminal (information processing device), 110, 210 control unit, 111 operation reception unit, 112 display control unit, 113 UI control unit, 114 animation generation unit, 115 element selection unit, 116 group organization unit, 117 first part progression unit, 118 second part progression unit, 130, 230 storage unit, 131 game program, 132 game information, 133 user information, 151 input unit (Operation unit), 152 display unit, 20 Server, 211 transceiver, 212 data management unit, 213 the server processing unit, 1010 an object, 1020 controller (operation unit), 1030 storage medium

Claims (14)

  1. A game program,
    The game program is executed by a computer including a processor and a memory,
    The game program is stored in the processor.
    A step of using said plurality of characters available in the game based on the game program, Ru to select the two or more characters to the user by the input operation,
    Selecting one or more characters using the plurality of characters;
    Determining whether a first parameter representing a relationship between the two or more characters satisfies a first condition in response to an input operation for executing a first event being performed;
    When the first parameter satisfies the first condition, executing a first event based on a group formed by including the two or more characters and the one or more characters;
    If the first parameter does not satisfy the first condition, the game program for re-executing the step of Ru is selected the two or more characters to the user.
  2.   The step of executing the first event proceeds with the first event based on parameters of a plurality of characters included in the group, and produces effects in the first event with contents according to the two or more selected characters. The game program according to claim 1, wherein:
  3. Step Ru is selected the two or more characters to the user, the order of the first event, the two or more characters Ru is selected to the user, the game program according to claim 1 or 2.
  4. The game program is stored in the processor.
    Using the group to proceed with the first part of the game,
    Step traveling through the first part includes the steps of Ru is selected the two or more characters to the user, and selecting the one or more character, and executing the first event, a claim 4. The game program according to any one of 1 to 3.
  5.   The step of executing the first event executes the first event when the plurality of characters constituting the group are a predetermined combination in addition to the first parameter satisfying the first condition. The game program according to any one of claims 1 to 4.
  6. The game program is stored in the processor.
    Further comprising selecting any element from one or more elements that may constitute the first event;
    The step of executing the first event executes the first event when a plurality of characters included in the group are a predetermined combination associated with the element selected in the step of selecting the element. The game program according to claim 5.
  7. The game program is stored in the processor.
    Further comprising selecting any element from one or more elements that may constitute the first event;
    The step of executing the first event executes the effect in the first event with the content according to the element selected in the step of selecting the element in addition to the content according to the selected two or more characters. The game program according to any one of claims 1 to 6.
  8. The game program according to claim 7, wherein the number of elements that are options in the step of selecting the elements changes according to a second parameter related to the user.
  9. The game program is stored in the processor.
    A step of advancing a second part including executing a second event for a combination of two or more characters among a plurality of characters available in the game;
    The game program according to any one of claims 1 to 8, wherein the step of proceeding with the second part updates the first parameter relating to the two or more characters in accordance with execution of the second event. .
  10. The game program is stored in the processor.
    A step of advancing a second part including executing a second event for a combination of two or more characters among a plurality of characters available in the game;
    The game program according to claim 8, wherein the step of proceeding with the second part updates the second parameter in accordance with execution of the second event.
  11. Step Ru is selected the 2 or more characters in the user assigns roles to the two or more characters each,
    The game program according to any one of claims 1 to 9, wherein the step of executing the first event executes an effect in the first event based on the role of each of the two or more characters.
  12. Step Ru is selected the two or more characters to the user, the screen for selecting the two or more characters, the first operating object for exchanging the roles of the two character selected, the two character selected And displaying a second operation object for starting execution of the first event, and when an operation on the first operation object is accepted, the roles assigned to the two selected characters are switched,
    The game program according to claim 11, wherein the step of executing the first event starts execution of the first event when an operation on the second operation object is accepted.
  13. A method for a computer to execute a game program,
    The computer includes a processor and a memory,
    The method by which the processor performs the steps of claim 1.
  14. An information processing apparatus,
    The information processing apparatus includes:
    A storage unit for storing the game program according to claim 1;
    An information processing apparatus comprising: a control unit that controls the operation of the information processing apparatus by executing the game program.
JP2018015868A 2018-01-31 2018-01-31 Game program, method, and information processing device Active JP6568246B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018015868A JP6568246B2 (en) 2018-01-31 2018-01-31 Game program, method, and information processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2018015868A JP6568246B2 (en) 2018-01-31 2018-01-31 Game program, method, and information processing device

Publications (2)

Publication Number Publication Date
JP2019130118A JP2019130118A (en) 2019-08-08
JP6568246B2 true JP6568246B2 (en) 2019-08-28

Family

ID=67546949

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2018015868A Active JP6568246B2 (en) 2018-01-31 2018-01-31 Game program, method, and information processing device

Country Status (1)

Country Link
JP (1) JP6568246B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6826171B1 (en) * 2019-08-29 2021-02-03 株式会社バンダイ Programs, terminals, server devices and systems

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012187207A (en) * 2011-03-09 2012-10-04 Konami Digital Entertainment Co Ltd Game apparatus, control method of game apparatus and program
JP6144877B2 (en) * 2011-11-11 2017-06-07 株式会社カプコン GAME PROGRAM AND GAME DEVICE
JP5814300B2 (en) * 2013-05-31 2015-11-17 株式会社コナミデジタルエンタテインメント GAME MANAGEMENT DEVICE AND PROGRAM
JP6239876B2 (en) * 2013-06-28 2017-11-29 株式会社バンダイナムコエンターテインメント Computer system and program
JP2017176511A (en) * 2016-03-30 2017-10-05 株式会社バンダイナムコエンターテインメント Medium data generation system

Also Published As

Publication number Publication date
JP2019130118A (en) 2019-08-08

Similar Documents

Publication Publication Date Title
CN107294838B (en) Animation generation method, device and system for social application and terminal
CN104603719B (en) Augmented reality surface is shown
KR20170094279A (en) Methods for generating a 3d virtual body model of a person combined with a 3d garment image, and related devices, systems and computer program products
US20160155256A1 (en) Avatar personalization in a virtual environment
CN103971401A (en) Information Processing Device, Terminal Device, Information Processing Method, And Programme
EP2777022A1 (en) Rendering system, rendering server, control method thereof, program, and recording medium
JP2016137106A (en) Construction method, program, information processing system, and information processing device
CN110147231B (en) Combined special effect generation method and device and storage medium
US11144187B2 (en) Storage medium having stored therein game program, information processing system, information processing apparatus, and game processing method
JP6568246B2 (en) Game program, method, and information processing device
JP6453500B1 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
JP2019130291A (en) Game program, method, and information processing device
JP2019205514A (en) Program, method, and information terminal device
CN112118397B (en) Video synthesis method, related device, equipment and storage medium
JP2021013095A (en) Image processing device, display method, and program
JP2020110352A (en) Game program, game method, and information processor
JP6334043B1 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
JP2021053466A (en) Game program, method for executing game program, and information processor
US9384013B2 (en) Launch surface control
JP6609084B1 (en) GAME DEVICE, CONTROL METHOD, AND CONTROL PROGRAM
JP2020048604A (en) Game program, method for executing game program, and information processor
JP6514376B1 (en) Game program, method, and information processing apparatus
JP6523378B2 (en) Game program, method for executing game program, and information processing apparatus
JP6960972B2 (en) Game device, control method, and control program
JP2020110449A (en) Game program, method, and information processing device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20180524

A871 Explanation of circumstances concerning accelerated examination

Free format text: JAPANESE INTERMEDIATE CODE: A871

Effective date: 20180702

A975 Report on accelerated examination

Free format text: JAPANESE INTERMEDIATE CODE: A971005

Effective date: 20180723

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20181029

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20181120

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20190118

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20190205

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20190416

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20190523

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190702

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190801

R150 Certificate of patent or registration of utility model

Ref document number: 6568246

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150