WO2022186142A1 - プログラム - Google Patents

プログラム Download PDF

Info

Publication number
WO2022186142A1
WO2022186142A1 PCT/JP2022/008304 JP2022008304W WO2022186142A1 WO 2022186142 A1 WO2022186142 A1 WO 2022186142A1 JP 2022008304 W JP2022008304 W JP 2022008304W WO 2022186142 A1 WO2022186142 A1 WO 2022186142A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
image
monster
user
information
Prior art date
Application number
PCT/JP2022/008304
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
モアナ 佐藤
大輔 錦
透 西山
Original Assignee
株式会社コロプラ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コロプラ filed Critical 株式会社コロプラ
Publication of WO2022186142A1 publication Critical patent/WO2022186142A1/ja

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/825Fostering virtual characters
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • Embodiments of the present invention relate to programs.
  • a game is known in which a user (player) raises a game object in a virtual game space.
  • a game object (hereinafter simply “object”) changes its status such as level and ability when it is nurtured.
  • Non-Patent Document 1 Also known is a game in which a plurality of objects are arranged in a wide game space and can be trained simultaneously.
  • a technique is known in which a player character operated by a user picks up each object, thereby displaying status information of each object so that the user can check the state of the object ( See Non-Patent Document 1).
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a technique for easily confirming the state of an object placed in a game space.
  • a program executed by a computer in which a first object that can move according to a user's operation and a first object that can move regardless of the user's operation are placed in a game space. placing a possible second object; displaying a first image of the game space including the first object; placing the second object in the game space in a predetermined area of the first image; and displaying a live image of the object.
  • a program according to an aspect of the present invention can make it possible to easily check the state of an object placed in a game space.
  • FIG. 1 is a diagram showing an example of the overall configuration of a game system related to a game program according to one embodiment.
  • FIG. 2 is a diagram showing an example of a functional configuration of a server in the game system shown in FIG. 1.
  • FIG. 3 is a diagram showing an example of a functional configuration of a user terminal in the game system shown in FIG. 1.
  • FIG. 4 is a diagram showing an example of a configuration of a game realized by a game program according to one embodiment.
  • FIG. 5 is a flow chart showing an example of information processing operation of the user terminal shown in FIG.
  • FIG. 6 is a diagram showing a first example of a game image realized by a game program according to one embodiment.
  • FIG. 7 is a diagram showing a second example of a game image realized by the game program according to one embodiment.
  • FIG. 8 is a diagram showing a third example of a game image realized by the game program according to one embodiment.
  • FIG. 9 is a diagram showing a first example of a game image that can be transitioned from the game image of FIG.
  • FIG. 10 is a diagram showing a second example of a game image that can transition from the game image of FIG.
  • FIG. 11 is a diagram showing a first example of a game image that can transition from the game image of FIG.
  • FIG. 12 is a diagram showing a second example of a game image that can transition from the game image of FIG.
  • a program according to one embodiment is a program for realizing a game on a terminal device (hereinafter referred to as "user terminal") used by a game player (hereinafter referred to as "user").
  • user terminal a terminal device used by a game player (hereinafter referred to as "user”).
  • game program a program for realizing general computer functions.
  • a game realized by a game program is a game in which a user can grow objects in a game space.
  • Games implemented by the game program include, but are not limited to, breeding games, simulation games, role-playing games, action games, adventure games, shooting games, sports games, puzzle games, and the like.
  • a game may also be part of such a game (such as a mini-game or an event).
  • a game program can place various objects in a game space to realize a game.
  • a game space is a space for arranging various objects implemented by a game program.
  • a game program may construct a different game space for each part of the game.
  • the game space is mainly a three-dimensional space, and the objects placed in the game space are also mainly displayed in three dimensions. is also applicable.
  • a character is, for example, a human character, an animal character, an anthropomorphic character, or the like, but is not limited to these.
  • a character object is an object that acts (moves).
  • Character objects include user-operable character objects and non-user-operable NPC (Non Player Character) objects.
  • Various objects include tool objects.
  • a prop object is an object that can be used by a character object.
  • a tool object is, for example, an item or equipment.
  • Various objects also include background objects. Background objects are objects that constitute the background of the game, such as trees, rocks, grass, sky, rivers, ponds, buildings, and equipment.
  • Various objects include visible objects and invisible objects.
  • a set value is a numerical value (also referred to as "parameter") representing position, size, shape, color, ability, attribute, and other various information.
  • the setting value of the character object is particularly called a status.
  • the status includes information such as ability values (e.g., level, physical strength, attack power, defense power), health status, attributes (e.g., sex, age, occupation), possessed skills, possessed items, or possessed money. is not limited to
  • the status includes a status that can change due to training and a status that does not change.
  • An example of a status that changes depending on training is the ability value. Training can also be rephrased as enhancement of ability values.
  • the objects placed in the game space by the game program include a first object that can be moved according to the user's operation and a second object that can be moved without the user's operation.
  • An example of a game implemented by a game program is a game in which a user raises a second object in a game space through manipulation of a first object.
  • An example of the first object is a player character object (hereinafter referred to as "player character”).
  • player character An example of the second object (raised object) is a monster character object (hereinafter simply referred to as "monster").
  • a monster is, for example, a fictional creature character.
  • the game program includes a game program that locally realizes a game on one user terminal, a game program that cooperates with a user terminal and a server to realize a game, and a game program that allows a plurality of user terminals (via a server or not via a server) to be played. It may be a game program or the like that realizes a game in cooperation with As an example below, a game system in which a server comprehensively manages a game program, transmits the game program and related data in response to a request from a user terminal, and actually progresses the game mainly on the user terminal. explain.
  • FIG. 1 shows an example of the overall configuration of a game system 1 related to a game program according to one embodiment.
  • the game system 1 includes multiple user terminals 100 and a server 200 .
  • Each user terminal 100 can communicate with the server 200 via the network NW.
  • NW Network-connected Device
  • the network NW is, for example, the Internet, and includes access networks such as LAN (Local Area Network), WAN (Wide Area Network), mobile communication network, wired telephone network, FTTH (Fiber To The Home), and CATV (Cable Television) network. can contain.
  • a user terminal 100 is a terminal used by a user who plays a game.
  • the user terminal 100 realizes a game by executing a game program, and realizes a game screen by displaying a game image according to the progress of the game.
  • a game image may include an image that depicts a game space.
  • An image depicting the game space may include images of multiple objects placed in the game space.
  • Game images include still images or moving images.
  • the game image can also include images of UI (User Interface) components expressed two-dimensionally or three-dimensionally.
  • the user terminal 100 is, for example, a mobile terminal such as a smart phone, a tablet terminal, or a notebook personal computer.
  • the user terminal 100 may be a stationary computer such as a desktop personal computer.
  • User terminal 100 may also be a dedicated gaming terminal suitable for playing games.
  • the user terminal 100 is a smart phone with a touch screen.
  • the server 200 is a device operated and managed by, for example, a game developer.
  • the server 200 comprehensively manages game programs and user information, and supports progress of the game on the user terminal 100 .
  • Server 200 may be a general-purpose computer such as a workstation or personal computer.
  • the server 200 receives user information and various requests from the user terminal 100 via the network NW, and transmits game programs and related data to the user terminal 100 via the network NW.
  • the server 200 can transmit and receive information to and from multiple user terminals 100 at the same time, and can support the progress of games in multiple user terminals 100 in parallel.
  • the user terminal 100 may receive the game program and necessary setting data from the server 200 at once, and proceed with the game without communicating with the server 200 thereafter.
  • the user terminal 100 may communicate with the server 200 from time to time as the game progresses, and receive the necessary game program or data from the server 200 each time.
  • the server 200 includes hardware such as a processor 2001, a memory 2002, a storage 2003, a communication interface (communication I/F ) 2004 and an input/output interface (input/output I/F) 2005 , which are electrically connected to each other via a bus 2006 .
  • hardware such as a processor 2001, a memory 2002, a storage 2003, a communication interface (communication I/F ) 2004 and an input/output interface (input/output I/F) 2005 , which are electrically connected to each other via a bus 2006 .
  • a processor 2001 controls the overall operation of the server 200 .
  • the processor 2001 includes, for example, a general-purpose processor such as a CPU (Central Processing Unit), MPU (Micro Processing Unit), GPU (Graphics Processing Unit).
  • the processor 2001 is not limited to a general-purpose processor, and may be a dedicated processor such as ASIC (Application Specific Integrated Circuit) or FPGA (Field-Programmable Gate Array).
  • the memory 2002 is a main storage device and includes ROM (Read Only Memory), RAM (Random Access Memory), and the like.
  • the storage 2003 is an auxiliary storage device and includes non-volatile storage devices such as hard disk drives (HDD) and solid state drives (SSD).
  • the storage 2003 stores programs executed by the processor 2001, setting data necessary for executing the programs, and the like. Part of the program may be stored in ROM.
  • the processor 2001 can implement the processing functions described later by reading a program from the storage 2003, expanding it in the memory 2002, and interpreting and executing the expanded program.
  • a communication interface (communication I/F) 2004 is a module for communicating with an external device such as the user terminal 100 via the network NW, and includes a signal processing circuit for transmission and reception, an optical connector, and the like.
  • Communication interface 2004 may include, for example, an optical communication module.
  • An input/output interface (input/output I/F) 2005 captures operation data input by an operator through an input device such as a keyboard or mouse, and outputs output data to an output device such as a liquid crystal or organic EL (Electro Luminescence) display or speaker. Output.
  • an input device such as a keyboard or mouse
  • an output device such as a liquid crystal or organic EL (Electro Luminescence) display or speaker.
  • the user terminal 100 includes, as hardware, a processor 1001, a memory 1002, a storage 1003, a sensor 1004, and a communication interface (communication I/F) 1005. , an input/output interface (input/output I/F) 1006 , and a touch screen 1007 , which are electrically connected to each other via a bus 1008 .
  • a processor 1001 a memory 1002, a storage 1003, a sensor 1004, and a communication interface (communication I/F) 1005.
  • an input/output interface (input/output I/F) 1006 input/output interface
  • a touch screen 1007 which are electrically connected to each other via a bus 1008 .
  • a processor 1001 controls the overall operation of the user terminal 100 .
  • the processor 1001 includes, for example, general-purpose processors such as CPU, MPU, and GPU.
  • the processor 1001 is also not limited to a general-purpose processor, and may be a dedicated processor such as ASIC or FPGA.
  • the processor 1001 reads a program from the storage 1003, expands it in the memory 1002, and interprets and executes the expanded program, thereby realizing processing functions to be described later.
  • a memory 1002 is a main storage device and includes ROM, RAM, and the like.
  • the storage 1003 is an auxiliary storage device and includes internal or external semiconductor memory (for example, flash memory).
  • the storage 1003 stores programs executed by the processor 1001, setting data necessary for executing the programs, and the like. Part of the program may be stored in ROM.
  • the sensor 1004 is, for example, an image sensor, sound sensor, acceleration sensor, angular velocity sensor, geomagnetic sensor, GPS sensor, proximity sensor, ambient light sensor, or the like.
  • the sensor 1004 converts various sensed information into electrical signals and outputs them.
  • a communication interface (communication I/F) 1005 is a module for communicating with an external device such as the server 200 via the network NW, and includes a signal processing circuit for transmission/reception, an antenna, a LAN terminal, and the like.
  • the communication interface 1005 can include a module for mobile communication, a module for wireless/wired LAN, a module for short-range wireless communication, and the like.
  • An input/output interface (input/output I/F) 1006 takes in input data from an external device and outputs output data to the external device.
  • the input/output interface 1006 can include, for example, physical buttons of the user terminal 100, a speaker built into the user terminal 100, a USB (Universal Serial Bus) port, and the like.
  • the touch screen 1007 includes an input unit 1071 and a display unit 1072, and has functions of receiving user input operations and displaying various images to the user.
  • the input unit 1071 is, for example, a capacitive or resistive touch panel.
  • the input unit 1071 detects a contact position touched by a user's finger or a touch pen (stylus pen), and generates coordinate information of the contact position.
  • the display unit 1072 is, for example, a liquid crystal display or an organic EL display, and displays various images based on display data.
  • the display unit 1072 realizes a game screen by displaying a game image.
  • the user terminal 100 can also accept user operations from external input devices such as a keyboard, mouse, and controller connected via the input/output interface 1006 .
  • the user terminal 100 may acquire game programs and related data from an external storage device such as a memory card connected via the input/output interface 1006 .
  • the user terminal 100 can also output display information to an external output device such as a display or speaker connected via the input/output interface 1006 .
  • the user terminal 100 may use an external storage device such as a memory card connected via the input/output interface 1006 as the storage 1003 .
  • User terminal 100 may also accept a signal from sensor 1004 as a user operation.
  • FIG. 2 shows an example of the functional configuration of the server 200 in the game system 1 shown in FIG. Illustrations and descriptions of the functional configuration of a general computer and the well-known configuration required to realize the game are omitted.
  • the server 200 has a function of communicating with each user terminal 100 and supporting the progress of the game on the user terminal 100 .
  • the server 200 has a function of transmitting a game program, related data necessary for executing the game program, etc. to each user terminal 100 in response to a request from each user terminal 100 .
  • the server 200 includes a control section 210 and a storage section 220 .
  • Storage unit 220 is mainly implemented by storage 2003 .
  • the storage unit 220 includes a game program storage unit 221 , a game information storage unit 222 and a user information database (DB) 223 .
  • DB user information database
  • the game program storage unit 221 stores game programs.
  • the game information storage unit 222 stores various game information referred to when executing the game program.
  • the user information database (DB) 223 is a database that stores information about users associated with each user terminal 100, and stores user account information, for example.
  • Control unit 210 is mainly implemented by processor 2001 and memory 2002 .
  • the control unit 210 controls the functions of the server 200 as a whole.
  • Control unit 210 can function as reception control unit 211 , transmission control unit 212 and game progression unit 213 by executing the game program stored in game program storage unit 221 .
  • the reception control unit 211 receives, from each user terminal 100, a transmission request for user information, a program or related data, information regarding the progress of the game in each user terminal 100, and the like.
  • the transmission control unit 212 transmits the requested program or related data, updated program, or the like to each user terminal 100 .
  • the transmission control unit 212 can also request each user terminal 100 to transmit information about the progress of the game.
  • the game progress unit 213 refers to the game information stored in the game information storage unit 222 and the account information stored in the user information database 223 according to the code described in the game program, and determines the progress of the game on each user terminal 100. to manage.
  • the game progression unit 213 can also determine whether information collection from each user terminal 100 is necessary and whether data transmission to each user terminal 100 is necessary.
  • control unit 210 receives a game program transmission request from the user terminal 100 by the reception control unit 211, determines the game program and related data to be transmitted to the user terminal 100 by the game progress unit 213, and controls transmission.
  • the determined game program and related data are transmitted to the user terminal 100 by the unit 212 .
  • FIG. 3 shows an example of the functional configuration of a user terminal 100 that can be used in the game system 1 shown in FIG. Illustrations and descriptions of the functional configuration of a general computer and the well-known configuration required to realize the game are omitted.
  • User terminal 100 includes a control unit 110 and a storage unit 120 .
  • the storage unit 120 is mainly implemented by the storage 1003 .
  • Storage unit 120 includes game program storage unit 121 , game information storage unit 122 , and UI information storage unit 123 .
  • the game program storage unit 121 stores game programs.
  • the game information storage unit 122 stores various game information referred to when executing the game program.
  • the game information includes, for example, setting values of various objects arranged in the game space, various other information related to the game space, and the like.
  • the game information includes information about the trained monster, such as name, status, growth conditions, and the like.
  • the UI information storage unit 123 stores information on various UI components used for generating game images.
  • Control unit 110 is mainly implemented by processor 1001 and memory 1002 .
  • the control unit 110 controls the functions of the user terminal 100 as a whole.
  • the control unit 110 controls the game progression unit 111, the object control unit 112, the display control unit 113, the virtual camera control unit 114, the farm management unit 115, and the operation reception unit. It can function as part 116 .
  • the game progression unit 111 refers to the game information stored in the game information storage unit 122 according to the code described in the game program, arranges objects in the game space, and progresses the game.
  • the game progression unit 111 can reflect the coordinate information about the user's operation from the operation reception unit 116 and the user's instruction specified from the user's operation in the progress of the game.
  • the game progression unit 111 arranges in the game space a first object (player character) that can move according to a user's operation, and a second object (monster) that can move regardless of the user's operation.
  • the object control unit 112 controls actions or statuses of various objects placed in the game space. For example, the object control unit 112 moves the player character according to the user's operation. The user can move the player character or cause the player character to perform an action by inputting an operation to the input unit 1071 . Actions of the player character include actions accompanied by positional movement and actions not accompanied by positional movement. The object control unit 112 also causes the monster to act according to the code written in the game program, regardless of the user's operation. The object control unit 112 can also move the monster according to the user's operation. Actions of the monster include actions accompanied by positional movement and actions not accompanied by positional movement.
  • Display control unit 113 generates display data for displaying various images on display unit 1072 .
  • the display data generated by the display control unit 113 includes display data for displaying an image (first image) of the game space including the player character (first object).
  • the first image is an image in which an image of the UI component is superimposed on a live video (first live video) of the player character in the game space.
  • the display data generated by the display control unit 113 also includes display data for displaying a live image (second live image) of a monster (second object) in the game space in a predetermined area of the first image.
  • the predetermined area of the first image may be an area arbitrarily set by a game designer or the like. The predetermined area is located, for example, on the right side in the first image as seen from the user playing the game.
  • live video refers to images of the game space displayed in real time or near real time.
  • a live video is, for example, an image (video) captured by a virtual camera movably arranged corresponding to each character.
  • the first live image is an image captured by a virtual camera tracking a player character
  • the second live image is an image captured by a virtual camera tracking a monster.
  • the first live image does not need to include the whole body of the player character, and may be an image including only a part of the player character such as the face and upper body.
  • the second live video also does not need to include the whole body of the monster, and may include only a part of the monster such as the face or upper body.
  • live video is realized as a combination of multiple still images obtained from each virtual camera.
  • the display data generated by the display control unit 113 is also display data for displaying the first information representing the status of the monster (second object) when the user's operation for selecting the predetermined area is received. including.
  • the display data generated by the display control unit 113 is also configured such that when the user's operation for selecting the predetermined area is received, instead of the first image, an image corresponding to the live image of the monster and an image of the monster are displayed.
  • a user's operation of selecting a predetermined area of the first image includes an operation of tapping the predetermined area or its periphery via the touch screen 1007 .
  • An operation of selecting a predetermined area of the first image can also be rephrased as an operation of selecting a monster or an operation of selecting a live video.
  • the display data generated by the display control unit 113 also includes data for displaying second information indicating that the monster has transitioned to a specific state.
  • the second information includes, for example, characters, symbols, pictures, animations, icons, or pictograms.
  • the second information is displayed in association with the live video.
  • Specific states include, for example, a state in which the monster's ability value is increased, a state in which the monster possesses a present, or a state in which the monster is sick.
  • the second information may include a third image visually indicating that the monster has transitioned to a specific state.
  • the third image is, for example, a pictogram or icon.
  • the third image may contain text information.
  • the data for displaying the second information may include information for removing the display of the second information after displaying the second information for a certain period of time.
  • the virtual camera control unit 114 arranges a plurality of virtual cameras in the game space and acquires images captured by each virtual camera.
  • a virtual camera is a virtual camera for obtaining an image from the game space.
  • a virtual camera obtains an image of an object placed in the game space.
  • Shooting includes the meaning of imaging.
  • the virtual camera includes a virtual camera arranged movably corresponding to the player character.
  • the virtual camera also includes a movably positioned virtual camera corresponding to each monster.
  • the virtual camera control unit 114 moves each virtual camera corresponding to each shooting target by setting each virtual camera as an accompanying object (child element) of each shooting target.
  • the virtual camera control unit 114 controls, for example, the position (X, Y, Z) of each virtual camera in the world coordinate system of the virtual space or the rotation angle (roll, pitch, yaw) on each axis.
  • the virtual camera control unit 114 controls the movement of each virtual camera according to conditions set for each shooting target, such as the distance range from each shooting target, the angle range of each shooting target with respect to the median sagittal plane or horizontal plane, or the viewpoint. can be controlled according to
  • the movement of the virtual camera is not limited to the manner in which the object to be photographed is tracked from behind. It includes all modes of moving in response to the object to be shot, such as a mode of moving while facing. Tracking may also be translated as tracking or following.
  • the virtual camera control unit 114 further controls the position of the virtual camera, zoom, tilt, focus, angle of view, magnification, relative distance or relative angle between each shooting target and each virtual camera, etc., according to the user's operation.
  • can also A virtual camera can also be rephrased as a virtual viewpoint.
  • the virtual camera includes a virtual camera that is fixed in the game space and does not track the shooting target.
  • the farm management unit 115 manages information on a farm field (hereinafter simply referred to as "farm") as an example of a game space.
  • a farm is a game space in which a user can raise monsters.
  • a farm may include a virtual outdoor space (eg, a space resembling a meadow or a farm) or a virtual indoor space (eg, a space resembling a facility such as a gymnasium).
  • the farm management section 115 manages the state of each monster in cooperation with the object control section 112 .
  • the farm manager 115 also manages the environment on the farm (eg, weather, day/night, season), equipment on the farm (eg, training equipment, feeding stations, items), or other events on the farm.
  • the operation accepting unit 116 accepts a user's operation input via the input unit 1071 .
  • user's operation refers to the user's operation input via the input unit 1071 .
  • User operations include various types of operations via the input unit 1071 such as tap operations and flick operations.
  • a tap operation is an example of an operation in which one contact position is detected on the input unit 1071 and is not detected within a predetermined time.
  • a tap operation is input by, for example, the user lightly touching the touch screen with a fingertip and releasing it immediately.
  • a flick operation is an example of an operation of moving a single contact position on the input unit 1071 in a short period of time in chronological order.
  • a flick operation is input by, for example, the user touching the touch screen with a fingertip and lightly flicking the finger in an arbitrary direction while touching the touch screen.
  • operation accepting unit 116 determines which object in the image displayed on display unit 1072 is operated, based on the position coordinates. As an example, when the operation reception unit 116 receives a tap operation at the position of the image of any UI component, the operation reception unit 116 generates a command indicating that the UI component has been selected. , the display control unit 113 , the virtual camera control unit 114 , or the firmware management unit 115 .
  • the operation reception unit 116 also receives a user's operation in a predetermined range of the image as an operation for the action of the player character. For example, when receiving a flick operation at a position around the player character, the operation receiving unit 116 receives the flick operation as a movement operation in the direction corresponding to the start position and end position.
  • the object control unit 112 further cooperates with the farm management unit 115 to control the actions and status of each monster placed within the farm.
  • the object control unit 112 randomly selects an action from a plurality of actions such as running, playing, walking, and sleeping, and causes each monster to act according to the corresponding program code without user's operation.
  • the object control unit 112 can also change the action of each monster according to the passage of time or changes in the state of the farm.
  • the object control unit 112 may reflect the individuality of each monster when selecting an action mode, such as increasing the running ratio.
  • the object control unit 112 may also reflect the environment of the farm in the behavior of each monster, such as being active when the weather is fine.
  • the object control unit 112 causes the monster for which "training” is set to perform preset actions (for example, strength training actions, meditation actions, etc.). "Training” is set for each monster by user selection.
  • the object control unit 112 also changes the ability value of the monster set to "training”. For example, the object control unit 112 increases parameters such as attack power and defense power according to set conditions.
  • the object control unit 112 can also control the action or status of the monster according to the user's operation. For example, when a player character feeds a monster through a user's operation, the object control unit 112 causes the monster to perform an action that makes the monster happy for a certain period of time. degree parameter) or mood parameter (kibun parameter).
  • the object control unit 112 can control the presence or absence of effect displays, such as the display of heart marks and musical note marks that represent the mood of the monsters, in addition to the actions of the monsters themselves. Further, for example, when the user does not feed a monster for a certain period of time, the object control unit 112 causes the monster to perform an angry action and lowers the friendliness parameter or the kindness parameter of the monster.
  • the object control unit 112 further controls state transitions of each monster.
  • the object control unit 112 monitors the status of each monster, and causes a state transition between a "normal state” and a "specific state” when a predetermined condition is met. For example, if the friendliness parameter exceeds a predetermined threshold, the object control unit 112 causes the monster to transition to a specific state of "having a present.” When received by the user, the monster transitions to a normal state of "not possessing a present".
  • a gift is, for example, an item that is advantageous in advancing the game.
  • the object control unit 112 repeatedly executes the above control at fixed time intervals (for example, at intervals of 1/30 second).
  • a control result by the object control unit 112 is, for example, passed to the display control unit 113 for image display, passed to the virtual camera control unit 114 for virtual camera control, and stored in the storage unit 120 .
  • FIG. 4 shows an example of a game configuration realized by a game program.
  • the game includes a training part 2 and a quest part 3.
  • the game program builds or uses a first game space for training part 2 and a second game space for quest part 3.
  • Breeding part 2 contains elements of "farm” and elements of "breeding”.
  • the user can breed monsters in the farm (farm element).
  • a user can possess one or more monsters in the game, and raises the monsters by placing the possessed monsters on the farm.
  • Monsters placed on farms change their status over time, for example. Changes in status include changes in ability values, changes in possessed items, changes in health, and the like.
  • the user can also perform various operations such as giving food to the monsters placed on the farm.
  • the user can also set training for the monsters in the farm. When a monster is trained, its stats will greatly improve.
  • An upper limit is set for the number of monsters that can be placed on the farm. Each monster's ability value is also capped. Therefore, the user can strategically select whether to place the owned monsters in the farm (to raise them) or put them in a standby state.
  • the user can breed a plurality of monsters in order to create new monsters in the breeding field (breeding element).
  • a farm and mating field are included in the first game space.
  • the farm and breeding field may be implemented as a single game space or as separate game spaces.
  • the user can, for example, breed using monsters whose ability values have reached the upper limit, and grow the monsters newly created by the breeding on the farm.
  • the quest part 3 is an adventure part aiming at some goal achievement, and includes an "action” element and a "monster acquisition” element.
  • the user can operate the player character in the second game space to perform actions (e.g., fight enemy characters, acquire items, avoid traps or gimmicks). action elements).
  • the user can cause the player character to acquire a monster in the second game space (monster acquisition element).
  • the user acquires a monster by having the player character fight against a monster as an enemy character and winning the battle.
  • the user causes the player character to obtain a dedicated item, and obtains a monster as a reward for obtaining the dedicated item.
  • the training part 2 and the quest part 3 are related to each other.
  • the user can use the monsters raised in the raising part 2 in the quest part 3.
  • a user can manipulate a monster with or on behalf of a player character to cause it to perform actions.
  • the user can grow the monster acquired in quest part 3 in breeding part 2.
  • the user can switch between the training part 2 and the quest part 3 by operating UI components displayed on the game screen.
  • the first game space for training part 2 and the second game space for quest part 3 are separate game spaces.
  • the game implemented by the game program makes it possible to train in the first game space monsters acquired in the quests in the second game space different from the first game space.
  • the first game space and the second game space may be an integrated game space.
  • the following description mainly relates to breeding part 2 (first game space), and particularly to the farm part.
  • FIG. 5 is a flow chart showing an example of the information processing operation of the user terminal 100 shown in FIG.
  • the control unit 110 and the storage unit 120 of the user terminal 100 cooperate to progress the game according to the code described in the game program and generate the display data.
  • the game progression unit 111 arranges various objects including player characters and training target monsters in a game space (farm). Under the control of the virtual camera control unit 114, the game progress unit 111 also creates a virtual camera movable corresponding to the player character and a virtual camera movable corresponding to each of one or more monsters. Place in space.
  • the object control unit 112 also controls the actions and statuses of monsters and stores the latest information.
  • step S ⁇ b>1 the control unit 110 acquires information on objects in the game space through cooperation between the object control unit 112 and the display control unit 113 .
  • the acquired object information may include position information and status information of the player character and position information and status information of the monster.
  • the position information is represented, for example, as position coordinates within the game space.
  • the control unit 110 acquires information on each of the plurality of monsters.
  • control unit 110 generates display data for displaying a first image of the game space including the player character in cooperation with display control unit 113 and virtual camera control unit 114, and displays the first image on the display unit. 1072 to display.
  • the first image is an image in which a UI component image is superimposed on a live image from a virtual camera arranged movably corresponding to the player character.
  • the virtual camera is controlled by the virtual camera control unit 114 so as to track the player character at a position behind the player character by a predetermined distance, with the point of view set to the head of the player character.
  • the virtual camera captures an image at a predetermined frame rate and passes it to display control section 113 .
  • a live image of the player character viewed from behind that is, a so-called third-person viewpoint
  • control unit 110 repeats the process of displaying an image including a live image of the monster in a predetermined area of the first image for each monster in the farm in steps S3 to S5.
  • the control unit 110 in cooperation with the display control unit 113 and the virtual camera control unit 114, generates display data for displaying a live image of a monster in a predetermined area within the first image, and displays the data.
  • a live image is a live image from a virtual camera movably arranged corresponding to each of one or more monsters in the game space.
  • each virtual camera is controlled by the virtual camera control unit 114 so as to track each monster at a position a predetermined distance in front of each monster, with the viewpoint at the head of each monster.
  • Each virtual camera captures an image at a predetermined frame rate and transfers it to the display control unit 113 .
  • a live image including a part of each monster's face is obtained.
  • the frame rate of the virtual camera that shoots the player character and the frame rate of each virtual camera that shoots each monster may be the same or different.
  • FIG. 6 shows a first example of a game image realized by a game program according to one embodiment.
  • a game image 50A in FIG. 6 is a first example of the first image, and is displayed on the display unit 1072 of the user terminal 100.
  • the game image 50A includes an image of the game space including the player character 11.
  • the game image 50A also includes images of UI components 51A, 51B, 51C, UI component 52, and UI components 71A, 71B, 71C, 71D.
  • the image of the game space including the player character 11 is the captured image of the virtual camera that tracks the player character 11 within the game space.
  • the player character 11 is placed in a farm having the appearance of grassland, and displayed with background objects such as grass, trees, rocks, fences, mountains, and the sky.
  • the player character 11 is an example of a first object that can move according to a user's operation, and is represented as a human game character in the game image 50A.
  • UI components 51A, 51B, and 51C are operation buttons that perform predetermined functions when selected by the user.
  • the UI component 51A when the UI component 51A is tapped, it switches to a dedicated image (not shown) for setting the firmware, allowing the user to set the firmware.
  • Farm settings include, for example, settings for monsters placed on the farm, settings for training monsters placed on the farm, and settings for other objects placed on the farm (for example, trainers that support training, training tools, playthings, etc.). etc.
  • the UI component 51B when the UI component 51B is tapped, it switches to a mating field image (not shown) so that the user can play a game of mating elements.
  • the UI component 51C when the UI component 51C is tapped, it switches to a dedicated image (not shown) for performing various settings, allowing the user to perform various settings such as screen display and volume.
  • the number, appearance, display position or function of the UI components 51A, 51B, 51C can be arbitrarily set by the game designer.
  • the UI component 52 has a scene title function, and includes a display that describes the scene displayed by the game image 50A.
  • the UI component 52 includes the characters "farm" to indicate that the player character 11 is on the farm.
  • Scene titles 52 may also be optionally set by the game designer.
  • the UI components 71A, 71B, 71C, and 71D are arranged within a predetermined area 70 of the game image 50A.
  • the predetermined area 70 is located on the right side of the game image 50A as seen from the user playing the game.
  • the right area in the game image 50A is an example of an area that is easily tapped with the right hand when the user holds both ends of the user terminal 100 in the longitudinal direction with both hands.
  • the predetermined area 70 may be another position on the game image 50A.
  • the UI components 71A, 71B, 71C, and 71D respectively have a function of displaying live images 20A, 20B, 20C, and 20D (collectively referred to as "live images 20") of monsters.
  • the UI components 71A, 71B, 71C, and 71D are hereinafter referred to as "wipe UI” respectively, and also collectively referred to as “wipe UI 71".
  • Images related to the wipe UI 71 including the live video 20 are collectively referred to as wipe images.
  • wipe UIs 71A, 71B, 71C, and 71D respectively display live images of four different monsters 21A, 21B, 21C, and 21D (collectively referred to as "monsters 21") arranged in the farm.
  • 20A, 20B, 20C and 20D are displayed.
  • a live image 20 is an image captured by a virtual camera that tracks each monster 21 in the game space.
  • the monster 21 is an example of a second object that can act without user's operation, and is represented as a fictitious creature character in FIG.
  • the monster 21 may also be able to move according to the user's operation.
  • Each monster 21 is a monster selected by the user as a breeding target and placed on the farm.
  • the appearance, shape, number, size and position of the wipe UI 71 may be set arbitrarily.
  • the number of wipe UIs 71 may be fixed or may change.
  • the number of wipe UIs 71 may change according to the number of monsters 21 placed in the farm during the progress of the game, or may be a fixed number equal to the maximum number of monsters 21 that can be placed in the farm. There may be.
  • the size of the displayed wipe UI 71 may be fixed or may change during the course of the game.
  • the wipe UIs 71 may be displayed side by side in a predetermined area 70 as shown in FIG. 6, or may partially overlap each other. For example, in FIG.
  • the display of the wipe UI 71A may be turned off, or the live image within the wipe UI 71A may be switched to a dummy image. Alternatively, only the outline of the wipe UI 71A may be maintained so that the first live video can be seen through.
  • the positions and sizes of the remaining wipe UIs 71B, 71C, and 71D may be maintained, or may be changed so that they are arranged evenly within the predetermined area 70 ( height), or spacing may be changed.
  • the player character 11 can move (for example, walk around) within the farm in response to a user's operation.
  • the monster 21 is also reflected in the captured image of the virtual camera that tracks the player character 11, and the player character 11 and the monster 21 appear in the game image 50A. is displayed.
  • the user can communicate with the monster 21 through the game image 50A.
  • the operation of moving the player character 11 closer to the monster 21 is complicated.
  • the user must first identify the monster 21 of interest, and then identify the position of the monster 21. can be more complicated.
  • the wipe UI 71 is listed together with the live image of the player character 11, and the live image of each monster 21 is displayed within the wipe UI 71.
  • FIG. This allows the user to grasp the current state of each monster 21 at a glance while operating the player character 11 without complicated operations.
  • the wipe UI 71A displays a live image 20A of the monster 21A, and the user can visually recognize the monster 21A walking around the farm while operating the player character 11.
  • FIG. A live image 20B of the monster 21B is displayed on the wipe UI 71B, and the state of the monster 21B closing its eyes can be visually recognized.
  • a live image 20C of the monster 21C is displayed on the wipe UI 71C, and the musical note mark 22 displayed as an effect corresponding to the mood of the monster is reflected, so that the monster 21C can be visually recognized as being in a good mood.
  • a live image 20D of the monster 21D is displayed on the wipe UI 71D, and the sitting state of the monster 21D can be visually recognized.
  • the monster 21 can appear in the wipe UI 71 as well as appearing in the live image of the player character 11 .
  • the player character 11 itself may appear in the live video within the wipe UI 71 .
  • two or more monsters 21 are close to each other, it is possible that two or more monsters 21 appear in one wipe UI 71 or the same monster 21 overlaps in two or more wipe UIs 71 .
  • step S4 the control unit 110 determines whether or not each monster 21 has transitioned to a specific state in cooperation with the object control unit 112, the farm management unit 115, and the like. If it is determined that the monster 21 has transitioned to the specific state (YES), the process proceeds to step S5, and if it is determined that the monster has not transitioned to the specific state (NO), the process ends.
  • each monster 21 placed in the farm has two states, a "normal state” and a "specific state", and undergoes a state transition when predetermined conditions are met.
  • An example of a specific state is a state in which monster 21 is in possession of a present.
  • the normal state is the state in which the monster 21 does not possess any presents.
  • the monster 21 when the monster 21 is given food in the farm, the friendliness parameter increases, and when the friendliness parameter reaches a certain value, the monster 21 transitions to the state of possessing a present (specific state).
  • the user obtains the item that is the content of the present by receiving the present from the monster 21 via the player character 11 .
  • the monster 21 transitions to a normal state.
  • the normal state is a state in which the monster 21 is not sick (healthy state).
  • the Kibun parameter is set to decrease.
  • the Kibun parameter may also be set to change due to the frequency of communication with the player character 11, changes in the environment within the farm, or the like.
  • the monster 21 transitions to a sick state with a certain probability. Furthermore, if the sick state continues for a certain period of time, the ability value of the monster 21 will decrease, or it will be impossible to take the monster 21 to the quest part 3.
  • the user treats the sick monster 21 through the player character 11, for example, by giving the sick monster 21 food, taking him to quest part 3, administering medicine, or increasing communication. condition can be recovered. When a predetermined recovery condition is satisfied, the monster 21 transitions to its normal state.
  • a specific state is a state in which monster 21's ability value is increased.
  • the normal state is a state in which the ability value of the monster 21 has not increased.
  • the monster 21 if the monster 21 is set to "train" in the farm and its ability value does not reach the upper limit, it transitions to a state in which its ability value is increased.
  • the monster 21 transitions to the normal state.
  • Attribute values are HP (hit points) representing physical strength, SP (skill points) representing the ability to use skills (special skills), attack power, defense power, attack hit rate, luck, attack evasion rate, movement speed, etc. including.
  • a state in which the ability value is increased may include a state in which the monster 21 is about to acquire a new skill or a state in which the monster has acquired a new skill.
  • Each monster 21 can transition to different specific states at the same time.
  • a combination of two or more of the state in which the monster 21 is in possession of a present, the state in which the monster 21 is sick, or the state in which the ability value of the monster 21 is increased is given to each monster 21. can occur simultaneously.
  • step S5 the control unit 110 causes the display control unit 113 to generate and display display data for displaying transition information indicating that the monster 21 has transitioned to a specific state.
  • Transition information is an example of second information.
  • the transition information may include characters such as "gift”, “disease”, “defense power rising”, “attack power rising”, and "new skill acquisition”.
  • the transition information may include an image (third image) visually indicating that the monster 21 has transitioned to a specific state, such as a picture of a gift box, a skull mark, or an arrow mark.
  • the display control unit 113 generates display data for displaying transition information, for example, by reading UI components that match the conditions from the UI information storage unit 123 .
  • the transition information is displayed in association with each live video 20 or each wipe UI 71 .
  • transition information may be badged in the frame of wipe UI 71 , overlaid on wipe UI 71 , or displayed near wipe UI 71 .
  • the flow shown in FIG. 5 is repeatedly executed at regular intervals (for example, every 1/30th of a second).
  • the frame rate of the virtual camera that captures the player character or each monster is set to a value (eg, 1/10th of a second) that provides smooth live video while reducing the processing load.
  • FIG. 7 shows a second example of a game image realized by a game program according to one embodiment.
  • a game image 50B in FIG. 7 is a second example of the first image.
  • the game image 50B in FIG. 7 is the same as the game image 50A in FIG. 6 except that the transition information is displayed, so the differences from FIG. 6 will be mainly described below.
  • Game image 50B in FIG. 7 includes present display 72 and disease display 73 .
  • the present display 72 and the disease display 73 are examples of display of transition information.
  • a present display 72 indicates that the monster 21 has transitioned to possessing a present.
  • the present representation 72 is an image containing a picture of a gift box.
  • the present display 72 is superimposed on the frame of the wipe UI 71A in a so-called badge display mode, indicating that the monster 21A possesses a present for the player character 11.
  • FIG. The present display 72 also suggests to the user that an operation to receive a present from the monster 21A is required.
  • the present display 72 is continuously displayed while the monster 21 is in possession of the present. For example, when the user moves the player character 11 near the monster 21A and performs an operation to receive a present from the monster 21A, the monster 21A transitions from the "specific state" to the "normal state". Therefore, in the process of the next cycle, the process ends without displaying the present display 72 due to the determination (NO) in step S4. If "NO” is determined in step S4, a process "if there is already displayed transition information, deletes the display" may be executed.
  • a disease display 73 indicates that the monster 21 has transitioned to a sick state.
  • the disease display 73 is an image including a skull.
  • the disease display 73 is also superimposed on the frame of the wipe UI 71D in the form of a badge display, indicating that the monster 21D is in a sick state.
  • the sick display 73 also suggests to the user that an operation is required to recover the monster 21D from the sick state.
  • the disease display 73 is continuously displayed while the monster 21 is in a sick state. For example, when the user moves the player character 11 near the sick monster 21D and performs an operation to give medicine, the monster 21D transitions from the "specific state” to the "normal state”. Therefore, in the processing of the next cycle, the processing ends without displaying the disease display 73 due to the determination (NO) in step S4. After all, when it is determined to be "NO” in step S4, the process of "turning off the display of transition information that has been displayed, if any" may be executed.
  • the present display 72 or the disease display 73 may include character information, or may consist of characters only.
  • the present display 72 or the disease display 73 may be displayed within the wipe UI 71 or may be displayed outside the wipe UI 71 .
  • the present display 72 or the disease display 73 has a function of displaying character information explaining what state the monster 21 is in, what kind of operation the user should perform, etc., when tapped by the user, for example. You may
  • FIG. 8 shows a third example of a game image realized by the game program according to one embodiment.
  • a game image 50C in FIG. 8 is a third example of the first image.
  • the game image 50C in FIG. 8 is the same as the game image 50A in FIG. 6 except that the transition information is displayed, so the differences from FIG. 6 will be mainly described below.
  • a game image 50 ⁇ /b>C of FIG. 8 includes a first ability increase display 74 and a second ability increase display 75 .
  • the first ability increase display 74 and the second ability increase display 75 are respectively examples of display of transition information (second information or third image).
  • the first ability increase display 74 indicates that the monster 21 has transitioned to a state in which the ability value has increased.
  • a first ability increase display 74 may include a picture of an arrow pointing upwards and the word "defense" as a shorthand for defense power.
  • a first ability increase display 74 is displayed within wipe UI 71B and wipe UI 71C, respectively, to indicate that the defense power of monsters 21B and 21C, respectively, is increasing.
  • the second ability increase display 75 also indicates that the monster 21 has transitioned to a state in which the ability value has increased.
  • a second ability increase display 75 may include a picture of an arrow pointing upwards and the word "attack” as an abbreviation for attack power. In FIG. 8, the second ability increase display 75 is displayed within the wipe UI 71C to indicate that the attack power of the monster 21C is increasing.
  • the display control unit 113 starts displaying the transition information and starts a timer to display the transition information for a certain period of time (for example, 3 seconds). , 5 seconds, 10 seconds, etc.), processing is performed to erase the display of the transition information (or display an image that does not include the transition information). It may be set for each transition information whether to continue displaying the transition information while the monster 21 is in a specific state or to turn off the display after a certain period of time from the start of the display. When the display is turned off after a certain period of time, the time until the display is turned off may also be set for each piece of transition information. As an example, the first ability increase display 74 and the second ability increase display 75 are erased after being displayed for a certain period of time. The first ability increase display 74 and the second ability increase display 75 may be continuously displayed while the ability value of the monster 21 is increasing.
  • a certain period of time for example, 3 seconds. , 5 seconds, 10 seconds, etc.
  • processing is performed to erase the display of the transition information (or display an image
  • the transition information indicating that the monster 21 (second object) displayed in the predetermined area 70 has transitioned to a specific state is more will be displayed.
  • the transition information includes characters or images and is superimposed on the predetermined area 70 .
  • the transition information is particularly displayed within the predetermined area 70, in the frame of each wipe UI 71, within each wipe UI 71, or around each wipe UI 71 to indicate which monster 21 has transitioned to a particular state. can be done. Accordingly, while operating the player character 11, the user can grasp at a glance whether or not each monster 21 has transitioned to a specific state without complicated operations.
  • the wipe UI 71 further has an operation button function for accepting user operations.
  • the control unit 110 displays detailed status information of the monster 21 related to the tapped wipe UI 71 under the control of the display control unit 113 .
  • An operation of tapping the wipe UI 71 is an example of an operation of selecting the predetermined area 70 .
  • control unit 110 causes operation reception unit 116 to cause the user to wipe UI 71 . It monitors whether or not there is an operation to tap . For example, when the position of the tap operation is included in one of the wipe UIs 71 , the operation reception unit 116 determines that the operation of tapping the wipe UI 71 has been detected. When an operation of tapping any of the wipe UIs 71 is detected, the control unit 110 interrupts the processing of FIG. and the first information representing the status thereof is generated and displayed in place of the first image.
  • the image corresponding to the live image includes an enlarged image of the live image displayed in the first image, or an image in which the live image displayed in the first image is displayed on the entire game screen.
  • FIG. 9 shows a first example of a game image that can transition from the game image of FIG.
  • a game image 60A in FIG. 9 is an example of a game image displayed on the display unit 1072 of the user terminal 100 particularly when an operation of tapping the wipe UI 71A is received.
  • a game image 60A in FIG. 9 is a first example of the second image, and is displayed instead of the game image 50A (first image) in FIG.
  • a similar game image can be displayed when the wipe UI 71A shown in FIG. 7 or 8 is tapped.
  • a game image 60A in FIG. 9 includes a live image of a monster 21A and UI components 52, 53, 54A, 55A, 56A, 57 and 58.
  • FIG. 9 shows a live image of a monster 21A and UI components 52, 53, 54A, 55A, 56A, 57 and 58.
  • the live image of monster 21A included in game image 60A is an image corresponding to the live image of monster 21A included in game image 50A (first image).
  • the live video included in the second image is, like the live video within the wipe UI 71A, the live video of the virtual camera tracking the monster 21A.
  • the live video included in the second image can also be rephrased as an enlarged version of the live video displayed on the tapped wipe UI 71A.
  • the game image 60A shows the monster 21A walking around the farm.
  • UI component 54A displays the first status information.
  • the first status information includes the name “monster A” of the monster 21A, the symbol “ ⁇ ” indicating male and female, and the level “level 20" of the monster 21A.
  • UI component 55A displays the second status information.
  • the second status information includes information indicating the monster 21A's friendliness level “level 3", feeling “good (smile mark)", and training setting status "unselected”.
  • UI component 56A displays the third status information.
  • the third status information includes information "HP445", "SP100”, “offensive power 80” and “defensive power 70” indicating the ability value of the monster 21A.
  • the first status information, second status information and third status information are examples of first information representing the status of the monster 21A (second object).
  • the first status information, the second status information, and the third status information do not need to be divided and displayed as shown in FIG. 9, and may be collectively displayed in one UI component, for example.
  • the first status information, the second status information, and the third status information are merely examples, and a part of the information may be omitted, a part of the information may be replaced with other information, or other information may be used. information may be added.
  • FIG. 9 shows an example in which the first information is displayed superimposed on the live video, the first information and the live video may be displayed in other manners, for example, displayed side by side vertically or horizontally. good too.
  • the UI component 52 has a scene title function as in FIG. 6, and includes a display for explaining the scene displayed by the game image 60A.
  • the UI component 52 includes the characters "monster” and indicates that the scene is to display the monster 21 .
  • the UI component 53 is an operation button having a so-called “return” function, and when a tap operation is received, the game image 60A is switched to the previously displayed image (for example, the game image 50A).
  • the UI component 57 is an operation button having a "swap" function, and when a tap operation is received, it switches to an image for performing a process of swapping the monsters 21A.
  • the UI component 58 is an operation button having a function of "give food”, and when a tap operation is received, the UI component 58 switches to an image for performing processing of giving food to the monster 21A.
  • FIG. 10 shows a second example of a game image that can transition from the game image of FIG.
  • a game image 60B in FIG. 10 is an example of a game image displayed on the display unit 1072 of the user terminal 100 particularly when an operation of tapping the wipe UI 71B is received.
  • a game image 60B in FIG. 10 is a second example of the second image, and is also displayed instead of the game image 50A (first image) in FIG.
  • a similar game image can be displayed when the wipe UI 71B shown in FIG. 7 or 8 is tapped.
  • the game image 60B of FIG. 10 includes video corresponding to the live video of the monster 21B and UI components 52, 53, 54B, 55B, 56B, 57 and 58.
  • FIG. Differences from FIG. 9 will be mainly described below.
  • the live image of the monster 21B included in the game image 60B is an image corresponding to the live image of the monster 21B included in the game image 50A (first image).
  • the live video included in the second image is, like the live video within the wipe UI 71B, the live video of the virtual camera tracking the monster 21B.
  • the live video included in the second image can also be rephrased as an enlarged version of the live video displayed on the tapped wipe UI 71B.
  • the game image 60B shows the monster 21B meditating with its eyes closed in the farm.
  • UI component 54B displays the first status information.
  • the first status information includes the name “monster B” of the monster 21B, the symbol “ ⁇ ” indicating male and female, and the level “level 1" of the monster 21B.
  • UI component 55B displays the second status information.
  • the second status information includes information indicating the monster 21B's friendliness level “level 5", feeling “good (smiley mark)", and training setting status "meditation”.
  • UI component 56B displays the third status information.
  • the third status information includes information "HP 270", “SP 55”, “attack power 45” and “defense power 40” indicating the ability value of the monster 21B.
  • the first status information, second status information, and third status information are examples of first information representing the status of the monster 21B (second object).
  • the first status information, the second status information, and the third status information do not need to be divided and displayed as shown in FIG. 10, and may be collectively displayed in one UI component, for example.
  • the first status information, the second status information, and the third status information are merely examples, and a part of the information may be omitted, a part of the information may be replaced with other information, Other information may be added.
  • monster 21B is in a state of being set for "meditation” training.
  • "Training” is not limited to “meditation”, and other training content may be selectable. It may be set so that the change in the ability value differs according to the content of the training. For example, when “meditation” is set, the "defense power" of the monster 21 is set to increase by a constant value at regular intervals.
  • the game image 60B may also include the first ability increase display 74 illustrated in FIG.
  • the second image is displayed instead of the first image when the operation of tapping the wipe UI 71 (the operation of selecting the predetermined area 70) is accepted.
  • the second image may include information representing the status of each monster and video corresponding to the live video of each monster, and may further enable operations for each monster. Additionally or alternatively, when an operation of tapping the wipe UI 71 is accepted, the player character 11 automatically moves within the game space near the monster 21 appearing on the tapped wipe UI 71. may be set as
  • FIG. 11 shows a first example of a game image that can transition from the game image of FIG.
  • a game image 61 in FIG. 11 is an example of a game image displayed on the display unit 1072 of the user terminal 100 when an operation of tapping the UI component 57 in FIG. 9 is received.
  • Game image 61 includes UI component 52 , UI component 53 , UI component 62 , UI component 63 , UI component 64 , UI component 65 and UI component 66 .
  • the UI component 52 has a scene title function, and the game image 61 indicates the scene after the UI component 57 having the "swap" function is selected.
  • the UI component 53 has a so-called "return” function, and when a tap operation is received, the game image 61 is switched to the previously displayed image (for example, the game image 60A).
  • UI component 62 contains information about the currently selected monster. In this example, UI component 62 displays an image of monster 21A and various information about monster 21A (the same information that was displayed in FIG. 9).
  • the UI component 63 has a "remove” function, and when a tap operation is received, it performs a process of removing the monster 21A from being raised. For example, the user selects "remove” when the ability value of the monster 21 reaches the upper limit. If the monster 21A is removed from the breeding target, there will be an empty slot for monsters that can be placed on the farm.
  • the UI component 64 includes information regarding monsters possessed by the player character 11 .
  • Monsters with a description of "under formation” are monsters currently placed in the farm.
  • the UI component 62 displays the status of the tapped monster instead of the monster 21A.
  • the UI component 65 has a "cancel” function. For example, when the user taps the UI component 65, the process performed immediately before on the game image 61 is canceled.
  • the UI component 66 has an "OK” function. For example, when the user taps the UI component 66, the monsters displayed within the UI component 62 are placed on the farm. In this way, the "swap" function makes it possible to switch the monsters arranged on the farm. In other words, the "swap” function enables the monsters 21 displayed in the live video 20 in the wipe UI 71 to be switched.
  • FIG. 12 shows a second example of a game image that can transition from the game image of FIG.
  • a game image 67 in FIG. 12 is an example of an image displayed on the display unit 1072 of the user terminal 100 when an operation of tapping the UI component 58 in the game image 60A in FIG. 9 is accepted.
  • the game image 67 includes live images of the monster 21A and UI components 52, 53, 68, 81, 82, 83A, 83B, 83C, 83D and 83E.
  • the UI component 52 has the function of a scene title, and the game image 67 indicates the scene after the UI component 58 having the function of "giving food” is selected.
  • the UI component 53 has a so-called “return” function, and when a tap operation is received, the game image 67 is switched to the previously displayed image (for example, the game image 60A).
  • UI component 68 contains information about the currently selected monster. In this example, the UI component 68 includes parameter information of "natsuki” and "kibun” among the statuses of the monster 21A.
  • the UI component 68 may further include an upward pointing arrow mark 69 representing that "natsuki” and "kibun” are rising.
  • a musical note mark 22 and a heart mark 23 are superimposed as effects indicating that the parameters of "natsuki” and "kibun” are increasing in the live image of the monster 21A.
  • Such an effect display can also be reflected in the live video within the wipe UI 71 .
  • UI parts 83A, 83B, 83C, 83D and 83E function as operation buttons for the user to give food objects to the monster 21A.
  • the user selects one of the UI parts 83A to 83E and drags it to drop it near the monster 21A or on a specific part of the monster 21A, the food object indicated by the UI part is placed on the monster 21A. is processed. Further, the process of giving the monster 21A the food object indicated by the UI parts may be performed simply by performing the operation of tapping the UI parts 83A to 83E.
  • a monster 21 given a food object increases the parameter of "natsuki" or "kibun".
  • a UI component 83C surrounded by a solid line is selected.
  • the UI component 83C corresponds to the cake object.
  • monster 21A's "natsuki” and “kibun” parameters are increased.
  • Food objects are set so that different parameter increase rates can be obtained depending on the type or number of food objects.
  • UI parts 81 and 82 are examples of UI parts for changing the display range of UI parts 83A to 83E. For example, when the user taps the UI component 81, the display of the UI component 83E currently displayed on the right end disappears, and the display positions of the UI components 83A to 83D shift to the right and to the left of the UI component 83A. A certain (undisplayed) UI component is newly displayed.
  • the display of the UI component 83A currently displayed on the left end disappears, and the display positions of the UI components 83B to 83E shift to the left and are located on the right side of the UI component 83E ( (Undisplayed) UI parts are newly displayed.
  • the change in the display range when the UI component 81 and the UI component 82 are tapped may be in other forms such as shifting in the opposite direction.
  • a predetermined area of the first image of the game space including the player character (first object) 10 that moves according to the user's operation is displayed.
  • a live image 20 of a monster (second object) 21 that can move without being operated by the user is displayed.
  • multiple monsters 21 are arranged in the game space, multiple live images 20 corresponding to each monster 21 are displayed. This allows the user to easily check the current state of the monsters 21 as a list while operating the player character 10 .
  • the first image does not need to include an image of the first object (player character 11) itself, and may include an image reflecting the field of view of the first object.
  • the first object may be replaced by a virtual camera.
  • the user can operate the position, orientation, shooting range, focus, etc. of the virtual camera.
  • the first image does not explicitly display the virtual camera as the first object, and may be replaced with an image captured by the virtual camera.
  • the second image is displayed instead of the first image when the wipe UI 71 is tapped.
  • the first information representing the status of the tapped monster 21 in the wipe UI 71 can be superimposed on the first image.
  • the second object or the object to be trained is not limited to a monster character, and may be a living object or an inanimate object as long as it can be trained (something can be changed in the game as a result of training). good too.
  • FIGS. 6 to 12 are merely examples, and some configurations may be omitted, replaced, or other configurations added.
  • the UI components 51A, 51B, 51C, and 52 in FIG. 6 may be omitted or replaced with other configurations.
  • An effect display corresponding to the sick state may be displayed around the sick monster 21 .
  • a presentation display may be provided that allows the sleeping state and the meditation state to be distinguished from each other. By reflecting such an effect display in the live video, it is possible to more easily grasp the current state of the monster 21 in combination with the display of the transition information.
  • the input unit 1071 may be a controller with multiple buttons.
  • a user's operation on the input unit 1071 may be replaced with an operation via such a controller button, keyboard, or mouse.
  • a tap operation may be rephrased as a click operation.
  • the operation reception unit 116 can also receive a signal from the sensor 1004 as a user's operation.
  • the operation reception unit 116 can detect a user's facial expression or gesture via an image sensor or a gesture sensor and receive it as an operation.
  • the operation accepting unit 116 can detect the user's voice via a sound sensor and accept it as an operation.
  • the configurations of the server 200 and the user terminal 100 may be replaced with other configurations that can implement the game program according to the embodiment.
  • the configuration of the server 200 and the configuration of the user terminal 100 may be omitted, distributed in multiple devices, and replaced with similar configurations.
  • Each functional unit of the server 200 and the user terminal 100 may be realized by using circuits.
  • a circuit may be a dedicated circuit that implements a specific function, or it may be a general-purpose circuit such as a processor.
  • a program that implements the above process may be provided by being stored in a computer-readable recording medium.
  • the program is stored in the recording medium as an installable format file or an executable format file.
  • Recording media include magnetic disks, optical disks (CD-ROM, CD-R, DVD, etc.), magneto-optical disks (MO, etc.), semiconductor memories, and the like. Any recording medium may be used as long as it can store the program and is readable by a computer.
  • the program that implements the above processing may be stored on a computer (server) connected to a network such as the Internet, and downloaded to the computer (client) via the network.
  • the present invention is not limited to the above-described embodiments, and various modifications can be made in the implementation stage without departing from the gist of the invention. Further, each embodiment may be implemented in combination as appropriate, in which case the combined effect can be obtained. Furthermore, various inventions are included in the above embodiments, and various inventions can be extracted by combinations selected from a plurality of disclosed constituent elements. For example, even if some constituent elements are deleted from all the constituent elements shown in the embodiments, if the problem can be solved and effects can be obtained, the configuration with the constituent elements deleted can be extracted as an invention.
  • game system 100 user terminal 110 control unit 111 game progression unit 112 object control unit 113 display control unit 114 virtual camera control unit 115 firmware management unit 116 operation reception Unit 120 Storage unit 121 Game program storage unit 122 Game information storage unit 123 UI information storage unit 200 Server 211 Reception control unit 212 Transmission control unit 213 Game progression unit 220... Storage unit 221... Game program storage unit 222... Game information storage unit 223... User information database 1001... Processor 1002... Memory 1003... Storage 1004... Sensor 1005... Communication interface 1006... Input/output Interface 1007 Touch screen 1008 Bus 1071 Input unit 1072 Display unit 2001 Processor 2002 Memory 2003 Storage 2004 Communication interface 2005 Input/output interface 2006 Bus.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/JP2022/008304 2021-03-04 2022-02-28 プログラム WO2022186142A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021034374A JP7442472B2 (ja) 2021-03-04 2021-03-04 プログラム
JP2021-034374 2021-03-04

Publications (1)

Publication Number Publication Date
WO2022186142A1 true WO2022186142A1 (ja) 2022-09-09

Family

ID=83153814

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/008304 WO2022186142A1 (ja) 2021-03-04 2022-02-28 プログラム

Country Status (2)

Country Link
JP (2) JP7442472B2 (enrdf_load_stackoverflow)
WO (1) WO2022186142A1 (enrdf_load_stackoverflow)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005152318A (ja) * 2003-11-26 2005-06-16 Namco Ltd プログラム、情報記憶媒体、ゲーム装置及びサーバ装置
JP2012252516A (ja) * 2011-06-02 2012-12-20 Nintendo Co Ltd ゲームシステム、ゲーム装置、ゲームプログラム、および画像生成方法
WO2014196135A1 (ja) * 2013-06-07 2014-12-11 株式会社スクウェア・エニックス・ホールディングス 画像生成装置、プログラム、端末、および画像生成システム
JP2018020001A (ja) * 2016-08-05 2018-02-08 株式会社セガゲームス 情報処理装置およびゲームプログラム
JP2020054478A (ja) * 2018-09-28 2020-04-09 株式会社ミクシィ ゲーム装置、ゲーム処理方法及びプログラム
JP2020195480A (ja) * 2019-05-31 2020-12-10 株式会社コーエーテクモゲームス 情報処理装置、情報処理方法及びプログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4117343B2 (ja) * 2000-09-20 2008-07-16 株式会社光栄 記録媒体及びゲーム装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005152318A (ja) * 2003-11-26 2005-06-16 Namco Ltd プログラム、情報記憶媒体、ゲーム装置及びサーバ装置
JP2012252516A (ja) * 2011-06-02 2012-12-20 Nintendo Co Ltd ゲームシステム、ゲーム装置、ゲームプログラム、および画像生成方法
WO2014196135A1 (ja) * 2013-06-07 2014-12-11 株式会社スクウェア・エニックス・ホールディングス 画像生成装置、プログラム、端末、および画像生成システム
JP2018020001A (ja) * 2016-08-05 2018-02-08 株式会社セガゲームス 情報処理装置およびゲームプログラム
JP2020054478A (ja) * 2018-09-28 2020-04-09 株式会社ミクシィ ゲーム装置、ゲーム処理方法及びプログラム
JP2020195480A (ja) * 2019-05-31 2020-12-10 株式会社コーエーテクモゲームス 情報処理装置、情報処理方法及びプログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Monster Hunter Portable 3rd", WEEKLY FAMITSU, JP, vol. 25, no. 44, 21 October 2010 (2010-10-21), JP, pages 25 - 31, XP009541199 *

Also Published As

Publication number Publication date
JP7442472B2 (ja) 2024-03-04
JP2024056972A (ja) 2024-04-23
JP2022134893A (ja) 2022-09-15

Similar Documents

Publication Publication Date Title
US12151164B2 (en) Game program, game method, and information terminal device
JP2023547720A (ja) 仮想オブジェクトの制御方法、装置、端末、及びコンピュータプログラム
US11628365B2 (en) Information processing system, storage medium, information processing apparatus and information processing method
JP2015116336A (ja) 複合現実感アリーナ
JP7701517B2 (ja) プログラムおよびシステム
JP2020110451A (ja) ゲームプログラム、方法、および情報処理装置
US20200324209A1 (en) Game system, game processing method, computer-readable non-transitory storage medium having stored therein game program, and game apparatus
JP6416365B1 (ja) ゲームプログラム、方法、および情報処理装置
JP2020044134A (ja) ゲームプログラム、方法、および情報処理装置
JP2022048285A (ja) プログラム、制御方法および情報処理装置
JP2005319191A (ja) ゲームシステム、プログラム、情報記憶媒体および画像生成方法
JP7672474B2 (ja) プログラム
WO2022186142A1 (ja) プログラム
JP7161977B2 (ja) プログラム、方法、および情報処理装置
JP7400140B1 (ja) ゲームプログラム
JP7656278B2 (ja) 制御プログラム、端末装置、及び端末装置の制御方法
JP2020116178A (ja) ゲームプログラム、方法、および情報処理装置
JP7467316B2 (ja) プログラム
JP2022108334A (ja) プログラム
JP2020110452A (ja) ゲームプログラム、方法、および情報処理装置
JP7731852B2 (ja) プログラム及び情報処理装置
JP7377790B2 (ja) ゲームプログラム、ゲーム方法、および情報端末装置
JP6502550B1 (ja) ゲームプログラム、方法、および情報処理装置
US11344802B1 (en) Game system, program and information processing method
JP2019098153A (ja) ゲームプログラム、方法、および情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22763198

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22763198

Country of ref document: EP

Kind code of ref document: A1