CN117899457A - Storage medium, information processing system, information processing apparatus, game processing method, and computer program - Google Patents

Storage medium, information processing system, information processing apparatus, game processing method, and computer program Download PDF

Info

Publication number
CN117899457A
CN117899457A CN202311339694.7A CN202311339694A CN117899457A CN 117899457 A CN117899457 A CN 117899457A CN 202311339694 A CN202311339694 A CN 202311339694A CN 117899457 A CN117899457 A CN 117899457A
Authority
CN
China
Prior art keywords
value
virtual space
dimensional
event
predetermined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311339694.7A
Other languages
Chinese (zh)
Inventor
堂田卓宏
森航
朝仓淳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nintendo Co Ltd
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Publication of CN117899457A publication Critical patent/CN117899457A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

Provided are a storage medium, an information processing system, an information processing apparatus, and a game processing method. In one example of the information processing system, when a predetermined event occurs based on game processing, a point corresponding to the occurred event among a plurality of points set in the virtual space is changed from a first state to a second state. The information processing system determines an area including at least a place in the plurality of places that becomes the second state, and draws, for at least a part of the terrain object, a portion of the at least a part of the terrain object that is not included in a target range including at least a part of the area, with a prescribed color or in a darker manner than a portion of the at least a part of the terrain object that is included in the target range. A map image representing the site information of the virtual space is displayed, wherein the map image shows the site information of a portion corresponding to the region.

Description

Storage medium, information processing system, information processing apparatus, game processing method, and computer program
Technical Field
The present invention relates to a storage medium, an information processing system, an information processing apparatus, and a game processing method for drawing a virtual space.
Background
Conventionally, there are game devices that execute the following games: a region (for example, a region which is displayed dark) in the virtual space is searched for by using props as light sources (for example, "doctor and pair device pair", (r), "doctor and pair device pair", university, 2017, 5, 11, p 211). In the above game, in a dark area (for example, an area in a cave) in a game field, a player character holds props such as a firebridge as a light source, and thus the surroundings of the player character can be illuminated to ensure visibility.
Conventionally, the following processes have not been performed: the region ensuring visibility in the virtual space is interlocked with the region showing the site information in the map image of the virtual space.
Accordingly, an object of the present invention is to provide a storage medium, an information processing system, an information processing apparatus, and a game processing method capable of changing an area ensuring visibility in a virtual space according to a change in an area showing site information in a map image.
Disclosure of Invention
In order to solve the above-described problems, the present invention adopts the following configurations (1) to (7).
(1)
An example of the present invention is a storage medium storing a game program for causing a computer of an information processing apparatus to execute the following processing.
Executing a game process of controlling a player character in the virtual space based on the operation input;
A process of, when a predetermined event occurs based on the game process, switching a point corresponding to the occurred event from a first state to a second state among a plurality of points set in the virtual space;
a process of specifying an area including at least a place which becomes the second state among the plurality of places;
A process of performing a rendering process in which, for at least a part of the terrain objects in the virtual space, a portion of the at least a part of the terrain objects that is not included in a target range including at least a part of the region is rendered in a predetermined color or is rendered darker than a portion of the at least a part of the terrain objects that is included in the target range; and
And a process of displaying a map image representing the site information of the virtual space, wherein the map image shows the site information of a portion corresponding to the region.
According to the configuration of (1) above, the range ensuring visibility in the virtual space (i.e., the above-described target range) can be changed according to a change in the area of the map image where the field information is not shown.
(2)
In the configuration of (1) above, the computer may perform the following processing: a region in which a total determination value obtained by summing up determination values based on one or more points in the plurality of points that have become the second state by location is equal to or greater than a predetermined value is determined as a region, wherein the determination value is a value that is a reference value at a location corresponding to the point and that decays according to a distance from the location.
According to the configuration of (2) above, the shape and size of the region can be made to be various shapes and sizes corresponding to the respective states of the plurality of places.
(3)
In the configuration of (1) or (2), the computer may perform the following processing: setting (a) a range in which a total determination value obtained by adding up the determination values based on one or more points in the second state among the plurality of points by position is equal to or greater than a predetermined value and (b) a range in which a two-dimensional distance from a two-dimensional position corresponding to the point is equal to or less than a threshold value, as a target range, wherein the determination value is a value that is a reference value at the two-dimensional position corresponding to the point and decays according to the distance from the two-dimensional position.
According to the configuration of (3) above, the range of ensuring visibility in the virtual space can be suppressed from becoming excessively large.
(4)
In the configuration of any one of (1) to (3), the computer may perform the following processing: two-dimensional range data is generated which represents a degree value of a degree of darkness drawn in the drawing process or a degree of drawing with a prescribed color for each two-dimensional coordinate of the virtual space corresponding to a coordinate component other than the height direction. The computer may perform the following processes: the degree value is calculated based on a total determination value obtained by adding up determination values based on one or more points in the second state among the plurality of points by position, and a value which is a reference value at a two-dimensional position corresponding to the point and which is attenuated according to a two-dimensional distance from the two-dimensional position to the coordinate in each coordinate of the two-dimensional range data, wherein the determination value is a value which is a reference value at a two-dimensional position corresponding to the point and which is attenuated according to a distance from the two-dimensional position. The computer may perform the following processing in the drawing processing: for each pixel drawn into the frame buffer, a pixel value obtained by reducing the brightness of a pixel value calculated so as to reflect the light source set in the virtual space according to the degree value at the two-dimensional coordinates corresponding to the pixel indicated by the two-dimensional range data or a pixel value obtained by synthesizing a predetermined color according to the degree value is written into the frame buffer.
According to the configuration of (4) above, an image of a virtual space in which the visibility gradually changes according to the position can be generated, and therefore, an image representing a virtual space that looks more natural can be generated.
(5)
In the configuration of any one of (1) to (4), the event may be an event generated by performing a predetermined operation input when the player character is located at an event occurrence position set in association with a place in the virtual space. In the virtual space, predetermined objects may be arranged at positions corresponding to the respective plural points. The computer may further perform the following processing: regardless of whether or not a predetermined object is included in the target range, the predetermined object is rendered so as to be displayed so as to be distinguishable from at least a part of the topographic object that is not included in the target range.
According to the configuration of (5) above, the player can easily move the player character to a place outside the target range by targeting a predetermined object.
(6)
In the configuration of any one of (1) to (5), at least a light source having a predetermined brightness may be set in the virtual space regardless of a position in the virtual space. The computer may perform the following processing in the drawing processing: the portion of the part of the terrain object included in the target range is rendered in a manner reflecting the light source.
According to the configuration of (6) above, a certain brightness can be ensured for the target range, and the visibility of the target range can be ensured regardless of the shape of the topography in the virtual space or the like.
(7)
In the configuration of (6) above, the computer may perform the following processing: a point light source is also set in the virtual space according to the occurrence of a predetermined event. The computer may perform the following processing in the drawing processing: the portion of the part of the terrain object included in the target range is drawn so as to reflect the point light source.
According to the configuration of (7) above, it is possible to easily recognize that a predetermined event has occurred and that the virtual space is brighter by the occurrence of the predetermined event.
Further, another example of the present invention may be an information processing apparatus or an information processing system that executes the processing in (1) to (7) above. In addition, another example of the present invention may be a game processing method for executing the processing in (1) to (7) above.
According to the storage medium, the information processing system, the information processing apparatus, and the game processing method described above, the area ensuring visibility in the virtual space can be changed according to the change in the area showing the site information in the map image.
These and other objects, features, aspects, and effects of the present invention will become more apparent from the following detailed description with reference to the accompanying drawings.
Drawings
Fig. 1 is a diagram showing an example of a state in which a left controller and a right controller are mounted on a main body device.
Fig. 2 is a diagram showing an example of a state in which the left controller and the right controller are detached from the main body device, respectively.
Fig. 3 is a six-sided view showing an example of the main body device.
Fig. 4 is a six-sided view showing an example of the left controller.
Fig. 5 is a six-sided view showing an example of the right controller.
Fig. 6 is a block diagram showing an example of the internal structure of the main body device.
Fig. 7 is a block diagram showing an example of the internal configuration of the main body device, the left controller, and the right controller.
Fig. 8 is a diagram showing an outline of a game example in the present embodiment.
Fig. 9 is a diagram showing a relationship between the land corresponding plane and the determination value in the case where 1 reference spot is released.
Fig. 10 is a view showing an example of a map image displayed when the circular area shown in fig. 9 is a release area.
Fig. 11 is a diagram showing a relationship between the land corresponding plane and the determination value in the case where 2 reference sites are released.
Fig. 12 is a view showing an example of a map image displayed when the area shown in fig. 11 is a release area.
Fig. 13 is a view showing an example of a floor corresponding plane of a set release area in a case where 2 reference points have been released and 1 reference point has not been released.
Fig. 14 is a diagram showing an example of a map image generation method according to the present embodiment.
Fig. 15 is a diagram showing an example of a game image including a field image showing a field including a player character.
Fig. 16 is a diagram showing an example of a game image in the case where a player character is located near a reference point.
Fig. 17 is a diagram showing an example of a game image of a field after a reference point is released.
Fig. 18 is a view of the site in a case where 1 reference spot is released, as viewed from above.
Fig. 19 is a view of the site when 2 reference points are released, as viewed from above.
Fig. 20 is a diagram showing an example of a game image of a field where light source props are arranged.
Fig. 21 is a diagram showing an example of a game image of a field in the case where a light source prop is arranged within an irradiation range obtained by a release event.
Fig. 22 is a diagram showing an example of a method of generating a field image written into a frame buffer.
Fig. 23 is a diagram showing an example of a storage area storing various data used for information processing in the game system 1.
Fig. 24 is a flowchart showing an example of the flow of game processing executed by the game system 1.
Fig. 25 is a sub-flowchart showing an example of the detailed flow of the player related control process of step S8 shown in fig. 24.
Fig. 26 is a sub-flowchart showing an example of a detailed flow of the other object control process of step S9 shown in fig. 24.
Fig. 27 is a sub-flowchart showing an example of the detailed flow of the drawing process of step S10 shown in fig. 24.
Fig. 28 is a sub-flowchart showing an example of a detailed flow of the drawing process in another embodiment.
Detailed Description
[1. Structure of Game System ]
Next, a game system according to an example of the present embodiment will be described. An example of the game system 1 in the present embodiment includes a main body device (an information processing device, which functions as a game device main body in the present embodiment) 2, and a left controller 3 and a right controller 4. The left controller 3 and the right controller 4 are detachable from the main body device 2. That is, the game system 1 can be used as an integrated device in which the left controller 3 and the right controller 4 are mounted on the main body device 2. The game system 1 can also independently use the main body device 2, the left controller 3, and the right controller 4 (see fig. 2). Next, the hardware configuration of the game system 1 of the present embodiment will be described, and then control of the game system 1 of the present embodiment will be described.
Fig. 1 is a diagram showing an example of a state in which the left controller 3 and the right controller 4 are attached to the main body device 2. As shown in fig. 1, the left controller 3 and the right controller 4 are integrally attached to the main body device 2. The main body apparatus 2 is an apparatus that performs various processes (for example, game processes) in the game system 1. The main body device 2 includes a display 12. The left controller 3 and the right controller 4 are devices provided with an operation unit for user input.
Fig. 2 is a diagram showing an example of a state in which the left controller 3 and the right controller 4 are detached from the main body device 2. As shown in fig. 1 and 2, the left controller 3 and the right controller 4 are detachable from the main body device 2. In the following, the left controller 3 and the right controller 4 are collectively referred to as "controllers" in some cases.
Fig. 3 is a six-sided view showing an example of the main body device 2. As shown in fig. 3, the main body device 2 includes a substantially plate-shaped housing (housing) 11. In the present embodiment, the main surface of the housing 11 (in other words, the surface on the front side, i.e., the surface on which the display 12 is provided) has a substantially rectangular shape.
Further, the shape and size of the housing 11 are arbitrary. As an example, the housing 11 may be of a size that can be carried. The main body device 2 alone or an integrated device in which the left controller 3 and the right controller 4 are mounted on the main body device 2 may be a portable device. The main body device 2 or the integrated device may be a hand-held device. In addition, the main body device 2 or the integrated device may be a removable device.
As shown in fig. 3, the main body device 2 includes a display 12 provided on a main surface of the housing 11. The display 12 is for displaying an image generated by the main body device 2. In the present embodiment, the display 12 is a Liquid Crystal Display (LCD). The display 12 may be any type of display device.
The main body device 2 further includes a touch panel 13 on the screen of the display 12. In the present embodiment, the touch panel 13 is a touch panel of a type (for example, a capacitance type) capable of performing multi-touch input. However, the touch panel 13 may be any type of touch panel, and may be a touch panel of a type (for example, a resistive film type) capable of single-touch input.
The main body device 2 includes a speaker (i.e., a speaker 88 shown in fig. 6) inside the housing 11. As shown in fig. 3, speaker holes 11a and 11b are formed in the main surface of the housing 11. The output sound of the speaker 88 is output from the speaker holes 11a and 11b, respectively.
The main body device 2 includes a left terminal 17 as a terminal for wired communication between the main body device 2 and the left controller 3, and a right terminal 21 for wired communication between the main body device 2 and the right controller 4.
As shown in fig. 3, the main body 2 includes a groove 23. The groove 23 is provided on the upper side of the housing 11. The slot 23 has a shape to which a predetermined type of storage medium can be attached. The predetermined type of storage medium is, for example, a storage medium (for example, a dedicated memory card) dedicated to the game system 1 and the same type of information processing apparatus as the game system. The predetermined type of storage medium is used to store, for example, data (for example, stored data of an application) used in the main body device 2 and/or a program (for example, a program of an application) executed in the main body device 2. The main body device 2 further includes a power button 28.
The main body device 2 includes a lower terminal 27. The lower terminal 27 is a terminal for communicating between the main body device 2 and the cradle. In the present embodiment, the lower terminal 27 is a USB connector (more specifically, a female-side connector). When the integrated device or the main body device 2 is mounted on a cradle alone, the game system 1 can display an image generated and output by the main body device 2 on a fixed monitor. In the present embodiment, the cradle has a function of charging the mounted integrated device or the main body device 2 alone. In addition, the cradle has the function of a hub device (specifically, a USB hub).
Fig. 4 is a six-sided view showing an example of the left controller 3. As shown in fig. 4, the left controller 3 includes a housing 31. In the present embodiment, the housing 31 has a vertically long shape, i.e., a shape long in the up-down direction (i.e., the y-axis direction shown in fig. 1 and 4). The left controller 3 can be held in the vertical direction even in a state separated from the main body device 2. The housing 31 is shaped and sized to be gripped by a single hand, particularly by the left hand, when held in the longitudinal direction. The left controller 3 can also be held laterally. In the case of holding the left controller 3 in the lateral direction, both hands may be used for holding.
The left controller 3 is provided with an analog stick (analog stick) 32. As shown in FIG. 4, the analog joystick 32 is disposed on the main surface of the housing 31. The analog joystick 32 can be used as a direction input section capable of inputting a direction. The user can input a direction corresponding to the tilting direction (and input a magnitude corresponding to the tilting angle) by tilting the analog stick 32. The left controller 3 may include a cross key, a slide joystick capable of performing a slide input, or the like instead of the analog joystick as the direction input unit. In the present embodiment, the input by pressing the analog stick 32 can be performed.
The left controller 3 is provided with various operation buttons. The left controller 3 includes 4 operation buttons 33 to 36 (specifically, a right direction button 33, a lower direction button 34, an upper direction button 35, and a left direction button 36) on the main surface of the housing 31. The left controller 3 includes a video button 37 and a (minus) button 47. The left controller 3 includes a first L button 38 and a ZL button 39 at the upper left side of the housing 31. The left controller 3 includes a second L button 43 and a second R button 44 on a side surface of the housing 31 on the side to which the main body device 2 is attached. These operation buttons are used to give instructions corresponding to various programs (e.g., OS programs, application programs) executed by the main body apparatus 2.
The left controller 3 further includes a terminal 42 for wired communication between the left controller 3 and the main body device 2.
Fig. 5 is a six-sided view showing an example of the right controller 4. As shown in fig. 5, the right controller 4 includes a housing 51. In the present embodiment, the housing 51 has a longitudinal shape, that is, a shape long in the vertical direction. The right controller 4 can be held in the vertical direction even in a state separated from the main body device 2. The housing 51 is shaped and sized to be held in one hand, particularly in the right hand, in the case of a longitudinal grip. The right controller 4 can be held laterally. In the case of holding the right controller 4 laterally, both hands may be used for holding.
The right controller 4 includes an analog joystick 52 as a direction input unit, similar to the left controller 3. In this embodiment, the analog stick 52 has the same structure as the analog stick 32 of the left controller 3. The right controller 4 may be provided with a cross key, a slide joystick capable of performing slide input, or the like instead of the analog joystick. The right controller 4 includes 4 operation buttons 53 to 56 (specifically, an a button 53, a B button 54, an X button 55, and a Y button 56) on the main surface of the housing 51, similarly to the left controller 3. The right controller 4 includes a + (plus) button 57 and a Home button 58. The right controller 4 further includes a first R button 60 and a ZR button 61 at the upper right side of the side surface of the housing 51. The right controller 4 includes a second L button 65 and a second R button 66, similarly to the left controller 3.
The right controller 4 further includes a terminal 64 for wired communication between the right controller 4 and the main body device 2.
Fig. 6 is a block diagram showing an example of the internal structure of the main body device 2. The main body device 2 includes the respective constituent elements 81 to 85, 87, 88, 91, 97, and 98 shown in fig. 6 in addition to the configuration shown in fig. 3. Several of these components 81 to 85, 87, 88, 91, 97, and 98 may be mounted on an electronic circuit board as electronic components and housed in the case 11.
The main body device 2 includes a processor 81. The processor 81 is an information processing unit that executes various information processing executed by the main body device 2, and may be configured by, for example, only a CPU (Central Processing Unit: central processing unit) or a SoC (system-on-a-chip) that includes a plurality of functions such as a CPU function and a GPU (Graphics Processing Unit: graphics processing unit) function. The processor 81 executes various information processing by executing an information processing program (for example, a game program) stored in a storage unit (specifically, an internal storage medium such as the flash memory 84 or an external storage medium mounted on the slot 23).
As an example of an internal storage medium incorporated in the main body device 2, the main body device includes a flash memory 84 and a DRAM (Dynamic Random Access Memory: dynamic random access memory) 85. The flash memory 84 and the DRAM 85 are connected to the processor 81. The flash memory 84 is a memory for storing various data (or programs) stored in the main body device 2. The DRAM 85 is a memory for temporarily storing various data used in information processing.
The main body device 2 includes a slot interface (hereinafter abbreviated as "I/F") 91. The slot I/F91 is connected to the processor 81. The slot I/F91 is connected to the slot 23, and reads and writes data from and to a predetermined type of storage medium (for example, a dedicated memory card) mounted on the slot 23 in response to an instruction from the processor 81.
The processor 81 performs the above-described information processing by appropriately reading or writing data from or to the flash memory 84, the DRAM 85, and the respective storage media.
The main body device 2 includes a network communication unit 82. The network communication unit 82 is connected to the processor 81. The network communication unit 82 communicates (specifically, wireless communication) with an external device via a network. In the present embodiment, as the first communication method, the network communication unit 82 communicates with an external device by connecting to a wireless LAN in accordance with the Wi-Fi standard. As the second communication method, the network communication unit 82 performs wireless communication with other host devices 2 of the same type by a predetermined communication method (for example, communication based on a proprietary protocol, infrared communication). The wireless communication according to the second communication method can perform wireless communication with other host apparatuses 2 disposed in the closed lan area, and a function called "local communication" is realized in which the plurality of host apparatuses 2 directly perform communication with each other to transmit and receive data.
The main body device 2 includes a controller communication unit 83. The controller communication unit 83 is connected to the processor 81. The controller communication unit 83 performs wireless communication with the left controller 3 and/or the right controller 4. The communication system between the main body device 2 and the left and right controllers 3 and 4 is arbitrary, and in the present embodiment, the controller communication unit 83 performs communication conforming to the standard of Bluetooth (registered trademark) between the left and right controllers 3 and 4.
The processor 81 is connected to the left terminal 17, the right terminal 21, and the lower terminal 27. The processor 81 transmits data to the left controller 3 via the left terminal 17 and receives operation data from the left controller 3 via the left terminal 17 in the case of wired communication with the left controller 3. In addition, the processor 81 transmits data to the right controller 4 via the right terminal 21 and receives operation data from the right controller 4 via the right terminal 21 in the case of wired communication with the right controller 4. In addition, when communicating with the cradle, the processor 81 transmits data to the cradle via the lower terminal 27. As described above, in the present embodiment, the main body device 2 can perform both wired communication and wireless communication with the left controller 3 and the right controller 4, respectively. In addition, when the left controller 3 and the right controller 4 are mounted to the main body device 2 and the main body device 2 is mounted alone to the cradle, the main body device 2 can output data (for example, image data and audio data) to a stationary monitor or the like via the cradle.
Here, the main body apparatus 2 can communicate with a plurality of left controllers 3 at the same time (in other words, in parallel). In addition, the main body device 2 can communicate with a plurality of right controllers 4 at the same time (in other words, in parallel). Thus, a plurality of users can simultaneously input to the main body device 2 using the groups of the left controller 3 and the right controller 4, respectively. As an example, the first user may input the main body device 2 using the first group of the left controller 3 and the right controller 4, and the second user may input the main body device 2 using the second group of the left controller 3 and the right controller 4.
In addition, the display 12 is connected to the processor 81. The processor 81 displays an image generated (for example, by performing the above-described information processing) and/or an image acquired from the outside on the display 12.
The main body device 2 includes a codec circuit 87 and speakers (specifically, left and right speakers) 88. The codec circuit 87 is connected to the speaker 88 and the audio input/output terminal 25, and is connected to the processor 81. The codec circuit 87 is a circuit for controlling input and output of sound data to and from the speaker 88 and the sound input/output terminal 25.
The main body device 2 includes a power control unit 97 and a battery 98. The power control unit 97 is connected to the battery 98 and the processor 81. Although not shown, the power control unit 97 is connected to each unit of the main body device 2 (specifically, each unit that receives the power of the battery 98, the left terminal 17, and the right terminal 21). The power control unit 97 controls the supply of power from the battery 98 to the respective units based on instructions from the processor 81.
The battery 98 is connected to the lower terminal 27. When an external charging device (for example, a cradle) is connected to the lower terminal 27 and electric power is supplied to the main body device 2 via the lower terminal 27, the supplied electric power is charged into the battery 98.
Fig. 7 is a block diagram showing an example of the internal configuration of the main body device 2, the left controller 3, and the right controller 4. Further, details concerning the internal structure related to the main body device 2 are shown in fig. 6, and thus omitted in fig. 7.
The left controller 3 includes a communication control unit 101 that communicates with the main body device 2. As shown in fig. 7, the communication control unit 101 is connected to each component including the terminal 42. In the present embodiment, the communication control unit 101 can communicate with the main body device 2 by both wired communication via the terminal 42 and wireless communication not via the terminal 42. The communication control unit 101 controls a communication method performed by the left controller 3 on the main body device 2. That is, when the left controller 3 is mounted on the main body device 2, the communication control unit 101 communicates with the main body device 2 via the terminal 42. When the left controller 3 is separated from the main body device 2, the communication control unit 101 performs wireless communication with the main body device 2 (specifically, the controller communication unit 83). For example, wireless communication between the controller communication unit 83 and the communication control unit 101 is performed in compliance with a standard of Bluetooth (registered trademark).
The left controller 3 includes a memory 102 such as a flash memory. The communication control unit 101 is configured by, for example, a microcomputer (also referred to as a microprocessor), and executes various processes by executing firmware stored in the memory 102.
The left controller 3 includes buttons 103 (specifically, buttons 33 to 39, 43, 44, and 47). The left controller 3 includes an analog joystick (referred to as a "joystick" in fig. 7) 32. Each of the buttons 103 and the analog stick 32 repeatedly outputs information on the operation performed on itself to the communication control unit 101 at an appropriate timing.
The communication control unit 101 acquires information on the input (specifically, information on the operation or detection result of the sensor) from each input unit (specifically, each button 103 and analog stick 32). The communication control unit 101 transmits an operation data including the acquired information (or information obtained by performing predetermined processing on the acquired information) to the main body device 2. The operation data is repeatedly transmitted at a rate of once transmission for a predetermined time. The interval at which information related to input is transmitted to the main body device 2 may be the same or different in each input unit.
By transmitting the operation data to the main body apparatus 2, the main body apparatus 2 can recognize the input to the left controller 3. That is, the main body device 2 can determine the operation of each button 103 and the analog stick 32 based on the operation data.
The left controller 3 includes a power supply unit 108. In the present embodiment, the power supply unit 108 includes a battery and a power control circuit. Although not shown, the power control circuit is connected to the battery and to each part of the left controller 3 (specifically, each part that receives the supply of the power from the battery).
As shown in fig. 7, the right controller 4 includes a communication control unit 111 that communicates with the main body device 2. The right controller 4 further includes a memory 112 connected to the communication control unit 111. The communication control unit 111 is connected to each component including the terminal 64. The communication control section 111 and the memory 112 have the same functions as the communication control section 101 and the memory 102 of the left controller 3. Thus, the communication control unit 111 can communicate with the main body device 2 by both wired communication via the terminal 64 and wireless communication (specifically, communication conforming to the standard of Bluetooth (registered trademark)) not via the terminal 64, and can control the communication method performed by the right controller 4 on the main body device 2.
The right controller 4 includes the same input units as those of the left controller 3. Specifically, each button 113 and analog joystick 52 are provided. These input units have the same functions as those of the input units of the left controller 3, and operate in the same manner.
The right controller 4 includes a power supply unit 118. The power supply unit 118 has the same function as the power supply unit 108 of the left controller 3, and operates in the same manner.
[2. Overview of processing in gaming System ]
An outline of the processing performed in the game system 1 will be described with reference to fig. 8 to 22. In the present embodiment, the game system 1 executes the following game: a player character operable by a player (i.e., a user of the game system 1) moves in a game field (hereinafter, simply referred to as "field") which is a three-dimensional virtual space. The game system 1 can display a map image representing a map of a field in addition to a field image representing a field where a player character is placed. In the present embodiment, the map image may be displayed together with the field image at least partially, in addition to being switched from the field image to the map image and displayed in response to an instruction from the player.
Fig. 8 is a diagram showing an outline of a game example in the present embodiment. The left column shown in fig. 8 shows the situation of the field, and the right column shows an example of the map image displayed. Here, in the present embodiment, a plurality of reference points (for example, reference point 202 shown in fig. 8) are set in the field. The reference point is released according to a predetermined operation input (for example, an operation input for causing the player character to perform an action of investigating the reference point) performed by the player character 201 in a state in which the player character is located at or near the reference point. That is, the player character 201 can release the reference point by reaching the reference point and performing a predetermined operation (for example, an operation of investigating the reference point). Hereinafter, a game event in which the reference point is released will be referred to as a "release event". The reference point may be another reference point (so-called quick travel (FAST TRAVEL)) where the player character 201 can be moved instantaneously, the player character 201 can be restored, or the equipment, skill, and property held by the player character 201 can be changed, for example.
In the present embodiment, the ground is in a dark state (state a shown in fig. 8) except for a part of exceptions (for example, the player character 201 itself or its surroundings, and a marker object 203 described later) before the reference point is released. In fig. 8, for the purpose of making the drawing easier to understand, the area that has become dark is indicated by diagonal lines, but the game system 1 displays the area that has become dark so as to be invisible or difficult for the player to visually recognize (see fig. 15 and the like described later). In the state a shown in fig. 8, the field is dark except for the surrounding of the player character 201 and the marker object 203 indicating the reference point 202, and therefore, it can be said that it is difficult to search for the field.
Before the reference point is released, the map image is displayed so as not to show the location information (state a shown in fig. 8). The site information is information about a site, and is, for example, information about a topography of the site (specifically, a shape of the topography, etc.), information about an object set in the site, information about props placed in the site, information about characters existing on the site, or the like. In the state a shown in fig. 8, the map image is displayed so as to show only the mark 204 indicating the position and orientation of the player character 201, and not to show the other pieces of site information other than the mark 204. In this way, the map image before the reference point is released may be displayed so as not to show at least a part of the site information, or may show another part of the site information before the release (for example, the mark 204 shown in fig. 8).
On the other hand, when the reference point (in the example shown in fig. 8, the reference point 202) is released by the player character 201, the area around the reference point in the field becomes a range where light is not dark but is present (state b shown in fig. 8). The range of illumination in the field is hereinafter referred to as "illumination range". As will be described in detail later, the game system 1 displays the irradiation range so that the player can visually recognize the irradiation range (see fig. 17 and the like described later).
When the reference point is released, the surroundings of the reference point in the map image are displayed so as to show the location information (state b shown in fig. 8). In the example shown in fig. 8, a line indicating the shape of the terrain and a mark 205 indicating the reference point are shown in addition to the mark 204 relating to the player character 201 for the surroundings of the released reference point.
As described above, when the release event occurs, the area around the released reference spot in the spot becomes visually displayed, and the spot information of the area around the reference spot is displayed in the map image. This facilitates the player to search for the surroundings of the released reference point by the player character 201. In the present embodiment, the player plays the game while increasing the area easy to search by releasing the reference point, one of the purposes of releasing the reference point in the field.
[2-1. Setting of released area of map ]
An example of a method for setting a region (referred to as a "release region") in which the reference point is released and the field information is displayed on the map image will be described with reference to fig. 9 to 14. Fig. 9 is a diagram showing a relationship between the land corresponding plane and the determination value in the case where 1 reference spot is released. Here, the field correspondence plane refers to a two-dimensional plane corresponding to a three-dimensional field. The field-corresponding plane may refer to a plane obtained by projecting a three-dimensional field in the vertical direction, and the two-dimensional position of the field-corresponding plane is a position represented by two-dimensional coordinates in the horizontal direction of the field (i.e., two-dimensional coordinates obtained by deleting coordinates in the height direction from three-dimensional coordinates representing the position of the field). In fig. 9, the upper side shows a field correspondence plane, and the lower side shows a graph representing a change in the determination value in the field correspondence plane. Specifically, the graph shows a change in the determination value on a straight line AB (a straight line indicated by a chain line in fig. 9) passing through the released reference point 211.
The determination value is a value for determining a release area in the floor corresponding plane. In the present embodiment, the game system 1 calculates the determination value for each position of the floor corresponding plane in order to determine the release area in the floor corresponding plane. The determination value is calculated for each position (referred to as "calculation target position") of a predetermined unit section in the field corresponding plane. Specifically, the determination using the determination value is performed at a position corresponding to each pixel in the map image.
In the present embodiment, the determination value at each calculation target position is calculated based on the reference value set for the reference point. That is, the game system 1 sets a reference value for the reference point, and calculates the determination value at each calculation target position based on the reference value. In the present embodiment, the reference value at the reference point is set for each reference point, and the reference value may be different for each reference point. For example, the reference value at each reference point may be set so that the entire floor becomes the release area when all the reference points are released.
In the present embodiment, the determination value at a certain position calculated based on 1 reference point (i.e., the determination value calculated based on the reference value set for 1 reference point) is calculated based on the distance from the reference point, more specifically, is calculated so as to decrease as the distance from the reference point becomes longer (see fig. 9). For example, in the example shown in fig. 9, the reference value A1 is set at the released reference point 211, and the determination value at each calculation target position on the straight line AB changes so as to be the reference value A1 at the reference point 211 and to be attenuated according to the distance from the reference point 211 to the position. The determination value becomes 0 at a position where the distance is equal to or longer than a certain length. Further, a specific calculation method of determining the determination value at each calculation target position based on the reference value is arbitrary.
In the present embodiment, when only the reference point 211 is released as shown in fig. 9, the game system 1 sets the release area in consideration of only the determination value based on the reference point 211. Specifically, the game system 1 sets, as the release area, an area in the floor corresponding plane that is constituted by a position at which the determination value based on the reference spot 211 is equal to or greater than a predetermined threshold (threshold th in fig. 9). As described above, the determination value decays according to the distance from the reference point 211 to the position, and thus, in the example shown in fig. 9, the circular area 212 centered on the reference point 211 becomes the release area.
Fig. 10 is a view showing an example of a map image displayed when the circular area 212 shown in fig. 9 is a release area. In the above case, as shown in fig. 10, the map image in which the field information is plotted for the pixel corresponding to the position where the determination value is equal to or greater than the threshold value in the field-corresponding plane among the pixels of the map image is displayed. As shown in fig. 9 and 10, the range of the drawing field information in the map image corresponds to the circular area 212. In fig. 10, a line indicating the shape of the terrain is displayed as the field information for the range corresponding to the region 212. In the present embodiment, the mark 205 indicating the reference point is displayed according to the arrival of the player character 201 at the reference point regardless of whether or not the position of the reference point is within the release area. In the present embodiment, a map image may be displayed on the entire screen of the display 12 as shown in fig. 10, or may be displayed on a part of the screen of the display 12 so as to be superimposed on a floor image representing a floor as shown in fig. 15 described later.
As described above, in the present embodiment, the map image is an image that two-dimensionally represents the field information. The determination value is a value that decays according to a two-dimensional distance from a two-dimensional position corresponding to the reference point (i.e., a distance on a field corresponding plane). According to the above, since the release area can be set on the two-dimensional plane, the area having high affinity with the two-dimensional map can be set with a small processing load. In addition, according to the above, the release area can be set according to the distance from the reference point (for example, such that the range within a certain distance from the reference point becomes the release area). In other embodiments, the game system 1 may calculate the determination value for each position in the three-dimensional field and set the release area in the three-dimensional field. At this time, the game system 1 determines a range corresponding to the released area in the two-dimensional map image based on the released area on the three-dimensional field, and generates a map image showing the field information within the range. In other embodiments, the map may be three-dimensional, or a release area may be set in the three-dimensional map, and a map image representing the three-dimensional map may be generated and displayed.
When a plurality of reference points are released, the game system 1 calculates the determination values based on the released reference points, and calculates the total value of the determination values (referred to as "total determination value") for each calculation target position. The release area is set based on the total determination value. When only 1 reference point is released (see fig. 9), it can be said that the determination value based on the reference value set for the reference point is the total determination value.
Fig. 11 is a diagram showing a relationship between the land corresponding plane and the determination value in the case where 2 reference sites are released. In fig. 11, similarly to fig. 9, the upper side shows a land corresponding plane, and the lower side shows a graph showing the change in the determination value and the total determination value on a straight line (in fig. 11, a straight line CD) passing through the released reference spot. Further, the straight line CD shown in fig. 11 passes through the released 2 reference points 211 and 213.
In the case shown in fig. 11, the game system 1 calculates the total determination value for each calculation target position described above. The total determination value is based on the total of the determination values of the released reference points 211 and 213. That is, the game system 1 calculates the determination value based on the reference point 211 and the determination value based on the reference point 213 for each calculation target position. The determination value based on the reference point 211 is calculated as a value that is the reference value A1 at the reference point 211 and decays according to the distance from the reference point 211 to the position. The determination value based on the reference point 213 is calculated as a value that is the reference value A2 at the reference point 213 and decays according to the distance from the reference point 213 to the position. The game system 1 calculates the total determination value for each calculation target position by adding the determination value based on the reference point 211 and the determination value based on the reference point 213 for each calculation target position. In the lower graph of fig. 11, the right mountain curve shown by the solid line represents the change of the determination value based on the reference point 211, the left mountain curve shown by the solid line represents the change of the determination value based on the reference point 213, and the curve shown by the thick broken line represents the change of the total determination value.
The game system 1 sets an area in the floor corresponding plane, which is made up of positions where the total determination value is equal to or greater than the threshold value th described above, as a release area. In the example shown in fig. 11, a region 216 (a region indicated by oblique lines in fig. 11) obtained by adding a region 215 connecting the two circular regions to a circular region 212 centered on the reference point 211 and a circular region 214 centered on the reference point 213 is a release region. That is, a release area in the case where a plurality of reference points are released can also be generated by a two-dimensional fusion ball (metaball) method.
Fig. 12 is a view showing an example of a map image displayed when the area 216 shown in fig. 11 is a release area. As shown in fig. 12, a map image in which the field information is plotted for a pixel having a total determination value equal to or greater than a threshold value in the field corresponding plane among the pixels of the map image is displayed. The range of the plot area information corresponds to the circular area 216. In fig. 12, a line representing the shape of the terrain is displayed as the site information. In the example shown in fig. 12, marks 205 and 221 indicating 2 reference points where the player character 201 has arrived are displayed in addition to the mark 204 indicating the position and orientation of the player character 201. As described above, in the present embodiment, since the above-described total determination value is used to set the release area, when both the reference point 211 and the reference point 213 are released, an area that does not become the release area (i.e., the area 215 shown in fig. 11) may be set as the release area, both when only the reference point 211 and when only the reference point 213 are released.
As described above, in the present embodiment, the reference value (i.e., the maximum value of the determination value) set for each of the plurality of reference points is set to be large for each reference point. Accordingly, when the reference point is released, the area of which size is set with respect to the reference point can be set as the release area for each reference point. For example, in the example shown in fig. 11, the size and shape of the release area when the reference points 211 and 213 are released can be changed by changing the size of the reference value A1 set for the reference point 211 or the reference value A2 set for the reference point 213. For example, by changing the reference value A1 and/or the reference value A2 to a smaller value, the released area when the reference points 211 and 213 are released can be made to be 2 circular areas that are not connected to each other. In other embodiments, the reference values set for the respective reference points among the plurality of reference points may be set to the same value. The range of the release area may be set to be different for each reference point by setting the calculation method of the determination value so that the degree of attenuation corresponding to the distance is different for each reference point although the reference values for the plurality of reference points are the same.
As described above, in the present embodiment, the game system 1 identifies, as the release area, an area formed by positions where at least one or more determination values based on one or more reference points in a released state among the plurality of reference points are summed up to a predetermined value (i.e., the threshold th) or more. According to the above, the shape and size of the release area can be made to be various shapes and sizes according to the release states of the plurality of reference points.
In the above embodiment, the game system 1 calculates the total determination value based on the reference value set for the released reference point, but may calculate the total determination value based on the reference value set for the unreleased reference point in addition to the reference value set for the released reference point. Next, an example of calculating the total determination value based on the reference value set for the unreleased reference point will be described with reference to fig. 13.
Fig. 13 is a view showing an example of a floor corresponding plane of a set release area in a case where 2 reference points have been released and 1 reference point has not been released. In fig. 13, 3 reference points 231 to 233 are arranged on the site. In fig. 13, the reference points 231 and 232 are released, and the reference point 233 located between the reference point 231 and the reference point 232 is not released.
In the example shown in fig. 13, the reference value is set for each of the released reference points 231 and 232 in the same manner as in the above embodiment. Here, in the present modification, a reference value different from the reference value in the case where the reference point 233 is released is set for the reference point 233 that is not released. Hereinafter, the reference value set for the released reference point is referred to as a "first reference value", and the reference value set for the unreleased reference point is referred to as a "second reference value". That is, in fig. 13, when the reference point 233 is released, the first reference value is set in the same manner as the reference points 231 and 232.
In the example shown in fig. 13, as in the examples shown in fig. 9 and 11, the game system 1 calculates a determination value (referred to as a "first determination value") based on the first reference value for each calculation target position. In the example shown in fig. 13, a first determination value based on a first reference value set for the reference point 231 and a first determination value based on a first reference value set for the reference point 232 are calculated for each calculation target position. In the example shown in fig. 13, a determination value based on the second reference value (referred to as "second determination value") is calculated for each calculation target position. In the example shown in fig. 13, a second determination value based on a second reference value set for the reference point 233 is calculated for each calculation target position. The second determination value is a value that is a second reference value at the reference point, decreases as the distance from the reference point becomes longer, and becomes 0 at a position where the distance is a certain length or longer.
In fig. 13, a dotted line of a circle centering on the reference points 231 and 232 is a line connecting positions where a first determination value based on the first reference value is a predetermined value. The broken lines 234 and 235 are lines connecting positions where the first determination value is the threshold value. In fig. 13, a dot-dash line of a circle centering on the reference point 233 is a line connecting positions where the second determination value based on the second reference value is a predetermined value.
In the example shown in fig. 13, the game system 1 calculates the total determination value based on the second determination value in addition to the first determination value. Specifically, the total determination value at the calculation target position is obtained by subtracting the total of the second determination values at the calculation target position from the total of the first determination values at the calculation target position. In the example shown in fig. 13, the game system 1 calculates the total determination value by subtracting the second determination value based on the second reference value of the reference point 233 from the total value of the first determination value based on the first reference value of the reference point 231 and the first determination value based on the first reference value of the reference point 232. The calculation method described above regarding the total determination value is synonymous with calculating the total of the first determination value and the second determination value by setting the second reference value to a negative value (as a result, the second determination value is a negative value). The absolute value of the first reference value may be the same as or different from the absolute value of the second reference value. The calculation method for calculating the first determination value based on the first reference value may be the same as or different from the calculation method for calculating the second determination value based on the second reference value.
In the example shown in fig. 13, since the second determination value is subtracted from the total value of the first determination values, the total determination value at the position affected by the second determination value (i.e., the position where the second determination value is a positive value) is smaller than the total determination value at the position in the case where the second determination value is not considered (for example, in the case of the above-described embodiment). Thus, in the example shown in fig. 13, a part of the area that would become the release area without taking the second determination value into consideration is not set as the release area. In the example shown in fig. 13, the region 236 shown by oblique lines becomes a release region, and the region 237 which becomes a release region (region shown by solid lines) without taking the second determination value into consideration does not become a release region. In this way, in the example shown in fig. 13, the absolute value of the second determination value becomes large with respect to the position in the vicinity of the unreleased reference point, and therefore it is difficult to become a released area.
In the example shown in fig. 13, the total determination value is obtained by subtracting the total of one or more second determination values for one or more reference points in the unreleased state from the total of one or more first determination values for one or more reference points in the released state among the plurality of reference points. Accordingly, the unreleased reference point and the position in the vicinity thereof can be made difficult to be a release area.
Here, if the total determination value is calculated so as not to reflect the second determination value based on the unreleased reference point, for example, in the case shown in fig. 13, the position in the vicinity of even the unreleased reference point 233 becomes the release area. As a result, the map is released to a position in the vicinity of the unreleased reference point 233 (that is, the field information is displayed in the map image), and the irradiation range may be set as described above. At this time, there are the following concerns: the player's motivation to release the unreleased reference spot 233 becomes weaker, and the game performance is lost by releasing the reference spot to expand the search range. In contrast, according to the above-described modification, since the unreleased reference point or the vicinity thereof is less likely to be a released area, the possibility of weakening the motivation for releasing the unreleased reference point can be reduced, and the game performance can be improved.
In the present embodiment, the game system 1 generates a map mask as data representing the above-described release area. That is, the map mask is two-dimensional data representing an area in the field that becomes a release area. Further, the game system 1 generates a map image showing the field information of the release area using the map mask.
Fig. 14 is a diagram showing an example of a map image generation method according to the present embodiment. In the present embodiment, the game system 1 generates a map image to be displayed based on an original map image and a map mask. The original map image is an image that is a base for generating a map image to be displayed, and represents a map image containing site information. The original map image can be said to be a map image in the case where the entire field is a released area.
In the present embodiment, the map mask data is data representing map mask values of two-dimensional positions. The map mask value indicates the degree to which the original map image is reflected in order to generate the map image. For example, the map mask value is a value having a maximum value of 1 and a minimum value of 0. At this time, a map image is generated as follows: if the map mask value is 1, the original map image is directly reflected to the map image, and if the map mask value is 0, the original map image is not reflected. In the present embodiment, the map mask value is a multi-valued value in the range of 0 to 1. As will be described in detail later, the map mask value is set to a multi-valued value, so that the map image in the vicinity of the boundary of the release area can be displayed in a blurred manner. In other embodiments, the map mask value may be a 2-value such as 0 or 1.
The map mask value is set for each of the calculation target positions based on the total determination value. Specifically, when the total determination value for a certain position is greater than the first value, the map mask value for the certain position is set to 1, and when the total determination value for the certain position is less than the second value, the map mask value for the certain position is set to 0. The second value is a value smaller than the first value and larger than the threshold th. When the total determination value for a certain position is equal to or greater than the second value and equal to or less than the first value, the map mask value for that position is set to a value having a size corresponding to the size of the total determination value within a range of greater than 0 and less than 1. According to the above, the map mask value of the position within the range from the reference point is set to 1, the map mask value of the position of the range outside the range is set to a value that becomes smaller according to the distance from the reference point, and the map mask value of the position where the total determination value is smaller than the threshold th (i.e., the position outside the release area) is set to 0. In the map mask shown in fig. 14, a position where the map mask value is 1 is indicated by white, a position where the map mask value is 0 is indicated by black, and a position where the map mask value is an intermediate value (i.e., a value greater than 0 and less than 1) is indicated by gray so that the larger the value is, the closer the value is to white.
As an example of a calculation method for calculating based on the total determination value, the map mask value Mp may be calculated by the following expression.
In the above equation K, thresh and Over are constants, and thresh is the threshold th described above. Oi is a variable of 1 if the i-th reference spot (i is a natural number of 1 to n. N represents the number of reference spots) has been released, and 0 if the i-th reference spot has not been released. Ci is a variable that is 0 if the ith reference spot has been released and 1 if the ith reference spot has not been released. The constant ai is a constant indicating the degree to which the first determination value, which is the first reference value at the i-th reference point, decays according to the distance. In the example of the above formula, the first reference value is 1. The constant bi is a constant indicating the degree to which the second determination value, which is the second reference value at the i-th reference point, decays according to the distance. In the example of the above formula, the second reference value is 1. The variable l (i, p) is a length from the i-th reference point to the position p (specifically, the calculation target position described above) (specifically, a length on the corresponding plane of the field). In the above equation, in order to avoid a result that the total of the first determination values is excessively large and the effect of subtracting the total of the second determination values is not obtained regardless of how close the first determination values are to the unreleased reference point, when the total of the first determination values is larger than the constant Over, calculation is performed to replace the total with the constant Over. On the other hand, a calculation is performed to subtract a value obtained by squaring the total of the second determination values so as to avoid the influence of the total of the second determination values from becoming excessive. If the result of the subtraction is negative, a calculation is performed to set the value to 0.
In the above equation, each reference value is set to 1, but the first reference value and the second reference value may be set to values independent for each reference point as described above. For example, by using the expression in which "Oi" is replaced with "oi×ai" and the constant Ai is deleted, and "Ci" is replaced with "ci×bi" and the constant Bi is deleted, the map mask value Mp in the case where the first reference value and the second reference value are set for each reference point can be calculated. Further, the variable Ai is a first reference value at the i-th reference point, and the variable Bi is a second reference value at the i-th reference point. In other embodiments, the expression in which "Oi" is replaced with "oi×ai" and "Ci" is replaced with "ci×bi" may be used while retaining the constants Ai and Bi in the above expression. When either one of the first reference value and the second reference value is set to a fixed value (=1), the map mask value Mp can be calculated by using the expression obtained by replacing only either one of "Oi" and "Ci" in the above expression. The calculation formula for calculating the map mask value is not limited to the above formula. In other embodiments, for example, any calculation formula may be used in which the determination value at a certain position is calculated so as to be attenuated according to the distance from the reference position to the certain position (for example, the determination value is inversely proportional to the square of the distance).
The game system 1 refers to the map mask to synthesize the original map image with an image representing the unreleased state at a ratio corresponding to the map mask value of each pixel, thereby generating a map image. Specifically, the game system 1 generates a map image in the following manner: for the pixels with the map mask value of 1, the original map image is directly reflected; for a pixel with a map mask value of 0, the original map image is not reflected; for pixels whose map mask value is an intermediate value, the original map image is reflected at a ratio corresponding to the map mask value. The image indicating the unreleased state may be a single color or may be formed with a predetermined pattern or the like. The synthesized map image may be further synthesized into a grid or the like for easy understanding of coordinates. As a result, the vicinity of the boundary (specifically, the position where the map mask value is the intermediate value) in the release area of the map image is displayed as a thin image (see fig. 14). In fig. 14, a portion of the map image that is displayed to be light is shown by a broken line.
As described above, in the present embodiment, when a release event occurs, the game system 1 generates two-dimensional mask data (i.e., map mask) indicating the range of the field that becomes the release area. In addition, the game system 1 generates a map image showing the field information of the portion corresponding to the release area by applying mask data to the original map image containing the field information. Accordingly, a map image showing a portion of the release area can be easily generated. In other embodiments, the specific method of generating the map image is arbitrary, and is not limited to a method using mask data.
In the present embodiment, the mask data is data indicating, for each position, a value of a plurality of values corresponding to the magnitude of the total determination value at the position within the field. In addition, the game system 1 generates a map image by synthesizing an original map image and an image showing an unreleased state for each pixel at a ratio corresponding to the value of the multivalue represented by the mask data. Accordingly, a map image that obscures the vicinity of the boundary of the release area can be generated. Thus, the released map can be made to look natural.
[2-2. Setting of irradiation Range ]
An example of a method of setting the irradiation range on the field will be described with reference to fig. 15 to 21. In the present embodiment, when a predetermined irradiation event occurs in a game, a range corresponding to the irradiation event in a field becomes an irradiation range. The release event is one of the irradiation events. In the present embodiment, as the irradiation event, a character lighting event and a prop placement event may occur in addition to the release event described above. In addition, there may be an irradiation range set in accordance with the occurrence of an irradiation event, an irradiation range set in advance on the field, or the like.
The character lighting event is an event in which the surroundings of the player character become the irradiation range. In this embodiment, the character lighting event is an event in which the player character equips a garment that lights. The character lighting event may be, for example, an event in which a player character holds a lighted prop or rides a lighted vehicle.
The prop arrangement event is an event in which a prop in which a light source is set (referred to as a "light source prop") is arranged on a terrain object such as the ground in a field, and the surroundings of the light source prop become an irradiation range.
Fig. 15 is a diagram showing an example of a game image including a field image showing a field including a player character. In fig. 15, a situation in which a character lighting event occurs and no other lighting event occurs. In the present embodiment, the game system 1 causes a character lighting event to occur in accordance with an instruction from a player (for example, in accordance with an operation input to wear that lights up character equipment of the player). As shown in fig. 15, when a character light emission event occurs, the range around the player character 201 (referred to as "character influence range") is set as the irradiation range. The character influence range is, for example, a range within a predetermined distance from the position of the player character 201. In addition, in the situation shown in fig. 15, the range outside the character influence range is not set as the irradiation range, and therefore is displayed in darkness (i.e., displayed in a manner invisible or difficult to visually recognize) except for the marker object 203. The object (object outside the target, which will be described later) that is visually displayed even when it is out of the irradiation range, such as the player character 201 itself and the marker object 203, will be described later.
In the present embodiment, the game system 1 sets the environmental light on the field, and displays the irradiation range obtained by the character influence range in a manner to reflect the environmental light by drawing the irradiation range so as to be visible. The ambient light is a light source whose brightness is set to a predetermined value regardless of the position in the field. As will be described in detail later, in the present embodiment, the game system 1 displays the range outside the irradiation range so as not to reflect the light source (e.g., ambient light or point light source) and thereby to make the range invisible or difficult to visually recognize.
In the present embodiment, the game system 1 displays the map image 241 in a region of a part of the screen of the display 12 (here, in a region on the lower right of the screen) while displaying the field image representing the field. In the situation shown in fig. 15, since the release event does not occur, the map image 241 that does not include the field information other than the mark indicating the position and orientation of the player character 201 is displayed. In other embodiments, the map image may not be displayed when the field image is displayed.
As described above, in the present embodiment, an example of the irradiation event is an event (i.e., a character light-emitting event) in which the surroundings of the player character become the irradiation range based on the operation input of the player. At this time, the game system 1 sets the position of the player character as the position of the reference point, and sets the irradiation range so as to include a range in which the distance is equal to or less than the threshold value, based on the distance from the reference point. Accordingly, the surroundings of the player character can be continuously visually displayed, and thus, the possibility that a situation in which the surroundings of the player character are not seen at all and the search for a place is difficult to perform can be reduced. In other embodiments, the game system 1 may set the surroundings of the player character to the irradiation range at all times regardless of whether or not the character light emission event occurs. In other embodiments, the game system 1 may not cause the character lighting event to be generated as the irradiation event.
In the above, the character lighting event is an event related to a player character, and the game system 1 performs a character lighting event for a character other than the player character (for example, a friend character or an enemy character of the player character) in addition to (or instead of) performing a character lighting event for the player character, and sets a lighting range for the other character. For example, the irradiation range may be set based on the position of another character according to a character light emission event in which the other character is changed to be self-light emission.
Fig. 16 is a diagram showing an example of a game image in the case where a player character is located near a reference point. The situation shown in fig. 16 is a situation in which the player character 201 moves to the vicinity of the unreleased reference point 211, relative to the situation shown in fig. 15. In the present embodiment, since the marker object 203 is visually displayed even when it is out of the irradiation range, the player can move the player character 201 toward the reference point 211 out of the irradiation range with the marker object 203 as a target.
As shown in fig. 16, when the player character 201 is located near the reference point 211 (specifically, within a predetermined distance from the reference point 211), the player character 201 can perform an operation of releasing the reference point 211. That is, in the above case, the game system 1 receives an operation input for releasing the reference point 211, and releases the reference point 211 according to the operation input performed by the player. In the present embodiment, the operation input is an input for executing a "survey" command (i.e., a command for causing the player character 201 to perform an operation of conducting a survey in the vicinity), specifically, an input for pressing the a button 53 of the right controller 4. In the above case, in order to notify the player that the operation input is possible, the game system 1 displays a command image 242 (see fig. 16) indicating that the command is executable. Further, a limited range such as immediately below the marker object 203 may be set as an irradiation range set in advance (that is, set irrespective of whether or not an irradiation event occurs), so that the above-described operation is easy after reaching the reference point 211.
When an operation input for releasing the reference point 211 is performed, the game system 1 sets the range around the reference point 211 as the irradiation range. At this time, in the present embodiment, the game system 1 displays an animation of an event scene representing a release event. For example, as the event scene, an animation showing a situation in which the surroundings of the reference place 211 become gradually brighter is displayed.
Fig. 17 is a diagram showing an example of a game image of a field after a reference point is released. As shown in fig. 17, when a release event occurs in which the reference point 211 is released, the range around the reference point 211 is set as the irradiation range, and the range is visually displayed. In fig. 17, a portion around the reference point 211 in the field, which is a hill, is included in the irradiation range and is displayed visually, and a portion on the other side of the hill is outside the irradiation range and remains displayed visually. In addition, in the situation shown in fig. 17, the reference spot 211 has been released, whereby the map around the reference spot 211 is released (i.e., the release area including the reference spot 211 is set), and thus the map image 241 includes the site information within the release area.
In addition, the visible marker object 203 that is outside the irradiation range is also visually displayed within the irradiation range. Here, in the present embodiment, the game system 1 also sets a point light source at a predetermined position in the field, for example, at the position of the marker object 203, in response to the occurrence of the release event. In the drawing process, the game system 1 draws a portion of the terrain object included in the irradiation range so as to reflect a point light source as well, as will be described in detail later. Accordingly, the surroundings of the marker object 203 are rendered so as to reflect the ambient light and the point light source, and thus are displayed brighter than the portion of the irradiation range that is rendered so as to reflect only the ambient light. That is, brightness obtained by the point light source can be expressed in addition to ensuring visibility in a predetermined range. In fig. 17, a bright portion of the irradiation range, which particularly reflects the light of the point light source set at the position of the marker object 203, is indicated as a white region, and a portion of the irradiation range, which has little influence of the point light source, is indicated as a shadow region. By the point light source, the player can easily recognize that the release event occurs.
The irradiation range set in response to the occurrence of the release event as the irradiation event is set based on the reference point corresponding to the release event. A method of setting the irradiation range according to the occurrence of the release event will be described with reference to fig. 18 and 19.
Fig. 18 is a view of the site in a case where 1 reference spot is released, as viewed from above. The situation shown in fig. 18 is a situation after the reference spot 211 shown in fig. 9 is released. In the present embodiment, the game system 1 sets the irradiation range based on the released area (area 212 in fig. 18) based on the released reference point and the point influence range (point influence range 251 in fig. 18) corresponding to the reference point. The "location influence range corresponding to the reference location" refers to a range predetermined for each reference location. In the present embodiment, a range within a predetermined distance from the reference location is set as the location influence range. The predetermined distance may be set for each reference point, and may be a value different for each reference point or a value identical for each reference point. For example, the place influence ranges of the respective reference places may be set so that a part of the places is out of the place influence range even when all the reference places are released, or so that all the places are in the place influence range when all the reference places are released. In the former case, there is a portion of the field that is outside the irradiation range even when all of the reference points are released.
In the present embodiment, the game system 1 sets a range in the place within the place influence range and within the release area as the irradiation range. In the example shown in fig. 18, the release region 212 is located inside the location influence range 251, and thus the same range as the release region 212 becomes the irradiation range. In fig. 18, a region outside the irradiation range is indicated by a hatched region. In the present embodiment, the location influence range is set for each reference location as described above, and is set independently of the release area corresponding to the location influence range. Therefore, the location area of influence may be set to be larger than the release area corresponding to the location area of influence (that is, the release area is included in the location area of influence), smaller than the release area (that is, the location area of influence is included in the release area), or the same as the release area. The "release area corresponding to the reference point" refers to a release area set when only the reference point is released.
The irradiation range may be set to include at least a part of the release region by any method. For example, in other embodiments, the game system 1 may set the release area as the irradiation range directly, or may set a range located inside at least one of the release area and the spot influence range as the irradiation range.
Fig. 19 is a view of the site when 2 reference points are released, as viewed from above. The situation shown in fig. 19 is a situation in which the reference points 211 and 213 shown in fig. 11 are released.
When the 2 reference points 211 and 213 are released as shown in fig. 19, a release area 216 indicated by a broken line shown in fig. 19 is set as shown in fig. 11. As described above, the game system 1 sets the range within the released area and within the place influence range corresponding to the released reference place as the irradiation range. Thus, in the example shown in fig. 19, a range that is inside at least one of the location influence range 251 corresponding to the reference location 211 and the location influence range 252 corresponding to the reference location 213 and that is inside the release area 216 is an irradiation range. In fig. 19, the area outside the irradiation range is indicated by a hatched area.
In the example shown in fig. 19, the location influence range 251 is set to be larger than the release area in the case where only the reference location 211 is released (i.e., the area 212 shown in fig. 18), and the location influence range 252 is set to be larger than the release area in the case where only the reference location 213 is released. Therefore, when 2 reference points 211 and 213 are released, a range that does not become an irradiation range when only 1 reference point 211 or 213 is released is also an irradiation range. For example, in the example shown in fig. 19, since the spot influence range 251 and the spot influence range 252 are set to partially overlap each other, when 2 reference spots 211 and 213 are released, 1 irradiation range is set to be continuous across the reference spot 211 and the reference spot 213. According to the above, by releasing 2 reference points 211 and 213, the player can easily search for a place between the two reference points 211 and 213.
On the other hand, for other 2 reference points than the reference points 211 and 213, the point influence range corresponding to the reference point may be set to be the same as or smaller than the release area when only the reference point is released. In this case, unlike the example shown in fig. 19, even if the above 2 reference points are released, the irradiation ranges are not set continuously across the 2 reference points, but the discontinuous 2 irradiation ranges are set.
As described above, in the present embodiment, the irradiation range is set independently of the release area, and the irradiation range is set based on the release area and the position influence range, whereby the size and shape of the irradiation range can be freely set. For example, when 2 reference points are released, the irradiation ranges may be set so as to be continuous across the 2 reference points, or the discontinuous 2 irradiation ranges may be set.
Further, as will be described in detail later, in the present embodiment, the game system 1 uses the above-described ambient light to visually display the irradiation range set by the release event.
As described above, in the present embodiment, an example of the irradiation event is an event (i.e., a release event) generated by performing a predetermined operation input when the player character is located at an event occurrence position (i.e., a reference position) set in the venue. At this time, the game system 1 sets the irradiation range so that the irradiation range includes a predetermined range including the event occurrence position (specifically, a range based on the release area of the reference point or the point influence range). Accordingly, a game in which the irradiation range is widened by the player character reaching the event occurrence position can be provided. In this case, the shape of the irradiation range may be obtained based on the distance from the event occurrence position as described above, or may be a predetermined shape including the event occurrence position in other embodiments.
In this embodiment, the irradiation range is also set according to the prop arrangement event. Fig. 20 is a diagram showing an example of a game image of a field where light source props are arranged. The situation shown in fig. 20 is a situation in which light source prop 261 is disposed at a position outside the irradiation range in the field. The light source prop 261 is an object to set a light source (specifically, a point light source) at the position of the prop. In the present embodiment, player character 201 can place a predetermined light source prop on a field. Player character 201 places the light source props on the ground, for example, by placing the light source props on the ground below the feet of player character 201, throwing the light source props, or projecting the light source props with a bow and arrow. In this embodiment, player character 201 can hold a light source prop as a prop, and can place the light source prop on the ground at a timing desired by the player. In other embodiments, for example, a holder, a candle, or other prop may be used as the light source prop.
When the light source prop 261 is placed on the ground in the field, the game system 1 sets the range around the light source prop (referred to as a "prop influence range") as the irradiation range. The prop influence range is, for example, a range within a predetermined distance from the position of the light source prop 261. In the example shown in fig. 20, by disposing the light source prop 261 at a position outside the irradiation range in the field, the prop influence range based on the position becomes the irradiation range, and the field within the range is visually displayed. As described above, in the present embodiment, the player can expand the visible range in the field by disposing the light source props in addition to the reference point. For example, when the player character 201 moves forward in a dark area (i.e., an area outside the irradiation range) toward an unreleased reference point, the player can move forward in the area while securing the field of view by disposing a light source prop in the area.
In the present embodiment, the game system 1 draws a game image so as to reflect the point light source set at the position of the light source prop with respect to the irradiation range (i.e., prop influence range) set by the prop arrangement event. That is, the irradiation range set by the prop arrangement event is drawn in consideration of the point light source in addition to the above-described ambient light. Further, details of the drawing processing of the game image will be described later.
Fig. 21 is a diagram showing an example of a game image of a field in the case where a light source prop is arranged within an irradiation range obtained by a release event. In this case, the range of influence of the light source prop 261 is plotted so as to reflect the ambient light and the point light source, and is therefore displayed brighter than the range of influence of the prop and within the irradiation range obtained by the release event. In fig. 21, a range within the irradiation range and outside the prop influence range is indicated as a hatched area, and the prop influence range is indicated as a white area. As described above, according to the present embodiment, the player can easily recognize that the light source prop 261 is arranged.
In addition, when the light source prop 261 is arranged within the irradiation range obtained by the release event as shown in fig. 21, the game system 1 sets the prop influence range obtained by the light source prop 261 as the irradiation range, similarly to the case where the light source prop 261 is arranged outside the irradiation range. However, if the entire range of influence of the prop is a range that has been set as the irradiation range, the irradiation range in the field will not change as a result.
As described above for the 3 irradiation events (i.e., the character lighting event, the release event, and the prop placement event), in the present embodiment, at least a light source (specifically, ambient light) whose predetermined brightness is set irrespective of the position in the field is set in the field. In the drawing process, the game system 1 draws at least a part of the terrain object (for example, the object on the ground shown in fig. 17) on the floor so that the part of the terrain object included in the irradiation range reflects the light source. Accordingly, since a certain brightness can be ensured for the irradiation range, the irradiation range can be displayed so as to be easily visible regardless of the shape of the topography of the field (for example, not to be displayed dark due to shadows of the topography).
In the present embodiment, a point light source is set in addition to the above-described ambient light. That is, the game system 1 also sets a point light source in the field in response to occurrence of a predetermined event (specifically, a release event and a prop arrangement event). In the drawing process, the game system 1 draws, with respect to at least a part of the terrain object in the field, a part of the at least a part of the terrain object included in the irradiation range so as to reflect the point light source. Accordingly, the player can easily recognize that the predetermined event has occurred and that the place is brighter by the occurrence of the predetermined event.
The type of light source set in the field is arbitrary. In other embodiments, for example, a light source having a shape other than a point light source may be set on a site together with ambient light. In addition, only the ambient light may be set on the field without disposing a point light source.
In the present embodiment, the predetermined event is an event in which a predetermined prop (specifically, a light source prop) is disposed on a field. In this case, the game system 1 sets the position of the reference point based on the position where the prop is placed, and sets the irradiation range so as to include a range (i.e., prop influence range) in which the distance is equal to or less than the threshold value based on the distance from the reference point. Accordingly, the player can easily set the irradiation range at a desired position by disposing the prop.
The term "event in which a predetermined prop is placed on a ground" is not limited to an event that occurs when the predetermined prop is simply placed on the ground, but includes an event that occurs when the predetermined prop is placed on the ground under a predetermined condition. For example, the "event in which a predetermined prop is placed on a ground" may be an event on the condition that the prop placed on the ground is given a certain impact. The above condition may be that a predetermined impact is applied to a predetermined prop placed on a floor by falling when the predetermined prop falls on the floor, or that another object applies a predetermined impact to the predetermined prop placed on the floor.
The event for setting the point light source is not limited to the prop arrangement event, and may be another kind of event. For example, in another embodiment, the game system 1 may set a point light source at the position of the player character 201 according to the occurrence of the character lighting event, and display the irradiation range visually by drawing the character influence range so as to reflect the point light source.
As described above for the 3 irradiation events (i.e., the character lighting event, the release event, and the prop placement event), in the present embodiment, the point that is the reference of the irradiation range is set in the virtual space according to the occurrence of a predetermined event (specifically, the irradiation event). Then, the irradiation range is set based on the distance from the point as the reference so as to include a range in which the distance is equal to or less than the threshold value. Accordingly, the position and the surrounding area corresponding to the event can be set as the irradiation range according to the occurrence of the event.
The "range where the distance is equal to or less than the threshold" refers to a character influence range in a character lighting event, a location influence range or a release area range in a release event, and a prop influence range in a prop placement event.
In the present embodiment, the "point as a reference of the irradiation range" is the position of the player character 201 in the character lighting event, the position of the reference point in the release event, and the position of the light source prop in the prop placement event. However, the "point as a reference of the irradiation range" need not be exactly these positions, and may be a position determined based on these positions. For example, the "point as a reference of the irradiation range" may be a position slightly deviated from the position of the player character 201, the position of the reference point, or the position of the light source prop.
In the present embodiment, the irradiation range set according to the occurrence of the irradiation event may be controlled to be gradually increased from the time of occurrence. That is, the game system 1 may set the reference point according to the occurrence of the irradiation event, and then increase the threshold value for determining the irradiation range with the lapse of time, thereby expanding the irradiation range. The threshold is a threshold of a distance set for a character influence range in a character lighting event, a threshold of a distance set for a place influence range in a release event, and a threshold of a distance set for a prop influence range in a prop placement event. Accordingly, when the irradiation event occurs, a situation in which a bright area in the field gradually expands can be displayed. In the above, the irradiation range is controlled to stop expanding after a predetermined time elapses. Further, the game system 1 may perform control to gradually expand the irradiation range for a predetermined event (for example, a release event and a prop placement event) among irradiation events, instead of gradually expanding the irradiation range for all irradiation events.
As described above, in the present embodiment, a predetermined object is visually displayed even when the object is located outside the irradiation range. Hereinafter, such an object is referred to as an "out-of-target object". Specifically, in the present embodiment, the target external object is a role of a predetermined kind and a self-luminous object. The specified types of characters are more specifically referred to as player characters and enemy characters. The self-luminous object is an object that is set to be displayed in a self-luminous manner in the setting related to drawing. For example, the above-described marker object 203 is a self-luminous object.
As will be described in detail later, when drawing a game image, even if the target external object is located outside the irradiation range, the game system 1 does not draw based on the drawing setting that does not reflect the light source, but draws based on the drawing setting that is set in advance for the target external object. In the present embodiment, when a character of a predetermined type is located outside the irradiation range, the character is visually drawn while being shaded. Thus, the character of the predetermined type is displayed so as to be distinguishable from other objects outside the irradiation range displayed as dark.
In addition, even when the self-luminous object is located outside the irradiation range, the self-luminous object is drawn based on a drawing setting such as light emission (emission) set for the object. As a result, for example, as in the mark object 203 shown in fig. 15, the self-luminous object is displayed so as to be distinguishable from other objects outside the irradiation range displayed as dark.
As described above, in the present embodiment, the irradiation event is an event (i.e., a release event) generated by performing a predetermined operation input when the player character is located at an event occurrence position set in association with the reference point in the field. As shown in fig. 15 and 16, the self-luminous mark objects are disposed at positions (for example, positions above the reference points) on the field corresponding to the plurality of reference points, respectively. The game system 1 draws the marker object so as to be displayed so as to be distinguishable from other objects not included in the irradiation range, regardless of whether the marker object is included in the irradiation range. Accordingly, the player can easily move the player character toward the reference point outside the irradiation range by targeting the marker object.
[2-3. Image Generation processing ]
Next, an example of a method of generating a game image in which a portion outside the irradiation range in a field is displayed in darkness (i.e., displayed in an invisible or difficult-to-visually-recognize manner) will be described. In the present embodiment, the game system 1 draws an object within the irradiation range so as to reflect the light source set in the field, and draws a pixel corresponding to the object in black instead of reflecting the light source for an object outside the irradiation range (except for the above-described object outside the target). Accordingly, the object outside the irradiation range can be made invisible, and the player can be effectively given an incentive to release the reference spot for exploring the spot. Next, a specific example of a game image generation method will be described.
In the present embodiment, the game system 1 performs drawing of a game image by a method based on delayed rendering (also referred to as DEFERRED SHADING or DELAYED SHADING (delayed coloring)). That is, the game system 1 performs the drawing processing through the first stage to the third stage described below during 1 frame.
In the first stage, the game system 1 writes information used in drawing for each object (including character objects and terrain objects) in the virtual space to a G buffer (geometry buffer). For example, information on the normal line of a polygon corresponding to each pixel, information on the color set for the polygon corresponding to the pixel, and the like are written in the G buffer for each pixel drawn. In the present embodiment, in addition to these pieces of information, coordinates indicating a position on a field corresponding to a pixel, information indicating a pixel that is an object other than a rendering target, and the like are stored in the G buffer. In the first stage, the game system 1 writes depth (depth) information of a position on the field into the depth buffer for each pixel corresponding to the position.
In the second stage, the game system 1 writes information on illumination to the illumination buffer based on the information written in the G buffer and the depth buffer and the information of the light source set at the venue. For example, information indicating the brightness of a position on a corresponding field is written for each pixel plotted in the illumination buffer. In the present embodiment, the game system 1 is assumed to perform the calculation related to illumination in the second stage, but in other embodiments, the game system 1 may perform the calculation related to illumination in a third stage, which will be described later, of drawing into the frame buffer.
In the present embodiment, in the second stage, the game system 1 generates data of a dark mask. The dark mask is data indicating, for each pixel drawn, whether or not a position on the field corresponding to the pixel is a position drawn as dark (i.e., a position outside the irradiation range) or a degree of darkness. In the present embodiment, a dark mask represents a dark mask value representing the degree of drawing with a color representing darkness (black in the present embodiment as described above) for each pixel. For example, the dark mask value is a value of 0 or more and 1 or less, the dark mask value of a pixel drawn with a color representing darkness is set to 1, and the dark mask value of a pixel not reflecting a color representing darkness is set to 0. In addition, when the dark mask value is an intermediate value (i.e., a value greater than 0 and less than 1), the intermediate value is taken to be a larger value as the degree to which the pixel reflects a color representing darkness is greater. In the present embodiment, the dark mask value of the pixel corresponding to the position outside the irradiation range is set to 1, and the dark mask value of the pixel corresponding to the position within the irradiation range is set to a value smaller than 1. Thus, the dark mask can be said to be data representing the irradiation range in the field. The dark mask is generated based on the irradiation range in the virtual space, coordinate data representing the position on the field stored in the G buffer, and the like, as described in detail later. The pixel corresponding to the pixel for drawing the target object is set to a value that does not reflect darkness. In another embodiment, the dark mask value of the pixel corresponding to the position outside the irradiation range may be set to a predetermined value or more (the predetermined value is a value greater than 0 and less than 1), and the dark mask value of the pixel corresponding to the position within the irradiation range may be set to a value less than the predetermined value. In the present embodiment, the dark mask value is a multi-value in the range of 0 to 1, but in other embodiments, the dark mask value may be a 2-value such as 0 or 1.
In the third stage, the game system 1 writes pixel values representing a field image of a field reflecting the influence of light generated by darkness and a light source to a frame buffer based on the darkness mask and information written in the respective buffers (i.e., the G buffer, the depth buffer, and the light buffer). That is, the game system 1 writes, to the frame buffer, a pixel value obtained by covering black on the basis of the dark mask, the pixel value obtained by reflecting the light source in the virtual space based on the information of the G buffer and the illumination buffer.
Fig. 22 is a diagram showing an example of a method of generating a field image written into a frame buffer. As shown in fig. 22, the pixel value of each pixel of the field image is calculated based on the information of the color stored in the G buffer, the information of the brightness stored in the illumination buffer, and the dark mask value of the dark mask. First, by reflecting the information of the brightness stored in the illumination buffer, a field image reflecting the light generated by the light source can be obtained. That is, a field image represented by the light generated by the point light source, which is irradiated with the ambient light, can be obtained. Further, by using the dark mask, a field image (see fig. 22) in which the irradiation range is out of the irradiation range can be expressed in the dark. By the above, the game system 1 can obtain a field image in which the irradiation range is expressed as the irradiated ambient light, the light generated by the point light source, and the irradiation range is expressed as darkness.
The dark mask shown in fig. 22 is represented by black at a position where the dark mask value is 1, white at a position where the dark mask value is 0, and gray at a position where the dark mask value is an intermediate value as the value increases. Regarding the illumination range based on the character lighting event, prop configuration event, the dark mask value is set to: the dark mask value of the pixel corresponding to the position within the prescribed distance from the reference point of the irradiation range is 0, the dark mask value of the pixel corresponding to the position farther from the reference point than the prescribed distance becomes gradually larger according to the distance from the reference point, and the dark mask value of the pixel corresponding to the position outside the irradiation range is 1. The reference point of the irradiation range is a position that is a reference of the irradiation range, specifically, a reference position for the irradiation range based on the release event, a position of the player character for the irradiation range based on the character lighting event, and a position of the light source prop for the irradiation range based on the prop placement event.
For the irradiation range based on the release event, the game system 1 calculates two-dimensional range data for calculating a dark mask value, and generates a dark mask based on the two-dimensional range data and a horizontal plane component in coordinate data representing a position on the field stored in the G buffer. The two-dimensional range data is data representing a degree value for calculating a dark mask value for each two-dimensional position in the above-described field correspondence plane. The two-dimensional range data can be said to be data representing the irradiation range in the field. In the present embodiment, the two-dimensional range data relating to the position on the two-dimensional plane is generated as the data indicating the irradiation range, but in other embodiments, the data indicating the irradiation range may be data indicating the position of the three-dimensional field.
The above-described level value is a value indicating the level of darkness drawn in the drawing process, similarly to the darkness mask value. The degree value varies, for example, in the following manner: the degree value is maximum at the reference point of the irradiation range, gradually decreases according to the distance from the reference point, and is 0 outside the irradiation range. Thus, the degree value can be calculated based on the value of the attenuation according to the distance from the reference point of the irradiation range. In the present embodiment, the irradiation range based on the release event is set based on the release area set based on the total determination value and the spot influence range based on the distance from the reference spot. Therefore, the degree value for the irradiation range based on the release event can be calculated based on the above-described total determination value and the value attenuated according to the distance from the reference point.
Next, the game system 1 calculates a dark mask value at each pixel based on the degree value at each position corresponding to each pixel. For example, the above-described degree value can be scaled to a range of 0 to 1, and a value obtained by subtracting the scaled value from 1 can be obtained as the dark mask value. By using the dark mask value calculated as described above, a dark mask reflecting the irradiation range based on the release event can be generated. In addition, when the range of the release area and the place influence range are set to be the same, the map mask described above may be used as the two-dimensional range data.
As described above, in the present embodiment, the game system 1 generates two-dimensional range data representing the irradiation range in the field in a planar manner so that the range corresponding to the event occurrence position (i.e., the position of the reference point) in the field becomes at least the irradiation range based on the occurrence of the release event, and generates the dark mask based on the two-dimensional range data.
In the present embodiment, the game system 1 also generates the dark mask so as to reflect the irradiation range based on the position of the player character and the irradiation range based on the position of the point light source set for the light source prop in the second stage of the drawing process. Thus, a dark mask is generated that reflects each illumination event (i.e., release event, character lighting event, and prop configuration event).
The method of calculating the dark mask value is arbitrary and is not limited to the above method. For example, in other embodiments, the game system 1 may directly generate a dark mask in the drawing process instead of generating the two-dimensional range data. That is, the game system 1 may generate a dark mask reflecting the irradiation range based on the release event based on the total determination value for each position on the field corresponding to the pixel and the value attenuated according to the distance from the reference point in the second stage of the drawing process.
As described above, in the present embodiment, in the rendering processing, the game system 1 generates mask data (i.e., data of a dark mask) indicating, for at least a part of the terrain object, for each pixel, whether or not at least a position corresponding to the pixel in the terrain object is included in the irradiation range. The frame buffer is rendered so as to reflect the light source for the pixels included in the irradiation range at the positions representing the topographic object in the mask data. The frame buffer is rendered with a predetermined color for pixels in which the mask data indicates that the position of the topographic object is not included in the irradiation range. Accordingly, by using mask data, a field image in which an area outside the irradiation range is represented so as to be invisible or difficult to visually recognize can be generated. The predetermined color is, for example, black. But is not limited to black but may be grey or another color. The drawing is not limited to a single color, and an image having a predetermined pattern may be drawn.
In the present embodiment, the mask data is data indicating a degree to which the predetermined color is drawn for each pixel. In the rendering process, the game system 1 writes the pixel values obtained by synthesizing the predetermined color according to the degree described above with respect to the pixel values calculated so as to reflect the light source (that is, the pixel values based on the information of the color stored in the G buffer and the information of the brightness stored in the illumination buffer) into the frame buffer. Accordingly, the degree of darkness can be expressed in multiple levels. For example, as described above, by setting the degree value so that it is maximum at the reference point of the irradiation range and gradually becomes smaller according to the distance from the reference point and becomes 0 outside the irradiation range, a field image in which darkness is gradually thickened at the boundary of the irradiation range can be generated (see fig. 22).
In the present embodiment, the game system 1 generates two-dimensional range data representing a degree value indicating the degree of darkness drawn in the drawing process for each two-dimensional coordinate of the field corresponding to the coordinate component other than the height direction. The game system 1 calculates the degree value based on the total determination value and a value which is a reference value at a two-dimensional position corresponding to the reference point in each of the coordinates and which decays according to a two-dimensional distance from the two-dimensional position to the coordinates. In the rendering processing, for each pixel rendered to the frame buffer, a pixel value obtained by synthesizing a predetermined color (that is, black) from a level value at a two-dimensional coordinate corresponding to the pixel indicated by the two-dimensional range data (that is, from a dark mask value based on the level value) with respect to a pixel value calculated so as to reflect a light source set in the field is written to the frame buffer. According to the above, the predetermined color can be reflected in the image representing the field step by step. Thus, for example, a field image can be generated so as to gradually darken in the vicinity of the boundary of the irradiation range, and thus a field image that looks more natural can be generated.
In the present embodiment, the game system 1 performs drawing on the above-described target external object not with black indicating darkness but with a method set for each object. Specifically, in the first stage, the game system 1 writes the mask-free data on the target external object in the G buffer. The release mask is data representing pixels corresponding to the position of the object outside the target. The release mask can be said to mean that the above-described dark mask is applied for pixel release corresponding to the target external object. The game system 1 also writes data indicating a rendering method (for example, self-emission, with a predetermined shade) set for the target external object into the G buffer.
In the third stage of the drawing process, the game system 1 draws the pixels indicated by the above-described relief mask in a manner set for the target external object regardless of the dark mask value of the dark mask. Thus, the object outside the target is not drawn as dark even if it is outside the irradiation range, but is drawn by the set method. In the second stage, a value indicating that the pixel indicated by the clear mask is not dark may be written into the dark mask.
As described above, in the present embodiment, the game system 1 performs writing into the G buffer and the depth buffer for each pixel for the target external object in the first stage of the rendering process, and generates the above-described mask-free data. Then, in the third stage of the rendering processing, the game system 1 renders pixels indicated by the mask-free data in a manner that allows a portion of the terrain object that is not included in the irradiation range to be visually recognized from an object outside the target (for example, a self-luminous method or a method in which a predetermined shadow is added can be seen). Accordingly, even when the target external object is out of the irradiation range, the game system 1 can visually display the target external object.
As described above, in the present embodiment, the game system 1 draws the object outside the irradiation range as dark by the drawing process based on the so-called delay rendering. That is, in the first stage, the game system 1 performs writing into the G buffer and the depth buffer for at least a part of the terrain object in the field. In the second stage, the game system 1 generates dark mask data for each pixel based on the position corresponding to the pixel, the depth value stored in the depth buffer, and the irradiation range. In the third stage, the game system 1 performs drawing to the frame buffer based on at least the dark mask data and the data stored in the G buffer. According to the above, the game system 1 can apply the technique of delayed rendering to draw the object outside the irradiation range in a manner invisible or difficult to visually recognize.
Further, in other embodiments, a method for drawing an object outside the irradiation range as dark is arbitrary, and is not limited to a drawing process based on delay rendering. In other embodiments, the rendering process may also be performed based on forward rendering (also referred to as forward shading). That is, in the rendering processing, the game system 1 may determine whether or not the object is included in the irradiation range for each pixel with respect to at least a part of the topographic object (for example, an object other than the target object), perform rendering to the frame buffer so as to reflect the light source for the pixels included in the irradiation range, and perform rendering to the frame buffer with a predetermined color (for example, black) for the pixels not included in the irradiation range. According to the above, the game system 1 can draw the object outside the irradiation range based on the forward rendering in a manner invisible or difficult to visually recognize.
[3. Specific example of processing in gaming System ]
Next, specific examples of information processing in the game system 1 will be described with reference to fig. 23 to 27.
Fig. 23 is a diagram showing an example of a storage area storing various data used for information processing in the game system 1. Each storage area shown in fig. 23 is provided on a storage medium (for example, a flash memory 84, a DRAM 85, and/or a memory card mounted in the slot 23) accessible to the main body device 2. As shown in fig. 23, a game program area storing a game program is provided in the storage medium. The game program is used to execute the game process (specifically, the game process shown in fig. 24) in the present embodiment. The storage medium is provided with the G buffer, the depth buffer, the illumination buffer, and the frame buffer.
Further, a dark mask data area for storing data of the dark mask is provided in the storage medium. In addition, the above-described mask-free data is stored in the G buffer. The storage medium is provided with a processing data area for storing various data used in game processing. The processing data area stores, for example, the map mask data described above. In addition, for example, object data (for example, data indicating the position and orientation of an object) concerning various objects (for example, player characters, light source props) that are present in a game are stored in the process data area.
Fig. 24 is a flowchart showing an example of the flow of game processing executed by the game system 1. For example, the game processing starts to be executed in response to the start of the game according to an instruction of the player during execution of the above-described game program. In the present embodiment, in the game process, there are the following process modes: a floor mode in which a floor image representing a floor is displayed, a map display mode in which the above-described map image is displayed in the entire display 12, and a menu display mode in which a menu image is displayed. The processing mode at the start of the game is arbitrary, and is set to, for example, a field mode.
In the present embodiment, the processing of each step shown in fig. 24 is described by the processor 81 of the main body apparatus 2 executing the game program stored in the game system 1. However, in other embodiments, some of the above-described steps may be executed by a processor other than the processor 81 (for example, a dedicated circuit, etc.). In addition, in the case where the game system 1 can communicate with other information processing apparatuses (e.g., servers), part of the processing of each step shown in fig. 24 may be executed in the other information processing apparatuses. The processing of each step shown in fig. 24 is merely an example, and the processing order of each step may be changed or other processing may be performed in addition to (or instead of) the processing of each step as long as the same result is obtained.
In addition, the processor 81 executes the processing of each step shown in fig. 24 using a memory (e.g., DRAM 85). That is, the processor 81 stores information (in other words, data) obtained by each processing step in the memory, and when the information is to be used in a subsequent processing step, reads the information from the memory and uses the information.
In step S1 shown in fig. 24, the processor 81 acquires the above-described operation data representing an instruction of the player. That is, the processor 81 acquires operation data received from each controller via the controller communication section 83 and/or each terminal 17 and 21. The process of step S2 is performed after step S1.
In step S2, the processor 81 determines whether or not an event scenario such as a release event is being executed. As described above, in the present embodiment, the playback of the animation of the event scene representing the release event is started in response to the occurrence of the release event (see step S26 described later). In step S2, the processor 81 determines whether or not the animation of the event scene is being played. In the case where the determination result of step S2 is affirmative, the process of step S3 is executed. On the other hand, when the determination result of step S2 is negative, the process of step S4 is executed.
In step S3, the processor 81 advances the event scene in execution. That is, the processor 81 causes the display 12 to display an image of the animation of the event scene. In step S3, 1 frame of image is displayed, and the processing of step S3 is repeatedly executed during execution of the event scene, thereby playing the above-described animation. The event-time rendering process may be performed in the same manner as the process in the floor mode in which the floor image is displayed, but may be performed in a different manner when a different scene is to be displayed. The specific contents of the different rendering processing are arbitrary, and details are omitted. In the present embodiment, the image generated by the game system 1 is displayed on the display 12, but the image may be displayed on another display device (for example, the above-described stationary monitor). After step S3, the process of step S12 described later is performed.
In step S4, the processor 81 determines whether or not in the map display mode in which the map image is displayed. As will be described in detail later, in the present embodiment, the map display mode is started in response to a map display instruction from a player in the floor mode in which the floor image is displayed (see step S22 described later). In the case where the determination result of step S4 is affirmative, the process of step S5 is executed. On the other hand, when the determination result of step S4 is negative, the process of step S6 is executed.
In step S5, the processor 81 causes the display 12 to display a map image. That is, the processor 81 generates a map image in accordance with the method described in "[2-1. Setting of a release area of a map ]" above, and causes the display 12 to display the generated map image. In step S5 (i.e., in the map display mode), the field image is not displayed, but the map image is displayed in the entire area of the display 12 (refer to fig. 10 and 12). In the map display mode, the processor 81 receives an instruction to end the display of the map image, and when the instruction is given, the processing mode is changed to the site mode. In this case, the determination result in step S4 performed next is negative, and a field image is displayed in step S11 described later. After step S5, the process of step S12 described later is performed.
In step S6, the processor 81 determines whether or not in a menu display mode in which a menu image is displayed. As will be described in detail later, in the present embodiment, the menu display mode is started in response to a menu display instruction from the player in the floor mode in which the floor image is displayed (see step S22 described later). In the case where the determination result of step S6 is affirmative, the process of step S7 is executed. On the other hand, when the determination result of step S6 is negative, the process of step S8 is executed.
In step S7, the processor 81 causes the display 12 to display a menu image. Here, in the present embodiment, the processor 81 receives at least an operation input for instructing to change the equipment of the player character among various operations in the menu display mode. That is, the player can change the equipment of the player character in the menu image, for example, the clothing capable of causing the player character to equip the above-described light emission. Although not shown in the flowchart of fig. 24, in the menu display mode, an operation input for various instructions (for example, an instruction to change the equipment of the player character, an instruction to use the prop, etc.) to the menu image is received, and the processor 81 appropriately changes the content of the menu image based on the operation input and displays the menu image. In the menu display mode, the processor 81 receives an instruction to end the display of the menu image, and when the instruction is given, the processing mode is switched to the field mode. In this case, the determination result in step S6 performed next is negative, and a field image is displayed in step S11 described later. After step S7, the process of step S12 described later is performed.
In step S8, the processor 81 executes player-related control processing. In the player-related control process, various processes (e.g., control processes related to player characters) are performed based on operation inputs of the player. Details of the player related control process will be described with reference to a flowchart shown in fig. 25 described later. After step S8, the process of step S9 is performed.
In step S9, the processor 81 executes other object control processing. In the other object control process, other objects (for example, an enemy character, the light source props described above, and the like) other than the player character are controlled. The details of the other object control processing will be described with reference to a flowchart shown in fig. 26, which will be described later. After step S9, the process of step S10 is performed.
In step S10, the processor 81 executes a rendering process of a field image representing a field. In the rendering process of the field image, as described above, the field image representing the outside of the irradiation range as dark is generated. Details of the drawing process will be described with reference to a flowchart shown in fig. 27, which will be described later. After step S10, the process of step S11 is performed.
In step S11, the processor 81 causes the display 12 to display the field image generated in step S10. Further, as shown in fig. 15 and the like, in the field mode, the processor 81 may generate a map image in addition to the field image, and display the map image so as to be superimposed on the field image. After step S11, the process of step S12 is performed.
In step S12, the processor 81 determines whether to end the game. For example, when a predetermined operation input for ending the game is made by the player, the processor 81 determines that the game is ended. If the determination result in step S12 is negative, the process in step S1 is executed again. Thereafter, a series of processes of steps S1 to S12 is repeatedly executed until it is determined in step S12 that the game is ended. On the other hand, when the determination result of step S12 is affirmative, the processor 81 ends the game process shown in fig. 24.
Fig. 25 is a sub-flowchart showing an example of the detailed flow of the player related control process of step S8 shown in fig. 24. In the player related control process, first, in step S21, the processor 81 determines whether or not an instruction to switch the process mode is given by the player based on the operation data acquired in step S1. Specifically, the instruction to switch the processing mode is an instruction to display a meaning of a map image or an instruction to display a meaning of a menu image. In the case where the determination result of step S21 is affirmative, the process of step S22 is executed. On the other hand, when the determination result of step S21 is negative, the process of step S23 is executed.
In step S22, the processor 81 switches the processing mode according to the instruction made in step S21. That is, when an instruction to display a map image is given, the processor 81 switches the processing mode to the map display mode. In this case, the determination result in step S4 described above, which is executed next time, is affirmative, and the process of displaying the map image is executed in step S5. When an instruction to display a menu image is given, the processor 81 switches the processing mode to the menu display mode. In this case, the determination result in step S6 described above, which is executed next time, is affirmative, and the process of displaying the menu image is executed in step S7. After step S22, the processor 81 ends the player-related control process.
In step S23, the processor 81 determines whether or not the operation reception period is an operation reception period for receiving an operation input for the player character. Here, in the present embodiment, the operation period in which the player character performs a predetermined operation (for example, an operation controlled in step S30 described later) according to the operation input to the player character is set to be excluded from the operation reception period. In the case where the determination result of step S23 is affirmative, the process of step S24 is executed. On the other hand, when the determination result of step S23 is negative, the process of step S33 described later is executed.
In step S24, the processor 81 determines whether or not an operation input for releasing the reference point is made based on the operation data acquired in step S1 described above. That is, the processor 81 determines whether or not an input of a command for executing "survey" is made in a state where the player character is located in the vicinity of the reference point. In the case where the determination result of step S24 is affirmative, the process of step S25 is executed. On the other hand, when the determination result in step S24 is negative, the process of step S29 described later is executed.
In step S25, the processor 81 sets the reference point to which the operation input is performed to the released state. For example, the processor 81 updates the data representing the state of the reference point stored in the memory to represent the released content. The processor 81 sets a point light source at a position of the marker indicating the reference point. In this way, in the drawing processing described later, the drawing is performed such that light is irradiated around the marker object. After step S25, the process of step S26 is performed.
In step S26, the processor 81 starts an event scenario in the case where a release event has occurred. That is, the processor 81 starts playing an animation showing a situation in which the surroundings of the released reference spot become gradually brighter. After the processing of step S26, execution of the event scenario is continued until the end of the playback of the moving image, with the result of the determination of step S2 being affirmative. After step S26, the process of step S27 is performed.
In step S27, the processor 81 sets the above-described release area based on the reference point released in step S26. That is, the processor 81 generates a map mask representing the set release area according to the method described in "[2-1. Setting of release area of map ]". Specifically, the data of the map mask is stored in the memory at the start of the game process, and the processor 81 updates the data so as to show the set release area. By the processing of step S27, the area of the floor including the released reference spot is set as the release area. After step S27, the process of step S28 is performed.
In step S28, the processor 81 sets the irradiation range described above based on the reference spot released in step S26. That is, the processor 81 generates the two-dimensional range data indicating the set irradiation range according to the method described in "[2-2. Setting of irradiation range ]". Specifically, two-dimensional range data is stored in the memory at the start of the game process, and the processor 81 updates the data so as to show the set irradiation range. By the processing of step S27, the area of the field including the released reference spot is set as the irradiation range. After step S28, the processor 81 ends the player-related control process. The processing in steps S25, S27, and S28 is not limited to be performed at this timing, and may be performed at a predetermined timing in the event scenario that follows.
In step S29, the processor 81 determines whether or not an operation input for an action instruction for the player character is made based on the operation data acquired in step S1 described above. The action instruction is an instruction for causing the player character to perform, for example, an attack action, a jump action, or the like. In the case where the determination result of step S29 is affirmative, the process of step S30 is executed. On the other hand, when the determination result in step S29 is negative, the process in step S31 described later is executed.
In step S30, the processor 81 causes the player character to start an action corresponding to the action instruction performed in step S29. After the player character starts to perform the operation in step S30, the player character is controlled to perform the operation for a predetermined period of time by the processing of step S33, which will be described later. After step S30, the processor 81 ends the player-related control process.
In step S31, the processor 81 determines whether or not an operation input for a movement instruction for the player character is made based on the operation data acquired in step S1 described above. The movement instruction is an instruction for causing the player character to perform an action of moving on the field. In the case where the determination result of step S31 is affirmative, the process of step S32 is executed. On the other hand, when the determination result of step S31 is negative, the process of step S33 is executed.
In step S32, the processor 81 causes the player character to perform an action of moving on the field in accordance with the movement instruction performed in step S29. After step S32, the processor 81 ends the player-related control process.
In step S33, the processor 81 controls the player character to perform various actions such as progress of the action started in step S30, an action in the case where nothing is input, and the like. In step S33 of 1 time, the processor 81 controls the player character to perform the progress of the action by the amount corresponding to 1 frame time. By repeatedly executing the processing of step S33 over a plurality of frames, the player character performs a series of actions according to the action instruction. In the case where the player does not instruct the action to be performed by the player character (for example, in the case where the action started in step S30 has ended), the processor 81 may not cause the player character to perform the action or may cause the player character to perform the action for making the behavior of the player character look natural (for example, the action of looking around or shaking the body) in step S33. After step S33, the processor 81 ends the player-related control process.
Fig. 26 is a sub-flowchart showing an example of a detailed flow of the other object control process of step S9 shown in fig. 24. In the other object control processing, first, in step S41, the processor 81 determines whether processing for each object as a control target other than the player character has been completed. That is, it is determined whether or not the above objects have been specified in step S42 described later. If the determination result in step S41 is negative, the process in step S42 is executed. On the other hand, in the case where the determination result of step S41 is affirmative, the processor 81 ends the other object control processing.
In step S42, the processor 81 specifies 1 object that is the processing target of step S43 described later from among the objects that are control targets. In step S42, the processing cycle of steps S41 to S45 is designated as the target of the processing. After step S42, the process of step S43 is performed.
In step S43, the processor 81 controls the operation of the object specified in step S42. For example, when the object is an enemy character, the operation of the enemy character is controlled according to an algorithm determined in the game program. For example, when the object is a light source prop, the movement of the light source prop is controlled in accordance with the motions of other characters such as a player character (for example, the motions of throwing the light source prop in accordance with the player character). After step S43, the process of step S44 is performed.
In step S44, processor 81 determines whether a prop configuration event has occurred based on the processing result of step S43 described above. For example, with respect to a light source prop, when the light source prop thrown by the player character is disposed on the ground in the ground, processor 81 determines that a prop disposition event has occurred. In the case where the determination result of step S44 is affirmative, the process of step S45 is executed. On the other hand, when the determination result of step S44 is negative, the process of step S41 is executed again.
In step S45, processor 81 sets a point light source at the position of the light source prop that is a factor in the occurrence of the prop configuration event. In this way, in the drawing processing described later, the drawing is performed so that the surroundings of the light source prop are illuminated. After step S45, the process of step S41 is performed again. Thereafter, a series of processes in steps S41 to S45 is repeatedly executed until it is determined in step S41 that the processes for all the objects as control targets have been completed.
Fig. 27 is a sub-flowchart showing an example of the detailed flow of the drawing process of step S10 shown in fig. 24. In the drawing processing, first, in step S51, the processor 81 determines whether or not the processing of the first stage described in "[2-3 ] image generation processing ]" described above has been completed. That is, it is determined whether writing to the G buffer for each object (for example, an object within the field of view of the virtual camera) as a drawing target has been completed. If the determination result in step S51 is affirmative, the process in step S56 described later is executed. On the other hand, when the determination result of step S51 is negative, the process of step S52 is executed.
In step S52, the processor 81 specifies 1 object that is the processing target of step S53 described later from among the objects that are drawing targets. In step S52, the processing cycle of steps S51 to S55 is designated as the target of the processing. After step S52, the process of step S53 is performed.
In step S53, the processor 81 determines whether the object specified in step S52 is the above-described target external object. If the determination result in step S53 is negative, the process in step S54 is executed. On the other hand, when the determination result of step S53 is affirmative, the process of step S55 is executed.
In step S54, the processor 81 writes information about the object specified in step S52 to the G buffer and the depth buffer. That is, the processor 81 writes information of the position, normal line, color, and the like of the polygon of the object into the G buffer and writes depth information into the depth buffer for pixels corresponding to the polygon. The process in step S54 may be the same as that in the conventional delayed rendering. After step S54, the process of step S51 is executed again.
On the other hand, in step S55, the processor 81 writes information on the object specified in step S52 to the G buffer and the depth buffer, and writes information indicating that the object is an off-target object to the G buffer. That is, the processor 81 writes the above-described mask-free data concerning the target external object to the G buffer. After step S55, the process of step S51 is executed again.
In step S56, the processor 81 determines whether the processing of the second stage described in "[2-3 ] image generation processing" described above has been completed. That is, it is determined whether writing of values to each pixel in the light buffer and the dark mask has been completed. If the determination result in step S56 is affirmative, the process in step S60 described later is executed. On the other hand, when the determination result of step S56 is negative, the process of step S57 is executed.
In step S57, the processor 81 specifies 1 pixel, which is the processing target of step S58 described later, from among the pixels. In step S57, a pixel that has not yet been the processing target in the processing cycle of steps S56 to S59 is specified. After step S57, the process of step S58 is performed.
In step S58, the processor 81 performs writing to the illumination buffer for the pixel specified in step S57. That is, the processor 81 calculates information of brightness or the like at the pixel based on the ambient light and the point light source set in step S45, and writes the calculated information to the illumination buffer. The process in step S58 may be the same as that in the conventional delayed rendering. After step S58, the process of step S59 is performed.
In step S59, the processor 81 generates a dark mask (i.e., sets a dark mask value) for the pixel specified in step S57. Specifically, the processor 81 calculates the dark mask value at the pixel in the manner described in "[2-3. Image generation processing ]". Specifically, the data of the dark mask is stored in the memory at the start of the game process, and the processor 81 updates the data according to the newly set irradiation range. For example, in the case where the irradiation range based on the release event is set by the processing of the above-described step S28, the processor 81 updates the dark mask based on the above-described two-dimensional range data. In addition, in the case where the player character is equipped with the light-emitting clothing by executing the menu display mode of the menu display process of the above step S7, the dark mask is updated so that the pixels corresponding to the positions within the character influence range based on the positions of the player character become the irradiation range. When the point light source is set by the process of step S45, the dark mask is updated so that the pixels corresponding to the positions within the range of influence of the prop based on the positions of the light source props are the irradiation ranges. After step S59, the process of step S56 is executed again.
In step S60, the processor 81 determines whether or not the processing of the third stage described in "[2-3 ] image generation processing ]" above is completed. That is, it is determined whether the writing of the value to each pixel in the frame buffer has been completed. In the case where the determination result of step S60 is affirmative, the processor 81 ends the drawing process shown in fig. 27. On the other hand, when the determination result of step S60 is negative, the process of step S61 is executed.
In step S61, the processor 81 specifies 1 pixel, which is the processing target of step S62 described later, from among the pixels. In step S61, a pixel that has not yet been the processing target in the processing cycle of steps S60 to S62 is specified. After step S61, the process of step S62 is performed.
In step S62, the processor 81 calculates the pixel value of the pixel specified in step S61 and writes it to the frame buffer. That is, the processor 81 calculates the pixel value at the pixel in the method described in "[2-3. Image generation processing ]" above based on the dark mask and the information written in the respective buffers (i.e., the G buffer, the depth buffer, and the illumination buffer). Specifically, the processor 81 calculates a pixel value reflecting the influence of light generated by the light source based on the information of the G buffer, the depth buffer, and the illumination buffer, and further calculates a pixel value reflecting darkness based on the calculated pixel value and a darkness mask value in the darkness mask. Thereby, pixel values reflecting the effects of darkness and light generated by the light source are written to the frame buffer. After step S62, the process of step S60 is performed again.
Further, as described above, the drawing processing of step S10 described above may also be performed by a method based on forward rendering. Fig. 28 is a sub-flowchart showing an example of a detailed flow of the rendering processing performed by the forward rendering-based method. The game system 1 may also execute the process shown in fig. 28 instead of the process shown in fig. 27 as the drawing process of step S10.
In the drawing process shown in fig. 28, first, in step S71, the processor 81 determines whether drawing for each object (for example, an object within the field of view of the virtual camera) as a drawing target has been completed. In the case where the determination result of step S71 is affirmative, the processor 81 ends the drawing process shown in fig. 28. On the other hand, when the determination result of step S71 is negative, the process of step S72 is executed.
In step S72, the processor 81 specifies 1 object that is the processing target of steps S73 to S81 thereafter from the objects that are the drawing targets. In step S72, the processing cycle of steps S71 to S81 is designated as the target of the processing. After step S72, the process of step S73 is performed.
In step S73, the processor 81 determines whether the object specified in step S72 is the above-described target external object. In the case where the determination result of step S73 is affirmative, the process of step S74 is executed. On the other hand, when the determination result of step S73 is negative, the process of step S75 is executed.
In step S74, the processor 81 performs drawing on the object specified in step S52 (i.e., on each pixel corresponding to the object) based on the drawing setting set in advance for the object. Thus, when the object is a self-luminous object, the drawing is performed so that the object itself appears to emit light, and when the object is a character of the predetermined type, the drawing is performed so that the object appears to be shaded. After step S74, the processing of step S71 described above is performed again.
In step S75, the processor 81 determines whether or not drawing of each polygon for the object specified in step S72 has been completed. In the case where the determination result of step S75 is affirmative, the drawing related to the object is completed, and therefore the processing of step S71 described above is executed again. On the other hand, when the determination result of step S75 is negative, the process of step S76 is executed.
In step S76, the processor 81 designates one of the polygons of the object designated in step S72. In step S76, a polygon that has not yet been the processing target in the processing cycle of steps S75 to S81 is specified. After step S76, the process of step S77 is performed.
In step S77, the processor 81 determines whether or not drawing for each pixel corresponding to the polygon specified in step S76 has been completed. In the case where the determination result of step S77 is affirmative, the drawing concerning the polygon is completed, and therefore the processing of step S75 described above is executed again. On the other hand, when the determination result of step S77 is negative, the process of step S78 is executed.
In step S78, the processor 81 designates one of the pixels corresponding to the polygon designated in step S76. In step S78, a pixel that has not yet been the processing target in the processing cycle of steps S77 to S81 is designated. After step S78, the process of step S79 is performed.
In step S79, the processor 81 determines whether the position corresponding to the pixel specified in step S78 (i.e., the position in the field) is within the irradiation range. In the embodiment in which the drawing is performed by the drawing process shown in fig. 28, the processor 81 sets the irradiation range based on the release event in the above-described step S28, and sets the irradiation range based on the position of the player character when the player character is equipped with the light-emitting clothing in the menu display process of the above-described step S7, and sets the irradiation range based on the position of the light source prop when the point light source is set in the process of the above-described step S45. In the case where the determination result of step S79 is affirmative, the process of step S80 is executed. On the other hand, when the determination result of step S79 is negative, the process of step S81 is performed.
In step S80, the processor 81 draws a map in such a manner as to reflect the light sources (i.e., the ambient light and/or the point light sources) set at the site for the pixels specified in step S78. Specifically, the processor 81 calculates a pixel value of the pixel based on information of a normal line of a polygon corresponding to the pixel, information of a color set for the polygon corresponding to the pixel, information of a light source set at a place, and the like, and writes the pixel value of the pixel to the frame buffer. Thus, the pixels corresponding to the positions within the irradiation range are drawn in consideration of the light source. The process in step S80 may be the same as the drawing process based on the conventional forward rendering. After step S80, the process of step S77 is performed again.
On the other hand, in step S81, the processor 81 draws in black for the pixel specified in step S78. Thus, the pixels corresponding to the positions outside the irradiation range are drawn in black. After step S81, the process of step S77 is performed again.
In the drawing process shown in fig. 28, the drawing may be performed as follows in the same manner as in the drawing process shown in fig. 27: in the irradiation range, the black color gradually becomes thicker as approaching the boundary of the irradiation range. For example, in the above step S80, the processor 81 may calculate the above dark mask value for the pixel specified in step S78, and synthesize the pixel value reflecting the influence of the light generated by the light source and black in a proportion corresponding to the dark mask value, thereby calculating the pixel value of the pixel.
[4 ] The operational effects and modification of the present embodiment ]
The game program in the above-described embodiment is a configuration that causes a computer (e.g., the processor 81) of an information processing apparatus (e.g., the game apparatus 2) to execute the following processing.
Executing a game process of controlling a player character in a virtual space (in the above embodiment, a venue) based on an operation input (step S32);
a process of, when a predetermined event (e.g., a release event) occurs based on the game process, causing a point corresponding to the occurred event among a plurality of points (e.g., reference points) set in the virtual space to transition from a first state (e.g., an unreleased state) to a second state (e.g., a released state) (step S25);
A step (S27) of specifying a region (in the above embodiment, a release region) in which a total determination value obtained by adding up, by position, first determination values based on one or more points in a second state among the plurality of points is equal to or more than a predetermined value, the first determination value being a value that is a first reference value at a position corresponding to a point and that decays according to a distance from the position; and
A process of displaying a map image representing the site information of the virtual space in accordance with a map display instruction by the operation input (step S5), wherein the map image shows the site information of the portion corresponding to the region.
According to the above-described structure, the range to be released of the map image (i.e., the range of the release area) can be changed according to the presence or absence of a plurality of events. Further, since the total determination value at each position in the virtual space is variously changed according to which of the plurality of points is set to the second state, the release area can be variously changed according to the state at each point (that is, according to the occurrence of the event at each point).
In the above embodiment, the process for determining the release area is performed at the timing when the event occurs (see step S27 of fig. 25), but the timing at which the process is performed is not limited thereto. In other embodiments, the process for determining the release area may be performed each time a map image is generated, and may be performed at the timing of generating a map image next after an event occurs.
In the above embodiment, the predetermined event is an event generated by performing a predetermined operation input when the player character is located at an event occurrence position set in correspondence with a place in the virtual space, specifically, a release event. Here, the "event that occurs by performing a predetermined operation input when the player character is located at the event occurrence position" is not limited to the release event, and may be another event. For example, the predetermined event may be an event in which the player character arrives at the event occurrence position in the virtual space (in this example, an operation input for moving the player character to the event occurrence position corresponds to a predetermined operation), or an event in which the player character uses a specific prop at the event occurrence position in the virtual space (in this example, an operation input for using the prop by the player character corresponds to a predetermined operation). In other embodiments, the predetermined event is not limited to an event generated by performing a predetermined operation input when the player character is located at the event occurrence position, and may be another kind of event (for example, an event not conditioned on the predetermined operation input).
The game program in the above embodiment may be a configuration in which a computer (for example, the processor 81) of an information processing apparatus (for example, the game apparatus 2) executes the following processing.
A process of setting a target range (e.g., an irradiation range) in the virtual space when a predetermined event (e.g., an irradiation event) occurs based on the game process (step S28, step S59); and
In the drawing process of drawing the virtual space, a portion of at least a part of the terrain object included in the target range is drawn so as to reflect the light source set in the virtual space, and a portion of the at least a part of the terrain object not included in the target range is drawn with a predetermined color (step S62).
According to the above configuration, the region of low visibility and the region where visibility is ensured in the virtual space can be dynamically changed according to occurrence of an event. Thereby, a game in which the visible portion in the field is increased by causing an event to occur can be provided. In addition, according to the above configuration, the portion within the target range can be drawn so as to reflect the light source, thereby making it easy to visually recognize the portion, and the portion outside the target range can be drawn with a predetermined color, thereby making it possible to make the portion invisible or difficult to visually recognize. In this way, according to the above-described configuration, the visibility of the area in the game field can be easily adjusted.
The process of setting the target range may be a process of setting a range in a three-dimensional virtual space (for example, a process of setting the character influence range and the prop influence range in the virtual space), or a process of setting a range in a two-dimensional plane corresponding to the virtual space (for example, a process of generating two-dimensional range data in the above-mentioned place corresponding plane). The target range is conceptually represented as a range in the virtual space, but the data representing the target range is not limited to the data related to the position in the virtual space, and may be the data related to the position in the two-dimensional plane corresponding to the virtual space (for example, the two-dimensional range data) or the data related to the position in the pixel plane corresponding to the virtual space (for example, the data of a dark mask).
The above-described "at least a part of the terrain object" is intended to mean a method in which it is not necessary to change the rendering according to the target range for all the terrain objects. For example, a part of the terrain object may be set as the target external object described above.
In the above embodiment, the game system 1 is configured to draw the object in black, which is not included in the target range, but may draw the object in another color. Even when drawing is performed with another color, the portion can be made invisible or difficult to visually recognize, and therefore the same effects as those of the above-described embodiment can be obtained. For example, the game system 1 may draw a region that is invisible or difficult to visually recognize due to fog in setting on a story of a game in white or gray. The "predetermined color" is a color set independently of a color set for an object corresponding to a pixel to be drawn, and is not necessarily a single color. The pattern may be formed by drawing a plurality of pixels corresponding to a portion not included in the target range in a plurality of predetermined colors.
In other embodiments, the game system 1 may be configured to draw objects that are not included in the target range so as to reduce brightness. For example, the game system 1 may draw the object of the portion so as to reduce the brightness as compared with the pixel in the case where the light source is set. Specifically, in the rendering process, the game system 1 may write, in the frame buffer, a pixel value obtained by reducing the brightness of a pixel value reflecting the influence of the light generated by the light source with respect to a pixel corresponding to a target not included in the target range. The specific method of reducing the brightness is arbitrary, and the original brightness (that is, the brightness when the influence of the light generated by the light source is taken into consideration) may be reduced at a predetermined ratio, or the original brightness may be reduced by a predetermined amount, or the brightness may be reduced to a predetermined reference or less. The above configuration can also provide the same effects as those of the above embodiment.
The game program in the above embodiment may be a configuration in which a computer (for example, the processor 81) of an information processing apparatus (for example, the game apparatus 2) executes the following processing.
Executing a game process of controlling a player character in the virtual space based on the operation input (step S32);
a process of, when a predetermined event (e.g., a release event) occurs based on the game process, causing a point corresponding to the occurred event among a plurality of points (e.g., reference points) set in the virtual space to transition from a first state (e.g., an unreleased state) to a second state (e.g., a released state) (step S25);
A process of specifying an area (in the above embodiment, a release area) including at least a point which is a second state among the plurality of points (step S27);
A rendering process (step S62) of rendering, for at least a part of the topographic object in the virtual space, a part of the at least a part of the topographic object that is not included in a target range (for example, an irradiation range) including at least a part of the region, with a predetermined color; and
A process of displaying a map image representing the site information of the virtual space in accordance with a map display instruction by the operation input (step S5), wherein the map image shows the site information of the portion corresponding to the release area.
According to the above-described configuration, the range in the virtual space where visibility is ensured (i.e., the above-described target range) can be changed according to a change in the area in the map image where the field information is not shown. That is, the virtual space can be displayed in a display manner ensuring visibility for the newly displayed area of the site information in the map image. Further, according to the above configuration, since the range of visibility ensured in the virtual space is enlarged and the area showing the field information in the map image is also enlarged according to occurrence of the event, it is possible to provide a game capable of sufficiently exhibiting the game performance by gradually enlarging the search range by occurrence of the event.
In addition, in the above-described configuration, in other embodiments, the game system 1 may draw a portion not included in the target range so as to be darker than a portion included in the target range, instead of drawing with a predetermined color. Specifically, in the rendering process, the game system 1 may write the pixel value, in which the brightness is reduced by a predetermined method, to the frame buffer for the pixel value reflecting the influence of the light generated by the light source. The predetermined method may be, for example, a method of reducing the original brightness by a predetermined ratio (or reducing the original brightness by a predetermined value), or a method of changing the original brightness to a brightness equal to or lower than a predetermined reference.
In the above embodiment, the game system 1 sets, as the target range, a range (a) which is in a range constituted by a position in which at least one or more determination values based on one or more points in the second state among the plurality of points are summed up to a predetermined value or more, and (b) which is in a range (i.e., a range which is in a release area and in a point influence range) in which a two-dimensional distance from a two-dimensional position corresponding to a point is equal to or less than a threshold value. Accordingly, the range of visibility ensured in the virtual space can be suppressed from becoming excessively large, and thus the possibility of losing game performance, which is to expand the search range by causing an event to occur, can be reduced.
In addition, in the above-described embodiment, in the case where processing is performed using data (meaning including a program) in a certain information processing apparatus, a part of the data necessary for the processing may be transmitted from another information processing apparatus different from the certain information processing apparatus. In this case, the certain information processing apparatus may execute the above processing using data received from other information processing apparatuses and data stored in the certain information processing apparatus itself.
In other embodiments, the information processing system may not include a part of the configuration in the above-described embodiments, or may not perform a part of the processing performed in the above-described embodiments. For example, in order to achieve some of the specific effects in the above-described embodiments, the information processing system may have a configuration for achieving the effect and may perform the process for achieving the effect, and may not have another configuration or may not perform other processes.
The above-described embodiments are intended to change the region ensuring visibility in the virtual space according to a change in the region showing the field information in the map image, and can be used as, for example, a game system or a game program.

Claims (28)

1. A storage medium storing a game program,
The game program causes a computer of an information processing apparatus to perform:
executing a game process of controlling a player character in the virtual space based on the operation input;
When a predetermined event occurs based on the game process, a point corresponding to the event that occurs among a plurality of points set in the virtual space is changed from a first state to a second state;
Determining an area including at least a place of the plurality of places that becomes the second state;
Performing a rendering process in which, for at least a part of the terrain object in the virtual space, a portion of the at least a part of the terrain object that is not included in a target range including at least a part of the region is rendered with a predetermined color or is rendered darker than a portion of the at least a part of the terrain object that is included in the target range; and
And displaying a map image representing the site information of the virtual space, wherein the map image shows the site information of a portion corresponding to the region.
2. The storage medium of claim 1, wherein,
The game program causes the computer to: and determining, as the region, a region in which a total determination value obtained by adding up, by location, determination values based on one or more points in the plurality of points that become the second state is equal to or greater than a predetermined value, wherein the determination value is a value that is a reference value at a location corresponding to the point and that decays according to a distance from the location.
3. The storage medium according to claim 1 or 2, wherein,
The game program causes the computer to: setting (a) a range in which a total determination value obtained by adding up the determination values based on one or more points in the second state among the plurality of points by position is equal to or greater than a predetermined value and (b) a range in which a two-dimensional distance from a two-dimensional position corresponding to the point is equal to or less than a threshold value, as the target range, wherein the determination value is a value that is a reference value at the two-dimensional position corresponding to the point and decays according to the distance from the two-dimensional position.
4. The storage medium according to any one of claim 1 to 3, wherein,
The game program causes the computer to:
generating two-dimensional range data representing a degree value of a degree of darkness drawn in the drawing process or a degree of drawing with a prescribed color for each two-dimensional coordinate of the virtual space corresponding to a coordinate component other than the height direction;
calculating the degree value based on a total determination value obtained by adding up, by position, determination values based on one or more points of the plurality of points that become the second state, and a value that is a reference value at a two-dimensional position corresponding to the point and decays according to a two-dimensional distance from the two-dimensional position to the coordinate in each coordinate of the two-dimensional range data, wherein the determination value is a value that is a reference value at a two-dimensional position corresponding to the point and decays according to a distance from the two-dimensional position; and
In the drawing processing, for each pixel drawn into the frame buffer, a pixel value obtained by reducing the brightness of a pixel value calculated so as to reflect the light source set in the virtual space according to the degree value at the two-dimensional coordinates corresponding to the pixel indicated by the two-dimensional range data is written into the frame buffer, or a pixel value obtained by synthesizing the predetermined color according to the degree value for each pixel drawn into the frame buffer.
5. The storage medium according to any one of claims 1 to 4, wherein,
The event is an event generated by performing a predetermined operation input when the player character is located at an event occurrence position set in association with the place in the virtual space,
In the virtual space, a predetermined object is arranged at each of positions corresponding to each of the plurality of points,
The game program causes the computer to further perform the following: the predetermined object is rendered so as to be displayed so as to be distinguishable from the at least one partial topographic object not included in the target range, regardless of whether the predetermined object is included in the target range.
6. The storage medium according to any one of claims 1 to 5, wherein,
In the virtual space, at least a light source with a predetermined brightness is set irrespective of the position in the virtual space,
The game program causes the computer to: in the drawing process, the portion of the partial terrain object included in the target range is drawn so as to reflect the light source.
7. The storage medium of claim 6, wherein,
The game program causes the computer to:
setting a point light source in the virtual space according to the occurrence of the specified event; and
In the drawing process, the portion of the partial terrain object included in the target range is drawn so as to reflect the point light source.
8. An information processing system is provided with at least one information processing apparatus having a processor, wherein,
The processor of at least any one of the at least one information processing apparatus performs the following processing:
executing a game process of controlling a player character in the virtual space based on the operation input;
When a predetermined event occurs based on the game process, a point corresponding to the event that occurs among a plurality of points set in the virtual space is changed from a first state to a second state;
Determining an area including at least a place of the plurality of places that becomes the second state;
Performing a rendering process in which, for at least a part of the terrain object in the virtual space, a portion of the at least a part of the terrain object that is not included in a target range including at least a part of the region is rendered with a predetermined color or is rendered darker than a portion of the at least a part of the terrain object that is included in the target range; and
And displaying a map image representing the site information of the virtual space, wherein the map image shows the site information of a portion corresponding to the region.
9. The information handling system of claim 8, wherein,
The processor of the at least any one of the information processing apparatuses performs the following processing: and determining, as the region, a region in which a total determination value obtained by adding up, by location, determination values based on one or more points in the plurality of points that become the second state is equal to or greater than a predetermined value, wherein the determination value is a value that is a reference value at a location corresponding to the point and that decays according to a distance from the location.
10. The information processing system according to claim 8 or 9, wherein,
The processor of the at least any one of the information processing apparatuses performs the following processing: setting (a) a range in which a total determination value obtained by adding up the determination values based on one or more points in the second state among the plurality of points by position is equal to or greater than a predetermined value and (b) a range in which a two-dimensional distance from a two-dimensional position corresponding to the point is equal to or less than a threshold value, as the target range, wherein the determination value is a value that is a reference value at the two-dimensional position corresponding to the point and decays according to the distance from the two-dimensional position.
11. The information processing system according to any one of claims 8 to 10, wherein,
The processor of the at least any one of the information processing apparatuses performs the following processing:
generating two-dimensional range data representing a degree value of a degree of darkness drawn in the drawing process or a degree of drawing with a prescribed color for each two-dimensional coordinate of the virtual space corresponding to a coordinate component other than the height direction;
calculating the degree value based on a total determination value obtained by adding up, by position, determination values based on one or more points of the plurality of points that become the second state, and a value that is a reference value at a two-dimensional position corresponding to the point and decays according to a two-dimensional distance from the two-dimensional position to the coordinate in each coordinate of the two-dimensional range data, wherein the determination value is a value that is a reference value at a two-dimensional position corresponding to the point and decays according to a distance from the two-dimensional position; and
In the drawing process, for each pixel drawn into the frame buffer, a pixel value obtained by reducing the brightness according to the degree value at the two-dimensional coordinates corresponding to the pixel indicated by the two-dimensional range data, or a pixel value obtained by synthesizing the predetermined color according to the degree value is written into the frame buffer.
12. The information processing system according to any one of claims 8 to 11, wherein,
The event is an event generated by performing a predetermined operation input when the player character is located at an event occurrence position set in association with the place in the virtual space,
In the virtual space, a predetermined object is arranged at each of positions corresponding to each of the plurality of points,
The processor of the at least any one of the information processing apparatuses performs the following processing: the predetermined object is rendered so as to be displayed so as to be distinguishable from the at least one partial topographic object not included in the target range, regardless of whether the predetermined object is included in the target range.
13. The information processing system according to any one of claims 8 to 12, wherein,
In the virtual space, at least a light source with a predetermined brightness is set irrespective of the position in the virtual space,
The processor of the at least any one of the information processing apparatuses performs the following processing: in the drawing process, the portion of the partial terrain object included in the target range is drawn so as to reflect the light source.
14. The information handling system of claim 13, wherein,
The processor of the at least any one of the information processing apparatuses performs the following processing:
setting a point light source in the virtual space according to the occurrence of the specified event; and
In the drawing process, the portion of the partial terrain object included in the target range is drawn so as to reflect the point light source.
15. An information processing apparatus includes a processor, wherein,
The processor performs the following processing:
executing a game process of controlling a player character in the virtual space based on the operation input;
When a predetermined event occurs based on the game process, a point corresponding to the event that occurs among a plurality of points set in the virtual space is changed from a first state to a second state;
Determining an area including at least a place of the plurality of places that becomes the second state;
Performing a rendering process in which, for at least a part of the terrain object in the virtual space, a portion of the at least a part of the terrain object that is not included in a target range including at least a part of the region is rendered with a predetermined color or is rendered darker than a portion of the at least a part of the terrain object that is included in the target range; and
And displaying a map image representing the site information of the virtual space, wherein the map image shows the site information of a portion corresponding to the region.
16. The information processing apparatus according to claim 15, wherein,
The processor performs the following processing: and determining, as the region, a region in which a total determination value obtained by adding up, by location, determination values based on one or more points in the plurality of points that become the second state is equal to or greater than a predetermined value, wherein the determination value is a value that is a reference value at a location corresponding to the point and that decays according to a distance from the location.
17. The information processing apparatus according to claim 15 or 16, wherein,
The processor performs the following processing: as the target range, (a) a range in which a total determination value obtained by adding up the determination values based on one or more points in the second state among the plurality of points by position is equal to or greater than a predetermined value and (b) a range in which a two-dimensional distance from a two-dimensional position corresponding to the point is equal to or less than a threshold value is set as the target range, wherein the determination value is a value that is a reference value at the two-dimensional position corresponding to the point and decays according to the distance from the two-dimensional position.
18. The information processing apparatus according to any one of claims 15 to 17, wherein,
The processor performs the following processing:
generating two-dimensional range data representing a degree value of a degree of darkness drawn in the drawing process or a degree of drawing with a prescribed color for each two-dimensional coordinate of the virtual space corresponding to a coordinate component other than the height direction;
calculating the degree value based on a total determination value obtained by adding up, by position, determination values based on one or more points of the plurality of points that become the second state, and a value that is a reference value at a two-dimensional position corresponding to the point and decays according to a two-dimensional distance from the two-dimensional position to the coordinate in each coordinate of the two-dimensional range data, wherein the determination value is a value that is a reference value at a two-dimensional position corresponding to the point and decays according to a distance from the two-dimensional position; and
In the drawing process, for each pixel drawn into the frame buffer, a pixel value obtained by reducing the brightness according to the degree value at the two-dimensional coordinates corresponding to the pixel indicated by the two-dimensional range data, or a pixel value obtained by synthesizing the predetermined color according to the degree value is written into the frame buffer.
19. The information processing apparatus according to any one of claims 15 to 18, wherein,
The event is an event generated by performing a predetermined operation input when the player character is located at an event occurrence position set in association with the place in the virtual space,
In the virtual space, a predetermined object is arranged at each of positions corresponding to each of the plurality of points,
The processor performs the following processing: the predetermined object is rendered so as to be displayed so as to be distinguishable from the at least one partial topographic object not included in the target range, regardless of whether the predetermined object is included in the target range.
20. The information processing apparatus according to any one of claims 15 to 19, wherein,
In the virtual space, at least a light source with a predetermined brightness is set irrespective of the position in the virtual space,
The processor performs the following processing: in the drawing process, the portion of the partial terrain object included in the target range is drawn so as to reflect the light source.
21. The information processing apparatus according to claim 20, wherein,
The processor performs the following processing:
setting a point light source in the virtual space according to the occurrence of the specified event; and
In the drawing process, the portion of the partial terrain object included in the target range is drawn so as to reflect the point light source.
22. A game processing method for executing a game processing method by an information processing system, wherein,
The information processing system performs the following processing:
executing a game process of controlling a player character in the virtual space based on the operation input;
When a predetermined event occurs based on the game process, a point corresponding to the event that occurs among a plurality of points set in the virtual space is changed from a first state to a second state;
Determining an area including at least a place of the plurality of places that becomes the second state;
Performing a rendering process in which, for at least a part of the terrain object in the virtual space, a portion of the at least a part of the terrain object that is not included in a target range including at least a part of the region is rendered with a predetermined color or is rendered darker than a portion of the at least a part of the terrain object that is included in the target range; and
And displaying a map image representing the site information of the virtual space, wherein the map image shows the site information of a portion corresponding to the region.
23. The game processing method according to claim 22, wherein,
The information processing system performs the following processing: and determining, as the region, a region in which a total determination value obtained by adding up, by location, determination values based on one or more points in the plurality of points that become the second state is equal to or greater than a predetermined value, wherein the determination value is a value that is a reference value at a location corresponding to the point and that decays according to a distance from the location.
24. The game processing method according to claim 22 or 23, wherein,
The information processing system performs the following processing: setting (a) a range in which a total determination value obtained by adding up the determination values based on one or more points in the second state among the plurality of points by position is equal to or greater than a predetermined value and (b) a range in which a two-dimensional distance from a two-dimensional position corresponding to the point is equal to or less than a threshold value, as the target range, wherein the determination value is a value that is a reference value at the two-dimensional position corresponding to the point and decays according to the distance from the two-dimensional position.
25. The game processing method according to any one of claims 22 to 24, wherein,
The information processing system performs the following processing:
generating two-dimensional range data representing a degree value of a degree of darkness drawn in the drawing process or a degree of drawing with a prescribed color for each two-dimensional coordinate of the virtual space corresponding to a coordinate component other than the height direction;
calculating the degree value based on a total determination value obtained by adding up, by position, determination values based on one or more points of the plurality of points that become the second state, and a value that is a reference value at a two-dimensional position corresponding to the point and decays according to a two-dimensional distance from the two-dimensional position to the coordinate in each coordinate of the two-dimensional range data, wherein the determination value is a value that is a reference value at a two-dimensional position corresponding to the point and decays according to a distance from the two-dimensional position; and
In the drawing process, for each pixel drawn into the frame buffer, a pixel value obtained by reducing the brightness according to the degree value at the two-dimensional coordinates corresponding to the pixel indicated by the two-dimensional range data, or a pixel value obtained by synthesizing the predetermined color according to the degree value is written into the frame buffer.
26. The game processing method according to any one of claims 22 to 25, wherein,
The event is an event generated by performing a predetermined operation input when the player character is located at an event occurrence position set in association with the place in the virtual space,
In the virtual space, a predetermined object is arranged at each of positions corresponding to each of the plurality of points,
The information processing system performs the following processing: the predetermined object is rendered so as to be displayed so as to be distinguishable from the at least one partial topographic object not included in the target range, regardless of whether the predetermined object is included in the target range.
27. The game processing method according to any one of claims 22 to 26, wherein,
In the virtual space, at least a light source with a predetermined brightness is set irrespective of the position in the virtual space,
The information processing system performs the following processing: in the drawing process, the portion of the partial terrain object included in the target range is drawn so as to reflect the light source.
28. The game processing method according to claim 27, wherein,
The information processing system performs the following processing:
setting a point light source in the virtual space according to the occurrence of the specified event; and
In the drawing process, the portion of the partial terrain object included in the target range is drawn so as to reflect the point light source.
CN202311339694.7A 2022-10-19 2023-10-17 Storage medium, information processing system, information processing apparatus, game processing method, and computer program Pending CN117899457A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2022-167798 2022-10-18
JP2022-167796 2022-10-18
JP2022-167797 2022-10-18
JP2022167798A JP2023098606A (en) 2022-10-19 2022-10-19 Game program, information processing system, information processing device, and game processing method

Publications (1)

Publication Number Publication Date
CN117899457A true CN117899457A (en) 2024-04-19

Family

ID=87071948

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202311339696.6A Pending CN117899458A (en) 2022-10-19 2023-10-17 Storage medium, information processing system, information processing apparatus, game processing method, and computer program
CN202311339694.7A Pending CN117899457A (en) 2022-10-19 2023-10-17 Storage medium, information processing system, information processing apparatus, game processing method, and computer program
CN202311339698.5A Pending CN117899459A (en) 2022-10-19 2023-10-17 Storage medium, information processing system, information processing apparatus, game processing method, and computer program

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202311339696.6A Pending CN117899458A (en) 2022-10-19 2023-10-17 Storage medium, information processing system, information processing apparatus, game processing method, and computer program

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202311339698.5A Pending CN117899459A (en) 2022-10-19 2023-10-17 Storage medium, information processing system, information processing apparatus, game processing method, and computer program

Country Status (2)

Country Link
JP (1) JP2023098606A (en)
CN (3) CN117899458A (en)

Also Published As

Publication number Publication date
CN117899459A (en) 2024-04-19
JP2023098606A (en) 2023-07-10
CN117899458A (en) 2024-04-19

Similar Documents

Publication Publication Date Title
US9604133B2 (en) Touch-controlled game character motion providing dynamically-positioned virtual control pad
US7582016B2 (en) Game system and game program
CN111035918B (en) Reconnaissance interface display method and device based on virtual environment and readable storage medium
US8146018B2 (en) Gesture-based control of multiple game characters and other animated objects
US7762893B2 (en) Storage medium having game program stored thereon and game apparatus
US9522325B2 (en) Storage medium storing a game program, game system, and game control method for displaying a target image in a separate frame
US7804502B2 (en) Method of causing first object to take motion according to positional relationship with second object
JP5469516B2 (en) Image display program, image display system, image display method, and image display apparatus
JP2010042090A (en) Program, information storage medium and game device
JP5210547B2 (en) Movement control program and movement control apparatus
CN114404972A (en) Method, device and equipment for displaying visual field picture
CN117899457A (en) Storage medium, information processing system, information processing apparatus, game processing method, and computer program
JP7462010B2 (en) GAME PROGRAM, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING DEVICE, AND GAME PROCESSING METHOD
CN112076468B (en) Virtual environment picture display method, device, equipment and storage medium
US8910085B2 (en) Information processing program and information processing apparatus
EP4356989A1 (en) Game program, information processing system and game processing method
JP2023098604A (en) Game program, information processing system, information processing device, and game processing method
US20240131427A1 (en) Storage medium, information processing system, information processing device, and game processing method
US20240131433A1 (en) Storage medium, information processing system, information processing device, and game processing method
US20240131432A1 (en) Storage medium, information processing system, information processing device, and game processing method
US10748349B2 (en) Storage medium, image-processing system, image-processing device, and image-processing method
US20230281920A1 (en) Storage medium, information processing system, information processing apparatus, and information processing method
CN114177616A (en) Virtual item release method, device, equipment and medium
CN116251349A (en) Method and device for prompting target position in game and electronic equipment
CN116764192A (en) Virtual character control method, device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination