CN117839199A - Program and game device - Google Patents

Program and game device Download PDF

Info

Publication number
CN117839199A
CN117839199A CN202311715984.7A CN202311715984A CN117839199A CN 117839199 A CN117839199 A CN 117839199A CN 202311715984 A CN202311715984 A CN 202311715984A CN 117839199 A CN117839199 A CN 117839199A
Authority
CN
China
Prior art keywords
game
display
sensing region
control unit
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311715984.7A
Other languages
Chinese (zh)
Inventor
武田理志
西村悠纪
饭田一希
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bandai Co Ltd
Original Assignee
Bandai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bandai Co Ltd filed Critical Bandai Co Ltd
Publication of CN117839199A publication Critical patent/CN117839199A/en
Pending legal-status Critical Current

Links

Abstract

The invention provides a program and a game device. Provided is a game wherein the interest of the game is not affected by the age and the size of the body. A program for causing a computer to execute a game. The program causes the computer to function as: a display control unit that causes a game image of a game to be displayed on a display unit; and a sensing region control unit that sets, on the display unit, a sensing region that senses an operation by the user and an imperceptible region that does not sense the operation by the user. The display control unit causes the operation image to be displayed only in the sensing area.

Description

Program and game device
Technical Field
The present invention relates to a program and a game device.
Background
There are a large number of games operated using a touch panel of a mobile terminal, and such games are favored by people (for example, patent document 1).
Prior art literature
Patent literature
Patent document 1: japanese patent application laid-open No. 2019-170802
Disclosure of Invention
Problems to be solved by the invention
The apparatus for executing a game by operating a touch panel is also applied to arcade games having a large-screen touch panel, and it is desired to provide a game in which the interest of the game is not affected by the age or the size of the body in the large-screen touch panel.
Accordingly, the present invention provides a program and apparatus for preventing interest in a game from being affected by age and body size.
Solution for solving the problem
One embodiment of the present invention is a program for causing a computer to execute a game, the program causing the computer to function as: a display control unit that causes a game image of a game to be displayed on a display unit; and a sensing region control unit that sets, on the display portion, a sensing region that senses an operation of the user and an imperceptible region that does not sense the operation of the user, wherein the display control unit causes the operation image to be displayed only on the sensing region.
One embodiment of the present invention is a game device including: a display control unit that causes a display unit to display a game image of the game; and a sensing region control unit that sets, on the display portion, a sensing region that senses an operation of a user and an imperceptible region that does not sense the operation of the user, wherein the display control unit causes an operation image to be displayed only on the sensing region.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, a program and a device capable of performing an operation in consideration of age and body size can be provided.
Drawings
Fig. 1 is an external view of a game device according to the present embodiment.
Fig. 2 is a block diagram of the game device according to the present embodiment.
Fig. 3 is a diagram showing an example of a card.
Fig. 4 shows an example of card information.
Fig. 5 shows an example of character information.
Fig. 6 is a diagram showing an example of a game image.
Fig. 7 is a diagram showing an example of a game image.
Fig. 8 is a diagram illustrating an example of a perceived region and an imperceptible region.
Fig. 9 is a flowchart for explaining the operation of the present embodiment.
Fig. 10 is a diagram showing an example of a game image.
Fig. 11 is a diagram showing an example of a game image.
Detailed Description
Structure of game device 100
The game device 100 is a device that starts executing a game based on cost payment. Fig. 1 is a diagram illustrating an external appearance of a game device in the present invention. More specifically, fig. 1 (a), 1 (b), and 1 (c) show a front perspective view, a side view, and a rear perspective view, respectively, of the game device 100.
As shown in the drawing, the game device 100 is configured such that two display devices, i.e., a first display portion 131 and a second display portion 132, are mounted on a column 130 provided on a top plate of a main body 160. In the example of fig. 1, the main body 160, the support 130, the first display 131, and the second display 132 are configured to be separable from each other. Further, a description of instructions concerning the operation of the game device 100 and a holder 150 for placing articles held by the user are formed on the top plate of the main body 160.
In fig. 1, a direction indicated by an arrow 170 is a vertical direction (specifically, a vertical upward direction), and a shape of an attachment portion of the pillar 130 is configured such that a lower end of the first display portion 131 is in contact with an upper end of the second display portion 132.
In fig. 1, the direction indicated by an arrow 180 is the front direction of the game device 100, and it is assumed that the user is basically present in the front direction to perform information presentation including the first display portion 131 and the second display portion 132 while providing a game play experience of the game. In other words, the game screen displayed via the first display unit 131 and the second display unit 132 is configured to be optimal for appreciation from the front direction.
Functional structure of game device 100
Next, the functional configuration of the game device 100 according to the present embodiment will be described with reference to the block diagram of fig. 2.
The display unit 101 displays various information such as a game image and a menu image to the user. In the game device 100 of the present embodiment, the display unit 101 is constituted by the first display unit 131 and the second display unit 132 described above. The first display portion 131 and the second display portion 132 are, for example, display devices such as liquid crystal displays, and are fixed above the main body 160 in the vertical direction by being attached to the stay 130 as described above. At this time, the first display portion 131 and the second display portion 132 are arranged in a vertical direction. In the game device 100 of the present embodiment, since the service-related screen display is performed using two display screens (display areas) of the first display unit 131 and the second display unit 132, the first display unit 131 is fixed to the support 130 so as to be tilted forward with respect to the front direction of the game device 100, and the second display unit 132 is fixed to the support 130 so as to be tilted backward with respect to the front direction, so that the user can appropriately visually recognize the screen display.
The storage unit 102 is a recording device such as a nonvolatile memory or HDD that can hold data. The storage unit 102 stores information on parameters required for the operations of the respective blocks, various pieces of graphics data used in a game executed by the game device 100, and the like, in addition to the operation programs of the respective blocks included in the game device 100. The storage unit 102 is configured such that the storage unit 102 further includes a memory for temporary data storage, such as a volatile memory. The memory includes a memory that is used not only as an expansion area of the operation program of each block but also as a storage area for temporarily storing data and the like output during the operation of each block.
The acquisition unit 103 is an interface related to information acquisition from an article held by a user. Specifically, the acquisition unit 104 is configured to be able to acquire information attached to an article, and acquire the information attached to the article via an acquisition device (reader). The game device 100 of the present embodiment includes a reader to be able to acquire information from an article to which a game element (character) that comes out in a game provided by the game device 100 is associated. In the following description, the card reader 120 provided in the game device 100 is used as the acquisition unit 103 and the card 300 is used as an article, but other types of articles such as cards, pieces, and handhelds may be used as long as information can be acquired from the article.
As illustrated in fig. 3, the card 300 is described using a card having a main surface formed in a rectangular shape, but the shape of the card is not limited. Fig. 3 (a) shows the front side of the card 300, and fig. 3 (b) shows the back side (the side opposite the front side) of the card 300. The "rectangular shape" described in the present embodiment is a case where the outer shape of the card 300 shown in the direction orthogonal to the main surface is substantially rectangular, and the case where the card is not strictly rectangular due to chamfering applied to the end portion and the edge portion is described.
As shown in the figure, the card 300 is defined as a positive position in which the outer shape of the rectangular main surface is rectangular (1 side in the vertical direction is longer than 1 side in the horizontal direction), and an image showing the appearance of the character associated with the card 300 is formed on the front surface of the card 300 so as to be visually recognized accurately when the card 300 is in the positive position. As shown in fig. 3 (a), an image of a character associated with the card 300 and a name of the character are attached to the front surface of the card 300. On the other hand, as shown in fig. 3 (b), a two-dimensional code obtained by converting the card information 800 is attached to the back surface of the card 300 at the lower part when the card is in the normal position. In the example of fig. 3, a QR code (Quick Response code: registered trademark) is used as an example of the two-dimensional code.
As described above, the card is associated with any character that is present in the game, and the card information 800 is attached to the card. As shown in fig. 4, for example, the card information 800 may include a character ID 802 for uniquely identifying a character associated with a card ID 801 for identifying the card. As illustrated in fig. 5, character information 810 for one-name character management includes drawing information 812 and parameter information 813 in association with a character ID 802 for uniquely specifying a character, wherein the drawing information 812 includes graphic data and the like used for generating images when the character, an associated prop and the like are brought out of a game, and the parameter information 813 describes an effect and the like started by using the character in addition to a name, an attribute and the like of the character. The parameter information 813 may include support information to be referred to in a game for making a support character play, and progress control of the provided game may be performed by using the information. The card information 800 and the character information 810 are stored in the storage unit 102.
The two-dimensional code attached to the card 300 is read by the acquisition section 103 as the card reader 120. The acquisition unit 103 is a card reader 120 disposed on the top plate 150 of the game device 100 of fig. 1. The card reader 120 receives the card 300 placed on the card reader 120, reads the two-dimensional code, acquires card information attached to the card 300, and outputs the card information to the game control unit 113. The card reader 120 performs detection of the card 300 mounted on the card reader 120 and acquisition of card information from the card 300 by applying predetermined image processing to the captured image while being controlled in a state where information can be acquired from the card 300.
Further, the card 300 is discharged from the card dispenser provided in the game device 100 to the take-out port 141 on condition that the cost payment is detected, whereby the card 300 is provided to the user. In addition, the card 300 may be provided to the user through a sales front in a sales shop, and through a network sales via the internet.
The operation input unit 104 is a user interface provided in the game device 100. In the game device 100 of the present embodiment, the second display unit 132 is a touch panel integrally formed therewith. However, the case where the touch panel is also provided in the first display portion 131 is not excluded. The operation input section 104 outputs sensor information in response to a touch operation performed on the display area of the second display section 132. The sensor information signal is a signal for determining a position of a touch of the user in the display area of the second display portion 132, and is a signal for determining coordinates of the touch position in the display area.
The sound output unit 105 is, for example, a speaker, and outputs various sounds.
The communication unit 106 is a communication interface with an external device provided in the game device 100. The communication unit 106 may be connected to an external device via a communication network such as the internet or a network (whether wired or wireless) of a cable connecting devices, and can transmit and receive data. The communication unit 106 converts information input as a transmission target into data of a predetermined format, for example, and transmits the data to an external device such as a server via a network. When receiving information from an external device via a network, for example, the communication unit 106 decodes the information and stores the decoded information in the storage unit 102. The game device 100 according to the present embodiment is configured to be able to receive program data obtained by encapsulating a program for processing related to a game from an external device via the communication unit 106.
The control unit 110 is, for example, a CPU. The control unit 110 includes a sensing region control unit 111, a display control unit 112, and a game control unit 113, and controls the operations of the respective components included in the game device 100. For example, the control unit 101 reads the operation program of each component recorded in the storage unit 102, and expands the operation program in the memory to execute the operation program to control the operation.
When the communication unit 106 receives the program data and an update request for the program, the control unit 110 can update the program of the game-related process stored in the storage unit 102 by using the received program data in accordance with the update request. In addition to the update processing of the program of the processing related to the game, for example, the update processing may be automatically executed when the program data recorded in the storage unit 102 is inserted into an optical drive, not shown, or may be executed after the insertion in accordance with a start command by the administrator.
The sensing region control unit 111 sets a sensing region 501 and an imperceptible region 502 in the display region of the second display unit 132 (the operation input unit 104), the sensing region 501 being a region that senses a touch operation by a user, and the imperceptible region 502 being a region that does not sense a touch operation by the user even if the touch operation is performed by the user. Specifically, the sensing region control unit 111 sets a two-dimensional coordinate system having a predetermined position of the display region of the second display unit 132 as an origin, and specifies a range of coordinates for defining the sensing region 501 and a range of coordinates for defining the non-sensing region 502 in the coordinate system. When the coordinates shown by the sensor information from the operation input unit 104 are included in the range of the coordinates of the predetermined sensing region 501, the sensing region control unit 111 outputs the coordinates shown by the sensor information to the game control unit 113 as sensor information. In addition, when the coordinates shown by the sensor information from the operation input unit 104 are included in the range for defining the coordinates of the non-sensing region 502, the sensing region control unit 111 does not output the coordinates shown by the sensor information to the game control unit 113. Further, only the range of coordinates for defining the sensing region 501 may be set, and only when the coordinates shown in the sensor information are included in the range of coordinates for defining the sensing region 501, the coordinates shown in the sensor information may be output to the game control unit 113.
Regarding the perceived area 501 and the imperceptible area 502, the size and shape of the perceived area 501 and the imperceptible area 502 may be changed according to the progress of the game and the content of the game, or the size and shape of the perceived area 501 and the imperceptible area 502 may be fixed all the time, taking into consideration the physical characteristics of the age group of the game. For example, in the case of a game in which a character liked by a user of a low age group comes out, it is preferable to appropriately set a perceived area and an imperceptible area according to the average body shape of the age group.
Here, a plurality of examples of setting the sensing region and the non-sensing region will be described with reference to fig. 8. Fig. 8 (a) shows an example in which a rectangular sensing region 501 is set in the center of a display region 504 of the second display unit 132, and the periphery thereof is set as a non-sensing region 502, that is, the inner periphery of the display region is set as the non-sensing region 502. Fig. 8 (b) shows an example in which an elliptical sensing region 501 is set in the center of the display screen, and a non-sensing region 502 is set around the elliptical sensing region. Fig. 8 (c) shows an example in which 4 corners of a rectangular screen are set as the non-sensing region 502 and the inner side thereof is set as the sensing region 501. Regarding the ratio of the perceived area 501 to the imperceptible area 502 in the display area 504, the perceived area 501 may be set larger or the imperceptible area may be set larger, but as shown in fig. 8, when the perceived area 501 is set larger, the area that can cope with various operations in the game is larger, so that it is more preferable to set the perceived area 501 larger. In addition, in the case where the user is a child, in order to support the body, a hand on the opposite side to the hand operated by the user is often placed on the display area. However, by setting the left and right sides of the display area 504 to be wider as in fig. 8 (a) and 8 (b) as the non-sensing area 502, even if a hand irrelevant to the operation is placed in the display area 504, contact of the hand is not sensed, and thus contact of the hand does not affect the game.
The display control unit 112 includes, for example, a drawing device such as a GPU, so that the game result output from the game control unit 113 is displayed as a game image on the first display unit 131 and the second display unit 132. Specifically, during execution of the game, drawing processing is performed on a desired drawing object (graphics data) based on the game result output from the game control unit 113, and display images such as a game image and a menu image are displayed on the first display unit 131 and the second display unit 132. As described above, the second display portion 132 is a touch panel and also serves as the operation input portion 104. Therefore, the display control section 112 needs to display an operation image that is an image showing an area where the user performs a touch operation. Here, the operation image may be an image in which the shape is not limited, but the shape of the operation image is not limited to the region where the touch operation is performed, but the shape of the touch operation such as touch, slide, flick, or the like is displayed.
The display control unit 112 outputs a game image obtained by superimposing an operation image on a character image generated using the character information 810 as a game image displayed by the second display unit 132. Fig. 6 and 7 show an example of a game image obtained by superimposing the operation image 503 on the character image 500. Here, fig. 6 is a diagram showing an example of an operation image showing an area for performing a touch operation in a game image. Fig. 7 is a view showing an example of an operation image in which a touch operation is a slide, a direction of the slide, and a region for performing the slide.
When the second display unit 132 is caused to display the game image, the display control unit 112 generates the game image such that the operation image 503 is displayed in the sensing region 501 set by the sensing region control unit 111. As shown in fig. 6 and 7, the perception area control unit 111 performs at least control to display the operation image 503 so as to display the operation image 503 in the perception area 501, but it is preferable to change the size, shape, and display position of the operation image according to the movement of the character or the like, because the interest of the game is improved. In addition, regarding the size, shape, and display position of the operation image 503, it is preferable that at least one of the size, shape, and display position is changed according to the number of wins of the game and progress of a series of games, so that difficulty of the game is improved, and the interest of the game is improved.
The display control unit 112 generates a game image so as to superimpose a text 505 for prompting an operation instruction of a user to perform a touch operation in the vicinity of the operation image 503. The text of the operation instruction may be displayed in either one of the perceived area 501 and the imperceptible area 502, or may be displayed across both the perceived area 501 and the imperceptible area 502.
When detecting the cost payment, the game control section 113 reads character information 810 based on the card information 800 output from the acquisition section 103, and executes a series of games.
While the operation image 503 is displayed, the game control unit 113 determines an operation input by the user to the display area of the operation image based on the sensor information output from the operation input unit 104. The determination of the operation input is made with respect to at least any one of the presence or absence of an operation, the direction (including a trajectory) of the operation, the timing of the operation, and the like. The game control unit 113 outputs an image signal indicating the result of the determination of the operation input. In the following description, for simplicity of description, description will be made by describing operation input by the user to the display area of the operation image 503 as a small game.
When the small game is continuously executed according to the determination result of the operation input, the game control unit 113 confirms whether or not the small game is executed a predetermined number of times after outputting the image signal indicating the determination result of the operation input.
When an image signal representing a game image of a determination result of an operation input is output, the game control unit 113 confirms whether or not the small game execution process is executed a predetermined number of times, and when confirming that the small game execution process is executed a predetermined number of times, ends.
Action of game device 100
The operation of the game device 100 will be described with reference to fig. 9. The game device 100 according to the present embodiment is explained with a sensing region and an imperceptible region set in advance.
The acquisition unit 103 acquires card information (card ID 801 and character ID 802) from the two-dimensional code on the back surface of the card, and outputs the card information to the game control unit 113 (step S601).
The game control unit 113 reads the character information 810 (drawing information 812 and parameter information 813) based on the character ID 802 in the card information 800 output from the acquisition unit 103 (step S602), and starts the small game execution process (step S603). The game control unit 113 outputs the game result to the display control unit 112 (step S604).
The display control unit 112 causes the first display unit 131 and the second display unit 132 to display the game image generated based on the game result (step S605). At this time, an example of the game image displayed on the second display unit 132 is fig. 6 or 7.
When the operation input unit 104 senses the operation of the user in the sensing area, it outputs sensor information (step S606).
The sensing region control unit 111 determines whether or not the coordinates shown by the sensor information from the operation input unit 104 are included in the range of the coordinates for defining the sensing region 501 (step S607), and if it is determined that the coordinates shown by the sensor information from the operation input unit 104 are included in the range of the coordinates for defining the sensing region 501, outputs the sensor information to the game control unit 113 (step S607: yes). When it is determined that the coordinates indicated by the sensor information from the operation input unit 104 are not included in the range of coordinates for defining the sensing region 501, operation sensing is continued (step S606).
The game control unit 113 determines an operation input by the user to the display area of the operation image 503 based on the sensor information (step S608). When it is determined that the north has failed (step S608: no), the determination result as shown in fig. 10 is displayed on the second display unit 132 (step S612), and the game is ended. On the other hand, when it is determined that "success" means winning (step S608: yes), the determination result as shown in fig. 11 is displayed on the second display unit 132 (step S609).
The game control unit 113 confirms whether or not the small game execution process is performed a predetermined number of times (step S610), and when confirming that the small game execution process is performed a predetermined number of times, ends (step S611).
According to the present invention, in a game using a touch panel, a sense area and an imperceptible area are set in a display area, and an operation image is displayed in the sense area, so that a game operation can be performed regardless of age and body size. In addition, by providing the sensing region near the center of the display region, even a child with a narrow angle of view can easily recognize the operation image, and thus the operation becomes easy. Further, by providing the sensing region near the center of the display region, it is possible to avoid the finger from touching the outer frame of the second display portion 132 at the time of the operation input.
In the above description, the mode of displaying one operation image 503 has been described, but a mode of displaying a plurality of operation images 503 may be used. In this way, the operation input becomes complicated, and the interest of the game increases.
In the above description, the boundary between the perceived area 501 and the imperceptible area 502 is not visually recognized by the user, but the imperceptible area 502 may be visually recognized by displaying a color, a watermark, a decorative image that can be used to recognize the imperceptible area, or the like. In this manner, the user can understand the region that does not reflect the operation input by feel, and thus can easily perform the operation.
In the above description, the two-dimensional code is printed visually, but may be printed with invisible ink, for example. With this structure, the effect of not impairing the appearance of the card can be obtained. In the above description, the two-dimensional code is printed, but for example, an IC tag or an RF tag may be embedded. In this configuration, the information amount can be increased, and therefore, the character information 810 described in fig. 5 can be written as an IC tag or an RF tag.
In the above description, the following examples are described: the operation input unit 104 is configured to output sensor information in response to a touch operation regardless of whether the touch operation is a sensed region or an imperceptible region, and the sensed region control unit 111 determines which of the sensed region and the imperceptible region is the operation based on the sensor information. However, the sensing region control unit 111 may be configured to set a sensing region or a non-sensing region in the operation input unit 104 (touch panel) itself, and the operation input unit 104 (touch panel) may output sensor information only in response to a touch operation of the sensing region. In this case, the display control section 112 causes the operation image to be displayed in the sensing area set in the operation input section 104 (touch panel). The game control unit 113 determines a touch operation using the sensor information output from the operation input unit 104 (touch panel). The same effect can be obtained by adopting such a structure.
In the above description, the configuration in which the control unit 110 reads the operation program of each component and expands the operation program in the memory to execute the operation program has been described with respect to the game device 100, but the game device 100 can also be implemented by hardware that realizes the operations of each component.
[ additional note 1]
A program for causing a computer to execute a game, the program causing the computer to function as:
a display control unit that causes a display unit to display a game image of the game; and
a sensing region control unit that sets, on the display unit, a sensing region that senses an operation by a user and a non-sensing region that does not sense the operation by the user,
wherein the display control unit causes an operation image to be displayed only in the sensing region.
[ additionally noted 2]
The procedure described in appendix 1, wherein,
the display control unit causes a game image to be displayed differently in a case where an operation of a display area of the operation image is perceived than in a case where an operation of a display area other than the operation image is perceived.
[ additionally recorded 3]
The program according to any one of supplementary notes 1 and 2, wherein,
the program causes the computer to function also as a game processing unit that processes a game in such a manner that progress of the game becomes favorable in the case where an operation of a display area of the operation image is perceived.
[ additional note 4]
The program according to any one of supplementary notes 1 to 3, wherein,
the display control unit presents operation images of different shapes according to progress of a game.
[ additional note 5]
The program according to any one of supplementary notes 1 to 4, wherein,
the display control unit causes a plurality of operation images to be displayed in the sensing region.
[ additional note 6]
The program according to any one of supplementary notes 1 to 5, wherein,
the display control unit changes a display position of the operation image according to progress of the game.
[ additionally noted 7]
The program according to any one of supplementary notes 1 to 6, wherein,
the sensing region control unit sets the sensing region to be larger than the non-sensing region.
[ additionally recorded 8]
The program according to any one of supplementary notes 1 to 7, wherein,
the perceived-area control means changes at least one of the shape and the size of the perceived area and the non-perceived area in accordance with at least one of the content of the game and the progress of the game.
[ additional note 9]
The program according to any one of supplementary notes 1 to 8, wherein,
the sensing region control unit sets at least a part of the vicinity of the inner periphery of the display region of the display unit as a non-sensing region.
[ additional note 10]
The program according to any one of supplementary notes 1 to 9, wherein,
as the non-sensing region, the sensing region control unit sets in such a manner as to surround the sensing region.
[ additional note 11]
A game device is provided with:
a display control unit that causes a display unit to display a game image of the game; and
a sensing region control unit that sets, on the display unit, a sensing region that senses an operation by a user and a non-sensing region that does not sense the operation by the user,
wherein the display control unit causes an operation image to be displayed only in the sensing region.
The present invention has been described above with reference to the preferred embodiments, but the present invention is not necessarily limited to the above embodiments, and can be implemented by various modifications and combinations within the scope of the technical idea.
Description of the reference numerals
100: a game device; 101: a display unit; 102: a storage unit; 103: an acquisition unit; 104: an operation input unit; 105: a sound output unit; 106: a communication unit; 110: a control unit; 111: a sensing region control unit; 112: a display control unit; 113: and a game control unit.

Claims (11)

1. A program for causing a computer to execute a game, the program causing the computer to function as:
a display control unit that causes a display unit to display a game image of the game; and
a sensing region control unit that sets, on the display unit, a sensing region that senses an operation by a user and a non-sensing region that does not sense the operation by the user,
wherein the display control unit causes an operation image to be displayed only in the sensing region.
2. The program according to claim 1, wherein,
the display control unit causes a game image to be displayed differently in a case where an operation of a display area of the operation image is perceived than in a case where an operation of a display area other than the operation image is perceived.
3. The program according to claim 1 or 2, wherein,
the program causes the computer to function also as a game processing unit that processes a game in such a manner that progress of the game becomes favorable in the case where an operation of a display area of the operation image is perceived.
4. The program according to claim 3, wherein,
the display control unit presents operation images of different shapes according to progress of a game.
5. The program according to claim 4, wherein,
the display control unit causes a plurality of operation images to be displayed in the sensing region.
6. The program according to claim 5, wherein,
the display control unit changes a display position of the operation image according to progress of the game.
7. The program according to claim 6, wherein,
the sensing region control unit sets the sensing region to be larger than the non-sensing region.
8. The program according to claim 7, wherein,
the perceived-area control means changes at least one of the shape and the size of the perceived area and the non-perceived area in accordance with at least one of the content of the game and the progress of the game.
9. The program according to claim 8, wherein,
the sensing region control unit sets at least a part of the vicinity of the inner periphery of the display region of the display unit as a non-sensing region.
10. The program according to claim 9, wherein,
as the non-sensing region, the sensing region control unit sets in such a manner as to surround the sensing region.
11. A game device is provided with:
a display control unit that causes a display unit to display a game image of the game; and
a sensing region control unit that sets, on the display unit, a sensing region that senses an operation by a user and a non-sensing region that does not sense the operation by the user,
wherein the display control unit causes an operation image to be displayed only in the sensing region.
CN202311715984.7A 2023-01-31 2023-12-14 Program and game device Pending CN117839199A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023-013554 2023-01-31
JP2023013554 2023-01-31

Publications (1)

Publication Number Publication Date
CN117839199A true CN117839199A (en) 2024-04-09

Family

ID=90546906

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311715984.7A Pending CN117839199A (en) 2023-01-31 2023-12-14 Program and game device

Country Status (1)

Country Link
CN (1) CN117839199A (en)

Similar Documents

Publication Publication Date Title
US8137196B2 (en) Game device and game program that performs scroll and move processes
US7934995B2 (en) Game system and information processing system
US7634136B2 (en) Touch input program and touch input device
US7825904B2 (en) Information processing apparatus and storage medium storing item selecting program
US8062131B2 (en) Game system and game apparatus used for the same
US7425175B2 (en) Match game program
JP5599741B2 (en) Electronic device, content display method, and content display program
US20060109259A1 (en) Storage medium storing image display program, image display processing apparatus and image display method
US8684841B2 (en) Storage medium having game program stored thereon and game apparatus
JP2006340744A (en) Game program and game device
JP2007111568A (en) Game program, and game device
US20110025614A1 (en) Storage medium having information processing program stored therein, information processing device, and coordinate calculation method
US9480912B2 (en) Game device, game device control method, program, and information storage medium
JP5000132B2 (en) Training program, training apparatus, training system, training control method, game program, game apparatus, game system, and game control method
US9019315B2 (en) Method of controlling display
CN109999492B (en) Game device, game system, and recording medium
JP2008012199A (en) Game system and image display control method thereof
JPWO2008015783A1 (en) Selection control method for a plurality of objects displayed on a display device, game device using the same, and program for controlling execution of the method
US20130321469A1 (en) Method of controlling display
CN117839199A (en) Program and game device
CN116196624A (en) Program, game device, and game system
JP5827695B2 (en) Information processing apparatus, information processing method, program, and information storage medium
JP7445048B1 (en) Programs and game devices
WO2018179879A1 (en) Game device and article for game
JP2019217343A (en) Game device, game system and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination