CN112891920B - Game device - Google Patents

Game device Download PDF

Info

Publication number
CN112891920B
CN112891920B CN202110218788.3A CN202110218788A CN112891920B CN 112891920 B CN112891920 B CN 112891920B CN 202110218788 A CN202110218788 A CN 202110218788A CN 112891920 B CN112891920 B CN 112891920B
Authority
CN
China
Prior art keywords
game
display surface
game device
image
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110218788.3A
Other languages
Chinese (zh)
Other versions
CN112891920A (en
Inventor
泽田夏子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bandai Co Ltd
Original Assignee
Bandai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bandai Co Ltd filed Critical Bandai Co Ltd
Publication of CN112891920A publication Critical patent/CN112891920A/en
Application granted granted Critical
Publication of CN112891920B publication Critical patent/CN112891920B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • A63F2300/1031Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection using a wireless connection, e.g. Bluetooth, infrared connections
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention relates to a game device capable of further improving interest by disposing a real object in a state of overlapping with an image on a display surface. A game device (1) is provided with: a housing (10); a game execution unit for executing a game; a display unit (20) that includes a display surface (23) on which game elements of a game are displayed; an operation unit (30) which can be operated by a user in relation to a game; and a detection unit (50) for detecting movement of the operation unit. The game execution unit can execute the game based on the detection result of the movement of the operation unit by the detection unit so as to change the display of the game element on the display surface. The operation section is configured to: at least in the operation of the operation unit related to the execution of the game, a first portion (31) which is at least a part of the operation unit appears to be in a state of overlapping with the image of the game element on the display surface in a first direction in which the display surface is viewed from the viewpoint of the user.

Description

Game device
Technical Field
The present invention relates to a technique for a game device, a toy, and the like.
Background
Various game devices and toys have been proposed which are interesting by using a technique such as augmented reality (Augmented Reality: AR). For example, there is one technique as follows: from the viewpoint of the user, an image is displayed so as to overlap with a real object on a display surface of a display device disposed in front of or behind the real object (for example, a model).
As a related art example of the above, japanese patent application laid-open No. 2006-218293 (patent document 1) can be cited. As a toy, patent document 1 describes the following. Inside a room (corresponding to a model of a home), a transparent LCD screen is arranged so as to be visible from the front surface of the room, and inside decorative objects such as chairs and outside decorative objects such as trees are arranged at the front and rear of the screen. The screen displays characters (corresponding to images) such as girls. The user can enjoy the game while observing the character or the like in the screen.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 2006-218293
Disclosure of Invention
Problems to be solved by the invention
As an example of patent document 1, there is proposed a game device and a toy as follows: a display surface is arranged in a structure such as a model, and an image of the display surface is changed in association with an object arranged in front of and behind the display surface, thereby bringing interest. However, in such a conventional art, for example, a toy of patent document 1, a physical object such as an interior decoration object or an exterior decoration object is not an object that can be operated by a user, and an operation of a game is performed by a predetermined switch/button. Therefore, there is a shortage of interest in terms of playing a game by the operation of the user.
The present invention provides a technique for realizing a game device or the like capable of changing an image on a display surface in association with an object, and the technique can further improve the interest.
Solution for solving the problem
Representative embodiments of the present invention have the following structures. A game device according to one embodiment includes: a housing; a game execution unit that executes a game; a display unit including a display surface on which game elements of the game executed by the game execution unit are displayed; an operation unit which can be operated by a user in relation to the game; and a detection section that detects a movement of the operation section, wherein the game execution section is capable of executing the game such that a display of the game element in the display surface changes based on a detection result of the movement of the operation section obtained by the detection section, the operation section being configured to: at least in an operation of the operation section in connection with execution of the game, a first portion, which is at least a part of the operation section, appears to be in a state of overlapping with an image of the game element of the display surface in a first direction in which the display surface is viewed from the viewpoint of the user.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the exemplary embodiment of the present invention, regarding the technology such as a game device, further improvement of interest and the like can be achieved by disposing the real object and the image of the display surface in a superimposed manner.
Drawings
Fig. 1 is a perspective view of the entire appearance of the game device according to embodiment 1 of the present invention, as viewed from the front surface side.
Fig. 2 is a schematic diagram showing an outline of the structure of the game device according to embodiment 1 when viewed from the front surface.
Fig. 3 is a schematic diagram showing an outline of a structure of the game device according to embodiment 1 when viewed from a side surface.
Fig. 4 is a schematic diagram showing an outline of the structure of the game device according to embodiment 1 when viewed from the upper side.
Fig. 5 is a perspective view showing the game device according to embodiment 1 when viewed from the back side.
Fig. 6 is a diagram showing a display example (first example, second example) in the game device according to embodiment 1.
Fig. 7 is a diagram showing a display example (third example, fourth example) in the game device according to embodiment 1.
Fig. 8 is a schematic diagram showing an outline of a structure of the game device according to embodiment 2 of the present invention when viewed from a side surface.
Description of the reference numerals
1: A game device; 10: a housing; 11: a cover; 12: a container; 20: a display unit; 21: a display device; 22: a window; 23: a display surface; 30: an operation unit; 31: a first portion; 32: a second portion; 33: a third section; 35: a bearing part; 36: a connecting part; 40: a button operation unit; 50: a detection unit; 51: a first detection unit; 52: a second detection unit; 60: a shielding part; 80: a restriction portion; 90: a battery housing part; 100: a substrate.
Detailed Description
(Embodiment 1)
A game device according to embodiment 1 of the present invention will be described with reference to fig. 1 to 7. Fig. 1 is a perspective view of the entire appearance of the game device 1 according to embodiment 1 as seen from the front surface side. Fig. 2 is a schematic diagram showing an outline structure in a plane (X-Y plane) corresponding to a head-up view of the front surface side viewed from the user's point of view in the game device 1. Fig. 3 is a schematic diagram showing an outline of a structure in a cross section (Y-Z plane) corresponding to a head-up view of the observation side surface in the game device 1. Fig. 4 is a schematic diagram showing an outline of the structure of the game device 1 in a cross section (X-Z plane) when viewed from the upper side. Fig. 5 is a perspective view of the game device 1 when viewed from the back side. As the direction for illustration, the direction X, Y, Z shown in the figure may be used. The X direction is a left-right direction (horizontal direction in the display surface 23), the Y direction is an up-down direction (vertical direction in the display surface 23), and the Z direction is a front-back direction, a depth direction, and corresponds to a first direction in which the display surface 23 and the like of the game device 1 are viewed from the user viewpoint, and the like.
In fig. 1 and the like, as an outline, the game device 1 is a device or toy capable of playing a game simulating a drink or food (for example, a pearl (tapioca) drink) as a target object. The housing 10 of the game device 1 is a structure that mimics the food and drink. The game device 1 displays an image related to a game on the display surface 23 on the front surface side, and a child or the like as a user can play the game by manipulating the manipulation unit 30 imitating the suction tube with a finger.
The game device 1 includes: a housing 10; a game execution unit that executes a game; a display unit 21 including a display surface 23, the display surface 23 displaying images of game elements of the game executed by the game execution unit; an operation unit 30 that enables a user to perform an operation related to a game; and a detection unit 50 that detects a state such as movement of the operation unit 30. The game execution unit can execute the game based on the detection result of the movement of the operation unit 30 by the detection unit 50 so that the display of the game element on the display surface 23 changes. The operation section 30 is configured to: at least in the operation of the operation unit 30 related to the execution of the game, the first portion 31, which is at least a part of the operation unit 30, appears to be in a state of overlapping with the image of the game element of the display surface 23 in the first direction in which the display surface 23 is viewed from the viewpoint of the user.
As an operation of the operation unit 30 by the user, a first operation corresponding to a first action AC1 of mainly swinging the second portion 32 or the like in the X direction and a second operation corresponding to a second action AC2 of pressing a certain predetermined object in the Y direction can be performed. These actions and operations correspond to inputs to game play. In the present embodiment, the first operation AC1 is an operation that simulates the operation of the stirring straw. The second action AC2 is an action of imitating the action of the lancing straw.
[1-1: Integral structure
As shown in fig. 1, the game device 1 according to embodiment 1 is roughly divided into a case 10, a display unit 20, an operation unit 30, a button operation unit 40, a detection unit 50, and a shielding unit 60. Although not shown in fig. 1, the game device 1 includes a restricting portion 80 (fig. 3 to 5), a substrate 100 (fig. 3), a battery housing portion 90 (fig. 3), a speaker, and the like in addition to these. The case 10 has two parts, namely, a lid 11 and a container 12, and these lid 11 and container 12 have shapes that mimic those of foods and drinks. The cover 11 is fixed to the upper surface of the container 12. In this example, the case 10 has a shape schematically shown in fig. 4 in the X-Z plane corresponding to the horizontal plane, in which a part of the back surface side of the cylinder is flattened in the longitudinal direction (Y direction) and removed. As shown in fig. 4, the housing 10 has a front surface side with a convex curved surface shape and a rear surface side with a planar shape.
As shown in fig. 1 to 4, a display unit 20 is provided on the front surface side of the case 10. The display unit 20 includes a display device 21 and a window 22. The window 22 is provided in a part of the front surface of the case 10 in a region schematically formed as a rectangle, for example. The window 22 has a simple opening or a structure in which a transmissive member is fixed to the opening. The window 22 may be provided with a door or the like. A display device 21 is disposed on the rear side of the window 22. The display device 21 has a transmissive display surface 23, a display driving circuit, and the like. The display driving circuit of the display device 21 displays an image on the display surface 23 based on control from the substrate 100 (fig. 3). The display surface 23 corresponds to an area capable of displaying an image. The transmissive display surface 23 is a display surface having a property and a function that a real object such as the first portion 31 arranged on the rear side can be seen through the display surface in the first direction as viewed from the user's point of view. The display surface 23 is formed to have a larger area on the X-Y plane than the window 22. Even when the direction in which the display surface 23 is viewed from the user viewpoint is shifted from the Z direction to a certain degree of angle, the image of the display surface 23 is not prevented from being visually recognized. In the present embodiment, the transmissive liquid crystal display device is used as the display device 21, but the present invention is not limited thereto, and any display device having transmissivity that can visually recognize the operation portion 40 through the display device 21 in the first direction as viewed from the user's viewpoint may be used.
In the Z direction, a space (fig. 3, etc.) is provided on the rear side of the display surface 23, and the first portion 31 of the operation unit 30 is movable in the space. A shielding portion 60 is disposed at the rear side of the first portion 31 to partition the space inside the housing 10. The shielding portion 60 shields the object so that the object existing at the rear side of the shielding portion 60 is not observed when viewed from the first direction. The third portion 33 of the operation unit 30, the detection unit 50, the battery housing unit 90 of fig. 3, and the like are disposed on the rear side of the shielding unit 60.
The operation portion 30 includes a first portion 31, a second portion 32, a third portion 33, a bearing portion 35, a coupling portion 36, and the like. The cover 11 has a bearing 35 at a part thereof. The operation unit 30 is supported by a bearing 35. The bearing 35 serves as a fulcrum for movement of the operating portion 30. In the bearing portion 35, the second portion 32 and the first portion 31 are connected to be 1 rod-like object in outline. The second portion 32 is configured to protrude from the outside of the housing 10, in particular, the upper side of the cover 11. The second portion 32 is a portion that can be operated by touching, for example, by a user directly holding the portion with fingers. The first portion 31 is a portion disposed in the housing 10 so as to overlap the display surface 23 in the first direction. The third portion 33 is a portion disposed on the rear side of the shielding portion 60 so as to overlap the first portion 31 and the shielding portion 60 in the Z direction. The third portion 33 is fixed to the first portion 31 via a coupling portion 36. The third portion 33 moves integrally in linkage with the movement of the first portion 31. The third portion 33 is configured to be shielded by the shielding portion 60, the restricting portion 80, so that the third portion 33 is not observed from the user's viewpoint.
In embodiment 1, the first operation AC1 and the second operation AC2 can be performed independently as operations performed by the operation unit 30. As shown in fig. 1, the first action AC1 includes at least an action like a left-right shake in the X direction. The second action AC2 includes at least actions like pressing back up and down in the Y direction.
A button operation portion 40 is provided on a lower side of the window 22 on a front surface side of the housing 10, for example. The button operation unit 40 is a unit capable of inputting basic operations related to the game device 1 by a user. The button operation unit 40 has, for example, 3 buttons 41, 42, 43 as hardware buttons as shown in fig. 2. For example, the button 41 is set to be operable for selection, the button 42 is set to be operable for determination, and the button 43 is set to be operable for cancellation. The button operation unit 40 is connected to the substrate 100 (fig. 3), and the state of the operation of each button is grasped by the controller. Further, an operation button or the like may be provided on the other surface of the housing 10.
The detection unit 50 is schematically illustrated in fig. 1 in its arrangement position, and details thereof are shown in fig. 2 to 5. The detection unit 50 is a part for detecting the state of the operation unit 30. The detection unit 50 includes a first detection unit 51 and a second detection unit 52 shown in fig. 2 and the like. The first detection unit 51 is a portion that detects the movement and position of the third portion 33 in the X direction corresponding to the first action AC 1. The second detection unit 52 is a portion that detects the movement and position of the third portion 33 in the Y direction corresponding to the second operation AC 2. The state detected by the detection unit 50 is grasped as an operation input by the operation unit 30 by the controller of the substrate 100 (fig. 3).
In embodiment 1, in the operation unit 30 imitating a pipette, the shapes of the first portion 31, the second portion 32, and the third portion 33 are not simply cylindrical or cylindrical, but are different from each other. Specifically, first, the second portion 32 is formed in a cylindrical or cylindrical shape, that is, a shape in which a cross section on a horizontal plane is substantially a perfect circle (fig. 4 and the like), so that the second portion 32 appears to be a three-dimensional shape similar to a physical object (a straw in this embodiment). In the present embodiment, the operation unit 30 simulates a suction tube, but unlike an actual suction tube, the second portion 32 is formed in a cylindrical shape, and the upper tip end portion is closed. Thereby, the strength of the second portion 32 can be improved. The first portion 31 need only look like a straw in the first direction (fig. 2), and need not be limited to a cylinder or cylinder. In this example, the shape is a shape like a cylinder flattened in the Z direction, that is, a shape in which a cross section in a horizontal plane is elliptical (fig. 1 and 3). That is, the thicknesses in the Z direction corresponding to the first direction are made different in the first portion 31 and the second portion 32 (in the present embodiment, the thickness of the first portion 31 is made smaller than the thickness of the second portion 32). By making the shape of the user seen in the Z direction corresponding to the first direction substantially the same as the shape of the second portion 32, the first portion 31 and the second portion 32 appear as a solid shape of an integrated real object when the user is seen in the first direction. In addition, in the case where the thickness dimension in the Z direction of the first portion 31 is made smaller than the thickness dimension of the second portion 32, the thickness dimension in the Z direction of the housing 10 of the game device 1 is suppressed to be small as a whole. Thus, in the case where the child holds the housing 10 of the game device 1 for operation, even if the child holds the housing 10 with a small hand, the operation can be performed easily. Therefore, it is easy for a child to operate the second portion 32 with one hand while holding the housing 10 with the other hand and observing the display surface 23 on the front surface side. The third portion 33 is configured to be shielded by the shielding portion 60 when viewed from the first direction, so that the third portion 33 is not observed, and thus is allowed to be various shapes, in this example, a flat plate shape (fig. 1, 3) thinner than the thickness of the second portion 32 in the Z direction. Thus, the thickness of each of the first portion 31 and the third portion 33 in the Z direction is thinner than the thickness of the second portion 32 in the Z direction, and the same effect as in the case of making the thickness of the first portion 31 in the Z direction thinner can be produced. The first portion 31 and the third portion 33 may have the same shape as the second portion 32.
[1-2: Front surface Structure ]
In fig. 2, the window 22, the display surface 23, the first portion 31, and the shielding portion 60 are arranged on the front surface of the game device 1 around the center of the case 12, for example, indicated by a chain line. On the transmissive display surface 23, an image of a game element (fig. 6 and the like described later) is displayed based on control related to a game by a controller of the substrate 100 (fig. 3). In a first direction as viewed from the user's point of view, the first portion 31 appears to be in a state of overlapping the image behind through the display surface 23. That is, the user can operate and visually recognize the feeling of the inside of the food or beverage container which is simulated in the form of an image on the display surface 23 as if the user were stirring the beverage container with the straw simulated by the first portion 31.
In fig. 2, the states of the first portion 31 and the second portion 32 indicated by solid lines in the operation portion 30 show a basic state in which they are located at a central position XC in the X direction and stand upright in the Y direction. In the basic state, the second portion 32 is set from the upper prescribed position Y1 to the lower prescribed position Y2. The first portion 31 is provided from the upper predetermined position Y3 to the lower predetermined position Y4. As shown by the dashed line, a third portion 33 is hidden behind the first portion 31. In this example, the first portion 31 and the third portion 33 are longer than the second portion 32 in the Y direction. Therefore, in the first operation AC1, the range of operation AC1a of the distal end portion 31a of the first portion 31 and the range of operation of the distal end portion 33a of the third portion 33 are larger than the range of operation of the distal end portion of the second portion 32.
In embodiment 1, as shown in the drawing, the first portion 31 and the third portion 33 are disposed so as to be separated from each other in the Z direction with respect to the operation portion 30 via the shielding portion 60 and the like, and the third portion 33 is longer than the first portion 31 at the lower side in the Y direction. The lower distal end portion 33a of the third portion 33 is located at a position Y7 lower than the position Y4 of the lower distal end portion 31a of the first portion 31. Thus, in the first direction as viewed from the user's point of view, the tip end portion 31a of the first portion 31 is designed to appear to be located at an appropriate position Y4 within the window 22 and the display surface 23. Specifically, the distal end portion 31a is disposed in the lower half area of the display surface 23. This design is to make it easy for the user to see a simulated action such as a granular object, which is a game element, described later, by the tip of the straw. The tip end portion 33a of the third portion 33, which is not visible through the shielding portion 60, is designed to be easily detected by the push switch of the second detecting portion 52 disposed so as to be close to the bottom surface of the housing 10. In addition, in accordance with this, the restriction of the movement of the first portion 31 is designed to be mainly the restriction of the movement of the third portion 33 by the restriction portion 80 (fig. 3 to 5). In the space 301 (fig. 3) including the first portion 31, an unnecessary object other than the first portion 31 is not disposed. This can further enhance the sense of simulating food and drink by overlapping the image on the display surface 23 in the first direction with the first portion 31 of the simulated straw.
As a modification, the tip 31a of the first portion 31 may be disposed at another position, for example, a position near the lower edge of the window 22 or the display surface 23.
The states of the first portion 31 and the second portion 32 of the operation portion 30 indicated by two-dot chain lines show the state of the left and right positions in the X direction corresponding to the first action AC1. The first action AC1, indicated by the dashed arrow, schematically shows the range of motion of the second part 32. The movement AC1a indicated by the broken-line arrow schematically shows the movement range of the first movement AC1, particularly the distal end portion 31a of the first portion 31. In response to the first action AC1 performed by the user on the second portion 32, the tip end portion 31a of the first portion 31 swings in an arc line left and right in the X direction as in the action AC1 a. For example, if the user tilts the second portion 32 to the right, the distal end 31a of the first portion 31 moves to the left. The distal end portion 33a of the third portion 33 moves toward the left side position XL in conjunction with the movement of the distal end portion 31a of the first portion 31. Such a first operation AC1 is designed to be performed within a predetermined movement range, and the first operation AC1 is also limited by a limiting portion 80 and a shielding portion 60 described later.
The first detecting unit 51 detects the movement in the X direction and the state of the position of the distal end portion 33a of the third portion 33. In this example, the first detection unit 51 detects the position of the distal end portion 33a as a schematic position of 3 values of 3 kinds, that is, a position XL on the left side, a position XR on the right side, or a vicinity of a position XC on the center. The state in the case where the detection result of the first detection section 51 is neither the left position XL nor the right position XR corresponds to the state in the vicinity of the center position XC. The near-center state is precisely a state between the left position XL and the right position XR.
The second action AC2, indicated by a dashed arrow, schematically shows the range of motion when the user presses the second part 32 in. Correspondingly, the action AC2a in the second action AC2 schematically shows the movement range of the tip end portion 31a of the first portion 31. The second operation AC2 can be performed similarly even in a state in which the first portion 31 and the like are positioned in the left-right direction by the first operation AC 1. In this example, the second detection unit 52 for detecting the second operation AC2 is configured using, for example, a push switch. The push switch is disposed so as to be close to the bottom surface of the case 10. The push switch is switched from the off state to the on state in response to contact and pressing of the tip end portion 33a of the third portion 33 against the push switch. In the case where the position of the distal end portion 31a of the first portion 31 is arranged so as to overlap with the image simulating the inside of the container in the display surface 23 when the user views from the first direction, the distal end portion 33a of the third portion 33 needs to be arranged at a position Y7 lower than the position Y4 of the distal end portion 31a of the first portion 31, and the length of the third portion 33 in the Y direction is longer than the length of the first portion 31 in the Y direction, so that the distal end portion 33a of the third portion 33 can contact the upper surface of the push switch at the time of the second operation AC2.
[1-3: Side surface Structure ]
In fig. 3, in the structure when viewed from the side surface of the casing 10, the window 22, the display surface 23 of the display device 21, the first portion 31, the shielding portion 60, the third portion 33, the battery housing portion 90, and the like are arranged in this order from front to rear in a first direction as viewed from the user's point of view, for example, at a position YC near the center in the Y direction. In the Z direction, the window 22 is disposed at a position Z1, the display surface 23 is disposed at a position Z2, the first portion 31 is disposed at a position Z3, the shielding portion 60 is disposed at a position Z4, and the third portion 33 is disposed at a position Z5. The distance between the position Z2 and the position Z3 is a distance d1. In this example, the display surface 23 is disposed at the position Z2 on the rear side with respect to the position Z1 of the window 22 in the Z direction, but the present invention is not limited to this, and the display surface 23 may be disposed at the position Z1 of the window 22.
A substrate 100 is disposed near the bottom surface of the case 10. The substrate 100 is not limited to this, and may be disposed in the vicinity of the back surface of the case 10, the back of the shielding portion 60, or the like, as long as it is disposed in an empty region in the case 10. The substrate 100 may be constituted by a plurality of substrates. A battery housing portion 90 is provided near the back surface of the case 10. The battery housing portion 90 houses a battery so as to be removable in response to opening and closing of the lid portion on the back surface of the case 10. The battery supplies power to the substrate 100 and the like. The substrate 100 is also connected to the button operation unit 40, the display device 21, and the sensors, switches, and the like of the detection unit 50. The controller of the substrate 100 grasps the state of the operation unit 30 based on the detection results of the first detection unit 51 and the second detection unit 52.
The board 100 is an electronic circuit board for controlling the game device 1, and in particular corresponds to a computer or a controller for controlling a game or the like. The board 100 includes CPU, ROM, RAM, a nonvolatile memory, an input/output interface, a communication interface, a bus, and the like, and is configured as a controller by a CPU or the like. The controller constitutes a game execution unit that controls and executes a game based on program processing or the like. The game execution unit controls the display of images of game elements and the like constituting a game on the display surface 23 of the display device 21.
Within the housing 10 there is a space 301, which space 301 is enclosed by the window 22, the shielding 60, and a substantial part of the first portion 31 is contained in this space 301. The space 301 is a space that is recognized by a user as a portion simulating the liquid or the like in the food and beverage container. As shown in fig. 3, the shielding portion 60 includes at least a flat plate-shaped portion 60a having an X-Y plane, which is disposed between the first portion 31 and the third portion 33, and shields the third portion 33 and the detection portion 50 in the first direction by the portion 60 a. As shown in the drawing, the shielding portion 60 may further include an upper surface portion having an X-Z surface and a lower surface portion having an X-Z surface, and left and right surface portions having a Y-Z surface, which extend forward from the portion 60a to the vicinity of the window 22. Thereby, the shielding portion 60 can constitute the space 301 including a large portion of the first portion 31. The display device 21 is fixed to, for example, the container 12 or the shielding portion 60. In addition, a restricting portion 61 is provided in a part of the upper surface portion of the shielding portion 60, and the first portion 31 passes through the restricting portion 61 in the up-down direction. The movement of the first portion 31 is also restricted by the restricting portion 61.
One end of the connecting portion 36 is fixed to a portion of the upper side of the first portion 31 at a position below the cover 11 and between the bearing portion 35 and the restricting portion 61, and the connecting portion 36 extends rearward in the Z direction. A part of the upper side of the third portion 33 is fixed to the other end of the connecting portion 36, and the third portion 33 extends downward in the Y direction to a position Y7.
The restricting portion 80, the first detecting portion 51, and the second detecting portion 52 (fig. 4 and 5 described later) are disposed in the vicinity of the operation range of the distal end portion 33a of the third portion 33. The first detection unit 51, in particular, an infrared sensor is disposed at the position Y6. A push switch is disposed near the position Y8 as the second detecting portion 52 connected to the substrate 100.
In the second operation AC2 in the basic state, the distal end 31a of the first portion 31 is displaced downward from the position Y4 to the position Y5 as in the operation AC2 a. The distal end portion 33a of the third portion 33 is also displaced downward from the position Y7 to the position Y8 (or the position of the upper surface of the push switch of the second detection portion 52) in conjunction with the operation AC2 b. The distal end portion 33a is pressed in so as to contact and press the upper surface of the push switch of the second detection portion 52. The push switch detects the state at the time of the push as a state of at least two values of on/off. That is, when the second operation AC2 is performed in the basic state, the push switch is turned from the off state to the on state. In more detail, the second detection unit 52 may be constituted by a pressing sensor or the like capable of detecting the degree of pressing as a plurality of values. Alternatively, the second detection unit 52 may be simply constituted by a touch sensor, and may detect the presence or absence of contact with two values. Here, it is preferable that the second action AC2 is designed in advance so that the position Y5 overlaps with the image imitating the interior of the container on the display surface 23 when the user views the container from the first direction. By designing in advance in this way, even if the first portion 31 is displaced at the time of the second action AC2, the user can visually recognize the tip of the first portion 31.
In embodiment 1, a push switch is used as the second detecting portion 52 in order to simulate the elastic touch feeling of the granular object in the food or beverage along with the second operation AC 2. The push switch gives a predetermined tactile effect such as elasticity to the user when the upper surface thereof is pushed at the time of the second operation AC 2. That is, at this time, the user can obtain the touch feeling as if the object having elasticity was poked with the tip of the straw.
The second portion 32 protruding from the upper side of the cover 11 may be made of a hard member which is not easily deformed, such as a synthetic resin, a metal, or the like, but in embodiment 1, is made of a member which is softer than the first portion 31, such as an elastic member of rubber or the like. In fig. 3, the state of the second portion 32 constituted by the soft member is indicated by a two-dot chain line as a second portion 32b. In this way, the second portion 32 is elastically deformed in accordance with the load of the first action AC1 by the finger of the user. In the case of the former structure in which the second portion 32 is constituted by a hard member, an effect of more directly transmitting the operation of the second portion 32 to the first portion 31 or the like can be obtained.
In the latter configuration in which the second portion 32 is made of a soft member as in embodiment 1, the following actions and effects are obtained. In the case where the child as the user moves the second portion 32 drastically, this action can be received by the elastic deformation of the second portion 32b as shown in the figure, and the cushioning can be performed. This can prevent damage to the components and the like caused by the intense movement of the operation unit 30. In embodiment 1, the first operation AC1 is configured to mainly detect the movement in the X direction (fig. 4, etc.). In this case, if the latter structure is adopted, the user can move the second portion 32 in the Z direction by elastic deformation to some extent, and the detection itself is mainly in the X direction, but the sense of operation in the Z direction can be simulated. That is, the operation of stirring the pipette in the horizontal plane (X-Z plane) can be simulated to some extent.
The restriction portion 80 restricts the movement of the first portion 31 by restricting the movement of the third portion 33. Specifically, the movement of the distal end portion 33a of the third portion 33 is limited to a predetermined range in the left-right direction and the front-rear direction by the limiting portion 80, and at the same time, the movement of the distal end portion 31a of the first portion 31 is also limited to a predetermined range. Due to this restriction, the movement of the distal end portion 31a of the first portion 31 is a movement that moves on a plane (X-Y plane) substantially parallel to the display surface 23, in other words, a movement that traverses the display surface 23, as in the operation AC1 a.
The shielding portion 60 may have a function of shielding at least a part of the third portion 33 and the detection portion 50 from being observed in the first direction, and a function of restricting the movement of the operation portion 30. Conversely, the restricting portion 80 may have not only the restricting function but also the shielding function. The shielding portion 60 and the restricting portion 80 may be integrally formed. The shielding portion 60 may have, as a limiting function, the following functions: the first portion 31 is prevented from contacting other parts such as the display device 21, or is capable of being buffered to prevent damage even when contacted.
[1-4: Upper surface structure ]
In fig. 4, the upper surface of the housing 10 has a rectangular region at the rear in the Z direction and a semi-elliptical region at the front. At least the front surface side of the case 10 is curved to simulate a food and beverage container. In embodiment 1, considering that the thickness of the case 10 in the Z direction is made small, the case 10 is stably placed on a horizontal placement surface such as a desk, etc., the front surface side of the case 10 is curved as shown in the figure, and the back surface side is flat. The shape of the case 10 is not limited to this, and the front surface side may be a planar shape, the back surface side may be a curved surface shape, both the front surface side and the back surface side may be curved surface shapes, and both the front surface side and the back surface side may be planar shapes. The shape of the case 10 in the Y direction is shown as the same area on the bottom surface side and the upper surface side in fig. 1 and 3, but may be a shape in which the area on the upper surface side is larger than the area on the bottom surface side as shown in fig. 2.
A bearing 35 is provided in a hole through which the operation unit 30 passes at a predetermined position (position XC and position Z3) near the center of the upper surface of the cover 11 of the housing 10. At the central position XC in the X direction, the second portion 32, the first portion 31, and the third portion 33 are arranged in a basic state. A left sensor 51L is disposed at a left position XL in the X direction as an infrared sensor constituting the first detecting portion 51, and a right sensor 51R is disposed at a right position XR.
The first action AC1 includes at least an action like a left-right shake in the X direction. The distal end 31a of the first portion 31 has a range of the operation AC1a performed in conjunction with the first operation AC1 of the second portion 32, and the distal end 33a of the third portion 33 has a range of the operation AC1b performed in conjunction with the first operation AC1 of the second portion 32. Further, the third portion 33 is longer than the first portion 31 in the Y direction, and thus the range of the action AC1b in the X direction is slightly larger than that of the action AC1 a.
In the first operation AC1, when it is desired to more accurately simulate the operation like an actual stirring straw, the operation may be performed not only in one direction such as the X direction but also in the Z direction. In embodiment 1, such an operation in the Z direction is allowed to be performed to some extent. In fig. 4, the first operation AC1 can also perform the operation AC12 as if swinging back and forth in the Z direction. Further, the first operation AC1 can also perform the operation AC13 as if it is freely rotatable in a circular arc shape in the horizontal plane. In accordance with such a regulation of the first operation AC1, a mechanism including the bearing 35 is mounted. It is also possible to accept only the movement in the X direction as the mounting of the first action AC 1.
However, in the case where the movement in the Z direction is allowed with respect to the operation portion 30, it is desirable that the first portion 31 is not in contact with other portions including the display device 21 and the like in order to prevent damage or the like to the surrounding members of the operation portion 30. In embodiment 1, when the first operation AC1 including the X direction and the Z direction is performed with respect to the second portion 32, the movement of the first portion 31 is restricted by restricting the movement of the third portion 33 by the restricting portion 80. This can prevent the distal end 31a of the first portion 31 from contacting other parts of the display device 21.
In this example, the restricting portion 80 includes a flat plate-shaped portion 80a having an X-Y plane disposed on the rear side of the shielding portion 60, and a flat plate-shaped portion 80b having a Y-Z plane disposed at a position on the outer side of the range of the operation AC1b of the third portion 33 (see fig. 5 for details). When the distal end portion 33a of the third portion 33 moves in the X direction to a position outside the left position XL or the right position XR in response to the first action AC1, the movement is restricted within a predetermined range by the portion 80b of the restricting portion 80. In addition, in the case where the distal end portion 33a of the third portion 33 moves to the front side in the Z direction, the movement is restricted by the portion 80a of the restricting portion 80.
In this example, the first detection unit 51 uses the left and right sensors 51L and 51R as infrared sensors, and detects the position XL of the distal end portion 33a of the third portion 33 on the left side, the position XR on the right side, or a state therebetween. The two sensors 51L, 51R are arranged between the shielding portion 60 and the portion 80a of the restricting portion 80 in the Z direction, for example (fig. 5). For example, the left sensor 51L disposed at the left position XL emits infrared rays to the rear side in the Z direction. The infrared ray passes through the hole of the restricting portion 80 and enters the space surrounded by the restricting portion 80. When the distal end portion 33a is at the left position XL, the infrared ray is blocked by the distal end portion 33a and reflected, and returns to the sensor 51L. When the infrared ray is not blocked by the distal end portion 33a, the sensor 51L detects the off state, and when the infrared ray is blocked by the distal end portion 33a, the sensor 51L detects the on state. When the detection result of the left sensor 51L is in the on state in the first operation AC1, the controller of the substrate 100 can determine that the third portion 33 is in the left position XL. That is, the controller can determine that the tip end portion 31a of the first portion 31 is substantially positioned on the left side in the X direction. The same applies to the sensor 51R on the right side. When both of the detection results of the two sensors 51L and 51R are in the off state, the controller can determine that the tip end portion 31a of the first portion 31 is at least in the vicinity of the position XC, which is substantially at the center in the X direction, as the position between the left position XL and the right position XR in the X direction. The first detection unit 51 is not limited to such a configuration, and may be implemented using other switches and sensors. The first detection unit 51 may be configured to be capable of detecting the position in the X direction more precisely, or may be configured to be capable of detecting the position in the Z direction more precisely.
In this example, the push switch of the second detecting unit 52 is designed to: a position XC arranged at the center in the X direction, and an area of an upper surface receiving the pressing by the distal end portion 33a of the third portion 33 is within a predetermined range. Therefore, when the distal end portion 33a of the third portion 33 is in the vicinity of the position XC at the center in the X direction in response to the first operation AC1, the push switch can be turned on by the second operation AC 2. On the other hand, when the distal end portion 33a of the third portion 33 is located at a left-right position in the X direction, that is, outside the range of the area in response to the first operation AC1, the push switch cannot be turned on even if the second operation AC2 is performed. With this arrangement, the position of the first portion 31 moving in association with the third portion 33 at the time of the second operation AC2 can be set to the vicinity of the position XC at the center in the X direction, and the arrangement relationship between the first portion 31 and the image on the display surface 23 when viewed from the first direction at the time of the second operation AC2 can be determined to be a predetermined arrangement relationship. The second detection unit 52 is not limited to such a configuration, and for example, may be configured to increase the area of the push switch in the X direction to increase the range of receiving the second operation AC2, so that the push switch can be turned on even when the second operation AC2 is performed when the distal end portion 33a is positioned at the left and right in the X direction. Further, by providing the plurality of second detection portions 52 in the X direction, the second operation can be detected when the tip end portion 33a is positioned at the left and right in the X direction.
In fig. 4 and the like, when the user moves the second portion 32 in the Z direction as in the action AC12 and the like, the first portion 31 and the third portion 33 move to some extent in the Z direction in conjunction with the movement of the second portion 32, although the cushioning is provided by the elasticity (the second portion 32b in fig. 3) described above. At this time, the movement of the first portion 31 is restricted by the restricting portion 80 and the shielding portion 60 to avoid the first portion 31 from contacting other portions of the display device 21. Or a member for protecting the first portion 31 from damage or the like by being able to perform buffering even when the first portion 31 contacts other portions. For example, a protector made of an elastic member or the like may be provided on each surface of the shielding portion 60 and the back surface of the display device 21. Alternatively, the distal end 31a of the first portion 31 may be formed of an elastic member or the like.
[1-5: Back Structure ]
In fig. 5, in the structure when viewed from the back side, the movement of the first portion 31 of the operation portion 30 is detected as the movement of the third portion 33 by the detection portion 50 (the first detection portion 51 and the second detection portion 52). The movement of the distal end portion 33a of the third portion 33 is regulated by a regulating portion 80 as shown in the drawing, and the movement AC1b in the X direction and the movement AC2b in the Y direction are allowed. The restricting portion 80 includes a portion 80a having a front surface side of an X-Y plane, a portion 80b having left and right side surfaces of a Y-Z plane, and a portion 80c having a bottom surface of the X-Z plane. The upper surface of the push switch is exposed at the center of the bottom surface portion 80c, so that the operation AC2b in the Y direction is allowed.
In this example, the distal end portion 33a of the third portion 33 has a convex curved shape on the lower side in the Y direction as shown in the drawing. Thus, the top end portion 33a is easily pressed against the upper surface of the push switch of the second detection portion 52. The bottom surface portion 80c of the restriction portion 80 has a curved shape corresponding to the range of the operation AC1b of the distal end portion 33 a. The shape of the restriction portion 80 can be not limited thereto. The restricting portion 80 may be provided with a flat plate-like portion having an X-Y plane for restricting movement in the Z direction on the rear side thereof.
In addition, for cushioning at the time of collision between the parts, a protector made of an elastic member or the like may be provided also at the restriction portion 80 and the distal end portion 33a of the third portion 33. Even when the user moves the second portion 32 drastically, the tip end portion 33a is stopped by touching any one portion (80 a, 80b, 80 c) of the restriction portion 80. Thereby, the movement of the distal end portion 31a of the first portion 31 linked to the third portion 33 is also prevented within a predetermined range. In addition, an elastic member or the like may be used as a constituent element for generating an elastic operation feeling in the second operation AC2 at the distal end portion 33 a.
As a modification, the first detection unit 51 may be configured by a touch sensor, a proximity sensor, or the like provided at the left and right side surfaces 80b of the restriction unit 80. For example, in the case of a sensor in which the distal end portion 33a contacts the left portion 80b of the restriction portion 80, a state of the left position XL is detected.
[1-6: Play method ]
Basic operations, play, and the like of the game device 1 are as follows. The user such as a child presses a button or the like of the button operation unit 40 while observing the display of the display surface 23, and performs basic operations such as selection and determination of a game or the like. The game execution unit of the game device 1 executes a game, and displays an image of a game element on the display surface 23. The user operates the operation unit 30 while viewing the image on the display surface 23 to perform the first operation AC1 and the second operation AC2 described above, thereby playing the game.
First, in the power-on state, the game device 1 displays a menu screen, not shown, on the display surface 23. The user selects and decides a game to be played by viewing the menu screen and operating the buttons of the button operation unit 40. The menu screen can be shifted to several game modes including a basic game described below, a user setting mode, a standby screen mode, and the like. The game device 1 provides several games including the basic game described below by the game execution unit. The game is not limited to the following examples, and various games can be provided according to the design of a program or the like. When the basic game is selected and decided, the game device 1 starts displaying the image of the game element of the basic game on the display surface 23 based on the program processing.
[1-7: Basic game
Next, a configuration example of a basic game in the game device 1 will be described. The basic game is a game in which a pearl drink is used as a target, and the name of the basic game is, for example, "search for a pearl game". In this basic game, a user such as a child finds, acquires, and collects pearls, which are granular objects constituting a pearl drink. The housing 10 of the game device 1 is shaped to simulate a pearl drink (in particular, the container 12 and the lid 11), and the operation unit 30 is shaped to simulate a straw. The display surface 23 displays an image simulating a pearl drink (particularly, a liquid or granular object in the container). The user can see in the first direction the state in which the image of the inside of the simulated container in the display surface 23 coincides with the first portion 31 of the simulated drinking straw.
In the basic game, the user operates the operation unit 30 while observing the image on the display surface 23, thereby simulating the first action AC1 as if the straw stirs the pearl drink and the second action AC2 as if the granular object were stamped. In this way, in the basic game, the user searches for and finds various kinds of granular objects constituting the pearl drink, that is, pearls, for example, unobtainable pearls. Then, the user captures and acquires the pearl by performing an action (particularly, the second action AC 2) imitating a predetermined suction action performed by the operation unit 30. In the basic game, the game execution unit determines the operation of the operation unit 30 by the user based on the detection by the detection unit 50, and determines success or failure in acquiring the granular object based on a predetermined condition or the like.
The first action AC1 (corresponding first operation) described above is reflected as an operation of flowing the liquid and the granular objects inside the pearl drink through the stirring straw. The second action AC2 (corresponding second operation) is reflected as an action of stamping or pressing the granular object having elasticity, that is, the pearl with the straw. The second operation AC2 is detected as an operation of the push switch that presses the second detection portion 52 with an elastic tactile sensation by the third portion 33. Thus, the user obtains the sense of touch as if the pearl had elasticity.
In addition, in the basic game, suction of the granular objects using a suction pipe is also simulated. In embodiment 1, this pumping operation is also realized as an operation common to the second operation AC2, which is an operation in the up-down direction, in consideration of simplification of the mechanism and cost reduction. In the basic game, the second action AC2 is functionally switched from the action of the granular object to the suction action in response to switching of the game progress status on the time axis. Further, as a modification, a third operation of imitating the suction operation may be realized by another mechanism (for example, an operation button for suction) unlike the second operation AC2 of imitating the operation of the particulate object.
[1-8: Display examples related to basic Game ]
Fig. 6 and 7 show display examples in the display surface 23 of the display device 21 related to the basic game.
1. The game execution unit of the game device 1 displays a game setting screen that simulates an order of a pearl drink or the like on the display surface 23, although not shown, at the start of the basic game. The game setting screen is a screen for setting the content of the basic game of this time, for example, the difficulty level, the type of the particulate object appearing, and the like, and the details are not particularly limited, and the game setting screen may be omitted.
The game device 1 displays parameters for configuring the pearl drink (corresponding basic game) on the game setting screen, and accepts a selection operation by the user. In the game setting screen, the user can select, for example, the type of liquid (e.g., tea, milk tea), the amount of ice, the sweetness, and other parameters as the base of the beverage. The game device 1 determines the internal structure of the pearl drink in the basic game according to the value of the selected parameter. The pearl beverage internal structure includes a dispensing structure for a plurality of pearls or at least 1 specific pearl contained within the liquid. The allocation structure can be realized by, for example, a process of probabilistically decimating the pearl appearing based on a program process, a process of reading a prescribed pearl from a prescribed table, or the like. The difficulty level of play can also be adjusted by, for example, the number of pearls, the flow rate of the pearls, etc.
Data including images and information is prepared in advance for a plurality of kinds of pearls (granular objects) which become candidates that can be acquired by a user as elements of a basic game. For example, pearls of various colors such as black, white, orange, blue, etc. are prepared. In addition, there are also prepared food-imitating pearls such as strawberry pearls, animal-imitating pearls such as cat pearls, and animals-imitating pearls such as novels and stories. For example, a degree of rarity such as ordinary and rare, that is, a difference in occurrence probability, etc. is set for each pearl. In addition, as for the image of each pearl, an image of a pearl having various expressions such as a smiling expression, a angry expression, a crying expression, and a non-expression is prepared in the form of a still image or an animation.
In this example, the pearl distribution structure of the pearl drink at the start of the basic game is set to be 1 specific pearl which becomes a candidate for acquisition by lottery. The pearls other than the 1 specific pearl present in the pearl drink were designated as prescribed pearls on the performance and were not obtained. It is needless to say that two or more pearls and the like may be simultaneously drawn and present.
2. After determining the setting of the basic game, the game device 1 starts the basic game. Fig. 6 (a) shows an example of a display at the start of a basic game (particularly, a "search start period") as a first example of the display. An image g0 representing a liquid (e.g., pure tea) in the pearl drink, an image g1 representing each of the plurality of pearls, and the like are displayed on the display surface 23 viewed through the window 22 from the viewpoint of the user. The pearl image g1 is an image which is intended to be personified toward a child and thus has an expression including eyes and mouths. Behind the image of the display surface 23, the first portion 31 imitating a straw can be seen through the image of the display surface 23. The display surface 23 appropriately displays an image g2 (for example, "start-. In this example, at this point in time, the game device 1 has not yet presented/displayed the image of the above-mentioned selected pearl of a specific kind, but has displayed an image g1 representing a plurality of pearls for play. In this example, the image g1 is set to a pearl having a constant color and a smiling expression. Thus, the user's sense of expectations can be improved by making the pearl candidate to be obtained unknown.
3. After the basic game starts, the user moves the second section 32, and the game execution section changes the image of the display surface 23 in response to detection of the operation section 30. Fig. 6B shows a display (second example) at this time (particularly, the "search period"). The user performs a first operation AC1 such as a stirring straw, for example, an operation of repeatedly swinging left and right in the X direction. Thereby, the tip end 31a of the first portion 31 is repeatedly rocked in the left and right directions on the display surface 23. The game device 1 detects the state of the third portion 33 (fig. 2, etc.) corresponding to the first action AC1 by the first detection unit 51. Based on the detected state, the game device 1 changes the image g0 of the liquid in the pearl drink and the image g1 of the pearl on the display surface 23 as shown in fig. 6 (B). Specifically, on the display surface 23, an image such as an animation is displayed as if a liquid and a plurality of pearls are flowing in the container. The image g0 of the liquid is, for example, an image like a wave excited in the upper surface, or an image like a bubble generated in the liquid. In this case, each pearl may flow like a circular arc drawn in the X direction (arrow in the drawing) or the Y direction, for example. The pearls may also be hidden from view within the display surface 23 in response to flow.
4. The game device 1 displays an image g3 representing the selected specific pearl on the display surface 23 as shown in fig. 6 (B) at a predetermined opportunity such as when the flow generated by the first action AC1 is in a state of a predetermined level or more. In this example, at this point in time, a specific pearl is set as an unknown pearl whose original shape has not been displayed, and an image g3 representing the unknown pearl is set. This can improve the feeling of expectations of the user. The unknown specific pearl (image g 3) is displayed in a state of a candidate that can be acquired by the user, and is displayed so as to flow in the liquid as in the case of the other constant pearls (image g 1). The image g3 of the unknown specific pearl may be set as, for example, a pearl with a sunglasses, attached? ' marked pearl, etc. As a modification, an image showing the original shape of a specific pearl may be displayed at this point in time.
The game device 1 displays an image g4 (for example, a dotted circle) indicating an area where a specific pearl can be obtained ("capturing area", "target frame") on the display surface 23 at a predetermined opportunity. In this example, the region is arranged in the center of the display surface 23 in the X direction, and is arranged as a region of substantially the lower half in the Y direction corresponding to the tip end portion 31a of the first portion 31. This region indicates that a particular pearl (image g 3) can be acquired/sucked by the second action AC2 when the pearl is within the region.
5. The user performs the second action AC2 as if pressing the straw downward at the timing when the specific pearl (image g 3) enters the region (image g 4) of fig. 6 (B). This second action AC2 corresponds in particular to the suction action of the first phase. The game device 1 detects the second action AC2 by the push switch of the second detection unit 52. The second operation AC2 cannot succeed if it is not an operation for the vicinity of the center in the X direction as described above. When a specific pearl (image g 3) enters the area (image g 4), the game device 1 determines that the first stage suction for acquiring the specific pearl is successful when the push switch is turned on by the second operation AC2. When the game device 1 determines that the first stage suction is successful, the process proceeds to the next display as follows. In this case, as an example of the performance, a cartoon or the like may be used to show how a specific pearl is sucked into the tip end portion of the suction tube.
6. Fig. 7 a shows a display example (third example) of the suction of the second stage (in particular, the "acquisition opportunity period") after the suction of the first stage is successful. At this time, the game device 1 switches the period of time over which the game progresses on the display surface 23 as in fig. 6 (B) to 7 (a). In the period of fig. 7 (a), an image representing a part of the expanded pearl drink and straw is displayed in the display surface 23. In this example, the display surface 23 has an image g5 showing a part of a liquid or another pearl, an image g6 showing a part of a cylindrical straw, and an image g7 showing a specific pearl entering the straw. In this example, at this point in time, the original shape of the particular pearl has not been displayed and the image g7 of the particular pearl is set to a constant pearl (e.g., a smiling pearl). In this case, the image g7 may be an image representing an unknown pearl, or may be an image representing an original shape, which will be described later, similarly to the image g 3.
In this period, the user is focused on a specific pearl (image g 7) and a straw on the image (image g 6), and attempts to perform the second stage pumping action using the second action AC 2. In this period, the straw on the image (image g 6) is displayed, and overlaps with the first part 31 of the object. In this period, since it is desirable that the user look at the straw on the image (image g 6), it is preferable that the first part 31 of the object at the back is hard to see by the user. Therefore, in this example, in order to make the first portion 31 difficult to be seen, the image g6 of the straw, the image g7 of the specific pearl, and the like are displayed in a large size, a specific color, and the like on the display surface 23 so as to be noticeable.
7. Next, the user performs the second stage suction operation using the second operation AC2 while observing the display surface 23 as shown in fig. 7 (a). The second action AC2 in this phase ("acquisition opportunity period") is to simulate the action of sucking the pearl with a suction tube, simulated by the second action AC2. The second action AC2 of 1 corresponds to the following operations: the detection state of the push switch of the second detection unit 52 changes from the off state to the on state after a short period of time. The 1 second action AC2 corresponds to a prescribed amount of suction in the game. In addition, a second operation AC2 by long pressing may be provided. The second action AC2 performed by 1 long press corresponds to an operation of making the on state of the push switch longer and continuous, and corresponds to continuous suction in the game. The user performs the second action AC2 several times appropriately, the second action AC2 performed by long pressing. In response to these second actions AC2, the game device 1 displays on the display surface 23 the state that a specific pearl (image g 7) is sucked in a gradually rising manner inside a long straw (image g 6) with an animation or the like. The game device 1 displays a part of the long straw while scrolling on the display surface 23, for example.
In addition, the game device 1 may display an image g8 indicating a predetermined metering bar on the display surface 23 during this period. The metering strip is a game element that mimics breathing when suctioned. The game device 1 changes the amount of the metering strip in size in response to the second action AC2 related to the suction. For example, the game device 1 increases the amount of the metering bar in response to the on state set by the second action AC2, and decreases the amount of the metering bar in response to the off state. In addition, when the amount of the metering bar exceeds a certain amount, the game device 1 may be put into a choked state and a predetermined loss (penalty) may be generated.
When the specific pearl (image g 7) is sucked/lifted by the second action AC2 by more than a predetermined amount, the game device 1 determines that the second-stage suction is successful, that is, that the specific pearl is successfully obtained. Thereby, the game device 1 shifts to the next display. If the user cannot smoothly perform the second operation AC2 related to the suction and the suction/rise of the specific pearl is less than the predetermined amount, the game device 1 determines that the suction of the second stage has failed. In this case, the game device 1 performs control so that the display is restored to the pre-suction state as shown in fig. 6 (B), for example, and can perform retry.
In the second stage of the suction as shown in fig. 7a, the game device 1 may be configured such that another pearl in the liquid (image g 5) as another game element is entangled near the suction pipe (image g 6) to prevent the suction of a specific pearl (image g 7) in the suction pipe. In this case, when the first action AC1 by the user is received and the first action AC1 is performed, the game device 1 displays the pearl so as to be thrown away from the other pearl.
8. Fig. 7 (B) shows a display example (fourth example) when the above-described specific pearl acquisition is successful ("pearl acquisition success period"). The game device 1 causes the display to shift from fig. 7 (a) to fig. 7 (B). The game device 1 displays a specific pearl in the original form on the display surface 23 by using an animation or the like together with a predetermined performance (for example, light emission and change). The game device 1 displays an image g9 indicating a specific pearl (strawberry pearl in this example) of the type determined by the first lottery, near the center on the display surface 23. The game device 1 displays an image g10 indicating the type, name, or the like of the specific pearl, together with the image g9. At this point in time, the user recognizes the particular pearl and the user is able to obtain a sense of achievement.
9. The game device 1 registers the image and information of the pearl acquired by the user in the pearl list owned by the user. The game device 1 holds a list such as a list of pearls owned by the user in the memory. The user can appropriately display the pearl list of himself/herself on the display surface 23 in accordance with the basic operation on the menu screen, and select a desired pearl from the pearl list, and view detailed images and information. In addition, the user can set a desired pearl ("my pearl: my tapioca") selected from the pearl list to an image displayed in standby screen mode or the like. The game device 1 displays an image including the my pearl together with a display such as a clock when not playing a game or in a standby screen mode. In the standby screen mode, for example, the appearance of my pearl flowing in the pearl drink is displayed by an animation or the like. In this case, the user can also influence the my pearl by operating the operation unit 30. Thus, the user can enjoy the interaction of appreciation, communication, etc. of my pearls.
As a modification related to the basic game described above, the following can be mentioned. First, the configuration for controlling the acquisition of the pearl in two stages may be such that the second stage (fig. 7 a) is omitted and a specific pearl can be acquired by the success of the first stage (fig. 6B). Further, as examples of other games, the following games are also possible: the score or the like is increased according to the number of granular objects acquired by the user, the time required for acquisition, or the like.
As a modification, the game device 1 may display the pearl distribution structure determined at the start of the basic game directly as an image of each pearl on the display surface 23. The user selects a desired pearl to be obtained from among the plurality of pearls in the display surface 23, and performs an operation for obtaining the desired pearl. In this case, the content of the display surface 23 becomes more complex, and the game difficulty becomes higher.
[1-9: Effect and the like
As described above, according to the game device 1 of embodiment 1, for example, by controlling the display so that the real object (the housing 10 and the operation unit 30) imitating the food and drink overlaps the image of the game element on the display surface 23, the interest and the like can be further improved. According to the game device 1, the user can visually recognize the state where the first portion 31 as a physical object is superimposed on the image of the game element on the display surface 23 with respect to the operation unit 30 imitating the straw, and can operate the second portion 32 and change the image of the game element in response to the operation. Therefore, the interest in the play of the game generated by the user operation can be improved. In addition, the game device 1 is assumed to be child-oriented, and can be realized at low cost without using an expensive sensor, for example. The detection unit 50 can be realized by using inexpensive sensors, switches, and the structure of the operation unit 30.
[1-10: Modification examples ]
As a modification of embodiment 1, the following is also possible. The operation unit 30 may be configured so that the second portion 32 to be held by the user's hand is not present. For example, instead of the second portion 32 imitating the suction pipe, an operation portion which does not imitate the suction pipe for moving the first portion 31 and the like in the housing 10 may be provided in the housing 10. As the operation portion which does not imitate the suction tube, for example, an operation for moving the second portion 32 by the button operation portion 40 may be configured. In addition, as a method of disposing the first portion 31, the first portion may be configured to extend in the X direction, and the game device 1 may be configured to simulate a physical object other than food or drink.
The first detection unit 51 and the second detection unit 52 of the detection unit 50 may be attached to a part of the operation unit 30 or an arbitrary position near the operation unit 30. For example, the detection unit 50 may be attached to the bearing unit 35 of fig. 1 and the like. The detection unit 50 may be attached near the first portion 31 at a position not overlapping with the image of the display surface 23. In this case, the operation unit 30 may not be provided with the third portion 33 behind the first portion 31.
The game device 1 may be set so that the first portion 31 does not overlap the display surface 23 in the first direction. For example, a mechanism capable of inserting and extracting the first portion 31 into and from the housing 10, for example, a mechanism capable of expanding and contracting the first portion 31 may be provided, or a mechanism capable of attaching and detaching the operation unit 30 to and from the housing 10 may be provided. In this case, for example, in the case of non-game, only the image of the display surface 23 can be seen in the first direction as viewed from the user.
(Embodiment 2)
As a modification of embodiment 1, fig. 8 is a schematic diagram showing a cross section when viewed from the side surface of the game device 1 of embodiment 2. In embodiment 2, the window 22, the first portion 31 of the operation unit 30, the display surface 23 of the display device 21, the shielding unit 60, the third portion 33, and the like are arranged in the order from front to rear in the Z direction. In the Z direction, the first portion 31 is disposed at a position Z21, and the display surface 23 is disposed at a position Z22 behind the first portion 31 and spaced apart from the first portion 31 by a distance d 2. In this example, the distance d2 is shorter than the distance d1 of fig. 3, and the first portion 31 and the display surface 23 are disposed closer to each other. The first action AC1 may be defined as an action in the X direction. In this example, a bearing 35 and a coupling 36 are mounted in the cover 11. In embodiment 2, the first portion 31 and the image can be visually recognized as being superimposed on each other from the viewpoint of the user.
In embodiment 2, the first portion 31 may be made of a non-transmissive member, but may be made of a transmissive member in order to further improve the feeling of overlapping with an image. The user can see the image of the rear display surface 23 through the first portion 31. Alternatively, the first portion 31 may be formed using an actual straw. The display device 21 may have a non-transmissive display surface 23, or the shielding portion 60 may be omitted. When the display device 21 is the non-transmissive display surface 23, the game device 1 can be made cheaper than when the display device is the transmissive display surface.
As another embodiment, the game device 1 may be configured to: the image display device has a non-transmissive display surface 23, and an image representing a suction tube is displayed on the display surface 23 in accordance with the state of the user's operation of the operation unit 30, particularly, the second portion 32. In this case, the first portion 31 is not provided in the case 10, and the second portion 32 can operate other objects than the first portion 31 disposed at a position overlapping with the image of the display surface 23.
The present invention has been specifically described above based on the embodiments, but the present invention is not limited to the above embodiments, and various modifications can be made without departing from the scope of the present invention. The present invention can be configured by adding, deleting, replacing, and various combinations of the constituent elements of the embodiment. Some or all of the functions and the like described above can be implemented by hardware or by software program processing. The program and data constituting the functions and the like may be stored in a computer-readable storage medium or may be stored in a device on a communication network. The present invention is not limited to being applied to foods and drinks which are objects related to games and simulations. As other examples of the food and beverage, soup, hot pot, etc. can be cited, and as other examples of the bar-shaped object, spoon, chopsticks, etc. can be cited. Examples of objects other than foods and drinks include fishing and catching insects.

Claims (10)

1. A game device is provided with:
A housing;
a game execution unit that executes a game;
A display unit including a display surface on which an image of a game element of the game executed by the game execution unit is displayed;
An operation unit configured to be operable by a user in relation to the game, the operation unit including a first portion disposed inside the housing and a third portion linked to the first portion;
a detection unit that detects movement of the operation unit; and
A shielding portion disposed in the housing, the shielding portion shielding such that at least a part of the third portion is not observed in a first direction in which the display surface is observed from a viewpoint of the user,
Wherein the game execution section is capable of executing the game based on a detection result of the movement of the operation section obtained by the detection section such that the display of the game element in the display surface is changed,
The first portion is configured to: at least in the operation of the operation section in connection with the execution of the game, in the first direction, the first portion appears to be in a state of overlapping with the image of the game element of the display surface,
The detecting portion detects movement of the third portion.
2. The game device according to claim 1, wherein,
The operating part further has a second portion disposed outside the housing and moved in linkage with the first portion by being touched by the user.
3. The game device according to claim 2, wherein,
The shielding portion shields such that at least a portion of the detection portion is not observed in the first direction.
4. The game device according to claim 1 or 2, wherein,
In the first direction, the display surface and the first portion are arranged in this order,
The display surface is a transmissive display surface.
5. The game device according to claim 1 or 2, wherein,
The detection unit includes: a first detection unit capable of detecting a first operation performed by the operation unit; and a second detection unit capable of detecting a second operation performed by the operation unit.
6. The game device according to claim 5, wherein,
The first detection unit detects, as the first operation, a first operation in which at least a part of the operation unit crosses the display surface when the user views the display surface in the first direction.
7. The game device according to claim 6, wherein,
The second detecting section detects a second action different from the first action as the second operation,
The second detection unit can impart a predetermined haptic effect during the second operation.
8. The game device according to claim 1 or 2, wherein,
The first portion of the operation portion is configured to move on a plane substantially parallel to the display surface in response to an operation of the operation portion.
9. The game device according to claim 1 or 2, wherein,
And a restriction portion for restricting movement of the operation portion,
The restriction portion is configured to restrict movement of the first portion by restricting movement of the third portion.
10. The game device according to claim 2, wherein,
At least a portion of the second portion is configured to be movable in three dimensions.
CN202110218788.3A 2020-03-04 2021-02-26 Game device Active CN112891920B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-037320 2020-03-04
JP2020037320A JP6868136B1 (en) 2020-03-04 2020-03-04 Game device

Publications (2)

Publication Number Publication Date
CN112891920A CN112891920A (en) 2021-06-04
CN112891920B true CN112891920B (en) 2024-05-14

Family

ID=75801811

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110218788.3A Active CN112891920B (en) 2020-03-04 2021-02-26 Game device

Country Status (2)

Country Link
JP (2) JP6868136B1 (en)
CN (1) CN112891920B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2309168Y (en) * 1997-08-12 1999-03-03 枫录电子科技有限公司 Electronic game computer in palm
CN101837192A (en) * 2009-03-19 2010-09-22 万代股份有限公司 Game device
CN102996989A (en) * 2011-09-13 2013-03-27 索尼电脑娱乐公司 Electric apparatus
CN104536199A (en) * 2014-12-15 2015-04-22 深圳市华星光电技术有限公司 Transparent liquid crystal display device
TW201539362A (en) * 2014-04-14 2015-10-16 Chi-Chao Lin Healthful information cup
JP2018175950A (en) * 2018-08-23 2018-11-15 株式会社サンセイアールアンドディ Game machine
JP2019037501A (en) * 2017-08-25 2019-03-14 株式会社三共 Game machine

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003340151A (en) * 1999-09-11 2003-12-02 Sony Computer Entertainment Inc Operation device and signal output adjusting method of this device
JP3828795B2 (en) * 2001-12-11 2006-10-04 信越ポリマー株式会社 Pointing device
JP2004305249A (en) * 2003-04-02 2004-11-04 Aruze Corp Game machine
WO2005028042A2 (en) * 2003-09-15 2005-03-31 Atlantic City Coin & Slot Service Company, Inc. Gaming machine with action unit container
US20060040581A1 (en) * 2004-08-20 2006-02-23 Davis Dennis L Rotating ring graphics
JP6413189B2 (en) * 2016-08-24 2018-10-31 株式会社コナミデジタルエンタテインメント Game device
JP6874985B2 (en) * 2017-06-05 2021-05-19 株式会社足立ライト工業所 App change system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2309168Y (en) * 1997-08-12 1999-03-03 枫录电子科技有限公司 Electronic game computer in palm
CN101837192A (en) * 2009-03-19 2010-09-22 万代股份有限公司 Game device
CN102996989A (en) * 2011-09-13 2013-03-27 索尼电脑娱乐公司 Electric apparatus
TW201539362A (en) * 2014-04-14 2015-10-16 Chi-Chao Lin Healthful information cup
CN104536199A (en) * 2014-12-15 2015-04-22 深圳市华星光电技术有限公司 Transparent liquid crystal display device
JP2019037501A (en) * 2017-08-25 2019-03-14 株式会社三共 Game machine
JP2018175950A (en) * 2018-08-23 2018-11-15 株式会社サンセイアールアンドディ Game machine

Also Published As

Publication number Publication date
JP2021137583A (en) 2021-09-16
JP2021137307A (en) 2021-09-16
JP6868136B1 (en) 2021-05-12
CN112891920A (en) 2021-06-04

Similar Documents

Publication Publication Date Title
US9072973B2 (en) Interactive play station
JP4864713B2 (en) Stereoscopic two-dimensional image display device
US8212820B2 (en) Virtual suction tool
AU2004214467B2 (en) Hand-held interactive electronic device
JP5506129B2 (en) GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
JP4895352B2 (en) Object selection program, object selection device, object selection system, and object selection method
CN110109534A (en) System and method for the haptic effect based on pressure
JP2009028504A (en) Cooking game program
JP6164813B2 (en) PROGRAM, STORAGE MEDIUM, AND GAME DEVICE
CN109789335B (en) Game device, game article, and recording medium
CN107485852A (en) Game console
JP4006949B2 (en) Image processing system, image processing apparatus, and imaging apparatus
TW201017597A (en) Computer system and controlling method thereof
CN110384920A (en) The more people's table trip interaction systems of virtual reality, interactive approach and server
CN112891920B (en) Game device
TW581701B (en) Recording medium, method of using a computer and computer for executing role-playing games
US9533224B2 (en) Game program and game apparatus for correcting player's route within a game space
US7429214B2 (en) Electronic game with real feel interface
TW585801B (en) Recording medium, computer, method for executing processes thereon, and method of selecting and executing processes on the computer
US8678926B2 (en) Computer-readable storage medium, information processing apparatus, system, and information process method
Pratticò et al. Investigating tangible user interaction in mixed-reality robotic games
WO2018179879A1 (en) Game device and article for game
JP7036340B2 (en) Freebie acquisition device
TWI813343B (en) Optical recognition control interactive toy and method thereof
JP2004089489A (en) Portable electronic toy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant