WO2009141855A1 - 入力システム、入力方法、コンピュータプログラム、及び、記録媒体 - Google Patents
入力システム、入力方法、コンピュータプログラム、及び、記録媒体 Download PDFInfo
- Publication number
- WO2009141855A1 WO2009141855A1 PCT/JP2008/002686 JP2008002686W WO2009141855A1 WO 2009141855 A1 WO2009141855 A1 WO 2009141855A1 JP 2008002686 W JP2008002686 W JP 2008002686W WO 2009141855 A1 WO2009141855 A1 WO 2009141855A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cursor
- image
- subject
- video image
- correction
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/422—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5372—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
- A63F13/577—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
- A63F13/655—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
- A63F2300/306—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying a marker associated to an object or location in the game field
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
- A63F2300/6054—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands by generating automatically game commands to assist the player, e.g. automatic braking in a driving game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/64—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
- A63F2300/643—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car by determining the impact between objects, e.g. collision detection
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/69—Involving elements of the real world in the game world, e.g. measurement in live races, real video
- A63F2300/695—Imported photos, e.g. of the player
Definitions
- the present invention relates to an input system for performing input based on an image of a subject reflected in a captured image and related technology.
- Patent Document 1 A golf game system by the present applicant is disclosed in Patent Document 1.
- This golf game system includes a game machine and a golf club type input device.
- An imaging unit is housed inside the housing of the game machine.
- the imaging unit includes an image sensor and an infrared light emitting diode. Infrared light is intermittently irradiated to a predetermined range above the imaging unit by the infrared light emitting diode. Therefore, the image sensor intermittently photographs the reflector provided in the golf club type input device that moves within the range. By processing such a strobe image of the reflector, the speed to be input to the game machine is calculated.
- an input system projects a video image on a screen arranged in a real space, a video image generation unit that generates a video image, a control unit that controls the video image.
- Projection means and imaging means for imaging a subject in real space that the player operates on the screen, and the control means determines the position of the subject based on a captured image obtained by the imaging means.
- Analysis means to be obtained, and cursor control means for interlocking a cursor with the subject based on the position of the subject obtained by the analysis means, wherein the cursor control means is implemented on the screen in real space.
- a correction method for correcting the position of the cursor so that the position of the subject in space matches the position of the cursor in the projected video image.
- the player can make an input to the control means by moving the subject on the video image projected on the screen and pointing the desired position in the video image directly with the subject. Because, on the real space screen, the position of the subject in the real space matches the position of the cursor in the projected video image, so that the control means passes the cursor through the video image on which the subject is placed. This is because the inside position can be recognized.
- match means a complete match and a substantially match.
- the input system includes video image generation means for generating a video image, and control means for controlling the video image, the control means being arranged in a real space by the player.
- the analysis means for obtaining the position of the subject, and based on the position of the subject obtained by the analysis means
- Cursor control means for interlocking the cursor with the subject, the cursor control means on the screen in real space, the position of the subject in real space and the video image projected on the screen Correction means for correcting the position of the cursor so that the position of the cursor coincides.
- This configuration has the same effect as the input system according to the first aspect.
- the input system generates a video image for calculating a parameter used for the correction, and has at least one predetermined marker at at least one predetermined position on the video image.
- the marker video generation means to be arranged, the captured image obtained by the imaging means, and the video video generated by the marker video generation means correspond to each other and correspond to the position of the subject image on the captured image.
- Corresponding position calculation means for calculating a corresponding position that is a position on the video image, the predetermined position where the predetermined marker is arranged, and the subject placed on the predetermined marker projected on the screen
- Parameter calculating means for calculating a parameter used by the correction means for correction based on the corresponding position.
- the correction parameters can be easily obtained simply by placing the subject on the marker projected on the screen by the player.
- the marker image generation means arranges the plurality of predetermined markers at the plurality of predetermined positions on the video image, or changes the time to different predetermined positions on the video image. To place the predetermined marker.
- the marker image generation means arranges the four predetermined markers at the four corners on the video image, or arranges the predetermined markers at the four corners on the video image at different times.
- a correction parameter can be obtained with high accuracy by using a relatively small number of markers.
- the marker image generation means further arranges one predetermined marker at the center on the video image on which the four predetermined markers are arranged or on different video images.
- the correction parameter can be obtained with higher accuracy.
- the correction by the correction means includes keystone correction.
- the imaging unit arranged so that the optical axis is inclined with respect to the screen images the subject on the screen, analyzes the motion of the subject based on the captured image, and interlocks with the subject Even when the cursor is generated, the movement of the subject operated by the player and the movement of the cursor match or substantially match. This is because trapezoidal distortion can be removed as much as possible by trapezoidal correction. As a result, the player can perform an input with a sense of incongruity suppressed as much as possible.
- the imaging unit is arranged in front of the player and images from the position overlooking the subject
- the cursor control unit is configured to move the subject from the back to the front as viewed from the imaging unit.
- the position of the cursor is determined so that the cursor moves upward, and when the subject moves from the front to the back as viewed from the imaging means,
- the cursor position is determined so that the cursor moves downward.
- the cursor is controlled with the same algorithm as in the case of looking down in the case of a bird's-eye view
- the player can view the video image displayed on the vertically installed screen.
- the position of the cursor is determined so that the cursor moves downward, and when the subject moves from the front to the back as viewed from the imaging means, the player views the video image displayed on the screen.
- the cursor position is determined so that the cursor moves upward. In this case, the moving direction of the subject operated by the player does not sensuously match the moving direction of the cursor on the screen. For this reason, input is stressed, and smooth input cannot be performed.
- the reason for causing such a result is that in the case of a bird's-eye view, the vertical component of the optical axis vector of the image pickup means is directed vertically downward, and the vertical direction of the image pickup means does not match the vertical direction of the player.
- the optical axis vector of the imaging means does not have a vertical component (that is, when the imaging plane is parallel to the vertical plane), or the vertical component of the optical axis vector faces vertically upward. This is because the image pickup means is installed so that the vertical direction of the means and the vertical direction of the player coincide with each other, and is used to such use.
- the direction from the end point of the vertical component of the optical axis vector of the image pickup means to the start point is set to the lower side of the image pickup means, and the direction from the start point to the end point is set to the upper side of the image pickup means.
- the direction from the player's foot toward the head is defined as the player's upper direction, and the direction from the head toward the foot is defined as the player's lower side.
- the cursor is displayed so as to be visible to the player.
- the player can confirm that the projected cursor matches the retroreflective sheet, and can recognize that the system is normal.
- the cursor is virtual and may not be displayed.
- the control means knows where the retroreflective sheet is placed on the projected video image.
- the cursor may be hidden or a transparent cursor may be displayed. Further, even if the cursor is not displayed, there is no significant problem for the player in playing.
- an input system includes a video image generation unit that generates a video image including a cursor, a control unit that controls the video image, and an optical axis that is inclined with respect to the surface to be imaged.
- An imaging unit that images the subject on the imaging surface, and the control unit is configured to analyze the position of the subject based on a captured image obtained by the imaging unit.
- a trapezoid correction unit that performs keystone correction on the position of the subject obtained by the analysis unit, and a cursor control unit that links the cursor to the subject based on the position of the subject after the keystone correction. , including.
- the imaging unit arranged so that the optical axis is inclined with respect to the imaging surface captures an image of the subject on the imaging surface, and analyzes the movement of the object based on the captured image. Even when a cursor linked to the cursor is generated, the movement of the subject operated by the player matches the movement of the cursor. This is because keystone correction is performed on the position of the subject that determines the position of the cursor. As a result, the player can perform an input with a sense of incongruity suppressed as much as possible.
- the input system includes a video image generation unit that generates a video image including a cursor, and a control unit that controls the video image, wherein the control unit has an optical axis.
- Analyzing means for obtaining the position of the subject based on a captured image obtained by an imaging means arranged to be inclined with respect to the surface to be photographed and capturing the subject on the surface to be photographed, and the analyzing means Trapezoid correction means for executing trapezoidal correction on the determined position of the subject, and cursor control means for interlocking the cursor with the subject based on the position of the subject after the keystone correction.
- This configuration has the same effect as the input system according to the third aspect.
- the trapezoid correction unit performs keystone correction according to the distance between the subject and the imaging unit.
- the trapezoidal distortion of the subject image shown in the captured image increases as the distance between the subject and the imaging means increases. Therefore, in the present invention, it is possible to execute an appropriate trapezoidal correction according to the distance.
- the trapezoid correcting means sets the horizontal coordinate of the cursor so that the distance between the subject and the imaging means and the horizontal movement distance of the cursor have a positive correlation.
- Horizontal correction means for correction is included.
- This configuration can correct horizontal trapezoidal distortion.
- the trapezoid correcting unit causes the distance between the subject and the imaging unit and the vertical movement distance of the cursor to have a positive correlation.
- vertical correction means for correcting the vertical coordinate of the cursor is included.
- This configuration can correct vertical trapezoidal distortion.
- the imaging unit captures an image from a position overlooking the subject.
- the player can move the subject on the floor and operate the cursor.
- the player can move the subject while wearing it on his / her foot.
- the present invention can be applied to a game using feet or an exercise using feet.
- the input system further includes a light emitting means for intermittently irradiating the subject with light, the subject including a retroreflective member for retroreflecting the received light,
- the analysis unit obtains the position of the subject based on a difference image between a captured image when the light from the light emitting unit is irradiated and a captured image when the light is not irradiated.
- control means includes an arrangement means for arranging a predetermined image on the video image, and a determination means for determining whether or not the cursor touches or overlaps the predetermined image. And including.
- the predetermined image can be used as an icon for issuing a command or used as various items in a video game.
- the determination unit determines whether or not the cursor overlaps the predetermined image continuously for at least a predetermined time.
- the input can be confirmed only after the contact or the like continues for a predetermined time without immediately confirming the input when contact is made, and erroneous input can be prevented.
- the arrangement unit moves the predetermined image, and the determination unit determines whether or not the cursor is in contact with or overlaps the moving predetermined image so as to satisfy a predetermined condition.
- An input method includes a step of generating a video image and a step of controlling the video image, wherein the step of controlling is operated on a screen arranged in a real space by a player.
- An analysis step for determining the position of the subject based on a captured image obtained by an imaging unit that captures a subject in real space, and a cursor on the subject based on the position of the subject determined by the analysis step
- This configuration has the same effect as the input system according to the first aspect.
- An input method includes a step of generating a video image including a cursor and a step of controlling the video image, wherein the step of controlling the optical axis with respect to the surface to be imaged
- An analysis step for determining the position of the subject based on a captured image obtained by an imaging unit arranged to be inclined and capturing the subject on the imaging surface; and the position of the subject determined by the analysis step
- a trapezoid correction step for executing trapezoid correction
- a cursor control step for interlocking the cursor with the subject based on the position of the subject after the keystone correction.
- This configuration has the same effect as the input system according to the third aspect.
- a computer program according to the seventh aspect of the present invention causes a computer to execute the input method according to the fifth aspect.
- This configuration has the same effect as the input system according to the first aspect.
- a computer program according to the eighth aspect of the present invention causes a computer to execute the input method according to the sixth aspect.
- This configuration has the same effect as the input system according to the third aspect.
- the recording medium according to the ninth aspect of the present invention is a computer-readable recording medium recording the computer program according to the seventh aspect.
- This configuration has the same effect as the input system according to the first aspect.
- the recording medium according to the tenth aspect of the present invention is a computer-readable recording medium recording the computer program according to the eighth aspect.
- This configuration has the same effect as the input system according to the third aspect.
- the cursor is displayed so as to be visible to the player.
- the cursor is virtual and may not be displayed.
- examples of the recording medium include a flexible disk, hard disk, magnetic tape, magneto-optical disk, CD (including CD-ROM and Video-CD), DVD (DVD-Video, DVD-ROM). , DVD-RAM), ROM cartridge, RAM memory cartridge with battery backup, flash memory cartridge, nonvolatile RAM cartridge, and the like.
- FIG. 5 is a diagram for explaining a relationship between a video image generated by the information processing apparatus 3 in FIG. 1, an image obtained by the camera unit 5, and a shooting effective area 31 in FIG. 4. It is explanatory drawing of the necessity for a calibration. It is explanatory drawing of the necessity for a calibration. It is explanatory drawing of the necessity for a calibration. It is explanatory drawing of the necessity for a calibration. It is an illustration figure of a calibration screen.
- FIG. 26 It is explanatory drawing of the trapezoid correction
- FIG. 1 is a diagram showing an overall configuration of an entertainment system according to Embodiment 1 of the present invention.
- the entertainment system includes an entertainment apparatus 1, a screen 21, and retroreflective sheets (retroreflective members) 17L and 17R that retroreflect the received light.
- retroreflective sheet 17 when it is not necessary to distinguish between the retroreflective sheets 17L and 17R, they are referred to as a retroreflective sheet 17.
- the player attaches the retroreflective sheet 17L with the rubber band 19 to the back of the left leg, and attaches the retroreflective sheet 17R with the rubber band 19 to the back of the right leg.
- a screen (for example, white) is disposed on the floor surface (horizontal plane) in front of the entertainment apparatus 1.
- the player 15 plays on the screen 21 by moving his / her feet with the retroreflective sheets 17L and 17R.
- the entertainment device 1 includes a housing 13 installed upright on the floor. In the vicinity of the middle stage of the housing 13, the support member 10 is attached substantially parallel to the vertical plane. A projector 11 is attached to the support member 10. The projector 11 projects the video image generated by the information processing device 3 on the screen 21. The player 15 moves his / her foot while watching the projected video image, and moves the retroreflective sheets 17L and 17R to desired positions.
- the support member 4 is attached to the upper stage of the housing 13 so as to protrude toward the player 15.
- the information processing device 3 is attached to the tip of the support member 4.
- the information processing apparatus 3 includes a camera unit 5.
- the camera unit 5 is attached to the information processing device 3 so as to overlook the screen 21 and the retroreflective sheets 17L and 17R, and images the retroreflective sheets 17L and 17R operated by the player 15.
- the camera unit 5 includes an infrared filter 9 that transmits only infrared light, and four infrared light emitting diodes 7 arranged around the infrared filter 9. On the back side of the infrared filter 9, an image sensor 27 described later is disposed.
- FIG. 2 is a schematic diagram of the entertainment system of FIG.
- the camera unit 5 is arranged to protrude toward the player 15 more than the projector 11 in a side view.
- the camera unit 5 is disposed above the screen 21 and sees the screen 21 and the retroreflective sheets 17L and 17R obliquely forward and downward.
- the projector 11 is arranged in the lower stage of the camera unit 5.
- FIG. 3 is a diagram showing an electrical configuration of the entertainment system of FIG.
- the information processing device 3 includes a processor 23, an external memory 25, an image sensor 27, an infrared light emitting diode 7, and a switch unit 22.
- the switch unit 22 includes a determination key, a cancel key, and a direction key.
- the image sensor 27 constitutes the camera unit 5 together with the infrared light emitting diode 7 and the infrared filter 9.
- the external memory 25 is connected to the processor 23.
- the external memory 25 is configured by, for example, a flash memory, a ROM, and / or a RAM.
- the external memory 25 includes a program area, an image data area, and an audio data area.
- the program area stores a control program that causes the processor 23 to execute various processes (processes in flowcharts described later).
- the image data area stores image data necessary for generating the video signal VD.
- the audio data area stores audio data for guides, sound effects, and the like.
- the processor 23 executes a control program in the program area, reads out image data in the image data area and audio data in the audio data area, performs necessary processing, and generates a video signal (video image) VD and an audio signal AU. To do.
- the video signal VD and the audio signal AU are given to the projector 11.
- the processor 23 includes various functional blocks such as a CPU (Central Processing Unit), a graphics processor, a sound processor, and a DMA controller, and an A / D converter, an infrared signal, and a key that are used when an analog signal is captured. It includes an input / output control circuit that receives an input digital signal such as an operation signal and provides an output digital signal to an external device, an internal memory, and the like.
- the CPU executes a control program stored in the external memory 25.
- the digital signal from the A / D converter and the digital signal from the input / output control circuit are given to the CPU, and the CPU executes necessary calculations according to these signals in accordance with the control program.
- the graphics processor performs the graphic processing necessary for the image data stored in the external memory 25 according to the calculation result of the CPU, and generates the video signal VD.
- the sound processor performs sound processing necessary for the sound data stored in the external memory 25 according to the calculation result of the CPU, and generates an audio signal AU representing a sound effect or the like.
- the internal memory is constituted by a RAM, for example, and is used as a working area, a counter area, a register area, a temporary data area, and / or a flag area.
- the image sensor 27 is, for example, a CMOS image sensor of 64 ⁇ 64 pixels.
- the image sensor 27 operates under the control of the processor 23. Specifically, it is as follows.
- the image sensor 27 drives the infrared light emitting diode 7 intermittently. Therefore, the infrared light emitting diode 7 emits infrared light intermittently.
- the retroreflective sheets 17L and 17R are intermittently irradiated with infrared light.
- the image sensor 27 captures the retroreflective sheets 17L and 17R when the infrared light is turned on and off.
- the image sensor 27 generates a difference image signal between the image signal when the infrared light is turned on and the image signal when the infrared light is turned off, and outputs the difference image signal to the processor 23.
- the difference image signal noise due to light other than the reflected light from the retroreflective sheets 17L and 17R can be removed as much as possible, and only the retroreflective sheets 17L and 17R can be detected with high accuracy. That is, only the retroreflective sheets 17L and 17R are reflected in the difference image.
- the video signal VD generated by the processor 23 includes two cursors 67L and 67R (described later).
- the two cursors 67L and 67R correspond to the detected retroreflective sheets 17L and 17R, respectively.
- the processor 23 links the two cursors 67L and 67R to the retroreflective sheets 17L and 17R, respectively.
- the projector 11 outputs sound corresponding to the audio signal AU given from the processor 23 from a speaker (not shown). Further, the projector 11 projects a video image based on the video signal VD given from the processor 23 onto the screen 21.
- FIG. 4 is an explanatory diagram of the shooting range of the camera unit 5 of FIG.
- the Y # axis is set along the horizontal line
- the Z # axis is set along the vertical line
- the X # axis is perpendicular to them.
- a horizontal plane is formed by the X # axis and the Y # axis.
- the vertical direction is positive on the Z # axis
- the direction from the screen 21 toward the entertainment device 1 is positive on the Y # axis
- the positive direction on the Y # axis is on the right
- the right direction is positive on the X # axis.
- the vertex a1 of the effective photographing area 31 is set as the origin.
- the horizontal component Vh of the optical axis vector V of the image sensor 27 of the camera unit 5 faces the negative direction of the Y # axis, and the vertical component Vv faces the negative direction of the Z # axis. This is because the camera unit 5 is installed so as to overlook the screen 21 and the retroreflective sheets 17L and 17R.
- the optical axis vector V is a unit vector along the optical axis 30 of the image sensor 27.
- the retroreflective sheets 17L and 17R are examples of the subject of the camera unit 5.
- the screen 21 is also shot by the camera unit 5 at the same time as the video image is projected (although it is not shown in the difference image), it can also be called a surface to be shot.
- a dedicated screen 21 may be used, but the floor surface itself can be used as a screen as long as the floor surface is flat and the content of the projected video image can be easily recognized. In this case, the floor surface is the imaged surface.
- the effective range 12 for photographing by the image sensor 27 is a range of a predetermined angle with the optical axis 30 as the center.
- the image sensor 27 looks down the screen 21 at an angle. Therefore, the effective area 31 for photographing by the image sensor 27 has a trapezoidal shape in plan view.
- Reference numerals a1, a2, a3, and a4 are attached to the four vertices of the imaging effective area 31, respectively.
- FIG. 5 illustrates the correspondence between the video image (rectangle) generated by the information processing apparatus 3 in FIG. 1, the image (rectangle) obtained by the camera unit 5, and the effective shooting area 31 (trapezoid) in FIG. It is a figure for doing.
- the imaging effective area 31 is referred to as a predetermined rectangular area (hereinafter referred to as “effective area corresponding image”) of the difference image (hereinafter referred to as “camera image”) 33 obtained by the image sensor 27. .) Corresponds to 35.
- the vertices a1 to a4 of the effective shooting area 31 correspond to the vertices b1 to b4 of the effective area corresponding image 35, respectively.
- the retroreflective sheet 17 in the imaging effective area 31 is reflected in the effective area corresponding image 35.
- the effective area corresponding image 35 corresponds to the video image 37 generated by the processor 23.
- the vertices b1 to b4 of the effective area corresponding image 35 correspond to the vertices c1 to c4 of the video image 37, respectively. Therefore, in this embodiment, when the cursor 67 linked to the retroreflective sheet 17 is included in the video image, the cursor 67 is a video corresponding to the position of the image of the retroreflective sheet 17 reflected in the effective area corresponding image 35. It is arranged at a position on the image 37. In the video image 37, the effective area corresponding image 35, and the imaging effective area 31, the upper side c1-c2, the upper side b1-b2, and the lower base a1-a2 indicated by the black triangle correspond to each other.
- the cursor is positioned so that the position of the retroreflective sheet (subject) 17 in the real space matches the position of the cursor 67 in the projected video image. It is necessary to adjust or correct the position 67, that is, to perform calibration.
- the calibration includes keystone correction. This point will be specifically described below.
- FIG. 6 to 8 are explanatory diagrams of the necessity of calibration.
- a rectangular video image 37 generated by processor 23 is projected onto screen 21 by projector 11.
- a video image projected on the screen 21 is referred to as a projected video image 38. It is assumed that the projection video image 38 has been keystone corrected by the projector 11.
- the generated video image 37 is projected as it is onto the screen 21 without being inverted. Accordingly, the vertices c1 to c4 of the video image 37 correspond to the vertices f1 to f4 of the projection video image 38, respectively.
- the effective area corresponding image 35, the photographing effective area 31, and the projection video image 38 the upper side c1-c2, the upper side b1-b2, the lower base a1-a2, and the lower side f1 indicated by the black triangles are shown.
- -F2 correspond to each other.
- the four corner images D1 to D4 of the video image 37 are projected as images d1 to d4 of the projection video image 38, respectively.
- the images D1 to D4 do not depend on the camera image 33. For this reason, the images d1 to d4 also do not depend on the camera image 33.
- the retroreflective sheets A1 to A4 are arranged on the images d1 to d4 forming the vertices of the rectangle, respectively.
- the images B1 to B4 of the retroreflective sheets A1 to A4 form the vertices of the trapezoid.
- the trapezoidal distortion occurs because the image sensor 27 takes an image of the screen 21 and the retroreflective sheets A1 to A4 that are horizontally disposed forward and obliquely below. Note that the retroreflective sheets A1 to A4 correspond to the images B1 to B4, respectively.
- the images C1 to C4 are arranged on the video image 37 corresponding to the images B1 to B4 of the retroreflective sheets A1 to A4 reflected in the effective area corresponding image 35, respectively. Accordingly, the images C1 to C4 of the video image 37 are projected as images e1 to e4 of the projection video image 38, respectively.
- the images e1 to e4 of the projection video image 38 need to be projected onto the retroreflective sheets A1 to A4, respectively.
- the processor 23 recognizes the position of the retroreflective sheet 17 via the cursor 67 linked to the retroreflective sheet 17 and knows where the retroreflective sheet 17 is located on the projection video image.
- images e1, e2, e3, and e4 correspond to A4, A3, A2, and A1, respectively.
- the images C1 to C4 are arranged at positions on the video image 37 corresponding to the positions where the positions of the images B1 to B4 of the effective area corresponding image 35 are turned upside down (mirror inverted up and down). To do.
- the video image 37 including the images C1 to C4 is turned upside down (mirrored up and down) and projected onto the screen 21 to obtain a projected video image 38.
- the images e1, e2, e3, and e4 are corrected so as to overlap the retroreflective sheets A1, A2, A3, and A4, that is, the images d4, d3, d2, and d1, respectively.
- the images e1 to e4 of the projection video image 38 are projected on the retroreflective sheets A1 to A4, respectively, and the projection video image 38 can be used as a user interface.
- FIG. 9A and FIG. 9B are examples of a calibration screen (a screen for calculating parameters (reference magnification and reference inclination) used for keystone correction).
- the processor 23 generates a video image (first step image) 41 for the first step of calibration.
- This video image 41 includes a marker 43 arranged at the center thereof. Since the video image 41 is projected onto the screen 21 as shown in FIG. 8, an image corresponding to the video image 41 as it is is projected as a projected video image. Accordingly, the player 15 follows the guide of the projection video image corresponding to the guide of the video image 41 and the retroreflective sheet CN (not illustrated) on the marker m (not illustrated) on the projection video image corresponding to the marker 43. Put. Then, the processor 23 obtains xy coordinates (CX, CY) on the video image 41 of the retroreflective sheet CN placed on the marker m on the projection video image.
- CX, CY xy coordinates
- the processor 23 generates a video image (second step image) 45 for the second step of calibration.
- the video image 45 includes markers D1 to D4 arranged at the four corners.
- the markers D1 to D4 correspond to the images D1 to D4 in FIG. Since the video image 45 is projected onto the screen 21 as shown in FIG. 8, an image corresponding to the video image 45 as it is is projected as a projected video image. Accordingly, the player follows the guide of the projection video image corresponding to the guide of the video image 45 and places the retroreflective sheets LU, RU, RB and LB on the markers d1 to d4 on the projection video images corresponding to the markers D1 to D4. Put down (not shown).
- the markers d1 to d4 correspond to the images d1 to d4 in FIG.
- the processor 23 xy coordinates (LUX, LUY), (RUX, RUY) on the video image 45 of the retroreflective sheets LU, RU, RB and LB placed on the markers d1 to d4 on the projected video image.
- FIG. 10 is an explanatory diagram of a method for deriving a reference magnification used for trapezoidal correction.
- the center is the origin
- the horizontal axis is the x axis
- the vertical axis is the y axis.
- the right direction on the paper surface is positive on the x axis
- the upward direction on the paper surface is positive on the y axis.
- the xy coordinates on the video image of the retroreflective sheet CN arranged on the marker m described in FIG. 9A are (CX, CY).
- the xy coordinates on the video images of the retroreflective sheets LU, RU, RB and LB arranged on the markers d1 to d4 described in FIG. 9B are respectively (LUX, LUY) and (RUX, RUY). , (RBX, RBY) and (LBX, LBY).
- the retroreflective sheets LU, RU, RB, and LB are located in the fourth quadrant q4, the first quadrant q1, the second quadrant q2, and the third quadrant q3, respectively.
- the x-coordinate reference magnification PRUX and the y-coordinate reference magnification PRUY can be obtained by the following equations.
- PRUX Rx / (RUX-CX) (1)
- PRUY Ry / (RUY-CY) (2)
- the constant Rx is the x coordinate of the marker D2 on the video image
- the constant Ry is the y coordinate of the marker D2 on the video image.
- the x-axis reference magnification PRBX and the y-coordinate reference magnification PRBY can be obtained by the following equations.
- PRBX Rx / (RBX ⁇ CX) (3)
- PRBY Ry / (CY-RBY) (4)
- the reference magnification PLBX of the x coordinate and the reference magnification PLBY of the y coordinate can be obtained by the following equations.
- PLBX Rx / (CX ⁇ LBX) (5)
- PLBY Ry / (CY-LBY) (6)
- the reference magnification of the xy coordinates of the fourth quadrant q4 is obtained.
- the reference magnification PLUX of the x coordinate and the reference magnification PLYU of the y coordinate can be obtained by the following equations.
- PLUX Rx / (CX-LUX) (7)
- PLUY Ry / (LUY ⁇ CY) (8)
- trapezoidal correction is performed by multiplying the x coordinate on the video image by the reference magnification PRUX and the y coordinate by the reference magnification PRUY. Can do.
- trapezoidal correction is performed by multiplying the x coordinate on the video image by the reference magnification PRBX and multiplying the y coordinate by the reference magnification PRBY. Can do.
- the reference magnification of the x coordinate should be essentially the same regardless of which quadrant the retroreflective sheet 17 is located.
- trapezoidal correction is uniformly performed at the reference magnification according to the quadrant, if the difference between the reference magnification PRUX of the x coordinate in the first quadrant q1 and the reference magnification PRBX of the x coordinate in the second quadrant q2 is large, the first quadrant Even in the vicinity where q1 and the second quadrant q2 are in contact with each other, the difference occurs in the same manner and becomes discontinuous.
- the first quadrant q1 based on the inclination of the reference magnification of the x coordinate with respect to the y axis and the y coordinate of the retroreflective sheet 17 positioned in the first quadrant q1, the first quadrant q1
- the reference magnification PRUX of the x coordinate is corrected.
- the reference magnification is corrected to CPRUX based on the inclination of the reference magnification of the x coordinate with respect to the y axis.
- the reference magnification of the y coordinate should be essentially the same regardless of which quadrant the retroreflective sheet 17 is located in. It is. However, if trapezoidal correction is uniformly performed at the reference magnification according to the quadrant, if the difference between the reference magnification PRUY of the y coordinate in the first quadrant q1 and the reference magnification PLUY of the y coordinate in the fourth quadrant q4 is large, the first quadrant Even in the vicinity where q1 and the fourth quadrant q4 are in contact with each other, the difference is generated in the same manner and becomes discontinuous.
- the first quadrant q1 based on the inclination of the reference magnification of the y-coordinate with respect to the x-axis and the x-coordinate of the retroreflective sheet 17 located in the first quadrant q1, the first quadrant q1
- the reference magnification PRUY of the y coordinate is corrected.
- the reference magnification is corrected to CPRUY based on the inclination of the reference magnification of the y coordinate with respect to the x axis.
- the reference slope SRUX for correcting the reference magnification PRUX (formula (1)) of the x coordinate of the first quadrant q1 is calculated by the following formula.
- the reference slope SRUY for correcting the reference magnification PRUY (formula (2)) of the y coordinate of the first quadrant q1 is calculated by the following formula.
- the reference inclination SRBX for correcting the reference magnification PRBX (expression (3)) of the x coordinate in the second quadrant q2 is calculated by the following expression.
- the reference slope SRBY for correcting the reference magnification PRBY (formula (4)) of the y coordinate in the second quadrant q2 is calculated by the following formula.
- the reference inclination SLBX for correcting the reference magnification PLBX (expression (5)) of the x coordinate in the third quadrant q3 is calculated by the following expression.
- the reference slope SLBY for correcting the reference magnification PLBY (formula (6)) of the y coordinate in the third quadrant q3 is calculated by the following formula.
- the reference slope SLUX for correcting the reference magnification PLUX (formula (7)) of the x coordinate in the fourth quadrant q4 is calculated by the following formula.
- the reference slope SLUY for correcting the reference magnification PLUY (formula (8)) of the y coordinate in the fourth quadrant q4 is calculated by the following formula.
- FIG. 14 is an explanatory diagram of a method of correcting the reference magnification PRUX of the x coordinate of the first quadrant q1 using the reference slope SRUX.
- the y coordinate of retroreflective sheet 17 located in first quadrant q1 is PY.
- the correction value CPRUX of the x-axis reference magnification PRUX is calculated by the following equation.
- FIG. 15 is an explanatory diagram of a method of correcting the reference magnification PRUY of the y coordinate in the first quadrant q1 using the reference slope SRUY.
- the x coordinate of the retroreflective sheet 17 located in the first quadrant q1 is PX.
- the correction value CPRUY of the reference magnification PRUY of the y coordinate is calculated by the following equation.
- the y coordinate of the retroreflective sheet 17 located in the second quadrant q2 is PY.
- the correction value CPRBX of the reference magnification PRBX of the x coordinate is calculated by the following equation.
- the value PX # after trapezoidal correction of the x coordinate PX of the retroreflective sheet 17 located in the second quadrant q2 is as follows.
- the x coordinate of the retroreflective sheet 17 located in the second quadrant q2 is PX.
- the correction value CPRBY of the reference magnification PRBY of the y coordinate is calculated by the following equation.
- the value PY # after the keystone correction of the y-coordinate PY of the retroreflective sheet 17 located in the second quadrant q2 is as follows.
- the y coordinate of the retroreflective sheet 17 located in the third quadrant q3 is PY.
- the correction value CPLBX of the reference magnification PLBX of the x coordinate is calculated by the following equation.
- the value PX # after trapezoidal correction of the x coordinate PX of the retroreflective sheet 17 located in the third quadrant q3 is as follows.
- the x coordinate of the retroreflective sheet 17 located in the third quadrant q3 is PX.
- the correction value CPLBY of the reference magnification PLBY of the y coordinate is calculated by the following equation.
- the correction value CPLUX of the x-coordinate reference magnification PLUX is calculated by the following equation.
- the value PX # after trapezoidal correction of the x coordinate PX of the retroreflective sheet 17 located in the fourth quadrant q4 is as follows.
- the x coordinate of the retroreflective sheet 17 located in the fourth quadrant q4 is PX.
- the correction value CPLUY of the reference magnification PLUY of the y coordinate is calculated by the following equation.
- the value PY # after the keystone correction of the y coordinate PY of the retroreflective sheet 17 located in the fourth quadrant q4 is as follows.
- FIG. 16 is a view showing an example of the mode selection screen 61 projected on the screen 21 of FIG.
- mode selection screen 61 includes icons 65 and 63 for mode selection, and cursors 67L and 67R.
- the cursor 67L is linked to the retroreflective sheet 17L, and the cursor 67R is linked to the retroreflective sheet 17R. This also applies to FIGS. 17 to 22 described later.
- the icon 63 is for entering the training mode
- the icon 65 is for entering the game mode.
- the positions of the cursors 67L and 67R coincide with or substantially coincide with the positions of the retroreflective sheets 17L and 17R, respectively. Therefore, the player 15 can move the corresponding cursor to the desired position on the projection video image by carrying the foot wearing the retroreflective sheet to the desired position on the projection video image. This also applies to FIGS. 17 to 22 described later.
- FIG. 17 is a view showing an example of the game selection screen 71 projected on the screen 21 of FIG.
- game selection screen 71 includes icons 73 and 75 for game selection, and cursors 67L and 67R.
- the cursors 67L and 67R operated by the player 15 with the retroreflective sheets 17L and 17R overlap either of the icons 73 and 75, the countdown display from 3 seconds is started, and the input becomes valid after 3 seconds have passed.
- a game corresponding to the icon 73 or 75 where both the cursors 67L and 67R overlap is started. That is, when both the cursors 67L and 67R overlap with one icon for 3 seconds or more (preventing erroneous input), the input to the icon becomes valid.
- the icon 73 is for entering a mole hitting game
- the icon 75 is for entering a free kick game.
- FIG. 18 is an illustration of a mole hitting screen 81 projected on the screen 21 of FIG.
- mole hitting screen 81 includes four hole images 83, an elapsed time display unit 93, a score display unit 95, and cursors 67L and 67R.
- Mole image 91 appears from four hole images 83 at random.
- the player 15 operates the retroreflective sheet 17L or 17R and tries to overlap the cursor 67L or 67R on the mole image 91 at the timing when the mole image 91 appears. If the cursor 67L or 67R can be overlaid on the mole image 91 with good timing, the score on the score display unit 95 is increased by one point.
- the elapsed time display section 93 displays the countdown result from 30 seconds, and the game ends when the result reaches 0 seconds.
- the player 15 can superimpose the corresponding cursor 67L or 67R on the mole image 91 by stepping on the mole image 91 with a foot wearing the retroreflective sheet 17L or 17R with good timing. This is because, on the screen 21, the position of the retroreflective sheet and the position of the cursor match or substantially match.
- the hole image 83 is displayed in a horizontal row, it can also be displayed in a plurality of rows horizontally. The more rows there are, the higher the difficulty. Further, the number of hole images 83 can be an arbitrary number. Also, a plurality of mole images 91 can appear simultaneously from the plurality of hole images 83. The greater the number of mole images 91 that appear simultaneously, the higher the difficulty level. Moreover, the difficulty level can be adjusted by adjusting the appearance interval of the mole image 91.
- FIG. 19 is a view showing an example of the free kick screen 101 projected on the screen 21 of FIG.
- free kick screen 101 includes a ball image 103, an elapsed time display unit 93, a score display unit 95, and cursors 67L and 67R.
- the ball image 103 descends vertically and at a constant speed from the upper end to the lower end of the screen.
- the position at which the ball image 103 appears from which position on the upper end of the screen is randomly determined. Since the ball images 103 appear and descend one after another, the player operates the retroreflective sheet 17L or 17R to move the cursor 67L or 67R to the descending ball image 103. In this case, when the cursor speed is equal to or higher than a certain value and the ball image 103 is touched, the ball image 103 is hit back in the opposite direction, and the score on the score display unit 95 is increased by one point.
- the elapsed time display section 93 displays the countdown result from 30 seconds, and the game ends when the result reaches 0 seconds.
- the player 15 can bring the corresponding cursor 67L or 67R into contact with the ball image 103 by performing an action of kicking the ball image 103 with a foot wearing the retroreflective sheet 17L or 17R in a timely manner. This is because, on the screen 21, the position of the retroreflective sheet and the position of the cursor match or substantially match.
- FIG. 20 is a view showing an example of the one-leg jump screen 111 projected on the screen 21 of FIG. In this single leg jump screen 111, the player 15 continuously jumps one leg, and the first half 15 seconds is played with the left leg and the second half 15 seconds is played with the right leg.
- the single leg jump screen 111 includes a left leg score display unit 115, a right leg score display unit 119, an elapsed time display unit 117, a guide image 113, and cursors 67L and 67R.
- the score on the left leg score display unit 115 is increased by one point, and the guidance image 113 moves to another position.
- the player 15 jumps with the left leg so that the cursor 67L overlaps the guided image 113 after movement.
- the score of the left leg score display unit 115 increases by one point, and the guide image 113 moves to another position.
- the guide image 113 moves the three vertices of the triangle counterclockwise.
- a guide for instructing to play with the right leg is displayed.
- the score on the right leg score display unit 119 is increased by one point, and the guidance image 113 moves to another position.
- the player 15 jumps with the right leg so that the cursor 67R overlaps the guide image 113 after the movement.
- the score on the right leg score display unit 119 increases by one point, and the guide image 113 moves to another position.
- the guide image 113 moves the three vertices of the triangle clockwise.
- the elapsed time display unit 117 displays the countdown result from 30 seconds, and the game ends when the result reaches 0 seconds.
- a guide image 113 representing the left foot sole is displayed, and when a play with the right leg is instructed, a guide image 113 representing the right foot sole is displayed.
- the player 15 can move the corresponding cursor 67L or 67R to the guide image 113 by stepping on the guide image 113 with the foot wearing the retroreflective sheet 17L or 17R. This is because, on the screen 21, the position of the retroreflective sheet and the position of the cursor match or substantially match.
- FIG. 21 is a view showing an example of the jumping screen 121 on both legs projected on the screen 21 in FIG.
- the both-leg jump screen 121 includes an elapsed time display unit 117, a score display unit 127, three lines 129 extending vertically, a guidance image 123, and cursors 67L and 67R.
- the screen is divided into four regions 135 by three lines 129.
- the player 15 jumps with both legs. Specifically, the player 15 tries to jump over the line 129 with both legs in accordance with the guidance image 123.
- the elapsed time display unit 117 displays the countdown result from 30 seconds, and the game ends when the result reaches 0 seconds.
- the player 15 can move the corresponding cursors 67L and 67R to the area 135 by jumping and moving to the area 135 where the guide image 123 is located with the feet wearing the retroreflective sheets 17L and 17R. This is because, on the screen 21, the position of the retroreflective sheet and the position of the cursor match or substantially match.
- FIG. 22 is a view showing an example of a single leg standing screen 151 projected on the screen 21 of FIG.
- the player 15 causes the left one leg standing with the eyes open, the right one leg standing with the eyes open, the left one leg standing with the eyes closed, and the right one leg standing with the eyes closed for 30 seconds, respectively. It is.
- the one leg standing screen 151 includes an elapsed time display unit 117, a sole image 155, an instruction column 154, and a cursor 67L or 67R.
- the instruction column 154 indicates, by an image representing characters and eyes, any one of standing on the left leg with the eyes open, standing on the right leg with the eyes open, standing on the left leg with the eyes closed, and standing on the right leg with the eyes closed.
- the instruction is given in the order of standing on one leg with eyes open, standing on one leg with eyes open, standing on one leg with eyes closed, and eyes closed. Each 30 seconds.
- the sole image 155 represents the left sole, it indicates the left leg standing, and when the sole image 155 represents the right sole, the right leg standing is instructed.
- the instruction column 154 instructs to stand on the right leg with the eyes open.
- the player 15 tries to stand on the right leg so that the cursor 67R overlaps the sole image 155.
- the OK counter is counted up
- the cursor 67R does not overlap the sole image 155
- the NG counter is counted up.
- the player 15 can hold the corresponding cursor 67L or 67R in the sole image 155 by standing on the sole image 155 with the foot wearing the retroreflective sheet 17L or 17R and standing on one leg. This is because, on the screen 21, the position of the retroreflective sheet and the position of the cursor match or substantially match.
- FIG. 23 is a flowchart showing the flow of preprocessing (processing for obtaining parameters for keystone correction (reference magnification and reference inclination)) by the processor 23 of FIG.
- processor 23 in step S1, processor 23 generates first step video 41 and provides it to projector 11 (see FIG. 9A). Then, in step S41, the projector 11 mirrors the first step image 41 up and down, and projects it on the screen 21 in step S43.
- step S3 the processor 23 executes a photographing process of the retroreflective sheet CN placed on the marker m (see the description of FIG. 9A).
- step S5 the processor 23 calculates xy coordinates (CX, CY) on the first step image 41 of the retroreflective sheet CN.
- step S7 the processor 23 determines whether or not the player 15 has pressed the enter key (switch unit 22). If pressed, the processor 23 proceeds to step S9, otherwise returns to step S1.
- step S ⁇ b> 9 the processor 23 stores the calculated coordinates (CX, CY) in the external memory 25.
- step S11 the processor 23 generates a second step image 45 (see FIG. 9B). Then, in step S45, the projector 11 mirrors the second step image 45 up and down, and projects it on the screen 21 in step S47.
- step S13 the processor 23 executes imaging processing of the retroreflective sheets LU, RU, RB, and LB placed on the markers d1 to d4 (refer to the description of FIG. 9B).
- step S15 the processor 23 determines the xy coordinates (LUX, LUY), (RUX, RUY), (RBX, RBY), and (LBX, L) on the second step image 45 of the retroreflective sheets LU, RU, RB, and LB. LBY) is calculated.
- step S17 the processor 23 determines whether or not the player 15 has pressed the enter key (switch unit 22). If pressed, the processor 23 proceeds to step S19, otherwise returns to step S11.
- step S ⁇ b> 19 the processor 23 stores the calculated coordinates (LUX, LUY), (RUX, RUY), (RBX, RBY), and (LBX, LBY) in the external memory 25.
- step S21 the processor 23 uses the coordinates stored in steps S9 and S19, and the equations (1) to (8), and the reference magnifications PRUX, PRUY, PLUX, PLUY, PRBX, PRBY, PLBX and PLBY. Is calculated.
- step S ⁇ b> 23 the processor 23 stores the calculated reference magnification in the external memory 25.
- step S25 the processor 23, based on the coordinates stored in steps S9 and S19, the reference magnification stored in step S23, and the equations (9) to (16), the reference gradients SRUX, SRUY, SLUX, SLUI. , SRBX, SRBY, SLBX and SLBY are calculated.
- step S ⁇ b> 27 the processor 23 stores the calculated reference inclination in the external memory 25.
- step S29 the processor 23 generates a preprocessing completion image indicating to the player 15 that the preprocessing has been completed, and gives it to the projector 11. Then, in step S49, the projector 11 mirrors the preprocess completion video up and down, and projects it on the screen 21 in step S51.
- FIG. 24 is a flowchart showing the flow of the photographing process in step S3 of FIG.
- processor 23 causes image sensor 27 to turn on infrared light emitting diode 7.
- the processor 23 causes the image sensor 27 to execute photographing when the infrared light is turned on.
- the processor 23 causes the image sensor 27 to turn off the infrared light emitting diode 7.
- the processor 23 causes the image sensor 27 to execute photographing when the infrared light is extinguished.
- step S69 the processor 23 causes the image sensor 27 to generate and output a difference image (camera image) between the image when the infrared light is turned on and the image when the infrared light is turned off.
- the image sensor 27 performs photographing when the infrared light is turned on and off, that is, flash photography.
- the infrared light emitting diode 7 functions as a stroboscope by the above control.
- step S13 in FIG. 23 is the same as the image capturing process in FIG.
- FIG. 25 is a flowchart showing the flow of the coordinate calculation process in step S5 of FIG.
- processor 23 extracts an image of retroreflective sheet CN from the camera image (difference image) received from image sensor 27.
- the processor 23 determines XY coordinates on the camera image of the retroreflective sheet CN based on the image of the retroreflective sheet CN.
- the processor 23 converts the XY coordinates on the camera image of the retroreflective sheet CN into xy coordinates in the screen coordinate system.
- the screen coordinate system is a coordinate system in which video images generated by the processor 23 are arranged.
- step S87 the processor 23 mirrors the xy coordinates obtained in step S85 up and down to obtain xy coordinates (CX, CY).
- the reason for performing this processing is as described in FIG. Incidentally, the XY coordinates obtained in step S83 may be mirror-inverted up and down, and the obtained coordinates may be given to step S85.
- the output of step S85 is xy coordinates (CX, CY), and there is no step S87.
- step S15 in FIG. 23 is the same as the coordinate calculation process in FIG.
- the retroreflective sheet CN is replaced with the retroreflective sheets LU, RU, RB, and LB, and the xy coordinates (CX, CY) are replaced with the xy coordinates (LUX, LUY).
- FIG. 26 is a flowchart showing the flow of overall processing performed by the processor 23 of FIG. 3, which is executed after completion of the preprocessing of FIG.
- the processor 23 executes a photographing process. This process is the same as the process of FIG.
- the processor 23 calculates the xy coordinates (PX L , PY L ) and (PX R , PY R ) on the video images of the retroreflective sheets 17L and 17R. This process is the same as the process of FIG. However, in the coordinate calculation process of step S103, in the description of FIG.
- the retroreflective sheet CN is replaced with the retroreflective sheets 17L and 17R, and the xy coordinates (CX, CY) are replaced with the xy coordinates (PX L , PY L ) and ( PX R , PY R ).
- step S105 the processor 23 performs keystone correction on the coordinates (PX L , PY L ) and (PX R , PY R ) obtained in step S103 based on the equations (17) to (40). Then, the coordinates (PX # L , PY # L ) and (PX # R , PY # R ) after trapezoidal correction are obtained.
- step S107 the processor 23 sets the coordinates (PX # L , PY # L ) and (PX # R , PY # R ) after the keystone correction to the coordinates of the cursors 67L and 67R, respectively. Therefore, the coordinates of the cursors 67L and 67R and the coordinates of the retroreflective sheets 17L and 17R on the video image after the keystone correction are synonymous.
- step S109 the processor 23 executes a game process (for example, control of various screens in FIGS. 16 to 22).
- the processor 23 generates a video image (for example, various screens in FIGS. 16 to 22) according to the processing result in step S109, gives it to the projector 11, and returns to step S101.
- the projector 11 mirrors the video image received from the processor 23 up and down and projects it onto the screen 21.
- PX L and PX R when it is not necessary to distinguish and explain PX L and PX R , it is represented as PX, and when it is not necessary to distinguish between PY L and PY R , it is represented as PY, and PX # L and PX #.
- PX # when it is not necessary to distinguish and explain R , it is denoted as PX #, and when it is not necessary to distinguish between PY # L and PY # R , it may be denoted as PY #.
- FIG. 27 is a flowchart showing the trapezoid correction process in step S105 of FIG.
- processor 23 determines that xy coordinates (PX, PY) of retroreflective sheet 17 stored in step S103 of FIG. 26 and xy coordinates (LUX) stored in step S19 of FIG. , LUY), (RUX, RUY), (RBX, RBY) and (LBX, LBY), the reference magnifications PRUX, PRUY, PLUX, PLUY, PRBX and PRBY stored in step S23 of FIG. 23, and the steps of FIG.
- step S123 the processor 23 determines the xy coordinates (PX, PY) of the retroreflective sheet 17 stored in step S103 of FIG. 26, the individual magnification calculated in step S121, and the equations (19), (22), Based on the equations (25), (28), (31), (34), (37), and (40), the xy coordinates (PX #, PX #, PY #) is calculated.
- step S125 the processor 23 determines whether or not the processing in steps S121 and S123 has been completed for the left and right retroreflective sheets 17L and 17R, returns to step S121 if not completed, and returns if completed. .
- FIG. 28 is a flowchart showing the flow of the first example of the game process in step S109 of FIG. For example, the screen control of FIGS. 16 and 17 is executed by the processing of FIG.
- step S143 processor 23 determines that both cursors 67L and 67R overlap icons (in the example of FIGS. 16 and 17, icons 63, 65, 73, 75, 77). If it is determined that there is a duplication, the process proceeds to step S145. Otherwise, the process proceeds to step S151. In step S145, the processor 23 counts up the timer and proceeds to step S147. In step S147, the processor 23 refers to the timer to determine whether or not a predetermined time (3 seconds in the examples of FIGS. 16 and 17) has elapsed since the cursors 67L and 67R overlapped with the icon. If YES in step S149, the process proceeds to step S149. If not, the process returns.
- a predetermined time 3 seconds in the examples of FIGS. 16 and 17
- step S149 the processor 23 sets another selection screen or a game start screen according to the icon on which the cursors 67L and 67R overlap, and returns. Now, after “NO” is determined in step S143, in step S151, the processor 23 resets the timer to 0 and returns.
- FIG. 29 is a flowchart showing the flow of the second example of the game process in step S109 of FIG. For example, the screen control of FIG. 18 is executed by the processing of FIG.
- step S161 processor 23 determines whether or not it is the setting timing of the target (mole image 91 in the example of FIG. 18), and proceeds to step S163 if it is the setting timing. Otherwise, the process proceeds to step S165.
- step S ⁇ b> 163 the processor 23 sets a target animation (in the example of FIG. 18, an animation setting in which the mole image 91 appears from one of the four hole images 83).
- step S165 the processor 23 determines whether one of the cursors 67L and 67R has overlapped with the target, and if it overlaps, the processor 23 proceeds to step S167, otherwise proceeds to step S171.
- step S167 the processor 23 executes a point addition process on the score display unit 95.
- step S169 the processor 23 sets an effect (image and sound) indicating success.
- step S171 the processor 23 determines whether or not the play time of the elapsed time display section 93 has become 0. If it is 0, the processor 23 proceeds to step S173, and otherwise returns. After “YES” is determined in step S171, in step S173, the processor 23 ends the game, sets the selection screen, and returns.
- FIG. 30 is a flowchart showing a third example of the game process in step S109 of FIG. For example, the screen control of FIG. 19 is executed by the processing of FIG.
- step S241 processor 23 determines whether or not the target (ball image 103 in the example of FIG. 19) animation setting timing is reached, and if it is the setting timing, the process proceeds to step S243. Otherwise, the process proceeds to step S245.
- step S243 the processor 23 sets a target animation (in the example of FIG. 19, setting of an animation in which the ball image 103 appears from one of the upper edges of the screen and descends).
- step S245 the processor 23 calculates y components vcL and vcR of the speeds of the cursors 67L and 67R. In the figure, y components vcL and vcR are comprehensively expressed as vc.
- step S247 the processor 23 determines whether one of the cursors 67L and 67R has overlapped (including contact) with the target. If it has overlapped, the processor 23 proceeds to step S249, otherwise proceeds to step S255. . In step S249, the processor 23 determines whether or not the y component of the speed of the cursor that has touched the target exceeds the threshold value Thv, the process proceeds to step S251 if it exceeds, otherwise the process proceeds to step S255.
- step S251 the processor 23 executes a point addition process on the score display unit 95.
- step S253 the processor 23 sets an effect (image and sound) indicating success.
- step S255 the processor 23 determines whether or not the play time of the elapsed time display section 93 has become 0, the process proceeds to step S257 if 0, otherwise the process returns. After “YES” is determined in step S255, in step S257, the processor 23 ends the game, sets the selection screen, and returns.
- FIG. 31 is a flowchart showing the flow of the fourth example of the game process in step S109 of FIG. For example, the screen control of FIGS. 20 and 21 is executed by the processing of FIG.
- step S193 processor 23 determines that the cursor (corresponding to the indicated foot among cursors 67L and 67R in FIG. 20, both cursors 67L and 67R in FIG. 21). Is duplicated with the target (guide image 113 in the example of FIG. 20, region 135 where the guide image 123 is located in the example of FIG. 21). If so, the process proceeds to step S195. Proceed to S199.
- step S195 the processor 23 adds points to the score display unit (in the example of FIG. 20, the one corresponding to the indicated foot of the score display units 115 and 119, in the example of FIG. 21, the score display unit 127). Execute the process.
- step S197 the processor 23 changes the setting (position) of the target (the guide image 113 in the example of FIG. 20, the guide image 123 in the example of FIG. 21).
- step S199 the processor 23 determines whether or not one play time (15 seconds in the example of FIG. 20, 30 seconds in the example of FIG. 21) of the elapsed time display unit 117 has ended. Proceed to step S200, otherwise return.
- step S200 the processor 23 determines whether or not all the play (left and right legs in the example of FIG. 20, only one play in the example of FIG. 21) is finished, and if finished, the process proceeds to step S201. Otherwise, the process proceeds to step S203.
- step S203 the processor 23 changes the setting of the target (guide image 113 in the example of FIG. 20) and returns.
- the processor 23 ends the game, sets the selection screen, and returns.
- FIG. 32 is a flowchart showing the flow of the fifth example of the game process in step S109 of FIG. For example, the screen control of FIG. 22 is executed by the processing of FIG.
- step S211 processor 23 determines whether any of cursors 67L and 67R overlaps the target (the sole image 155 in the example of FIG. 22). If YES in step S213, the process advances to step S213. Otherwise, the process advances to step S215. In step S213, the processor 23 counts up an OK timer that counts the time during which one of the cursors 67L and 67R overlaps the target. On the other hand, in step S215, an NG timer that counts the time when the cursors 67L and 67R do not overlap the target is counted up.
- step S217 the processor 23 determines whether or not one play time (30 seconds in the example of FIG. 22) of the elapsed time display unit 117 has ended, the process proceeds to step S219 if it has ended, otherwise returns. To do.
- step S219 the processor 23 performs all play (in the example of FIG. 22, the left one leg standing with the eyes open, the right one leg standing with the eyes open, the left one leg standing with the eyes closed, and the right one leg standing with the eyes closed). It is determined whether or not the process has been completed. If the process has been completed, the process proceeds to step S223. Otherwise, the process proceeds to step S221.
- step S221 the processor 23 changes the setting of the target (the sole image 155 and the instruction field 154 in the example of FIG. 22), and returns.
- step S223 the processor 23 ends the game, sets the selection screen, and returns.
- the position of the retroreflective sheet (subject) 17 in the real space and the position of the cursor 67 in the projected video image on the screen 21 in the real space is adjusted so as to match or substantially match. Therefore, the player 15 performs an input to the processor 23 by moving the retroreflective sheet 17 on the video image displayed on the screen 21 and directly pointing a desired position in the video image with the retroreflective sheet 17. be able to. This is because the position of the retroreflective sheet 17 in the real space substantially coincides with the position of the cursor 67 in the projected video image on the screen 21 in the real space. This is because the position in the video image where the reflection sheet 17 is placed can be recognized.
- the cursor 67 when the retroreflective sheet 17 moves from the back to the front as viewed from the image sensor 27, the cursor 67 projected so as to move from the back to the front as viewed from the image sensor 27.
- the cursor 67 is projected so that the projected cursor 67 as viewed from the image sensor 27 moves from the front to the back.
- the cursor 67 is projected so that the projected cursor 67 as viewed from the image sensor 27 moves from right to left.
- the position of the image sensor 27 is determined. See the cursor 67, which is projected to determine the position of the cursor 67 to move from left to right.
- the retroreflective sheet when the retroreflective sheet is imaged from the position where the retroreflective sheet is looked forward in front of the player (hereinafter referred to as “when looking up”), usually when the retroreflective sheet moves from the back to the front as viewed from the image sensor.
- the cursor position is determined so that the cursor moves upward, and the retroreflective sheet is viewed from the front as viewed from the image sensor.
- the position of the cursor is determined so that the cursor moves downward when the player views the video image displayed on the vertically installed screen.
- the cursor is controlled with the same algorithm as in the case of looking down in the case of a bird's-eye view
- the video image displayed on the vertically installed screen is displayed when the retroreflective sheet moves from the back to the front as seen from the image sensor.
- the position of the cursor is determined so that the cursor moves downward when viewed by the player, and the video image displayed on the screen is displayed when the retroreflective sheet moves from the front to the back as viewed from the image sensor.
- the position of the cursor is determined so that the cursor moves upward.
- the moving direction of the retroreflective sheet operated by the player and the moving direction of the cursor on the screen do not sensuously match. For this reason, input is stressed, and smooth input cannot be performed.
- the optical axis vector V of the image sensor does not have a vertical component (that is, when the imaging surface is parallel to the vertical plane), or the vertical component Vv of the optical axis vector V faces vertically upward. This is because the image sensor is installed so that the vertical direction of the image sensor and the vertical direction of the player coincide with each other and are used to such use.
- the direction from the end point of the vertical component Vv of the optical axis vector V of the image sensor to the start point is the lower side of the image sensor, and the direction from the start point to the end point is the upper side of the image sensor (see FIG. 4).
- the direction from the player's foot toward the head is defined as the player's upper direction, and the direction from the head toward the foot is defined as the player's lower side.
- the keystone correction is performed on the position of the retroreflective sheet 17 obtained based on the camera image.
- the retroreflective sheet 17 on the surface to be imaged is imaged by the image sensor 27 arranged so that the optical axis is inclined with respect to the surface to be imaged.
- the movement of the retroreflective sheet 17 operated by the player and the movement of the cursor 67 match or substantially match.
- keystone correction is applied to the position of the retroreflective sheet 17 that determines the position of the cursor 67.
- the player can perform an input with a sense of incongruity suppressed as much as possible.
- the infrared light emitting diode 7 is intermittently driven to generate a difference image (camera image) at the time of turning on and off, and based on this, the movement of the retroreflective sheet 17 is performed. Analyzing.
- a difference image camera image
- noise due to light other than the reflected light from the retroreflective sheet 17 can be removed as much as possible, and only the retroreflective sheet 17 can be detected with high accuracy.
- various objects are displayed on the projected video image, and these are used for issuing commands. It can be used as an icon or as various items in a video game.
- the processor 23 determines whether or not the cursor 67 is in contact with or overlaps with a predetermined image that moves (for example, the ball image 103 in FIG. 19) so as to satisfy a predetermined condition (for example, step S249 in FIG. 30). ). For this reason, the player 15 does not have to simply operate the retroreflective sheet 17 so that the cursor 67 touches the predetermined image, but must operate the retroreflective sheet 17 to satisfy the predetermined condition. Thereby, game nature and difficulty can be improved.
- the predetermined condition is that the cursor 67 exceeds a certain speed in the game of FIG. 30, but a condition according to the game specifications can be set.
- the camera unit 5 captures an image from a position where the retroreflective sheet 17 is looked down. Therefore, the player 15 can operate the cursor 67 by moving the retroreflective sheet 17 on the screen 21 or the floor surface placed on the floor surface. In the above, the player 15 moves with the retroreflective sheet 17 attached to his / her foot. Therefore, the present invention can be applied to a game using feet or an exercise using feet.
- the trapezoidal correction parameters can be obtained simply by causing the player 15 to place the retroreflective sheets CN, LU, RU, RB and LB on the markers m and d1 to d4 projected on the screen 21. It can be easily obtained.
- the retroreflective sheets CN, LU, RU, RB, and LB are placed on the markers m and d1 to d4 arranged at a plurality of positions on the projected video image, the parameters for keystone correction are obtained. The accuracy can be further improved.
- Embodiment 2 describes another example of keystone correction.
- the video image generated by the processor 23 is projected on the screen 21.
- an example in which the video image generated by the processor 23 is displayed on a display device such as a television monitor having a vertical screen is given.
- FIG. 33 is a diagram showing an electrical configuration of the entertainment system according to the second embodiment of the present invention.
- this entertainment system includes information processing device 3, retroreflective sheets (retroreflective members) 17L and 17R for retroreflecting received light, and television monitor 200. Further, the information processing apparatus 3 includes the same camera unit 5 as in the first embodiment.
- a television monitor 200 is provided instead of the projector 11 and the screen 21 of FIG. Therefore, in the second embodiment, the video signal VD and the audio signal AU generated by the processor 23 are given to the television monitor 200.
- the origin of the camera image 33 is the upper left corner
- the horizontal axis is the X axis
- the vertical axis is the Y axis.
- the X-axis positive is the horizontal right direction
- the Y-axis positive is the vertical downward direction.
- the player 15 attaches the retroreflective sheet 17L with the rubber band 19 to the back of the left leg, and attaches the retroreflective sheet 17R with the rubber band 19 to the back of the right leg.
- the information processing device 3 is in front of the player 15 (for example, about 0.7 meter) so as to reach a predetermined height from the floor (for example, 0.4 meter), and the camera unit 5 Is installed so as to photograph the floor surface at a predetermined depression angle (for example, 30 degrees).
- the height may be adjustable.
- the television monitor 200 is disposed in front of the player 15 and above the information processing device 3, behind the information processing device 3 (as viewed from the player 15) or directly above. Therefore, the camera unit 5 looks at the retroreflective sheets 17L and 17R diagonally forward and downward.
- FIG. 34 (a) is a diagram for explaining the necessity of trapezoid correction of the X coordinate in the present embodiment.
- the player 15 moves the retroreflective sheet 17 straightly along the Y # axis (see FIG. 4) as indicated by an arrow 226 in the photographing effective area 31.
- the camera unit 5 overlooks the retroreflective sheet 17, a trapezoidal distortion occurs, and the camera unit 5 recursively opens outward as indicated by an arrow 222 on the effective area corresponding image 35 of the camera image 33.
- the image of the reflection sheet 17 moves.
- the image of the retroreflective sheet 17 moves so as to open outward on the effective area corresponding image 35 as indicated by an arrow 220.
- the trapezoidal distortion increases as the distance from the camera unit 5 increases, and in the photographing effective region 31, the pixel density decreases as the distance from the camera unit 5 increases, and increases as the distance decreases. .
- FIG. 34B is an explanatory diagram of a first example of trapezoidal correction for the X coordinate (horizontal coordinate) Xp of the retroreflective sheet 17 on the effective area corresponding image 35 of the camera image 33.
- trapezoidal correction is performed on the X coordinate Xp with the side a1-a2 of the effective photographing region 31 as a reference, that is, the side a1-a2 is set to “1”.
- the correction coefficient (X correction coefficient) cx (Y) of the X coordinate Xp of the image of the retroreflective sheet 17 draws a curve 228 according to the Y coordinate of the image of the retroreflective sheet 17. That is, the X correction coefficient cx (Y) is a function of Y.
- the X correction coefficient cx (Y) takes a maximum value of 1 when the Y coordinate of the image is the same as the Y coordinate Y0 of the side b1-b2 (corresponding to the side a1-a2) of the effective area corresponding image 35, and When the coordinate is the same as the Y coordinate Y1 of the side b4-b3 (corresponding to the side a4-a3) of the effective area corresponding image 35, the minimum value D1 (0 ⁇ D1 ⁇ 1) is taken.
- a table (X table) in which the Y coordinate and the X correction coefficient cx (Y) are associated is prepared in the external memory 25 in advance.
- the processor 23 obtains the X coordinate Xf after the keystone correction by the following equation.
- the center coordinates of the effective area corresponding image 35 are assumed to be (Xc, Yc).
- FIG. 34C is an explanatory diagram of a second example of trapezoidal correction for the X coordinate (horizontal coordinate) Xp of the retroreflective sheet 17 on the effective area corresponding image 35 of the camera image 33.
- trapezoidal correction is performed on the X coordinate Xp with the side a4-a3 of the effective photographing region 31 as a reference, that is, the side a4-a3 is set to “1”.
- the correction coefficient (X correction coefficient) cx (Y) of the X coordinate Xp of the image of the retroreflective sheet 17 draws a curve 230 according to the Y coordinate of the image of the retroreflective sheet 17. That is, the X correction coefficient cx (Y) is a function of Y.
- the X correction coefficient cx (Y) takes the maximum value D2 (> 1) when the Y coordinate of the image is the same as the Y coordinate Y0 of the side b1-b2 (corresponding to the side a1-a2) of the effective area corresponding image 35.
- a table (X table) in which the Y coordinate and the X correction coefficient cx (Y) are associated is prepared in the external memory 25 in advance.
- the processor 23 obtains the X coordinate Xf after the trapezoid correction by the equation (41).
- FIG. 35 is an explanatory diagram of trapezoidal correction for the Y coordinate (vertical coordinate) Yp of the retroreflective sheet 17 on the effective area corresponding image 35 of the camera image 33.
- the trapezoidal distortion increases as the distance from the camera unit 5 increases, and in the effective shooting area 31, the pixel density decreases as the distance from the camera unit 5 increases, and the distance decreases. It gets bigger. For this reason, even when the retroreflective sheet 17 is moved parallel to the Y # axis (see FIG. 4) by a certain distance on the photographing effective region 31, the image of the retroreflective sheet 17 in the effective region corresponding image 35 is moved. The amount decreases as the distance between the camera unit 5 and the retroreflective sheet 17 increases, and increases as the distance decreases.
- the correction coefficient (Y correction coefficient) cy (Y) of the Y coordinate Yp of the image of the retroreflective sheet 17 draws a curve 232 according to the Y coordinate of the image of the retroreflective sheet 17. That is, the Y correction coefficient cy (Y) is a function of Y.
- the Y correction coefficient cy (Y) takes a maximum value of 1 when the Y coordinate of the image is the same as the Y coordinate Y0 of the side b1-b2 (corresponding to the side a1-a2) of the effective area corresponding image 35, and When the coordinate is the same as the Y coordinate Y1 of the side b4-b3 (corresponding to the side a4-a3) of the effective area corresponding image 35, the minimum value D3 (> 0) is taken.
- a table (Y table) in which the Y coordinate and the Y correction coefficient cy (Y) are associated is prepared in the external memory 25 in advance.
- the processor 23 obtains the Y coordinate Yf after the keystone correction by the following equation.
- the Y coordinate Yp is trapezoidally corrected with the side a1-a2 of the effective imaging area 31 as a reference, that is, the side a1-a2 is set to “1”.
- trapezoidal correction may be performed on the Y coordinate Yp with the side a4-a3 of the effective photographing area 31 as a reference, that is, the side a4-a3 is set to “1”.
- FIG. 36 is a flowchart showing the flow of the coordinate calculation process in step S103 of FIG. 26 in the second embodiment.
- processor 23 extracts an image of retroreflective sheet 17 from the camera image (difference image) received from image sensor 27.
- the processor 23 determines the XY coordinates on the camera image of the retroreflective sheet 17 based on the image of the retroreflective sheet 17.
- FIG. 37 is a flowchart showing the trapezoid correction process in step S105 of FIG. 26 in the second embodiment.
- the processor 23 acquires the corresponding X correction coefficient cx from the X table using the Y coordinate of the image of the retroreflective sheet 17 as an index.
- the processor 23 calculates the corrected X coordinate Xf according to the equation (41).
- step S325 the processor 23 acquires a corresponding Y correction coefficient cy from the Y table using the Y coordinate of the image of the retroreflective sheet 17 as an index.
- step S327 the processor 23 calculates the corrected Y coordinate Yf by the equation (42).
- step S329 the processor 23 converts the corrected X coordinate Xf and Y coordinate Yf to the screen coordinate system to obtain the xy coordinate.
- step S331 the processor 23 mirrors the xy coordinates in the screen coordinate system up and down.
- the position of the cursor 67 is determined so that the cursor 67 moves from the bottom to the top of the screen.
- the position of the cursor 67 is determined so that the cursor 67 moves from the top to the bottom of the screen.
- the position of the cursor is determined so that the cursor moves upward, and the retroreflective sheet moves from the front to the back as viewed from the image sensor. In this case, the position of the cursor is determined so that the cursor moves downward when the player views the video image displayed on the television monitor.
- the cursor is controlled with the same algorithm as in the case of a bird's-eye view
- the player views the video image displayed on the television monitor when the retroreflective sheet moves from the back to the front as viewed from the image sensor.
- the cursor position is determined so that the cursor moves upward.
- the moving direction of the retroreflective sheet operated by the player and the moving direction of the cursor on the television monitor do not coincide sensuously. For this reason, input is stressed, and smooth input cannot be performed.
- the optical axis vector V of the image sensor does not have a vertical component (that is, when the imaging surface is parallel to the vertical plane), or the vertical component Vv of the optical axis vector V faces vertically upward. This is because the image sensor is installed so that the vertical direction of the image sensor and the vertical direction of the player coincide with each other and are used to such use.
- the direction from the end point of the vertical component Vv of the optical axis vector V of the image sensor to the start point is the lower side of the image sensor, and the direction from the start point to the end point is the upper side of the image sensor (see FIG. 4).
- the direction from the player's foot toward the head is defined as the player's upper direction, and the direction from the head toward the foot is defined as the player's lower side.
- the cursor 67 moves to the right of the screen when the retroreflective sheet 17 moves from the right to the left as viewed from the image sensor 27 without performing special processing.
- the position of the cursor 67 is determined so as to move from left to right and the retroreflective sheet 17 moves from left to right as viewed from the image sensor 27, the cursor 67 moves from left to right on the screen. 67 positions are determined.
- step S111 the processor 23 generates a video image (FIGS. 16 to 22) according to the processing result of step S109, and provides this to the television monitor 200. In response, a corresponding video is displayed on the television monitor 200.
- trapezoidal correction is performed on the position of the retroreflective sheet 17 obtained based on the camera image.
- the retroreflective sheet 17 on the surface to be imaged is imaged by the image sensor 27 arranged so that the optical axis is inclined with respect to the surface to be imaged.
- the movement of the retroreflective sheet 17 operated by the player and the movement of the cursor 67 match or substantially match. This is because keystone correction is applied to the position of the retroreflective sheet 17 that determines the position of the cursor 67.
- the player can perform an input with a sense of incongruity suppressed as much as possible.
- trapezoidal correction is performed according to the distance between the retroreflective sheet 17 and the camera unit 5.
- the trapezoidal distortion of the image of the retroreflective sheet 17 reflected in the camera image increases as the distance between the retroreflective sheet 17 and the camera unit 5 increases. Therefore, appropriate trapezoidal correction according to the distance can be executed.
- the X coordinate of the cursor 67 is such that the distance between the retroreflective sheet 17 and the camera unit 5 and the movement distance of the cursor 67 in the X-axis direction (horizontal direction) have a positive correlation.
- Correct (horizontal coordinates) That is, the smaller the distance between the retroreflective sheet 17 and the camera unit 5 is, the smaller the moving distance of the cursor 67 in the X-axis direction is, and the larger the distance is, the larger the moving distance of the cursor 67 in the X-axis direction is. . In this way, the trapezoidal distortion in the X-axis direction is corrected.
- the Y coordinate (vertical coordinate) of the cursor 67 is set so that the distance between the retroreflective sheet 17 and the camera unit 5 and the movement distance of the cursor 67 in the Y-axis direction (vertical direction) have a positive correlation. ) Is corrected. That is, the smaller the distance between the retroreflective sheet 17 and the camera unit 5 is, the smaller the moving distance of the cursor 67 in the Y-axis direction is, and the larger the distance is, the larger the moving distance of the cursor 67 in the Y-axis direction is. . In this way, the trapezoidal distortion in the Y-axis direction is corrected.
- the infrared light emitting diode 7 is intermittently driven to generate a difference image (camera image) at the time of turning on and off, and based on this, the movement of the retroreflective sheet 17 is performed. Analyzing.
- a difference image camera image
- noise due to light other than the reflected light from the retroreflective sheet 17 can be removed as much as possible, and only the retroreflective sheet 17 can be detected with high accuracy.
- various objects (63, 65, 73, 75, 77, 91, 103, 113, 123, 155) are displayed on the video image, so these are icons for issuing commands. It can be used as various items during video games.
- the processor 23 determines whether or not the cursor 67 is in contact with or overlaps with a predetermined image that moves (for example, the ball image 103 in FIG. 19) so as to satisfy a predetermined condition (for example, step S249 in FIG. 30). ). For this reason, the player 15 does not have to simply operate the retroreflective sheet 17 so that the cursor 67 touches the predetermined image, but must operate the retroreflective sheet 17 to satisfy the predetermined condition. Thereby, game nature and difficulty can be improved.
- the predetermined condition is that the cursor 67 exceeds a certain speed in the game of FIG. 30, but a condition according to the game specifications can be set.
- the camera unit 5 captures an image from a position where the retroreflective sheet 17 is looked down. Therefore, the player 15 can operate the cursor 67 by moving the retroreflective sheet 17 on the floor surface. In the above, the player 15 moves with the retroreflective sheet 17 attached to his / her foot. Therefore, the present invention can be applied to a game using feet or an exercise using feet.
- the present invention is not limited to the above-described embodiment, and can be implemented in various modes without departing from the gist thereof.
- the following modifications are possible.
- a self-light emitting device such as an infrared light emitting diode can be mounted.
- the infrared light emitting diode 7 is unnecessary.
- a subject for example, the back of a player's foot
- an imaging device such as an image sensor or CCD, and the motion can be detected by analyzing the image.
- the above-described strobe shooting (flashing of the infrared light emitting diode 7) and difference processing are merely examples, and are not essential elements of the present invention. That is, the infrared light emitting diode 7 may not be blinked, and the infrared light emitting diode 7 may not be provided. Irradiation light is not limited to infrared light. Further, the retroreflective sheet 17 is not an essential element in the present invention, and it is only necessary to analyze a captured image and detect a specific part of the body (for example, the back of the foot).
- the image sensor is not limited to an image sensor, and other image sensors such as a CCD can be used.
- the first step calibration (see FIG. 9A) can be omitted.
- the calibration in the first step is performed in order to further improve the correction accuracy.
- the second step calibration four markers were used. However, a larger number of markers may be used. Also, 3 or less markers can be used. In this case, when two markers are used, markers having different y coordinates (for example, D1 and D4, or D2 and D3) are used rather than markers having the same y coordinate (for example, D1 and D2, or D4 and D3). ) Is preferred. This is because keystone correction can be performed simultaneously. When one marker is used, or when two markers having the same y coordinate are used, it is necessary to separately perform keystone correction.
- the cursor is set so that the position of the retroreflective sheet 17 in the real space matches the position of the cursor 67 in the projected video image.
- the process of correcting the position 67 includes trapezoidal correction. In view of the processing amount and accuracy, it is preferable to use the four markers as described above.
- the markers D1 to D4 are simultaneously displayed.
- the markers D1 to D4 can be displayed one by one at different times. That is, first, the marker D1 is displayed, after acquiring data based on the marker D1, the marker D2 is displayed, after acquiring data based on the marker D2, the marker D3 is displayed, and after acquiring data based on the marker D3, the marker D4 is displayed. Then, data based on the marker D4 is acquired.
- the cursor 67 is displayed so that the player 15 can visually recognize it.
- the player 15 can confirm that the projected cursor 67 matches the retroreflective sheet 17 and can recognize that the system is normal.
- the cursor 67 may be virtual and the cursor 67 may not be displayed. This is because, even if the player 15 cannot visually recognize the cursor 67, if the processor 23 can recognize the position of the cursor 67, the processor 23 knows where the retroreflective sheet 17 is placed on the projected video image. In this case, the cursor 67 may be hidden or a transparent cursor 67 may be displayed. Further, even if the cursor 67 is not displayed, there is no significant problem for the player 15 in play.
- the same calibration process as in the first embodiment can be performed.
- a player wearing a retroreflective sheet on one leg is asked to stand in front of the camera unit 5.
- the retroreflective sheet at that time is photographed to obtain coordinates.
- the player 15 is caused to move the retroreflective sheet to the positions of the upper left front, the upper right upper, the lower left rear, and the lower right rear, and at the respective positions of the upper left upper, the upper right front, the lower left rear, and the lower right rear, Photograph the retroreflective sheet and obtain the coordinates. Based on these coordinates, parameters for correction are calculated.
- trapezoidal correction methods listed above are examples, and other known keystone corrections can be applied.
- the keystone correction is performed on both the X coordinate and the Y coordinate.
- trapezoidal correction may be executed for any one of the coordinates. According to the experiments by the inventors, if trapezoidal correction is performed only on the Y coordinate, it is possible to perform input that does not hinder play.
- the trapezoidal correction may be performed on the coordinates on the camera image, or may be performed on the coordinates after being converted into the screen coordinate system.
- the processing in step S87 in FIG. 25 and step S331 in FIG. 37 is executed after conversion into the screen coordinate system. However, these processes can also be performed before conversion to the screen coordinate system.
- the processing in step S87 in FIG. 25 and step S331 in FIG. 37 is not necessary depending on the specifications of the image sensor 27. This is because the image sensor 27 may output a camera image that is mirrored up and down.
- the processor 23 arranges one marker 43 at the center on the video image 41 different from the video image 45 on which the four markers D1 to D4 are arranged.
- the markers D1 to D4 and the marker 43 can be arranged on the same video image.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Position Input By Displaying (AREA)
- Controls And Circuits For Display Device (AREA)
- Projection Apparatus (AREA)
Abstract
Description
PRUY=Ry/(RUY-CY) …(2)
PRBY=Ry/(CY-RBY) …(4)
PLBY=Ry/(CY-LBY) …(6)
PLUY=Ry/(LUY-CY) …(8)
CPRUX=PRUX-{(RUY-PY)*SRUX} …(17)
CPRUX=PRUX+{(RUY-PY)*SRUX} …(18)
CPRUY=PRUY-{(RUX-PX)*SRUY} …(20)
CPRUY=PRUY+{(RUX-PX)*SRUY} …(21)
CPRBX=PRBX-{(RBY-PY)*SRBX} …(23)
CPRBX=PRBX+{(RBY-PY)*SRBX} …(24)
CPRBY=PRBY-{(RBX-PX)*SRBY} …(26)
CPRBY=PRBY+{(RBX-PX)*SRBY} …(27)
CPLBX=PLBX-{(LBY-PY)*SLBX} …(29)
CPLBX=PLBX+{(LBY-PY)*SLBX} …(30)
CPLBY=PLBY-{(LBX-PX)*SLBY} …(32)
CPLBY=PLBY+{(LBX-PX)*SLBY} …(33)
CPLUX=PLUX-{(LUY-PY)*SLUX} …(35)
CPLUX=PLUX+{(LUY-PY)*SLUX} …(36)
CPLUY=PLUY-{(LUX-PX)*SLUY} …(38)
CPLUY=PLUY+{(LUX-PX)*SLUY} …(39)
Claims (24)
- ビデオ映像を生成するビデオ映像生成手段と、
前記ビデオ映像を制御する制御手段と、
実空間に配置されたスクリーンに前記ビデオ映像を投影する投影手段と、
プレイヤが前記スクリーン上で操作する実空間中の被写体を撮像する撮像手段と、を備え、
前記制御手段は、
前記撮像手段により得られた撮像画像に基づいて、前記被写体の位置を求める解析手段と、
前記解析手段が求めた前記被写体の位置に基づいて、カーソルを前記被写体に連動させるカーソル制御手段と、を含み、
前記カーソル制御手段は、
実空間中の前記スクリーン上において、実空間中の前記被写体の位置と投影された前記ビデオ映像中の前記カーソルの位置とが一致するように、前記カーソルの位置を補正する補正手段を含む入力システム。 - ビデオ映像を生成するビデオ映像生成手段と、
前記ビデオ映像を制御する制御手段と、を備え、
前記制御手段は、
プレイヤが実空間に配置されたスクリーン上で操作する実空間中の被写体を撮像する撮像手段により得られた撮像画像に基づいて、前記被写体の位置を求める解析手段と、
前記解析手段が求めた前記被写体の位置に基づいて、カーソルを前記被写体に連動させるカーソル制御手段と、を含み、
前記カーソル制御手段は、
実空間中の前記スクリーン上において、実空間中の前記被写体の位置と前記スクリーンに投影された前記ビデオ映像中の前記カーソルの位置とが一致するように、前記カーソルの位置を補正する補正手段を含む入力システム。 - 前記補正の際に使用するパラメータを算出するためのビデオ映像を生成し、当該ビデオ映像上の少なくとも1つの所定位置に少なくとも1つの所定のマーカを配置するマーカ映像生成手段と、
前記撮像手段により得られた撮像画像と、前記マーカ映像生成手段が生成した前記ビデオ映像と、を対応させ、前記撮像画像上の前記被写体の像の位置に対応する前記ビデオ映像上の位置である対応位置を算出する対応位置算出手段と、
前記所定のマーカが配置された前記所定位置と、前記スクリーンに投影された前記所定のマーカ上に前記被写体を置いたときの前記対応位置と、に基づいて、前記補正手段が補正の際に使用するパラメータを算出するパラメータ算出手段と、をさらに備える請求項1又は2記載の入力システム。 - 前記マーカ映像生成手段は、前記ビデオ映像上の複数の前記所定位置に、複数の前記所定のマーカを配置し、あるいは、前記ビデオ映像上の異なる前記所定位置に、時間を変えて前記所定のマーカを配置する、請求項3記載の入力システム。
- 前記マーカ映像生成手段は、前記ビデオ映像上の四隅に、4つの前記所定のマーカを配置し、あるいは、前記ビデオ映像上の四隅に、時間を変えて前記所定のマーカを配置する、請求項4記載の入力システム。
- 前記マーカ映像生成手段は、前記4つの所定のマーカが配置された前記ビデオ映像又は異なるビデオ映像上の中心に、1つの前記所定のマーカを配置する、請求項5記載の入力システム。
- 前記補正手段による前記補正は台形補正を含む、請求項1から6のいずれかに記載の入力システム。
- 前記撮像手段は、前記プレイヤの前方に配置され、前記被写体を俯瞰する位置から撮像し、
前記カーソル制御手段は、前記撮像手段から見て前記被写体が奥から手前へ移動した場合に、前記撮像手段から見て投影された前記カーソルが奥から手前へ移動するように前記カーソルの位置を決定し、かつ、前記撮像手段から見て前記被写体が手前から奥へ移動した場合に、前記撮像手段から見て投影された前記カーソルが手前から奥へ移動するように前記カーソルの位置を決定し、かつ、前記撮像手段から見て前記被写体が右から左へ移動した場合に、前記撮像手段から見て投影された前記カーソルが右から左へ移動するように前記カーソルの位置を決定し、かつ、前記撮像手段から見て前記被写体が左から右へ移動した場合に、前記撮像手段から見て投影された前記カーソルが左から右へ移動するように前記カーソルの位置を決定する、請求項1から7のいずれかに記載の入力システム。 - 前記カーソルは、前記プレイヤが視認可能に表示される、請求項1から8のいずれかに記載の入力システム。
- 前記カーソルは、仮想のものであり、表示されない、請求項1から8のいずれかに記載の入力システム。
- カーソルを含むビデオ映像を生成するビデオ映像生成手段と、
前記ビデオ映像を制御する制御手段と、
光軸が被撮影面に対して斜めになる様に配置され、前記被撮影面上の被写体を撮像する撮像手段と、を備え、
前記制御手段は、
前記撮像手段により得られた撮像画像に基づいて、前記被写体の位置を求める解析手段と、
前記解析手段が求めた前記被写体の位置に対して、台形補正を実行する台形補正手段と、
前記台形補正後の前記被写体の位置に基づいて、前記カーソルを前記被写体に連動させるカーソル制御手段と、を含む入力システム。 - カーソルを含むビデオ映像を生成するビデオ映像生成手段と、
前記ビデオ映像を制御する制御手段と、を備え、
前記制御手段は、
光軸が被撮影面に対して斜めになる様に配置され、前記被撮影面上の被写体を撮像する撮像手段により得られた撮像画像に基づいて、前記被写体の位置を求める解析手段と、
前記解析手段が求めた前記被写体の位置に対して、台形補正を実行する台形補正手段と、
前記台形補正後の前記被写体の位置に基づいて、前記カーソルを前記被写体に連動させるカーソル制御手段と、を含む入力システム。 - 前記台形補正手段は、前記被写体と前記撮像手段との間の距離に応じて、台形補正を施す、請求項11又は12記載の入力システム。
- 前記台形補正手段は、
前記被写体と前記撮像手段との間の距離と、前記カーソルの水平方向の移動距離と、が正の相関関係になるように、前記カーソルの水平座標を補正する水平補正手段を含む請求項13記載の入力システム。 - 前記台形補正手段は、
前記被写体と前記撮像手段との間の距離と、前記カーソルの垂直方向の移動距離と、が正の相関関係になるように、前記カーソルの垂直座標を補正する垂直補正手段を含む請求項13又は14記載の入力システム。 - 前記撮像手段は、前記被写体を俯瞰する位置から撮像する、請求項11から15のいずれかに記載の入力システム。
- 前記被写体に対して、間欠的に光を照射する発光手段をさらに備え、
前記被写体は、受けた光を再帰反射する再帰反射部材を含み、
前記解析手段は、前記発光手段からの光の照射時の撮像画像と非照射時の撮像画像との差分画像に基づいて、前記被写体の位置を求める、請求項1から16のいずれかに記載の入力システム。 - 前記制御手段は、
所定画像を前記ビデオ映像上に配置する配置手段と、
前記カーソルが前記所定画像に接触又は重なったか否かを判定する判定手段と、を含む請求項1から17のいずれかに記載の入力システム。 - 前記判定手段は、前記カーソルが前記所定画像に少なくとも所定時間継続して重なったか否かを判定する、請求項18記載の入力システム。
- 前記配置手段は、前記所定画像を移動し、
前記判定手段は、前記カーソルが、所定条件を満足するように、移動する前記所定画像と接触又は重なったか否かを判定する、請求項18記載の入力システム。 - ビデオ映像を生成するステップと、
前記ビデオ映像を制御するステップと、を含み、
制御する前記ステップは、
プレイヤが実空間に配置されたスクリーン上で操作する実空間中の被写体を撮像する撮像手段により得られた撮像画像に基づいて、前記被写体の位置を求める解析ステップと、
前記解析ステップが求めた前記被写体の位置に基づいて、カーソルを前記被写体に連動させるカーソル制御ステップと、を含み、
前記カーソル制御ステップは、
実空間中の前記スクリーン上において、実空間中の前記被写体の位置と前記スクリーンに投影された前記ビデオ映像中の前記カーソルの位置とが一致するように、前記カーソルの位置を補正する補正ステップを含む入力方法。 - カーソルを含むビデオ映像を生成するステップと、
前記ビデオ映像を制御するステップと、を含み、
制御する前記ステップは、
光軸が被撮影面に対して斜めになる様に配置され、前記被撮影面上の被写体を撮像する撮像手段により得られた撮像画像に基づいて、前記被写体の位置を求める解析ステップと、
前記解析ステップが求めた前記被写体の位置に対して、台形補正を実行する台形補正ステップと、
前記台形補正後の前記被写体の位置に基づいて、前記カーソルを前記被写体に連動させるカーソル制御ステップと、を含む入力方法。 - 請求項21又は22の入力方法をコンピュータに実行させるコンピュータプログラム。
- 請求項23のコンピュータプログラムを記録したコンピュータ読み取り可能な記録媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/993,204 US20120044141A1 (en) | 2008-05-23 | 2008-09-26 | Input system, input method, computer program, and recording medium |
JP2010512854A JPWO2009141855A1 (ja) | 2008-05-23 | 2008-09-26 | 入力システム、入力方法、コンピュータプログラム、及び、記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-136108 | 2008-05-23 | ||
JP2008136108 | 2008-05-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009141855A1 true WO2009141855A1 (ja) | 2009-11-26 |
Family
ID=41339830
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2008/002686 WO2009141855A1 (ja) | 2008-05-23 | 2008-09-26 | 入力システム、入力方法、コンピュータプログラム、及び、記録媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20120044141A1 (ja) |
JP (1) | JPWO2009141855A1 (ja) |
WO (1) | WO2009141855A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014041938A1 (ja) * | 2012-09-12 | 2014-03-20 | シチズンホールディングス株式会社 | 情報入力装置 |
JP2016005055A (ja) * | 2014-06-16 | 2016-01-12 | セイコーエプソン株式会社 | プロジェクター及び投写方法 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9602778B2 (en) * | 2012-12-12 | 2017-03-21 | Sensormatic Electronics, LLC | Security video system using customer regions for monitoring point of sale areas |
US9360888B2 (en) * | 2013-05-09 | 2016-06-07 | Stephen Howard | System and method for motion detection and interpretation |
US10891003B2 (en) | 2013-05-09 | 2021-01-12 | Omni Consumer Products, Llc | System, method, and apparatus for an interactive container |
CN107430325B (zh) | 2014-12-30 | 2020-12-29 | 欧姆尼消费品有限责任公司 | 交互式投影的系统和方法 |
JP6018266B1 (ja) * | 2015-07-17 | 2016-11-02 | 株式会社スクウェア・エニックス | ビデオゲーム処理プログラム及びビデオゲーム処理システム |
JP2017208688A (ja) * | 2016-05-18 | 2017-11-24 | 株式会社スクウェア・エニックス | プログラム、コンピュータ装置、プログラム実行方法、及び、コンピュータシステム |
JP7167503B2 (ja) * | 2018-06-27 | 2022-11-09 | セイコーエプソン株式会社 | プロジェクター |
DE102020108984A1 (de) | 2020-04-01 | 2021-10-07 | Bayerische Motoren Werke Aktiengesellschaft | Fortbewegungsmittel, Vorrichtung und Verfahren zur Unterhaltung eines Insassen eines Fortbewegungsmittels |
US20240295939A1 (en) * | 2023-03-02 | 2024-09-05 | Samsung Electronics Co., Ltd. | Projector and method for controlling the same |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000081950A (ja) * | 1998-07-03 | 2000-03-21 | Sony Corp | 画像処理装置、画像処理方法、及び提供媒体、並びにプレゼンテーションシステム |
JP2000112651A (ja) * | 1998-10-06 | 2000-04-21 | Olympus Optical Co Ltd | ポインティング機構 |
JP2003122505A (ja) * | 2001-10-12 | 2003-04-25 | Sony Corp | 情報処理装置、情報処理システム、及びプログラム |
WO2006078996A2 (en) * | 2005-01-21 | 2006-07-27 | Gesturetek, Inc. | Motion-based tracking |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5684514A (en) * | 1991-01-11 | 1997-11-04 | Advanced Interaction, Inc. | Apparatus and method for assembling content addressable video |
WO1992017875A1 (en) * | 1991-04-08 | 1992-10-15 | Hitachi, Ltd. | Method and apparatus for image or data processing, and monitoring method and apparatus using the same |
JPH10334275A (ja) * | 1997-05-29 | 1998-12-18 | Canon Inc | 仮想現実方法及び装置並びに記憶媒体 |
US6611253B1 (en) * | 2000-09-19 | 2003-08-26 | Harel Cohen | Virtual input environment |
US7774075B2 (en) * | 2002-11-06 | 2010-08-10 | Lin Julius J Y | Audio-visual three-dimensional input/output |
JP2004185258A (ja) * | 2002-12-03 | 2004-07-02 | Hitachi Ltd | 情報処理装置 |
CN1816792A (zh) * | 2003-07-02 | 2006-08-09 | 新世代株式会社 | 信息处理装置、信息处理系统、操作物、信息处理方法、信息处理程序以及游戏系统 |
US20070197290A1 (en) * | 2003-09-18 | 2007-08-23 | Ssd Company Limited | Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method |
US7554545B2 (en) * | 2003-11-04 | 2009-06-30 | Ssd Company Limited | Drawing apparatus operable to display a motion path of an operation article |
US20080031544A1 (en) * | 2004-09-09 | 2008-02-07 | Hiromu Ueshima | Tilt Detection Method and Entertainment System |
US8334837B2 (en) * | 2004-11-10 | 2012-12-18 | Nokia Corporation | Method for displaying approached interaction areas |
JP4180065B2 (ja) * | 2005-05-13 | 2008-11-12 | 株式会社スクウェア・エニックス | 画像生成方法、画像生成装置、および画像生成プログラム |
WO2006135087A1 (en) * | 2005-06-16 | 2006-12-21 | Ssd Company Limited | Input device, simulated experience method and entertainment system |
EP1970104A4 (en) * | 2005-12-12 | 2010-08-04 | Ssd Co Ltd | DRIVING METHOD, DRIVING DEVICE AND COORDINATION TRAINING METHOD |
US8009866B2 (en) * | 2008-04-26 | 2011-08-30 | Ssd Company Limited | Exercise support device, exercise support method and recording medium |
JP2010169668A (ja) * | 2008-12-18 | 2010-08-05 | Shinsedai Kk | 物体検出装置、それを用いたインタラクティブシステム、物体検出方法、それを用いたインタラクティブシステム構築方法、コンピュータプログラム、及び記録媒体 |
-
2008
- 2008-09-26 JP JP2010512854A patent/JPWO2009141855A1/ja active Pending
- 2008-09-26 WO PCT/JP2008/002686 patent/WO2009141855A1/ja active Application Filing
- 2008-09-26 US US12/993,204 patent/US20120044141A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000081950A (ja) * | 1998-07-03 | 2000-03-21 | Sony Corp | 画像処理装置、画像処理方法、及び提供媒体、並びにプレゼンテーションシステム |
JP2000112651A (ja) * | 1998-10-06 | 2000-04-21 | Olympus Optical Co Ltd | ポインティング機構 |
JP2003122505A (ja) * | 2001-10-12 | 2003-04-25 | Sony Corp | 情報処理装置、情報処理システム、及びプログラム |
WO2006078996A2 (en) * | 2005-01-21 | 2006-07-27 | Gesturetek, Inc. | Motion-based tracking |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014041938A1 (ja) * | 2012-09-12 | 2014-03-20 | シチズンホールディングス株式会社 | 情報入力装置 |
JP2016005055A (ja) * | 2014-06-16 | 2016-01-12 | セイコーエプソン株式会社 | プロジェクター及び投写方法 |
Also Published As
Publication number | Publication date |
---|---|
US20120044141A1 (en) | 2012-02-23 |
JPWO2009141855A1 (ja) | 2011-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2009141855A1 (ja) | 入力システム、入力方法、コンピュータプログラム、及び、記録媒体 | |
JP5081964B2 (ja) | ゲーム装置、ゲーム装置の制御方法、及びプログラム | |
JP5039808B2 (ja) | ゲーム装置、ゲーム装置の制御方法、及びプログラム | |
US8520901B2 (en) | Image generation system, image generation method, and information storage medium | |
JP5520656B2 (ja) | プログラム及び画像生成装置 | |
US8113953B2 (en) | Image-linked sound output method and device | |
JP2002233665A5 (ja) | ||
US8684837B2 (en) | Information processing program, information processing system, information processing apparatus, and information processing method | |
JP2001299975A (ja) | 体感装置及び体感システム | |
JP5425940B2 (ja) | ゲーム装置、ゲーム装置の制御方法、及びプログラム | |
JP5183878B2 (ja) | ゲームプログラム、ゲームシステム | |
US20100215215A1 (en) | Object detecting apparatus, interactive system, object detecting method, interactive system realizing method, and recording medium | |
JP5237312B2 (ja) | ゲームプログラム、ゲーム装置、ゲーム制御方法 | |
JP4808802B2 (ja) | ゲームプログラム、ゲーム装置、ゲーム制御方法 | |
JP5656387B2 (ja) | ゲーム装置、およびこのゲーム装置を実現するためのゲームプログラム | |
JP2017189446A (ja) | 処理装置および投影画像生成方法 | |
JP4341066B2 (ja) | 情報処理装置および画像生成プログラム | |
JP6285980B2 (ja) | 処理装置および投影画像生成方法 | |
JP2011101764A5 (ja) | ||
KR102066806B1 (ko) | 운동 지원 시스템 및 운동 지원 방법 | |
JP2007267850A (ja) | プログラム、情報記憶媒体、及び画像生成システム | |
JP4932815B2 (ja) | 描画処理プログラム、描画処理装置、及び描画処理方法 | |
JP6462441B2 (ja) | ダンス装置 | |
JP2001224731A (ja) | ゲーム装置および情報記憶媒体 | |
JP4705145B2 (ja) | 描画処理プログラム、描画処理装置、及び描画処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08808575 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010512854 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12993204 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08808575 Country of ref document: EP Kind code of ref document: A1 |