WO2009093461A1 - 撮像装置、オンラインゲームシステム、操作物、入力方法、画像解析装置、画像解析方法、及び記録媒体 - Google Patents
撮像装置、オンラインゲームシステム、操作物、入力方法、画像解析装置、画像解析方法、及び記録媒体 Download PDFInfo
- Publication number
- WO2009093461A1 WO2009093461A1 PCT/JP2009/000245 JP2009000245W WO2009093461A1 WO 2009093461 A1 WO2009093461 A1 WO 2009093461A1 JP 2009000245 W JP2009000245 W JP 2009000245W WO 2009093461 A1 WO2009093461 A1 WO 2009093461A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mcu
- image
- information
- variable
- imaging
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 97
- 238000003384 imaging method Methods 0.000 title claims description 74
- 238000010191 image analysis Methods 0.000 title claims description 7
- 238000003703 image analysis method Methods 0.000 title claims description 6
- 238000001514 detection method Methods 0.000 claims abstract description 67
- 230000033001 locomotion Effects 0.000 claims abstract description 35
- 239000013598 vector Substances 0.000 claims description 69
- 238000004364 calculation method Methods 0.000 claims description 8
- 230000005540 biological transmission Effects 0.000 claims description 6
- 230000001133 acceleration Effects 0.000 claims description 5
- 239000003550 marker Substances 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 claims description 2
- 230000008569 process Effects 0.000 abstract description 85
- 235000009421 Myristica fragrans Nutrition 0.000 description 16
- 239000001115 mace Substances 0.000 description 16
- 238000003491 array Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 230000004044 response Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 6
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 238000012538 light obscuration Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000001154 acute effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000010304 firing Methods 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
Images
Classifications
-
- A63F13/12—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
- A63F13/245—Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/833—Hand-to-hand fighting, e.g. martial arts competition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1062—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to a type of game, e.g. steering wheel
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8029—Fighting without shooting
Definitions
- the present invention relates to an imaging apparatus that is provided separately from a computer and that is used by being connected to the computer, and a related technique thereof.
- Patent Document 1 discloses a communication battle type virtual reality tennis game system that uses a camera as an input device.
- a player is photographed with a camera, and an image obtained by the computer main body is analyzed to detect a swing as an input of the player. Then, the computer main body creates return ball data according to the detected swing.
- an object of the present invention is to provide an imaging device that is easy to use as an input device for a programmer of a computer main body and related technology.
- the imaging device is an imaging device provided separately from the computer, and includes imaging means for imaging an operation article operated by a user, and imaging provided from the imaging means. Analyzing an image, detecting input by the operation article, generating detection information, and transmission means for transmitting the input information to the computer.
- the detection unit analyzes the captured image to calculate the state information of the operation article, and gives the state information to the transmission unit as the input information.
- the computer can execute processing based on the state information of the operation article.
- the state information of the operation article includes position information, speed information, movement direction information, movement distance information, speed vector information, acceleration information, movement trajectory information, area information, inclination information, motion information, or Any of the morphological information, or a combination of two or more thereof.
- the form includes a shape, a pattern, a color, or a combination of two or more thereof.
- a form contains a number, a symbol, and a character.
- the state information of the operation article is state information of one or more markers attached to the operation article.
- the state information of the plurality of markers includes the state information of each marker, information indicating the positional relationship of the plurality of markers (placement information) and number information, and information in a form formed by the plurality of markers as a whole. And position information, speed information, movement direction information, movement distance information, velocity vector information, acceleration information, movement trajectory information, area information, inclination information, and movement information in the form.
- the transmission unit transmits the state information as a command to the computer.
- the computer can execute processing in response to a command from the imaging device corresponding to the state information of the operation article.
- the imaging apparatus further includes a stroboscope that irradiates the operation article with light at a predetermined cycle, and the imaging means photographs the manipulation article when the stroboscope is turned on and off. Then, a difference signal generation means is provided for acquiring a light emission image and a light extinction image and generating a difference signal between the light emission image and the light extinction image.
- the operation article includes retroreflection means for retroreflecting received light.
- the operation article can be detected with higher accuracy.
- an online game system includes a plurality of imaging devices each connected to a corresponding terminal and provided separately from the corresponding terminal, the imaging device being a user Imaging means for imaging an operation article operated by the imaging means; analyzing a captured image given from the imaging means; detecting input by the operation article; generating input information; and A plurality of terminals connected to each other via a network and exchanging the input information with each other to advance the game.
- the operation article is a subject of the imaging apparatus and is an operation article that is held and given movement by a user, and includes a plurality of reflection means and at least one reflection means. Switching means for switching between an exposure state and a non-exposure state, and at least one other reflection means maintains the exposure state.
- the reflection unit that always maintains the exposure state since the reflection unit that always maintains the exposure state is provided, the presence / absence of input by the operation article and / or the input form can always be detected based on the captured image of the reflection unit.
- the reflecting means for switching between the exposure state and the non-exposure state since the reflecting means for switching between the exposure state and the non-exposure state is provided, different inputs can be given when the reflecting means is imaged and when it is not imaged, and the reflecting means is used. Diversify input types.
- the operation article is a subject of the imaging apparatus, and is an operation article that is held and given movement by a user, the first reflection means, the second reflection means, And switching means for switching the state of the first reflecting means and the second reflecting means so that the exposure and non-exposure states are reversed between the first reflecting means and the second reflecting means.
- the presence / absence of input by the operation article and / or the input mode are based on the respective captured images. Can be detected. Further, based on switching between exposure and non-exposure between the first reflecting means and the second reflecting means, it is also possible to detect the presence / absence of input by the operation article and / or the form of input.
- the reflecting means retroreflects the received light.
- the input method is an input method executed by an imaging device provided separately from a computer, the step of imaging an operation article operated by a user, and the step of imaging Analyzing the picked-up image given from the above, detecting input by the operation article, generating input information, and transmitting the input information to the computer.
- a recording medium is a computer-readable recording medium that records a computer program that causes a computer mounted on an imaging apparatus to execute the input method according to the fifth aspect.
- an image analysis apparatus includes an imaging unit that images one or more subjects, and an image of the subject from an image obtained by the imaging unit, and the pixels of the image First candidate area determining means for determining a primary candidate area comprising fewer pixels, and when the number of subjects is 1 or 2, the primary candidate area is scanned to obtain state information of the subject.
- First state calculating means for calculating, and when the number of subjects is at least 3, the image includes the subject image from the primary candidate region, and includes two pixels that are smaller than the number of pixels in the primary candidate region.
- including means that the subject image is completely contained in the primary candidate region and does not protrude, and that the subject image is completely contained in the secondary candidate region and does not protrude.
- the first candidate area determining unit includes a first array unit that generates a first array that is an orthogonal projection of the pixel values of the image onto a horizontal axis, and a vertical of the pixel values of the image.
- Second array means for generating a second array that is orthogonal to the axis, and means for determining the primary candidate region based on the first array and the second array, the second array means,
- Candidate area determining means includes third array means for generating a third array that is an orthogonal projection of the pixel values of the primary candidate area to the horizontal axis, and the vertical axis of the pixel values of the primary candidate area.
- the image analysis apparatus further includes a stroboscope that irradiates light at a predetermined cycle to the operation article, and the imaging unit displays the operation article when the stroboscope is turned on and off.
- the state calculation means, the second candidate area determination means, and the second state calculation means execute processing based on the difference signal.
- an image analysis method is an image analysis method based on an image obtained by an imaging device that images one or more subjects, and the image of the subject is included from the images. Determining a primary candidate area composed of pixels smaller than the number of pixels of the image, and if the number of subjects is 1 or 2, the primary candidate area is scanned to obtain state information of the subject. And calculating, when the number of the subjects is at least 3, a secondary candidate region comprising the subject image from the primary candidate regions and comprising pixels smaller than the number of pixels of the primary candidate region. Determining, and if the number of the subjects is at least 3, scanning the secondary candidate area to calculate the subject state information.
- a recording medium is a computer-readable recording medium that records a computer program that causes a computer to execute the image analysis method according to the eighth aspect.
- recording media include, for example, flexible disks, hard disks, magnetic tapes, magneto-optical disks, CDs (including CD-ROMs and Video-CDs), DVDs (DVD-Videos, DVD-DVDs).
- CDs including CD-ROMs and Video-CDs
- DVDs DVD-Videos, DVD-DVDs
- ROM, DVD-RAM ROM cartridge
- RAM memory cartridge with battery backup flash memory cartridge
- nonvolatile RAM cartridge and the like.
- FIG. 1 is an external perspective view showing an overall configuration of a game system according to an embodiment of the present invention. It is a figure which shows the electrical structure of camera unit 1-N of FIG.
- FIG. 2A is an external perspective view of the operation article 3A-N in FIG.
- B It is an external appearance perspective view of the other example of the operation thing.
- C It is an external appearance perspective view of the further another example of the operation thing. It is explanatory drawing of the detection process of the retroreflection sheet 4 based on the difference image DI which the image sensor 21 output.
- FIG. 10 is an explanatory diagram of an additional detection process for a retroreflective sheet 4 when a player uses the operation article 3C-N. It is explanatory drawing of the inclination detection process of operation thing 3A-N.
- step S5 of FIG. 14 It is a flowchart which shows a part of flow of the detection process of the retroreflection sheet of step S5 of FIG. It is a flowchart which shows a part of other flow of the detection process of the retroreflection sheet of step S5 of FIG. 15 is a flowchart showing still another part of the flow of the retroreflective sheet detection process in step S5 of FIG. 14. 15 is a flowchart showing still another part of the flow of the retroreflective sheet detection process in step S5 of FIG. 14. 15 is a flowchart showing still another part of the flow of the retroreflective sheet detection process in step S5 of FIG. 14. 15 is a flowchart showing still another part of the flow of the retroreflective sheet detection process in step S5 of FIG. 14. 15 is a flowchart showing still another part of the flow of the retroreflective sheet detection process in step S5 of FIG. 14.
- FIG. 15 is a flowchart showing still another part of the flow of the retroreflective sheet detection process in step S5 of FIG. 14. It is a flowchart which shows the flow of the 4 end point detection process of step S349 of FIG. It is a flowchart which shows the flow of the trigger detection process (based on a sword) of step S9 of FIG. It is a flowchart which shows the flow of the shield trigger detection process of step S443 of FIG. It is a flowchart which shows a part of flow of the special trigger detection process of step S445 of FIG.
- FIG. 25 is a flowchart showing another part of the flow of the special trigger detection process in step S445 of FIG. 24.
- FIG. 32 is a flowchart showing a flow of charge trigger detection processing in step S765 of FIG. 31.
- FIG. It is a flowchart which shows the flow of the shield trigger detection process of step S769 of FIG.
- 1-N (1-1 to 1-n) ... Camera unit, 3-N (3-1 to 3-n), 3A-N (3A-1 to 3A-n), 3B-N (3B-1 to 3B-n), 3C-N (3C-1 to 3C-n) ... operation object, 5-N (5-1 to 5-n) ... terminal, 4, 4A to 4G ... retroreflective sheet, 11 ... infrared Light emitting diode, 21 ... Image sensor, 23 ... MCU, 29 ... Network, 31 ... Host computer.
- FIG. 1 is an external perspective view showing an overall configuration of a game system according to an embodiment of the present invention.
- a sword-shaped operation article hereinafter referred to as “sword”
- terminal 5-N a terminal 5-N
- monitor of terminal 5-N a terminal 5-N
- 7 includes a camera unit 1-N placed on the upper edge.
- N is an integer of 1 or more.
- the camera unit 1-N is connected to the terminal 5-N by a USB (Universal Serial Bus) cable 9.
- the camera unit 1-N includes an infrared filter 13 that transmits only infrared light and four infrared light emitting diodes (IREDs) 11 that emit infrared light disposed around the infrared filter 13.
- IREDs infrared light emitting diodes
- An image sensor 21 described later is disposed on the back side of the infrared filter 13.
- the sword 3A-N in FIG. 1 has retroreflective sheets 4A attached to both sides of the blade portion 33.
- semi-cylindrical members 37 are attached to both surfaces of the collar portion 35 of the sword 3A-N.
- a retroreflective sheet 4 ⁇ / b> B is attached to the curved surface of the semi-cylindrical member 37.
- FIG. 3 (b) shows a mace-type operation article (hereinafter referred to as “mace”) 3B-N.
- the mace 3B-N includes a stick 45 held by the player and a ball 47 fixed to one end of the stick 45.
- a retroreflective sheet 4 ⁇ / b> C is attached to the entire surface of the ball 47.
- FIG. 3C shows a crossbow-type operation article (hereinafter referred to as “crossbow”) 3CN.
- Circular retroreflective sheets 4E and 4F (not shown in the figure) are attached to the side surfaces of the left and right bow portions 39 of the crossbow 3C-N symmetrically.
- a circular retroreflective sheet 4D is attached between the left and right retroreflective sheets 4E and 4F and at the tip of the base 41. *
- a lid 49 is attached to the tip of the base 41 so as to be freely opened and closed.
- the lid 49 is closed. Therefore, in this case, the retroreflective sheet 4D is covered with the lid 49 and is not exposed.
- the lid 49 is open as shown. Therefore, in this case, the retroreflective sheet 4D is exposed.
- the retroreflective sheet 4 ⁇ / b> G is attached to the bottom of the base 41.
- the retroreflective sheet 4 ⁇ / b> G is attached so that the reflection surface thereof has an acute angle (as viewed from the trigger 51 side) with respect to the longitudinal direction of the pedestal 41. Therefore, when the tip of the pedestal 41 faces the camera unit 1, the retroreflective sheet 4G is not photographed, and when the tip of the pedestal 41 faces obliquely upward, the retroreflective sheet 4G is photographed.
- the retroreflective sheets 4A to 4G may be collectively referred to as the retroreflective sheet 4.
- the retroreflective sheet 4 can also be referred to as a marker 4.
- the sword 3A-N, the mace 3B-N, and the crossbow 3C-N may be collectively referred to as the operation article 3-N.
- the operation article 3-N can also be called a subject 3-N. *
- the infrared light emitting diode 11 of the camera unit 1-N emits infrared light intermittently at a predetermined cycle.
- the infrared light emitting diode 11 functions as a stroboscope. Infrared light from the infrared light emitting diode 11 is retroreflected by the retroreflective sheet 4A or 4B of the sword 3A-N and input to the image sensor 21 provided on the back side of the infrared filter 13. In this way, the swords 3A-N are photographed intermittently. The same applies to the mace 3B-N and the crossbow 3C-N. *
- the camera unit 1 obtains the difference between the image signal when the infrared light is turned on and the image signal when the infrared light is turned off, and detects the movement of the sword 3A-N based on the difference signal DI (difference image DI). Then, the detection result is transmitted to the terminal 5-N via the USB cable 9. Then, the terminal 5-N reflects the movement of the sword 3A-N in the online game process. The same applies to the mace 3B-N and the crossbow 3C-N.
- the camera unit 1 can remove the noise caused by light other than the reflected light from the retroreflective sheet 4 as much as possible by obtaining the differential signal DI, and can detect the retroreflective sheet 4 with high accuracy.
- host computer 31 provides online games to terminals 5-1 to 5-n through network 29.
- Camera units 1-1 to 1-n are connected to the terminals 5-1 to 5-n, respectively.
- the camera units 1-1 to 1-n photograph the retroreflective sheets 4 of the operation articles 3-1 to 3-n, respectively.
- a terminal 5-N in FIG. 1 is a comprehensive representation of terminals 5-1 to 5-n.
- the camera unit 1-N in FIG. 1 is a comprehensive representation of the camera units 1-1 to 1-n.
- the operation article 3-N (3A-N, 3B-N, 3CN) in FIGS. 3A to 3C is a comprehensive representation of the operation articles 3-1 to 3-n. .
- the camera unit 1-N includes a USB controller 25, an MCU (Micro Controller Unit) 23, an image sensor 21, and an infrared light emitting diode (IRED) 11.
- a USB controller 25 an MCU (Micro Controller Unit) 23, an image sensor 21, and an infrared light emitting diode (IRED) 11.
- MCU Micro Controller Unit
- IRED infrared light emitting diode
- the USB controller 25 communicates with the terminal 5-N through the USB cable 9 and the USB port 27 of the terminal 5-N, and transmits and receives data.
- the image sensor 21 executes a photographing process when the infrared light emitting diode 11 is turned on and off. Then, the image sensor 21 outputs a difference signal DI between the image signal when turned on and the image signal when turned off to the MCU 23. Further, the image sensor 21 turns on the infrared light emitting diode 11 intermittently.
- the resolution of the image sensor 21 is, for example, 64 pixels ⁇ 64 pixels.
- the MCU 23 detects the image of the retroreflective sheet 4 based on the difference signal DI given from the image sensor 21, and calculates the state information.
- the state information includes position information, speed information, movement direction information, movement distance information, velocity vector information, acceleration information, movement trajectory information, area information, inclination information, movement information, or form of the single or plural retroreflective sheets 4 Any of the information, or a combination of two or more thereof.
- Forms include shapes, patterns or colors or combinations of two or more thereof.
- a form contains a number, a symbol, and a character.
- the state information of the plurality of retroreflective sheets 4 includes the state information of the respective retroreflective sheets 4, information indicating the positional relationship between the plurality of retroreflective sheets 4, number information, and a plurality of recursions Information on the form formed by the entire reflection sheet 4 and position information, speed information, movement direction information, movement distance information, speed vector information, acceleration information, movement trajectory information, area information, inclination information, and movement information of the form. Including.
- a specific example will be described.
- FIG. 4 is an explanatory diagram of the detection process of the retroreflective sheet 4 based on the difference image (difference signal) DI output from the image sensor 21.
- the MCU 23 sets Y to 0, increments X by 1, and then increments Y again, compares each pixel value with a predetermined threshold value Thl, and exceeds the threshold value Thl.
- the MCU 23 detects a pixel that is less than the threshold value Thl in a certain column and then detects a pixel that exceeds the threshold value Thl in the next column, the MCU 23 detects the X coordinate (X0, X2 in FIG. 4) of that pixel. ) Is stored in an internal memory (not shown), and after a pixel exceeding the threshold value Thl is detected in a certain column, a pixel having a threshold value Thl or less is detected in the next column, the column including the pixel
- the X coordinates (X1, X3 in FIG. 4) of the pixels included in the left adjacent column are stored in the internal memory.
- the MCU 23 sets X to 0, increments Y by 1, and then increments X again, compares each pixel value with a predetermined threshold value Thl, and exceeds the threshold value Thl.
- the MCU 23 In this scanning, when the MCU 23 detects pixels that are below the threshold value Thl continuously in a certain row and then detects a pixel that exceeds the threshold value Thl in the next row, the MCU 23 displays the Y coordinate ( Y0, Y2) is stored in the internal memory, and when a pixel exceeding the threshold value Thl is detected in a certain row, and a pixel having a threshold value Thl or less is detected in the next row, one of the rows including the pixel
- the Y coordinates (Y1, Y3 in FIG. 4) of the pixels included in the upper row are stored in the internal memory.
- the MCU 23 compares the pixel value with the threshold value Thl for each of the candidate regions a0 to a3, and determines that the images IM0 and IM1 exist in the candidate region including the pixel exceeding the threshold value Thl.
- the MCU 23 determines that the images IM0 and IM1 exist in the candidate areas a0 and a3, respectively.
- the MCU 23 recognizes that the number of images includes the number of candidate areas including pixels exceeding the threshold value Thl.
- the MCU 23 calculates (Equation 1) for each of the candidate areas a0 and a3 where it is determined that the images IM0 and IM1 exist, and obtains XY coordinates (Xr, Yr) of the images IM0 and IM1.
- Pj is the pixel value of the candidate area where the retroreflective sheet 4 exists
- Xj is the X coordinate of the pixel value Pj
- Yj is the Y coordinate of the pixel value Pj
- the subscript j is the retroreflective sheet.
- 4 represents a pixel in a candidate area.
- the MCU 23 performs the calculation with the pixel value Pj equal to or less than the threshold value Th1 being zero. Note that (Equation 1) may be calculated using only the pixel value Pj exceeding the threshold value Thl, ignoring the pixel value Pj that is less than or equal to the threshold value Thl.
- the MCU 23 calculates (Equation 1) and simultaneously counts the number of pixels exceeding the threshold value Th1 for each of the candidate areas a0 and a3 where the images IM0 and IM1 exist.
- the number of pixels exceeding the threshold value Th1 in the candidate area a0 corresponds to the area of the image IM0
- the number of pixels exceeding the threshold value Th1 in the candidate area a3 corresponds to the area of the image IM1. To do.
- the retroreflective sheet 4 retroreflects infrared light, pixel regions exceeding the threshold value Th1, that is, the images IM0 and IM1, correspond to the retroreflective sheet 4. In FIG. 4, two retroreflective sheets 4 are reflected.
- the MCU 23 calculates the XY coordinates and areas of the images IM0 and IM1 of the retroreflective sheet 4.
- the MCU 23 substitutes “1” into the arrays H [X] and V [Y] corresponding to the XY coordinates of the pixels exceeding the threshold value Thl.
- the MCU 23 assigns “0” to the arrays H [X] and V [Y] corresponding to the XY coordinates of the pixels equal to or less than the threshold value Thl.
- arrays H [X] and V [Y] in which “1” is stored are schematically illustrated by hatching.
- the element number X at the left end of the array H [X] in which “1” is stored is the X coordinates X0 and X2, and the element number X at the right end of the array H [X] in which “1” is stored is the X coordinate X1 and X3.
- the element number Y at the upper end of the array V [Y] storing “1” is the Y coordinates Y0 and Y2, and the element number Y at the lower end of the array V [Y] storing “1” is the Y coordinate. Y1 and Y3. In this way, the candidate areas a0 to a3 can be determined.
- the array H [X] stores orthogonal projections of the pixel values of the difference image onto the horizontal axis (X axis). Further, it can be said that the array V [Y] stores orthogonal projections of the pixel values of the difference image onto the vertical axis (Y axis).
- the XY coordinates and area of the image of the retroreflective sheet 4 are obtained in the same manner as in the case of two sheets.
- the number of retroreflective sheets 4 reflected in the difference image DI can be three or more, that is, when the player uses the crossbow 3C-N, the following processing is added.
- 5 (a) and 5 (b) are explanatory diagrams of the additional detection process of the retroreflective sheet 4 when the player uses the crossbow 3C-N.
- a pixel included in the column adjacent to the left of the column including the pixel is detected when a pixel having a threshold value Thl or less is detected in a certain column X coordinates (x1, x3 in FIG. 5B) are stored in the internal memory.
- the MCU 23 sets X to X0, increments Y by 1, and then increments X again, compares each pixel value with a predetermined threshold value Thl, and exceeds the threshold value Thl.
- the Y coordinate of the pixel included in the row immediately above the row is stored in the internal memory.
- the MCU 23 compares the pixel value with the threshold value Thl for each of the candidate regions b0 and b1, and determines that the images IM0 and IM1 exist in the candidate region including the pixels exceeding the threshold value Thl.
- the MCU 23 determines that the images IM0 and IM1 exist in the candidate areas b0 and b1, respectively.
- the MCU 23 recognizes the number of candidate areas including the pixels exceeding the threshold value Thl as the number of images reflected in the candidate area a0 (see FIG. 5A).
- the MCU 23 calculates (Equation 1) for each candidate area b0 and b1 where it is determined that the images IM0 and IM1 exist, and obtains XY coordinates (Xr, Yr) of the images IM0 and IM1.
- the MCU 23 calculates (Equation 1) and simultaneously counts the number of pixels exceeding the threshold value Th1 for each of the candidate regions b0 and b1 where the images IM0 and IM1 exist.
- the number of pixels exceeding the threshold value Th1 in the candidate region b0 corresponds to the area of the image IM0
- the number of pixels exceeding the threshold value Th1 in the candidate region b1 is equal to that of the image IM1. It corresponds to the area.
- the retroreflective sheet 4 retroreflects infrared light, pixel regions exceeding the threshold value Th1, that is, the images IM0 and IM1, correspond to the retroreflective sheet 4.
- the two retroreflective sheets 4 are reflected in the candidate area a0.
- the MCU 23 calculates the XY coordinates and areas of the images IM0 and IM1 of the retroreflective sheet 4.
- the MCU 23 scans the area b0 to obtain the maximum X coordinate mxX [0], the maximum Y coordinate mxY [0], the minimum X coordinate mnX [0], and the minimum Y coordinate mnY [0] of the image IM0. Further, the MCU 23 scans the area b1 to obtain the maximum X coordinate mxX [1], the maximum Y coordinate mxY [1], the minimum X coordinate mnX [1], and the minimum Y coordinate mnY [1] of the image IM1.
- the MCU 23 calculates the XY coordinates and the area of the image IM2 of the retroreflective sheet 4 by executing the above process performed in the candidate area a0 in FIG. 5A also in the candidate area a1.
- the MCU 23 substitutes “1” into the arrays HcX [X] [0] and VcY [Y] [0] corresponding to the XY coordinates of the pixels exceeding the threshold value Thl.
- the MCU 23 substitutes “0” into the arrays HcX [X] [0] and VcY [Y] [0] corresponding to the XY coordinates of the pixels equal to or less than the threshold Thl.
- the arrays HcX [X] [0] and VcY [Y] [0] in which “1” is stored are schematically shown by diagonal lines.
- the element number X at the left end of the array HcX [X] [0] in which “1” is stored is the X coordinates x0 and x2, and the element number at the right end of the array HcX [X] [0] in which “1” is stored X is the X coordinates x1 and x3.
- the element number Y at the upper end of the array VcY [Y] [0] in which “1” is stored is the Y coordinate y0, and the element number at the lower end of the array VcY [Y] [0] in which “1” is stored. Y is the Y coordinate y1. In this way, the candidate areas b0 and b1 can be determined.
- the MCU 23 executes the above-described processing performed in the candidate area a0 in FIG. 5B also in the candidate area a1.
- the array Hc [X] [0] stores orthographic projections of the pixel values of the primary candidate region onto the horizontal axis (X axis).
- the array Vc [Y] [0] stores orthogonal projections of the pixel values of the primary candidate region onto the vertical axis (Y axis).
- the two retroreflective sheets 4B of the swords 3A-N are reflected in the difference image DI.
- the player operates the sword 3A-N away from the camera unit 1 by a certain distance or more.
- the distance between the two retroreflective sheets 4B reflected in the difference image DI is smaller than one pixel. Accordingly, the images of the two retroreflective sheets 4B appear in the difference image DI as one image.
- the number of images of the retroreflective sheet 4 reflected in the difference image DI is always one (either the retroreflective sheet 4A or 4B).
- a higher resolution image sensor 21 can also be used.
- one retroreflective sheet can be attached to the sword tip.
- a lower resolution image sensor 21 can also be used.
- the MCU 23 executes determination processing in the order of shield trigger generation conditions, special trigger generation conditions, and swing trigger generation conditions.
- description will be made in the order of a shield trigger, a swing trigger, and a special trigger.
- the MCU 23 determines that the retroreflective sheet 4A having a large area has been photographed when the area of the image of the retroreflective sheet reflected in the difference image DI exceeds a predetermined threshold value Tha1.
- the MCU 23 determines that the retroreflective sheet 4A has been continuously photographed in the five difference images DI, the MCU 23 generates a shield trigger and executes an inclination detection process.
- FIG. 6 is an explanatory diagram of the inclination detection process of the swords 3A-N.
- the MCU 23 classifies the inclination of the sword 3A-N as one of horizontal B0, diagonal B1, and vertical B2 based on the ratio r.
- the MCU 23 determines that the retroreflective sheet 4B has been shot when the area of the image of the retroreflective sheet is equal to or less than a predetermined threshold value Thal. And MCU23 performs a swing detection process.
- FIG. 7 is an explanatory diagram of the swing detection process of the swords 3A-N.
- MCU 23 determines whether images IM0 to IM4 of retroreflective sheet 4B are continuously detected in the five difference images DI.
- the MCU 23 classifies the directions of the velocity vectors V0 to V3 based on the XY coordinates (Xr, Yr) of the five images IM0 to IM4 into any of the eight directions A0 to A4 in FIG. .
- the direction in the range of 22.5 degrees in the clockwise direction around the direction A0 and 22.5 degrees in the counterclockwise direction is classified as the direction A0.
- the MCU 23 compares the magnitudes of the respective velocity vectors V0 to V4 with a predetermined threshold value Thv1. When the magnitude of the velocity vectors V0 to V4 exceeds the threshold value Thv1, the MCU 23 determines that the player has swung the sword 3A-N, and generates a swing trigger. In this case, the MCU 23 sets the same direction in which the velocity vectors V0 to V3 are classified as the swing direction of the swords 3A-N.
- the MCU 23 determines that the player has swung the sword 3A-N, the MCU 23 obtains the swing position based on the XY coordinates (Xr, Yr) of the center image IM2 among the five images IM0 to IM4. In this case, as shown in FIGS. 9A to 9H, the MCU 23 classifies the swing position into one of seven positions for each swing direction A0 to A7.
- the MCU 23 determines whether or not a shield trigger has been generated this time and the previous time.
- the MCU 23 detects the presence or absence of a special operation by the sword 3A-N when it is determined that the shield trigger is generated by both, and generates a special trigger when it is detected.
- FIG. 10 is an explanatory diagram of special operations by the swords 3A-N.
- the MCU 23 first determines whether or not the retroreflective sheet 4A has moved vertically upward, and then the retroreflective sheet 4A or 4B has moved vertically downward, that is, whether a special operation has been performed. Judge whether or not. Specifically, it is as follows.
- the MCU 23 determines whether or not the images IM0 to IM4 of the retroreflective sheet 4A are continuously detected in the five difference images DI from when the first flag is turned on until it is turned off. To do. In this case, since it is an image of the retroreflective sheet 4A, the area of each of the images IM0 to IM4 needs to exceed the threshold value Tha1.
- the velocity vectors V0 to V3 based on the XY coordinates (Xr, Yr) of the five images IM0 to IM4 are detected. Are classified into any of eight directions A0 to A7 (see FIG. 8).
- the MCU 23 compares the magnitudes of the speed vectors V0 to V4 with a predetermined threshold Thv2 when all the directions of the speed vectors V0 to V3 are classified into the direction A1. Then, the MCU 23 turns on the second flag when the magnitudes of the velocity vectors V0 to V3 all exceed the threshold value Thv2.
- the MCU 23 determines whether or not the images IM0 to IM4 of the retroreflective sheet 4B are continuously detected in the five difference images DI from when the second flag is turned on until it is turned off. To do.
- the MCU 23 determines that five images IM0 to IM4 of the retroreflective sheet 4B have been detected in succession, the direction of each velocity vector V0 to V3 based on the XY coordinates (Xr, Yr) of the five images IM0 to IM4 Are classified into one of eight directions A0 to A7 (see FIG. 8).
- the MCU 23 compares the magnitudes of the respective speed vectors V0 to V4 with a predetermined threshold Thv3 when all the directions of the speed vectors V0 to V3 are classified into the direction A0. Then, when the magnitudes of the velocity vectors V0 to V3 all exceed the threshold value Thv3, the MCU 23 determines that the player has performed a special operation and generates a special trigger. Note that Thv2 ⁇ Thv3.
- the first flag is turned off after the first predetermined time has elapsed since the first flag was turned on. Further, the second flag is turned off after the second predetermined time has elapsed since the second flag was turned on.
- the MCU 23 includes trigger information (shield trigger, special trigger, swing trigger, standby state), XY coordinates and area of the image of the retroreflective sheet 4, and a swing direction and position when the swing trigger is generated. Information is transmitted to the terminal 5-N.
- the MCU 23 sets a standby state as trigger information when none of the generation conditions of the shield trigger, the special trigger, and the swing trigger is satisfied.
- the terminal 5-N executes a game process corresponding to these pieces of information. These pieces of information are transmitted from the terminal 5-N to the host computer 31.
- the host computer 31 executes game processing according to these pieces of information and / or transmits these pieces of information to other terminals 5-N.
- the other terminal 5-N executes game processing according to these pieces of information.
- FIGS. 11A and 11B are explanatory diagrams of a special trigger based on Mace 3B-N.
- the MCU 23 is operated so that the mace 3B-N draws a circle in a clockwise direction (it can be counterclockwise as a condition), and then the mace 3B-N is moved vertically downward.
- Generate a special trigger when swung down in the direction Specifically, it is as follows.
- the MCU 23 detects the XY coordinates (Xr) of the three images IM0 to IM2 when the images IM0 to IM2 of the retroreflective sheet 4C are continuously detected in the three difference images DI. , Yr), when the directions of the two velocity vectors V0 and V1 are all classified into the same direction A2, the first flag is turned on.
- the MCU 23 detects the XY coordinates (Xr, Yr) of the three images IM0 to IM2 when the images IM0 to IM2 of the retroreflective sheet 4C are continuously detected in the three difference images DI. ) Based on the two speed vectors V0 and V1 are all classified into the same direction A7, the second flag is turned on.
- the MCU 23 detects the XY coordinates (Xr, Yr) of the three images IM0 to IM2 when the images IM0 to IM2 of the retroreflective sheet 4C are continuously detected in the three difference images DI. ) Based on the two speed vectors V0 and V1 are all classified into the same direction A0, the third flag is turned on.
- the MCU 23 detects the XY coordinates (Xr, Yr) of the three images IM0 to IM2 when the images IM0 to IM2 of the retroreflective sheet 4C are continuously detected in the three difference images DI. ) Based on the second velocity vector V0 and V1 are all classified into the same direction A5, the fourth flag is turned on.
- the MCU 23 detects the XY coordinates (Xr, Yr) of the three images IM0 to IM2 when the images IM0 to IM2 of the retroreflective sheet 4C are continuously detected in the three difference images DI. ) Based on the two speed vectors V0 and V1 are all classified into the same direction A3, the fifth flag is turned on.
- the MCU 23 detects the XY coordinates (Xr, Yr) of the three images IM0 to IM2 when the images IM0 to IM2 of the retroreflective sheet 4C are continuously detected in the three difference images DI. ) Based on the two speed vectors V0 and V1 are all classified into the same direction A6, the sixth flag is turned on.
- the MCU 23 detects the XY coordinates (Xr, Yr) of the three images IM0 to IM2 when the images IM0 to IM2 of the retroreflective sheet 4C are continuously detected in the three difference images DI. ) Based on the second velocity vector V0 and V1 are all classified into the same direction A1, the seventh flag is turned on.
- the MCU 23 detects the XY coordinates (Xr, Yr) of the three images IM0 to IM2 when the images IM0 to IM2 of the retroreflective sheet 4C are continuously detected in the three difference images DI. ) Based on the second velocity vector V0 and V1 are all classified into the same direction A4, the eighth flag is turned on.
- the MCU 23 detects the XY coordinates (Xr, Yr) of the three images IM0 to IM2 when the images IM0 to IM2 of the retroreflective sheet 4C are continuously detected in the three difference images DI. ) Based on the second velocity vector V0 and V1 are all classified into the same direction A2, the ninth flag is turned on.
- the MCU 23 turns off all the first to eighth flags when the ninth flag is not turned on within the third predetermined time from the time when the first flag is turned on.
- the MCU 23 determines whether or not the size of the circle drawn by the mace 3B-N is larger than a predetermined value. If it is larger, the MCU 23 turns on the tenth flag. Turn off all the 1st to 9th flags. Specifically, it is as follows.
- the MCU 23 detects the XY coordinates (Xr, Yr) of the three images IM0 to IM2 when the images IM0 to IM2 of the retroreflective sheet 4C are continuously detected in the three difference images DI. ), The directions of the two velocity vectors V0 and V1 are classified into any of the eight directions A0 to A7. When all the directions of the two velocity vectors V0 and V1 are classified into the same direction A0, the MCU 23 determines whether or not the magnitudes of the two velocity vectors V0 and V1 all exceed a predetermined threshold Thv4.
- the MCU 23 determines that the player has performed a special operation and generates a special trigger. If no special trigger is generated within the fourth predetermined period from when the tenth flag is turned on, all the first to tenth flags are turned off.
- the MCU 23 transmits the trigger information (special trigger, standby state), the XY coordinates and the area of the image of the retroreflective sheet 4 to the terminal 5-N.
- the MCU 23 sets a standby state as trigger information when the special trigger generation condition is not satisfied.
- the terminal 5-N executes a game process corresponding to these pieces of information. These pieces of information are transmitted from the terminal 5-N to the host computer 31.
- the host computer 31 executes game processing according to these pieces of information and / or transmits these pieces of information to other terminals 5-N.
- the other terminal 5-N executes game processing according to these pieces of information.
- the MCU 23 first detects the number of retroreflective sheets 4 reflected in the difference image DI, and proceeds with processing according to the number.
- the method for detecting the number of retroreflective sheets 4 is as described above (see FIGS. 4 and 5).
- the MCU 23 determines whether or not a charge trigger generation condition is satisfied. Further, when the number of retroreflective sheets 4 is two, the MCU 23 determines whether or not the shield trigger generation condition is satisfied, and if not, determines whether or not the switch trigger generation condition is satisfied. to decide. Furthermore, when the number of retroreflective sheets 4 is three, the MCU 23 determines whether or not a shooting trigger generation condition is satisfied. When two or more of the four triggers satisfy the generation condition, the priority is the charge trigger, the shield trigger, the switch trigger, and the shooting trigger from the highest priority.
- the charge trigger, shield trigger, switch trigger, and shooting trigger will be described in this order.
- FIG. 12A is an explanatory diagram of a charge trigger based on the crossbow 3C-N.
- the MCU 23 determines whether or not the area of the image is larger than the threshold value Tha2. If larger, the MCU 23 determines that the retroreflective sheet 4G has been photographed and generates a charge trigger.
- FIG. 12B is an explanatory diagram of a shield trigger based on the crossbow 3C-N.
- the MCU 23 determines that the retroreflective sheets 4E and 4F have been photographed, and their XY coordinates (Xr, Yr) is used to calculate the slope of the straight line connecting them.
- the MCU 23 generates a shield trigger when the inclination is larger than a certain value.
- the MCU 23 calculates the midpoint coordinates based on the XY coordinates (Xr, Yr).
- the MCU 23 switches the switch trigger in the same manner as the detection of the swing trigger of the sword 3A-N when the retroreflective sheet 4 reflected in the difference image DI is two and the shield trigger generation condition is not satisfied. It is determined whether or not the generation condition is satisfied. However, in the determination, the XY coordinates (Xr, Yr) of the retroreflective sheets 4E and 4F are not used, but the midpoint coordinates thereof are used. Specifically, it is as follows.
- FIG. 12C is an explanatory diagram of a switch trigger based on the crossbow 3C-N.
- the MCU 23 when the images of the retroreflective sheets 4E and 4F are continuously detected in the five differential images DI, the MCU 23 is based on the five middle point coordinates corresponding to them. It is determined whether or not the directions of the four velocity vectors are all classified into the same direction A1. When all are classified into the same direction A1, the MCU 23 determines whether or not the sizes of the four velocity vectors all exceed a predetermined threshold Thv5. The MCU 23 turns on the predetermined flag when the magnitudes of the four velocity vectors all exceed the threshold Thv5.
- the MCU 23 detects five images corresponding to the retroreflective sheets 4E and 4F in the five difference images DI. Based on the midpoint coordinates, it is determined whether or not the directions of the four velocity vectors are all classified into the same direction A0. When all are classified into the same direction A0, the MCU 23 determines whether or not the sizes of the four velocity vectors all exceed a predetermined threshold value Thv6. The MCU 23 turns on the switch trigger when the magnitudes of the four velocity vectors all exceed the threshold Thv6.
- the MCU 23 turns off the predetermined flag when the switch trigger is not generated within the fifth predetermined period from the time when the predetermined flag is turned on.
- FIG. 12D is an explanatory diagram of a shooting trigger based on the crossbow 3C-N.
- the MCU 23 determines that these are the retroreflective sheets 4D, 4E, and 4F. Then, the MCU 23 determines whether or not the retroreflective sheet 4 reflected in the previous difference image is only the retroreflective sheets 4E and 4F, and only the two retroreflective sheets 4E and 4F are reflected. Generate a shooting trigger. However, the MCU 23 does not generate a shooting trigger when the retroreflective sheet 4 reflected in the previous difference image is the three retroreflective sheets 4D, 4E, and 4F. That is, a shooting trigger is generated when the number of retroreflective sheets 4 reflected in the difference image DI changes from two to three.
- the MCU 23 determines the difference between the areas ar0, ar1 and ar2 of the three retroreflective sheets 4 before determining the shooting trigger generation condition
- the MCU 23 performs retroreflection at both ends before determining the shooting trigger generation condition. It is determined whether or not the sheets 4E and 4F satisfy the shield trigger generation condition. If satisfied, the shield trigger is generated.
- the MCU 23 determines the shooting trigger generation condition when the charge trigger and shield trigger generation conditions are not satisfied.
- the MCU 23 transmits the trigger information (charge trigger, shield trigger, switch trigger, shooting trigger and standby state), the XY coordinates and the area of the image of the retroreflective sheet 4 to the terminal 5-N. Further, when the number of retroreflective sheets 4 reflected in the difference image DI is two, the MCU 23 transmits the midpoint coordinates to the terminal 5-N. Further, when the number of retroreflective sheets 4 reflected in the difference image DI is three, the MCU 23 transmits the midpoint coordinates of the retroreflective sheets 4 at both ends to the terminal 5-N.
- the trigger information charge trigger, shield trigger, switch trigger, shooting trigger and standby state
- the MCU 23 transmits the midpoint coordinates to the terminal 5-N.
- the MCU 23 sets a standby state as trigger information when none of the generation conditions of the charge trigger, the shield trigger, the switch trigger, and the shooting trigger is satisfied.
- the terminal 5-N executes a game process corresponding to these pieces of information. These pieces of information are transmitted from the terminal 5-N to the host computer 31.
- the host computer 31 executes game processing according to these pieces of information and / or transmits these pieces of information to other terminals 5-N.
- the other terminal 5-N executes game processing according to these pieces of information.
- the input of the crossbow 3C-N is based on the captured images of the retroreflective sheets 4E and 4F. Presence / absence and input form can always be detected Further, since the retroreflective sheet 4D that can be switched between the exposure state and the non-exposure state is provided, different inputs can be given depending on whether the retroreflective sheet 4D is captured or not. Diversify the types of input using reflective sheets.
- FIG. 13 shows a modification of the crossbow 3C-N shown in FIG.
- the mounting positions of the shutter 50 and the retroreflective sheet 4G are different from those of the crossbow 3C-N in FIG.
- a shutter 50 is attached to the front end of the base 41 so as to be freely opened and closed. Then, the retroreflective sheet 4 ⁇ / b> D is attached to the distal end portion of the base 41 and on the back side of the shutter 50.
- the shutter 50 is closed. Therefore, in this case, the retroreflective sheet 4D is hidden behind the shutter 50 and is not exposed.
- the trigger 51 is pulled, the shutter 50 is open. Therefore, in this case, the retroreflective sheet 4D is exposed.
- the member 40 is attached to the distal end portion of the pedestal 41 so as to have an obtuse angle (as viewed from the trigger 51 side) with respect to the longitudinal direction of the pedestal 41.
- the retroreflective sheet 4G is attached to the back surface of the member 40, that is, the surface facing the trigger 51. Accordingly, when the tip of the base 41 faces the camera unit 1, the retroreflective sheet 4G is not photographed, and when the tip of the base 41 faces upward, the retroreflective sheet 4G is photographed. . *
- the retroreflective sheet 4G Since the retroreflective sheet 4G is attached at an obtuse angle with respect to the longitudinal direction of the pedestal 41, the retroreflective sheet 4G is attached more than the crossbow 3C-N of FIG. The retroreflective sheet 4G is not photographed unless the tip of the base 41 is directed upward. Therefore, it is possible to further prevent photographing of the retroreflective sheet 4G that is not intended by the player.
- FIG. 14 is a flowchart showing an overall flow of processing of the MCU 23 of FIG.
- step S1 the MCU 23 executes an initialization process for variables and the like.
- step S ⁇ b> 3 the MCU 23 controls the image sensor 37 to cause the retroreflective sheet 4 to be imaged.
- step S ⁇ b> 5 the MCU 23 executes the detection process of the retroreflective sheet 4 based on the difference image signal from the image sensor 21 and calculates the state information of the retroreflective sheet 4.
- the MCU 23 executes trigger detection processing based on the detection result in step S5.
- step S11 the MCU 23 transmits a trigger (that is, a trigger flag described later) and status information to the terminal 5-N.
- a trigger that is, a trigger flag described later
- step S21 the terminal 5-N receives the trigger and status information.
- step S23 the terminal 5-N executes a game process according to the received trigger and state information.
- step S 25 the terminal 5 -N transmits trigger and status information to the host computer 31 via the network 29.
- the host computer 31 executes a game process corresponding to the trigger and state information and / or transmits the trigger and state information to another terminal 5-N.
- the other terminal 5-N executes game processing according to these pieces of information. Such processing is performed between the plurality of terminals 5-N, and an online game is executed.
- the terminal 5-N can also execute the online game by transmitting the trigger and status information directly to the other terminal 5-N via the network.
- FIG. 15 is a flowchart showing the flow of the photographing process in step S3 of FIG.
- MCU 23 causes image sensor 21 to turn on infrared light emitting diode 11.
- step S43 the MCU 23 causes the image sensor 21 to execute photographing when the infrared light is turned on.
- step S ⁇ b> 45 the MCU 23 causes the image sensor 21 to turn off the infrared light emitting diode 11.
- step S47 the MCU 23 causes the image sensor 21 to perform photographing when the infrared light is extinguished.
- step S49 the MCU 23 causes the image sensor 21 to generate and output a difference image between the image when the infrared light is turned on and the image when the infrared light is turned off.
- the image sensor 21 performs photographing when the infrared light is turned on and off, that is, flash photography.
- the infrared light emitting diode 11 functions as a stroboscope by the above control.
- FIGS. 16 to 22 are flowcharts showing the flow of the retroreflective sheet detection process in step S5 of FIG.
- MCU 23 substitutes 0 for variables X and Y.
- the MCU 23 compares the pixel value P (X, Y) of the difference image with the threshold value Thl.
- the pixel value is, for example, a luminance value.
- the MCU 23 proceeds to step S77 when the pixel value P (X, Y) exceeds the threshold value Th1, and proceeds to step S79 otherwise.
- step S77 the MCU 23 substitutes 1 for each of the variables H [X] and V [Y].
- step S79 the MCU 23 proceeds to step S83 if the value of the variable H [X] is 1, otherwise proceeds to step S81.
- step S81 the MCU 23 substitutes 0 for the variable H [X].
- step S83 the MCU 23 proceeds to step S87 if the value of the variable V [Y] is 1, and otherwise proceeds to step S85.
- step S85 the MCU 23 substitutes 0 for the variable V [Y].
- step S87 the MCU 23 increments the value of the variable X by one.
- step S89 the MCU 23 proceeds to step S91 if the value of the variable X is 64, and otherwise returns to step S73.
- step S91 the MCU 23 substitutes 0 for the variable X.
- step S93 the MCU 23 increments the value of the variable Y by one.
- step S95 the MCU 23 proceeds to step S101 in FIG. 17 when the value of the variable Y is 64, and returns to step S73 otherwise.
- the differential image is scanned, and values are set in the arrays H [X] and V [Y] that define the primary candidate regions (see FIGS. 4 and 5A).
- the primary candidate areas are, for example, areas a0 to a3 in FIG. 4 and areas a0 and a1 in FIG. 5A.
- step S101 MCU 23 substitutes 0 for each of variables X, m, Hmx [] [] and Hmn [] [].
- step S103 the MCU 23 proceeds to step S105 when the value of the variable H [X] is 1, and proceeds to step S109 otherwise.
- step S105 the MCU 23 proceeds to step S115 if the value of the variable H [X ⁇ 1] is 0, otherwise proceeds to step S117.
- step S115 the MCU 23 substitutes the value of the variable X for the variable Hmn [m] [0].
- step S109 the MCU 23 proceeds to step S111 if the value of the variable H [X-1] is 1, otherwise proceeds to step S117.
- step S111 the MCU 23 substitutes the value of the variable X for the variable Hmx [m] [0].
- step S113 the MCU 23 increments the value of the variable m by one.
- step S117 the MCU 23 increments the value of the variable X by one.
- step S119 the MCU 23 proceeds to step S121 when the value of the variable X is 64, otherwise returns to step S103.
- step S121 the MCU 23 substitutes a value obtained by subtracting 1 from the value of the variable m for the variable Hn.
- the processing from the above steps S101 to S121 includes the element number X (X coordinate) at the left end of the array H [X] in which “1” is stored and the right end of the array H [X] in which “1” is stored. This is a process for obtaining the element number X (X coordinate).
- step S123 the MCU 23 substitutes 0 for each of the variables Y, n, Vmx [] [] and Vmn [] [].
- step S125 the MCU 23 proceeds to step S127 when the value of the variable V [Y] is 1, and proceeds to step S135 otherwise.
- step S127 the MCU 23 proceeds to step S129 if the value of the variable V [Y ⁇ 1] is 0, otherwise proceeds to step S131.
- step S129 the MCU 23 substitutes the value of the variable Y for the variable Vmn [m] [0].
- step S135 the MCU 23 proceeds to step S137 if the value of the variable V [Y-1] is 1, otherwise proceeds to step S131.
- step S137 the MCU 23 substitutes the value of the variable Y for the variable Vmx [m] [0].
- step S139 the MCU 23 increments the value of the variable “n” by one.
- step S131 the MCU 23 increments the value of the variable Y by one.
- step S133 the MCU 23 proceeds to step S141 when the value of the variable Y is 64, and otherwise returns to step S125.
- step S141 the MCU 23 substitutes a value obtained by subtracting 1 from the value of the variable n for the variable Vn.
- step S123 to S141 The processing from step S123 to S141 is performed by the element number Y (Y coordinate) at the upper end of the array V [Y] in which “1” is stored and the lower end of the array V [Y] in which “1” is stored. This is a process for obtaining the element number Y (Y coordinate).
- the differential image is scanned and the primary candidate area is determined (see FIGS. 4 and 5A).
- step S143 the MCU 23 substitutes 0 for the variable m.
- step S145 the MCU 23 substitutes the value of the variable Hmn [m] [0] for the variable Hm [m], and substitutes the value of the variable Hmx [m] [0] for the variable Hx [m].
- step S147 the MCU 23 proceeds to step S151 when the value of the variable m is the value Hn, otherwise proceeds to step S149.
- step S149 the MCU 23 increments the value of the variable m by one, and returns to step S145.
- step S151 the MCU 23 substitutes 0 for the variable n.
- step S153 the MCU 23 substitutes the value of the variable Vmn [n] [0] for the variable Vn [m], and substitutes the value of the variable Vmx [n] [0] for the variable Vx [n].
- step S155 the MCU 23 proceeds to step S171 in FIG. 18 if the value of the variable “n” is the value Vn, otherwise proceeds to step S157.
- step S157 the MCU 23 increments the value of the variable “n” by one.
- step S171 MCU 23 has three or more retroreflective sheets 4 reflected in the difference image, that is, crossbow 3C-N is used as an operation article. If so, the process proceeds to step S177; otherwise, the process proceeds to step S173.
- step S173, the MCU 23 substitutes 0 for the variable J.
- step S175 the MCU 23 substitutes the value of the variable Hn for the variable M [0], substitutes the value of the variable Vn for the variable N [0], and proceeds to step S331 in FIG.
- step S331 the MCU 23 determines the variables CA, A, B, C, minX, minY, maxX, maxY, s, mnX [], mnY [], mxX [], mxY [], Xr [], Yr [] and C [] are initialized. Then, the MCU 23 repeats between step S333 and step S389 in FIG. 22 while updating the variable j. Further, the MCU 23 repeats between step S335 and step S387 in FIG. 22 while updating the variable n. Further, the MCU 23 repeats between step S337 and step S385 of FIG. 22 while updating the variable m.
- step S339 the MCU 23 substitutes the value of the variable Hmn [m] [j] for the variable X, and substitutes the value of the variable Vmn [n] [j] for the variable Y.
- step S341 the MCU 23 compares the pixel value P (X, Y) of the difference image with the threshold value Thl.
- step S343 the MCU 23 proceeds to step S345 when the pixel value P (X, Y) exceeds the threshold value Th1, and proceeds to step S351 otherwise.
- step S345 the MCU 23 increments the value of the counter CA that calculates the area of the retroreflective sheet image by one.
- step S347 the MCU 23 updates the values of the variables A, B, and C by the following formula.
- step S349 the MCU 23 detects four end points (maximum X coordinate, maximum Y coordinate, minimum X coordinate, minimum Y coordinate) of the image of the retroreflective sheet 4.
- step S351 the MCU 23 increments the value of the variable X by one.
- step S353 the MCU 23 proceeds to step S355 if the value of the variable X is equal to the value obtained by adding 1 to the value of the variable Hmx [m] [j], otherwise returns to step S341.
- step S355 the MCU 23 substitutes the value of the variable Hmn [m] [j] for the variable X.
- step S357 the MCU 23 increments the value of the variable Y by one.
- step S359 the MCU 23 proceeds to step S371 in FIG. 22 if the value of the variable Y is equal to the value obtained by adding 1 to the value of the variable Vmx [n] [j], otherwise returns to step S341.
- step S371 MCU 23 proceeds to step S373 if the value of counter CA exceeds 0, otherwise proceeds to step S385. If the value of the counter CA exceeds 0, it means that an image of the retroreflective sheet exists in the candidate area. In order to calculate the coordinates (Xr, Yr) and save the result, the process proceeds to step S373. Go ahead. In step S373, the MCU 23 substitutes the value of the counter CA for the variable C [s]. In step S375, the MCU 23 substitutes the value of A * R / C for the variable Xr [s], and substitutes the value of B * R / C for the variable Yr [s].
- step S377 the MCU 23 substitutes the value of the variable minX for the variable mnX [s], substitutes the value of the variable minY for the variable mnY [s], and substitutes the value of the variable maxX for the variable mxX [s].
- the value of maxY is substituted into the variable mxY [s].
- step S379 the MCU 23 substitutes the value of the counter s for counting the number of photographed retroreflective sheets into the variable SN.
- step S381 the MCU 23 increments the value of the counter s by one.
- step S383 the MCU 23 resets the variables CA, A, B, C, minX, minY, maxX, and maxY, and proceeds to step S385.
- step S349 the details of step S349 will be described.
- FIG. 23 is a flowchart showing the flow of the four end point detection process in step S349 of FIG.
- MCU 23 compares the value of variable minX with the value of variable X.
- the MCU 23 proceeds to step S405 when the value of the variable “minX” is larger than the value of the variable “X”, otherwise proceeds to step S407.
- the MCU 23 substitutes the value of the variable X for the variable minX.
- step S407 the MCU 23 compares the value of the variable maxX with the value of the variable X.
- step S409 the MCU 23 proceeds to step S411 when the value of the variable “maxX” is smaller than the value of the variable “X”, otherwise proceeds to step S413.
- step S411 the MCU 23 substitutes the value of the variable X for the variable maxX.
- step S413 the MCU 23 compares the value of the variable minY with the value of the variable Y.
- step S415 the MCU 23 proceeds to step S417 if the value of the variable “minY” is larger than the value of the variable “Y”, otherwise proceeds to step S419.
- step S417 the MCU 23 substitutes the value of the variable Y for the variable minY.
- step S419 the MCU 23 compares the value of the variable maxY with the value of the variable Y.
- step S421 the MCU 23 proceeds to step S423 if the value of the variable maxY is smaller than the value of the variable Y, and returns otherwise.
- step S177 the MCU 23 substitutes 0 for each of the variables m, n, and k.
- step S179 the MCU 23 substitutes the value of the variable Hn [m] for the variable X and substitutes the value of the variable Vn [n] for the variable Y.
- step S181 the MCU 23 compares the pixel value P (X, Y) of the difference image with the threshold value Thl.
- step S183 the MCU 23 proceeds to step S185 when the pixel value P (X, Y) exceeds the threshold value Th1, and proceeds to step S187 otherwise.
- step S185 the MCU 23 substitutes 1 for each of the variables Hc [X] [k] and Vc [Y] [k].
- step S187 the MCU 23 proceeds to step S191 if the value of the variable Hc [X] [k] is 1, otherwise proceeds to step S189.
- step S189 the MCU 23 substitutes 0 for the variable Hc [X] [k].
- step S191 the MCU 23 proceeds to step S195 when the value of the variable Vc [Y] [k] is 1, and proceeds to step S193 otherwise.
- step S193 the MCU 23 substitutes 0 for the variable Vc [Y] [k].
- step S195 the MCU 23 increments the value of the variable X by one.
- step S197 the MCU 23 proceeds to step S199 when the value of the variable X is equal to the value obtained by adding 1 to the value of the variable Hx [m], otherwise the MCU 23 returns to step S181.
- step S199 the MCU 23 substitutes the value of the variable Hn [m] for the variable X.
- step S201 the MCU 23 increments the value of the variable Y by one.
- step S203 the MCU 23 proceeds to step S205 if the value of the variable Y is equal to the value obtained by adding 1 to the variable Vx [n], otherwise returns to step S181.
- step S205 the MCU 23 proceeds to step S209 if the value of the variable m is equal to the value Hn, otherwise proceeds to step S207.
- step S207 the MCU 23 increments each value of the variables m and k by 1, and returns to step S179.
- step S209 the MCU 23 proceeds to step S215 in FIG. 18 if the value of the variable n is equal to the value Vn, otherwise proceeds to step S211.
- the MCU 23 substitutes 0 for the variable m.
- step S213 the MCU 23 increments each of the values of the variables n and k by 1, and returns to step S179.
- step S215 the MCU 23 substitutes the value of the variable k for the variable K, and proceeds to step S231 in FIG.
- each primary candidate region is scanned, and values are set in the arrays Hc [X] [k] and Vc [Y] [k] that define the secondary candidate regions (FIG. 5B). )reference).
- Secondary candidate regions are, for example, regions b0 and b1 in FIG.
- step S231 MCU 23 substitutes 0 for variables p, m, k, Hmx [] [] and Hmn [] [], respectively.
- step S233 the MCU 23 substitutes the value of the variable Hn [m] for the variable X.
- step S235 the MCU 23 proceeds to step S237 when the value of the variable Hc [X] [k] is 1, and proceeds to step S243 otherwise.
- step S237 the MCU 23 proceeds to step S239 when the value of the variable Hc [X ⁇ 1] [k] is 0, otherwise proceeds to step S241.
- step S239 the MCU 23 substitutes the value of the variable X for the variable Hmn [p] [k].
- step S243 the MCU 23 proceeds to step S245 when the value of the variable Hc [X ⁇ 1] [k] is 1, and proceeds to step S241 otherwise.
- step S245 the MCU 23 substitutes the value of the variable X for the variable Hmx [p] [k].
- step S247 the MCU 23 increments the value of the variable “p” by one.
- step S241 the MCU 23 increments the value of the variable X by one.
- step S249 the MCU 23 proceeds to step S251 if the value of the variable X is equal to the value obtained by adding 1 to the value of Hx [m], otherwise returns to step S235.
- step S251 the MCU 23 substitutes a value obtained by subtracting 1 from the value of the variable p for the variable M [k].
- step S253 the MCU 23 substitutes 0 for the variable p.
- step S255 the MCU 23 proceeds to step S259 when the value of the variable m is equal to the value of the variable Hm, otherwise proceeds to step S257.
- step S257 the MCU 23 increments each of the values of the variables m and k by one and returns to step S233.
- step S259 the MCU 23 proceeds to step S281 in FIG. 20 if the value of the variable k is equal to the value of the variable K, otherwise proceeds to step S261.
- step S261 the MCU 23 substitutes 0 for the variable m.
- step S263 the MCU 23 increments the variable k by one and proceeds to step S233.
- the element number X (X coordinate) at the left end of the array Hc [X] [k] in which “1” is stored and the array Hc in which “1” is stored are related to the secondary candidate region This is a process for obtaining the element number X (X coordinate) at the right end of [X] [k].
- step S281 MCU 23 substitutes 0 for each of variables r, n, m, k, Vmx [] [] and Vmn [] [].
- step S283 the MCU 23 substitutes the value of the variable Vn [n] for the variable Y.
- step S285 the MCU 23 proceeds to step S287 when the value of the variable Vc [Y] [k] is 1, and proceeds to step S291 otherwise.
- step S287 the MCU 23 proceeds to step S289 when the value of the variable Vc [Y ⁇ 1] [k] is 0, otherwise proceeds to step S297.
- step S289 the MCU 23 substitutes the value of the variable Y for the variable Vmn [r] [k].
- step S291 the MCU 23 proceeds to step S293 when the value of the variable Vc [Y ⁇ 1] [k] is 1, and proceeds to step S297 otherwise.
- step S293 the MCU 23 substitutes the value of the variable Y for the variable Vmx [r] [k].
- step S295 the MCU 23 increments the value of the variable “r” by one.
- step S297 the MCU 23 increments the value of the variable Y by one.
- step S299 the MCU 23 proceeds to step S301 if the value of the variable Y is equal to the value obtained by adding 1 to the value of Vx [n], otherwise returns to step S285.
- step S301 the MCU 23 substitutes a value obtained by subtracting 1 from the value of the variable r in the variable N [k].
- step S303 the MCU 23 substitutes 0 for the variable r.
- step S305 the MCU 23 proceeds to step S309 if the value of the variable m is equal to the value of the variable Hm, otherwise proceeds to step S307.
- step S307 the MCU 23 increments the values of the variables m and k by one and returns to step S283.
- step S309 the MCU 23 proceeds to step S311 when the value of the variable k is equal to the value of the variable K, and proceeds to step S313 otherwise.
- step S313 the MCU 23 substitutes 0 for the variable m.
- step S315 the MCU 23 increments each of the variables k and n by one and proceeds to step S283.
- step S311 the MCU 23 substitutes the value of the variable K for the variable J, and proceeds to step S331 in FIG.
- the processing of FIG. 20 relates to the secondary candidate area, the element number Y (Y coordinate) at the upper end of the array Vc [Y] [k] storing “1”, and the array Vc storing “1”.
- [Y] is a process for obtaining the element number Y (Y coordinate) at the lower end of [k].
- each primary candidate area is scanned and a secondary candidate area is determined (see FIG. 5B).
- FIG. 24 is a flowchart showing the trigger detection process (based on swords 3A-N) in step S9 of FIG.
- MCU 23 clears the trigger flag. Information indicating the type of the generated trigger is set in the trigger flag.
- the MCU 23 executes a shield trigger detection process.
- the MCU 23 executes a special trigger detection process.
- the MCU 23 executes a swing trigger detection process.
- FIG. 25 is a flowchart showing a flow of shield trigger detection processing in step S443 in FIG.
- the MCU 23 compares the area C [0] of the retroreflective sheet with the threshold value Tha1.
- the MCU 23 determines that the retroreflective sheet 4A has been photographed and proceeds to step S465. Otherwise, the MCU 23 proceeds to step S467.
- the MCU 23 increments the value of the variable Q0 by one.
- step S469 the MCU 23 proceeds to step S471 if the value of the variable Q0 is equal to 5, that is, if the retroreflective sheet 4A has been photographed five times in succession, and otherwise returns.
- step S467 the MCU 23 assigns 0 to each of the variables Q0, ⁇ X, ⁇ Y, and r and returns.
- step S471 the MCU 23 sets the trigger flag to a value indicating a shield (a shield trigger is generated).
- step S473 the MCU 23 substitutes mxX [0] ⁇ mnX [0] (that is, the horizontal length of the candidate area) for the variable ⁇ X, and mxY [0] ⁇ mnY [0] (that is, the variable ⁇ Y). , The vertical length of the candidate area) is substituted.
- step S475 the MCU 23 obtains the ratio r by the following equation.
- step S477 the MCU 23 classifies and stores (stores) the inclination of the sword 3A-N as one of the inclinations B0 to B2 based on the ratio r.
- step S479 the MCU 23 assigns 0 to the variable Q0 and returns.
- 26 and 27 are flowcharts showing the flow of the special trigger detection process in step S445 of FIG.
- the MCU 23 proceeds to step S561 in FIG. 27 when the second flag is on, and proceeds to step S503 when it is off.
- the MCU 23 proceeds to step S511 if the first flag is on, and proceeds to step S505 if it is off.
- step S505 the MCU 23 proceeds to step S507 if the trigger flag is set to shield in the previous time and this time, and returns otherwise.
- step S507 the MCU 23 turns on the first flag.
- step S509 the MCU 23 starts the first timer and returns.
- step S511 the MCU 23 refers to the first timer and proceeds to step S541 when the first predetermined time has elapsed, otherwise proceeds to step S513.
- step S513 the MCU 23 compares the area C [0] of the retroreflective sheet 4 with the threshold value Tha1.
- step S515 the MCU 23 determines that the retroreflective sheet 4A has been detected when the area C [0] is larger than the threshold value Tha1, proceeds to step S517, otherwise proceeds to step S543.
- step S517 the MCU 23 increments the value of the variable Q1 by one.
- step S523 if the value of the variable Q1 is equal to 5, that is, if the retroreflective sheet 4A is detected five times in succession, the MCU 23 proceeds to step S525, and otherwise returns.
- step S525 the MCU 23 calculates velocity vectors V0 to V3 based on the XY coordinates (Xr, Yr) of the current and past four images IM0 to IM4 of the retroreflective sheet 4A.
- step S527 the MCU 23 classifies each of the velocity vectors V0 to V3 into any of the directions A0 to A7.
- step S529 the MCU 23 proceeds to step S531 when all the velocity vectors V0 to V3 are classified in the same direction A1, and returns otherwise.
- step S531 the MCU 23 compares the magnitudes of the velocity vectors V0 to V3 with the threshold value Thv2. In step S535, if all the velocity vectors V0 to V3 are larger than the threshold value Thv2, the process proceeds to step S537, and otherwise returns. In step S537, the MCU 23 turns on the second flag. In step S539, the MCU 23 starts the second timer and returns.
- step S541 the MCU 23 resets the first timer and the first flag.
- step S543 the MCU 23 assigns 0 to the variable Q1 and returns.
- step S561 MCU 23 refers to the second timer and proceeds to step S571 when the second predetermined time has elapsed, otherwise proceeds to step S563.
- step S571 the MCU 23 resets the first timer, the second timer, the first flag, and the second flag.
- step S573 the MCU 23 assigns 0 to each of the variables Q1 and Q2 and returns.
- step S563 the MCU 23 compares the area C [0] of the retroreflective sheet 4 with the threshold value Tha1. In step S565, the MCU 23 determines that the retroreflective sheet 4B has been detected when the area C [0] is equal to or less than the threshold value Tha1, and proceeds to step S567. Otherwise, the MCU 23 proceeds to step S573. In step S567, the MCU 23 increments the value of the variable Q2 by one.
- step S569 when the value of the variable Q2 is equal to 5, the MCU 23 determines that the retroreflective sheet 4B has been detected five times in succession, proceeds to step S575, and returns otherwise.
- step S575 the MCU 23 calculates velocity vectors V0 to V3 based on the XY coordinates (Xr, Yr) of the current and past four images IM0 to IM4 of the retroreflective sheet 4B.
- step S577 the MCU 23 classifies each of the velocity vectors V0 to V3 into any one of the directions A0 to A7.
- step S579 the MCU 23 proceeds to step S581 when all the velocity vectors V0 to V3 are classified in the same direction A0, and otherwise returns.
- step S581 the MCU 23 compares the magnitudes of the velocity vectors V0 to V3 with the threshold value Thv3. In step S583, if all the velocity vectors V0 to V3 are larger than the threshold value Thv3, the process proceeds to step S585, and otherwise, the process returns.
- step S585 the MCU 23 sets the trigger flag to “special” (occurrence of a special trigger).
- step S587 the MCU 23 resets the first timer, the second timer, the first flag, and the second flag.
- step S589 the MCU 23 substitutes 0 for each of the variables Q1 and Q2, and proceeds to step S11 in FIG.
- FIG. 28 is a flowchart showing the flow of the swing trigger detection process in step S447 of FIG.
- the MCU 23 compares the area C [0] of the retroreflective sheet with the threshold value Tha1.
- the MCU 23 determines that the retroreflective sheet 4B has been detected when the area C [0] is equal to or less than the threshold value Tha1, and proceeds to step S605. Otherwise, the MCU 23 proceeds to step S631.
- the MCU 23 substitutes 0 for the variable Q3 and proceeds to step S627.
- step S605 the MCU 23 increments the value of the variable Q3 by one.
- step S607 if the value of the variable Q3 is equal to 5, the MCU 23 determines that the retroreflective sheet 4B has been photographed five times continuously, and proceeds to step S609. Otherwise, the MCU 23 proceeds to step S627.
- step S609 the MCU 23 calculates velocity vectors V0 to V3 based on the XY coordinates (Xr, Yr) of the current and past four images IM0 to IM4 of the retroreflective sheet 4B.
- step S611 the MCU 23 classifies each of the velocity vectors V0 to V3 into any one of the directions A0 to A7.
- step S613 the MCU 23 proceeds to step S615 if all the velocity vectors V0 to V3 are classified in the same direction, and proceeds to step S627 otherwise.
- step S615 the MCU 23 registers (stores) the directions of the velocity vectors V0 to V3.
- step S617 the MCU 23 compares the magnitudes of the velocity vectors V0 to V3 with the threshold value Thv1. If all the velocity vectors V0 to V3 are larger than the threshold Thv1 in step S619, it is determined that the sword 3A-N has been swung, and the process proceeds to step S621. Otherwise, the process proceeds to step S627.
- step S621 the MCU 23 sets a trigger flag to swing (generation of a swing trigger).
- step S623 the MCU 23 obtains and registers (stores) the swing position based on the XY coordinates of the central image IM2 among the five images IM0 to IM4. In this case, as shown in FIGS. 9A to 9H, the MCU 23 classifies the swing position into one of seven positions for each swing direction A0 to A7. In step S625, the MCU 23 substitutes 0 for the variable Q3.
- step S627 the MCU 23 proceeds to step S629 if the trigger flag is not set to the shield, and returns if it is set to the shield.
- step S629 the MCU 23 sets standby to the trigger flag and returns.
- step S653 the MCU 23 proceeds to step S683 when the q-th flag is on, and proceeds to step S655 when it is off.
- step S655 the MCU 23 refers to the third timer and proceeds to step S657 when the third predetermined time has elapsed, otherwise proceeds to step S661.
- step S657 the MCU 23 resets the third timer.
- step S659 the MCU 23 turns off the first to eighth flags, substitutes 0 for the variable Q4, and proceeds to step S715 in FIG.
- step S661 the MCU 23 proceeds to step S665 if the area C [0] is larger than 0, that is, if the retroreflective sheet 4C is detected, otherwise proceeds to step S663.
- step S663 the MCU 23 substitutes 0 for the variable Q4, and proceeds to step S715 in FIG.
- step S665 the MCU 23 increments the variable Q4 by one.
- step S667 the MCU 23 proceeds to step S669 if the value of the variable Q4 is equal to 3, that is, if the retroreflective sheet 4C is detected three times in succession, otherwise proceeds to step S715 in FIG. move on.
- step S669 the MCU 23 calculates velocity vectors V0 and V2 based on the XY coordinates (Xr, Yr) of the current and past two images IM0 to IM2 of the retroreflective sheet 4C.
- step S671 the MCU 23 classifies each of the velocity vectors V0 and V2 into any one of the directions A0 to A7.
- step S673 the MCU 23 proceeds to step S675 when all the velocity vectors V0 and V1 are classified in the same direction SD, and proceeds to step S715 in FIG. 30 otherwise.
- step S675 the MCU 23 turns on the qth flag.
- step S677 the MCU 23 substitutes 0 for the variable Q4.
- step S679 the MCU 23 proceeds to step S681 to start the third timer if the value of the variable q is 1, and otherwise proceeds to step S715 in FIG.
- step S681 the MCU 23 starts the third timer and proceeds to step S715 in FIG.
- step S701 the MCU 23 proceeds to step S717 if the ninth flag is on, and proceeds to step S703 if it is off.
- step S703 the MCU 23 determines the difference ⁇ X between the maximum coordinate X1 and the minimum coordinate X0 among the X coordinates Xr of the nine images of the retroreflective sheet 4C with the first flag to the ninth flag turned on, and the nine images.
- the difference ⁇ Y between the maximum coordinate Y1 and the minimum coordinate Y0 among the Y coordinates Yr is calculated (see FIG. 11A).
- step S705 the MCU 23 substitutes ⁇ X + ⁇ Y for the variable s.
- step S707 the MCU 23 proceeds to step S709 if the value of the variable “s” exceeds a certain value, otherwise proceeds to step S713.
- step S713 the MCU 23 turns off the first to ninth flags, substitutes 0 for each of the variables Q4 and Q5, and resets the third timer.
- step S709 the MCU 23 turns on the tenth flag.
- step S711 the fourth timer is started, and the process proceeds to step S715.
- step S715 the MCU 23 sets the trigger flag to standby and returns.
- step S717 the MCU 23 proceeds to step S719 if the tenth flag is on, and proceeds to step S739 if it is off.
- step S719 the MCU 23 compares the area C [0] with the threshold value Tha1.
- step S565 when the area C [0] is larger than 0, that is, when the retroreflective sheet 4C is photographed, the MCU 23 proceeds to step S721, otherwise proceeds to step S742.
- step S742 the MCU 23 substitutes 0 for the variable Q5.
- step S721 the MCU 23 increments the value of the variable Q5 by one.
- step S723 the MCU 23 proceeds to step S725 if the value of the variable Q5 is equal to 3, that is, if the retroreflective sheet 4C is detected three times in succession, otherwise proceeds to step S715.
- step S725 the MCU 23 calculates velocity vectors V0 and V1 based on the XY coordinates (Xr, Yr) of the current and past two images IM0 to IM2 of the retroreflective sheet 4C.
- step S727 the MCU 23 classifies each of the velocity vectors V0 and V1 into any one of the directions A0 to A7.
- step S729 the MCU 23 proceeds to step S731 if all the velocity vectors V0 and V1 are classified in the same direction A0, otherwise proceeds to step S715.
- step S731 the MCU 23 compares the magnitudes of the velocity vectors V0 and V1 with the threshold value Thv4. In step S733, if the magnitudes of all velocity vectors V0 and V1 are larger than the threshold value Thv4, the process proceeds to step S735, and otherwise, the process proceeds to step S715. In step S735, the MCU 23 sets the trigger flag to special (occurrence of a special trigger). In step S737, the MCU 23 turns off the first to tenth flags, substitutes 0 for each of the variables Q4 and Q5, resets the third and fourth timers, and returns.
- step S739 the MCU 23 refers to the fourth timer and proceeds to step S741 when the fourth predetermined time has elapsed, otherwise proceeds to step S715.
- step S741 the MCU 23 turns off the first to ninth flags, substitutes 0 for each of the variables Q4 and Q5, resets the third and fourth timers, and proceeds to step S715.
- FIG. 31 is a flowchart showing the trigger detection process (based on the crossbow 3C-N) in step S9 of FIG.
- the MCU 23 may have taken the retroreflective sheet 4G when the value of the variable SN indicating the number of retroreflective sheets reflected in the difference image is 1. Judgment is made, and the process proceeds to step S763. Otherwise, the process proceeds to step S767.
- the MCU 23 substitutes 0 for each of variables Q6 and Q7 described later.
- the MCU 23 executes charge trigger detection processing and returns.
- step S767 if the value of the variable SN is 2, the MCU 23 determines that the retroreflective sheets 4E and 4F have been photographed, and proceeds to step S769. Otherwise, the MCU 23 proceeds to step S773.
- step S769 the MCU 23 executes a shield trigger detection process.
- step S771 the MCU 23 executes a switch trigger detection process and returns.
- step S773 the MCU 23 substitutes 0 for each of the variables Q6 and Q7.
- step S775 when the value of the variable SN is 3, the MCU 23 determines that the retroreflective sheets 4D, 4E, and 4F have been photographed, and proceeds to step S777. Otherwise, the MCU 23 proceeds to step S779.
- step S777 the MCU 23 executes shooting trigger detection processing and returns.
- step S779 the MCU 23 sets the trigger flag to standby and returns.
- FIG. 32 is a flowchart showing the flow of the charge trigger detection process in step S765 of FIG.
- step S801 the area C [0] of the retroreflective sheet is compared with the threshold value Tha2.
- step S803 if the area C [0] is larger than Tha2, the MCU 23 determines that the retroreflective sheet 4G has been photographed, and proceeds to step S805. Otherwise, the MCU 23 proceeds to step S807.
- step S805 the MCU 23 sets the trigger flag to “charge” (occurrence of a charge trigger) and returns.
- step S807 the MCU 23 sets the trigger flag to standby and returns.
- FIG. 33 is a flowchart showing the flow of the shield trigger detection process in step S769 of FIG.
- the MCU 23 determines that the retroreflective sheets 4E and 4F have been photographed, and based on their XY coordinates (Xr, Yr), calculates the slope T1 of the straight line connecting them. calculate.
- the MCU 23 calculates and registers (stores) the midpoint of these XY coordinates (Xr, Yr).
- the MCU 23 proceeds to step S827 if the slope T1 is greater than a certain value, and returns otherwise.
- the MCU 23 sets the trigger flag to shield (generation of shield trigger), and proceeds to step S11 in FIG.
- FIG. 34 is a flowchart showing the flow of the switch trigger detection process in step S771 of FIG.
- the MCU 23 determines that the retroreflective sheets 4E and 4F have been photographed, calculates the midpoints of these XY coordinates (Xr, Yr), and registers (stores) them. .
- the MCU 23 proceeds to step S873 if the predetermined flag is on, and proceeds to step S855 if it is off.
- step S855 the MCU 23 increments the value of the variable Q6 by one.
- step S857 the MCU 23 proceeds to step S859 if the value of the variable Q6 is 5, otherwise proceeds to step S871.
- step S859 the MCU 23 calculates four velocity vectors based on the midpoint obtained in step S851.
- step S861 the MCU 23 classifies each of the four velocity vectors into one of directions A0 to A7.
- step S863 the MCU 23 proceeds to step S865 when all the velocity vectors are classified in the direction A1, and proceeds to step S871 otherwise.
- step S865 the MCU 23 proceeds to step S867 if the magnitudes of all the velocity vectors are larger than the threshold value Thv5, otherwise proceeds to step S871.
- step S867 the MCU 23 turns on the predetermined flag.
- step S869 the MCU 23 starts the fifth timer and proceeds to step S871.
- step S871 the MCU 23 sets the trigger flag to standby and returns.
- step S873 the MCU 23 refers to the fifth timer and proceeds to step S891 when the fifth predetermined time has elapsed, otherwise proceeds to step S875.
- step S891 the MCU 23 substitutes 0 for each of the variables Q6 and Q7, turns off the predetermined flag, resets the fifth timer, and proceeds to step S871.
- step S875 the MCU 23 increments the value of the variable Q7 by one.
- step S877 the MCU 23 proceeds to step S879 if the value of the variable Q7 is 5, otherwise proceeds to step S871.
- step S879 the MCU 23 calculates four velocity vectors based on the midpoint obtained in step S851.
- step S881 the MCU 23 classifies each of the four velocity vectors into one of directions A0 to A7.
- step S863 the MCU 23 proceeds to step S885 when all the velocity vectors are classified in the direction A0, otherwise proceeds to step S871.
- step S885 the MCU 23 proceeds to step S887 if the magnitudes of all velocity vectors are larger than the threshold Thv6, otherwise proceeds to step S871.
- step S887 the MCU 23 sets a trigger flag to a switch (generation of a switch trigger).
- step S889 the MCU 23 substitutes 0 for each of the variables Q6 and Q7, turns off the predetermined flag, resets the fifth timer, and returns.
- FIG. 35 is a flowchart showing the flow of the shooting trigger detection process in step S777 of FIG.
- the MCU 23 calculates and registers (stores) the midpoint coordinates of the retroreflective sheets 4E and 4F.
- the MCU 23 determines the difference
- step S915 the MCU 23 calculates an average value of the areas of the two retroreflective sheets having the smallest area difference.
- step S917 the MCU 23 calculates a difference between the average value calculated in step S915 and the area of the retroreflective sheet having the maximum area.
- step S919 if the difference calculated in step S917 is greater than a certain value, the MCU 23 determines that the retroreflective sheet having the largest area is the retroreflective sheet 4G, and proceeds to step S921. The process proceeds to S923.
- step S921 the MCU 23 sets the trigger flag to charge and returns (generation of a charge trigger).
- step S923 the MCU 23 checks whether or not the retroreflective sheets 4E and 4F satisfy the shield trigger generation condition.
- step S925 the MCU 23 proceeds to step S927 if the shield trigger generation condition is satisfied, otherwise proceeds to step S929.
- step S927 the MCU 23 sets the trigger flag to shield and returns (generation of shield trigger).
- step S929 the MCU 23 proceeds to step S931 when the two retroreflective sheets detected last time are the retroreflective sheets 4E and 4F, and proceeds to step S933 otherwise.
- step S931 the MCU 23 sets the trigger flag to shooting and returns (generation of shooting trigger).
- step S933 the MCU 23 sets the trigger flag to standby and returns.
- the camera unit 1-N gives the terminal 5-N the state information of the operation article 3-N, that is, the retroreflective sheet 4, such as XY coordinates (Xr, Yr) and area information. Accordingly, the terminal 5-N can execute processing based on the state information of the operation article 3-N. For example, the terminal 5 displays a cursor at a position on the monitor 7 corresponding to the XY coordinates (Xr, Yr) of the operation article 3-N.
- the camera unit 1-N gives the status information of the operation article 3-N, that is, the retroreflective sheet 4, to the terminal 5-N as a command. Accordingly, the terminal 5-N can execute processing based on the command corresponding to the status information of the operation article 3-N.
- commands from the camera unit 1 to the terminal 5 include a swing trigger (motion information), a shield trigger (area information for the sword 3A-N, arrangement information for the crossbow 3C-N), and a special trigger (area information for the sword 3A-N).
- Motion information, mace 3B-N is motion information), charge trigger (area information), switch trigger (motion information) and shooting trigger (number information).
- the terminal 5-N displays the sword locus on the monitor 7 in response to the swing trigger.
- the terminal 5-N displays a shield image on the monitor 7 in response to the shield trigger.
- the terminal 5-N displays the first predetermined effect on the monitor 7 in response to a special trigger by the sword 3A-N.
- the terminal 5-N displays the second predetermined effect on the monitor 7 in response to the special trigger by the mace 3B-N.
- the terminal 5-N charges the game character's energy in response to the charge trigger.
- the terminal 5-N switches the arrow firing mode (sequential fire, single fire) according to the switch trigger.
- the terminal 5-N fires an arrow on the monitor 7 in response to the shooting trigger.
- the state information can be obtained.
- the processing of FIGS. Processing for determining secondary candidate regions can be omitted, and the processing load can be reduced.
- the present invention can be widely used as a user interface.
- the present invention can be used for a video game or the like that receives human body movements.
- the present invention is not limited to the above-described embodiment, and can be implemented in various modes without departing from the gist thereof.
- the following modifications are possible.
- the camera unit 1 is applied to the online game, but it can also be used for an offline game, that is, a stand-alone type game.
- the online game is provided via the host computer 31.
- the lid 49 opens when the trigger 51 is pulled.
- the lid 49 can be opened when the trigger 51 is not pulled, and the lid 49 can be closed when the trigger 51 is pulled.
- the first operation is such that the exposure and non-exposure states are reversed between the first retroreflective sheet and the second retroreflective sheet and the first retroreflective sheet and the second retroreflective sheet.
- Switching means for switching the states of the retroreflective sheet and the second retroreflective sheet can be provided.
- the presence / absence of input by the operation article and / or the input form are determined based on the respective captured images. It can be detected. Further, based on switching between exposure and non-exposure between the first retroreflective sheet and the second retroreflective sheet, it is also possible to detect the presence / absence of input by the operation article and / or the form of input.
- the retroreflective sheet 4 was detected by generating a differential image DI using a stroboscope (flashing of the infrared light emitting diode 11).
- a stroboscope flashing of the infrared light emitting diode 11.
- the infrared light emitting diode 11 may not be blinked, and the infrared light emitting diode 11 may not be provided.
- Irradiation light is not limited to infrared light.
- the retroreflective sheet 4 is not an essential element in the present invention, and it is sufficient that the operation article 3-N can be detected by analyzing the captured image.
- the image sensor is not limited to an image sensor, and other image sensors such as a CCD can be used.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
B←B+P(X,Y)*X
C←C+P(X,Y)
Claims (19)
- コンピュータとは別体で設けられる撮像装置であって、
ユーザが操作する操作物を撮像する撮像手段と、
前記撮像手段から与えられた撮像画像を解析して、前記操作物による入力を検出し、入力情報を生成する検出手段と、
前記入力情報を前記コンピュータに送信する送信手段と、を備えた撮像装置。 - 前記検出手段は、前記撮像画像を解析して前記操作物の状態情報を算出し、前記状態情報を前記入力情報として前記送信手段に与える、請求項1記載の撮像装置。
- 前記操作物の前記状態情報は、位置情報、速さ情報、移動方向情報、移動距離情報、速度ベクトル情報、加速度情報、移動軌跡情報、面積情報、傾斜情報、動き情報若しくは、形態情報のいずれか、又は、それらの2以上の組み合わせ、である、請求項2記載の撮像装置。
- 前記操作物の前記状態情報は、前記操作物に装着された単数又は複数のマーカの状態情報である、請求項2又は3記載の撮像装置。
- 前記送信手段は、前記状態情報をコマンドとして、前記コンピュータに送信する、請求項2から4のいずれかに記載の撮像装置。
- 前記操作物に、予め定められた周期で、光を照射するストロボスコープをさらに備え、
前記撮像手段は、前記ストロボスコープの点灯時及び消灯時のそれぞれにおいて、前記操作物を撮影して、発光時画像及び消灯時画像を取得し、前記発光時画像と前記消灯時画像との差分信号を生成する差分信号生成手段を含む請求項1から5のいずれかに記載の撮像装置。 - 前記操作物は、受けた光を再帰反射する再帰反射手段を含む、請求項1から6のいずれかに記載の撮像装置。
- 前記検出手段は、
前記撮像手段により得られた前記撮像画像から、前記マーカの像が包含され、前記撮像画像の画素数より少ない画素からなる一次の候補領域を決定する第1の候補領域決定手段と、
前記マーカの数が1又は2である場合、前記一次の候補領域をスキャンして、前記マーカの前記状態情報を算出する第1の状態算出手段と、
前記マーカの数が少なくとも3である場合、前記一次の候補領域から、前記マーカの像を包含し、前記一次の候補領域の画素数より少ない画素からなる二次の候補領域を決定する第2の候補領域決定手段と、
前記マーカの数が少なくとも3である場合、前記二次の候補領域をスキャンして、前記マーカの前記状態情報を算出する第2の状態算出手段と、を含む請求項4記載の撮像装置。 - 各々が、対応する端末に接続され、対応する前記端末とは別体で設けられる複数の撮像装置を備え、
前記撮像装置は、
ユーザによって操作される操作物を撮像する撮像手段と、
前記撮像手段から与えられた撮像画像を解析して、前記操作物による入力を検出し、入力情報を生成する検出手段と、
前記入力情報を前記端末に送信する送信手段と、を含み、
ネットワーク経由で複数の前記端末が相互に接続されて、前記入力情報を相互に交換することにより、ゲームを進行する、オンラインゲームシステム。 - 撮像装置の被写体であり、ユーザに保持されて動きが与えられる操作物であって、
複数の反射手段と、 少なくとも1つの前記反射手段の露光状態と非露光状態とを切り替える切替手段と、を備え、 少なくとも1つの他の前記反射手段は、露光状態を維持する、操作物。 - 撮像装置の被写体であり、ユーザに保持されて動きが与えられる操作物であって、
第1反射手段と、 第2反射手段と、 前記第1反射手段と前記第2反射手段との間で、露光及び非露光状態が逆になるように、前記第1反射手段及び前記第2反射手段の状態を切り替える切替手段と、を備える操作物。 - 前記反射手段は、受けた光を再帰反射する、請求項10又は11記載の操作物。
- コンピュータとは別体で設けられる撮像装置が実行する入力方法であって、
ユーザが操作する操作物を撮像するステップと、
撮像する前記ステップから与えられた撮像画像を解析して、前記操作物による入力を検出し、入力情報を生成するステップと、
前記入力情報を前記コンピュータに送信するステップと、を含む入力方法。 - 請求項13記載の入力方法を撮像装置に搭載されたコンピュータに実行させるコンピュータプログラムを記録したコンピュータ読み取り可能な記録媒体。
- 単数又は複数の被写体を撮像する撮像手段と、
前記撮像手段により得られた画像から、前記被写体の像が包含され、前記画像の画素数より少ない画素からなる一次の候補領域を決定する第1の候補領域決定手段と、
前記被写体の数が1又は2である場合、前記一次の候補領域をスキャンして、前記被写体の状態情報を算出する第1の状態算出手段と、
前記被写体の数が少なくとも3である場合、前記一次の候補領域から、前記被写体の像を包含し、前記一次の候補領域の画素数より少ない画素からなる二次の候補領域を決定する第2の候補領域決定手段と、
前記被写体の数が少なくとも3である場合、前記二次の候補領域をスキャンして、前記被写体の状態情報を算出する第2の状態算出手段と、を備える画像解析装置。 - 前記第1の候補領域決定手段は、
前記画像の画素値の水平軸への正射影である第1配列を生成する第1配列手段と、
前記画像の前記画素値の垂直軸への正射影である第2配列を生成する第2配列手段と、
前記第1配列及び前記第2配列に基づいて、前記一次の候補領域を決定する手段と、を含み、
前記第2の候補領域決定手段は、
前記一次の候補領域の前記画素値の前記水平軸への正射影である第3配列を生成する第3配列手段と、
前記一次の候補領域の前記画素値の前記垂直軸への正射影である第4配列を生成する第4配列手段と、
前記第3配列及び前記第4配列に基づいて、前記二次の候補領域を決定する手段と、を含む請求項15記載の画像解析装置。 - 前記操作物に、予め定められた周期で、光を照射するストロボスコープをさらに備え、
前記撮像手段は、前記ストロボスコープの点灯時及び消灯時のそれぞれにおいて、前記操作物を撮影して、発光時画像及び消灯時画像を取得し、前記発光時画像と前記消灯時画像との差分信号を生成する差分信号生成手段を含み、
前記第1の候補領域決定手段、前記第1の状態算出手段、前記第2の候補領域決定手段、及び前記第2の状態算出手段は、前記差分信号に基づいて処理を実行する、請求項15又は16記載の画像解析装置。 - 単数又は複数の被写体を撮像する撮像装置により得られた画像に基づく画像解析方法であって、
前記画像から、前記被写体の像が包含され、前記画像の画素数より少ない画素からなる一次の候補領域を決定するステップと、
前記被写体の数が1又は2である場合、前記一次の候補領域をスキャンして、前記被写体の状態情報を算出するステップと、
前記被写体の数が少なくとも3である場合、前記一次の候補領域から、前記被写体の像を包含し、前記一次の候補領域の画素数より少ない画素からなる二次の候補領域を決定するステップと、
前記被写体の数が少なくとも3である場合、前記二次の候補領域をスキャンして、前記被写体の状態情報を算出するステップと、を含む画像解析方法。 - 請求項18記載の画像解析方法をコンピュータに実行させるコンピュータプログラムを記録したコンピュータ読み取り可能な記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/863,764 US20110183751A1 (en) | 2008-01-22 | 2009-01-22 | Imaging device, online game system, operation object, input method, image analysis device, image analysis method, and recording medium |
CN2009801100466A CN102124423A (zh) | 2008-01-22 | 2009-01-22 | 摄像装置、在线游戏系统、操作物、输入方法、图像解析装置、图像解析方法以及记录媒体 |
JP2009550476A JPWO2009093461A1 (ja) | 2008-01-22 | 2009-01-22 | 撮像装置、オンラインゲームシステム、操作物、入力方法、画像解析装置、画像解析方法、及び記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008011320 | 2008-01-22 | ||
JP2008-011320 | 2008-01-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009093461A1 true WO2009093461A1 (ja) | 2009-07-30 |
Family
ID=40900970
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/000245 WO2009093461A1 (ja) | 2008-01-22 | 2009-01-22 | 撮像装置、オンラインゲームシステム、操作物、入力方法、画像解析装置、画像解析方法、及び記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20110183751A1 (ja) |
JP (1) | JPWO2009093461A1 (ja) |
CN (1) | CN102124423A (ja) |
WO (1) | WO2009093461A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012148844A2 (en) | 2011-04-25 | 2012-11-01 | Microsoft Corporation | Laser diode modes |
CN102855030A (zh) * | 2011-06-30 | 2013-01-02 | 精工爱普生株式会社 | 指示部件、光学式位置检测装置以及带输入功能的显示系统 |
JP2016508646A (ja) * | 2013-02-22 | 2016-03-22 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | 受動的ワンドを追跡しそして検出されたワンド経路に基づき効果を作用させるシステム及び方法 |
WO2020209088A1 (ja) * | 2019-04-11 | 2020-10-15 | 株式会社ソニー・インタラクティブエンタテインメント | 複数のマーカを備えたデバイス |
US11262841B2 (en) | 2012-11-01 | 2022-03-01 | Eyecam Llc | Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing |
US11314399B2 (en) | 2017-10-21 | 2022-04-26 | Eyecam, Inc. | Adaptive graphic user interfacing system |
US11794095B2 (en) | 2019-04-24 | 2023-10-24 | Sony Interactive Entertainment Inc. | Information processing apparatus and device information derivation method |
US12017137B2 (en) | 2019-04-15 | 2024-06-25 | Sony Interactive Entertainment Inc. | Device including plurality of markers |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106984041B (zh) * | 2011-02-11 | 2021-07-06 | 漳州市舟锋电子科技有限公司 | 一种人机互动控制系统 |
CN103285585A (zh) * | 2012-09-24 | 2013-09-11 | 天津思博科科技发展有限公司 | 一种基于互联网架构的体感击剑互动装置 |
CN103902018B (zh) * | 2012-12-24 | 2018-08-10 | 联想(北京)有限公司 | 一种信息处理方法、装置及一种电子设备 |
US9616350B2 (en) | 2014-05-21 | 2017-04-11 | Universal City Studios Llc | Enhanced interactivity in an amusement park environment using passive tracking elements |
US10207193B2 (en) | 2014-05-21 | 2019-02-19 | Universal City Studios Llc | Optical tracking system for automation of amusement park elements |
US9600999B2 (en) | 2014-05-21 | 2017-03-21 | Universal City Studios Llc | Amusement park element tracking system |
US10061058B2 (en) | 2014-05-21 | 2018-08-28 | Universal City Studios Llc | Tracking system and method for use in surveying amusement park equipment |
US9429398B2 (en) | 2014-05-21 | 2016-08-30 | Universal City Studios Llc | Optical tracking for controlling pyrotechnic show elements |
US9433870B2 (en) | 2014-05-21 | 2016-09-06 | Universal City Studios Llc | Ride vehicle tracking and control system using passive tracking elements |
US10025990B2 (en) | 2014-05-21 | 2018-07-17 | Universal City Studios Llc | System and method for tracking vehicles in parking structures and intersections |
NL2014037B1 (en) | 2014-12-22 | 2016-10-12 | Meyn Food Proc Technology Bv | Processing line and method for inspecting a poultry carcass and/or a viscera package taken out from the poultry carcass. |
CN105678817B (zh) * | 2016-01-05 | 2017-05-31 | 北京度量科技有限公司 | 一种高速提取圆形图像中心点的方法 |
US10249090B2 (en) * | 2016-06-09 | 2019-04-02 | Microsoft Technology Licensing, Llc | Robust optical disambiguation and tracking of two or more hand-held controllers with passive optical and inertial tracking |
CN106730811B (zh) * | 2016-12-07 | 2018-09-07 | 腾讯科技(深圳)有限公司 | 一种vr场景下的道具运动处理方法和装置 |
CN109806588B (zh) * | 2019-01-31 | 2020-12-01 | 腾讯科技(深圳)有限公司 | 属性值的恢复方法和装置、存储介质、电子装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005003945A1 (ja) * | 2003-07-02 | 2005-01-13 | Ssd Company Limited | 情報処理装置、情報処理システム、操作物、情報処理方法、情報処理プログラム、及び、ゲームシステム |
JP2005032245A (ja) * | 2003-07-11 | 2005-02-03 | Hewlett-Packard Development Co Lp | 画像に基づいたビデオゲームの制御 |
JP2006277076A (ja) * | 2005-03-28 | 2006-10-12 | Fuji Electric Device Technology Co Ltd | 画像インターフェイス装置 |
-
2009
- 2009-01-22 WO PCT/JP2009/000245 patent/WO2009093461A1/ja active Application Filing
- 2009-01-22 JP JP2009550476A patent/JPWO2009093461A1/ja active Pending
- 2009-01-22 CN CN2009801100466A patent/CN102124423A/zh active Pending
- 2009-01-22 US US12/863,764 patent/US20110183751A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005003945A1 (ja) * | 2003-07-02 | 2005-01-13 | Ssd Company Limited | 情報処理装置、情報処理システム、操作物、情報処理方法、情報処理プログラム、及び、ゲームシステム |
JP2005032245A (ja) * | 2003-07-11 | 2005-02-03 | Hewlett-Packard Development Co Lp | 画像に基づいたビデオゲームの制御 |
JP2006277076A (ja) * | 2005-03-28 | 2006-10-12 | Fuji Electric Device Technology Co Ltd | 画像インターフェイス装置 |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20140024876A (ko) * | 2011-04-25 | 2014-03-03 | 마이크로소프트 코포레이션 | 레이저 다이오드 모드 |
EP2702464A2 (en) * | 2011-04-25 | 2014-03-05 | Microsoft Corporation | Laser diode modes |
JP2014513357A (ja) * | 2011-04-25 | 2014-05-29 | マイクロソフト コーポレーション | レーザー・ダイオード・モード |
EP2702464A4 (en) * | 2011-04-25 | 2014-09-17 | Microsoft Corp | LASERDIODENMODI |
WO2012148844A2 (en) | 2011-04-25 | 2012-11-01 | Microsoft Corporation | Laser diode modes |
KR102007445B1 (ko) | 2011-04-25 | 2019-08-05 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 레이저 다이오드 모드 |
CN102855030A (zh) * | 2011-06-30 | 2013-01-02 | 精工爱普生株式会社 | 指示部件、光学式位置检测装置以及带输入功能的显示系统 |
US11262841B2 (en) | 2012-11-01 | 2022-03-01 | Eyecam Llc | Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing |
JP2016508646A (ja) * | 2013-02-22 | 2016-03-22 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | 受動的ワンドを追跡しそして検出されたワンド経路に基づき効果を作用させるシステム及び方法 |
US11314399B2 (en) | 2017-10-21 | 2022-04-26 | Eyecam, Inc. | Adaptive graphic user interfacing system |
JP2020173670A (ja) * | 2019-04-11 | 2020-10-22 | 株式会社ソニー・インタラクティブエンタテインメント | 複数のマーカを備えたデバイス |
WO2020209088A1 (ja) * | 2019-04-11 | 2020-10-15 | 株式会社ソニー・インタラクティブエンタテインメント | 複数のマーカを備えたデバイス |
JP7283958B2 (ja) | 2019-04-11 | 2023-05-30 | 株式会社ソニー・インタラクティブエンタテインメント | 複数のマーカを備えたデバイス |
US12017137B2 (en) | 2019-04-15 | 2024-06-25 | Sony Interactive Entertainment Inc. | Device including plurality of markers |
US11794095B2 (en) | 2019-04-24 | 2023-10-24 | Sony Interactive Entertainment Inc. | Information processing apparatus and device information derivation method |
Also Published As
Publication number | Publication date |
---|---|
JPWO2009093461A1 (ja) | 2011-05-26 |
US20110183751A1 (en) | 2011-07-28 |
CN102124423A (zh) | 2011-07-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2009093461A1 (ja) | 撮像装置、オンラインゲームシステム、操作物、入力方法、画像解析装置、画像解析方法、及び記録媒体 | |
EP2264583B1 (en) | Electronic device with coordinate detecting means. | |
JP7383135B2 (ja) | 仮想対象の制御方法及びその装置、機器、並びにコンピュータープログラム | |
CN112044074B (zh) | 对非玩家角色寻路的方法、装置、存储介质及计算机设备 | |
JP4115809B2 (ja) | ゲームシステム及びゲームプログラム | |
US7828660B2 (en) | Storage medium having game program stored thereon and game apparatus | |
JP5622447B2 (ja) | 情報処理プログラム、情報処理装置、情報処理システム及び情報処理方法 | |
JP5130504B2 (ja) | 情報処理装置、情報処理方法、プログラム及び記憶媒体 | |
CN109754471A (zh) | 增强现实中的图像处理方法及装置、存储介质、电子设备 | |
CN113449696B (zh) | 一种姿态估计方法、装置、计算机设备以及存储介质 | |
JP2006122519A (ja) | ゲームプログラム | |
US10867454B2 (en) | Information processing apparatus and method to control operations in virtual space | |
JPH0786936B2 (ja) | 画像処理装置 | |
JP2006314632A (ja) | ゲームプログラムおよびゲーム装置 | |
CN112717407A (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
US12005360B2 (en) | Virtual object control method and apparatus, computer device, and storage medium | |
US20210117070A1 (en) | Computer-readable recording medium, computer apparatus, and method of controlling | |
JP2007296191A (ja) | ゲームプログラムおよびゲーム装置 | |
JP3138145U (ja) | 脳トレーニング装置 | |
JP6924799B2 (ja) | プログラム、画像処理方法及び画像処理システム | |
US20220297005A1 (en) | Storage medium, game system, game apparatus and game control method | |
JP2019000138A (ja) | ゲームプログラム、ゲーム装置およびサーバ装置 | |
JP2024522484A (ja) | 仮想シーンに基づくグラフィック表示方法、装置、機器及びコンピュータプログラム | |
JP2008059375A (ja) | 情報処理方法、情報処理装置 | |
JP3136549U (ja) | スロットマシーン |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980110046.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09704566 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009550476 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12863764 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09704566 Country of ref document: EP Kind code of ref document: A1 |