WO2016079901A1 - Dispositif optique - Google Patents

Dispositif optique Download PDF

Info

Publication number
WO2016079901A1
WO2016079901A1 PCT/JP2015/001876 JP2015001876W WO2016079901A1 WO 2016079901 A1 WO2016079901 A1 WO 2016079901A1 JP 2015001876 W JP2015001876 W JP 2015001876W WO 2016079901 A1 WO2016079901 A1 WO 2016079901A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
imaging
card
medium
optical device
Prior art date
Application number
PCT/JP2015/001876
Other languages
English (en)
Inventor
Kenji Yoshida
Original Assignee
Gridmark Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gridmark Inc. filed Critical Gridmark Inc.
Priority to JP2017545010A priority Critical patent/JP2017535902A/ja
Publication of WO2016079901A1 publication Critical patent/WO2016079901A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/22Character recognition characterised by the type of writing
    • G06V30/224Character recognition characterised by the type of writing of printed characters having additional code marks or containing code marks
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3202Hardware aspects of a gaming system, e.g. components, construction, architecture thereof
    • G07F17/3216Construction aspects of a gaming system, e.g. housing, seats, ergonomic aspects
    • G07F17/322Casino tables, e.g. tables having integrated screens, chip detection means
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3286Type of games
    • G07F17/3293Card games, e.g. poker, canasta, black jack
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/16Image acquisition using multiple overlapping images; Image stitching

Definitions

  • the present invention relates to an optical device and a reading device.
  • a card game device for playing cards by placing them on a stage of the game machine is known (refer to PTL 1).
  • a game device is also proposed in which information concerning the game such as the number on a card that is placed by a player or the result of the game, or the like is displayed as an image on a stage by using a projector (refer to PTL 2).
  • the game machine described in PTL 2 had a problem that it is difficult to be installed in a game center, or the like due to its large size because a pole is provided in the game machine cabinet equipped with the stage for placing cards, or the like and a display device such as the projector is installed on the pole.
  • the inventors of the present invention invented an information output device that enables information of a medium placed on a stage surface to be read reliably and efficiently, at the same time having high dramatic effect and high security and filed a patent application (refer to PTL 3).
  • the present invention has been made in view of the situations described above, and is intended to enable a board game having a stage surface of an arbitrary size and a table top display to be realized with ease.
  • the optical device in one aspect of the present invention comprises: a flat plate that makes at least some of the first light among the light emitted from the back surface side transmit to the front surface and in which a predetermined medium can be disposed at the said front surface side; a light emitting part that emits the said light from the back surface side of the said flat plate; an imaging part that captures an image of the said medium by exposing it to the said first light reflected by the said medium via the said flat plate; and a regulating part that regulates the direction of emission of the light emitted from the said light emitting part so that incidence of the second light reflected by the said flat plate among the light emitted by the said light emitting part onto the said imaging part will be prohibited.
  • the optical device can further comprise: a recognition part that recognizes the features of the said medium whose image has been captured by the said imaging part, or the information that has been attached to the said medium.
  • the said recognition part can recognize, as the features of the said medium, at least either of the shape and size of the said medium.
  • an image, character, figure, or code, or a combination of these is formed, and the said recognition part can recognize the said image, said character, said figure, said code, or the said combination of these.
  • the said medium has at least some parts that contain a portion where the reflectivity of the first light is different from that of the other portions, and the said recognition part can recognize the said portion where the reflectivity of the first light is different from that of the other portions.
  • the said image, said character, said figure, said code, or the said combination of these can be formed.
  • the said portion where the reflectivity of the first light is different from that of the other portions in the said medium is a dot that reflects the said first light, and on the said surface of the said medium that non-reflects the first light, a dot pattern is formed that consists a plurality of the said dots, and the said recognition part can recognize the said dot pattern.
  • the said portion where the reflectivity of the first light is different from that of the other portions in the said medium is a dot that non-reflects the said first light, and on the said surface of the said medium that reflects the first light, a dot pattern is formed that consists a plurality of the said dots, and the said recognition part can recognize the said dot pattern.
  • each imaging area whose image is captured by each of the said plurality of imaging parts can have an overlapping area with the imaging area whose image is captured by at least one of the said other imaging parts.
  • the said flat plate is irradiated by a plurality of the said light emitting parts, and the said regulating part further suppresses the difference in the amount of light on the said flat plate as compared with the case without the said regulating part, and at the same time can regulate the direction of emission of the light that has been emitted from at least some of the said plurality of light emitting parts so that the degree of change in the amount of light is suppressed in area where there is a difference in the amount of light.
  • the optical device can further comprise a controlling part which causes, during light emission of the said light emitting part that irradiates the imaging area of the said imaging part, exposure of the said imaging part.
  • the optical device can further comprise a controlling part which, for a predetermined time within the period of exposure by means of the said imaging part, executes control that makes the said light emitting part emit light that irradiates the imaging area of the said imaging part.
  • the optical device can further comprise a recognition part that recognizes the position of the said medium by means of a coordinate system of which datum is the said flat plate, by using at least one of the images including at least some parts of the said medium as a subject, among the images captured by the said plurality of imaging parts.
  • the said flat plate can further comprise a projector having the functions of a translucent screen which can project an image onto the said translucent screen by means of visible light.
  • the light emitted from the said light emitting part can be invisible light.
  • the light emitted from the said light emitting part is white visible light
  • the optical device can further comprise a controlling part which, for a predetermined time, executes control that makes the said light emitting part emit light, and a recognition part that recognizes the said medium disposed on the said flat plate by using an image captured by the said imaging part.
  • the said controlling part can further execute control that synchronizes the timing of exposure of the said imaging part with the timing of irradiation by the said light emitting part, after clearing the buffer that holds image signals that have undergone photoelectric conversion in the said imaging part.
  • the prevent invention enables a board game having a stage surface of an arbitrary size to be realized with ease.
  • Fig. 1 is a perspective view showing an example of the external configuration of the optical device pertaining to one embodiment of the present invention.
  • Fig. 2 is a cross-sectional view showing an outline of the internal structure of the optical device shown in Fig. 1.
  • Fig. 3 is a block diagram showing the constituent elements that process electrical signals in the optical device shown in Fig. 1.
  • Fig. 4 is an explanatory view showing the relationships among the light emitting part, imaging part, and regulating part in the optical device shown in Fig. 1.
  • Fig. 5 is an illustrative view showing an example of the entire analysis area for analyzing the positions, and the like of cards which consists of 10 analysis parts in the optical device shown in Fig. 1, each of the imaging areas of which being disposed by having an overlapping area.
  • Fig. 1 is a perspective view showing an example of the external configuration of the optical device pertaining to one embodiment of the present invention.
  • Fig. 2 is a cross-sectional view showing an outline of the internal structure of the optical device shown
  • FIG. 6 is an illustrative view showing the external configuration of the regulating part in the optical device shown in Fig. 1.
  • Fig. 7 is an explanatory view showing the significance of providing the regulating part shown in Fig. 6 partially with a light shielding part.
  • Fig. 8 shows specific examples of various types of cards used in the optical device shown in Fig. 1.
  • Fig. 9 shows enlarged views of the cross sections of cards on which a dot pattern is formed, which show specific examples of the combination of absorption and reflection of infrared rays.
  • Fig. 10 is an explanatory view showing an example of the dot pattern.
  • Fig. 11 is an enlarged view showing an example of information dots in the dot pattern.
  • Fig. 12 is an explanatory view showing arrangements of information dots.
  • Fig. 10 is an explanatory view showing an example of the dot pattern.
  • FIG. 13 is an illustrative view of an example of information dots and the bit indication of the data defined in the dots, which is an illustrative view showing another embodiment.
  • Fig. 14 is an illustrative view of the examples of information dots and the bit indication of the data defined in the dots.
  • Fig. 15 is an illustrative view showing examples of deformation of the dot pattern.
  • Fig. 16 is an explanatory view that defines, in a dot pattern, the direction of the block by changing the way in which information dots are arranged.
  • Fig. 17 is an explanatory view that defines, in a dot pattern, the direction of the block by changing the way in which information dots are disposed.
  • Fig. 18 is an illustrative view showing specific examples of cards used in the optical device shown in Fig. 1.
  • Fig. 19 is a plan view of the state of a card being placed on the card arrangement panel as seen from top.
  • Fig. 20 is an illustrative view showing the image of cards placed on the card arrangement panel.
  • Fig. 21 is an explanatory view showing a method of judging whether there is a card or not.
  • Fig. 22 is an explanatory view showing a method of analyzing the code of a card.
  • Fig. 23 is an explanatory view showing a method of recognizing the position and angle of a card.
  • Fig. 24 is an explanatory view showing a method, when a player has moved a card, of calculating the angle of movement and the amount of movement.
  • Fig. 25 is another example of a dot pattern.
  • Fig. 26 is an explanatory view showing cases in which the locus of a card is used as a parameter, and (a) is a case where the card is moved in a circular manner, and (b) is a case where the card is moved as though a quadrangle is drawn.
  • Fig. 27 is a diagram (1) illustrating a method of recognizing a shape of a medium disposed on a stage surface.
  • Fig. 28 is a diagram (2) illustrating a method of recognizing a shape of a medium disposed on a stage surface.
  • Fig. 1 is a perspective view showing an example of the external configuration of an optical device 10 pertaining to one embodiment of the present invention.
  • the optical device 10 shown in Fig. 1 comprises a card arrangement panel 11 for placing cards C1, C2, and the like owned by a player (these are collectively called "card C" hereinafter) and a pedestal part 12 onto which the said card arrangement panel 11 is mounted.
  • the optical device 10 is a table for playing a predetermined card game using card C, and the card arrangement panel 11 functions as a stage surface. Note that the card arrangement panel 11 is placed on top of the pedestal part 12, as seen from the player.
  • Fig. 2 is a cross-sectional view showing an outline of the internal structure of the optical device 10 shown in Fig. 1.
  • the optical device 10 comprises, in the inside of the cabinet of its pedestal part 12, n (n is an arbitrary integer value of 2 or more) regulating parts 21-1 through 21-n, a short focus projector 22, n imaging parts 23-1 through 23-n, n unit controlling parts 24-1 through 24-n, and n light emitting parts 25-1 through 25-n.
  • the optical device 10 comprises, at the bottom side of the cabinet of its pedestal part 12, an entire panel controlling part 26 and a main controlling part 27.
  • Fig. 3 is a block diagram showing the constituent elements that process electrical signals in the optical device 10 shown in Fig. 1.
  • the pair of an imaging part 23-k (k is an arbitrary integer value between 1 and n) and a unit controlling part 24-k is dealt with as one unit.
  • a unit like this is called a sensor unit 30-k below.
  • These sensor units 30-1 through 30-n are disposed, for example, in the lower part of the cabinet of the pedestal part 12, as shown in Fig. 2.
  • Each of the sensor units 30-1 through 30-n is connected, for example, by means of a cable, and the cable is connected to the entire panel controlling part 26 via a hub unit or the like that is not shown in the figure.
  • the method of connection of the sensor units 30-1 through 30-n may be a chain connection in which the units are connected in one ring, or they may be connected with a plurality of rings. As for the method of connection, it may totally be freely determined, and a wireless connection may be made. A plurality of sensors may be disposed on one circuit board, even though a block connection for each sensor is disabled thereby.
  • the entire panel controlling part 26 is connected to the main controlling part 27. To the main controlling part 27, a display part 31 and a speaker 32 are connected.
  • the display part 31 includes at least the short focus projector 22 in this embodiment, and it may also include any other display not shown in the figure.
  • each of the sensor units 30-1 through 30-n shall be called the “sensor unit 30" collectively.
  • the regulating parts 21-1 through 21-n shall also be called the “regulating part 21” collectively
  • the imaging parts 23-1 through 23-n shall also be called the “ imaging part 23” collectively
  • the unit controlling parts 24-1 through 24-n shall also be called the “unit controlling part 24" collectively
  • the "light emitting parts 25-1 through 25-n” shall also be called the "light emitting part 25" collectively.
  • the imaging part 23 is configured such that it includes a lens and an imaging device of the CMOS or CCD type.
  • the unit controlling part 24 is disposed in the lower part of the sensor of the imaging part 23, the unit controlling part 24 is disposed.
  • the unit controlling part 24 is configured such that it includes a CPU and a frame buffer.
  • the frame buffer may be changed to a line buffer. In this case, binarization processing by means of a predetermined scan line method is to be done.
  • the data of images taken in by means of the sensor of the imaging part 23 are subjected to image processing by means of the CPU of the unit controlling part 24.
  • an IR filter is pasted onto the top surface of the lens.
  • This IR filter is a filter having optical characteristics that lets infrared light LO with an infrared wavelength (700 nm or more) that is emitted at least in the light emitting part 25 pass through it.
  • all the light emitting part 25 including the light emitting part 25-1 shown in Fig. 4 emits infrared light LO.
  • the card arrangement panel 11 lets, of the infrared light LO that has been emitted from the back surface side, at least some of the first light LS pass through it onto the front surface.
  • the infrared ray reflection area includes an area that diffuses and reflects infrared rays (hereinafter called the "infrared ray reflection area")
  • the infrared light diffused and reflected from the infrared ray reflection area of card C is incident onto the sensor of the imaging part 23 via the lens and IR filter.
  • the sensor of the imaging part 23 captures an image containing the infrared ray reflection area of card C through exposure by means of the said infrared light.
  • card C is disposed, in the card arrangement panel 11, within the imaging area P2 of the imaging part 23-10.
  • the infrared ray reflection area of card C appears in the image that has been captured by means of the imaging part 23-10.
  • the infrared ray reflection area of card C appears in the image that has been captured by means of the said imaging part 23-1.
  • the infrared light that is incident onto the sensor of the imaging part 23-1 has been diffused and reflected in the infrared reflection area of card C after having been emitted from the light emitting part 25-10 (not shown in Fig. 4) that is disposed at the side of the imaging part 23-10.
  • the card arrangement panel 11 is a sheet-like medium of glass, acrylic, or any other material having a smooth back surface. For this reason, as shown in Fig.
  • the infrared light LO that has been emitted from the light emitting part 25-1 does not turn into the transmitted light LS in its entirely by passing through the surface of the card arrangement panel 11, and part of the infrared light becomes the reflected light LR after being specularly reflected by the surface of the card arrangement panel 11.
  • the reflected light LR from this light emitting part 25-1 happens to be incident on the sensor (various groups of pixels that capture the image of the imaging area P1) of the imaging part 23-10 that is disposed at the opposite side of the said light emitting part 25-1, the reflected light undesirably appears in the image captured by means of the imaging part 23-10 as ambient light. Ambient light such as this may make the infrared reflection area of card C difficult to be recognized.
  • the regulating part 21-1 which regulates the direction of emission of the infrared light LO emitted from the light emitting part 25-1, so that, among the infrared light LO that has been emitted from the light emitting part 25-1, incidence of the reflected light LR that has been specularly reflected by the card arrangement panel 11 onto the imaging part 23-10 will be prohibited.
  • the reflected light LR its incidence onto the sensor of the imaging part 23-10 that is separated from the light emitting part 25-1 is prohibited, and as a result of this, naturally, its incidence on the sensor of the imaging part 23-1 in the vicinity of the light emitting part 25-1 is also prohibited.
  • analysis (recognition) of the position, and the like of card C is actually performed in a predetermined area (hereinafter called the "analysis area") that is defined in the imaging area P1. Therefore, to put it more accurately, even if the reflected light LR has been incident on the sensor of the imaging part 23-10, if it is not incident on each group of pixels that capture the image of the analysis area, recognition of the infrared reflection area of card C is enabled.
  • the regulating part 21-1 should regulate the direction of emission of the infrared light LO that has been emitted from the light emitting part 25-1, so that, incidence of the reflected light LO onto each group of pixels that capture the image of the analysis area among the various groups of pixels that constitute the sensor of the imaging part 23-10 will be prohibited.
  • the imaging area and the analysis area are separate areas having independent concepts, but for the sake of convenience in giving explanations, it is assumed in this embodiment that the imaging area and the analysis area coincide with each other.
  • the regulating parts 21-2 through 21-10 are provided which regulates the direction of emission of the light emitted from each of the light emitting parts 25-2 through 25-10 so that incidence of the reflected light LR that has been specularly reflected by the surface of the card arrangement panel 11 onto the sensor of the imaging part 23 will be prohibited.
  • These regulating parts 21-1 through 21-n are disposed, for example, in the lower part of the cabinet of the pedestal part 12, as shown in Fig. 2.
  • part of the remainder of card C is included in at least one of the other analysis areas of the imaging part 23 (imaging area P1 of the imaging part 23-1 in the case of the example shown in Fig. 4).
  • recognition of the entire card C is enabled by synthesizing a plurality of images captured by the imaging part 23.
  • analysis areas are not to be overlapped, when it happens that an important portion of card C (for example, one unit of the dot pattern to be described later, or the like) is placed at a boundary line of bordering areas, there is also the danger that an error will occur in the recognition of the said important portion.
  • each of the analysis areas of which image is captured by each of the imaging part 23 has an overlapping area that overlaps with the analysis area of which image is captured by at least one of the other imaging part 23.
  • an overlapping area is provided where the imaging area P1 as the analysis area of the imaging part 23-1 and the imaging area P2 as the analysis area of the imaging part 23-2 overlap each other.
  • Fig. 5 shows an example of the entire analysis area for analyzing the positions, and the like of cards C which consists of 10 imaging parts, 23-1 through 23-10, each of the analysis areas of which being disposed by having an overlapping area.
  • the entire analysis area consists of 10 imaging parts, 23-1 through 23-10, each of the analysis areas (each of the imaging areas P1 through P10 as indicated by the dotted lines in Fig. 5) of which being disposed by having an overlapping area.
  • the size of the entire analysis area becomes W in lateral direction x H in vertical direction.
  • the 10 imaging parts 23-1 through 23-10 are disposed in a form of 2 rows x 5 columns at equal intervals laterally and vertically.
  • the 10 imaging parts 23-1 through 23-10 are disposed at positions that are h1 apart from the long sides of the entire analysis area such that the interval between two of the imaging part 23 that are adjacent to each other in the vertical direction (for example, the interval between the imaging part 23-1 and the imaging part 23-10) will become h2.
  • the 10 imaging parts 23-1 through 23-10 are disposed at positions that are w1 apart from the short sides of the entire analysis area such that the interval between two of the imaging part 23 that are adjacent to each other in the lateral direction (for example, the interval between the imaging part 23-1 and the imaging part 23-2) will become w2.
  • the entire analysis area shown in Fig. 5 is formed.
  • the imaging area of one of the imaging part 23 becomes an analysis area where the position of card C, or the like is analyzed by means of the corresponding unit controlling part 24.
  • an analysis area such as this that is analyzed by means of the unit controlling part 24 is hereinafter called a "sub-analysis area" in order to distinguish it from the overall analysis area.
  • the size of the sub-analysis area (imaging area of the imaging part 23) becomes DW in lateral direction and Dh in vertical direction.
  • the overlapping areas include an area formed by the overlapping of two sub-analysis areas that are adjacent in the vertical direction (hereinafter called the "vertical overlapping area”) and an area formed by the overlapping of two sub-analysis areas that are adjacent in the lateral direction (hereinafter called the "lateral overlapping area”).
  • the size of the vertical overlapping area in the vertical direction is Dh0 in vertical direction. Namely, the vertical overlapping area is formed at positions each of which is Dh1 apart from each of the two long sides of the analysis area.
  • the size of the lateral overlapping area in the lateral direction is Dm0 in lateral direction.
  • the lateral overlapping area is formed at positions each of which is Dm1 apart from each of the two short sides of the analysis area. Note that as shown in Fig. 5, there also exists an area where the vertical overlapping area and lateral overlapping area such as these further overlap each other.
  • a local coordinate system is defined in each of the sub-analysis areas (imaging areas) P1 through P10.
  • the coordinates, and the like of the said card C are calculated by the corresponding unit controlling part 24 by means of the local coordinate system.
  • the coordinate values of card C are calculated by means of each of the local coordinate systems.
  • the coordinate values of card C are converted into the "card analysis overall coordinate system" by the entire panel controlling part 26 to be described later.
  • the result of analysis is notified that is equivalent to the result of an analysis of the position of card C, and the like that has been carried out as though by means of one optical module for the main controlling part 27 in the latter stage, and therefore it becomes possible to recognize the position of card C in the entire card arrangement panel 11 (coordinate values in the card analysis overall coordinate system) with ease.
  • the size of card arrangement panel 11 is small (relatively as seen from the sensor unit 30), the position of card C, etc. can be analyzed by means of one sensor unit 30 only.
  • the size of the card arrangement panel 11 in this embodiment is large, and in order to analyze the position, and the like of card C that can arbitrarily move on the said card arrangement panel 11, one sensor unit 30 does not suffice.
  • an analysis equivalent to the result of an analysis as though it has been carried out by means of one optical module (sensor unit 30) is enabled.
  • the said card arrangement panel 11 can entirely be irradiated with infrared light without unevenness (so that there is no difference in the amount of light emitted).
  • the size of the card arrangement panel 11 in this embodiment is large, it is very difficult to irradiate the said card arrangement panel 11 entirely without unevenness (so that there is no difference in the amount of light emitted) by using one light emitting part 25 only.
  • a plurality of the light emitting part 25 is provided.
  • the provision of a plurality of the light emitting part 25 entails the problem of unexpected appearance of reflected light on the card arrangement panel 11 as described above, and therefore the regulating part 21 is provided so that incidence of the light reflected by the card arrangement panel 11 onto the imaging part 23 will be prohibited.
  • the regulating part 21 in this embodiment has a shape shown in Fig. 6.
  • Fig. 6 is an illustrative view showing the external configuration of the regulating part 21 in this embodiment.
  • Fig. 6 (a) is an illustrative view showing the light shielding surface side of the regulating part 21.
  • Fig. 6 (b) is an illustrative view showing the side of the regulating part 21 to be mounted onto the cabinet.
  • Fig. 6 (c) is an illustrative view showing the cross-section of the regulating part 21.
  • the regulating part 21 is roughly classified into the totally light shielding part 21a and the partially light shielding part 21b.
  • the light emitting part 25 is disposed at the left end of Fig. 6 (a) and (c) and at the side opposite to the light shielding surface shown in Fig. 6 (a) (at the lower side in Fig. 6 (c)).
  • the infrared light emitted from the light emitting part 25 is totally shielded in the totally light shielding part 21a, whereas in the partially light shielding part 21b the light is not totally shielded and part of the light is emitted externally as leakage of light.
  • This partially light shielding part 21b has a shape of the so-called saw blade, and the area of the light shielding surface becomes smaller when coming closer to the tip (the right end of Fig. 6 (a) and (c). Namely, the amount of leakage of light gradually increases when coming closer to the tip.
  • Fig. 7 is an explanatory view showing the significance of the provision of the partially light shielding part 21b in the regulating part 21.
  • Fig. 7 (a) shows the distribution of the amount of emission in the card arrangement panel 11 in the case of not providing the regulating part 21 with the partially light shielding part 21b.
  • the chart at the top shows the positional change of the amount of emission in the card arrangement panel 11
  • the figure at the bottom is a schematic diagram explaining the situation of irradiation to the card arrangement panel 11 of the light emitting parts 25-1 and 25-10 that form a pair.
  • the infrared light L1 emitted from the light emitting part 25-1 is regulated by the regulating part 21-1 and emitted to the range between positions c and d on the card arrangement panel 11.
  • the amount of emission (amount of light) obtained by the infrared light L1 becomes 0 in the amount of light within the range between positions a and c, and becomes R in the amount of light within the range between positions c and d.
  • the infrared light L10 emitted from the light emitting part 25-10 is regulated by the regulating part 21-10 and emitted to the range between positions a and b on the card arrangement panel 11.
  • the amount of emission (amount of light) obtained by the infrared light L10 becomes constant R in the amount of light within the range between positions a and b, and becomes 0 in the amount of light within the range between positions b and d.
  • the amount of emission (amount of light) in the entire card arrangement panel 11 equals the sum of the amount of emission (amount of light) obtained by the infrared light L1 and the amount of emission (amount of light) obtained by the infrared light L10, and therefore as shown in the chart at the top of Fig. 7 (a), the amount of light is constant R in the range between positions a and b and in the range between positions c and d, whereas the amount of light is 0 in the range between positions b and c, thus a difference in the amount of light being generated. Moreover, the degree of change in the amount of light is also rapid.
  • Fig. 7 (b) shows the distribution of the amount of emission in the card arrangement panel 11 in the case of providing the regulating part 21 with the partially light shielding part 21b. Note that since the relationship between the chart at the top and the figure at the bottom in Fig. 7 (b) is the same as the relationship in Fig. 7 (a), its explanation is omitted here.
  • the light reducing area such as that shown as 21 in Fig. 6 is not to be provided in the regulating part 21, then the amount of light becomes 2 x R in the range between positions b and c, thus the amount of light not becoming even, and if card C is disposed in such area, there occurs a problem that analysis of the recognition of card C is adversely affected thereby.
  • the infrared light L1 emitted from the light emitting part 25-1 is regulated by the regulating part 21-1, but in the partially light shielding part 21-1b the light is not totally shielded and there occurs leakage of light, and the said leaked light is emitted to positions b and c on the card arrangement panel 11. Moreover the amount of leakage (namely the amount of emission) in the partially light shielding part 21-1b becomes greater when coming closer to the tip (when coming closer to position c from position b), as described above by using Fig. 6.
  • the amount of emission (amount of light) obtained by the infrared light L1 becomes 0 in the amount of light in the range between positions a and c, gradually increasing and coming closer to constant R in the amount of light in the range between positions b and c, and becomes constant R in the amount of light in the range between positions c and d.
  • the infrared light L10 emitted from the light emitting part 25-10 is regulated by the regulating part 21-10, but in the partially light shielding part 21-10b the light is not totally shielded and there occurs leakage of light, and the said leaked light is emitted to positions b and c on the card arrangement panel 11. Moreover the amount of leakage (namely the amount of emission) in the partially light shielding part 21-10b becomes greater when coming closer to the tip (when coming closer to position c from position b), as described above by using Fig. 6.
  • the amount of emission (amount of light) obtained by the infrared light L10 becomes constant R in the amount of light in the range between positions a and b, gradually decreasing and coming closer to 0 in the amount of light in the range between positions b and c, and becomes 0 in the amount of light in the range between positions c and d.
  • the amount of emission (amount of light) in the entire card arrangement panel 11 equals the sum of the amount of emission (amount of light) obtained by the infrared light L1 and the amount of emission (amount of light) obtained by the infrared light L10, and therefore as shown in the chart at the top of Fig. 7 (b), the amount of light becomes constant R in the entire range between positions a and d.
  • the regulating part 21 having the partially light shielding part 21b suppression of the difference in the amount of light in the card arrangement panel 11 as compared with the case of being without the aid regulating part 21 (regulating part 21 consisting only of the totally light shielding part 21a having no partially light shielding part 21b) as well as the regulating of the direction of emission of light which has been emitted at least part of the plurality of the light emitting part 25 so that the degree of change in the amount of light will be suppressed in an area where there is a difference in the amount of light will suffice.
  • the shape, etc. of the partially light shielding part 21b is not particularly limited to the example in Fig.
  • the part may be made of milky-white acrylic with the thickness of the acrylic plate being changed so that the translucent light will be gradually changed, or a material whose degree of transparency changes may also be used.
  • the card arrangement panel 11 lets, as shown in Fig. 4, at least part of the first light LS among the infrared light that has been emitted from the back surface side pass through to the top surface, and at the same time is a flat plate, as shown in Fig. 1, where card C can be disposed at the top surface side.
  • the light emitting part 25-1 emits infrared light from the back surface side of the the card arrangement panel 11.
  • the imaging part 23-10 captures an image of card C by means of exposure by using the first light LS that has been reflected by card C (more accurately the infrared reflection area) via the card arrangement panel 11.
  • the unit controlling part 24-10 analyzes the position, and the like of card C by means of the local coordinate system by using the imaging area P2 as a sub-analysis area.
  • Fig. 4 was used as the example, an explanation was made as that of analysis processing for the imaging part 23-10 and the unit controlling part 24-10 which are in charge of the imaging area P2 as the sub-analysis area, and , namely the sensor unit 30-10, but the said analysis processing is executed in the sensor unit 30 as well, which is in charge of the sub-analysis area where at least part of card C is disposed.
  • the main controlling part 27 executes control related to the entire board game by using the result of recognition of one or more card C, and display the result of such execution on the display part 31 as an image or outputs it as a sound through the speaker 32.
  • an arbitrary number of the sensor unit 30 can be applied, and therefore there is no limitation on the size of the card arrangement panel 11, and for example a large size that has not been used before can also be adopted easily.
  • this embodiment comprises the regulating part 21-1 that regulates the direction of emission of the infrared light that has been emitted from the light emitting part 25-1 such that, of the infrared light emitted from the light emitting part 25-1, incidence of the second light LS that has been reflected by the card arrangement panel 11 onto the imaging part 23-1 will be prohibited.
  • this embodiment comprises the regulating parts 21-1 through 21-10 as well.
  • the regulating part 21 in this embodiment has the partially light shielding part 21b that regulates the direction of emission of the infrared light that has been emitted from at least part of the plurality of the light emitting part 25, such that the difference in the amount of light on the card arrangement panel 11 will be suppressed as compared with the case without the said regulating part 21 (partially light shielding part 21b), and at the same time in an area having a difference in the amount of light the degree of change in the amount of light will be suppressed (refer to Fig. 7).
  • the regulating part 21 in this embodiment has the partially light shielding part 21b that regulates the direction of emission of the infrared light that has been emitted from at least part of the plurality of the light emitting part 25, such that the difference in the amount of light on the card arrangement panel 11 will be suppressed as compared with the case without the said regulating part 21 (partially light shielding part 21b), and at the same time in an area having a difference in the amount of light the degree of
  • each of the imaging areas of which image is captured by each of the plurality of the imaging part 23 has an overlapping area that overlaps with the imaging area of which image is captured by at least one of the other imaging part 23 (refer to Fig. 5).
  • the sensor of the imaging part 23 performs exposure by using, of the infrared light that has been emitted from the light emitting part 25 to the imaging area of the card arrangement panel 11, the infrared light reflected by card C (accurately, the infrared reflection area).
  • control is required between the light emission timing of the light emitting part 25 that irradiates the imaging area and the exposure timing of the imaging part 23 that captures an image of the said imaging area.
  • control is executed, in the example shown in Fig. 3 in this embodiment, in the entire panel controlling part 26.
  • the sensor for the imaging part 23 various types of sensors can be used such as the CCD type and CMOS type.
  • exposure control will also vary naturally with the type of the senor (sensor characteristics).
  • the player moves card C by using his or her had on the card arrangement panel 11. For this reason, it is assumed that card C moves at high speeds.
  • the time for storing the infrared light reflected by the said card C at the sensor side is too long, the infrared light is stored at each position (each pixel) while the card is moving, and the so-called motion blur problem occurs.
  • the time for storing the infrared light reflected by card C at the sensor side should be reduced.
  • the entire panel controlling part 26 should simply execute control for making the imaging part 23 open the shutter (perform expose) during light emission of the light emitting part 25 that irradiates the imaging area.
  • the entire panel controlling part 26 executes control that causes infrared light to be emitted from the light emitting part 25 for a predetermined time within the period of exposure (namely while capturing an image) by the imaging part 23.
  • the predetermined time will suffice if it is shorter than the period of exposure and is not limited, but the problem of motion blur can be solved remarkably if the time is extremely short.
  • the predetermined time is 1/1000 seconds, then it can be said that generally no motion blur occurs. If the card is to be moved extremely quickly, the predetermined time needs to be shortened according to the speed of such movement.
  • the luminance (brightness) of each pixel that constitutes a captured image is determined according to the amount of light stored in each pixel at the sensor side.
  • the amount of light stored in each pixel at the sensor side is an integral of the amount of light per unit time and the storage time. Therefore, if the storage time is short, the amount of light to be stored in each pixel may be insufficient.
  • the amount of light per unit time should be increased so that the amount of light to be stored (luminance of the captured image) will not be insufficient. For this reason, the entire panel controlling part 26 executes control such that each light emitting part 25 emits intense light instantaneously.
  • the imaging part 23 is exposed with infrared light, and with this each of the light emitting part 25 emits infrared light.
  • the card arrangement panel 11 has the functions of a translucent screen.
  • the short focus projector 22 in Fig. 2 is a projector having a short focus that can project an image onto the said translucent screen by means of visible light.
  • card C that can be placed on the card arrangement panel 11 such as that described above will be explained in detail.
  • Fig. 8 shows specific examples of the various types of card C.
  • the side to be placed on the card arrangement panel 11, namely the surface of which image is captured by the imaging part 23, shall be called the "back surface.”
  • the surface opposite to the back surface namely the surface that can be visually recognized by the player, shall be called the "top surface.”
  • Fig. 8 (a) shows card C having a pattern showing the type of the playing card printed on its back surface.
  • Fig. 8 (b) shows card C having the English letter "A" printed on its back surface.
  • Fig. 8 (c) shows card C having a two-dimensional code printed on its back surface.
  • Fig. 8 (d) shows card C having a dot pattern printed on its back surface.
  • a "dot pattern” denotes an information code that has been encoded by an arrangement algorithm of a plurality of dots.
  • well-known algorithms such as Grid Onput (registered trademark) of Gridmark Limited and Anoto Pattern of Anoto can be used.
  • the encoding algorithm for a dot pattern itself is common to cases where it is read by visible light and cases where it is read by infrared rays, and therefore it is not specifically be limited.
  • a dot pattern may or may not be visually recognized, and even if it can be visually recognized, it will suffice if the pattern is recognized as a simple design, and any dot pattern whatsoever can be adopted.
  • a dot pattern can encode a different information code, by defining coordinate values, depending on the position of reading such values. Further details of dot patterns will be described later by referring to Fig. 10 and thereafter.
  • Fig. 8 (a), the letter of Fig. 8 (b), the two-dimensional code of Fig. 8 (c), the dot pattern of Fig. 8 (d), and the like are, for example, printed by using ink having infrared ray absorbing characteristics, and the others on the back surface of card C are infrared ray reflection areas.
  • the infrared light emitted from the light emitting part 25 passes through the area other than the areas where card C is placed on the card arrangement panel 11. Also, in the areas where the picture of Fig. 8 (a), the letter of Fig. 8 (b), the two-dimensional code of Fig. 8 (c), the dot pattern of Fig. 8 (d), and the like are formed (areas where the ink having infrared ray absorbing characteristics is printed), the infrared light is absorbed. Namely, the infrared light is reflected only in the other areas of the back surface of card C. Therefore, the images shown in Fig. 8 (however, the areas other than card C become dark in the images), namely images where the picture of Fig.
  • the unit controlling part 24, entire panel controlling part 26, or main controlling part 27 in Fig. 3 can recognize the picture of Fig. 8 (a), the letter of Fig. 8 (b), the two-dimensional code of Fig. 8 (c), the dot pattern of Fig. 8 (d), and the like based on such captured images.
  • Fig. 8 just shows examples, and on the back surface of card C, an arbitrary image, an arbitrary letter, an arbitrary figure, or an arbitrary code or an arbitrary number of combinations of these of arbitrary types can be formed. Namely, whatever may be formed on card C as long as it can be recognized by the unit controlling part 24, entire panel controlling part 26, or main controlling part 27 in Fig. 3 based on the captured images.
  • the areas of the picture of Fig. 8 (a), the letter of Fig. 8 (b), the two-dimensional code of Fig. 8 (c), the dot pattern of Fig. 8 (d), and the like were specified as absorbing infrared rays, and the other areas were specified to be the infrared ray reflection area, but the areas may not specifically be restricted to these, and combinations of absorption or reflection of infrared rays in card C may be specified arbitrarily.
  • Fig. 9 shows enlargements of the cross section of card C on which the dot pattern of Fig. 8 (d) is formed, showing specific examples of the combination of absorption and reflection of infrared rays.
  • Fig. 9 (a) shows an example of the absorption of infrared light by medium S that forms the entire card C as well as the diffusion and reflection of infrared light by dot d.
  • Fig. 9 (b) shows an example of the absorption of infrared light by infrared absorption layer I that is formed on the back surface of medium S, with medium S itself having arbitrary characteristics, as well as the diffusion and reflection of infrared light by dot d.
  • Fig. 9 (c) shows the example described above, i.e. of the diffusion and reflection of infrared light by medium S and absorption of infrared light by dot d.
  • the unit controlling part 24, entire panel controlling part 26, or main controlling part 27 in Fig. 3 can easily recognize the "portion having a reflectance of infrared light that is different from that of the other portions.”
  • an image, letter, figure, or code or any combination of these can be adopted as the "portion having a reflectance of infrared light that is different from that of the other portions" on card C.
  • a specific example where the "portion having a reflectance of infrared light that is different from that of the other portions" on card C is the dot that reflects infrared light, and a dot pattern is formed that consists of a plurality of dots on the surface of card C that non-reflects infrared light, is the example of Fig. 9 (A) or Fig. 9 (B).
  • a specific example where the "portion having a reflectance of infrared light that is different from that of the other portions” on card C is the dot that non-reflects infrared light, and a dot pattern is formed that consists of a plurality of dots on the surface of card C that reflects infrared light is the example of Fig. 9 (C).
  • Fig. 10 is an explanatory view showing GRID1 which an example of the dot patterns of the present invention.
  • Fig. 11 is an enlargement showing an example of the information dot in a dot pattern and the bit representation of the data defined therein.
  • Fig. 12 (a) and (b) are explanatory views showing the information dots placed around the key dots at the center.
  • the method of information input and output by using a dot pattern in the present invention consists of the generation of dot pattern 1, recognition of such dot pattern 1, and a means for outputting information and a program from this dot pattern 1.
  • dot pattern 1 is taken in by means of a camera (the imaging part 23 in Fig. 2 in this embodiment) as image data, and in the first place, reference grid point dot 4 is picked out, and then key dot 2 is picked out based on non-existence of reference grid point dot 4 at the position where it inherently should exist, and then information is quantified through digitization and the picking out of information area by picking out information dot 3, and based on this quantified information, information and a program are output from this dot pattern 1.
  • information such as sounds and a program from this dot pattern 1 are output to an information output device, personal computer, PDA or cellular phone, or the like (the main controlling part 27 in Fig. 3 in this embodiment ).
  • tiny dots for making information such as sounds recognized i.e. the key dot, information dot, and reference grid point dot 4
  • a dot code generation algorithm As shown in Fig. 10, in the block of dot pattern 1 that represents information, 5 x 5 reference grid point dot 4's are disposed with key dot 2 being used as the datum, and information dot 3 is disposed around virtual grid point 5 which is at the center surrounded by four reference grid point dot 4's.
  • arbitrary numerical information is defined. Note that the example in Fig. 10 shows the state in which four blocks (within the frames of thick line) of dot pattern 1 are lined up. However, it goes without saying that dot pattern 1 is not restricted to four blocks only.
  • one piece of information or a program corresponding thereto can be output, or to a plurality of blocks one of information or a program corresponding thereto can be output.
  • Key dot 2 is a dot that is disposed, as shown in Fig. 10, by moving the four datum grind point dot 4's located at the four corners of a block in a certain direction.
  • This key dot 2 is a representative point of dot pattern 1 for one block that represents information dot 3. For example, it is the point obtained by moving the reference grid point dot 4's that are located in the four corners of a block of dot pattern 1 for a distance of 0.1 mm upward.
  • information dot 3 represents X, Y coordinate values
  • the position obtained by moving key dot 2 for a distance of 0.1 mm downward becomes the coordinate point.
  • this value is not restricted thereto, and is variable with the size of the block of dot pattern 1.
  • Information dot 3 is a dot that makes various types of information recognized.
  • This information dot 3 is disposed, with key dot 2 being a representative point, around such point, and at the same time by using a point at the center surrounded by four reference grid point dot 4's as virtual grid point 5, information dot 3 is disposed at the end point of a vector whose start point is virtual grind point 5.
  • the dot diameter of key dot 2, information dot 3 or reference grid point dot 4 be around 0.03 - 0.05 mm by taking account of appearance, the precision of printing in relation to paper quality, the resolution of the camera, and optimum digitization.
  • the interval between datum grind point dot 4's is around 0.3 - 0.5 mm in vertical and lateral directions.
  • the movement of key dot 2 be around 20% of the grid interval.
  • the interval between this information dot 3 and the virtual grid point surrounded by the four reference grid point dot 4's be around 10 - 30% of the distance between adjacent virtual grid point 5's. If the distance between information dot 3 and virtual grid point 5 is smaller than this interval, the grid where a dot will be placed will be easily understood and the defined dot code value can be reversed, and therefore it lacks security performance, and a grid-like design is generated. Conversely, if the distance between information dot 3 and virtual grid point 5 is greater than this interval, it becomes difficult to identify which adjacent virtual grid point 5 was used as the center for giving vector directivity, and a design may be generated depending on the arrangement pattern of the information dot, and thus dot pattern 1 may become awkward. This is the reason for specifying the interval as described above.
  • the vector direction (direction of rotation) of information dot 3 be determined evenly for every 30 degrees - 90 degrees.
  • Fig. 13 is an example of information dot 3 and bit indication of the data defined therein, showing another embodiment.
  • information dot 3 by using two distances, long and short, from virtual grind point 5 surrounded by reference grid point dot 4's and by using eight vector directions, 4 bits can be represented. At this time, it is desirable that the longer one be around 20 - 30% of the distance between adjacent virtual grid point 5's and the shorter one be around 10 - 20% thereof. However, it is desirable that the interval between the longer and shorter information dot 3's be greater than the diameter of these dots.
  • information dot 3 surrounded by four reference grid point dot 4's be one dot by taking account of its appearance.
  • the dot can have plenty of information by representing information dot 3 by using a plurality of dots by allotting one bit for each vector.
  • vectors in eight directions on concentric circles can represent 28 pieces of information by means of information dot 3 surrounded by four reference grid point dot 4's, and 16 information dots in one block can represent 2128 pieces.
  • Fig. 14 is an illustrative view of the examples of information dots and the bit indication of the data defined in the dots, and (a) shows one having two dots disposed therein, (b) having four dots, and (c) having five dots.
  • Fig. 15 shows examples of deformation of the dot pattern, and (a) is a schematic diagram of a type having six information dots, (b) of a type having nine information dots, (c) of a type having 12 information dots, and (d) of a type having 36 information dots.
  • Dot pattern 1 shown in Fig. 10 and Fig. 12 shows an example of 16 (4 x 4) information dot 3's being disposed in one block.
  • this information dot 3 can be changed in various ways, without being restricted to the placing of 16 pieces in one block.
  • the direction of a block can be defined based on the relationships of placement of grid areas 34a, 34b, and 34c in which the direction of placement of information dot 3 in the block has been changed (the information dot has been disposed in the vertical or lateral direction from the center).
  • This enables information dot 3 to be disposed in all the grid areas in the block, and therefore the grid areas are not sacrificed in order to define the block direction, and information dot 3 can be disposed in all the grid areas.
  • Fig. 17 (b) shows the two blocks in Fig. 17 (a) being connected in each of the vertical and lateral directions.
  • the distance between grids is about 15 mm and the dot size is about 15% of the distance between dots. Therefore, it is desirable that the dot size is 2 mm to 2.5 mm, but the size is not limited to this. It is desirable that the distance between dots in a captured image be 14 pixels or more.
  • Fig. 18 (a) is an explanatory view o the method for obtaining the angle of card C.
  • Fig. 21 (b) is an illustrative view showing an example of a dot pattern to be printed on the back surface of the card. Note that the straight lines connecting the grid point dots are indicated for the sake of convenience in order to facilitate an understanding of the dot pattern, and no such grid lines are printed on the actual dot pattern.
  • the angle formed by the direction of imaging capturing i.e. the y-direction of the frame buffer and the direction of the dot pattern be ⁇ , and this shall be regarded as the angle of the card.
  • the coordinates of the reference grid point dot P1 be (x1, y1) and the coordinates of P2 be (x2, y2) and let the distance between P1 and P2, i.e. the interval between the reference grid point dots be 1.
  • ⁇ based on the x coordinate.
  • , or ⁇ 2 180-
  • ⁇ 1 180+
  • , or ⁇ 2 360-
  • ⁇ based on the y coordinate.
  • , or ⁇ 2 360-
  • ⁇ 1 180-
  • ⁇ 2 180+
  • each has two solutions ( ⁇ 1, ⁇ 2), and choose ⁇ that gives the same solution among the solutions given by the x and y coordinates, and le ⁇ be that solution.
  • Fig. 19 is a plan view of the state of card C being placed on the card arrangement panel 11 as seen from top.
  • the player can place four card C's A through D at any position on the card arrangement panel 11.
  • Fig. 20 shows the image of card C's placed on the card arrangement panel 11 as captured by the sensor unit 30 placed at the lower surface side of the card arrangement panel. Namely, the captured image in Fig. 20 shows how the four cards CA through CD are placed within the imaging area (sub-analysis area) of one sensor unit 30, i.e. one imaging part 23.
  • the sensor unit 30 is provided in the lower part in the cabinet of the pedestal part 12 (Fig. 1 and Fig. 2), and captures the image of the back surface of the card arrangement panel 11. Therefore, the image captured by the sensor unit 30 is an image in Fig. 20 being turned upside down and with right and left being reversed, and also the image of the dot code printed on the back surface of the card is captured.
  • Fig. 21 shows a pixel matrix for judging whether there is card C.
  • this pixel matrix one cell consists of 16 pixels x 16 pixels, and predetermined pixels (the hatched pixels in the figure) function as the check pixels.
  • the brightness of the group of check pixels that have been specified for each predetermined interval (here for every five pixels) is detected and it is supposed that judgment on the placement of a medium on the said pixel matrix is made when the said brightness is equal to or above the predetermined threshold value.
  • the CPU provided on the sensor unit 30 divides the sub-analysis area (imaging area) into 18 vertical x 22 lateral cells as shown in Fig. 22 to be described later. These cells are further divided into 16 vertical x 16 lateral pixels. And of these pixels, the bright level (brightness) of the hatched portions in Fig. 21 is measured.
  • the bright level is represented in 256 stages from 0 to 255. And by specifying an arbitrary threshold value, if the bright level is equal to or greater than the threshold value, it is judged that a card or object or hand is placed at that position. However, if the bright level is 255, it is white noise, and therefore it shall not be judged that a card or the like is placed there.
  • Fig. 22 is an explanatory view of the method of analyzing the code of card C.
  • the sub-analysis area (imaging area) into 18 vertical x 22 lateral cells. And by using the sensor unit 30, scan the cells starting at the top left cell in the rightward direction.
  • the portions other than the card C dots use a pattern that reflect infrared rays, such portions have a captured image that is brighter than that in the areas where no card C is placed.
  • search cells having a bright captured image. If the cell has a bright captured image, it is judged that card C is placed there. And by using the method described above, judgment as to whether there is a dot is made. By performing a search of the bright area and making judgment as to whether there is a dot one by one, the dot code printed on the card surface is read.
  • Fig. 23 is an explanatory view of the method for recognizing the position and angle of card C placed on the card arrangement panel 11 (on the stage surface). Note that as described above, here an explanation is given by using the local coordinate system of one sub-analysis area (imaging area) of which one sensor unit 30 is in charge. However, actually in the entire analysis area where a plurality of sub-analysis areas are placed including the overlapping areas, a processing may be done by using the card analysis overall coordinate system.
  • the unit controlling part 24 of the said sensor unit 30 detects a code value corresponding to this dot pattern. And upon detection of the said code value, the unit controlling part 24 then searches the key dot. And the unit controlling part 24 calculates the x, y coordinates of the card center, and uses this as the card position.
  • the card position is represented by the x, y coordinates of which datum is at the right bottom (left top when an image is captured) of the arrangement panel. Namely, by detecting the card position by means of this, the card center is calculated and the x, y coordinates of the card center are obtained. By means of this, the card position is calculated.
  • be the angle formed by the straight line connecting the card center and the key dot, and the vertical direction of the card arrangement panel, and this angle is used for specifying the card direction.
  • the card angle is calculated by the method described above.
  • Fig. 24 is an explanatory view for the method of calculation of the angle of movement and displacement when the player has moved card C on the card arrangement panel 11 (on the stage surface).
  • the direction and data area can be defined by using the reference dot, and there is no need to place the key dot.
  • (a) of the said figure is an illustrative view showing one dot pattern, and (b) is an enlarged view of the area shown with the alternate long and short dash line in (a) of the said figure.
  • the difference between the directions of the key dot or datum dot before and after movement enables one to know the angle of rotation resulting from the movement. Also, the displacement and the time of movement can be calculated.
  • the power of the card holder may be made variable with the position of card C, and the game may proceed with the displacement and the time of movement being used as parameters.
  • a game can be considered such that, if the player moves card C, the displacement, angle of rotation, and time of movement of card C become parameters, and the power of card C changes. Namely, it follows that, by moving card C, the strength of card C is changed thereby.
  • Fig. 26 is an explanatory view showing cases in which the locus of card C is used as a parameter, and (a) is a case where card C is moved in a circular manner, and (b) is a case where card C is moved as though a quadrangle is drawn.
  • the locus of card C when the player has moved it can also be used as a parameter.
  • the shape of the locus can be used as a parameter.
  • parameters such as the attribute, etc. of card C may also be changed depending on whether the player moves the card in the form of a curve as shown in (a), or moves it in the form of a straight line.
  • the medium for recognition on the stage surface may be, aside from the card described above, the tip of a finger of the operator or player him- or herself. Like this, when the tip of the finger has touched the stage surface, since the brightness of the touched portion on the stage surface changes, it can be recognized that what has touched the surface is the tip of the finger. Specifically explanations are given in Fig. 27 and Fig. 28. As shown in Fig. 27 (a), the game is played by the placement of a card by the player or operator or by the touching of the stage surface by the finger tip. (b) of the said figure is a view of this state from below the stage surface. (c) of the said figure is an explanatory view of the method for recognizing the shape of a medium.
  • the image captured with infrared rays contain an image of the touched position, by making judgment on the touched area by using a predetermined method, and further by means of a predetermined method for obtaining the center of gravity of the touched area and the center of the figure, and the like, the coordinates of the center of the touched position can be obtained.
  • a predetermined method for obtaining the center of gravity of the touched area and the center of the figure, and the like.
  • an area is deemed to be a surface of contact with the stage surface.
  • the threshold value may be an absolute value, or may be a dynamic threshold value that varies with the brightness of the surrounding areas, or the ratio to the brightness of the surrounding areas may be used as the threshold value.
  • One surface of contact shall be an area in which, of the pixels that constitute the storage area, one or more pixels whose brightness has exceeded the threshold value lie continuously in either of the vertical, lateral, upward, or downward directions.
  • the area surrounded by the thick line as shown in (c) of the said figure is the area of pixels whose brightness has exceeded the threshold value, and it shows the shape of the medium.
  • the present invention is not particularly restricted to this panel.
  • the present invention can be applicable to a flat panel, including the card arrangement panel 11, that makes at least some of the first light among the light emitted from the back surface side transmit to the front surface and in which a predetermined medium can be disposed at the said front surface side.
  • a predetermined medium can be disposed at the said front surface side.
  • the object of light emission from the light emitting part is not particularly restricted to infrared light, and it may also be visible light.
  • an optical device to which the present invention is applied comprises: a flat plate that makes at least some of the first light among the light emitted from the back surface side transmit to the front surface and in which a predetermined medium can be disposed at the said front surface side; a light emitting part that emits the said light from the back surface side of the said flat plate; an imaging part that captures an image of the said medium by exposing it to the said first light reflected by the said medium via the said flat plate; and a regulating part that regulates the direction of emission of the light emitted from the said light emitting part so that incidence of the second light reflected by the said flat plate among the light emitted by the said light emitting part onto the said imaging part will be prohibited.
  • the position, and the like were recognized, but arbitrary information of a medium including card C can be recognized.
  • the features of a medium of which image has been captured by the imaging part, or information attached to the said medium can be recognized.
  • the features of a medium for example either of the shape and size of a medium can be recognized.
  • a medium can be made to be formed with an image, character, figure, or code, or a combination of these.
  • the image, character, figure, or code, or a combination of these can be recognized.
  • the medium can be made to have at least some parts that contain a portion where the reflectivity of the first light is different from that of the other portions.
  • the portion where the reflectivity of the first light is different from that of the other portions can be recognized.
  • an image, character, figure, or code, or a combination of these can be made to be formed as the portion where the reflectivity of the first light is different from that of the other portions.
  • the portion of the medium where the reflectivity of the first light is different from that of the other portions may be a dot that reflects the first light, and on the surface of the medium that non-reflects the first light, a dot pattern consisting of a plurality of dots may be formed. In this case, the dot pattern is recognized.
  • the portion of the medium where the reflectivity of the first light is different from that of the other portions may be a dot that non-reflects the first light, and on the surface of the medium that reflects the first light, a dot pattern consisting of a plurality of dots may be formed. In this case as well, the dot pattern is recognized.
  • an optical device to which the present invention is applied may: comprise a plurality of imaging parts; and on a flat plate, each imaging area whose image is captured by each of the said plurality of imaging parts can have an overlapping area with the imaging area whose image is captured by at least one of the said other imaging parts.
  • an optical device to which the present invention is applied may: comprise a plurality of light emitting parts; and the said flat plate is irradiated by a plurality of the said light emitting parts, and the said regulating part further suppresses the difference in the amount of light on the said flat plate as compared with the case without the said regulating part, and at the same time can regulate the direction of emission of the light that has been emitted from at least some of the said plurality of light emitting parts so that the degree of change in the amount of light is suppressed in area where there is a difference in the amount of light.
  • an optical device to which the present invention is applied may: further comprise a controlling part which causes, during light emission of the said light emitting part that irradiates the imaging area of the said imaging part, exposure of the said imaging part.
  • an optical device to which the present invention is applied may: further comprise a controlling part which, for a predetermined time within the period of exposure by means of the imaging part, executes control that makes the light emitting part emit light that irradiates the imaging area of the said imaging part.
  • an optical device to which the present invention is applied may: further comprise a recognition part that recognizes the position of a medium by means of a coordinate system of which datum is a flat plate, by using at least one of the images including at least some parts of the medium as a subject, among the images captured by a plurality of imaging parts.
  • the flat plate may further comprise a projector having the functions of a translucent screen which can project an image onto the translucent screen by means of visible light.
  • the light emitted from the light emitting part may be specified to be invisible light such as infrared light in the embodiment described above.
  • the light emitted from the light emitting part may be white visible light.
  • the optical device to which the present invention is applied may preferably further comprise; a controlling part which, for a predetermined time (for example, for 1/1000 seconds), executes control that makes the light emitting part emit light, and a recognition part that recognizes the medium disposed on the flat plate by using an image captured by the imaging part.
  • the controlling part may preferably further execute control that synchronizes the timing of exposure of the imaging part with the timing of irradiation by the light emitting part, after clearing the buffer that holds image signals that have undergone photoelectric conversion in the imaging part.
  • the series of processing described above may be executed by hardware, or may be executed by software. If the series of processing is to be executed by software, programs constituting such software are installed in a computer, or the like from a network or recording medium.
  • the computer may be a computer incorporated in dedicated hardware. Also, the computer may be a computer that can execute various types of functions by installing various types of programs, for example, a versatile personal computer.
  • the recording medium containing such programs not only consists of removable media to be distributed separately from the device itself in order to provide the player with programs, but also of recording media, and the like to be provided to the player in a state of being incorporated into the device itself in advance.
  • the removable media may, for example, consist of magnetic disks (including floppy disks), Blu-ray Disc (registered trademark), optical disks, or optical magnetic disks, and the like.
  • the optical disks consist, for example, CD-ROMs (Compact Disk-Read Only Memories), DVDs (Digital Versatile Disks), and the like.
  • the optical magnetic disks consist of MDs (Mini-Disks), and the like.
  • the recording media to be provided to the player in a state of being incorporated into the device itself in advance consist, for example, ROMs in which programs are recorded, hard disks, and the like.
  • the steps that describe programs to be recorded in recording media include not only the processing to be executed in a temporal sequence according to the order, but also processing to be executed in parallel or individually although it may not necessarily be executed in a temporal sequence.
  • the term "system” shall mean an overall device consisting of a plurality of devices and a plurality of means, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Image Input (AREA)

Abstract

L'invention a pour objet de permettre de réaliser avec aisance le déroulement d'un jeu de plateau en combinaison avec un terminal mobile, ou similaire. Le panneau 11 d'agencement de cartes fait en sorte qu'au moins une partie de la première lumière comprise dans la lumière émise à partir du côté surface arrière se transmette à la surface avant et dans lequel peut être disposé un support prédéterminé comme une carte C1, C2 ou similaire sur ledit côté surface avant. La partie 25-1 émettrice de lumière ou similaire émet ladite lumière à partir du côté surface arrière du panneau 11 d'agencement de cartes. La partie 23-n d'imagerie ou similaire capture une image dudit support en l'exposant à ladite première lumière réfléchie par ledit support via le panneau 11 d'agencement de cartes. La partie régulatrice 21-1 ou similaire régule la direction d'émission de la lumière émise à partir de ladite partie 25-1 émettrice de lumière de telle sorte que l'incidence de la deuxième lumière réfléchie par le panneau 11 d'agencement de cartes comprise dans la lumière émise par la partie 25-1 émettrice de lumière sur la partie 23-n d'imagerie ou similaire soit interdite.
PCT/JP2015/001876 2014-11-17 2015-03-31 Dispositif optique WO2016079901A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017545010A JP2017535902A (ja) 2014-11-17 2015-03-31 光学装置及び読取装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462080733P 2014-11-17 2014-11-17
US62/080,733 2014-11-17

Publications (1)

Publication Number Publication Date
WO2016079901A1 true WO2016079901A1 (fr) 2016-05-26

Family

ID=53761457

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/001876 WO2016079901A1 (fr) 2014-11-17 2015-03-31 Dispositif optique

Country Status (2)

Country Link
JP (1) JP2017535902A (fr)
WO (1) WO2016079901A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002102529A (ja) 2000-10-03 2002-04-09 Sega Corp ゲーム装置
JP2005046649A (ja) 2001-02-02 2005-02-24 Sega Corp カードゲーム装置
JP2008501490A (ja) 2006-08-02 2008-01-24 健治 吉田 情報出力装置、媒体および情報入出力装置
US20080233360A1 (en) * 2006-12-27 2008-09-25 Keiko Sekine Transparent sheet having a pattern for infrared reflection
US20110163163A1 (en) * 2004-06-01 2011-07-07 Lumidigm, Inc. Multispectral barcode imaging
EP2369454A2 (fr) * 2008-11-25 2011-09-28 YOSHIDA, Kenji Système d'entrée/sortie manuscrites, feuille d'entrée manuscrite, système d'entrée d'informations, et feuille d'aide à l'entrée d'informations
EP2613298A1 (fr) * 2012-01-06 2013-07-10 Mortalcine OY Système de présentation pour événement de carte en temps réel

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11195082A (ja) * 1997-10-27 1999-07-21 Denso Corp 光学情報読取装置
JP2010066822A (ja) * 2008-09-08 2010-03-25 Dainippon Screen Mfg Co Ltd 検査装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002102529A (ja) 2000-10-03 2002-04-09 Sega Corp ゲーム装置
JP2005046649A (ja) 2001-02-02 2005-02-24 Sega Corp カードゲーム装置
US20110163163A1 (en) * 2004-06-01 2011-07-07 Lumidigm, Inc. Multispectral barcode imaging
JP2008501490A (ja) 2006-08-02 2008-01-24 健治 吉田 情報出力装置、媒体および情報入出力装置
US20080233360A1 (en) * 2006-12-27 2008-09-25 Keiko Sekine Transparent sheet having a pattern for infrared reflection
EP2369454A2 (fr) * 2008-11-25 2011-09-28 YOSHIDA, Kenji Système d'entrée/sortie manuscrites, feuille d'entrée manuscrite, système d'entrée d'informations, et feuille d'aide à l'entrée d'informations
EP2613298A1 (fr) * 2012-01-06 2013-07-10 Mortalcine OY Système de présentation pour événement de carte en temps réel

Also Published As

Publication number Publication date
JP2017535902A (ja) 2017-11-30

Similar Documents

Publication Publication Date Title
US8248666B2 (en) Information input/output device including a stage surface on which a reflecting medium including a printed dot pattern is disposed
US9210404B2 (en) Calibration and registration of camera arrays using a single circular grid optical target
US9557811B1 (en) Determining relative motion as input
KR102535177B1 (ko) 지문 인식 장치와 이를 포함한 표시장치와 모바일 단말기
US8791901B2 (en) Object tracking with projected reference patterns
CN105229412B (zh) 用于主动立体的强度调制光图案
JP2008501490A (ja) 情報出力装置、媒体および情報入出力装置
JP5138119B2 (ja) 物体検出装置および情報取得装置
KR20160146735A (ko) 대화형 디스플레이 스크린들에 대한 압력, 회전 및 스타일러스 기능
JP4268659B2 (ja) 情報出力装置
US20090091553A1 (en) Detecting touch on a surface via a scanning laser
CN110489580B (zh) 图像处理方法、装置、显示屏组件以及电子设备
KR20220047638A (ko) 카메라용 광원
CN101263445A (zh) 信息输出装置、媒体及信息输入/输出装置
US11068684B2 (en) Fingerprint authentication sensor module and fingerprint authentication device
WO2016079901A1 (fr) Dispositif optique
JP6098386B2 (ja) プロジェクタ
JP2015087776A (ja) 情報入力補助シート、ドットコード情報処理システムおよびキャリブレーション方法
KR101385263B1 (ko) 가상 키보드를 위한 시스템 및 방법
JP5429446B2 (ja) ゲーム装置及びそのパターンマッチング方法
JP2012019851A (ja) ストリームドットを用いた情報出力装置、媒体および情報入出力装置
JP2016057707A (ja) コード画像表示システム、コード画像表示方法、及びコード画像表示装置
AU2012211456A1 (en) Information Output Device, Medium, and information Input/Output Device
JP2008224328A (ja) 位置演算システム、位置演算装置および位置演算方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15744367

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017545010

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15744367

Country of ref document: EP

Kind code of ref document: A1