WO2017038982A1 - 投影システム - Google Patents
投影システム Download PDFInfo
- Publication number
- WO2017038982A1 WO2017038982A1 PCT/JP2016/075841 JP2016075841W WO2017038982A1 WO 2017038982 A1 WO2017038982 A1 WO 2017038982A1 JP 2016075841 W JP2016075841 W JP 2016075841W WO 2017038982 A1 WO2017038982 A1 WO 2017038982A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projection
- image
- processing
- display
- projected
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/22—Setup operations, e.g. calibration, key configuration or button assignment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2206/00—Systems for exchange of information between different pieces of apparatus, e.g. for exchanging trimming information, for photo finishing
Definitions
- the present invention relates to a projection system and the like.
- Patent Documents 1 and 2 Conventionally, a system is known which projects a projection image onto a projection object by means of a projection device.
- Patent Documents 1 and 2. there are techniques disclosed in Patent Documents 1 and 2.
- a projection system or the like that can solve these problems by projecting a projection image on which positional relationship information etc. of objects are reflected, and can further improve the interactivity. it can.
- One aspect of the present invention acquires positional information of at least one of a first object and a second object based on detection information of a projection unit that projects a projection image and a sensor unit, and performs the generation process of the projection image.
- a processing unit wherein the processing unit determines that the first object and the second object have a given relationship based on the acquired position information;
- the present invention relates to a projection system that performs processing of changing the content of at least one of a first projection image projected onto the first object and a second projection image projected onto the second object.
- positional information of at least one of the first and second objects is acquired based on detection information of the sensor unit.
- the first and second projections are projected onto the first and second objects.
- Processing is performed to change the content of at least one of the images.
- the contents of the first and second projected images can be changed by determining the relationship between the first and second objects based on the position information of the objects. Therefore, it is possible to realize a projection system capable of enhancing the interactivity by projecting a projection image on which positional relationship information and the like between objects are reflected.
- the processing unit obtains a positional relationship between a virtual surface set at a given position with respect to the first object and the second object, and It may be determined whether one object and the second object have the given relationship.
- the positional relationship between the second object and the virtual plane set at a given position with respect to the first object, not the first object itself, is determined.
- the processing unit projects the first object and the second object.
- Processing for causing a display to appear, processing for causing a display to disappear, and display processing for at least one of the first projection image to be displayed and the second projection image to be projected onto the second object At least one process of changing the image of may be performed.
- the processing unit is configured to send the first object to the first object when it is determined that the first object and the second object have the given relationship.
- the processing of generating the second projection image may be performed such that a display object to be projected is projected, for example, following the second object.
- the display object to be projected onto the first object is projected, for example, following the second object. It will be displayed. Therefore, as the first and second objects are in a given relationship, it is possible to generate a projection image that looks as if a display object appeared at a location corresponding to the second object. .
- the processing unit performs display control of the display object based on a relation between the display object projected on the second object and the second object. It is also good.
- the processing unit when the first object and the second object have the given relationship, the processing unit performs arithmetic processing based on a processing rule, and the arithmetic operation is performed.
- display control of the display object may be performed such that the display object determined to be projected on the second object is projected on the second object.
- the processing unit is configured to determine whether the relationship between the first object and the second object changes from the given relationship.
- the display control of the display object may be performed according to the change of the relationship between the two objects.
- the processing unit performs arithmetic processing based on processing rules when the relationship between the first object and the second object changes, and as a result of the arithmetic processing,
- the display control of the display object may be performed such that the display object determined to be projected onto the second object is projected onto the second object.
- the processing unit performs arithmetic processing based on processing rules when the relationship between the first object and the second object changes, and as a result of the arithmetic processing,
- the display control of the display object may be performed such that the display object determined not to be projected on the second object is projected on the first object.
- the processing unit determines that the display object is the third object when it is determined that the second object and the third object have a given relationship. Processing for displaying on the screen.
- the processing unit determines a relative positional relationship between the first object and the second object based on detection information of the sensor unit, and the first processing unit determines the first position. It may be determined whether the object of and the second object have the given relationship.
- the relative positional relationship may be a relationship with respect to a height of the second object with respect to the first object.
- the processing unit performs a recognition process of a marker set to the second object based on detection information of the sensor unit, and the processing unit performs the recognition process based on a result of the recognition process. Even if the position information of the second object is acquired and it is determined whether the first object and the second object have the given relationship based on the acquired position information Good.
- the processing unit obtains a second projection area on which the second projection image is projected, based on the marker, and the second projection area is projected on the second projection area. Processing of generating a projection image of
- the second projection area is determined using the marker, and a second projection image to the second projection area is generated, for example, processing for changing the content of the second projection image, etc. Will be realized.
- the second object may be a portion of the user or a grip of the user.
- One aspect of the present invention includes a projection unit that projects a projection image onto a play field that is a first object, and a processing unit that performs generation processing of the projection image, the processing unit including The image of the water surface is displayed on a virtual surface set at a given position with respect to the playfield, and the projection image for displaying the image of a living thing is generated, the projection unit includes the image of the water surface and the image The projection image for displaying an image of a creature is projected onto the play field, and the processing unit is the play field which is the first object based on position information of a second object.
- the present invention relates to a projection system that performs processing of changing the content of at least one of a first projection image projected onto the second object and a second projection image projected onto the second object.
- a projected image for displaying an image of a living being is displayed on the playfield while displaying an image of the water surface on a virtual surface set at a given position with respect to the playfield. It is projected.
- the content of at least one of the first projection image projected onto the playfield and the second projection image projected onto the second object changes in accordance with the position information of the second object.
- the water surface is at a position corresponding to the virtual surface of the playfield, and for example, a living thing appears to be present near the water surface.
- the contents of the first and second projection images can be changed according to the position information of the second object, it is possible to realize a projection system which can further improve the interactivity.
- the processing unit is configured to perform at least one of the first projection image projected onto the playfield and the second projection image projected onto the second object. At least one of processing for causing the display to appear, processing for deleting the display, and processing for changing the image of the display may be performed.
- the user can be made to feel as if appearance or disappearance of a display object or a change in the image occurred, and the interactivity in the projection system can be improved.
- the processing unit performs recognition processing of a marker set to the second object, and acquires position information of the second object based on a result of the recognition processing.
- a process of changing the content of at least one of the first projection image and the second projection image may be performed based on the acquired position information.
- the position information of the second object can be stably and appropriately acquired, and the content of at least one of the first projection image and the second projection image can be changed. It will be.
- the processing unit is configured to determine whether the playfield as the first object and the second object are given based on the position information of the second object. When it is determined that the relationship is established, processing may be performed to change the content of at least one of the first projection image and the second projection image.
- the content of at least one of the first and second projection images can be changed, and the interactivity in the projection system can be improved.
- the processing unit may obtain the position information of the second object based on detection information of a sensor unit.
- the projection unit may project the projection image for displaying the image of the water surface and the image of the living thing onto the playfield by projection mapping.
- projection projection can be used to project a projection image on which the influence of the shape is reduced, onto the play field.
- the playfield may be a sandbox.
- the processing unit may generate the projection image on which the water surface and the living thing are animated.
- the projection unit may be installed above the playfield.
- the projection unit can be installed, for example, at an inconspicuous place above the play field, and the projection image can be projected onto the play field.
- FIGS. 7A and 7B are explanatory diagrams of a method of setting markers on an object and acquiring position information and the like. Explanatory drawing of the method of changing a display thing according to a marker pattern.
- FIG. 9 (A) and FIG. 9 (B) are explanatory drawings of the projection method of the projection image to a container.
- Explanatory drawing of the method of acquiring a positional information, etc. using a bait item Explanatory drawing of the production
- Explanatory drawing of the modification of this embodiment Explanatory drawing about the correction process of a projection image.
- 5 is a flowchart of a detailed process example of the embodiment. 5 is a flowchart of a detailed process example of the embodiment. 5 is a flowchart of a detailed process example of the embodiment.
- FIG. 1 shows an example of the overall configuration of a projection system of the present embodiment.
- the projection system of the present embodiment includes projection units 40 and 42 and a processing unit 90 (processing unit in a broad sense).
- the sensor unit 50 can be further included.
- the configuration of the projection system of the present embodiment is not limited to that shown in FIG. 1, and various modifications may be made such as omitting some of the components (each part) or adding other components.
- the play field 10 is a field for the user (player) to enjoy attractions and the like, and in FIG. 1 is a sand field covered with sand.
- As the play field 10 various fields such as a field of grass or flower, a field of soil, a field for playing sports, or a field in which a course for a competition game is drawn can be assumed.
- the projection units 40 and 42 project a projection image onto the play field 10 (first object in a broad sense) or the like, and can be realized by a so-called projector.
- the projection units 40 and 42 are installed above the playfield 10 (for example, a ceiling or the like), and project a projection image onto the playfield 10 below from above.
- the number of projectors may be one, or three or more.
- a so-called rear projection method may be used in which a floor surface is a screen and a projector (projection unit) is disposed under the floor. You may comprise by a display.
- the sensor unit 50 detects position information and the like of an object.
- the sensor unit 50 is installed above the playfield 10 (for example, a ceiling), and detects, for example, height information (height information in each area) of the playfield 10 which is an object as position information.
- the sensor unit 50 can be realized by, for example, a normal camera that captures an image, a depth sensor (distance measuring sensor), or the like.
- the bucket 60 is for stocking creatures such as caught fish, as will be described later, and on the upper surface thereof, a display unit 62 (for example, a display of a tablet PC) for projecting an indication of the captured creatures. It is provided.
- a display unit 62 for example, a display of a tablet PC
- the processing device 90 functions as a processing unit of the present embodiment, and performs various processes such as a process of generating a projection image.
- the processing device 90 can be realized by, for example, various information processing devices such as a desktop PC, a notebook PC, or a tablet PC.
- FIG. 2 shows a detailed configuration example of the projection system of the present embodiment.
- the processing device 90 in FIG. 1 is realized by the processing unit 100, the I / F unit 120, the storage unit 150, and the like in FIG.
- the processing unit 100 performs various determination processes, an image generation process, and the like based on detection information and the like from the sensor unit 50.
- the processing unit 100 performs various processing with the storage unit 150 as a work area.
- the function of the processing unit 100 can be realized by hardware such as various processors (CPU, GPU, etc.), ASIC (gate array, etc.), or a program.
- the I / F (interface) unit 120 performs interface processing with an external device.
- the I / F unit 120 performs interface processing with the projection units 40 and 42, the sensor unit 50, and the display unit 62.
- information of a projection image generated by the processing unit 100 is output to the projection units 40 and 42 via the I / F unit 120.
- Detection information from the sensor unit 50 is input to the processing unit 100 via the I / F unit 120.
- Information of an image to be displayed on the display unit 62 is output to the display unit 62 via the I / F unit 120.
- the storage unit 150 is a work area of the processing unit 100 or the like, and its function can be realized by a RAM, an SSD, an HDD, or the like.
- the storage unit 150 stores a display object information storage unit 152 that stores information (image information and the like) of a display object, a marker pattern storage unit 154 that stores marker pattern information, and height information (position information) of an object. It includes a height information storage unit 156 to be stored.
- the processing unit 100 includes a position information acquisition unit 102, a marker recognition unit 104, a position relationship determination unit 106, a capture determination unit 108, a release determination unit 109, and an image generation processing unit 110.
- the image generation processing unit 110 also includes a distortion correction unit 112. Note that various modifications may be made such as omitting some of these components (each part) or adding other components.
- the processing unit 100 acquires position information of at least one of the first and second objects based on the detection information of the sensor unit 50.
- the position information acquisition unit 102 performs acquisition processing of position information (for example, height information and the like) of the object based on the detection information from the sensor unit 50.
- position information for example, height information and the like
- the processing unit 100 generates a projection image
- the projection units 40 and 42 project the generated projection image.
- the image generation processing unit 110 generates a projection image.
- a specific creature is placed at a deep position in the terrain, and a position that is raised enough that the terrain is judged to be higher than the virtual water surface (virtual surface) is expressed as a land without displaying water. Etc.
- a plurality of projectors projection units 40 and 42
- the distortion correction unit 112 may perform distortion correction processing on the projected image. For example, distortion correction processing for reducing distortion when the projection image is projected on the object is performed based on position information of the object and the like. However, since distortion correction processing also depends on the viewpoint position of the observer, distortion correction is not performed when it is difficult to obtain the viewpoint position of the observer or when there are a plurality of observers. Sometimes it is better. Whether or not distortion correction is to be performed may be appropriately determined according to the content of the content and the situation of the observer.
- the processing unit 100 determines whether the first object and the second object have a given relationship.
- the positional relationship determination unit 106 performs this determination process. Then, when it is determined that the first and second objects have a given relationship, a first projection image projected onto the first object and a second projection projected onto the second object A process of changing the content of at least one of the two projected images is performed. For example, processing of changing the content of one of the first and second projected images or changing the content of both is performed.
- the image generation processing unit 110 performs this image change processing. Then, the first and second projection images after the change processing are projected by the projection units 40 and 42 onto the first and second objects.
- the first object is, for example, the play field 10 of FIG.
- the second object is, for example, a portion of the user or a grip of the user.
- the region of the user is, for example, the hand (palm) of the user
- the grasped object of the user is a container or the like held by the user with a hand or the like, and is an object which can be grasped by the user.
- the site of the user may be a site such as the user's face, chest, stomach, waist, or foot.
- the grasped object may be an object other than the container, or may be an object grasped at a part other than the user's hand.
- the first object is not limited to the playfield 10, and may be an object to be projected, such as a main image serving as a background or the like.
- the second object is not limited to the user's site or grip.
- the processing unit 100 obtains a positional relationship between a virtual plane (virtual plane) set at a given position (height) with respect to the first object and the second object, It is determined whether the object and the second object have a given relationship. Then, the content of at least one of the first and second projection images projected onto the first and second objects is changed.
- a virtual plane virtual plane set at a given position (height) with respect to the first object and the second object
- a virtual plane corresponding to the projection plane is set at a position (upper position) offset from the projection plane of the first object.
- the virtual plane is, for example, a plane virtually set corresponding to the projection plane of the play field 10. Then, it is determined whether the virtual plane and the second object have a given relationship (positional relationship), not the first object (the projection surface of the first object). For example, it is determined whether the second object, which is the user's part or the grasped object, has a given relationship with a virtual surface (for example, a virtual sea surface, a virtual water surface). Specifically, it is determined whether or not the second object is below the virtual plane.
- a virtual surface for example, a virtual sea surface, a virtual water surface
- a second projection image for example, an image seen on a hand or a container
- a first projection image for example, a living being on a first object
- the processing unit 100 is projected onto the first object. Processing for causing the display to appear, processing for eliminating the display, and changing the image of the display in at least one of the first projection image and the second projection image projected onto the second object Perform at least one of the processing steps. For example, the processing unit 100 performs a process of causing a display object such as a creature to appear later to appear in the first projection image or the second projection image, or performs a process of causing the display object to disappear, or Perform processing to change an image (display pattern, texture, color, effect, etc.).
- the first projection image projected onto the first object when it is determined that the first object and the second object have a given relationship, the first projection image projected onto the first object, and the second A process of changing the content of at least one of the second projection images projected onto the object is realized.
- information (image information, object information, attribute information, etc.) of the display object is stored in the display object information storage unit 152.
- the display object to be projected onto the first object is the second object.
- a second projection image generation process is performed so as to be projected onto the object (as projected following the second object).
- a display object such as a sea creature is to be projected onto the play field 10 which is the first object.
- the first object such as the playfield 10 and the part such as the user's hand or the second object which is the object held by the user have a given relationship
- the creature of the sea is obtained.
- the projection image generation processing is performed so that the display object such as is displayed in consideration of not only the first object but also the position and shape of the second object such as the user's site and the grasped object. .
- the processing unit 100 determines that the first object and the second object have a given relationship
- the display object to be projected onto the first object is the second object. It is determined that the subject has been captured. This determination process is performed by the capture determination unit 108 (hit check unit). Then, the processing unit 100 (image generation processing unit 110) performs a process of generating a second projection image so that the display object determined to be captured is projected onto the second object. For example, when it is determined that a display object such as a sea creature is captured by a second object such as a hand or a container, a display object such as a captured creature is projected to a second object Let's do it.
- the processing unit 100 generates the first projection image so that the display object determined to be uncaptured is projected onto the first object. For example, when the display object such as a sea creature is not captured by the second object, the display object failed to be captured is projected to the first object such as the play field 10 or the like. .
- the processing unit 100 also performs display control of the display object based on the relationship between the display object projected onto the second object and the second object.
- the fish 14 when it is determined that the fish 14 is captured by the hand 20 which is the user's part or the container 22 which is the grasped object, it is the second object
- the fish 14 as the display object is displayed on the hand 20 and the container 22.
- the hand 20 and the container 22 enter the lower part of the virtual sea surface 12 in FIG. 4 described later, and the playfield 10 as the first object and the hand 20 and the container 22 as the second object If it is determined that the fish 14 has a given relationship, processing for projecting the fish 14 onto the hand 20 or the container 22 is performed.
- the processing unit 100 performs display control for expressing a situation in which the fish 14 as a display object is tangled against the hand 20 or the edge of the container 22 is hit. For example, hit check processing is performed between the fish 14 and the hand 20 or the container 22, and display control is performed to control the movement of the fish 14 based on the result of the hit check processing. In this way, it is possible to give the player a virtual reality as if a real live fish 14 moves on the hand 20 or appears to be swimming in the container 22.
- the processing unit 100 performs arithmetic processing based on the processing rule, and as a result of the arithmetic processing, projection onto the second object
- the display control of the display is performed so that the display determined to be displayed is projected onto the second object.
- the playfield 10 as the first object and the hand 20 and the container 22 as the second object have a given relationship (for example, the hand 20 and the container 22 enter below the virtual sea surface 12) If it is determined that the arithmetic processing based on the processing rule is performed.
- a fish in a predetermined range (within a predetermined radius) is searched with reference to the hand 20 and the container 22 (central position), and arithmetic processing (game processing) is performed to call the hand 20 or the container 22.
- This arithmetic processing is processing based on a predetermined processing rule (algorithm), and, for example, search processing based on a predetermined algorithm (program), movement control processing, hit check processing and the like can be assumed.
- the display control of the fish as the display object so that the fish determined to be projected on the hand 20 or the container 22 as the second object is projected on the hand 20 or the container 22 Is done.
- display control such as moving a fish to the hand 20 or the container 22 is performed.
- various processes can be assumed as arithmetic processing based on the processing rule in this case.
- arithmetic processing is performed such as not calling fish to the hand 20 or reducing the number of approaching fish. In this way, display control of the display object can be performed based on the result of the arithmetic processing which is the game processing.
- the processing unit 100 displays a display according to the change in the relationship between the first object and the second object. Control display of objects.
- the processing unit 100 performs display control of a display object such as a fish according to a change in this relationship (a change in which the hand is moved upward from below the virtual sea surface 12). For example, when there is a change in such a relationship, it is determined that the fish has been captured, and display control is performed to express the state in which the fish is captured by the hand 20. For example, display control is performed such that a fish is displayed (projected) on the hand 20.
- display control is performed such that the fish on the hand 20 bounces or shines.
- the display control of the display object is, for example, processing of moving the display object, changing the motion (motion) of the display object, or changing properties such as color, brightness or texture of the image of the display object. is there.
- the processing unit 100 performs arithmetic processing based on the processing rule, and as a result of the arithmetic processing, is projected onto the second object.
- Display control of the display object is performed so that the display object determined to be projected onto the second object.
- display control is performed to express how a fish is captured by the user's hand 20.
- the processing unit 100 performs display control of the display object such that the display object determined not to be projected on the second object as a result of the arithmetic processing is projected on the first object.
- display control is performed to express a situation in which a fish that fails to be captured escapes toward the play field 10, which is the first object.
- display control such as movement control of the fish is performed so that the fish is projected onto the hand 20 or the container 22.
- display control such as movement control of the fish is performed so that the fish escapes from the hand 20 or the container 22 and is projected onto the playfield 10.
- the processing unit 100 displays the display object as the third object.
- a process of displaying on an object (a process of displaying on the second object) is performed.
- the display object is displayed on the display unit of the third object (for example, the display unit 62 of FIG. 1) or displayed on the third object It is a process of projecting an object.
- the release determination unit 109 performs this determination process. Then, a process of displaying the released display object on the third object (processing of displaying the location of the third object) is performed.
- a second object such as a hand or a container captures an indication such as a sea creature, and the second object and the third object such as the bucket 60 of FIG. 1 are given positions. Suppose that it becomes a relationship. For example, it is assumed that the second object such as the user's hand or a container has a positional relationship such that the second object such as the bucket 60 approaches the third object.
- the processing unit 100 determines that the captured creature or the like has been released. Then, the processing unit 100 (image generation processing unit 110) generates, as a display image of the display unit 62 of the bucket 60, an image on which the captured creature or the like is displayed. In this way, captured creatures and the like are released, and an image can be generated that looks as if it had moved to the bucket 60. In this case, a process of projecting a display object such as a captured creature on the third object such as the bucket 60 may be performed.
- the processing unit 100 determines the relative positional relationship between the first object and the second object based on the detection information of the sensor unit 50, and generates the first object and the second object. Determine if has become a given relationship. For example, the relative positional relationship in the height direction and the lateral direction is determined, and when it is determined that the given relationship is obtained, the content of at least one of the first and second projection images is changed.
- the relative positional relationship is a relationship of, for example, the height of the second object with respect to the first object. For example, based on the detection information of the sensor unit 50, the relative positional relationship between the first and second objects in the height direction is obtained. For example, it is determined whether the second object is above or below the first object or a virtual plane set for the first object. Then, based on the determination result, the content of at least one of the first and second projection images for the first and second objects is changed.
- the processing unit 100 also performs recognition processing of the marker set to the second object based on the detection information of the sensor unit 50. Then, based on the result of the recognition processing, position information of the second object is acquired, and based on the acquired position information, whether the first object and the second object have a given relationship To judge. For example, a captured image is acquired by capturing an image of the marker set to the second object by the sensor unit 50, and image recognition processing of the captured image is performed to acquire position information of the second object.
- the marker recognition unit 104 performs these marker recognition processes.
- the marker is placed and set on the second object.
- a marker is attached to the part of the user, or an object to be a marker is gripped by the part of the user.
- the grip itself feature amount such as color or shape
- the marker is recognized by the sensor unit 50, and the position information of the second object is obtained based on the recognition result.
- a marker is image-recognized from a captured image, position information (height information etc.) of the marker is obtained based on the result of the image recognition, and it is determined whether the first and second objects have a given relationship. Do.
- the processing unit 100 obtains a second projection area on which the second projection image is projected, based on the marker. Then, processing of generating a second projection image to be projected onto the second projection area is performed. For example, based on the result of marker recognition processing, for example, the position (address) of the second projection area on the VRAM is determined, and the second projection image generation processing in the second projection area is performed. Then, for example, processing for changing the content of the second projected image is performed.
- the processing unit 100 also displays an image of the water surface on a virtual surface set at a given position with respect to the playfield, which is the first object, and generates a projection image for displaying the image of a living being.
- the living being may be displayed below the virtual surface, may be displayed above the virtual surface, or may be displayed at the boundary of the virtual surface.
- the projection parts 40 and 42 project the projection image for displaying the image of a water surface, and the image of a creature on a play field.
- the processing unit 100 causes at least one of the contents of the first projection image projected onto the playfield and the second projection image projected onto the second object to be the position information of the second object.
- Change processing based on. For example, processing of changing the content of one of the first and second projected images and processing of changing the content of both are performed. Then, the first and second projection images after the change processing are projected by the projection units 40 and 42 onto the first and second objects.
- the processing unit 100 causes the display object to appear in at least one of the first projection image projected on the playfield and the second projection image projected on the second object. At least one process of annihilation process and a process of changing an image of a display object is performed. In this way, the display object appears or disappears, or the image thereof changes, in accordance with the position information of the second object (for example, the user's site or a grip object).
- the processing unit 100 performs recognition processing of the marker set in the second object, and acquires position information of the second object based on the result of the recognition processing. Then, based on the acquired position information, processing is performed to change the content of at least one of the first projection image and the second projection image. In this way, it is possible to obtain the position information of the second object and change the contents of the first projection image and the second projection image by using the marker set for the second object. it can.
- the processing unit 100 determines that the first and second projected images are It is desirable to change at least one of the contents. Further, it is desirable that the processing unit 100 acquire the position information of the second object based on the detection information of the sensor unit 50.
- the projection parts 40 and 42 project the projection image for displaying the image of a water surface, and the image of a creature with respect to a play field by projection mapping.
- a projection image on which distortion correction or the like has been performed is projected.
- the playfield is, for example, a sandbox as described later.
- the processing unit 100 also generates a projection image on which the water surface and the living being are displayed in animation. In this way, it is possible to display an image that looks like a creature moving in real time under water.
- the projection parts 40 and 42 are installed, for example above the play field. This makes it possible to project the projection image for displaying the water surface and the living thing from the upper side of the playfield to the playfield.
- a play field 10 as shown in FIG. 1 is installed at the facility of the attraction.
- the playfield 10 is a sandbox where children can play sand.
- an image in which seawater or sea creatures are displayed is projected onto the play field 10, which is a sandbox, by projection mapping using the projection units 40 and 42.
- the hand that has captured the living thing is moved to the location of the bucket 60 as shown in FIG. 3B, the captured living thing is displayed on the display unit 62.
- a tablet PC is installed at the top of the bucket 60, and the captured creature is displayed on the display unit 62 of the tablet PC.
- achieved by the method of this embodiment is not limited to an attraction like FIG.
- the present invention can be applied to an attraction that expresses a field other than a sandbox or the sea, or an attraction that realizes a play different from capture of sea creatures.
- the method of the present embodiment can be applied not only to a large attraction as shown in FIG. 1 but also to, for example, a business game apparatus provided with a play field in the device.
- the parent does not feel awkwardness to go out and play far to the sea without worrying about the safety of the child or the like.
- the child can catch the small fledgling sea creatures with their own hands, without giving up without being caught.
- a playfield 10 which is an indoor sand place that can be easily taken out, is prepared, and a real southern country beach is realistically imitating the environment such as the noise of a wave and the cry of birds. Reproduce. And it reproduces the sea level and waves of a shallow beach that pulls up and down like a real thing by projection mapping to sand. For example, it sometimes reproduces the situation where the tide is full and the whole becomes water surface, or the tide pulls out and the sand flat appears. Also, make sure that water splashes and ripples occur interactively on the water surface touched by the child's foot.
- projection mapping is used to animate sea water and captured creatures.
- the child can then move the captured creature into the bucket 60 for viewing.
- captured creatures can be transferred to a smartphone and brought back. That is, by displaying the captured creature on the display unit 62 of the bucket 60 or the display unit of the smartphone, the child is made to feel as if it really caught the creature.
- creatures that have been fond of themselves can be called upon themselves when they visit an attraction facility again.
- the creatures that are fond of themselves realize the communication elements with the creatures, such as swimming around them and following them later.
- an image is projected onto the play field 10 which is a sandbox by projection mapping so that a child can catch sea creatures.
- an announcement is made such as "Cooperating with parent and child within the time limit and picking it all!" Then, if you throw a glowing ball or the like that imitates the bait, the fish will gather. Parents step on the ground and step on the fish, and the child catches the fish. In addition, ripples will be produced on the beach, and after the waves are drawn, a large number of shells and fish will be displayed. Children can dig sand using Kumade or scoop and look for treasures hidden in the sand.
- At least one of the first and second objects based on the detection information of the sensor unit 50. Get location information. Then, based on the acquired position information, it is determined whether or not the first and second objects have a given relationship. Then, if it is determined that a given relationship is established, the contents of at least one of the first projection image projected onto the first object and the second projection image projected onto the second object Change. For example, if the first projection plane corresponding to the first object and the second projection plane corresponding to the second object have a given relationship, the first projection plane is projected onto the first projection plane Change the contents of the second projected image on the second projected plane or the second projected plane.
- projected images for expressing the virtual sea surface 12 of the virtual coast and the fishes 14 and 15 are projected on the play field 10. Then, when the user (such as a child) puts a hand 20 below the virtual sea surface 12 (virtual surface in a broad sense) expressed by the projection mapping, the fish 14 and 15 come close. In this case, for example, when the user puts a bait item with a marker on the hand 20 and places the hand 20 under the virtual sea surface 12, the fishes 14 and 15 come close to the bait item. It is also good.
- the color information may be set as the determination material such that the middle of the predetermined range used for the determination of capture is, for example, the effective range among the range showing the color of the hand.
- the user's hand 20 After the user captures a fish, the user's hand 20 approaches the location of the bucket 60 (for example, a place that can be recognized by an image marker or the like), and the bucket 60 (third object in a broad sense) and the hand 20 (second When the two objects (2) are in a given positional relationship, the determination of the movement of the fish to the bucket 60 is established. This determination can be realized, for example, by performing intersection determination of a given range set at the position of the bucket 60 and a given range set at the position of the hand 20. Then, when it is determined that the fish has moved to the bucket 60, an image of the fish is displayed on the display section 62 (display of the tablet PC) of the bucket 60 (bucket item). This allows the captured fish to appear as if they had been transferred to the bucket 60.
- the display section 62 display of the tablet PC
- the first object is the play field 10 and the second object is the hand of the user is mainly described as an example, but the present embodiment is not limited to this.
- the first object may be an object other than the play field 10
- the second object may be, for example, a part other than the user's hand, or the user's grasped object (container or the like). May be
- the sensor unit 50 in FIG. 4 includes a normal camera 52 (imaging unit) that captures a color image (RGB image), and a depth sensor 54 (ranging sensor) that detects depth information.
- the depth sensor 54 can adopt, for example, a TOF (Time Of Flight) method of obtaining depth information from the time when the projected infrared light is reflected from the object and returns.
- the depth sensor 54 can be realized, for example, by an infrared projector that emits a pulse-modulated infrared light, and an infrared camera that detects an infrared light reflected back from the object.
- a light coding method may be employed in which the projected infrared pattern is read and the depth information is obtained from the distortion of the pattern.
- the depth sensor 54 can be realized by an infrared projector that emits an infrared pattern and an infrared camera that reads the projected pattern.
- the sensor unit 50 (depth sensor 54) is used to detect height information of the play field 10 or the like. Specifically, as shown in FIG. 5, the height information h11, h12, h13,... In each divided area (for example, an area of 1 cm ⁇ 1 cm) is used as a height information map (depth information map) in the sensor unit. Acquired based on the detection information (depth information) from 50. The acquired height information is stored in the height information storage unit 156 of FIG. 2 as a height information map.
- a plane in a plan view seen from the sensor unit 50 is taken as an XY plane defined by the X axis and the Y axis, and an axis orthogonal to the XY plane is taken as the Z axis.
- the XY plane is a plane parallel to the first projection plane corresponding to the play field 10 (a plane as an average value although it is actually uneven).
- the Z axis is an axis along the direction in which the sensor unit 50 (depth sensor 54) faces.
- the height information in FIG. 5 is height information (depth information) in the Z-axis direction.
- it is height information in the Z-axis direction based on the position of the play field 10 (first projection plane, first object).
- the Z-axis direction is a direction (upward in the drawing) from the play field 10 toward the sensor unit 50 provided above the play field 10.
- the height information map of FIG. 5 can be obtained by performing processing for converting the height information in the Z-axis direction.
- the hand 20 When the hand 20 is positioned above the play field 10 as shown in FIG. 4, the hand 20 (in a broad sense, the second in a broad sense) is used in the divided area corresponding to the position of the hand 20 in the height information map of FIG. Information of the object) is stored. Therefore, not only the height information at each place of the play field 10 but also the height information of the hand 20 can be acquired by using the height information map of FIG.
- etc. Is generated and projected based on this height information (depth information).
- depth information For example, a projection image on which sea water or sea creatures are displayed is generated and projected on the playfield 10 or the like.
- the same processing as generation processing of a normal three-dimensional image is performed.
- the object space processing for arranging and setting objects corresponding to the fishes 14 and 15 is performed.
- the virtual sea surface 12 is set at a given height from the projection surface of the play field 10, and the arrangement setting processing of the object space is performed so that the image of the sea surface is displayed on the virtual sea surface 12.
- an image seen from a given viewpoint in the object space is generated as a projection image.
- it is desirable to set "a given viewpoint” to reproduce the viewpoint of the user who is focusing on the area as much as possible this is most representative because it is difficult when there are many users. It may be set to be a drawing with parallel projection from directly above as a special viewpoint.
- height information (height in the Z-axis direction) of the hand 20 can also be detected based on detection information (depth information) from the sensor unit 50 (depth sensor 54). That is, as described above, in the height information map of FIG. 5, the height information of the hand 20 is stored in the divided area corresponding to the position of the hand 20 (the position on the XY plane). In this case, the position of the hand 20 can be specified, for example, by detecting an area of the color of the hand 20 (color closer to the skin color than other areas) from a color image captured by the camera 52 of the sensor unit 50. Or you may specify by the recognition process of the marker set to the position of the hand 20 so that it may mention later.
- the height of the hand 20 is lower than the height (height in the Z-axis direction) of the virtual sea surface 12 (virtual surface).
- the height of the hand 20 is lower than the virtual sea surface 12
- an image is generated such that the fish 14, 15 move toward the hand 20.
- the fish capture determination is performed. That is, when it is determined that the hand 20 has come out of the water, a capture determination as to whether or not the fish has been captured is performed. Specifically, it is determined that the fish present in a predetermined range area (area in the XY plane) centering on the position of the hand 20 (position in the XY plane) at that time is captured. On the other hand, it is determined that the fish that were outside the predetermined range area were escaped without being able to capture.
- a predetermined range area area in the XY plane
- FIG. 6A it is determined that the fish 14 has been captured.
- an image of the fish 14 or seawater is projected on the palm of the hand 20. This makes it possible to give the user a virtual reality as if the fish 14 were actually captured with their own hands 20.
- B1 indicates the range of the hand 20 before lifting
- B2 indicates the range of the hand 20 after lifting
- C1 indicates the position and size of the fish 14 before lifting the hand 20
- C2 indicates the position and size of the fish 14 after lifting the hand 20.
- C1 and C2 as the hand 20 moves upward, the size of the fish 14 looks smaller.
- processing of enlarging or reducing the size of the fish 14 may be performed according to the height.
- C3 indicates the position and size of the fish 14 when the correction process (enlargement / reduction and position adjustment described later) is performed, and the process of enlarging the image (video) of the fish 14 with respect to C2 is It has been done.
- an image (video) of the fish 14 as shown in C1 and C2. ) Appears to be shifted from the position of the hand 20 to the side of the projection unit 40 (42).
- the position adjustment processing may be performed such that the image of the fish 14 is captured while maintaining the positional relationship with the hand 20, as shown in C3, by calculation taking into account the height.
- the second object based on position information (positional relationship between the projection units 40 and 42 and the second object) such as height information of the second object such as the hand 20, the second object At least one of the adjustment processing of the display position of the display object such as the fish 14 and the like and the adjustment processing of the size is performed.
- position information positional relationship between the projection units 40 and 42 and the second object
- the second object At least one of the adjustment processing of the display position of the display object such as the fish 14 and the like and the adjustment processing of the size is performed.
- the first object is selected. It is possible to realize appropriate generation processing of the second projection image in which the display object such as the fish 14 or the like to be projected is projected following the second object such as the hand 20 or the like.
- the hand 20 comes out of the water at the location shown by A1, and it is determined that the fish 15, 16 have escaped without being able to capture. That is, when the hand 20 is taken out of the water, it is determined that the fish 15 and 16 could not be captured because they were out of the region of the predetermined range centered on the position of the hand 20. In this case, a fish 15, 16 that could not be captured generates a projection image, for example, swimming outward from the place of A1, and projects it onto the play field 10. By doing this, the user can visually recognize that the capture of the fish 15, 16 has failed.
- the image which a ripple spreads, for example is produced
- the position information of the playfield 10 (first object) or the hand 20 (second object) is acquired. Be done. Specifically, as described in FIG. 4 and FIG. 5, height information (height information in each divided area) of the play field 10 and height information of the hand 20 are acquired as position information.
- height information of the play field 10 is stored in advance as table information in the storage unit 150, only the height information (position information in a broad sense) of the hand 20 may be acquired.
- the relative positional relationship is, for example, the relationship between the height of the hand 20 (second object) and the play field 10 (first object) as described in FIGS. 4 and 5.
- processing of changing the contents of at least one of the first projected image to the play field 10 and the second projected image to the hand 20 is performed. To be done.
- a projection image on which position information etc. of an object such as the playfield 10 or the hand 20 is reflected is It can be projected on an object.
- the relative positional relationship between objects is utilized so that the image can move between a plurality of objects. Therefore, when the positional relationship of objects such as the play field 10 and the hand 20 changes, the projected image projected on these objects also changes accordingly. Therefore, according to the movement of the user, the projection image reflecting the movement is projected onto the object, and it becomes possible to realize a highly interactive projection system which could not be realized by the conventional systems. And by applying the projection system of the present embodiment to an attraction etc., it becomes possible to realize an attraction etc. which is interesting and does not get tired even if it is played over a long time.
- a virtual sea surface 12 (virtual plane) set at a given position with respect to the play field 10 (first object), and a hand 20 (second object)
- the positional relationship with the object is determined to determine whether the play field 10 and the hand 20 have a given relationship. For example, if it is determined that the height of the hand 20 is lower than the virtual sea surface 12, it is determined that the hand 20 has entered the water, and a seawater image is projected on the hand 20. Generate an image that comes close.
- the hand 20 determines that the height of the hand 20 has become higher than the virtual sea surface 12 after the hand 20 has entered the water, it is determined that the hand 20 has come out of the water, and the fish 14 captured is a palm of the hand 20 Generate an image projected onto the image, or generate an image from which the fish 15, 16 that failed to catch escape.
- the determination process of the positional relationship with the hand 20 which is the second object is performed using the virtual sea surface 12 set in the playfield 10 instead of the playfield 10 itself which is the first object.
- the determination process of the positional relationship with the hand 20 which is the second object is performed using the virtual sea surface 12 set in the playfield 10 instead of the playfield 10 itself which is the first object.
- the process of changing the contents of the first and second projection images includes, for example, a process of causing a display object to appear in at least one of the first projection image and the second projection image. It is a process of making it disappear or a process of changing the image of the display object.
- FIG. 6 (B) in the first projection image of the play field 10, processing is performed to change the images of the fish 15 and 16 as the display objects to an image fleeing from the place of A1. Also in FIG. 4, when it is determined that the hand 20 has entered the water, processing is performed to change the images of the fish 14 and 15 to images approaching the hand 20.
- the image of the fish 14 as the display object may be changed so that the fish 14 shines.
- the image of the fish 14 is changed so that an animation is displayed to make the palm fish 14 of the hand 20 feel as if, for example, jumping. Good. After the fish 14 jumps, they disappear from the palm of the hand 20 and are displayed on the display unit 62 of the bucket 60.
- the fish 14 to be projected onto the play field 10 (first object) is shown in FIG. As shown in), generation processing of a second projection image is performed so as to be projected on the hand 20 (second object). That is, the display object of the fish 14 originally intended to be projected onto the play field 10 is also projected onto the hand 20. This makes it possible to represent an unprecedented projection image.
- the play field 10 and the hand 20 have a given relationship
- it is determined that the fish 14 to be projected onto the play field 10 is captured by the hand 20 .
- a second projection image generation process is performed so that the fish 14 determined to be captured is projected onto the hand 20. That is, when it is determined that the hand 20 has come above the virtual sea surface 12 after entering the water, it is determined that the fish 14 within a predetermined range centered on the hand 20 has been captured. And as shown to FIG. 6 (A), the 2nd projection image by which the captured fish 14 is projected on the hand 20 is produced
- the process of generating the first projection image is performed so as to be projected onto the play field 10. Do.
- the user views not only the captured fish 14 but also the fish 15 and 16 that have failed to capture and fled, by seeing the first projection image of the playfield 10 for the swimming appearance It is possible to recognize the user's virtual reality further.
- the display of the fish 14 determined to be captured is A process is performed to display objects at the location of the bucket 60. For example, as shown in FIG. 6A, when the user captures the fish 14 and then brings the hand 20 to the location of the bucket 60 of FIG. 1, it is determined that the captured fish 14 is released to the bucket 60 . Then, processing for displaying the captured fish 14 on the display unit 62 of the bucket 60 is performed. At this time, processing is also performed to cause the fish 14 projected on the hand 20 to disappear from the second projection image.
- a container 22 (a grasped object in a broad sense), which is a second object, is gripped by the hand 20 of the user.
- the marker 24 is set with respect to the container 22 which is a 2nd target object.
- the container 22 imitates a fruit of a hemispherical forceps, and a black marker 24 is set for the circular edge portion.
- the black circular marker 24 is imaged by the camera 52 of the sensor unit 50 of FIG. 4, and the recognition process of the marker 24 is performed based on the acquired image.
- image recognition processing is performed on the captured image from the camera 52 to extract an image of a black circle corresponding to the marker 24.
- the central position of the black circle is determined as the position of the container 22 which is the second object. That is, the position of the container 22 on the XY plane described in FIG. 4 is determined.
- height information (Z) corresponding to the obtained position (X, Y) of the container 22 is acquired from the height information map of FIG.
- height information corresponding to the position of the container 22 on the XY plane is determined using the height information map obtained from the depth information from the depth sensor 54 of the sensor unit 50, and is used as the height of the container 22.
- the position of the hand 20 is stably and appropriately detected.
- the position of the hand 20 is stably and appropriately detected.
- it is difficult.
- FIG. 6A when the fish 14 is captured, it is affected by the wrinkles and color of the hand 20 and it is difficult to project the image of the fish 14 etc. clearly on the hand 20
- the position of the container 22 is detected based on the result of the recognition process of the marker 24 set in the container 22. Therefore, as compared with the method of detecting the position of the hand 20 based on the color or the like of the hand 20, there is an advantage that the position of the container 22 which is the second object can be detected stably and appropriately. In addition, by appropriately setting the projection plane or the like of the container 22, there is also an advantage that an image of captured fish, a seawater image or the like can be projected on the projection plane of the container 22 as a clear image.
- pattern recognition of the markers 24 is performed, and processing such as making different types of fish approaching the user can be performed based on the result of pattern recognition.
- the pattern of the markers 24 is the pattern on the left side of FIG. 7B
- the fish 15 corresponding to the pattern is directed toward the container 22. I will try to come close.
- the pattern of the marker 24 is the pattern on the right side of FIG. 7B
- the fish 16 corresponding to the pattern is made to approach the container 22.
- marker pattern information (table) is prepared in which the indicator ID of the fish is associated with each marker pattern.
- This marker pattern information is stored in the marker pattern storage unit 154 of FIG. Then, it is determined by the image recognition process on the captured image from the camera 52 of the sensor unit 50 which marker pattern of FIG. 8 is detected. When it is determined that the container 22 has entered the water, a fish corresponding to the detected marker pattern is made to appear, and an image approaching the container 22 is generated.
- the user can easily capture different types of fish according to the pattern of the markers 24 of the container 22 possessed. Therefore, it is possible to realize an attraction or the like that is hard to get bored even when playing for a long time.
- various methods can be assumed as a projection method of the projection image (2nd projection image) to the container 22 (gripping object).
- a projection image is projected by the projection unit 40 on the hemispherical inner surface of the container 22.
- FIG. 9 (B) the plane of projection plane 21 is set with respect to the upper part of the container 22. Then, a projection image is projected by the projection unit 40 on the projection surface 21 of this plane.
- FIG. 9A in order to project a projection image with little distortion, it is necessary to perform distortion correction reflecting the hemispherical inner surface shape of the container 22, the position of the projector, and the viewpoint position of the user.
- the hemispherical inner surface shape of the container 22 is represented by a formula or the like, and distortion correction is performed using this formula or the like.
- the method of using the marker is not limited to the method described in FIG. 7A, FIG. 7B or the like.
- a two-dimensional code invisible to the player is disposed on the inner bottom of the container 22 by printing, coating, adhesion, etc., and this is photographed with an infrared camera. It may be
- Each bait item 26 is provided with, for example, an infrared LED marker.
- the light emission pattern of the infrared LED marker is image-recognized using the camera 52 of the sensor unit 50, whereby the position of the bait item 26 (hand 20) is specified. Be done. Then, an image in which the fish approaches the bait item 26 is generated. Also, for example, an animation in which a fish sticks on the bait item 26 is displayed, and at this time, the bait item 26 is vibrated. That is, the bait item 26 vibrates by the vibration mechanism provided in the bait item 26, and the vibration is transmitted to the hand 20 of the user.
- the fish captured by the palm of the hand 20 bounces and the vibration is transmitted to the hand 20 of the user.
- the vibration is transmitted to the user's hand 20, for example by vibrating the bait item 26. In this way, it is possible to give the user a virtual reality as if the real fish had been scooped up and captured.
- a plurality of bait items 26 are prepared, and depending on each bait item 26, the type of fish approaching is made different.
- the infrared LED markers of each bait item 26 emit light with different emission patterns. Therefore, when the type of the light emission pattern is determined by image recognition and the user places the hand carrying the food item 26 under the virtual sea surface 12 (virtual water surface), the fish corresponding to the type of the light emission pattern is the food item Let's come to 26. By doing this, different fish come to each user and it is possible to increase the enjoyment and variation of play.
- the reason for using an LED marker of infrared light instead of visible light LED for each bait item 26 is that the light beam of the projector is visible light, so when identifying the LEDs placed in it, Because infrared LEDs are easier. If discrimination is possible, a visible light LED may be used, a piece of paper on which a marker pattern is printed, or the like may be used, or a marker pattern may be directly printed on each bait item 26.
- an NFC (near-field wireless communication) chip may be incorporated. Then, the fish may approach the bait item 26 using the communication signal output from the NFC chip as a marker.
- a second projection area RG2 on which the second projection image is projected is determined based on the markers provided on the container 22 and the bait item 26, and the second projection area is determined.
- a process of generating a second projected image IM2 to be projected onto RG2 may be performed.
- a first projection image to be projected onto a first object such as the play field 10 is drawn in the first projection area RG1.
- a second projection image projected on a second object is drawn in the second projection region RG2.
- the images on the VRAM are shared by the projection units 40 and 42 in FIG. 1 and projected onto the play field 10 and the container 22 or the hand 20.
- the location (address) of the second projection area RG2 on the VRAM is specified based on the recognition result of the marker 24, and the container 22 or the hand 20 is specified for the specified second projection area RG2.
- the second projection image IM2 to be projected onto a second object, such as Then, for example, when it is determined that the fish 14 has been captured as shown in FIG. 6 (A), as shown in FIG. An image IM2 is generated and drawn in the second projection area RG2.
- the first projection area IM1 is generated by generating a first projection image IM1 in which the fish 15, 16 that failed to catch escape from the place of A1 where the hand 20 has come out. Draw on RG1.
- the position of the second projection area RG2 is also changed accordingly. Then, when the container 22 or the hand 20 moves to the location of the bucket 60 and it is determined that the fish 14 is released to the bucket 60, a second projected image IM2 is generated such that the released fish 14 disappears. , And draw in the second projection region RG2.
- the play field 10 was a field, such as a sandbox, which becomes a state close
- the playfield 10 may be such that its projection plane is orthogonal to (crosses with) the horizontal plane.
- the play field 10 simulates a waterfall, and the user captures the fish 14 by holding a hand net provided with a marker, for example.
- a projection unit 40 and a sensor unit 50 are provided on the side of the play field 10, and the projection unit 40 projects an image of a waterfall onto the play field 10.
- the sensor unit 50 detects height information and the like in the direction along the horizontal surface to determine whether or not the hand net held by the user has entered the virtual water surface, or whether or not the fish 14 has been captured, etc. .
- rendering processing such as splashing is also performed on the surface of the water containing the hand net.
- step S1 height information of the play field 10 is acquired as described in FIGS. 4 and 5 (step S1). And a seawater image is projected on the play field 10 based on the acquired height information (step S2). For example, a seawater image is projected so that a puddle of seawater can be formed in the depressed portion of the sandbox of the playfield 10.
- the marker set on the hand or the container is image-recognized by the sensor unit 50, and the height information of the marker is acquired as the height information of the hand or the container (steps S3 and S4).
- the position (XY plane) of the marker is determined by image recognition using the image captured by the camera 52 of the sensor unit 50, and the marker height information is acquired from the height information map of FIG. 5 based on the marker position.
- step S5 it is determined whether the height of the hand or the container is lower than the height of the virtual sea level. And when it becomes low, a seawater image is projected on a hand or a container (step S6).
- FIG. 15 is a flowchart showing a detailed process example of the fish catching determination and the like.
- step S11 it is determined whether or not the hand or the container has been pulled higher than the virtual sea surface (step S11).
- the fish present in the area within the predetermined range from the position of the hand or the container at that time is judged as a captured fish, and the others are judged as fled fish (step S12) ).
- the captured fish image is displayed on the projection image of the hand or the container, and the escaped fish is displayed on the projection image of the play field 10 (step S13).
- an image in which the captured fish 14 is displayed is generated as the second projection image IM2 to the second projection region RG2 of FIG. 11, and escaped as the first projection image IM1 to the first projection region RG1.
- An image is generated on which the fish 15, 16, 17 are displayed.
- FIG. 16 is a flow chart showing a detailed processing example such as fish release determination.
- the position of the hand or container that has captured a fish and the position of a bucket are detected by the sensor unit 50 (step S21). Then, it is determined whether the position of the hand or the container and the position of the bucket have a given positional relationship (step S22). For example, it is determined whether the position of the hand or the container overlaps with the location of the bucket. When the positional relationship is obtained, it is determined that the captured fish has been released to the bucket, and an image of the fish is displayed on the display unit of the bucket (step S23).
Abstract
Description
図1に本実施形態の投影システムの全体構成例を示す。本実施形態の投影システムは、投影部40、42と処理装置90(広義には処理部)を含む。またセンサ部50を更に含むことができる。なお、本実施形態の投影システムの構成は図1に限定されず、その構成要素(各部)の一部を省略したり、他の構成要素を追加するなどの種々の変形実施が可能である。
2.1 アトラクションの概要
まず本実施形態の手法により実現されるアトラクションの概要について説明する。本実施形態では、アトラクションの施設に、図1に示すようなプレイフィールド10を設置する。このプレイフィールド10は砂場となっており、子供が砂遊びを楽しめるようになっている。
以上のようなアトラクション等を実現するために、本実施形態では、センサ部50の検出情報に基づいて第1、第2の対象物の少なくとも一方の位置情報を取得する。そして、取得された位置情報に基づいて、第1、第2の対象物が所与の関係になったか否かを判断する。そして所与の関係になったと判断された場合には、第1の対象物に投影される第1の投影画像と、第2の対象物に投影される第2の投影画像の少なくとも一方の内容を変化させる。例えば第1の対象物に対応する第1の投影面と第2の対象物に対応する第2の投影面が所与の関係になった場合に、第1の投影面に投影される第1の投影画像、或いは第2の投影面に第2の投影画像の内容を変化させる。
以上では、第2の対象物の高さ情報等を検出して本実施形態の手法を実現する場合について説明したが、本実施形態はこれに限定されない。例えばセンサ部50からの検出情報に基づいて、第2の対象物に設定されたマーカの認識処理を行い、認識処理の結果に基づいて、第2の対象物の位置情報を取得し、取得された位置情報に基づいて、第1の対象物と第2の対象物とが所与の関係になったかを判断してもよい。
次に本実施形態の詳細な処理例について図14のフローチャートを用いて説明する。
20 手、21 投影面、22 容器、24 マーカ、26 餌アイテム、
RG1、RG2 第1、第2の投影領域、IM1、
IM2 第1、第2の投影画像、
40、42 投影部、50 センサ部、52 カメラ、54 デプスセンサ、
60 バケツ、62 表示部、90 処理装置、
100 処理部、102 位置情報取得部、104 マーカ認識部、
106 位置関係判定部、108 捕獲判定部、109 リリース判定部、
110 画像生成処理部、112 歪み補正部、120 I/F部、
150 記憶部、152 表示物情報記憶部、154 マーカパターン記憶部、
156 高さ情報記憶部
Claims (25)
- 投影画像を投影する投影部と、
センサ部の検出情報に基づいて第1、第2の対象物の少なくとも一方の位置情報を取得し、前記投影画像の生成処理を行う処理部と、
を含み、
前記処理部は、
取得された前記位置情報に基づいて、前記第1の対象物と前記第2の対象物とが所与の関係になったと判断された場合に、前記第1の対象物に投影される第1の投影画像と、前記第2の対象物に投影される第2の投影画像の少なくとも一方の内容を変化させる処理を行うことを特徴とする投影システム。 - 請求項1において、
前記処理部は、
前記第1の対象物に対して所与の位置に設定された仮想面と、前記第2の対象物との位置関係を求めて、前記第1の対象物と前記第2の対象物とが前記所与の関係になったか否かを判断することを特徴とする投影システム。 - 請求項1又は2において、
前記処理部は、
前記第1の対象物と前記第2の対象物とが前記所与の関係になったと判断された場合に、前記第1の対象物に投影される前記第1の投影画像と、前記第2の対象物に投影される前記第2の投影画像の少なくとも一方の画像において、表示物を出現させる処理、表示物を消滅させる処理、及び表示物の画像を変更する処理の少なくとも1つの処理を行うことを特徴とする投影システム。 - 請求項1乃至3のいずれかにおいて、
前記処理部は、
前記第1の対象物と前記第2の対象物とが前記所与の関係になったと判断された場合に、前記第1の対象物への投影対象である表示物が、前記第2の対象物に投影されるように、前記第2の投影画像の生成処理を行うことを特徴とする投影システム。 - 請求項4において、
前記処理部は、
前記第2の対象物に投影される前記表示物と、前記第2の対象物との関係に基づいて、前記表示物の表示制御を行うことを特徴とする投影システム。 - 請求項4又は5において、
前記処理部は、
前記第1の対象物と前記第2の対象物とが前記所与の関係になった場合に、処理規則に基づく演算処理を行い、前記演算処理の結果、前記第2の対象物に投影されると判断された前記表示物が、前記第2の対象物に投影されるように、前記表示物の表示制御を行うことを特徴とする投影システム。 - 請求項4乃至6のいずれかにおいて、
前記処理部は、
前記第1の対象物と前記第2の対象物との関係が前記所与の関係から変化した場合に、前記第1の対象物と前記第2の対象物の前記関係の変化に応じた前記表示物の表示制御を行うことを特徴とする投影システム。 - 請求項7において、
前記処理部は、
前記第1の対象物と前記第2の対象物の前記関係が変化した場合に、処理規則に基づく演算処理を行い、前記演算処理の結果、前記第2の対象物に投影されると判断された前記表示物が、前記第2の対象物に投影されるように、前記表示物の表示制御を行うことを特徴とする投影システム。 - 請求項7又は8において、
前記処理部は、
前記第1の対象物と前記第2の対象物の前記関係が変化した場合に、処理規則に基づく演算処理を行い、前記演算処理の結果、前記第2の対象物に投影されないと判断された前記表示物が、前記第1の対象物に投影されるように、前記表示物の表示制御を行うことを特徴とする投影システム。 - 請求項4乃至9のいずれかにおいて、
前記処理部は、
前記第2の対象物と第3の対象物とが所与の関係になったと判断された場合に、前記表示物を前記第3の対象物に表示するための処理を行うことを特徴とする投影システム。 - 請求項1乃至10のいずれかにおいて、
前記処理部は、
前記センサ部の検出情報に基づいて、前記第1の対象物と前記第2の対象物との相対的な位置関係を求めて、前記第1の対象物と前記第2の対象物とが前記所与の関係になったかを判断することを特徴とする投影システム。 - 請求項11において、
前記相対的な位置関係は、前記第1の対象物に対する前記第2の対象物の高さについての関係であることを特徴とする投影システム。 - 請求項1乃至12のいずれかにおいて、
前記処理部は、
前記センサ部の検出情報に基づいて、前記第2の対象物に設定されたマーカの認識処理を行い、前記認識処理の結果に基づいて、前記第2の対象物の位置情報を取得し、取得された前記位置情報に基づいて、前記第1の対象物と前記第2の対象物とが前記所与の関係になったかを判断することを特徴とする投影システム。 - 請求項13において、
前記処理部は、
前記マーカに基づいて、前記第2の投影画像が投影される第2の投影領域を求め、前記第2の投影領域に投影される前記第2の投影画像の生成処理を行うことを特徴とする投影システム。 - 請求項1乃至14のいずれかにおいて、
前記第2の対象物は、ユーザの部位又はユーザの把持物であることを特徴とする投影システム。 - 第1の対象物であるプレイフィールドに対して投影画像を投影する投影部と、
前記投影画像の生成処理を行う処理部と、
を含み、
前記処理部は、
前記プレイフィールドに対して所与の位置に設定された仮想面に水面の画像を表示すると共に、生き物の画像を表示するための前記投影画像を生成し、
前記投影部は、
前記水面の画像と前記生き物の画像を表示するための前記投影画像を、前記プレイフィールドに対して投影し、
前記処理部は、
第2の対象物の位置情報に基づいて、前記第1の対象物である前記プレイフィールドに投影される第1の投影画像と、前記第2の対象物に投影される第2の投影画像の少なくとも一方の内容を変化させる処理を行うことを特徴とする投影システム。 - 請求項16において、
前記処理部は、
前記プレイフィールドに投影される前記第1の投影画像と、前記第2の対象物に投影される前記第2の投影画像の少なくとも一方の画像において、表示物を出現させる処理、表示物を消滅させる処理、及び表示物の画像を変更する処理の少なくとも1つの処理を行うことを特徴とする投影システム。 - 請求項16又は17において、
前記処理部は、
前記第2の対象物に設定されたマーカの認識処理を行い、前記認識処理の結果に基づいて、前記第2の対象物の位置情報を取得し、取得された前記位置情報に基づいて、前記第1の投影画像及び前記第2の投影画像の少なくとも一方の内容を変化させる処理を行うことを特徴とする投影システム。 - 請求項16乃至18のいずれかにおいて、
前記第2の対象物は、ユーザの部位又はユーザの把持物であることを特徴とする投影システム。 - 請求項16乃至19のいずれかにおいて、
前記処理部は、
前記第2の対象物の前記位置情報に基づいて、前記第1の対象物である前記プレイフィールドと前記第2の対象物とが所与の関係になったと判断された場合に、前記第1の投影画像と前記第2の投影画像の少なくとも一方の内容を変化させる処理を行うことを特徴とする投影システム。 - 請求項16乃至20のいずれかにおいて、
前記処理部は、
センサ部の検出情報に基づいて、前記第2の対象物の前記位置情報を取得することを特徴とする投影システム。 - 請求項16乃至21のいずれかにおいて、
前記投影部は、
前記水面の画像と前記生き物の画像を表示するための前記投影画像を、前記プレイフィールドに対してプロジェクションマッピングにより投影することを特徴とする投影システム。 - 請求項22において、
前記プレイフィールドは砂場であることを特徴とする投影システム。 - 請求項16乃至23のいずれかにおいて、
前記処理部は、
前記水面及び前記生き物がアニメーション表示される前記投影画像を生成することを特徴とする投影システム。 - 請求項16乃至24のいずれかにおいて、
前記投影部は、前記プレイフィールドの上方に設置されていることを特徴とする投影システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680050791.6A CN107925739B (zh) | 2015-09-02 | 2016-09-02 | 投影系统 |
GB1804171.5A GB2557787B (en) | 2015-09-02 | 2016-09-02 | Projection system |
US15/909,836 US20180191990A1 (en) | 2015-09-02 | 2018-03-01 | Projection system |
HK18106074.6A HK1247012A1 (zh) | 2015-09-02 | 2018-05-10 | 投影系統 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015172568A JP6615541B2 (ja) | 2015-09-02 | 2015-09-02 | 投影システム |
JP2015-172568 | 2015-09-02 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/909,836 Continuation US20180191990A1 (en) | 2015-09-02 | 2018-03-01 | Projection system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017038982A1 true WO2017038982A1 (ja) | 2017-03-09 |
Family
ID=58187764
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/075841 WO2017038982A1 (ja) | 2015-09-02 | 2016-09-02 | 投影システム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20180191990A1 (ja) |
JP (1) | JP6615541B2 (ja) |
CN (1) | CN107925739B (ja) |
GB (1) | GB2557787B (ja) |
HK (1) | HK1247012A1 (ja) |
WO (1) | WO2017038982A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107277476A (zh) * | 2017-07-20 | 2017-10-20 | 苏州名雅科技有限责任公司 | 一种适合在旅游景点供儿童互动体验的多媒体设备 |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3062142B1 (en) | 2015-02-26 | 2018-10-03 | Nokia Technologies OY | Apparatus for a near-eye display |
US20190116356A1 (en) * | 2016-04-15 | 2019-04-18 | Sony Corporation | Information processing apparatus, information processing method, and program |
CN109906423B (zh) * | 2016-11-02 | 2022-05-06 | 松下知识产权经营株式会社 | 姿势输入系统及姿势输入方法、电子设备 |
US10650552B2 (en) | 2016-12-29 | 2020-05-12 | Magic Leap, Inc. | Systems and methods for augmented reality |
EP3343267B1 (en) | 2016-12-30 | 2024-01-24 | Magic Leap, Inc. | Polychromatic light out-coupling apparatus, near-eye displays comprising the same, and method of out-coupling polychromatic light |
CN106943756A (zh) * | 2017-05-18 | 2017-07-14 | 电子科技大学中山学院 | 一种投影沙池游艺系统 |
US10578870B2 (en) | 2017-07-26 | 2020-03-03 | Magic Leap, Inc. | Exit pupil expander |
CN111448497B (zh) | 2017-12-10 | 2023-08-04 | 奇跃公司 | 光波导上的抗反射涂层 |
AU2018392482A1 (en) | 2017-12-20 | 2020-07-02 | Magic Leap, Inc. | Insert for augmented reality viewing device |
JP7054774B2 (ja) * | 2018-01-10 | 2022-04-15 | パナソニックIpマネジメント株式会社 | 投影制御システム及び投影制御方法 |
CN112136152A (zh) | 2018-03-15 | 2020-12-25 | 奇跃公司 | 由观看设备的部件变形导致的图像校正 |
JP2019186588A (ja) * | 2018-03-30 | 2019-10-24 | 株式会社プレースホルダ | コンテンツ表示システム |
US10921935B2 (en) | 2018-05-21 | 2021-02-16 | Compal Electronics, Inc. | Interactive projection system and interactive projection method |
US11885871B2 (en) | 2018-05-31 | 2024-01-30 | Magic Leap, Inc. | Radar head pose localization |
JP7369147B2 (ja) | 2018-06-05 | 2023-10-25 | マジック リープ, インコーポレイテッド | 視認システムのホモグラフィ変換行列ベースの温度較正 |
US11579441B2 (en) | 2018-07-02 | 2023-02-14 | Magic Leap, Inc. | Pixel intensity modulation using modifying gain values |
US11856479B2 (en) | 2018-07-03 | 2023-12-26 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality along a route with markers |
US11510027B2 (en) | 2018-07-03 | 2022-11-22 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality |
JP7147314B2 (ja) * | 2018-07-19 | 2022-10-05 | セイコーエプソン株式会社 | 表示システム、及び、反射体 |
WO2020023545A1 (en) | 2018-07-24 | 2020-01-30 | Magic Leap, Inc. | Temperature dependent calibration of movement detection devices |
WO2020023543A1 (en) | 2018-07-24 | 2020-01-30 | Magic Leap, Inc. | Viewing device with dust seal integration |
EP3831058A4 (en) | 2018-08-02 | 2022-04-20 | Magic Leap, Inc. | VIEWING SYSTEM WITH PUPILE DISTANCE COMPENSATION BASED ON HEAD MOVEMENT |
EP3830631A4 (en) | 2018-08-03 | 2021-10-27 | Magic Leap, Inc. | NON-FUSED POSE DRIFT CORRECTION OF A FUSED TOTEM IN A USER INTERACTION SYSTEM |
JP7472127B2 (ja) | 2018-11-16 | 2024-04-22 | マジック リープ, インコーポレイテッド | 画像鮮明度を維持するための画像サイズによってトリガされる明確化 |
US11425189B2 (en) | 2019-02-06 | 2022-08-23 | Magic Leap, Inc. | Target intent-based clock speed determination and adjustment to limit total heat generated by multiple processors |
EP3939030A4 (en) | 2019-03-12 | 2022-11-30 | Magic Leap, Inc. | REGISTRATION OF LOCAL CONTENT BETWEEN FIRST AND SECOND VIEWERS OF AUGMENTED REALITY |
WO2020223636A1 (en) | 2019-05-01 | 2020-11-05 | Magic Leap, Inc. | Content provisioning system and method |
WO2021021670A1 (en) * | 2019-07-26 | 2021-02-04 | Magic Leap, Inc. | Systems and methods for augmented reality |
US11109139B2 (en) * | 2019-07-29 | 2021-08-31 | Universal City Studios Llc | Systems and methods to shape a medium |
US11737832B2 (en) | 2019-11-15 | 2023-08-29 | Magic Leap, Inc. | Viewing system for use in a surgical environment |
WO2022181106A1 (ja) * | 2021-02-26 | 2022-09-01 | 富士フイルム株式会社 | 制御装置、制御方法、制御プログラム、及び投影装置 |
WO2023009765A2 (en) * | 2021-07-28 | 2023-02-02 | Fuller Mark W | System for projecting images into a body of water |
CN113744335B (zh) * | 2021-08-24 | 2024-01-16 | 北京体育大学 | 一种基于场地标记的运动引导方法、系统及存储介质 |
CN113676711B (zh) * | 2021-09-27 | 2022-01-18 | 北京天图万境科技有限公司 | 虚拟投影方法、装置以及可读存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009225432A (ja) * | 2008-02-22 | 2009-10-01 | Panasonic Electric Works Co Ltd | 光投影装置、照明装置 |
JP2011180712A (ja) * | 2010-02-26 | 2011-09-15 | Sanyo Electric Co Ltd | 投写型映像表示装置 |
JP2014010362A (ja) * | 2012-06-29 | 2014-01-20 | Sega Corp | 映像演出装置 |
JP2015079169A (ja) * | 2013-10-18 | 2015-04-23 | 増田 麻言 | 投影装置 |
JP2015106147A (ja) * | 2013-12-03 | 2015-06-08 | セイコーエプソン株式会社 | プロジェクター、画像投写システム、およびプロジェクターの制御方法 |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6554431B1 (en) * | 1999-06-10 | 2003-04-29 | Sony Corporation | Method and apparatus for image projection, and apparatus controlling image projection |
US8300042B2 (en) * | 2001-06-05 | 2012-10-30 | Microsoft Corporation | Interactive video display system using strobed light |
US7134080B2 (en) * | 2002-08-23 | 2006-11-07 | International Business Machines Corporation | Method and system for a user-following interface |
AU2003291320A1 (en) * | 2002-11-05 | 2004-06-07 | Disney Enterprises, Inc. | Video actuated interactive environment |
US7576727B2 (en) * | 2002-12-13 | 2009-08-18 | Matthew Bell | Interactive directed light/sound system |
US8155872B2 (en) * | 2007-01-30 | 2012-04-10 | International Business Machines Corporation | Method and apparatus for indoor navigation |
KR101595104B1 (ko) * | 2008-07-10 | 2016-02-17 | 리얼 뷰 이미징 리미티드 | 광시야각 디스플레이들 및 사용자 인터페이스들 |
US8845110B1 (en) * | 2010-12-23 | 2014-09-30 | Rawles Llc | Powered augmented reality projection accessory display device |
US9508194B1 (en) * | 2010-12-30 | 2016-11-29 | Amazon Technologies, Inc. | Utilizing content output devices in an augmented reality environment |
WO2012119215A1 (en) * | 2011-03-04 | 2012-09-13 | Eski Inc. | Devices and methods for providing a distributed manifestation in an environment |
US9118782B1 (en) * | 2011-09-19 | 2015-08-25 | Amazon Technologies, Inc. | Optical interference mitigation |
US8840250B1 (en) * | 2012-01-11 | 2014-09-23 | Rawles Llc | Projection screen qualification and selection |
US8887043B1 (en) * | 2012-01-17 | 2014-11-11 | Rawles Llc | Providing user feedback in projection environments |
US9262983B1 (en) * | 2012-06-18 | 2016-02-16 | Amazon Technologies, Inc. | Rear projection system with passive display screen |
US9195127B1 (en) * | 2012-06-18 | 2015-11-24 | Amazon Technologies, Inc. | Rear projection screen with infrared transparency |
US9124786B1 (en) * | 2012-06-22 | 2015-09-01 | Amazon Technologies, Inc. | Projecting content onto semi-persistent displays |
US8964292B1 (en) * | 2012-06-25 | 2015-02-24 | Rawles Llc | Passive anisotropic projection screen |
US9294746B1 (en) * | 2012-07-09 | 2016-03-22 | Amazon Technologies, Inc. | Rotation of a micro-mirror device in a projection and camera system |
US9282301B1 (en) * | 2012-07-25 | 2016-03-08 | Rawles Llc | System for image projection |
US9052579B1 (en) * | 2012-08-01 | 2015-06-09 | Rawles Llc | Remote control of projection and camera system |
US9726967B1 (en) * | 2012-08-31 | 2017-08-08 | Amazon Technologies, Inc. | Display media and extensions to display media |
US8933974B1 (en) * | 2012-09-25 | 2015-01-13 | Rawles Llc | Dynamic accommodation of display medium tilt |
US9281727B1 (en) * | 2012-11-01 | 2016-03-08 | Amazon Technologies, Inc. | User device-based control of system functionality |
US9204121B1 (en) * | 2012-11-26 | 2015-12-01 | Amazon Technologies, Inc. | Reflector-based depth mapping of a scene |
US8992050B1 (en) * | 2013-02-05 | 2015-03-31 | Rawles Llc | Directional projection display |
CN104460951A (zh) * | 2013-09-12 | 2015-03-25 | 天津智树电子科技有限公司 | 一种人机交互方法 |
CN104571484A (zh) * | 2013-10-28 | 2015-04-29 | 西安景行数创信息科技有限公司 | 一种虚拟钓鱼互动装置及其使用方法 |
US9508137B2 (en) * | 2014-05-02 | 2016-11-29 | Cisco Technology, Inc. | Automated patron guidance |
US20160109953A1 (en) * | 2014-10-17 | 2016-04-21 | Chetan Desh | Holographic Wristband |
US10122976B2 (en) * | 2014-12-25 | 2018-11-06 | Panasonic Intellectual Property Management Co., Ltd. | Projection device for controlling a position of an image projected on a projection surface |
-
2015
- 2015-09-02 JP JP2015172568A patent/JP6615541B2/ja active Active
-
2016
- 2016-09-02 CN CN201680050791.6A patent/CN107925739B/zh active Active
- 2016-09-02 WO PCT/JP2016/075841 patent/WO2017038982A1/ja active Application Filing
- 2016-09-02 GB GB1804171.5A patent/GB2557787B/en active Active
-
2018
- 2018-03-01 US US15/909,836 patent/US20180191990A1/en not_active Abandoned
- 2018-05-10 HK HK18106074.6A patent/HK1247012A1/zh unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009225432A (ja) * | 2008-02-22 | 2009-10-01 | Panasonic Electric Works Co Ltd | 光投影装置、照明装置 |
JP2011180712A (ja) * | 2010-02-26 | 2011-09-15 | Sanyo Electric Co Ltd | 投写型映像表示装置 |
JP2014010362A (ja) * | 2012-06-29 | 2014-01-20 | Sega Corp | 映像演出装置 |
JP2015079169A (ja) * | 2013-10-18 | 2015-04-23 | 増田 麻言 | 投影装置 |
JP2015106147A (ja) * | 2013-12-03 | 2015-06-08 | セイコーエプソン株式会社 | プロジェクター、画像投写システム、およびプロジェクターの制御方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107277476A (zh) * | 2017-07-20 | 2017-10-20 | 苏州名雅科技有限责任公司 | 一种适合在旅游景点供儿童互动体验的多媒体设备 |
Also Published As
Publication number | Publication date |
---|---|
CN107925739B (zh) | 2020-12-25 |
GB2557787A (en) | 2018-06-27 |
JP6615541B2 (ja) | 2019-12-04 |
HK1247012A1 (zh) | 2018-09-14 |
CN107925739A (zh) | 2018-04-17 |
JP2017050701A (ja) | 2017-03-09 |
US20180191990A1 (en) | 2018-07-05 |
GB201804171D0 (en) | 2018-05-02 |
GB2557787B (en) | 2021-02-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017038982A1 (ja) | 投影システム | |
US11779846B2 (en) | Method for creating a virtual object | |
Held et al. | 3D puppetry: a kinect-based interface for 3D animation. | |
CN102129343B (zh) | 运动捕捉系统中的受指导的表演 | |
CN111417443A (zh) | 交互式视频游戏系统 | |
CN110665230B (zh) | 虚拟世界中的虚拟角色控制方法、装置、设备及介质 | |
CN102129292B (zh) | 在运动捕捉系统中识别用户意图 | |
US20170189797A1 (en) | Interactive game apparatus and toy construction system | |
US11738270B2 (en) | Simulation system, processing method, and information storage medium | |
Jones et al. | Build your world and play in it: Interacting with surface particles on complex objects | |
JP5320332B2 (ja) | ゲーム装置、ゲーム装置の制御方法、及びプログラム | |
US20110305398A1 (en) | Image generation system, shape recognition method, and information storage medium | |
TW201143866A (en) | Tracking groups of users in motion capture system | |
TW201234261A (en) | Using a three-dimensional environment model in gameplay | |
CN108664231B (zh) | 2.5维虚拟环境的显示方法、装置、设备及存储介质 | |
US11083968B2 (en) | Method for creating a virtual object | |
US20180082618A1 (en) | Display control device, display system, and display control method | |
JP2011215968A (ja) | プログラム、情報記憶媒体及び物体認識システム | |
JP5425940B2 (ja) | ゲーム装置、ゲーム装置の制御方法、及びプログラム | |
Gajic et al. | Egocentric human segmentation for mixed reality | |
JP5499001B2 (ja) | ゲーム装置、ならびに、プログラム | |
JP2017064180A (ja) | 投影システム | |
JP5715583B2 (ja) | ゲーム装置及びプログラム | |
JP2007185482A (ja) | 運動支援方法及び運動器具 | |
Guerineau | Learning Gravity, Basic Physics, and Camera Controls: An Angry Birds-like Game, Part I |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16842011 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 201804171 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20160902 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16842011 Country of ref document: EP Kind code of ref document: A1 |