US20200202824A1 - Cyber reality device including gaming based on a plurality of musical programs - Google Patents
Cyber reality device including gaming based on a plurality of musical programs Download PDFInfo
- Publication number
- US20200202824A1 US20200202824A1 US16/806,734 US202016806734A US2020202824A1 US 20200202824 A1 US20200202824 A1 US 20200202824A1 US 202016806734 A US202016806734 A US 202016806734A US 2020202824 A1 US2020202824 A1 US 2020202824A1
- Authority
- US
- United States
- Prior art keywords
- virtual
- trigger
- triggers
- cyber
- performer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/814—Musical performances, e.g. by evaluating the player's ability to follow a notation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
- G10H1/0025—Automatic or semi-automatic music composition, e.g. producing random music, applying rules from music theory or modifying a musical piece
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/105—Composing aid, e.g. for supporting creation, edition or modification of a piece of music
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/125—Medley, i.e. linking parts of different musical pieces in one single piece, e.g. sound collage, DJ mix
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/091—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
- G10H2220/096—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith using a touch screen
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/091—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
- G10H2220/101—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
- G10H2220/106—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/091—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
- G10H2220/101—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
- G10H2220/106—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
- G10H2220/111—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters for graphical orchestra or soundstage control, e.g. on-screen selection or positioning of instruments in a virtual orchestra, using movable or selectable musical instrument icons
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/091—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
- G10H2220/101—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
- G10H2220/126—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters for graphical editing of individual notes, parts or phrases represented as variable length segments on a 2D or 3D representation, e.g. graphical edition of musical collage, remix files or pianoroll representations of MIDI-like files
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/135—Musical aspects of games or videogames; Musical instrument-shaped game input interfaces
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/201—User input interfaces for electrophonic musical instruments for movement interpretation, i.e. capturing and recognizing a gesture or a specific kind of movement, e.g. to control a musical instrument
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/401—3D sensing, i.e. three-dimensional (x, y, z) position or movement sensing.
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/405—Beam sensing or control, i.e. input interfaces involving substantially immaterial beams, radiation, or fields of any nature, used, e.g. as a switch as in a light barrier, or as a control device, e.g. using the theremin electric field sensing principle
- G10H2220/411—Light beams
- G10H2220/415—Infrared beams
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/121—Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
- G10H2240/131—Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
Definitions
- FIG. 3 illustrates a possible user's perspective of the frontal view during game play mode
- Two or more activated virtual triggers 100 create a composition, including but not limited to sympathetic music whereby each virtual trigger 100 is associated with a music program stored in a media file 506 , and the music programs are synchronized when the music programs are played.
- Each music program may comprise a sub-part of a composition, such as a subset of a song where each subset corresponds to a particular instrument's portion of the composition.
- These music programs can consist of for example, one or more MIDI files, samples such as .wav and .mp3 files, etc.
- the virtual triggers 100 are configured to manipulate the background cyber environment 102 when they are triggered, such as, but not limited to, modifying the color properties of specific elements in the display 505 or changing the Imagery entirely.
- an animated virtual trigger activator 300 When an animated virtual trigger activator 300 has a collision 301 with a virtual trigger object 100 , the associated virtual trigger 108 becomes active and enabled for a predetermined time period, and then the associated virtual trigger 100 returns to a disabled state.
- This illustration shows an embodiment where the trigger activator 300 changes color when it collides with the associated virtual trigger 100 . In other embodiments, the trigger activator 300 could also be highlighted during a collision.
- While a virtual trigger 100 is active, it can change color or be highlighted in some way as to indicate its active state.
- FIG. 4 illustrates virtual trigger activation with animated drops 300 as trigger activators.
- the virtual trigger 100 When the trigger activation object 300 collides at 301 with the virtual trigger 100 , the virtual trigger 100 is activated for a predetermined time window, after which, it will become inactive again. Virtual triggers 100 are highlighted while they are active.
- the predetermined time window can be selectively controlled by application engine 502 , and may be, for instance, 0.5 seconds.
- Game scoring has two different methods depending on the current mode of play: Game Play mode 302 ( FIG. 4 ) and Free Style mode 602 ( FIG. 6 ).
- Application engine 502 is operable on an electronic processor 503 and receives one or more Gestures from the multiple virtual triggers 100 within the cyber reality environment 112 , such as shown in FIG. 1 , FIG. 2 , FIG. 3 , FIG. 4 , and FIG. 6 .
- the application offers two user selectable modes of operation: game play 302 and free style 602 , as previously described, with each having their own playing methods and scoring algorithms.
- the touch is held within the virtual trigger 100 for a substantially short period of time, such as with a threshold for the short period of time of 0.5 seconds or less.
- the application engine 502 can use the Tap gesture to trigger a one-shot, play a single note in a streamed sequence, or start and stop a loop.
- the touch is held within the virtual trigger object 100 a longer period of time, such as with a threshold for the longer period of time of 0.5 seconds or more. Additional thresholds may be used for Tap-and-Hold gesture with each threshold associated with a different action to be taken by the application engine 502 .
- the Application engine 502 can use a Tap-and-hold gesture to Pulse (stream) notes.
- Many additional gestures can trigger the virtual triggers, such as the user using an object, such as a wand, to trigger the virtual triggers 100 .
- Processor 503 is configured such that visual outputs from the application engine 502 are displayed within the cyber reality environment 112 on display 505 and output from sound engine 504 is played on speaker 512 .
- the combination of application engine 502 and sound engine 504 form an application on the processor 503 .
- the processor 503 is configured to selectively associate the music programs with each of the plurality of virtual triggers 100 .
- the processor 503 is configured such that when one of the virtual triggers 100 is in a first state for a prolonged period of time successive said audible musical sounds are generated, such that, for instance, the musical program associated with the virtual trigger 100 continues to play uninterrupted, along with any other music programs that are playing in response the associated virtual trigger 100 being triggered.
- Display 505 displays the total cyber reality environment 112 , which includes Foreground and Background visualizations.
- trigger-specific visual output from application engine 502 can be displayed to alter the display properties or attributes or any element within the cyber reality environment 112 , such as the virtual triggers 100 in the Foreground or what the user sees in the Background behind the virtual triggers 100 .
- FIG. 6 shows how a user could interact with the cyber environment 112 while in Free Style mode 602 , where the virtual triggers 100 are icons that are, arranged in front of the user as seen in display 505 of cyber reality headset display 601 .
- the virtual triggers 100 ire icons that are floating in front of a kaleidoscope background 102 which is dynamically controlled by the sounds being interactively played.
- the generated audio music from triggering the virtual triggers is saved as a separate media file 506 , and is available for playback on device 501 , and is also transferable to another device for play, such as to a computer and to a portable electronic device, such as a smart phone.
- the saved generated audio music file 506 is shareable, such wirelessly by using BlueTooth or via a removable storage device, and is also uploadable to a device, network or the Cloud, such as using WiFi, such as using applications including Facebook, Snap Chat and Twitter etc.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Acoustics & Sound (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Electrophonic Musical Instruments (AREA)
Abstract
A system enabling the performance of sensory stimulating content including music and video using gaining in a cyber reality environment, such as using a virtual reality headset. This disclosure includes a system and method through which a performer can virtually trigger and control a presentation of pre-packaged sensory stimulating content including musical programs through gaming. A theme for the performer is that the pre-packaged sensory stimulating content is preferably chosen such that, even where the performer is a novice, the sensory stimulating data is presented in a pleasing and sympathetic manner and scoring is provided as a function of the performer's ability to provide a gesture in association with a displayed virtual trigger.
Description
- This application is a continuation of U.S. patent application Ser. No. 16/564,226 filed Sep. 9, 2019, which is a continuation of U.S. patent application Ser. No. 15/483,910 filed Apr. 10, 2017, which is a continuation-in-part (CIP) of co-pending U.S. patent application Ser. No. 15/402,012 filed Jan. 9, 2017 entitled Cyber Reality Musical Instrument and Device, which is a continuation of U.S. patent application Ser. No. 15/215,427 tiled Jul. 20, 2016 entitled Cyber Reality Musical Instrument and Device, the teachings of which are included herein in their entirety.
- The present disclosure relates to the composition and performance of sound and video content in a cyber reality environment, including gaming.
- This disclosure relates to the performance of sensory stimulating content including music and video using gaming in a cyber reality environment, such as using a virtual reality headset. This disclosure includes a system and method through which a performer can virtually trigger and control a presentation of pre-packaged sensory stimulating content including musical programs through gaming. A theme for the performer is that the pre-packaged sensory stimulating content is preferably chosen such that, even where the performer is a novice, the sensory stimulating data is presented in a pleasing and sympathetic manner and scoring is provided as a function of the performer's ability to provide a gesture in association with a displayed virtual trigger.
- The accompanying drawings, which are included to provide a further, understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the principles of the disclosure.
-
FIG. 1 illustrates virtual (foreground) triggers; -
FIG. 2 illustrates multiple virtual triggers on top of a music-linked interactive background environment; -
FIG. 3 illustrates a possible user's perspective of the frontal view during game play mode; -
FIG. 4 illustrates trigger activator objects and how they are used to activate virtual trigger objects; -
FIG. 5 illustrates a diagram of a system for multilayered media playback in Cyber Reality gaming; and -
FIG. 6 illustrates a possible embodiment for free style mode, where the virtual triggers are always enabled and active. - Cyber reality is defined as the collection of virtual reality, technology assisted reality, and augmented reality that does not require the performer to physically touch the trigger in order to activate the trigger. Gestures of a user are detected in association with the virtual trigger to cause a triggering event.
- This disclosure enables anyone to interactively play musical instruments and notes while at the same time enjoying the competition of a traditional music rhythm timing game and/or a range of potential music games,
- This disclosure enables players to actually play the notes as the user is visually compelled to time their strikes with an army of moving trigger activation objects.
- A musical underscore may accompany the trigger activation objects to reinforce the musical timing but the triggered notes do not exist in the music underscore. They are, layered on top of the ongoing musical underscore and only sound if the player triggers them during the prescribed timing window (the “Trigger Zone”). The player can see the note “opportunity” approaching (in the term of a trigger activator object such as a large Crop, a ball, etc.), and as the object collides with the “Trigger Zone” the virtual trigger becomes enabled and the player has a brief opportunity to trigger the note. If the player strikes within the “Trigger Zone”, then the note sounds. If the player does not strike at the appropriate time, or strikes before or after the “Trigger Zone”, then the note will not sound. Based on the player's skill, they can achieve different increasingly difficult levels of play.
- The player can also enter Freestyle Play at any time., which allows the player to trigger any of the instruments at will, creating new melodies and riffs without the boundary of the “Trigger Zone”—free to just jam and create their own music composition. The game includes an optional scoring algorithm for Free Play, allowing users to compete here as well.
-
FIG. 1 illustrates fourvirtual triggers 100 in a Foreground cyber environment. Thevirtual triggers 100 are shown on top of anempty Background environment 102. The illustratedvirtual triggers 100 are Guitar 104, DJ Deck 106, Saxophone 108, andKeyboard instrument 110, although these virtual triggers could be any other form, such as a laser beam or other item without an instrument ICON. Thevirtual triggers 100 are spatial and “float” in front of theBackground environment 102 collectively forming acyber environment 112. - The
virtual triggers 100 appear in the cyber reality space in front of the user who interacts with them by physically reaching out to where thevirtual triggers 100 are perceived to be within a cyber reality display 505 (seeFIG. 5 ), such as a virtual reality headset, and touching them in a prescribed way with hands, a physical object such as a conductor's baton, or a hand-held controller. - There can be any number of
virtual triggers 100, and thevirtual triggers 100 can be placed anywhere in thecyber environment 112, directly in front, off to the side, on top or behind the user, requiring the user to look to the side or up or back to see them. - The
virtual trigger 100 can be any Cyber Reality object or element that indicates when a user interacts with it. These interactive Cyber Reality objects/elements send standard gestures or notifications to anApplication Engine 502, as shown inFIG. 5 , when the user interacts with them in a prescribed way. TheApplication Engine 502 sends a Trigger-ON notification to asound engine 504 when a gesture is received from avirtual trigger 100, and a corresponding Trigger-OFF is sent when the gesture ends. - Interactive
virtual triggers 100 are configured to manipulate the Foreground environment to provide visual feedback ondisplay 505 when they are triggered, such as, but not limited to, highlighting or altering the trigger imagery in some way, or by introducing additional graphic objects into the Foreground. Such trigger-specific manipulations cause the cyber reality Foreground to be dynamically linked to music programs that are being interactively triggered. Each time the performer activates avirtual trigger 100, such as by gesturing, a corresponding signal is then sent to anapplication engine 502 as will be described shortly, and causes the presentation of content associated with the activatedvirtual trigger 100. Two or more activatedvirtual triggers 100 create a composition, including but not limited to sympathetic music whereby eachvirtual trigger 100 is associated with a music program stored in amedia file 506, and the music programs are synchronized when the music programs are played. Each music program may comprise a sub-part of a composition, such as a subset of a song where each subset corresponds to a particular instrument's portion of the composition. These music programs can consist of for example, one or more MIDI files, samples such as .wav and .mp3 files, etc. -
FIG. 2 illustrates a possible embodiment of acyber environment 112 where thevirtual triggers 100 are arranged to present a wall of music instrument icons in front of the user within thecyber environment 112, as viewed ondisplay 505. Thevirtual triggers 100 appear in front ofbackground environment 102 consisting of an interactive kaleidoscope that is being controlled by the virtual (music program) triggers 100. - The
virtual triggers 100 are configured to manipulate thebackground cyber environment 102 when they are triggered, such as, but not limited to, modifying the color properties of specific elements in thedisplay 505 or changing the Imagery entirely. - On an individual basis, each
virtual trigger 100, or the sound produced by thevirtual trigger 100, controls or adjusts a unique color component for thedisplay 505. The brightness of the color could optionally be linked to the volume level of the sound being produced by thevirtual trigger 100. - In addition, each
virtual trigger 100 can increase or decrease the value of a property used to generate the kaleidoscopic design itself (Number of petals, Number of orbits, Radial suction, & Scale factor). The amount of adjustment can be linked to the volume level of the sound being interactively produced, by thevirtual triggers 100. - The same concept can be applied to simulated multi-colored laser displays that draw geometric patterns in the cyber reality background, where the color attributes or geometric rendering properties are manipulated interactively by the
virtual triggers 100 and/or the sounds that are interactively produced by thevirtual triggers 100. - Such trigger-specific manipulations cause the
cyber reality background 102 to be dynamically linked to the music programs that are being interactively triggered. -
FIG. 3 shows an embodiment for a Game Play mode of thecyber environment 112, as viewed ondisplay 505, where fivevirtual triggers 100 are normally disabled and require interaction from animatedvirtual trigger activators 300, which are being shown in this example as droplets. - When an animated
virtual trigger activator 300 has acollision 301 with avirtual trigger object 100, the associatedvirtual trigger 108 becomes active and enabled for a predetermined time period, and then the associatedvirtual trigger 100 returns to a disabled state. This illustration shows an embodiment where thetrigger activator 300 changes color when it collides with the associatedvirtual trigger 100. In other embodiments, thetrigger activator 300 could also be highlighted during a collision. - While a
virtual trigger 100 is active, it can change color or be highlighted in some way as to indicate its active state. -
FIG. 4 illustrates virtual trigger activation withanimated drops 300 as trigger activators. - When a signal is received from a virtual trigger activator 509 (see
FIG. 5 ), a teardrop shapedtrigger activation object 300 is instantiated directly above avirtual trigger 100 where it begins traveling down apath 404 towards thevirtual trigger object 100. - When the
trigger activation object 300 collides at 301 with thevirtual trigger 100, thevirtual trigger 100 is activated for a predetermined time window, after which, it will become inactive again.Virtual triggers 100 are highlighted while they are active. The predetermined time window can be selectively controlled byapplication engine 502, and may be, for instance, 0.5 seconds. - The
trigger activation object 300 can change color or be highlighted when it collides at 301 with thevirtual trigger object 100 during this predetermined time window. - If the user interacts with the
virtual trigger 100 while it is highlighted and active, anappropriate gesture input 508 will be sent to theapplication engine 502, and a sound tile assigned to thevirtual trigger 100 will be played and heard viaspeaker 512. If the user interacts with avirtual trigger 100 while it is inactive, nothing will be sent to theapplication engine 502 and no sound is produced atspeaker 512. - Game scoring has two different methods depending on the current mode of play: Game Play mode 302 (
FIG. 4 ) and Free Style mode 602 (FIG. 6 ). - Both modes start with a displayed
score 303 having a value of zero (0) when the application is launched and the score value is dynamically adjusted as described when the application is running. - In
Game Play mode 302, if the user interacts with thevirtual trigger object 100 while it is highlighted and active, a sound is played onspeaker 512, and a predetermined value is added to thecurrent score 303, such as a value of 100. If the user does not interact with avirtual trigger 100 while it is active or interacts with it while it is inactive, no sound is played onspeaker 512, nothing is added to the displayedscore 303, and, in some embodiments, a predetermined value may optionally be deducted from thescore 303, such as 25 points. - In the
Free Style mode 602, thevirtual triggers 100 are always active and will always produce sound whenever the user interacts with them. Scoring in Free Style mode has its own methods and is optional as a user selection. - If selected, Free Style scoring can be based on the musical timing of
virtual trigger 100 gestures in relationship with the musical underscore as defined by its tempo, time signature, and its current metronome value as determined by theapplication engine 502. When agesture 508 is sent to theapplication engine 502 that is musically “in time” with the musical underscore, a predetermined value can be added to thecurrent score 303, such as 100. Conversely, when agesture 508 is sent to theapplication engine 502 that is badly out of time with the musical underscore as determined by theapplication engine 502, a predetermined value can be optionally deducted from thescore 303, such as 25. Alternatively, additional algorithms that involve other parameters can determine the scoring during free style play. - Multiple players can optionally save their game scores for competitive playing against each other.
-
FIG. 5 illustrates a diagram of a system 500 for multilayered media playback in Cyber Reality gaining in accordance with embodiments of the present disclosure. - The system 500 can be implemented in an
electronic device 501, and embodied as one of a computer, a smart phone, a tablet, a touchscreen computer, a cyber reality headset 601 (FIG. 6 ) and the like havingdisplay 505. -
Application engine 502 is operable on anelectronic processor 503 and receives one or more Gestures from the multiplevirtual triggers 100 within thecyber reality environment 112, such as shown inFIG. 1 ,FIG. 2 ,FIG. 3 ,FIG. 4 , andFIG. 6 . -
Application engine 502 controls playback ofmedia files 506 that are combined to form a multilayered media file based on one or more ofGesture inputs 508, according to adefinition file 511, via sound theengine 504. The media files 506 can be one or more MIDI files, samples such as .wav and .mp3 files, video files in a plurality of formats, and/or any other audio or video file format. - The application offers two user selectable modes of operation: game play 302 and
free style 602, as previously described, with each having their own playing methods and scoring algorithms. - In
game play mode 302, thevirtual trigger activator 509 receives virtualtrigger activation cues 510 from themedia files 506 that are associated with the content of the musical programs, and controls the active/inactive state o Ee virtual triggers 10 as shown inFIG. 4 . When thevirtual triggers 100 are under control of thevirtual trigger activator 509, they will only passgestures 508 to theapplication engine 502 while they are activated. - In
free style mode 602, thevirtual trigger activator 509 is bypassed and allvirtual triggers 100 are always active. -
Gesture inputs 508 include one or more standard gestures that indicate when and bow an interactivevirtual trigger 100 is being “touched” by the user within thecyber environment 112.Gesture inputs 508 used for triggering may include, but are not limited to, a Tap gesture and a Tap-and-Hold gesture. - With a Tap gesture, the touch is held within the
virtual trigger 100 for a substantially short period of time, such as with a threshold for the short period of time of 0.5 seconds or less. Theapplication engine 502 can use the Tap gesture to trigger a one-shot, play a single note in a streamed sequence, or start and stop a loop. - With a Tap-and-Hold gesture, the touch is held within the virtual trigger object 100 a longer period of time, such as with a threshold for the longer period of time of 0.5 seconds or more. Additional thresholds may be used for Tap-and-Hold gesture with each threshold associated with a different action to be taken by the
application engine 502. - The
Application engine 502 can use a Tap-and-hold gesture to Pulse (stream) notes. - Many additional gestures can trigger the virtual triggers, such as the user using an object, such as a wand, to trigger the virtual triggers 100.
-
Processor 503 is configured such that visual outputs from theapplication engine 502 are displayed within thecyber reality environment 112 ondisplay 505 and output fromsound engine 504 is played onspeaker 512. The combination ofapplication engine 502 andsound engine 504 form an application on theprocessor 503. Theprocessor 503 is configured to selectively associate the music programs with each of the plurality ofvirtual triggers 100. Theprocessor 503 is configured such that when one of thevirtual triggers 100 is in a first state for a prolonged period of time successive said audible musical sounds are generated, such that, for instance, the musical program associated with thevirtual trigger 100 continues to play uninterrupted, along with any other music programs that are playing in response the associatedvirtual trigger 100 being triggered. -
Display 505 displays the totalcyber reality environment 112, which includes Foreground and Background visualizations. - When a
virtual trigger 100 is virtually touched or triggered by a user Gesture, trigger-specific visual output fromapplication engine 502 can be displayed to simulate triggering avirtual trigger 100 within thecyber reality environment 112. - When a
virtual trigger 100 is triggered by a user Gesture, trigger-specific visual output fromapplication engine 502 can be displayed to alter the display properties or attributes or any element within thecyber reality environment 112, such as thevirtual triggers 100 in the Foreground or what the user sees in the Background behind the virtual triggers 100. -
FIG. 6 shows an embodiment forFree Style mode 602, where thevirtual triggers 100 are always enabled and active. -
FIG. 6 shows how a user could interact with thecyber environment 112 while inFree Style mode 602, where thevirtual triggers 100 are icons that are, arranged in front of the user as seen indisplay 505 of cyberreality headset display 601. - The
virtual triggers 100 are not controlled by thevirtual trigger activators 300 and are always active and enabled. Every time a user interacts with avirtual trigger 100, theappropriate gesture 508 is sent to theapplication engine 502 and the assignedmedia file 506 is played. - In this illustration, the
virtual triggers 100 ire icons that are floating in front of akaleidoscope background 102 which is dynamically controlled by the sounds being interactively played. - In both the Game Play mode and the Free Style mode, the generated audio music from triggering the virtual triggers is saved as a
separate media file 506, and is available for playback ondevice 501, and is also transferable to another device for play, such as to a computer and to a portable electronic device, such as a smart phone. The saved generatedaudio music file 506 is shareable, such wirelessly by using BlueTooth or via a removable storage device, and is also uploadable to a device, network or the Cloud, such as using WiFi, such as using applications including Facebook, Snap Chat and Twitter etc. - The appended claims set forth novel and inventive aspects of the subject matter described above, but the claims may also encompass additional subject matter not specifically recited in detail. For example, certain features, elements, or aspects may be omitted from the claims if not necessary to distinguish the novel and inventive features from what is already known to a person having ordinary skill in the art. Features, elements, and aspects described herein may also be combined or replaced by alternative features serving the same, equivalent, or similar purpose without departing from the scope of the invention defined by the appended claims.
Claims (1)
1. A device, comprising:
an electronic processor configured to generate a first signal as a function of virtual triggers selected by a user;
the electronic processor configured to generate a second signal as a function of a plurality of music programs, wherein each said music program comprises sound elements comprising a subset of a musical composition, and the music programs are correlated to each other;
the electronic processor configured to generate a score as a function of the user selecting the virtual triggers; and
a cyber reality headset configured to display the virtual triggers and the score.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/806,734 US20200202824A1 (en) | 2016-07-20 | 2020-03-02 | Cyber reality device including gaming based on a plurality of musical programs |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/215,427 US9542919B1 (en) | 2016-07-20 | 2016-07-20 | Cyber reality musical instrument and device |
US15/402,012 US9646588B1 (en) | 2016-07-20 | 2017-01-09 | Cyber reality musical instrument and device |
US15/483,910 US10418008B2 (en) | 2016-07-20 | 2017-04-10 | Cyber reality device including gaming based on a plurality of musical programs |
US16/564,226 US10593311B2 (en) | 2016-07-20 | 2019-09-09 | Cyber reality device including gaming based on a plurality of musical programs |
US16/806,734 US20200202824A1 (en) | 2016-07-20 | 2020-03-02 | Cyber reality device including gaming based on a plurality of musical programs |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/564,226 Continuation US10593311B2 (en) | 2016-07-20 | 2019-09-09 | Cyber reality device including gaming based on a plurality of musical programs |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200202824A1 true US20200202824A1 (en) | 2020-06-25 |
Family
ID=60988114
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/483,910 Expired - Fee Related US10418008B2 (en) | 2016-07-20 | 2017-04-10 | Cyber reality device including gaming based on a plurality of musical programs |
US16/564,226 Expired - Fee Related US10593311B2 (en) | 2016-07-20 | 2019-09-09 | Cyber reality device including gaming based on a plurality of musical programs |
US16/806,734 Abandoned US20200202824A1 (en) | 2016-07-20 | 2020-03-02 | Cyber reality device including gaming based on a plurality of musical programs |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/483,910 Expired - Fee Related US10418008B2 (en) | 2016-07-20 | 2017-04-10 | Cyber reality device including gaming based on a plurality of musical programs |
US16/564,226 Expired - Fee Related US10593311B2 (en) | 2016-07-20 | 2019-09-09 | Cyber reality device including gaming based on a plurality of musical programs |
Country Status (3)
Country | Link |
---|---|
US (3) | US10418008B2 (en) |
EP (1) | EP3488322A1 (en) |
WO (1) | WO2018017613A1 (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11137601B2 (en) * | 2014-03-26 | 2021-10-05 | Mark D. Wieczorek | System and method for distanced interactive experiences |
US10418008B2 (en) * | 2016-07-20 | 2019-09-17 | Beamz Ip, Llc | Cyber reality device including gaming based on a plurality of musical programs |
US10782779B1 (en) * | 2018-09-27 | 2020-09-22 | Apple Inc. | Feedback coordination for a virtual interaction |
EP3844863A4 (en) | 2018-10-13 | 2021-11-03 | Planar Motor Incorporated | Systems and methods for identifying a magnetic mover |
CN112581922A (en) * | 2019-09-30 | 2021-03-30 | 圣诞先生公司 | System for non-contact musical instrument |
US11893898B2 (en) | 2020-12-02 | 2024-02-06 | Joytunes Ltd. | Method and apparatus for an adaptive and interactive teaching of playing a musical instrument |
US11972693B2 (en) | 2020-12-02 | 2024-04-30 | Joytunes Ltd. | Method, device, system and apparatus for creating and/or selecting exercises for learning playing a music instrument |
US11670188B2 (en) | 2020-12-02 | 2023-06-06 | Joytunes Ltd. | Method and apparatus for an adaptive and interactive teaching of playing a musical instrument |
US11900825B2 (en) | 2020-12-02 | 2024-02-13 | Joytunes Ltd. | Method and apparatus for an adaptive and interactive teaching of playing a musical instrument |
US20240203058A1 (en) * | 2021-04-20 | 2024-06-20 | Quill & Quaver Associates Pty. Ltd. | System and method for performance in a virtual reality environment |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6960715B2 (en) * | 2001-08-16 | 2005-11-01 | Humanbeams, Inc. | Music instrument system and methods |
US8431811B2 (en) * | 2001-08-16 | 2013-04-30 | Beamz Interactive, Inc. | Multi-media device enabling a user to play audio content in association with displayed video |
US20070245881A1 (en) * | 2006-04-04 | 2007-10-25 | Eran Egozy | Method and apparatus for providing a simulated band experience including online interaction |
US8079907B2 (en) * | 2006-11-15 | 2011-12-20 | Harmonix Music Systems, Inc. | Method and apparatus for facilitating group musical interaction over a network |
US8907193B2 (en) * | 2007-02-20 | 2014-12-09 | Ubisoft Entertainment | Instrument game system and method |
US20080200224A1 (en) * | 2007-02-20 | 2008-08-21 | Gametank Inc. | Instrument Game System and Method |
EP2173444A2 (en) * | 2007-06-14 | 2010-04-14 | Harmonix Music Systems, Inc. | Systems and methods for simulating a rock band experience |
US8678896B2 (en) * | 2007-06-14 | 2014-03-25 | Harmonix Music Systems, Inc. | Systems and methods for asynchronous band interaction in a rhythm action game |
US20090310027A1 (en) * | 2008-06-16 | 2009-12-17 | James Fleming | Systems and methods for separate audio and video lag calibration in a video game |
US8663013B2 (en) * | 2008-07-08 | 2014-03-04 | Harmonix Music Systems, Inc. | Systems and methods for simulating a rock band experience |
US7910818B2 (en) * | 2008-12-03 | 2011-03-22 | Disney Enterprises, Inc. | System and method for providing an edutainment interface for musical instruments |
US20100184497A1 (en) * | 2009-01-21 | 2010-07-22 | Bruce Cichowlas | Interactive musical instrument game |
US8445767B2 (en) * | 2009-04-11 | 2013-05-21 | Thomas E. Brow | Method and system for interactive musical game |
US8629342B2 (en) * | 2009-07-02 | 2014-01-14 | The Way Of H, Inc. | Music instruction system |
US20110077080A1 (en) * | 2009-09-30 | 2011-03-31 | Syed Ashraf Meer | 3D Gaming client for interactive musical video games inventor(s) |
US20110086704A1 (en) * | 2009-10-14 | 2011-04-14 | Jack Daniel Davis | Music game system and method of providing same |
KR101114750B1 (en) * | 2010-01-29 | 2012-03-05 | 주식회사 팬택 | User Interface Using Hologram |
US8874243B2 (en) * | 2010-03-16 | 2014-10-28 | Harmonix Music Systems, Inc. | Simulating musical instruments |
US8895830B1 (en) * | 2012-10-08 | 2014-11-25 | Google Inc. | Interactive game based on user generated music content |
US20150046808A1 (en) * | 2013-08-08 | 2015-02-12 | Beamz Interactive, Inc. | Apparatus and method for multilayered music playback |
KR20160113592A (en) * | 2014-01-27 | 2016-09-30 | 엘지전자 주식회사 | Wearable glass-type device and method of controlling the device |
EP4218975A3 (en) * | 2015-05-19 | 2023-08-30 | Harmonix Music Systems, Inc. | Improvised guitar simulation |
US11110355B2 (en) * | 2015-06-19 | 2021-09-07 | Activision Publishing, Inc. | Videogame peripheral security system and method |
US20170185141A1 (en) * | 2015-12-29 | 2017-06-29 | Microsoft Technology Licensing, Llc | Hand tracking for interaction feedback |
US10418008B2 (en) * | 2016-07-20 | 2019-09-17 | Beamz Ip, Llc | Cyber reality device including gaming based on a plurality of musical programs |
US9542919B1 (en) * | 2016-07-20 | 2017-01-10 | Beamz Interactive, Inc. | Cyber reality musical instrument and device |
US20180126268A1 (en) * | 2016-11-09 | 2018-05-10 | Zynga Inc. | Interactions between one or more mobile devices and a vr/ar headset |
-
2017
- 2017-04-10 US US15/483,910 patent/US10418008B2/en not_active Expired - Fee Related
- 2017-07-18 WO PCT/US2017/042671 patent/WO2018017613A1/en unknown
- 2017-07-18 EP EP17746575.4A patent/EP3488322A1/en not_active Withdrawn
-
2019
- 2019-09-09 US US16/564,226 patent/US10593311B2/en not_active Expired - Fee Related
-
2020
- 2020-03-02 US US16/806,734 patent/US20200202824A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
WO2018017613A1 (en) | 2018-01-25 |
EP3488322A1 (en) | 2019-05-29 |
US20180025710A1 (en) | 2018-01-25 |
US10418008B2 (en) | 2019-09-17 |
US10593311B2 (en) | 2020-03-17 |
US20200005742A1 (en) | 2020-01-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10593311B2 (en) | Cyber reality device including gaming based on a plurality of musical programs | |
US9542919B1 (en) | Cyber reality musical instrument and device | |
US7435169B2 (en) | Music playing apparatus, storage medium storing a music playing control program and music playing control method | |
US8858330B2 (en) | Music video game with virtual drums | |
US8749495B2 (en) | Multiple actuation handheld device | |
Blaine et al. | Contexts of collaborative musical experiences | |
JP5890968B2 (en) | Game device, game program | |
US8696456B2 (en) | Music-based video game with user physical performance | |
Dahl et al. | Sound Bounce: Physical Metaphors in Designing Mobile Music Performance. | |
EP1529280A1 (en) | Digital musical instrument system | |
TW200951764A (en) | Gesture-related feedback in electronic entertainment system | |
US9799314B2 (en) | Dynamic improvisational fill feature | |
Kim et al. | TapBeats: accessible and mobile casual gaming | |
Michalakos | Designing musical games for electroacoustic improvisation | |
US9751019B2 (en) | Input methods and devices for music-based video games | |
KR100895259B1 (en) | Total sound control music game method and apparatus and computer-readable recording medium for recording program therefor | |
Papworth | iSpooks: an audio focused game design | |
KR20000072250A (en) | Dancing machine for hand | |
JP6590782B2 (en) | Game device, program | |
KR101547933B1 (en) | Video game control method and apparatus | |
Ariza | The Dual-Analog Gamepad as a Practical Platform for Live Electronics Instrument and Interface Design. | |
Zamorano | SimpleTones: a collaborative sound controller system for non-musicians | |
JP2002166047A (en) | Game system, program and information-recording medium | |
Bott et al. | One man band: a 3D gestural interface for collaborative music creation | |
Loyer | Stories as instruments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: TOPDOWN LICENSING LLC, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BEAMZ IP, LLC;REEL/FRAME:053029/0492 Effective date: 20200528 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |