NO345656B1 - Game Story System for Mobile Apps - Google Patents

Game Story System for Mobile Apps Download PDF

Info

Publication number
NO345656B1
NO345656B1 NO20190524A NO20190524A NO345656B1 NO 345656 B1 NO345656 B1 NO 345656B1 NO 20190524 A NO20190524 A NO 20190524A NO 20190524 A NO20190524 A NO 20190524A NO 345656 B1 NO345656 B1 NO 345656B1
Authority
NO
Norway
Prior art keywords
game
scene
story
app
game story
Prior art date
Application number
NO20190524A
Other languages
Norwegian (no)
Other versions
NO20190524A1 (en
Inventor
Ole-Ivar Holthe
Original Assignee
Holthe Ole Ivar
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Holthe Ole Ivar filed Critical Holthe Ole Ivar
Priority to NO20190524A priority Critical patent/NO345656B1/en
Publication of NO20190524A1 publication Critical patent/NO20190524A1/en
Publication of NO345656B1 publication Critical patent/NO345656B1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5526Game data structure
    • A63F2300/554Game data structure by saving game or status data
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • A63F2300/636Methods for processing data by generating or executing the game program for controlling the execution of the game in time involving process of starting or resuming a game

Description

FIELD OF THE INVENTION
The present invention relates generally to user interfaces for mobile devices, and more particularly to a game story system for mobile apps.
BACKGROUND OF THE INVENTION
The mobile, computer and video game industry offers many different approaches to find, play, and share games and content. Mobile, PC and video game console platforms typically have a main store that users typically use to find games. The PC platform has the largest variety in stores. A problem with this is that as the number of platforms increase it becomes increasingly difficult for users to visit these stores, periodically, to find games to play. Another problem is that the number of games is increasingly growing in all stores in such a rapid pace that it is difficult for store managers to predict which games are popular with respect to specific user needs, which means that it is difficult for users to find games.
Mobile game stores are dominated by casual and "hyper-casual" games. Finding a good high-end game title is difficult because they have a tendency to "drown" in crowd of casual games. Another problem is that it is difficult for game publishers to advertise a game that is not a casual game.
PC game stores usually have a big variety of games from indie game developers to large game studios. Finding a good game is difficult because users have to search and browse a large collection of games.
Video game consoles have the highest degree of high-end game titles compared to the other platforms. There is a large variety of games on consoles as well. Finding the right game is difficult if you are not looking for a high-end game title.
There are many game review web sites and applications but they typically have a traditional text-based or videobased game review approach. There are also web sites and applications where people can stream and share gaming experiences, but they are not very efficient with respect to quickly finding a good game to play.
There are many inventions within the technical area of the invention, such as for example:
o Interactive spectating interface for live videos (US patent application number US 2018/0359295 A1) that provides an interface for specators to view live videos of users that are playing games, enabling specatators to react with comments, etc.
o Dynamic story driven gameworld creation (US patent application number US 2015/0165310 A1) that provides an approach for users (e.g. children) to easily create a story driven gameworld.
o Story-driven game creation and publication system (US patent application number US 2018/0036639 A1) that is another approach to generating a game.
o Apparatus and method for providing a game (South Korea patent application number KR 20170052407 A) that provides a way to show game stories.
o Method and system for saving a snapshot of game play and used to begin later execution of the game play by any user as executed on a game cloud system (US patent application number 2017/0354888 A1).
Systems and solutions for creating and generating games could make it easier for people to produce games, videos, stories, etc., which could lead to a significant growth in the amount of games and content being produced. Systems and solutions for generating snapshots and video from games could make it easier to make summaries of the experiences in games.
Therefore, what is clearly needed is a game story system for mobile apps that can solve the problems mentioned above.
SUMMARY OF THE INVENTION
The following presents a simplified summary of the invention in order to provide a basic understanding of some aspects of the invention. This summary is not an extensive overview of the invention. It is intended to neither identify key or critical elements of the invention nor delineate the scope of the invention. Its sole purpose is to present some concepts of the invention in a simplified form as a prelude to the more detailed description that is presented later.
In an embodiment of the invention a computer implemented method for showing a game story on the main display of a smartphone, tablet or similar computing device is provided, comprising showing a game story opening scene, receiving a goto scene trigger condition, and in response, proceeding to show game story main scenes in an iterative manner, in response to a goto scene trigger condition from each scene, until a game story final scene is shown or an ending game story trigger condition is received.
In another aspect of the invention a smartphone, tablet or similar computing device is provided, comprising a display that can show a user interface, and an app process configured to execute on the device and further configured to show a game story opening scene, receive a goto scene trigger condition, and in response, proceed to show game story main scenes in an iterative manner, in response to a goto scene trigger condition from each scene, until a game story final scene is shown or an ending game story trigger condition is received.
To the accomplishment of the foregoing and related ends, certain illustrative aspects of the invention are described herein in connection with the following description and the annexed drawings. These aspects are indicative, however, of but a few of the various ways in which the principles of the invention may be employed and the present invention is intended to include all such aspects and their equivalents. Other advantages and novel features of the invention will become apparent from the following detailed description of the invention when considered in conjunction with the drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the present invention will now be described with reference to the following drawings, where: FIG.1 is an example game browsing user interface,
FIG.2 is an example game story user interface,
FIG.3 is a block diagram illustrating an example game story scene object model that is suitable for practicing the playback of a game story scene of the present invention,
FIG.4 is an example game story scene user interface that shows a full-screen video presentation with elements, FIG.5 is an example game story user interface that shows a slide-up details view,
FIG.6 is an example conceptual user interface for the home screen on a mobile device,
FIG.7 is a flow diagram illustrating a methodology for determining see-through areas on the home screen of the device in accordance with an embodiment of the present invention,
FIG.8 is an example single slide pad controller user interface for a remotely playable game that is shown in a game story scene,
FIG.9 is an example triggers, bumpers and d-pads controller user interface for a remotely playable game that is shown in a game story scene, and
FIG.10 is an example perspective layout triggers, bumpers and d-pads controller user interface for a remotely playable game that is shown in a game story scene.
DETAILED DESCRIPTION
The inventor provides a new and unique game story system for mobile apps. The game story system comprises a smartphone, tablet or other similar computing device, that has a display that shows a user interface for a game story, game browsing, slide-up detail, Live Wallpapers, etc. The present invention is described in enabling detail in the following examples that could represent more than one embodiment of the present invention.
FIG.1 illustrates a two-column game browsing user interface. The user interface is displayed on a display 101 of a smartphone, tablet or other similar computing device 100. A status bar 102 may be shown by the operating system of the device. A navigation bar may be shown below the line 125. The navigation bar usually comprises a background 114 and buttons 126, 127 and 128. The navigation bar is rendered by the operating system of the device. FIG.1 shows the user interface on an Android operating system. The status bar and navigation bar varies in appearance and functionality on different versions of the operating system. iOS and certain Android phones do not have a navigation bar, but has button(s) on the device. Other operating systems may have other elements like these. The elements in FIG.1 are included to illustrate an example device and the user interface. The display aspect ratio will typically also vary on different devices. The smartphone, tablet or other similar computing device may have many displays. The main display is the display that a user perceives to be the main user interface of the device.
FIG.1 shows an example main user interface of an application, or app. The app could have a top area 103 with a logo. The top area is typically the area above the menu 104. The top area can also be smaller or larger, and could simply show a title for the page, images, effects, menu, other elements, fields, etc. The top area in FIG.1 shows "Geelix" as the logo to illustrate such an appearance. The menu 104 in FIG.1 shows a menu with entries the user can tap to view games for specific game platforms. In this example menu, "All" means all game platforms, "Mobile" means mobile games, "PC" means PC games, "PS5" means PlayStation 5 games, etc. There is menu-button 105 on the right side of the menu 104. The item browse list 106 through 112 changes when the user presses or taps a menu entry. The menu shown in FIG.1 shows game platforms, but it could also show any other list of relevant entries, such as a search field, popular, featured, new, etc. The menu will typically vary depending on the users choice. The item browse list shown in FIG.1 illustrates six game items 107 to 112 in the current view. The user can scroll this list with his finger to reveal more game items below 111 and 112. This list could be implemented with a ListView or RecyclerView on Android or similar on other operating systems. The bottom navigation menu 113 above the line 125, that includes the buttons and labels 115 through 124, is used to navigate between the main user interfaces, or main sections, in the app. The bottom navigation menu typically works the same way as the bottom navigation menu in most apps. The line 125 is for illustration purposes and is not visible in the app. The menu 104 typically comprises many entries, often more than what can be shown on the display at the same time, so the user can scroll the menu entries horizontally. There can be an alpha color gradient image next to the menubutton 105 to make it appear like the menu entries fade out as they go under the menu-button. The bottom navigation menu 113 entries 115 to 124 can also be scrollable horizontally if there are many entries. It can also be expandable vertically to reveal more entries or functionality. The menu 104, top area 103, and bottom navigation menu 113 takes up space on the display. Some or all of these areas can be hidden when the user scrolls down the list. Some or all of these areas could reappear when the user scroll up in the list. There can be different animations to all of them when hiding and showing. There could be a Floating Action Button (FAB) in the user interface with actions. The menu can have a different color, text style, elements, etc. for selected items. The menu 104 and the item browse list 106 through 112 could be an Android tab layout with horizontally swipeable (or side-swipe) views, menu items, etc. The menu and item browse list shows game platforms and games in this figure, but could be other types of menu entries and item types depending on content, app, etc. It is important to note that the main user interface in FIG.1 represents one way to provide a main user interface. A user interface that uses a Navigation Drawer instead of a bottom navigation menu is another important main user interface.
FIG.2 shows an example user interface for a game story. This user interface can be shown when a user taps or clicks a game item in the item browse list 106 (see FIG.1). The area for displaying the game story 200 will typically fit fully inside the whole display area 101, by showing transparently through the status bar 102 and the navigation bar 114. If the status bar 102 is not visible or transparent, then the top of the area for displaying the game story 200 will typically more or less be at the top of the display. If the status bar 102 is visible and mostly not transparent, then the top of the area for displaying the game story 200 will typically more or less be at the bottom of the status bar 102. If the navigation bar 114 is not visible or transparent, then the bottom of the area for displaying the game story 200 will typically more or less be at the bottom of the display. If the navigation bar 114 is visible and not transparent, then the bottom of the area for displaying the game story 200 will typically more or less be at the top of the navigation bar 114. The person or software that creates a game story scene could be enabled to decide how to show the scene, status bar and navigation bar. The game story user interface can vary a lot, depending on device, app, content, etc., without departing from the intended scope of the invention.
FIG.3 is a block diagram illustrating an example game story scene object model that is suitable for practicing the playback of a game story scene in the area for displaying the game story 200. The game story scene object model comprises a programmatic object called Scene 300 in this example. This object can for example be a Java, Swift, Kotlin or other language object. The scene object can hold references to image element objects 301, video element objects 302, text element objects 303, audio objects 304, and other objects 305. The scene typically also holds references to scripts or data structures for representing interaction event actions. The scene object model is used to render and play an animated and interactive scene in the area for displaying the game story 200. The scene is typically rendered using Android layout objects, such as ConstraintLayout, RelativeLayout, etc., or OpenGL, Vulkan or other graphics application-programming interfaces. The scene can be rendered as a ConstraintLayout, and can have similar properties, methods and events as a ConstraintLayout. An image element can be similar to an ImageView, a video element is similar to a SurfaceView with an ExoPlayer streaming video content to it. A text element is similar to a TextView or WebView that can show text or HTML content. Audio objects are used to play and manage audio and music output and can typically be implemented by using several ExoPlayer objects, etc. Other objects can be used to represent other visual or non-visual objects in the scene. An interactive and animated 3D visualization element, or 3D element, could be an other object. The 3D element could be able to load a 3D scene or model (e.g. Unity, Unreal, WebGL, FBX, etc.) and show it. Different types of playable and remotely playable game elements could also be other objects. There could be an AR element as an other object, that could be able to show interactive and animated AR scenes and models, with for example ARCore, ARKit, etc. There can be a FAB, dialog, slide in view, menu, button, fragment, view, layout, view pager, card view, popup menu, recycler view, search view, edit text, check box, radio button, spinner, scroll view, switch, ratings bar, list, grid, tabs, password, web view, toggle button, surface, etc. elements as other objects, that could be populated with subordinate user interface elements that show as the user interacts with the element(s). Elements and objects in the scene can be animated by typical animation objects in for example Android, such as property animations, animated drawables, physics-based motion, transitions, etc. Any file format, such as JSON, XML, etc., could be used to represent a games story or scene. The game story object model and file format can vary a lot, depending on device, app, content, etc., without departing from the intended scope of the invention. One or more script or data structure that is capable of representing, defining or otherwise describing interaction event actions in a story, a scene, a set of scenes, an element, a set of elements, other objects, groups, formats, etc., is called an interaction event action (IEA) script or structure (or IEA script or structure). There can be IEA script or structure integration(s) with elements, objects, etc. for interaction events, handling, initialization, periodic callbacks, accessing assets, gyroscope, database, etc. The IEA scripts or structures can vary a lot depending on app, content, operating system, etc.
A game story comprises one or more game story scenes 300. The game story can for example begin by playing the first scene in the game story. The scene could then show the visible elements in their specified layout position. Video and audio elements could be instructed to start playing specific URLs to external, local or in-app content, by the script or data structure for representing interaction event actions. The IEA script or structure can respond to user interactions or other events, such as touches, tap, swipe, scroll, gyroscope, acceleration of device, periodic callbacks, etc., on the scene, any element, or object in the scene. Even though it is said "the script or data structure for representing interaction event actions", it does not mean that it exclusively is for that. It can comprise many other things as well. It typically comprises initialization code or functionality for game stories, scenes, elements, etc., interaction, loops, periodic callbacks, device events, sensor events, events from games and apps, etc. The IEA script or structure can trigger playing a specific scene, play a video element, play an audio object, show or hide elements, play an animation, pin content, edit or remove a pin, open a specific page or section, open a deep link in a game or app, open a specific web page, open the game details page, close the game story, play a different game story, present or unlock a promo code to the user, show an image from the web, show a YouTube or Twitch video, show links to other games stories, show a link to Live Wallpaper(s), show a link to AR-model(s), setting a scene as a Live Wallpaper, play a game, etc. The application programming interface (API) available to the IEA scripts or structures to call can vary a lot, depending on device, app, content, etc., without departing from the intended scope of the invention. The IEA scripts or structures described above is one possible embodiment. There could be many embodiments of IEA scripts or structures in apps. In a minimalistic embodiment, game stories, scenes, interaction events, etc. could be implemented (or "hard-coded") directly in the app. In an IEA script or structure control embodiment, a set of stories, story, set of scenes, scene, etc., could be defined, described, etc. by one or more IEA script or structure. The IEA script or structure mange, start, load, show, connect events for, etc., a story, scene, etc. The IEA script or structure can have an initialize, create or similar function that is called to show the story, scene, element(s), etc. A story, scene, set of stories, scenes, etc., could basically be an IEA script or structure, in the IEA script or structure control embodiment, that programmatically creates elements, objects, connects events, handles interactions, etc. A pinning trigger condition typically occur when a user presses a pin button, or when a pin function is called in an IEA script or structure. A pinning trigger condition typically triggers showing a pin to collection page that shows a representation of an item that is being pinned with a button or similar for storing it in a collection or other list.
Typical general user interactions for a game story scene could comprise tapping the scene to go to the next scene in the game story, swipe down to end the game story and navigate to the previous place in the app, and swipe up to reveal more information. There is usually a video or image element, or set of elements, in a scene that covers parts of the background. Typical general user interactions for a game story scene could pass through or be on these elements as well, so that for example swiping up on the background video will reveal more information. Swiping left could also be a typical user interaction to go to the next scene in the game story. Other elements in the scene, may not support the typical general user interactions, so that if for example an image at a specific place in the scene is tapped, it could trigger an interaction event action in the IEA script or structure instead of the typical general user interactions. Long-pressing could be a typical user interaction to show a menu with actions for the story, scene, content, game, etc. There could be a count down time period for each scene, that could depend on user interaction, etc., to determine if the story should transition to the next scene. Swiping or scrolling up can show a details page, show a Live Wallpaper (LWP) preview, show a 3D model in a AR camera, open a web page, app page, app, deep link, start an Instant App, game, remote game, video, etc. A details page trigger condition could be a swipe up, scroll up, etc. in a game story scene. It is important to understand that the IEA script or structure could handle and implement custom implementations of all user interactions in a scene, and that a typical general user interaction could actually be implemented by the IEA script or structure. A typical general user interaction could be modified, changed, removed, etc. in an IEA script or structure. A swipe or scroll up could for example subscribe a user to a story, creator of a story, etc.
FIG.4 illustrates a full-screen video presentation with elements. The full-screen video presentation is essentially a video element 400, zero or more audio objects, etc. The video typically starts playing when the scene starts and can loop indefinitely. The edges of the video can be slightly inside or outside the display 101. The aspect ratio of the video is typically not exactly the same as the display 101, so the element will typically be fit inside the display to show it full-screen. The video content in this example is roughly made to fit portrait aspect ratios of most smartphones, most tablets, etc. Video presentations can vary a lot, depending on app, device, content, etc.
FIG.4 shows four elements in addition to the video element and audio objects. A close button 401 is typically shown only on the end-scene of a game story. A user can tap the button to finish the game story and navigate to the previous place in the app. The Pin-button 402 is typically also only shown on the end-scene. A user can tap the button to pin the story to one of the users collections of game stories. A More-button 403 is typically shown on any game story scene. The text or image representation of this button will vary. Users may be able to tap the button to slide up the details page. There will often be one or more store-link buttons 404, typically on the endscene of a game story. Users can tap it to open an external store page of the game in an external app or web browser, such as e.g. Google Play Store or similar. The buttons 401 through 404 layouts, text, images, etc., are only for example purposes. It is the game story designer person or software that decides the layout and other details for the scenes in the game story. The Pin-button can be called bookmark, favorite, save, etc. There can be a button, text, image or video element, or combination of these, that the user can tap to go to a Live Wallpaper preview page, Live Wallpaper group page, for a specific Live Wallpaper, group of Live Wallpapers, to set the scene as a Live Wallpaper, etc. The button, text, image or video element, or combination, is called a Live Wallpaper button, or LWP button. The Live Wallpaper preview page could be the Live Wallpaper preview page of the Android operating system. A goto Live Wallpaper trigger condition typically occurs when a user taps a Live Wallpaper button, thumb, etc., or when a IEA script or structure navigates to a Live Wallpaper preview page. There could be a button, text or other representation in a scene to show that the game is on sale. There could be a button in the scene that puts the game in the users games watch list when the user taps it. FIG.5 is an example end-scene for a game story. The end-scene will naturally vary a lot. The design, text, appearance, placing, etc. of the elements in the figure are for example purposes only and will typically be very different.
An important game story scene is a scene that comprises a playable game. A playable game could be shown and played using for example a WebView on Android that has JavaScript, etc. enabled. The playable game could be loaded from an external site or a set of local files, data structures, etc. A playable game is typically implemented in HTML, JavaScript, WebGL, or similar. A playable game could have been made in Unity, Unreal, etc. and built as a WebGL, Android, iOS or other platform game. The scene could use a different approach to playing the game than using a WebView. The app could comprise a Unity Engine (so-files, etc.) that can load and play the game, or an Unreal Engine, etc. An engine could also be implemented in JavaScript that could be transmitted with a scene. A playable game could be a full game, it could be a shorter demo version of a game, it could be a level in a game, etc. A playable game is usually shown and played full-screen. It can be possible to show one or more playable games in various places of a scene. There could be an integration between the scene, or an IEA script or structure, and the playable game. For example, a Unity WebGL game can call functions in the JavaScript of the WebView, and the JavaScript of the WebView can call the code of the game, using a ".jslib" plugin, SendMessage, etc. The playable game script integration makes it possible to invoke many useful game story functionalities from the game, such as pinning a story (e.g. pinStory( )), navigating to the next scene (e.g. endScene( )), navigating to a different scene (e.g. playScene(String id)), preview a LWP (e.g. showLiveWallpaper(String id)), show AR camera for a model (e.g. showARModel(String id)), end a story (e.g. endStory()), access credits/ tokens/etc. (e.g. getCredits(), addCredits(String id) and consumeCredits(String id)), play a different playable game (e.g. play(String url)), show an ad (e.g. showAd(String adType)), show a web page (e.g. showWebPage(String url)), open an app deep link (e.g. openDeepLink(String url)), rate the game (e.g. getRating() and addRating(int value)), transfer game instant state (e.g. getState(), setState(String data), loadState() and saveState(String data)), sharing leader boards (e.g. getLeaderboard() and setLeaderboard(String data)), transfer/ share progress (e.g. getGameProgress() and setGameProgress(String data)), get/unlock achievements (e.g. getAchievements() and setAchievement(String data)), get user profile (e.g. getUserProfile()), play/communicate with other users (e.g. getUserLists(), getUserList(String id), inviteUsers(String data), sendMessage(String id), receiveMessages() and shareCamera()), manage/share gameplay recordings (e.g. getRecordings(), setRecordingTarget(String url), shareRecording(String id), startRecording() and stopRecording()), etc. There can also be game specific functionality available, such as credits, score, data, props, achievements, user lists, device lists, etc., shared by all, or groups of, games, for a single game, etc. There is a large variety of functionality that could be added to this list. One or more typical general user interactions for a game story scene could be implemented by the game, the scene, or combinations, that could also be determined in a script integration. A playable game could also be launched as an Instant App on Android, or similar approach on other operating systems. The Instant App could be launched directly from a scene, IEA script or structure, etc. Data could be passed to the Instant App to specify where to start, what to show, how to return, etc. The app or Instant App could support app links, deep links, etc. to show a game story scene. The app or Instant App can support app indexing, etc. Game story scenes with a playable game could vary a lot, depending on app, device, game, etc. Playable games, implementation, game engines, integration, script integration, game specific functionality, user interactions, etc. could vary a lot without departing from the intended scope of the present invention.
An important game story scene is a scene comprising a remotely playable game. The app connects to a server over a network. A game starts, or is executing, on the server. A compressed data stream comprising video, audio and other data is transmitted from the server to the app, called the audiovisual stream. A data stream comprising user input data is transmitted from the app to the server, called the user input stream. This is experienced by the user as being able to play the game, or part of the game, over the network. There is typically a video element, implemented with for example a SurfaceView or similar, in the game story scene that can show the audiovisual video as it is decoded, etc. An audiovisual stream decoder could for example use VP9, VP8, H265, or other video codecs, to decode the video stream, and AAC, or other audio codecs, to decode the audio stream. Audio can be rendered using ExoPlayer, native Android audio output functionalities, etc. The user will typically perform specific user interaction actions, such as tapping, pressing, swiping, scrolling, etc., on elements in the scene, the scene itself, the video element, etc. A specific set or subset of the user interaction actions are processed, put in an aggregated structure and sent to the server in the user input stream. The user input stream could comprise other data as well, such as gyroscope data, accelerometer data, round-trip latency measurements data, app or game specific data, location data, etc. The user input stream could comprise a compressed stream, or images, from one or more cameras, microphones, etc. in the device as well. The device could be connected to a Gamepad by Bluetooth, Universal Serial Bus (USB), network, etc., and the user input stream could comprise the Gamepad data as well. The user input stream could comprise multiple Gamepads. The device could be connected to a Television, Chromecast, or similar, to show the game story scene as well. A remotely playable game is usually shown and played full-screen. Certain scene elements are usually placed on top of the video element, such as for example a menu button, close button, etc. A remotely playable game could be a full game, a short demo, a level, a specifically made remote play experience in a game, etc. A remotely playable game could be an interactive multimedia experience, a sequence of video clips, a live video stream, etc. There could be multiple remotely playable games in a scene and/or story. There could be an API, element or data structure support for enabling remotely playable games in game story scenes, such as a remote game element, classes to connect to servers, check game availability, etc. The script etc. could support querying user input device capabilities of a game, send user input data, check network latency, have a class for managing a remote game play session, etc. The playable game script integrations described for playable games also apply to remotely playable games. Game story scenes with a remotely playable game could vary a lot.
A game story opening scene is a game story scene that is usually shown first in a game story. There can be many opening scenes in a game story. A game story main scene is a game story scene that usually provides the main parts of a game story. There could be many main scenes in a game story. A game story final scene is a game story scene that is usually shown at the end of a game story. There can be many final scenes in a game story. A game story ornate main scene or game story ornate final scene has a button for showing a details page 403 and a Pinbutton 402, see FIG.4, in addition to other elements, objects, etc. A goto scene trigger condition usually occurs when a user performs a user interaction to go to a scene. This is usually a tap or side-swipe, as in typical general user interactions. The condition could be triggered by an IEA script or structure by calling a function to do so, for example as a result of the user tapping a button to go to a new scene, a method call that occurs periodically that decides that it is time to show a scene, etc. The app may have functionality to automatically show a new scene after a time period, etc. An ending game story trigger condition usually occurs when a user taps a close button, such as for example 401 in FIG.4, or a user for example swipes down, as in typical general user interactions. An IEA script or structure could trigger the condition. A game story is usually presented by first showing the opening scene. The opening scene is shown until a goto scene trigger condition occurs, and then a main scene is usually shown. The main scene is shown until a goto scene trigger condition occurs, and then an main scene or final scene is shown. The app can show a sequence of many main scenes in response to receiving goto scene trigger conditions, which is described as an "iterative manner" of showing main scenes. A game story can end by showing a final scene, and/or by receiving an ending game story trigger condition. A final scene is closed by receiving an ending game story trigger condition. An example sequence could be that an opening scene shows with an onboarding video clip, then a user taps the video clip, triggering a goto scene, then a main scene shows with an introduction video clip of the first level, then a user taps the video clip, triggering a goto scene, then a main scene shows with a playable game, then the user plays the game for a while and presses a button to proceed, triggering a goto scene, then a main scene shows with an introduction video clip for the last level, then a user taps the video clip, triggering a goto scene, then a main scene shows with a remotely playable game that the user plays for a while and then swipes left, triggering a goto scene, then a final scene is shown with a Pin-button and button for showing a details page, then the user swipes down in the final scene, triggering an ending game story trigger condition, then the game story closes, and the app navigates back to the user interface that started the game story. There could be ads and other user interface elements in the sequence as well. IEA scripts and structures could override typical general user interactions. Playable games, remotely playable games, and Instant Apps often override typical general user interactions so that swiping in the game can be an in-game event, and will not trigger a goto scene condition, etc. An ending game story trigger condition can typically be triggered in any scene in the sequence unless it is overridden by an IEA script/structure, etc. It could be possible to turn off or enable typical general user interactions with a property or properties in scenes, game stories, etc. There could be other sequence types of scenes (than opening, main, and final) as well, there can be other user interfaces slotted in the sequence (for example interstitial ads and rewarded video ads), there can be other trigger conditions, etc. It could be possible to have loops in the sequence, by for example going to a first main scene again at a "game over" condition. It could be possible to go to a scene in a different game story as well, which would enable library game stories (i.e., game stories that are used as a library of scenes for use by other game stories). The user could close or terminate the app while a game story scene shows. The app could allow a user to restart, resume, pause, go to a specific scene or state, pin a specific scene or state, etc. in a game story. The user could view a details page at zero or more scenes in the sequence. The sequence for showing scenes, game stories, etc. could vary a lot without departing from the intended scope of the present invention.
If an app developer wants to include an example simple game story system in an app, then a JSON-file format can be used to represent a game story. The JSON-file could contain a list of story objects that represent the scenes. A story object can contain a list of scene objects that represent the elements, objects, script, properties, etc. of the scene. The details page can be set first in the scene with an object with name (e.g. detailsPage), label (e.g. More Info) and source (e.g. detailspage.html). Next, an audio object can be set in the scene with an object with name (e.g. audio) and source (e.g. myaudio.m4a). Next, a scene background can be set with an object with name (e.g. background), aspect (e.g.0.5) and source (e.g. myvideo.webm). Next, one or more image elements can be placed inside the scene with objects with name (e.g. Image0, Image1, etc.), x (e.g.0.0 to 1.0), y, width (e.g.0.0 to 1.0), height, aspect, source (e.g. myimage.png) and click (e.g. http://link.com). The element positions and sizes could be relative to the container. The aspect ratio could be used instead of either the width or height. The app could load this JSON-file, from a file, and begin at the first scene in the file. An Android app could for example have an activity and/or fragment show ImageViews, SurfaceViews, etc. as described by the JSON-file. If a user clicks an element, it can invoke the interaction action in the click-field of the file. If a URL-value is in the field, then the URL can be opened in an external app or web browser. If for example a PlayScene(value) is in the field, then the specified scene can be played. If EndStory is in the field, then the story can end. If ShowDetails is in the field, then the app can show the details page. If PinStory is in the field, then story pinning can be performed. If Play(value) is in the field, then the specified video/audio/animation can be played. The ExoPlayer library on Android makes it easy to implement video and audio. Animations in the JSON-file could be specified as an initial offset on an element with name offsetX and offsetY, to-values toX and toY, and a duration-value in milliseconds. The app could use a ValueAnimator on Android to play the animation when a scene starts/shows. This simple game story system could provide a minimum viable product implementation for an app. Advanced implementations could consider using a JavaScript engine, such as V8, Rhino, JavaScriptCore, etc., to implement interaction events and interaction event actions. This can be done by writing interfaces for the game story API, in for example Java, and adding the interfaces to the JavaScript engine, so that they can be called from game story scripts, written in JavaScript. Other scripting or coding languages can also be used, such as Lua, Java, Kotlin, etc. A virtual machine could also be included in the app to run Java, etc. The JavaScript interfaces could include functions to get elements in a scene (e.g. findElement-ById(String id)), set an event handler (e.g. setOnClickListener(OnClickListener listener)) on an element, play a scene (e.g. playScene(String id)), end a story (e.g. endStory()), pin a story (e.g. pinStory()), play a video/audio/animation/etc. (e.g. play(String id)), position an element (e.g. setX(float v) or setY(float v)), resize an element (e.g. setWidth(float v) and setHeight(float v)), etc. There can be an interface and/or object for each element type, for example Scene, Image, Video, Text, Audio, etc. In an elaborate embodiment, the objects/ interfaces may be similar to Android view, widget and support classes. A Scene could for example be similar to a ConstraintLayout, other ViewGroup classes, a ViewGroup, etc. A app do not implement all the functionality in those classes, but for example what is useful to the game story implementation of the app, such as layout, managing child views, focus, event handling, input handling, clipping, drawing, animation, transition, scrolling, etc. An Image could for example be similar to an ImageView. Text may be similar to a TextView or WebView that can show text or HTML content. Video can be similar to a SurfaceView with an ExoPlayer streaming video content to it. An Audio object could use ExoPlayer to play and manage audio and music output. The objects/interfaces can comprise game story specific functionality. Scene can have play( ), pause( ), playScene(String id), endStory( ), findElementById(String id), pinStory(), etc. Video and Audio can have play( ), pause( ), stop( ), getState( ), getVolume( ), setVolume( ), play(String url), etc. All elements could have setX(float v), setY(float v), setWidth(float v), setHeight(float v), etc. Event handlers may be set with for example setOnClickListener(OnClickListener listener), etc. Formats, structures, systems, code, etc. mentioned here are for example purposes. They will vary a lot, depending on app, platform, content, etc. The present invention comprises any scene-model or structure as illustrated in FIG.3 for providing a game story scene. The function names, parameters, return values, interfaces, objects, models, etc. described here can vary a lot, depending on app, platform, content, etc.
FIG.5 shows a game story slide-up details page 500. The page is conceptually placed outside of view below the navigation bar 114. When the user swipes up in a game story scene or presses the More-button 403 then the details page 500 will slide up until its top aligns with the top of the display 101, or bottom of the status bar 102 if status bar is not transparent. The content or a reference to the content of the details page is stored in the game story scene 300. Scenes may or may not have a details page. The details page is typically comprising Android layout objects, such as ConstraintLayout, RelativeLayout, etc., and/or comprising a WebView or similar. A WebView can have JavaScript and any other relevant options enabled. A WebView can have objects injected in the JavaScriptcontext to enable JavaScript to pin a game story, preview a Live Wallpapers, open a Live Wallpaper group, start a scene, start a different game story, open the AR camera, open a specific model in the AR camera, etc. The arrow 501 is for illustrating the slide up and down motion, and is not visible on a details page 500. Swiping down on the details page can slide the details page down until its top aligns with the bottom of the navigation bar 114. The duration of time for sliding the details page up or down can be configured in the app. There are many sliding up panel/layout implementations available for Android. The invention can comprise any slide-up or slide-in detail page implementation. The page can slide up from the bottom, stop to cover parts of the story, slide in from top, left, right, etc. The details page could comprise a URL-field that shows the URL of the details page. The URL-field can be editable by the user to navigate the details page to a different URL. The details page could comprise lists, side-swipe lists, side-swipe tab lists, stacked cards, etc. A details page could for example be implemented on Android with a ConstraintLayout in the game story scene layout that is placed outside the view below the navigation bar. The ConstraintLayout could comprise a WebView and other user interface elements. When a user swipes up or down, the ConstraintLayout could slide in or out by using an animation. This is just an example embodiment. A details page can vary a lot without departing from the intended scope of the invention.
FIG.6 provides a conceptual illustration of the home screen on a mobile device 100. The operating system shows this screen when the user starts and unlocks the device. The home screen usually has a lot of icons for the various apps and games installed on the device that are not shown here for simplicity. A LWP typically shows on the full screen 101 of the device. There are certain typical areas for LWPs. The top area 600 is typically a place for logos, offers, notifications, etc. The top-right area 601 is often a nice place to have a logo. The left 603 and right 602 areas could be used to show offers, information, notifications, buttons, links to games, links to game stores, etc. The main area 604 could show subtle logos, offers, etc., embedded in the LWP content.
FIG.7 illustrates a method to determine see-trough areas on the home screen, as well as screens the user can swipe to on the home screen, on a device 100. The method is usually implemented as a periodic or continuous loop in a LWP (loop). The LWP renders itself on the home screen, captures an image of the home screen and then studies/analyses the captured image in comparison to the rendered image. An example way to implement a technique to detect see-through areas is to render a test image, e.g. a black image, capture the screen and then use a set of preferred, grid, etc. points and check outwards around the points until non-test image pixels are found. Another example way is to compare against the LWP image (i.e. not using a test image). There are many ways to get rectangles from the LWP image and the captured image, such as flood fill, edge detectors, etc. There can be many ways to detect see-through areas in an image. The present invention comprises any technique for detecting see-through areas in the captured image. It is helpful to know if there are any large enough areas or frequently large areas on the home screen that could be used to show an advertisement, logo, notification, information, etc.
FIG.8 illustrates a controller user interface for a remotely playable game that is shown in a game story scene. The user interface is called a single slide pad controller. The controller user interface shows on top of a remotely playable game. The user can interact with the controller user interface to interact with the game. The controller user interface is typically implemented in the app, but could also be implemented in the game on a server. The controller user interface comprises a slide pad 825 with a thumb indicator 827. A user can press down on the area of the slide pad 824 and move/slide the finger around to interact with the game similarly to a thumb stick or d-pad in a game. The outer ring 826 could trigger some special interaction if passed when moving/sliding the finger. The top buttons 811 to 815 in top area 810, the right buttons 817 to 819 in right area 816, and the left buttons 821 to 823 in left area 820, could trigger an interaction, e.g. as a action button, trigger button, bumper button, d-pad button, other stick movements, option button, menu button, select button, home button, etc. There could be more or less buttons, buttons could be placed differently, look different, etc. FIG.9 shows a triggers, bumpers and d-pads controller user interface for a remotely playable game. The left bumper button 916 and the left trigger button 915 in the left front-button area 914 typically trigger the corresponding left side bumpers/triggers in a game. The right bumper button 913 and the right trigger button 912 in the right front-button area 911 typically trigger the corresponding right side bumpers/triggers in a game. An action-pad is shown in the action-pad area 917 with a triangle 918, "x" 919, circle 920, and square 921 button that correspond to a typical action pad in a game. The d-pad 922 to 927 is also shown. The layouts in FIG.8 to 9 can vary a lot. The pad areas 824, 922 and 917 can have any combination of side pads, d-pads, action-pads, other buttons and pads. The pad area 824 is centered, but it could be placed on the left or right side of the display instead. The pad areas could be placed higher up in the display and could have a set of buttons etc. beneath. An action pad could have other buttons than triangle, x, circle, and square. Buttons "y", "a", "b", and "x" are also common for action pads, etc. Controller user interfaces for remotely playable games can vary a lot, depending on games, systems, app, device, content, etc.
FIG.10 illustrates a controller user interface for a remotely playable game called a perspective layout triggers, bumpers and d-pads controller. There is a d-pad 1000 to 1005, an action pad 1021 to 1025, a left front-button set 1006 to 1008 and a right front-button set 1018 to 1020. There could be game control or menu buttons 1014 to 1016 in the bottom-center area 1013. There could also be a menu in the top left or right side 1010 of the display. The pad areas 1000 and 1021 could have any combination of side pads, d-pads, action-pads, other buttons and pads. The pad areas can be placed higher up in the display and could have a set of buttons etc. beneath. There can be a close, back, or similar button 1012 in the user interface.
Remotely playable games, playable games and Instant Apps could be configured or made to show and/or start at a specific time, place, level, or a specific state of the game so the users is able to, for example, view an image, video, etc., in a game story, tap it, and then instantly be able to play the game from that specific time, place, level, state, etc. This is an effective way to give the user an efficient way to rapidly try a game. The user can for example open a game story, view a quick intro-video in the first scene, view a list of three different editions of the game, click the first one, instantly play a playable game for a few minutes in the next scene, then view another video in the following scene, play a remotely playable game for several minutes in the next scene, etc. It is also possible to provide full gaming experiences in this way.
Playable games can be implemented with HTML5, WebGL, Unity, Unreal, other engines, SDKs, custom code, etc. Playable games could be provided by services, such as Facebook, Soft Games, etc. Playable games could be downloaded and played in the app. The app typically has to integrate a WebView, engine, or similar in the app that shows a playable game. The service provider typically provides in advance or on demand a URL to the playable game. The WebView, engine, etc. loads the URL and starts the game. The WebView, engine, etc. is typically shown in a fragment or activity in full-screen. Playable game service providers could also provide SDKs that can be integrate into the app, using for example Gradle, for loading, playing, monetizing, sharing, managing games, etc.
The user interfaces described in the present invention can have advertisements in them, provided by for example AdMob, MoPub, Facebook, Unity Ads, etc. The ads could be banner ads, interstitials, medium rectangles, native ads, native slotted ads, etc. The ads can appear anywhere in the user interfaces, mixed with lists and items, etc. Ad networks, such as AdMob, MoPub, etc., have guides and recommendations on how to do this, and it is relatively common to have these kinds of ads in apps and games. There are, however, a growing number of playable games and playable game ads providers/networks. These can be integrated and controlled by ad mediation networks, with e.g. MoPub custom events or similar. Another approach is header bidding where these providers/networks can bid on a game/ad placement simultaneously. The bidding usually occurs on the server side, but can occur on the client side too. The app sends information to these providers/networks either directly or through a server, or set of servers, about the ad placement, user interests, etc., and receives a winning game/ad. Ads can be added without departing from the intended scope of the invention. There can also be ad related in-app messages at different places in the app and notifications.
The app could have payment solutions, such as in-app purchases, subscriptions, credit card payments, credits, tokens, virtual currencies, etc. The user could be enabled to buy, unlock, or otherwise get access to games, game stories, game story scenes, playable games, remotely playable games, game items, movies, Live Wallpapers, AR content, etc., called content. Credits, tokens, virtual currencies, etc. could be made available to the user when the user buys them, periodically as a part of a subscription, as a reward, etc. Credits, tokens, virtual currencies, etc. could be used to get access to content, features, etc.
The app could be configured to get the location of the user. The location could be obtained from the users device, network, global positioning system (GPS), etc. There may be code in the app that retrieves information about the installed apps on the device, device model, hardware, amount of random access memory, operating system, languages, central processing unit, graphics processing unit, etc. The app could also comprise user interfaces for asking the user for age, gender, interests, etc. Location, installed apps, device and user information, etc. can be used for targeting and segmentation on the device and/or a server. Targeting and segmentation could be used to serve advertisements, content, etc. that is of a higher degree of interest to the user than other advertisements, content, etc. Developing the code to get location, installed apps, device and user information, etc. is relatively easy and straight forward on Android, iOS, etc. There could be many approaches for targeting, segmentation, advertising, information gathering, user interests, content serving, etc. that can be used without departing from the intended scope of the present invention. There can be logging of the content, etc. a user has interacted with to determine interests, etc. There are many ways to do targeting and segmentation, and it could vary a lot without departing from the intended scope of the invention.
The app could implement Google Awareness API or similar. The API provides a way to get the location, detect proximity of specific places, detect if the user is moving in various ways, get weather information, etc. The app could also implement Google Nearby Connections API, Nearby Messages API, Fast Pair, etc. or similar. These APIs provide a way to discover nearby devices and apps on devices, connect to them, send data, etc. that could be also be used to send and/or receive an instant state of a game. A user could have a mobile phone with the app where he has a game or game story running. The user could also use a Television with for example an Android TV device that is running an app. The mobile app and/or TV app could detect the other app. The user can choose to transfer the instant state of the game from the mobile app to the TV app. Similarly, the user may also be able to transfer the instant state of a game from the TV app to the mobile app. The app could alternatively send the fine location to a server to determine proximity of apps. It is possible to implement different approaches for detecting and communicating with nearby devices without departing from the intended scope of the invention. The TV app could be the Android TV operating system, a component, etc. The instant state of the game could be sent/received to/from a server instead of directly between devices or apps on devices. The mobile app could continue to show the game that has been transferred to the TV app, by means of streaming an audiovisual stream from the TV app to the mobile app, and the mobile app could show a controller user interface that can be used to control the TV app and/or the game executing on the TV.
The app could implement support for multiplayer game stories, playable games, remotely playable games, Instant Apps, etc., called multiplayer games. Multiplayer games and other games could access and/or share friends lists, nearby device or app list, etc. by communicating with the app or each other with indexedDB, JavaScript, a server, shared preferences, database, sending/receiving messages to/from app, shared APIs, interprocess communication, by using an identifier, etc., similarly to sharing game story state, or state. Multiplayer games and other games can also use Google Play Games Services or similar, and Google Nearby Connections API, Nearby Messages API, Fast Pair, etc. or similar. A multiplayer game could work similarly to how many multiplayer games work, by using a server, etc., to communicate data between the game instances for the multiplayer session. Remotely playable games could have other means of communicating on the back-end. A multiplayer game could implement any means of communication without departing from the intended scope of the invention. There could be a permission user interface in the app that shows when a multiplayer game or other game requests access to a friends list, a list of nearby devices or apps, etc. The permission user interface could comprise a message to the user, one or more buttons, etc. The app could show a multiplayer request notification or user interface that could show when someone invites the user to a multiplayer session. The multiplayer request notification or user interface could show a message to the user, one or more buttons, etc. The multiplayer session could typically start if the user accepts the invitation. The multiplayer session identifier, etc., could be shared similarly to sharing game story state. The implementation for multiplayer games could vary a lot.
Words like tap, click, swipe, drag, scroll, etc., have been used for interactions in the app. The words used do not necessarily mean that they are bound by the definition they may have on a specific operating system, like Android or iOS, but rater have a broad and inclusive meaning, such that for example tap or click can mean roughly the same. A "button" in the present invention does not mean that it must be an Android or iOS button, but is typically an image, text, video, etc. or combinations of such objects and elements. An "image representation" in the present invention does not mean that it is an Android or iOS image, but rather it is typically an image, text, video, SurfaceView, ViewHolder, etc. or combinations of such objects and elements. An "image" in the present invention is not necessarily an Android or iOS image, but can also comprise animated images, interactive effects, gyroscopedriven effects, etc. Similarly for text, video, etc., a broader perspective of such objects or elements should be taken into consideration. Providing an entertaining, fun and esthetically pleasing user experience is typically of importance in an app like this. Offering a right balance of the user interfaces, ads, appearance, etc. can be important. The words section and page can also be used interchangeably. All aspect ratios used for image representations in the figures are for illustration purposes. Actual aspect ratios depend on content, app, device, etc. Radius, shape, etc., of rounded corners, margins, etc., for image representations, etc., are also for illustration purposes and depend on content, app, device, etc. There may be multiple users of the app on a device. The word friend could mean any user that is in any way referenced by a user. A page could be an activity, fragment, view, etc. Android is the main reference platform for concepts in the present invention. Similar concepts exists on other platforms, such as iOS, Windows, OS X, etc. The app is usually running in its own process on the operating system. The app process is the process that an app is running in. An app can start or have more processes, that in the present invention is in, or a part of, the app process. An app could for example run a Live Wallpaper inside its own process, which for the present invention is considered to be in the app process. The present invention have been described with a focus on smartphones to make it easier to understand. Those skilled in the art will appreciate that the present invention can also be implemented for tablets, foldable phones, PCs, TV, VR, AR, web, etc.
While the present invention has been described with reference to an embodiment thereof, those skilled in the art will appreciate that various changes in form and detail may be made without departing from the intended scope of the invention as defined in the appended claims. The particulars described above are intended merely to be illustrative and the scope of the invention is defined by the appended claims. For example, the present invention may be practiced with a game story system for mobile apps that differs from the system described above.
Alternative systems and methods may include only a subset of the above-described parts or include additional parts that differ from those described above. Moreover, user interface examples and the organization of the layout described above are not intended to limit the scope of the present invention.
2.

Claims (10)

1. A computer implemented method for showing a game story on the main display of a smartphone, tablet or similar computing device, comprising:
showing a game story opening scene;
proceeding to show game story main scenes in an iterative manner, in
response to a goto scene trigger condition from each scene, until a game story
final scene is shown or an ending game story trigger condition is received;
receiving a pinning trigger condition while showing one of the game story scenes, and in
response, showing a pin to collection page.
2. The method of claim 1, further comprising pinning the story, scene or state, of the game story, in the pin to collection page.
3. The method of claim 1, further comprising one or more scripts or data structures that is capable of representing, defining or otherwise describing interaction event actions in a story, a scene, a set of scenes, an element, a set of elements, other objects, groups, or formats.
4. The method of claim 1, further comprising receiving a details page trigger condition in a game story scene; and in response, showing a game story slide-up details page.
5. The method of claim 1, wherein a game story scene comprises a playable game, a remotely playable game, or an Instant App.
6. The method of claim 5, wherein a game story scene comprises a controller user interface for a remotely playable game.
7. A smartphone, tablet or similar computing device that is executing an app process on the device, the app process is showing a game story on the main display of the device, and the app process comprises one or more scripts and structures configured to:
show a game story opening scene;
proceeding to show game story main scenes in an iterative manner, in
response to a goto scene trigger condition from each scene, until a game story
final scene is shown or an ending game story trigger condition is received;
wherein one or more of the game story scenes comprise a playable game, a remotely
playable game, or an Instant App.
8. The device of claim 7, wherein a game story scene comprises a controller user interface for a remotely playable game.
9. The device of claim 7, wherein a game story scene comprises a multiplayer game.
10. The device of claim 7, further comprising sending or receiving a game instant state to or from a nearby device.
NO20190524A 2019-04-21 2019-04-21 Game Story System for Mobile Apps NO345656B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
NO20190524A NO345656B1 (en) 2019-04-21 2019-04-21 Game Story System for Mobile Apps

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NO20190524A NO345656B1 (en) 2019-04-21 2019-04-21 Game Story System for Mobile Apps

Publications (2)

Publication Number Publication Date
NO20190524A1 NO20190524A1 (en) 2020-10-22
NO345656B1 true NO345656B1 (en) 2021-05-31

Family

ID=73451194

Family Applications (1)

Application Number Title Priority Date Filing Date
NO20190524A NO345656B1 (en) 2019-04-21 2019-04-21 Game Story System for Mobile Apps

Country Status (1)

Country Link
NO (1) NO345656B1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150165310A1 (en) * 2013-12-17 2015-06-18 Microsoft Corporation Dynamic story driven gameworld creation
KR20170052407A (en) * 2015-11-04 2017-05-12 넥슨지티 주식회사 Apparatus for providing game and method thereof
US20170354888A1 (en) * 2016-06-13 2017-12-14 Sony Interactive Entertainment America Llc Method and system for saving a snapshot of game play and used to begin later execution of the game play by any user as executed on a game cloud system
US20180036639A1 (en) * 2016-08-05 2018-02-08 MetaArcade, Inc. Story-driven game creation and publication system
US20180359295A1 (en) * 2017-06-12 2018-12-13 Facebook, Inc. Interactive Spectating Interface for Live Videos

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150165310A1 (en) * 2013-12-17 2015-06-18 Microsoft Corporation Dynamic story driven gameworld creation
KR20170052407A (en) * 2015-11-04 2017-05-12 넥슨지티 주식회사 Apparatus for providing game and method thereof
US20170354888A1 (en) * 2016-06-13 2017-12-14 Sony Interactive Entertainment America Llc Method and system for saving a snapshot of game play and used to begin later execution of the game play by any user as executed on a game cloud system
US20180036639A1 (en) * 2016-08-05 2018-02-08 MetaArcade, Inc. Story-driven game creation and publication system
US20180359295A1 (en) * 2017-06-12 2018-12-13 Facebook, Inc. Interactive Spectating Interface for Live Videos

Also Published As

Publication number Publication date
NO20190524A1 (en) 2020-10-22

Similar Documents

Publication Publication Date Title
US11287946B2 (en) Interactive menu elements in a virtual three-dimensional space
US9092061B2 (en) Augmented reality system
US8060824B2 (en) User interface for a multimedia service
US9478059B2 (en) Animated audiovisual experiences driven by scripts
EP2689343B1 (en) Remotely emulating computing devices
US9652046B2 (en) Augmented reality system
US20150026573A1 (en) Media Editing and Playing System and Method Thereof
US20090215512A1 (en) Systems and methods for a gaming platform
BR102013033136B1 (en) METHOD FOR GENERATING A LIMITED PLAYABLE VERSION OF A VIDEO GAME; AND METHOD TO PROVIDE REMOTE CONTROL OF A USER'S GAME
CN113298602A (en) Commodity object information interaction method and device and electronic equipment
WO2018140089A1 (en) System and method for interactive units within virtual reality environments
Seidelin HTML5 games: creating fun with HTML5, CSS3 and WebGL
US9764238B2 (en) Apparatus and method for servicing user participation-type game by using real-time flash-mob
NO345656B1 (en) Game Story System for Mobile Apps
Rettig Professional HTML5 mobile game development
Odom HoloLens Beginner's Guide
JP7212993B2 (en) Device, method, and program for gifting by fans
CN117899460A (en) Interaction method and device in game and electronic equipment
Sumpter Make a 2D arcade game in a weekend: with unity
CN117768667A (en) Picture configuration method, device, equipment, medium and program product
Green et al. Beginning Android Games
CN115686310A (en) Interaction method, device, equipment, computer readable storage medium and product
CN116737028A (en) Short video playing method and device and electronic equipment
JP2016131702A (en) Simulation device and simulation program of game machine
Magazine The Mobile Book

Legal Events

Date Code Title Description
CHAD Change of the owner's name or address (par. 44 patent law, par. patentforskriften)

Owner name: OLE-IVAR HOLTHE, NO