WO2016127210A1 - System and method of implementing remotely controlling sensor-based applications and games which are run on a non-sensor device - Google Patents

System and method of implementing remotely controlling sensor-based applications and games which are run on a non-sensor device Download PDF

Info

Publication number
WO2016127210A1
WO2016127210A1 PCT/AU2016/050076 AU2016050076W WO2016127210A1 WO 2016127210 A1 WO2016127210 A1 WO 2016127210A1 AU 2016050076 W AU2016050076 W AU 2016050076W WO 2016127210 A1 WO2016127210 A1 WO 2016127210A1
Authority
WO
WIPO (PCT)
Prior art keywords
events
effects
screen
games
application
Prior art date
Application number
PCT/AU2016/050076
Other languages
French (fr)
Inventor
Simon Wang
Thomas Liu
Original Assignee
Hubi Technology Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2015900485A external-priority patent/AU2015900485A0/en
Application filed by Hubi Technology Pty Ltd filed Critical Hubi Technology Pty Ltd
Priority to CN201680007046.3A priority Critical patent/CN107206282A/en
Publication of WO2016127210A1 publication Critical patent/WO2016127210A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game

Definitions

  • This invention relates to a remote control.
  • this invention relates to a software remote control on a smart device such as a smart phone, a tablet and a smart watch.
  • Android game consoles which provide inexpensive game solutions for gamers compared with professional game consoles like PlayStation, Xbox, and Wii and so on.
  • These Android game consoles normally come with a special game controller which supports Android, and the games which are particularly developed for the device and the controller.
  • Mini computers are another type of popular devices which are entering people's living rooms. These devices are in very small size, and they use TV screens as the monitor and mice and keyboards as the main controller. These devices can basically do whatever a TV box or a Smart TV or a game console can do.
  • the platforms for these devices are mainly Windows, Android and Mac.
  • a remote controller is an indispensable part.
  • Commonly used controllers include TV remotes, smart remotes with touch pad, mice and keyboards, mini keyboards, mobile Apps on a smart phone.
  • Another indispensable part is resources available in the device.
  • the resources is a key factor determining whether the device can be popular or not. It includes applications, games, and contents.
  • the applications and games are distributed in a form of Apps on mobile platforms, and the contents are TV programs, movies, books, music and so on which are platform independent and delivered through specific applications. Therefore, applications and games or say Apps actually represent all the resources.
  • a mouse may be used instead of a touch screen, but it is very inconvenient to control an application or a game compared with touch operations; an Android game controller may be expected to control the games, but it is unable to control the absolutely most of the games at the moment yet due to the same reason.
  • Android Lollipop 5 was officially released by Google, providing supports to all sizes of screens, including the TV screen.
  • Google started to encourage developers to develop applications and games with supporting the big screen and the multiple controls. But it may take a very long time to wait for the developers to develop new applications and games, or wait for them to convert the current applications or games to support the features, according to the current adoption rate of Android 5.
  • Touch screen is actually one of the sensors on a smart device, but it would be distinguished from other sensors in this document because it is an essential control component for every mobile App.
  • the invention is a method of implementing a remote software control which enables applications and games that were developed based on the sensors of a smart device to be used on a device that does not have the corresponding sensors.
  • a tablet game with touch control and motion control can be installed and run on a TV box or a Mini PC, but cannot be played by a game controller or other types of controllers because the game was developed for the tablet which has a touch screen and the motion sensor; the invention enables the player to play the game using the tablet as a controller to control the game in the exact same way with playing the game on a tablet.
  • the sensors herein include Motion sensor, Proximity sensor, Light sensor, Moisture sensor, Position sensor and any other sensors available on the smart device.
  • the smart device herein includes a
  • smartphone a tablet, a smart watch but not limited to them.
  • the original control events are sent to the processor through a communication medium
  • the display effects are displayed on the both screens.
  • the invented method involves two software applications which are installed on a controlling device and the controlled device respectively.
  • the application installed on the controlling device is a client application and the application installed on the controlled device is a server application.
  • the method comprises seven processes, including Capture Events, Map Positions, Send Events, Inject Events, Capture Effects, Send Effects and Perform Effects.
  • a system implemented based on the invented method is in client-server
  • a client application is installed on a controlling device and a server application is installed on the controlled device.
  • the key system components in the client application comprise GUI, Device Finder, Event Catcher, Event Mapper, Event Sender, Effect Receiver and Effect Performer;
  • the key system components in the server application comprise Finding Responder, Event Receiver, Event Injector, Effect Catcher, and Effect Sender;
  • the communication between the client application and the server application applies UDP, TCP and Bluetooth technologies.
  • Figure 1 shows the processes involved in the invention.
  • Figure 2 shows the processes involved in the invention in the context of the controlling device and the controlled device.
  • Figure 3 shows the key components of a practical system which is implemented based on the invention.
  • Figure 4 shows a demonstration of the Map Positions process.
  • Figure 5 shows an example of the event format for Send Events process.
  • Figure 6 shows an example of the effect format for Send Effects process.
  • Figure 7 shows one way to inject events in the server application.
  • Figure 8 shows one way to capture effects in the server application.
  • the invented method involves a client software application which is installed on a controlling device and a server software application which is installed on the controlled device. Multiple controlling devices can control the controlled device at the same time.
  • the controlling device is the device which the applications and games are developed for; and the controlled device is the device which the applications and games are run on.
  • the invented method comprises seven processes, as shown in Figure 1 , including Capture Events, Map Events, Send Events, Inject Events, Capture Effects, Send Effects and Perform Effects.
  • Figure 2 shows the processes in the context of the both devices.
  • the first process is Capture Events. This is a process hosted by the client application on the controlling device.
  • the events are captured from the application GUI as shown in Figure 2, which includes all the events supported by the local operating system.
  • the former events are screen position-dependent, for example, a Touch Event comes with the coordinate parameters representing the touch position on the screen; in this case, the position on the controlling device screen must be mapped to the corresponding position on the controlled device screen, otherwise it will not work properly.
  • the later events are other events which do not need position parameters, such as a Key Event and a Sensor Event.
  • 101 means the former event
  • 103 means the latter event.
  • the second process is Map Positions. This process maps the screen positions for position-dependent events such as Touch Events, from the position on the controlling device screen to the position on the controlled device screen. This is an essential process because the controlled device normally has a bigger screen. To achieve the correct mapping, a ratio mapping method can be used, as described below.
  • a touch point (x1 , y1 ) on the controlling device screen needs to be mapped to the touch point (x2, y2) on the controlled device screen.
  • the coordinate centre (0, 0) is at the left bottom corner of the screen for the both devices; w1 and hi are the width and height of the controlling device screen, and w2 and h2 are the width and height of the controlled device screen; w1 , hi , w2, h2, x1 and y1 are known; x2 and y2 are the expected values.
  • the Map Positions process can be either in the client application or in the server application, although as default it is put in the client application.
  • the w2 and h2 in the mapping method can be a virtual size. For example, assume the actual screen resolution of the controlled device is 1920 * 1080, the w2 * h2 can be registered as 1920*1080, 2000 * 2000 or any other figures.
  • the local operating system will handle the mapping between the registered size and the actual size. But be aware of that, the registered size will affect the precision of the touch event - theoretically the bigger registered size, the higher precision unless the virtual size is over the actual size.
  • the third process is Send Events.
  • the captured original events or the mapped events are directly sent to the controlled device in a certain format.
  • the format must be defined in an efficient way.
  • the events need to be sent by bytes; and the events are normally represented by integer and float values.
  • a Touch Event can be represented by an integer Action value and two integer coordinate values X and Y;
  • a motion sensor event can be represented by three float coordinate values X, Y and Z.
  • Figure 5 shows a format to send the events.
  • the format comprises four parts: the first part is the first byte representing the event type; the second part is the following four bytes representing the first integer argument (Arg1 ) or the first float value (Val1 ); the third part is the four bytes after the second part representing the second integer argument (Arg2) or the second float value (Val2); the fourth part is the last four bytes representing the third integer argument (Arg3) or the third float value (Val3).
  • the type is an integer stored in one byte, which represents the different event types including Touch Event, Mouse Event, Key Event and Sensor Event and so on.
  • the following three parts are used to store the parameters for different types of events. They can be integer values or float values depending on the type of events.
  • a Touch Event comes with an integer Action value (representing Down or Up event), an integer coordinate X value and an integer coordinate Y value, and in this case, the first part stores the Action value called Arg1 , the second part stores the X value called Arg2, and the third part stores the Y value called Arg3; for a motion sensor event, the three parts are used to store three float values representing the movement in the three dimensions, which are called Val1 , Val2 and Val3 respectively.
  • the fourth process is Inject Events.
  • the received events by the server application must be injected into the local operating system with creating virtual input devices.
  • the created input devices must be recognized by applications and games without any difference from the local real input devices. That means, to applications and games on the controlled device, the events are from the local touch screen and sensors.
  • To inject events one concern is that, when multiple controlling devices are connected with the controlled device, how to recognize the event source is important for some Apps, especially for games. It can be done by creating a virtual input device for each type of events and each of the controlling devices. For example, when a controlling device is connected with the controlled device, a virtual touch device can be created to handle touch events, a virtual motion sensor device can be created to handle motion events, and a virtual key event device is created to handle all key events; when another controlling device is connected with the controlled device, another group of the virtual devices are created again. In this way, the applications and games are able to support multiple separate controllers because they can recognize every created virtual device by its device ID which is managed by the system device manager. In some implementations, it may be not necessary to distinguish the event source; in this case, shared virtual input devices can be created for all the controlling devices. It is left to the actual implementation to decide whether to create separate or shared or hybrid virtual devices.
  • the fifth process is Capture Effects.
  • the effects herein are captured by the server application on the controlled device, including Display Effects, Sound Effects and Vibration Effects, but not limited to them.
  • the Display Effects are the display on the controlled device screen; the screenshots or a stream of the display video are sent back to the controlling device and displayed on the screen at real time; in this way, the user is able to know where to do touch operations.
  • the Sound Effects and Vibration Effects are mainly used for games providing an instant operation feedback to the controller; for example, when playing a car driving game, and when the car contacts other racing cars, a collision sound effect or a vibration effect can be sent to the client controller so that the user can hear or feel it; these effects are captured from games which support these features. It is particularly useful for multiple-player games which can send specific sound and vibration effects to an individual player. According to games or special applications' demands, more effects may be added in the same way.
  • the dash link between the Inject Events process and the Capture Effects process means the latter is not necessarily the following process of the former; or say, they may or may not be dependent processes; for example, capturing display effects are not related to Inject Events while capturing sound and vibration effects may be triggered by certain events.
  • Figure 8 shows how Effects are captured in an Android device. Capturing Display Effects are different from capturing other effects.
  • the Linux kernel has a device called "framebuffer” or "fb" which contains the screen display data in RGB format. From the memory map of the device, the screenshots can be directly fetched by JN I, or a video stream can be created using the device as the source.
  • JN I Joint Photographic Experts Group
  • Vibration Effects they are captured from the applications or games; but this requires the applications and games to implement the corresponding APIs.
  • the sixth process is Send Effects.
  • the effects are sent back to the client application in a certain format. It is the same with Send Events that, based on the currently most common technologies, the effects can be sent in bytes.
  • Figure 6 shows an example format which includes a header of nine bytes and the data of uncertain number of bytes.
  • the first byte is used to store an integer type which includes Display, Sound and Vibration; the left eight bytes are divided into two groups; the two groups of bytes are used for different purposes according to the effect type; for a Display Effect, the first group stores the sub-type of the Display Effect including 0 and 1 representing images and video respectively and the second group stores length of the data; for the other two types, the first group stores the effect sub-type and the time length, for a Sound Effect as an example, the first group points out which sound effect is going to be performed, and the second group indicates how long the sound effect will be performed.
  • the Data is used for Display Effects or any other effects which require data. It stores the compressed image data or the URI of the video stream.
  • the seventh process is Perform Effects.
  • the client application receives the effects, it performs the effects respectively according to the type.
  • the received image data will be set to the display component if images are sent back; and the video stream will be played on the display component at real time if the video is applied.
  • a Sound Effect pre-set sounds may be played according to the received specification.
  • the local vibration APIs may be called to perform the effects according to the received specification.
  • Other types of effects can be defined in the actual implementations. The performed effects are normally reflected on the application GUI.
  • a system implemented based on the invented method is in client-server
  • the system comprises seven key components in the client application and five key components in the server application.
  • the key components comprise:
  • GUI used to set the application and capture the events from
  • Device Finder used to find a controlled device in local network
  • Event Catcher used to capture events from the GUI
  • Event Mapper used to map the position on the controlling device screen to the position on the controlled device screen for position-dependent events
  • Event Sender used to send the events to the server application
  • Effect Receiver used to receive the effects from the server application
  • Effect Performer used to perform the received effects to the GUI.
  • the key components comprise:
  • Finding Responder used to respond finding broadcasts from the client application
  • Event Receiver used to receive the events from the client application
  • Event Injector used to inject the events to the operating system with simulating the real input device
  • Effect Catcher used to capture the effects from the operating system or the kernel, and from the Apps as well;
  • Effect Sender used to send the captured effects to the client application.
  • the system uses UDP broadcasts in the local network or Bluetooth discovery to find the controlled device. After the device has been found, the user can choose to connect the device by Wi-Fi or Bluetooth if the both are supported; the user can only choose one of them at the same time. After the connection has been established, it enters the processes involved in the invention.

Abstract

The invention is a method of implementing a remote software control, which enables applications and games that were developed based on the sensors of a smart device to be used on a device that does not have the corresponding sensors. The method involves a client application installed on a controlling device, a server application installed on the controlled device, and seven essential processes including Capture Events, Map Positions, Send Events, Inject Events, Capture Effects, Send Effects and Perform Effects. A system implemented based on the method comprises client-server architecture, the two applications, and the key components in the two applications; some of the key components implement the functions of the seven processes.

Description

Title
[0001 ] System and method of implementing remotely controlling sensor-based applications and games which are run on a non-sensor device.
Technical Field
[0002] This invention relates to a remote control. In particular, this invention relates to a software remote control on a smart device such as a smart phone, a tablet and a smart watch.
Background Art
[0003] Smart phones and tablets are extremely popular nowadays in many countries all over the world. There are millions of applications and games are available in App stores over the two main platforms: Google's Android and Apple's iOS. These applications and games are all developed based on the device's touch screen. And some special applications and many of the games also utilize the device's certain sensors such as the Motion sensor, Proximity sensor and Gyroscope and so on.
[0004] Along with the increasing popularity of digital streaming contents, Smart TVs, TV boxes, TV dongles began to enter people's living rooms, which mainly enable users to watch TV programs and movies over the Internet. Advanced ones can also be used to listen to music, play games, browse websites and shop online and so on. There are quite many platforms for these products, such as Android TV, Apple TV, Fire TV, Chromecast, WebOS and so on. These devices are normally controlled by a TV remote.
[0005] In recent years, Android game consoles appeared, which provide inexpensive game solutions for gamers compared with professional game consoles like PlayStation, Xbox, and Wii and so on. These Android game consoles normally come with a special game controller which supports Android, and the games which are particularly developed for the device and the controller.
[0006] Mini computers are another type of popular devices which are entering people's living rooms. These devices are in very small size, and they use TV screens as the monitor and mice and keyboards as the main controller. These devices can basically do whatever a TV box or a Smart TV or a game console can do. The platforms for these devices are mainly Windows, Android and Mac.
[0007] No matter what devices are used in the living room, a remote controller is an indispensable part. Commonly used controllers include TV remotes, smart remotes with touch pad, mice and keyboards, mini keyboards, mobile Apps on a smart phone. Another indispensable part is resources available in the device. The resources is a key factor determining whether the device can be popular or not. It includes applications, games, and contents. The applications and games are distributed in a form of Apps on mobile platforms, and the contents are TV programs, movies, books, music and so on which are platform independent and delivered through specific applications. Therefore, applications and games or say Apps actually represent all the resources.
[0008] As mentioned in the beginning, there are millions of Apps in the two popular mobile platforms: Android and iOS. At the moment, Apple has not integrated its iOS resources into its TV box; but Google has moved forward, that its Android TV platform and Nexus Player product are coming with an attractive feature of utilizing all the Android platform resources. At the same time, variety of Android devices like TV boxes, TV dongles, Mini PCs are available in the market aiming to utilize all the Android resources as well. But the reality is that, all the applications and games were originally developed completely based on the touch screen and certain sensors while the target device do not have the touch screen and the sensors; so that the vast majority of them cannot be used at all, especially games. A mouse may be used instead of a touch screen, but it is very inconvenient to control an application or a game compared with touch operations; an Android game controller may be expected to control the games, but it is unable to control the absolutely most of the games at the moment yet due to the same reason. [0009] In late 2014, Android Lollipop 5 was officially released by Google, providing supports to all sizes of screens, including the TV screen. At the same time, Google started to encourage developers to develop applications and games with supporting the big screen and the multiple controls. But it may take a very long time to wait for the developers to develop new applications and games, or wait for them to convert the current applications or games to support the features, according to the current adoption rate of Android 5.
Moreover, the applications and games developed for the big screen is unable to use the sensors which a smart phone or tablet has. Therefore, the reality would keep the same.
[0010] Imagine, if all the App resources on the mobile platforms can be perfectly used on a big screen without changing anything, or just with a bit of effort to change the image resolution, how easy it would be for developers to move their resources to a new platform; and how easy it would be for users to enjoy the free resources which they have already had in their App store accounts. Apple's AirPlay technology was one of the solutions to achieve the purpose, but it actually just streams the display of the mobile device to the big screen; Google's Chromecast technology was another solution, but it can only cast multimedia contents and it also requires the casting applications implementing the specific APIs.
[0011 ] To sum up the problem that, there are a huge number of App resources over mobile platforms, but the vast majority of them cannot be used on the big screen, because they were originally developed based on the touch screen and certain sensors.
[0012] Touch screen is actually one of the sensors on a smart device, but it would be distinguished from other sensors in this document because it is an essential control component for every mobile App.
Summary of Invention
[0013] The invention is a method of implementing a remote software control which enables applications and games that were developed based on the sensors of a smart device to be used on a device that does not have the corresponding sensors. For example, a tablet game with touch control and motion control can be installed and run on a TV box or a Mini PC, but cannot be played by a game controller or other types of controllers because the game was developed for the tablet which has a touch screen and the motion sensor; the invention enables the player to play the game using the tablet as a controller to control the game in the exact same way with playing the game on a tablet. The sensors herein include Motion sensor, Proximity sensor, Light sensor, Moisture sensor, Position sensor and any other sensors available on the smart device. The smart device herein includes a
smartphone, a tablet, a smart watch but not limited to them.
[0014] The invented method can be simply understood in the following way:
1. Cut off a smart phone or tablet into two pieces, including the screen with the sensors and the processor;
2. The screen with the sensors becomes the independent controller;
3. And the processor becomes a TV box connecting with a big screen;
4. The original control events are sent to the processor through a communication medium;
5. The display effects are displayed on the both screens.
[0015] The invented method involves two software applications which are installed on a controlling device and the controlled device respectively. The application installed on the controlling device is a client application and the application installed on the controlled device is a server application. The method comprises seven processes, including Capture Events, Map Positions, Send Events, Inject Events, Capture Effects, Send Effects and Perform Effects.
[0016] A system implemented based on the invented method is in client-server
architecture, wherein a client application is installed on a controlling device and a server application is installed on the controlled device. The key system components in the client application comprise GUI, Device Finder, Event Catcher, Event Mapper, Event Sender, Effect Receiver and Effect Performer; the key system components in the server application comprise Finding Responder, Event Receiver, Event Injector, Effect Catcher, and Effect Sender; the communication between the client application and the server application applies UDP, TCP and Bluetooth technologies. Drawings
[0017] Figure 1 shows the processes involved in the invention.
[0018] Figure 2 shows the processes involved in the invention in the context of the controlling device and the controlled device.
[0019] Figure 3 shows the key components of a practical system which is implemented based on the invention.
[0020] Figure 4 shows a demonstration of the Map Positions process. [0021 ] Figure 5 shows an example of the event format for Send Events process. [0022] Figure 6 shows an example of the effect format for Send Effects process. [0023] Figure 7 shows one way to inject events in the server application. [0024] Figure 8 shows one way to capture effects in the server application.
Invention Embodiments
[0025] The invented method involves a client software application which is installed on a controlling device and a server software application which is installed on the controlled device. Multiple controlling devices can control the controlled device at the same time. The controlling device is the device which the applications and games are developed for; and the controlled device is the device which the applications and games are run on. The invented method comprises seven processes, as shown in Figure 1 , including Capture Events, Map Events, Send Events, Inject Events, Capture Effects, Send Effects and Perform Effects. Figure 2 shows the processes in the context of the both devices.
[0026] The first process is Capture Events. This is a process hosted by the client application on the controlling device. The events are captured from the application GUI as shown in Figure 2, which includes all the events supported by the local operating system. There are two types of events: the events which need a mapping process and the events which do not need a mapping process. The former events are screen position-dependent, for example, a Touch Event comes with the coordinate parameters representing the touch position on the screen; in this case, the position on the controlling device screen must be mapped to the corresponding position on the controlled device screen, otherwise it will not work properly. The later events are other events which do not need position parameters, such as a Key Event and a Sensor Event. In Figure 1 , 101 means the former event and 103 means the latter event.
[0027] The second process is Map Positions. This process maps the screen positions for position-dependent events such as Touch Events, from the position on the controlling device screen to the position on the controlled device screen. This is an essential process because the controlled device normally has a bigger screen. To achieve the correct mapping, a ratio mapping method can be used, as described below.
[0028] As shown in Figure 4, a touch point (x1 , y1 ) on the controlling device screen needs to be mapped to the touch point (x2, y2) on the controlled device screen. The coordinate centre (0, 0) is at the left bottom corner of the screen for the both devices; w1 and hi are the width and height of the controlling device screen, and w2 and h2 are the width and height of the controlled device screen; w1 , hi , w2, h2, x1 and y1 are known; x2 and y2 are the expected values. To work out the x2 and y2, the following algorithms can be used: x2 = (x1 / w1 ) * w2
y2 = (y1 / hi ) * h2
[0029] The Map Positions process can be either in the client application or in the server application, although as default it is put in the client application. The w2 and h2 in the mapping method can be a virtual size. For example, assume the actual screen resolution of the controlled device is 1920 * 1080, the w2 * h2 can be registered as 1920*1080, 2000 * 2000 or any other figures. The local operating system will handle the mapping between the registered size and the actual size. But be aware of that, the registered size will affect the precision of the touch event - theoretically the bigger registered size, the higher precision unless the virtual size is over the actual size.
[0030] The third process is Send Events. The captured original events or the mapped events are directly sent to the controlled device in a certain format. The format must be defined in an efficient way. With the currently most common communication technologies, like Wi-Fi and Bluetooth, the events need to be sent by bytes; and the events are normally represented by integer and float values. For example, a Touch Event can be represented by an integer Action value and two integer coordinate values X and Y; a motion sensor event can be represented by three float coordinate values X, Y and Z.
[0031 ] Figure 5 shows a format to send the events. The format comprises four parts: the first part is the first byte representing the event type; the second part is the following four bytes representing the first integer argument (Arg1 ) or the first float value (Val1 ); the third part is the four bytes after the second part representing the second integer argument (Arg2) or the second float value (Val2); the fourth part is the last four bytes representing the third integer argument (Arg3) or the third float value (Val3). The type is an integer stored in one byte, which represents the different event types including Touch Event, Mouse Event, Key Event and Sensor Event and so on. The following three parts are used to store the parameters for different types of events. They can be integer values or float values depending on the type of events. For example, a Touch Event comes with an integer Action value (representing Down or Up event), an integer coordinate X value and an integer coordinate Y value, and in this case, the first part stores the Action value called Arg1 , the second part stores the X value called Arg2, and the third part stores the Y value called Arg3; for a motion sensor event, the three parts are used to store three float values representing the movement in the three dimensions, which are called Val1 , Val2 and Val3 respectively.
[0032] The fourth process is Inject Events. The received events by the server application must be injected into the local operating system with creating virtual input devices. The created input devices must be recognized by applications and games without any difference from the local real input devices. That means, to applications and games on the controlled device, the events are from the local touch screen and sensors.
[0033] Different types of events may need to create different types of input devices and different operating systems may have different mechanisms to handle input devices. It requires to refer to the operating system's document to find the exact way to create virtual input devices for all the events. For a Linux kernel based operating system like Android, as shown in Figure 7, the "uinput" and "input" interfaces can be used to simulate the input devices; in this case, Java JNI can be used to call C codes to create the device and directly inject the events into the system.
[0034] To inject events, one concern is that, when multiple controlling devices are connected with the controlled device, how to recognize the event source is important for some Apps, especially for games. It can be done by creating a virtual input device for each type of events and each of the controlling devices. For example, when a controlling device is connected with the controlled device, a virtual touch device can be created to handle touch events, a virtual motion sensor device can be created to handle motion events, and a virtual key event device is created to handle all key events; when another controlling device is connected with the controlled device, another group of the virtual devices are created again. In this way, the applications and games are able to support multiple separate controllers because they can recognize every created virtual device by its device ID which is managed by the system device manager. In some implementations, it may be not necessary to distinguish the event source; in this case, shared virtual input devices can be created for all the controlling devices. It is left to the actual implementation to decide whether to create separate or shared or hybrid virtual devices.
[0035] The fifth process is Capture Effects. The effects herein are captured by the server application on the controlled device, including Display Effects, Sound Effects and Vibration Effects, but not limited to them. The Display Effects are the display on the controlled device screen; the screenshots or a stream of the display video are sent back to the controlling device and displayed on the screen at real time; in this way, the user is able to know where to do touch operations. The Sound Effects and Vibration Effects are mainly used for games providing an instant operation feedback to the controller; for example, when playing a car driving game, and when the car contacts other racing cars, a collision sound effect or a vibration effect can be sent to the client controller so that the user can hear or feel it; these effects are captured from games which support these features. It is particularly useful for multiple-player games which can send specific sound and vibration effects to an individual player. According to games or special applications' demands, more effects may be added in the same way.
[0036] The dash link between the Inject Events process and the Capture Effects process, as shown in Figure 1 , means the latter is not necessarily the following process of the former; or say, they may or may not be dependent processes; for example, capturing display effects are not related to Inject Events while capturing sound and vibration effects may be triggered by certain events.
[0037] Figure 8 shows how Effects are captured in an Android device. Capturing Display Effects are different from capturing other effects. For Display Effects, the Linux kernel has a device called "framebuffer" or "fb" which contains the screen display data in RGB format. From the memory map of the device, the screenshots can be directly fetched by JN I, or a video stream can be created using the device as the source. For Sound Effects and
Vibration Effects, they are captured from the applications or games; but this requires the applications and games to implement the corresponding APIs.
[0038] The sixth process is Send Effects. The effects are sent back to the client application in a certain format. It is the same with Send Events that, based on the currently most common technologies, the effects can be sent in bytes. Figure 6 shows an example format which includes a header of nine bytes and the data of uncertain number of bytes. In the header, the first byte is used to store an integer type which includes Display, Sound and Vibration; the left eight bytes are divided into two groups; the two groups of bytes are used for different purposes according to the effect type; for a Display Effect, the first group stores the sub-type of the Display Effect including 0 and 1 representing images and video respectively and the second group stores length of the data; for the other two types, the first group stores the effect sub-type and the time length, for a Sound Effect as an example, the first group points out which sound effect is going to be performed, and the second group indicates how long the sound effect will be performed. The Data is used for Display Effects or any other effects which require data. It stores the compressed image data or the URI of the video stream.
[0039] The seventh process is Perform Effects. When the client application receives the effects, it performs the effects respectively according to the type. For a Display effect, the received image data will be set to the display component if images are sent back; and the video stream will be played on the display component at real time if the video is applied. For a Sound Effect, pre-set sounds may be played according to the received specification. For a Vibration Effect, the local vibration APIs may be called to perform the effects according to the received specification. Other types of effects can be defined in the actual implementations. The performed effects are normally reflected on the application GUI.
[0040] A system implemented based on the invented method is in client-server
architecture, wherein a client application is installed on one or more controlling devices and a server application is installed on the controlled device. As shown in Figure 3, the system comprises seven key components in the client application and five key components in the server application.
[0041 ] In the client application, the key components comprise:
GUI, used to set the application and capture the events from;
Device Finder, used to find a controlled device in local network;
Event Catcher, used to capture events from the GUI;
Event Mapper, used to map the position on the controlling device screen to the position on the controlled device screen for position-dependent events;
Event Sender, used to send the events to the server application;
Effect Receiver, used to receive the effects from the server application;
Effect Performer, used to perform the received effects to the GUI.
[0042] In the server application, the key components comprise:
Finding Responder, used to respond finding broadcasts from the client application;
Event Receiver, used to receive the events from the client application; Event Injector, used to inject the events to the operating system with simulating the real input device;
Effect Catcher, used to capture the effects from the operating system or the kernel, and from the Apps as well;
Effect Sender, used to send the captured effects to the client application.
[0043] The system uses UDP broadcasts in the local network or Bluetooth discovery to find the controlled device. After the device has been found, the user can choose to connect the device by Wi-Fi or Bluetooth if the both are supported; the user can only choose one of them at the same time. After the connection has been established, it enters the processes involved in the invention.

Claims

Claims
1. A method for remotely controlling applications or games, comprising:
a client application installed on a controlling device;
a server application installed on the controlled device;
Capture Events process, wherein the client application captures screen-position- dependent events and screen-position-independent events from the application GUI;
Map Positions process, wherein the client application or the server application maps the screen positions for the screen-position-dependent events from the position on the controlling device screen to the position on the controlled device screen;
Send Events process, wherein the client application sends the originally captured events or the mapped events to the server application;
Inject Events process, wherein the server application injects the received events or the mapped events into the local operating system;
Capture Effects process, wherein the server application captures the Effects from the local operating system or applications and games;
Send Effects process, wherein the server application sends the captured Effects to the client application;
Perform Effects process, wherein the client application performs the received Effects and reflects the results onto the GUI.
2. The Capture Events process of claim 1 , wherein the events include all the events supported by the local operating system.
3. The Inject Events process of claim 1 , wherein the server application injects the events either by the kernel's interface of the operating system or by the APIs (application programming interface) provided by the operating system, with creating the corresponding virtual input devices which may or may not be recognized by applications and games.
4. The Capture Effects process of claim 1 , wherein the Effects comprise Display Effects, Sound Effects, Vibration Effects and any other effects as the controlling results.
5. A system which is implemented based on the method defined in claim 1 , comprising: client-server architecture, wherein a client application is installed on a controlling device and a server application is installed on the controlled device;
the processes defined in claim 1.
PCT/AU2016/050076 2015-02-13 2016-02-09 System and method of implementing remotely controlling sensor-based applications and games which are run on a non-sensor device WO2016127210A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201680007046.3A CN107206282A (en) 2015-02-13 2016-02-09 Remote control runs the system and implementation of sensor-based application program and game on without sensor device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2015900485A AU2015900485A0 (en) 2015-02-13 System and method of implementing remotely controlling sensor-based applications and games which are run on a non-sensor device
AU2015900485 2015-02-13

Publications (1)

Publication Number Publication Date
WO2016127210A1 true WO2016127210A1 (en) 2016-08-18

Family

ID=53054315

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2016/050076 WO2016127210A1 (en) 2015-02-13 2016-02-09 System and method of implementing remotely controlling sensor-based applications and games which are run on a non-sensor device

Country Status (3)

Country Link
CN (1) CN107206282A (en)
AU (1) AU2015100438B4 (en)
WO (1) WO2016127210A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11573874B2 (en) 2021-01-05 2023-02-07 The Mitre Corporation Systems and methods for automated injection of effects in cyber-physical systems and their simulations

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100251292A1 (en) * 2009-03-27 2010-09-30 Sudharshan Srinivasan Smartphone for interactive television
GB2471883A (en) * 2009-07-16 2011-01-19 Nec Corp Controlling a software application in a thin client session using a mobile device
US20140349757A1 (en) * 2011-06-03 2014-11-27 Nintendo Co., Ltd. Computer-Readable Storage Medium, Information Processing Apparatus, Information Processing System and Information Processing Method
US20140373082A1 (en) * 2012-02-03 2014-12-18 Sharp Kabushiki Kaisha Output system, control method of output system, control program, and recording medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092495B (en) * 2011-11-01 2017-07-28 茂杰国际股份有限公司 The synchronously operating system and method for contactor control device
CN103079019A (en) * 2012-12-21 2013-05-01 康佳集团股份有限公司 Control method and system for controlling intelligent terminal through mobile equipment
US8858335B2 (en) * 2013-01-18 2014-10-14 Microsoft Corporation Reconfigurable clip-on modules for mobile computing devices
CN103908778A (en) * 2014-04-04 2014-07-09 深圳市同洲电子股份有限公司 Game control method and related terminals

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100251292A1 (en) * 2009-03-27 2010-09-30 Sudharshan Srinivasan Smartphone for interactive television
GB2471883A (en) * 2009-07-16 2011-01-19 Nec Corp Controlling a software application in a thin client session using a mobile device
US20140349757A1 (en) * 2011-06-03 2014-11-27 Nintendo Co., Ltd. Computer-Readable Storage Medium, Information Processing Apparatus, Information Processing System and Information Processing Method
US20140373082A1 (en) * 2012-02-03 2014-12-18 Sharp Kabushiki Kaisha Output system, control method of output system, control program, and recording medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11573874B2 (en) 2021-01-05 2023-02-07 The Mitre Corporation Systems and methods for automated injection of effects in cyber-physical systems and their simulations

Also Published As

Publication number Publication date
CN107206282A (en) 2017-09-26
AU2015100438B4 (en) 2016-04-28
AU2015100438A4 (en) 2015-05-14

Similar Documents

Publication Publication Date Title
US11752429B2 (en) Multi-user demo streaming service for cloud gaming
US11097188B2 (en) System, method, and graphical user interface for controlling an application executing on a server
US20230233933A1 (en) Video Game Overlay
CN107029429B (en) System, method, and readable medium for implementing time-shifting tutoring for cloud gaming systems
US9990029B2 (en) Interface object and motion controller for augmented reality
US9937423B2 (en) Voice overlay
US9707485B2 (en) Systems and methods for cloud processing and overlaying of content on streaming video frames of remotely processed applications
US8435121B1 (en) Providing remote access to games designed for a single-machine experience
US20150304712A1 (en) Method, apparatus, and system for transferring digital media content playback
JP2015506198A (en) Content system with secondary touch controller
US20140121010A1 (en) Method and system for video gaming using game-specific input adaptation
JP5947876B2 (en) Information processing system, information processing method, information processing program, computer-readable recording medium recording the information processing program, and information processing apparatus
US9948691B2 (en) Reducing input processing latency for remotely executed applications
US9497238B1 (en) Application control translation
US11936928B2 (en) Method, system and device for sharing contents
AU2015100438B4 (en) System and method of implementing remotely controlling sensor-based applications and games which are run on a non-sensor device
JP2013109560A (en) Information processing system, information processing terminal, information processing method, information processing program, and computer-readable recording medium storing information processing program
US9954718B1 (en) Remote execution of applications over a dispersed network
Rivas et al. Interactive techniques for entertainment applications using mobile devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16748480

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16748480

Country of ref document: EP

Kind code of ref document: A1