GB2618596A - Auxiliary sensing system and method - Google Patents

Auxiliary sensing system and method Download PDF

Info

Publication number
GB2618596A
GB2618596A GB2206947.0A GB202206947A GB2618596A GB 2618596 A GB2618596 A GB 2618596A GB 202206947 A GB202206947 A GB 202206947A GB 2618596 A GB2618596 A GB 2618596A
Authority
GB
United Kingdom
Prior art keywords
sensors
portable
data
portable processing
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2206947.0A
Other versions
GB202206947D0 (en
Inventor
Serdar Kocdemir Sahin
William Walker Andrew
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Inc filed Critical Sony Interactive Entertainment Inc
Priority to GB2206947.0A priority Critical patent/GB2618596A/en
Publication of GB202206947D0 publication Critical patent/GB202206947D0/en
Publication of GB2618596A publication Critical patent/GB2618596A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality

Abstract

A system for generating data for use by one or more portable processing devices 110 located in a real‐world environment, the system comprising one or more sensors 130 configured to capture information about the real‐world environment and/or one or more elements in the real‐world environment, the sensors being remote to the one or more portable processing devices 110, and a sensing device 120 configured to generate data in dependence upon the information captured by the sensors, and to transmit this data to one or more of the portable processing devices. The sensing devices may include cameras, microphones, heat, movement, orientation, position or pressure sensors. The sensing device processes data from the sensors and transmits data such as tracking information to the portable device(s). The sensors and sensing device may be embodied in a single device which may be a drone. The portable processing units may be a head mounted display, phone or portable games console.

Description

AUXILIARY SENSING SYSTEM AND METHOD BACKGROUND OF THE INVENTION
Field of the invention
This disclosure relates to an auxiliary sensing system and method. Description of the Prior Art The "background" description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present invention.
Portable electronic devices have been in use for many years now, but particularly in recent years there has been an increase in the capabilities of such devices. This has led to an increasing adoption of such devices, as they are able to provide worthwhile experiences that in many cases are comparable to those offered by equivalent non-portable devices as well as often being able to perform multiple functions. The increasing use of such devices has also been encouraged by the development of high speed mobile networks, and the increasing prevalence of wireless networks offering a user internet access when away from home.
One example of this is mobile phones, which have experienced significant increases in their processing power and the quality of their displays -this allows for a number of different applications, including computer games, to be executed by the device. These technological advances have allowed users to obtain portable processing devices at increasingly affordable prices for a given amount of processing power. Similarly, portable games consoles have seen comparable increases in their processing capabilities over the years. Another example is that of head-mountable display devices (HMDs); while more static devices have been available for some time, it is only more recently that truly portable solutions have become popular.
However, despite the improvements to portable devices mentioned above there are still a number of limitations on the operation of such devices. Firstly, there is an issue in that the primary function of these devices is portability -this means that the weight of such devices is a primary consideration, as if a device is too heavy then it may be too impractical to be considered portable. Similarly, a small form factor may be desirable to improve user operability when on the move. This limits the number and/or specifications of hardware elements in the device, including batteries, sensors, and processing units.
This means that limitations are placed on experiences that are able to be provided to a user. For instance, a limitation on battery capacity may reduce the length of a user experience while also reducing the power of processing units in an effort to balance the quality and duration of the user experience. One solution that has been proposed for such problems is that of a docking station which can offer power to a device while it is in use, and in some cases can offer additional processing power to the device as well as an interface to additional peripherals and/or display units.
It is in the context of the above discussion that the present disclosure arises. SUMMARY OF THE INVENTION
This disclosure is defined by claim 1.
Further respective aspects and features of the disclosure are defined in the appended claims.
It is to be understood that both the foregoing general description of the invention and the following detailed description are exemplary, but are not restrictive, of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein: Figure 1 schematically illustrates a system comprising a number of portable devices and an auxiliary sensing arrangement; Figure 2 schematically illustrates an embodiment using a fixed sensing device; Figure 3 schematically illustrates an embodiment using a non-fixed sensing device; Figure 4 schematically illustrates a system for generating data for use by one or more portable processing devices located in a real-world environment; and Figure 5 schematically illustrates a method for generating data for use by one or more portable processing devices located in a real-world environment.
DESCRIPTION OF THE EMBODIMENTS
Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views, embodiments of the present disclosure are described.
Figure 1 schematically illustrates a system comprising a number of portable devices and an auxiliary sensing arrangement (including one or more sensors and optional elements for processing the results of detections by those sensors) which are in communication. An auxiliary sensing arrangement is defined here as meaning any device which is operable to obtain information about an environment or one or more elements within that environment on behalf of a portable device; this may include further processing of sensor data and the like, in some cases. Examples of such arrangements are discussed in more detail below.
The arrangement and number of devices shown in Figure 1 should be regarded as being purely exemplary rather than limiting in any regard. For instance, the number of portable devices 100 may be increased or decreased, as may the number of sensors 130, and each of the portable devices 100 may be able to communicate directly with the sensing device 120 without the use of a centralised network 110. Similarly, the number of sensing devices 120 should not be considered as being limited to one; it is considered that the portable devices 100 and sensing devices may be provided in a many-to-one, oneto-many, and/or many-to-many correspondence as appropriate -this may be determined on a per-device basis, rather than a per-system basis, meaning that in a particular arrangement some portable devices 100 may be associated with many sensing devices 120 while some are associated with only one.
Such a correspondence may also be dynamic throughout the use of a particular portable device 100, based upon a user, device, or application preference or requirement for example.
The portable devices 100 may each be associated with a single user, or may be associated with a respective one of a plurality of users; in other words, there may be any correspondence between users and devices as appropriate for a given implementation. For instance, a user may have an HMD and a portable games console as an example of a user being associated with multiple devices. Similarly, a number of users may be sharing a single portable games console. The portable devices may include any device that is operable to perform a processing function and able to be operated by a user without placing constraints on the user's location (in other words) the device is both transportable and able to be used during transport).
Examples of suitable portable devices 100 include mobile phones, portable games consoles, tablet computers, and head-mountable display devices (including both see-through and/or full immersion).
Any other device that is able to be used in a handheld configuration or able to be mounted upon the user's body may be considered appropriate. In some embodiments, the portable devices 100 could also be considered to include vehicles (such as user-operated or driverless cars) or robots -while these are not handheld devices, they may be interacted with by a user during travel. For example, a robot may walk about an environment when performing a function instructed by a user thereby meeting the requirements of a portable device.
The network 110 is shown as an example of a communication protocol able to be used by the portable devices 100 and/or sensing device 120. This may be an internet connection that enables communication between any of these devices, which may be managed by a server, or an ad-hoc network between the devices. This connection may be implemented using any known technology; examples include Wi-Fi and Bluetooth ® connections, although any other communication protocol may be considered in dependence upon the intended implementation.
The sensing device 120 may be a further portable device, or may be a fixed (non-portable) device as appropriate for a given implementation. Examples of each of these are discussed below. In some embodiments, the sensing device 120 may act as an interface device between the sensors 130 and the portable devices 100; in other words, the sensing device 120 may receive sensor readings and pass these to the portable devices 100 without further processing. In other embodiments, processing may be applied to some or all of these readings in dependence upon the type of sensor or the requirements of one or more of the portable devices 100. Examples of such processing include changing the format of the sensor data, performing a smoothing or other interpretation of the data, and the generation of further data based upon the sensor data (such as generating a map of the environment). In some embodiments the processing may instead be offloaded to a server or other remote processing device if a low-latency network connection is available -this can reduce the processing burden on the portable device 100 and sensing device 120.
The sensors 130 may include any number of sensors (in some cases, only one sensor may be provided -the fact that two are shown here should not be regarded as being limiting) in any suitable configuration. In some embodiments one or more sensors 130 may be integrated with the sensing device 120 in a single physical unit; alternatively, or in addition, sensors 130 may be provided separately to the sensing device 120 such that they are embodied in different physical units. This may allow the sensors to be operated and/or moved independently of the sensing device 120.
The sensors 130 and one or more sensing devices 120 may be configured in any suitable manner -any number of sensors 130 may be associated with a single sensing device 120, while each sensor 130 may be associated with any number of sensing devices 120. This means that each sensing device 120 may receive data from any number of sensors 130, and each of the sensors 130 may transmit data to any number of sensing devices 120. The data may be transmitted wirelessly or through a wired connection as appropriate for a given implementation.
Figure 2 schematically illustrates an exemplary embodiment in which a fixed (that is) stationary) sensing device is used in an environment. Such an arrangement may be considered advantageous in at least that the sensing device (and one or more sensors) are able to be powered without relying on the batteries of a portable device. It is also considered that without requiring the sensing device or sensors to be portable, higher quality (which can often mean bulkier or more demanding) sensors can be provided which can enable the capture of higher quality data (such as capturing higher-resolution images of an environment).
The environment 200 may be any environment, indoor or outdoor and with any size as appropriate, in which one or more users may be expected to use portable devices. Examples may include a hall in which an event is hosted, a shopping centre, a sports stadium, a music venue, a park, an office, or a restaurant; however, any space in which users of portable devices are present may be considered suitable.
The environment 200 comprises a sensing device 210, for simplicity shown in the corner of the environment 200; this is an example of a sensing device 120 discussed with reference to Figure 1. However, the location should not be regarded as being limited in this way as the sensing device 210 may be located anywhere within an environment -which may include suspended above the environment or even located partially underground to reduce the amount of space required. This sensing device 210 is operable to communicate with any one or more of the portable devices 220 (associated with users, not shown) to relay sensor information and/or data derived from sensor information. This sensor information may be captured by any sensors in the environment; Figure 2 illustrates exemplary sensors 230, 240, and 250, but any others may be used as an alternative or additional source of data about the environment.
The exemplary sensor 230 is a camera that is operable to capture images of the environment 200; these may be captured using visible light, for example, or infra-red or other types of images. These images may be used to generate a map of the environment (particularly if multiple cameras are provided); alternatively, or in addition, one or more portable devices 220 and/or users may be tracked within the environment 200 using the captured images. In some cases, it may be possible to identify users within the environment 200 using some form of facial or gait recognition (for example).
The exemplary sensor 240 is a microphone or a microphone array operable to capture audio in the environment 200. This captured audio can be used for tracking objects or people in the environment (for example, based upon identifying audio output by an object), for instance, and/or modelling the environment itself. In some embodiments the captured audio may be used to indicate the occurrence of an event, or to gauge user sentiment in the environment or the like. Any suitable use of the audio data may be considered where appropriate for a given implementation, as the use of that data is separate the discussion of its capture.
The exemplary sensor 250 is that of one or more pressure pads located in the environment 200. These can be used to infer user location within the environment, and may be used to determine the level of occupancy of the environment. Information from the pressure pads may be used in conjunction with one or more other sensor readings to generate a refined location of a particular user, for instance; an 4 example of this is an approximate location generated based upon a wireless signal strength or audio of a portable device which can be used to identify a particular pressure pad activation.
Other (non-limiting) examples of sensors that could be utilised in an embodiment with a fixed sensing device 210 include wireless signal scanners/receivers, temperature gauges, depth sensors, and humidity sensors. Any sensors that are operable to capture information about an environment or elements/activity within that environment may be considered where corresponding information is desired.
It should be noted that in some cases multiple sensors may be embodied in a single unit in the environment; for instance, a device may comprise a camera and a microphone to capture both images and audio of the environment. In other words, it is not necessary that each of the sensors is provided separately within the environment.
Further to this, it is considered that it may be advantageous for some sensors within the environment to be able to move within the environment rather than being fixed in place. For instance, a camera may be mounted on a rail that enables it to be moved along a fixed path so as to capture images from a number of different viewpoints. The movement within the environment may be performed in an automated or otherwise scheduled manner, or may be controlled by the sensing device in dependence upon one or more properties of the environment or the data captured by the sensor. For instance, the movement may be performed in response to the movement of objects or users in the environment, or the identification of a blind spot (in the case of a camera) or otherwise poor captured data. Alternatively, or in addition, one or more sensors may be associated with a mobile unit such as a drone which can be moved freely within an environment (that is, without the constraints of a rail or predetermined track).
Figure 3 schematically illustrates an embodiment in which the sensing device is not fixed at a particular location in the environment. One advantage of such an arrangement is that of portability; being able to use a sensing unit wherever the user/portable device is located means that a same experience can be provided regardless of location. It is also considered that improved data may be obtained about the environment as the sensing device (and associated sensors) may be moved freely within the environment to take advantage of multiple viewpoints.
The environment 300 is shown comprising a pair of sensing devices 310; these is an example of a sensing device 120 discussed with reference to Figure 1. These sensing devices 310 may both be provided in some embodiments, while in others these may be considered to be alternatives to one another. Of course, other types of sensing device may be considered appropriate -the form is not limited to the examples shown here.
The first of these sensing devices 310 is embodied by a drone comprising one or more sensors operable to capture information about the environment 300 and/or elements (such as the user) within the environment 300. Here, a drone may be considered to be representative of any device that is operable to fly about an environment. Such a form may be advantageous in that the higher altitude (relative to the portable device 320) may enable the capture of improved data -for instance, some user tracking processes may be enhanced with the use of birds-eye view images. A further advantage may be that of the sensing device 310 being able to located away from the user so as to avoid accidental collisions or the like -this may be a particular concern if the portable device 320 is an HM D. The second of these sensing devices 310 is a buggy-style device; in other words, a ground-based device which can move about the environment 300 on wheels (for example). Such a form may be advantageous in that the weight-based restrictions of a flying device are significantly mitigated -this can enable the provision of a greater number (and/or higher quality) of sensors, as well as larger batteries and/or more powerful processing elements. This can result in an increase in the amount of data that is able to be collected by the sensing device 310, as well as increase the amount of processing that can be performed without introducing a significant latency.
In addition to the sensing devices 310, an optional additional sensor device 330 is shown in the example of Figure 3; while only one is shown, it is considered that multiple such devices may be provided in some implementations of the present disclosure. This device 330 is considered to be subordinate to the sensing devices 310 and is provided so as to enable more sensor data to be captured; this may mean the use of different sensors between the devices and/or the use of the same sensors but with an additional viewpoint. The sensor device 330 may be able to omit a number of features of the sensing devices 310 (such as at least some of the processing capabilities) as it is envisaged that the sensor data obtained by the sensor device 330 is output to the sensing device 310 with only minimal (or indeed no) processing being performed. However, in some embodiments the devices 310 and 330 may each be operable to communicate with the portable device 320 independently.
The locations of each of these devices may be determined in any suitable manner. A first example is that of the devices having a predefined location relative to the user or portable device; this can cause the devices to move in response to motion of the user or portable device as appropriate. A second example is that of a device having a predetermined path; this may be defined relative to the user or the environment. A third example is that of controlling the devices in response to information about the collected sensor data; in such an example, the devices may be operable to change their location so as to remedy any deficiencies in the captured sensor data (such as to avoid occlusions) or to otherwise change the data that is collected.
These locations may be dependent upon user preferences, the type of device, the type of portable device, the identification of particular features to be tracked or otherwise monitored, and/or an application being executed by the portable device.
While presented as separate examples here, the arrangements shown in Figures 2 and 3 may be provided in a combined manner so as to obtain advantages of each arrangement. For instance, a fixed sensing device (such as that described with reference to element 210 of Figure 2) may be provided which performs processing while mobile sensing devices (such as those described with reference to element 310 of Figure 3) may be provided to perform sensing and, optionally, initial processing. For instance, a sensing device 310 may be operable to perform pre-processing of captured images that a sensing device 210 uses to generate a map of the environment. This can reduce the burden of processing from the mobile sensing devices without requiring sensing to be done by fixed arrangements; this can allow for increased performance of the sensing devices 310 through increased battery life or reduced weight (due to less need for processing), for example.
In any of the examples described in this disclosure, it is considered that sensing devices are operable to communicate information to the portable devices. The nature and content of this information may be tailored freely to the requirements of a particular embodiment; a number of examples are discussed below, each of which may be considered in any combination.
A first example is that of transmitting the sensor data itself, either without processing applied (such as a stream of images captured by a camera, or audio as captured by a microphone) or with only minimal processing applied. Examples of such processing include applying smoothing to data, filtering of the data, edge detection in images, or the like. This processing may include any form of pre-processing of the data so as to render it more readily usable by another application. In some embodiments, this processing may include removing identifying information from sensor readings -such as applying a voice modifier to audio or digitally altering a captured image to hide faces or other sensitive information (such as names or number plates on cars).
A second example is that of transmitting data that is derived from the sensor information; this may include maps of an environment, for example, generated from one or more of captured images, depth sensor readings, and captured audio. Other information may include position and/or orientation information for one or more objects, people, and/or devices within an environment, identification information for one or more users in the environment (based upon facial recognition, for instance), and/or motion data for a person -such as motion capture for providing inputs to a device.
A third example is that of transmitting information about events within the environment on a conditional basis. This may be the result of a monitoring process that is based upon information obtained by the sensors. Examples of such information include warnings about collisions between users and/or devices, or notifications about events such as a new user or device entering the environment. Warnings or information may be generated based upon information from any sensors, and not just based upon determined motion; for instance, if the temperature in an environment gets too high then information may be transmitted encouraging users to seek shade or stay hydrated. Such information may improve user awareness of the environment and/or user safety, for example.
By performing processing at the sensing device, the processing burden upon a portable device can be reduced. This effect may be amplified in an environment in which multiple portable devices make use of the same data; by performing this processing once at a centralised location (the sensing device) the need for each portable device to perform the processing individually can be avoided. Of course, it is not considered essential that the processing is performed by the sensing device and in some cases it may be performed at the portable device or an additional device (such as a remote server or another processing device within an environment) -this can reduce the complexity of the sensing device, which may be particularly useful for non-fixed sensing devices such as those discussed with reference to Figure 3.
Information may be transmitted selectively, rather than all of the information obtained/generated by the sensing device being passed on to each portable device. One example of this is providing data in response to a request from a specific portable device, the request indicating the information to be transmitted.
Alternatively, the information that is transmitted may be determined in dependence upon the portable device and/or an application being executed by that device. For instance, motion data for a user may only be transmitted to the corresponding portable device in the case that an application is being executed would use this as an input to control the execution. Similarly, information about the location of a portable device may be omitted if it is determined that the portable device has sufficient capabilities to determine their own position. Information may also be selectively transmitted on the basis of information about the data already stored on a device; this can prevent map data being repeatedly transmitted, for instance.
Information may also be transmitted in dependence upon the location of a device and/or user. This may be in the context of a collision warning, for instance, or the transmission of any other data that may have a spatial consideration. For example, audio may be transmitted to a portable device that is located over a threshold distance from the microphone capturing that audio as it may be considered that the user of the device would not be able to clearly hear audio captured by that microphone due to the distance.
Similarly, the information that is transmitted may instead (or also) be determined in dependence upon the identity of the user of the portable device and/or their preferences. For instance, this can be based upon permissions of a user -the average user may not have access to image information in the environment to protect the privacy of other users, for instance, and instead may only be permitted to access non-sensitive data such as collision warnings and map information or their own motion data. Users may also opt not to receive data so as to reduce the amount of data being transmitted -this can preserve battery life of the portable device, for example.
Figure 4 schematically illustrates a system for generating data for use by one or more portable processing devices, located in a real-world environment, that may be associated with the system. The system comprises one or more sensors 400, a sensing device 410, and may include one or more portable processing devices 420. In some embodiments, it is considered that more than one sensing device 410 may be included in the system.
In some embodiments, one or more sensors 400 and the sensing device 410 are embodied in a single device that is configured to move about the real-world environment; this single device may be a drone, for instance, or any other mobile device such as a buggy or robot as discussed above. The single unit may be configured to move with a predetermined position or positions relative to a portable processing device and/or a user of such a device; this may include maintaining a fixed relative location, for instance, or the performing of a predetermined motion pattern centred about a device or user location.
Alternatively, or in addition, the sensing device 410 and at least one of the one or more sensors 400 are provided with a fixed location with the real-world environment. In such an embodiment, others of the one or more sensors 400 are able to be configured to be moved about the real-world environment.
The one or more sensors 400 are configured to capture information about the real-world environment and/or one or more elements in the real-world environment, the sensors 400 being remote to the one or more portable processing devices 420. These sensors may comprise one or more sensors selected from the list of: cameras, microphones, heat sensors, positioning sensors, movement sensors, orientation sensors, and wireless signal receivers. These may be embodied in any number of different units; each sensor may be included in a different hardware unit, and a single hardware unit may include a number of sensors. These sensors may be configured to transmit their output to a corresponding sensing device 410 in real-time in a number of embodiments.
The one or more sensing devices 410 are configured to generate data in dependence upon the information captured by the sensors 400, and to transmit this data to one or more of the portable processing devices 420. As noted above, sensing devices 410 and sensors 400 may be provided in an integrated fashion such that they are present in the same hardware unit. The sensing devices 410 may be operable to transmit data to one or more portable processing devices 420 in real time as the data is generated; for instance, to provide live mapping and/or tracking information relating to an environment. The transmitted data may comprise the whole of the generated data, or information 8 representing differences between previously transmitted and newly generated data -for instance, position delta for a tracked element, a temperature change (rather than a new temperature reading), or additional map information (for a previously unmapped portion of the example) or to update an earlier mapping, for instance) rather than an entire map.
In some embodiments, a sensing device 410 is configured to transmit data to portable processing devices in dependence upon the identity of the user of a respective portable processing device; this can be implemented by a user being associated with a particular device, for instance, or an identification of a user profile that is loaded on the device. Similarly; a sensing device 410 may be configured to transmit data to portable processing devices in dependence upon the type of portable processing device and/or an application being executed by that portable processing device.
In some embodiments, sensing devices 410 may be configured to perform one or more of smoothing, de-noising, modification, feature detection, and/or identification processes on the information captured by the sensors 400. This data may be output to the portable processing devices 420 instead of (or in addition to) raw sensor data.
In some embodiments, the generated data (that is, the data generated by a sensing device 410) comprises one or more of a map of the environment, position tracking information for one or more users and/or one or more of the portable processing devices 420, gesture information for a user, and/or notifications for a user. These may be used as inputs for an application that is being executed by a portable processing device 420 that receives the generated data, or to otherwise generate an output to be presented to a user of such a device.
The system may include one or more portable processing devices 420. The portable processing devices 420 may include one or more mobile phones, portable games consoles, and head-mountable display devices in any combination. These may each be associated with one or more sensing devices 410, and in some cases multiple ones of the portable processing devices 420 may be associated with the same sensing device (or devices) 410.
The arrangement of Figure 4 is an example of a processor (for example, a GPU and/or CPU located in a games console or any other computing device) that is operable to generate data for use by one or more portable processing devices located in a real-world environment, and in particular is operable to: capture, using one or more sensors, information about the real-world environment and/or one or more elements in the real-world environment, the sensors being remote to the one or more portable processing devices; and generate data in dependence upon the information captured by the sensors, and to transmit this data to one or more of the portable processing devices.
Figure 5 schematically illustrates a method for generating data for use by one or more portable processing devices located in a real-world environment. This method may be executed in accordance with any one or more of the embodiments described above, such as those discussed with reference to Figures 2 and 3.
A step 500 comprises capturing information, using one or more sensors, about the real-world environment and/or one or more elements in the real-world environment, the sensors being remote to the one or more portable processing devices.
A step 510 comprises generating data in dependence upon the information captured by the sensors. A step 520 comprises transmitting this data to one or more of the portable processing devices.
The techniques described above may be implemented in hardware, software or combinations of the two. In the case that a software-controlled data processing apparatus is employed to implement one or more features of the embodiments, it will be appreciated that such software, and a storage or transmission medium such as a non-transitory machine-readable storage medium by which such software is provided, are also considered as embodiments of the disclosure.
Thus, the foregoing discussion discloses and describes merely exemplary embodiments of the present invention. As will be understood by those skilled in the art, the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting of the scope of the invention, as well as other claims. The disclosure, including any readily discernible variants of the teachings herein, defines, in part, the scope of the foregoing claim terminology such that no inventive subject matter is dedicated to the public.

Claims (15)

  1. CLAIMS1. A system for generating data for use by one or more portable processing devices located in a real-world environment, the system comprising: one or more sensors configured to capture information about the real-world environment and/or one or more elements in the real-world environment, the sensors being remote to the one or more portable processing devices; and a sensing device configured to generate data in dependence upon the information captured by the sensors, and to transmit this data to one or more of the portable processing devices.
  2. 2. A system according to claim 1, wherein one or more sensors and the sensing device are embodied in a single device that is configured to move about the real-world environment.
  3. 3. A system according to claim 2, wherein the single device is a drone.
  4. 4. A system according to claim 2, wherein the single unit is configured to move with a predetermined position or positions relative to a portable processing device and/or a user of such a device.
  5. 5. A system according to any preceding claim, wherein the sensing device and at least one of the one or more sensors are provided with a fixed location with the real-world environment.
  6. 6. A system according to claim 5, wherein others of the one or more sensors are configured to be moved about the real-world environment.
  7. 7. A system according to any preceding claim, wherein the one or more sensors comprise one or more sensors selected from the list of: cameras, microphones, heat sensors, positioning sensors, movement sensors, orientation sensors, and wireless signal receivers.
  8. 8. A system according to any preceding claim, wherein the sensing device is configured to transmit data to portable processing devices in dependence upon the identity of the user of a respective portable processing device.
  9. 9. A system according to any preceding claim, wherein the sensing device is configured to transmit data to portable processing devices in dependence upon the type of portable processing device and/or an application being executed by that portable processing device.
  10. 10. A system according to any preceding claim, wherein the sensing device is configured to perform one or more of smoothing, de-noising, modification, feature detection, and/or identification processes on the information captured by the sensors.
  11. 11. A system according to any preceding claim, wherein the generated data comprises one or more of a map of the environment, position tracking information for one or more users and/or one or more of the portable processing devices, gesture information for a user, and/or notifications for a user.
  12. 12. A system according to any preceding claim comprising one or more portable processing devices, wherein the portable processing devices include one or more mobile phones, portable games consoles, and head-mountable display devices.
  13. 13. A method for generating data for use by one or more portable processing devices located in a real-world environment, the method comprising: capturing information, using one or more sensors, about the real-world environment and/or one or more elements in the real-world environment, the sensors being remote to the one or more portable processing devices; generating data in dependence upon the information captured by the sensors; and transmitting this data to one or more of the portable processing devices.
  14. 14. Computer software which, when executed by a computer, causes the computer to carry out the method of claim 13.
  15. 15. A non-transitory machine-readable storage medium which stores computer software according to claim 14.
GB2206947.0A 2022-05-12 2022-05-12 Auxiliary sensing system and method Pending GB2618596A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2206947.0A GB2618596A (en) 2022-05-12 2022-05-12 Auxiliary sensing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2206947.0A GB2618596A (en) 2022-05-12 2022-05-12 Auxiliary sensing system and method

Publications (2)

Publication Number Publication Date
GB202206947D0 GB202206947D0 (en) 2022-06-29
GB2618596A true GB2618596A (en) 2023-11-15

Family

ID=82156219

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2206947.0A Pending GB2618596A (en) 2022-05-12 2022-05-12 Auxiliary sensing system and method

Country Status (1)

Country Link
GB (1) GB2618596A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016135447A1 (en) * 2015-02-25 2016-09-01 Bae Systems Plc Information system and method
WO2017059215A1 (en) * 2015-10-01 2017-04-06 Mc10, Inc. Method and system for interacting with a virtual environment
US20170309088A1 (en) * 2016-04-20 2017-10-26 Gopro, Inc. Data Logging in Aerial Platform
US10417497B1 (en) * 2018-11-09 2019-09-17 Qwake Technologies Cognitive load reducing platform for first responders
US20190378476A1 (en) * 2019-08-08 2019-12-12 Lg Electronics Inc. Multimedia device and method for controlling the same
EP3629309A2 (en) * 2018-09-28 2020-04-01 Quoc Luong Drone real-time interactive communications system
US20210392188A1 (en) * 2018-10-17 2021-12-16 Thk Co., Ltd. Information processing method, and program
US20220004328A1 (en) * 2020-07-01 2022-01-06 Facebook Technologies, Llc Hierarchical power management of memory for artificial reality systems
US20220100261A1 (en) * 2020-09-28 2022-03-31 International Business Machines Corporation Contextual spectator inclusion in a virtual reality experience

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016135447A1 (en) * 2015-02-25 2016-09-01 Bae Systems Plc Information system and method
WO2017059215A1 (en) * 2015-10-01 2017-04-06 Mc10, Inc. Method and system for interacting with a virtual environment
US20170309088A1 (en) * 2016-04-20 2017-10-26 Gopro, Inc. Data Logging in Aerial Platform
EP3629309A2 (en) * 2018-09-28 2020-04-01 Quoc Luong Drone real-time interactive communications system
US20210392188A1 (en) * 2018-10-17 2021-12-16 Thk Co., Ltd. Information processing method, and program
US10417497B1 (en) * 2018-11-09 2019-09-17 Qwake Technologies Cognitive load reducing platform for first responders
US20190378476A1 (en) * 2019-08-08 2019-12-12 Lg Electronics Inc. Multimedia device and method for controlling the same
US20220004328A1 (en) * 2020-07-01 2022-01-06 Facebook Technologies, Llc Hierarchical power management of memory for artificial reality systems
US20220100261A1 (en) * 2020-09-28 2022-03-31 International Business Machines Corporation Contextual spectator inclusion in a virtual reality experience

Also Published As

Publication number Publication date
GB202206947D0 (en) 2022-06-29

Similar Documents

Publication Publication Date Title
EP3293723A1 (en) Method, storage medium, and electronic device for displaying images
KR102582863B1 (en) Electronic device and method for recognizing user gestures based on user intention
KR102481486B1 (en) Method and apparatus for providing audio
US20180232955A1 (en) Electronic device and method for transmitting and receiving image data in electronic device
KR20170097519A (en) Voice processing method and device
US11472038B2 (en) Multi-device robot control
JP2015532028A (en) Headset computer with hands-free emergency response
CN111354434A (en) Electronic device and method for providing information
EP3633497A1 (en) Information processing apparatus, information processing method, and program
KR20180098079A (en) Vision-based object recognition device and method for controlling thereof
CN112451968A (en) Game sound control method, mobile terminal and computer-readable storage medium
WO2021013043A1 (en) Interactive method and apparatus in virtual reality scene
CN112956209A (en) Acoustic zoom
US20140055339A1 (en) Adaptive visual output based on motion compensation of a mobile device
KR102580837B1 (en) Electronic device and method for controlling external electronic device based on use pattern information corresponding to user
CN108063869B (en) Safety early warning method and mobile terminal
CN107913519B (en) Rendering method of 2D game and mobile terminal
CN108307031B (en) Screen processing method, device and storage medium
GB2618596A (en) Auxiliary sensing system and method
CN112947474A (en) Method and device for adjusting transverse control parameters of automatic driving vehicle
US11363189B2 (en) Apparatus and method for recognizing voice and face on basis of change in camera driving method
CN108846817B (en) Image processing method and device and mobile terminal
KR20210085696A (en) Method for determining movement of electronic device and electronic device using same
JP6171046B1 (en) Electronic device, control method, and control program
KR102251076B1 (en) Method to estimate blueprint using indoor image