GB2605390A - Apparatus, method and computer program for use in telepresence - Google Patents

Apparatus, method and computer program for use in telepresence Download PDF

Info

Publication number
GB2605390A
GB2605390A GB2104502.6A GB202104502A GB2605390A GB 2605390 A GB2605390 A GB 2605390A GB 202104502 A GB202104502 A GB 202104502A GB 2605390 A GB2605390 A GB 2605390A
Authority
GB
United Kingdom
Prior art keywords
pointing
sensing means
sensing
relative
mounting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2104502.6A
Other versions
GB202104502D0 (en
Inventor
Rugg Gordon
Louise Martin Amy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
A G Telehelmets LLP
Original Assignee
A G Telehelmets LLP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by A G Telehelmets LLP filed Critical A G Telehelmets LLP
Priority to GB2104502.6A priority Critical patent/GB2605390A/en
Publication of GB202104502D0 publication Critical patent/GB202104502D0/en
Publication of GB2605390A publication Critical patent/GB2605390A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0386Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0383Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

Apparatus 107 comprising pointing means, such as a laser pointer 101’, and first 102’ and second 103’ sensing means, wherein movement of the pointing means and at least one of the sensing means is remotely controllable. Preferably the sensing means is a sensor, an image capture device, an audio capture device, and a temperature measurement device. Preferably the pointing means is mounted on a first mounting means 104’ and the moveable sensor is mounted on a second mounting means, and the first and second mounting means are independently controllable such that the movement of the pointing means and the moveable sensor are independently adjustable. The apparatus further comprises communication 105 means to send the first and second data to a remote device and receive information for controlling movement of the pointing means and the moveable sensing means. The pointer may be a virtual dynamic crosshair (fig 4B).

Description

APPARATUS, METHOD AND COMPUTER PROGRAM FOR USE IN TELEPRESENCE
TECHNOLOGICAL FIELD
Examples of the present disclosure relate to an apparatus, method and computer program for use in telepresence.
BACKGROUND
Conventional telepresence devices (not least for example telecommunication devices such as video conferencing equipment for enabling real time audio/visual communication between remote users, as well as telemetry devices for enabling transmission of sensor data to a remote device/user) are not always optimal. It can be difficult for a remote user to indicate to a local user objects or regions of interest in the local user's local environment. In some circumstances, it may be desirable to provide an improved telepresence device that can enhance the ability of a remote user to interact and communicate with a local user and the local environment of the local user and provide improved telepresence functionality.
The listing or discussion of any prior-published document or any background in this specification should not necessarily be taken as an acknowledgement that the document or background is part of the state of the art or is common general knowledge. One or more aspects/examples of the present disclosure may or may not address one
or more of the background issues.
BRIEF SUMMARY
The scope of protection sought for various embodiments of the present disclosure is set out by the independent claims.
Any examples/embodiments and features described in this specification that do not fall under the scope of the independent claims are to be interpreted as examples useful for understanding various embodiments of the present disclosure.
According to at least some examples of the disclosure there is provided an apparatus comprising: pointing means; and first and second sensing means; wherein movement of the pointing means and the at least one of the first and second sensing means is remotely controllable.
According to various, but not necessarily all, examples of the disclosure there is provided a method comprising receiving, from a remote device, information for controlling movement of: a pointing means of an apparatus, and at least one of a first and a second sensing means of the apparatus; and moving the pointing means and the at least one of the first and second sensing means based, at least in part, on the information.
According to various, but not necessarily all, examples of the disclosure there is provided computer program instructions for causing an apparatus to perform: receiving, from a remote device, information for controlling movement of: a pointing means of an apparatus, and at least one of a first and a second sensing means of the apparatus; and moving the pointing means and the at least one of the first and second sensing means based, at least in part, on the information.
According to various, but not necessarily all, examples of the disclosure there is provided an apparatus comprising: at least one processor; and at least one memory including computer program instructions; the at least one memory and the computer program instructions configured to, with the at least one processor, cause the apparatus at least to perform: receiving, from a remote device, information for controlling movement of: a pointing means of an apparatus, and at least one of a first and a second sensing means of the apparatus; and moving the pointing means and the at least one of the first and second sensing means based, at least in part, on the information.
According to various, but not necessarily all, examples of the disclosure there is provided a non-transitory computer readable medium encoded with instructions that, when performed by at least one processor, causes at least the following to be performed: receiving, from a remote device, information for controlling movement of a pointing means of an apparatus, and at least one of a first and a second sensing means of the apparatus; and moving the pointing means and the at least one of the first and second sensing means based, at least in part, on the information.
According to various, but not necessarily all, examples of the disclosure there is provided a module, device and/or system comprising the above-mentioned apparatus.
The following portion of this 'Brief Summary' section describes various features that can be features of any of the examples described in the foregoing portion of the 'Brief Summary' section.
In some but not necessarily all examples, the pointing means is configured for indicating an object or position in an environment, and wherein the pointing means is movably mounted to the apparatus such that the pointing means is movable relative to the apparatus.
In some but not necessarily all examples, the first and second sensing means are configured for sensing the environment, and at least one of the first and second sensing means is movably mounted to the apparatus such that the at least one of the first and second sensing means is movable relative to the apparatus.
In some but not necessarily all examples, the apparatus further comprises communication means for receiving, from a remote device, information for controlling movement of the pointing means and the at least one of the first and second sensing means relative to the apparatus.
In some but not necessarily all examples, the apparatus further comprises means for moving the pointing means and the at least one of the first and second sensing means relative to the apparatus based, at least in part, on the information.
In some but not necessarily all examples, the pointing means comprises at least one selected from the group of: a pointer device, an illumination device, a laser pointer, and means for generating a visual indication for indicating an object, a location and/or a position in a captured an image of a real scene.
In some but not necessarily all examples, the first and/or second sensing means comprises at least one selected from the group of a sensor, an image capture device, an audio capture device, and a temperature measurement device In some but not necessarily all examples, the first and/or second sensing means comprises a directional sensor.
In some but not necessarily all examples, the first sensing means is configured to generate first sensor data.
In some but not necessarily all examples, the second sensing means is configured to generate second sensor data In some but not necessarily all examples, the apparatus comprises a communication means configured to send the first and second sensor data to the remote device.
In some but not necessarily all examples, the pointing means and the at least one of the first and second sensing means are rotatably mounted to the apparatus such that the pointing means and the at least one of the first and second sensing means are rotatable relative to the apparatus so as to adjust: a direction and/or orientation of the pointing means relative to the apparatus, and a direction and/or orientation the at least one of the first and second sensing means relative to the apparatus.
In some but not necessarily all examples, the apparatus further comprises at least one mounting means for moveably mounting the pointing means and at least one of the first and second sensing means relative to the apparatus, wherein the at least one mounting means is controllable by a remote device so as to move the pointing means and the at least one of the first and second sensing means relative to the apparatus.
In some but not necessarily all examples, the apparatus further comprises a first mounting means for mounting the pointing means and the at least one of the first and second sensing means to the apparatus.
In some but not necessarily all examples, the apparatus further comprises: a first mounting means for mounting the pointing means to the apparatus; and at least a second mounting means for mounting the at least one of the first and second sensing means to the apparatus.
In some but not necessarily all examples, the first mounting means and the at least second mounting means are independently controllable such that movement of the pointing means relative to the apparatus and movement of the at least one of the first and second sensing means relative to the apparatus are independently adjustable.
In some but not necessarily all examples, the apparatus further comprises one or more further sensing means, wherein the one or more sensing means are attached to the apparatus such that movement of the one or more further means for sensing relative to the apparatus is prevented.
In some but not necessarily all examples, there is provided a device or module comprising the apparatus as set out above.
In some but not necessarily all examples, there is provided a device comprising the apparatus as set out above, wherein the device is at least one selected from the group of: a portable device, a wearable device, a handheld device, a wireless communications device, a telepresence device, a telecommunications device, and a user equipment device In some but not necessarily all examples, there is provided a head mountable device comprising the apparatus as set out above.
10 15 20 In some but not necessarily all examples, there is provided a remotely controllable vehicle comprising the apparatus as set out above.
While the above examples of the disclosure and optional features are described separately, it is to be understood that their provision in all possible combinations and permutations is contained within the disclosure. Also, it is to be understood that various examples of the disclosure can comprise any or all of the features described in respect of other examples of the disclosure, and vice versa.
According to various, but not necessarily all, examples of the disclosure there are provided examples as claimed in the appended claims.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of various examples of the present disclosure that are useful for understanding the detailed description and certain examples of the present disclosure, reference will now be made by way of example only to the accompanying drawings in which: FIG. 1 shows a schematic example of an apparatus according to an example of the present disclosure; FIG. 2A shows a schematic example of another apparatus according to an example of the present disclosure; FIG. 2B shows a schematic example of the apparatus of FIG. 2A in an adjusted configuration; FIG. 3 shows a schematic example of another apparatus according to the present disclosure; FIG. 4A shows an example of use of an apparatus according to the present disclosure; FIG. 4B shows an example of use of an apparatus according to the present disclosure; FIG. 5 shows an example of a method according to the present disclosure; FIG. 6 shows an example of another apparatus according to the present disclosure; and FIG. 7 shows a schematic example of a computer program according to the present 30 disclosure.
The figures are not necessarily to scale. Certain features and views of the figures can be shown schematically or exaggerated in scale in the interest of clarity and conciseness. For example, the dimensions of some elements in the figures can be exaggerated relative to other elements to aid explication. Similar reference numerals are used in the figures to designate similar features. For clarity, all reference numerals are not necessarily displayed in all figures.
DETAILED DESCRIPTION
FIG. 1 schematically illustrates an example apparatus 100 comprising: pointing means 101; and first 102 and second 103 sensing means; wherein movement of the pointing means and the at least one of the first and second sensing means is remotely controllable.
In FIG. 1, movement of the of the pointing means and the at least one of the first and second sensing means is schematically illustrated via the dashed pointing means 101' first and second sensing means 102' and 103'. In example shown, the remotely controllable movement comprises an adjustment of the direction/orientation of the pointing means and the at least one of the first and second sensing means, i.e. with respect to the apparatus, such as with respect to a main body or housing of the apparatus.
FIG 2A schematically illustrates a further example apparatus 200 according to the present disclosure. The apparatus 200 comprises pointing means 101 for indicating an object, location or position in a local environment. As used herein, a "pointing means" can be any suitable device or arrangement for pointing out/indicating (e.g., to one or more local users physically present in the local environment or one or more remote users) an object or a position in the local environment, such as a scene of a space. In some examples, the pointing means points out/indicates the object, location or position directly in a real scene of a real space, i.e., such that the pointing out/indication occurs directly in the physical world. In some examples, the pointing means additionally or alternatively points out/indicates the object, location or position indirectly via a captured image and/or video of the real scene (such as discussed below with respect to FIG 4B.
The pointing means can, for example, be a pointer device, an illumination device, a laser pointer or an equivalent structure. The pointing means 101 is movably mounted to the apparatus 200 such that the pointing means 101 is movable relative to the apparatus 200.
The apparatus further comprises first and second sensing means 102, 103 for sensing the local environment. As used herein, the term "sensing means" can be a device for sensing a stimulus in the environment. It can, for example, be a sensor, an image capture device (not least such as a charge coupled device (CCD) sensor array), an audio capture device (not least such as a microphone) and a temperature measurement device (not least such as an infrared thermometer). The sensing means may be directional so as to provide a directional sensor that is configured to capture sensor measurements in a particular capture direction/area. In this regard, the sensitivity/responsivity of the sensor can be optimised/biased in a capture direction. Such a directional sensor may be, not least for example, a unidirectional microphone.
The first and second sensing means 102, 103 are movably mounted to the apparatus such that the first and second sensing means 102, 103 are movable relative to the apparatus 200.
In the example FIG. 2A, each of the pointing means as well as both the first and second sensing means are mounted to the apparatus via mounting means 104 which movably mounts the pointing means as well as the first and second sensing means to the apparatus such that the pointing means and the first and second sensing means can move relative to the apparatus. This allows the: position, pose, orientation and/or direction of the pointing means and the first and second sensing means to be altered/adjusted relative to the apparatus 200 and/or a main body or housing 107 of the apparatus 200. For example, the mounting means 104 (and pointing means 101 as well as the first and second sensing means 102, 103 mounted thereon) can rotate about an axis 108 aligned along a z direction [coming out of the page] perpendicular to the x and y directions shown in FIG. 2A).
The apparatus 200 further comprises communication means 105 configured to receive, from a remote device (and a remote user thereof -not shown) information for controlling movement of the pointing means and the first and second sensing means relative to the apparatus. As used herein, the term "communication means" may be any suitable means or device for receiving information from a remote device. It can, for example, be a receiver or a transceiver or the like, along with means such as controlling circuitry for controlling the same. The communication means may comprise a wireless communication interface, or a wired communication interface, for communicating with the remote device via a communication network. Such a communication network may comprise a wireless communication network not least for example such as a wireless cellular communication network such as a 3G, 4G, 5G cellular communication network or the like. The communication network may also comprise a wireless local area network (VVLAN).
The apparatus further comprises means 106 for moving the pointing means and the first and second sensing means relative to the apparatus based, at least in part, on the received information. As used herein, the means for moving 106 may be any device or mechanism for moving the pointing means and the at least one of the first and second sensing means relative to the apparatus. Such means may comprise a driving mechanism, actuate, motor or the like configured to move the pointing means and the at least one of the first and second sensing means relative to the apparatus. Such means for moving may be controlled based, at least in part, on the information received from a remote device controlling movement of the pointing means and the at least one of the first and second sensing means. In the example illustrated, the means for moving 106 is a separate component/device from the mounting means 104. However, in other examples (e.g., as in FIG. 3), the means for moving may be comprised in or integrated with the mounting means.
FIG. 2A schematically illustrates the apparatus as a block diagram for effecting the functionality and performing the methods described in the present disclosure. The component blocks of FIG. 2A are functional and the functions described can be performed by the above described means under the control of a controller 11.
In the example shown, the pointing means and the first and second sensors are rotatably mounted to the apparatus such that they are rotatable relative to the apparatus so as to adjust: a direction and/or orientation of the laser pointer relative to the apparatus. and a direction and/or orientation of the first and second sensors relative to the apparatus.
In some examples, not all of the sensing means are movably mounted relative to the apparatus. In some examples, the pointing means and at least one of the first and second sensing means are configured to move relative to the apparatus.
Examples of the present disclosure may enable the remote control of the movement of the pointing means and at least one of the first and second sensing means relative to the apparatus such that the position, orientation and/or direction of the pointing means and the at least one of the first and second sensing means may be altered/adjusted relative to the apparatus 200. In this regard, the direction of the pointing means and the at least one of the first and second sensing means may be adjusted relative to a main body or housing 107 of the apparatus 200.
As used hereinafter, the term "pointing means" shall be referred to as a laser pointer, and the first and second sensing means shall be referred to as first and second sensors. It is to be appreciated that the term "laser pointer" can be used interchangeably with the term "pointing means", and that likewise the term "sensor" can be used interchangeably with the term "sensing means".
The mounting means 104, such an adjustable mount mechanism, is provided for movably mounting the laser pointer and the first and second sensors relative to the apparatus. The mounting means and movement thereof are controllable by a remote device so as to move the laser pointer and the first and second sensors relative to the apparatus. Some examples, such as that shown in FIGS.1 and 2, a single mounting means is provided for each of the laser pointer and the first and second sensors.
However, in other examples, the pointing means and first and second sensors may be provided with their own mounting means thereby enabling movement of the laser pointer as well as the first and second sensors relative to the apparatus to be independently adjustable. In such a manner, plural mounts may be provided to enable independent movement of the laser pointer and one or more sensors, rather than having a common/single mount for common movement of the laser pointer and one or more sensors.
The mount may be any means or mechanism via which the laser pointer and sensors can be secured to the apparatus whilst enabling the laser pointer and sensors to move relative to the apparatus so as to adjust their position (e.g., orientation/direction) relative to the apparatus, not least so as to enable one or more of: pan, tilt, yaw, pitch and roll movement.
In use, the laser pointer is configured to generate a beam 101a for indicating an object or a position in a local environment, i.e., proximal to/in the vicinity of the apparatus, such as within range of the beam of the laser pointer. The first and second sensors are configured to sense the environment, the first sensor is may be configured to generate first sensor data whilst the second sensor may be configured to generate second sensor data. The communication means may be further configured to transmit the first and second sensor data to a remote device. In such a manner, the apparatus may, in effect, function as a telemetry device to sense, detect, measure and/or capture sensor data at a local environment and transmit such sensor data to a remote device.
In some examples, the first sensor data may comprise image data such as images, video or audio/video that are captured and transmitted substantially in real time such that a live stream may be received at the remote device for presenting to a remote user.
The captured/livestreamed visual data of the local environment may further capture an image of the laser beam of the laser pointer. In other words, the captured image may include an image of the reflection of the laser beam from an object/position in the local environment that the laser beam is incident to. The laser beam may thereby indicate (to each of a local user seeing the laser beam/reflection for him/herself, as well as the remote user who can see the laser beam/reflection via the received captured image) the direction/position that the laser pointer is currently aiming at. The remote user is able to remotely control the orientation of the laser pointer and at least one of the first and second sensing means so as to adjust their relative position and orientation with respect to the apparatus and aim the laser pointer and sensors at another object or another position.
By remotely controlling the aiming of the pointing device, a remote user is able to provide an indication to a local user as to a particular object or position in the local environment of interest to the remote user. Likewise, by remotely controlling the position and orientation of the first and/or second sensors, a remote user is able to reorientate/aim the sensors at an object position in the local environment of interest to the remote user so that the remote user may receive first and second sensor data from the desired direction.
FIG. 2B shows the apparatus 200 of FIG. 2A in an adjusted configuration, wherein the mounting means 104' has been moved under the control of the remote user (not shown) such that the direction of the laser beam 101a' of the laser pointer is reorientated with respect to the apparatus/housing 107. Likewise, the direction of the first and second sensors 102, 103 is also changed relative to the apparatus/housing 107 under control of the remote user. In the example shown in FIGS.1 and 2, the first and second sensors as well as the laser pointer are movably mounted with respect to the apparatus. However, it is to be appreciated that in other examples of the disclosure, the laser pointer and just one of the first and second sensors may be movable with respect to the apparatus.
In some examples additional sensors may be provided. In some examples, one or more of such further sensors may be attached to the apparatus such that movement of such sensors relative to the apparatus is prevented, i.e., their position relative to the apparatus is fixed. In some examples, one or more further sensors may be provided which are attached to the apparatus such that movement of such further sensors relative to the apparatus is adjustable and, moreover, which is remotely controlled by the remote user.
The apparatus 200 may be provided in a device or a module. In some examples, the apparatus may be provided in: a portable device, a wearable device, a handheld device, a wireless communications device, a telepresence device, a telecommunications device and a user equipment device.
In some examples, the apparatus is provided as a head mountable device, not least for example such as a helmet. In some examples, the apparatus may be provided in a remotely operatable vehicle, not least for example an unmanned aerial vehicle (UAV) or an unmanned ground vehicle (UGV).
Examples of the apparatus of FIG. 2A may provide utility as a telepresence device, for example a remote inspection device for use in machinery and/or site inspection. The apparatus may be provided as a portable device carriable/wearable by a user in a local environment, referred to hereinafter as a "local user", wherein the apparatus is remotely controlled by a "remote user". For instance, a local site inspecting engineer may wear the apparatus (e.g., in the form of a head mountable device such as a helmet as per FIG. 3). The apparatus may be configured to provide a video stream and telemetry/sensor data to a remote site inspecting engineer supervisor. The remote user can control the direction of the laser pointer and sensors to aim them at a target object/position in the local environment of interest to the remote user.
FIG. 3 shows a schematic diagram of an apparatus 300 embodied in the form of a head mountable device, in this case a helmet 307. The figure schematically illustrates an overhead/plan view of the helmet and the various components thereof.
The helmet comprises a sensor array comprising sensor 302, e.g., a camera, and a second sensor 303, e.g., a directional microphone. The sensor array is attached to a front portion of the helmet 307 and is able to pan and tilt via a pan/tilt mounting/platform 304. A laser pointer 301 is also attached to the helmet via the pan/tilt mounting. The laser pointer is operable to emit a pointing/indicating beam 301a. In this example, a means 306 for moving the mounting mechanism 304, e.g., an appropriate actuating device/mechanism remotely controllable by a remote user, is integrated with the tilt/pan mounting mechanism 304. The pan/tilt mounting mechanism is remotely controllable via a remote user of a remote device so as to control at least a pan and/or tilt movement of the sensors and the pointer.
In this example, the pair of sensors comprises a camera and a directional microphone. However, it is to be appreciated that other combinations of sensors may be provided, not least for example: two or more cameras (not least for example one with a wide-angle lens/wide field of view and another with a telephoto lens/narrower field of view; and/or two or more cameras in differing parts of the electromagnetic spectrum, not least for example infrared and visible parts of the spectrum), and two or more microphones configured for differing parts of the acoustic spectrum (not least for example an infrasound microphone and an ultrasound microphone).
In this example, the apparatus is configured in the form of a helmet so as to be head mounted. However, in other examples, the apparatus may be configured as a portable device or module, a device that can be placed on desk (such as for use in a conference or meeting room), or a module mounted on a remotely controlled vehicle.
The apparatus 300 further comprises additional sensors that are mounted on the apparatus but separately of the pan/tilt mounting mechanism 304,306, i.e., such that they are held in a fixed position relative to the apparatus. The apparatus may comprise additional modules and mounting points for further components, such as a mount 311 on a right-hand side of the helmet for lights and other components and a mounting point 312 for mounting lights and other components on a left-hand side of the helmet. At a rear side of the helmet, computing hardware 305 (e.g., including a controller and communication means) as well as units, e.g., batteries, may be provided.
In order to provide optimal weight balance, the computing hardware and power unit may be located at the back of the helmet so as to counterbalance the sensor array and mounting mechanism located at the front of the helmet. Likewise, the mounting point/platform 311 on the right-hand side of the helmet and components thereon may be configured so as to counterbalance mounting point/platform and its mounted components on the left-hand side of the helmet. The mounting points/platforms may be configured to hold: lights (such as flash lights or torches), speakers and/or other components on the helmet.
It should be noted that the components, elements and devices of FIG. 3 need not be mandatory and thus some can be omitted in certain examples of the present disclosure, not least for examples the mounting platforms 311, 312 and any additional sensors devices thereon.
FIG. 4A shows an example use of an apparatus according to the present disclosure, such as apparatus 100, 200 or 300 described above. In particular, FIG. 4A shows an example of an image 402 captured by a camera of the apparatus that is transmitted to a remote device and presented to a remote user. The remote user may be presented with a substantially live video feed/real time video stream of footage captured by the camera namely footage of a local environment of the apparatus, in this particular example, an interior of a room. Furthermore, the captured image of the room includes a captured image of the laser beam 401a of the laser pointer of the apparatus. The remote user is thereby able to identify where the laser pointer is currently pointing/aiming towards and is able to control movement of the laser pointer, as well as the field of view of the captured image, so as to aim the laser pointer at a desired target object or position of interest in the local environment. In the example shown, the remote user may control movement of the laser pointer and the camera via graphical user interface buttons 403 for a panning or tilting movement. However, it is appreciated such movement commands may be affected via any appropriate input means, not least for example touchscreen or voice control.
Advantageously, since the remote user is able to remotely control direction of the laser pointer 401a to a desired object/position of interest in the local environment, this would thereby indicate to any local users that are physically present in the local environment as to the remote user's target/object or position of interest. Moreover, if additional remote users were also to receive a display of the captured sensor data including the image of the laser beam 401a this would also indicate to such additional remote users (who may be remote from the first remote user) as to an object of interest or a position of interest in the local environment. Furthermore, with the laser pointer and sensor array aimed at a new target object/position, where directional sensors are provided and aligned with the orientation of the laser pointer's beam direction, directional sensor data focused on the desired target object/target position may be captured and transmitted to the remote user (and any additional remote users) such that directional sensor data focused on an object/position of interest might be received.
The display presented to a remote user, e.g., as shown in FIG. 4A, may be based on images of an environment local to an apparatus, such as apparatus 300 of FIG. 3, that are captured by a sensor array of the apparatus. Advantageously, examples of the present disclosure may provide an apparatus comprising a pointing device such as a laser pointer as well as a video camera, wherein the apparatus is configured to provide a substantially live/real time image/video feed to a remote user wherein the laser pointer is able to be remotely controlled by the remote user. In such a manner the apparatus provides telepresence functionality -enabling the remote user to feel as if they were virtually present in the local environment by visually perceiving (and, in some examples, audibly perceiving) the real world local visual (and audible) environment. Furthermore, the remote user is able to remotely steer/aim the laser pointer, via which the remote user can control/determine the direction/target/aiming of the laser beam 401a and the received livestream video, so as to remotely indicate an object or a position in the field of view of the camera. The remote user may thereby indicate to local users of physically present, or remote users who are also receiving the video feed, a particular target object position of interest. In such a manner, examples of the claimed invention may provide not only telepresence and telecommunication via additional sensors and input and output devices (such as one or more microphones and one or more speakers) but may also provide telecommand of the orientation of the laser pointer and the sensors so as to steer and aim the same by remotely controlling the movement relative to the apparatus.
It is to be appreciated that movement of the local user (i.e., the wearer of the helmet) such as head movement, could give rise to movement of the entirety of the apparatus worn by the local user. In some examples, movement of the tilt and pan mechanism may additionally be controlled based, at least in part, on detected movement of the local user by a movement detector so as to accommodate and nullify the effect of user movement. By taking into account movement of the local user, the position of the tilt and pan mechanism may be controlled such that the orientation of the tilt and pan mechanism, and hence the orientation of the laser pointer and sensors, relative to the real world may remain fixed i.e., independent of movement of the apparatus itself and the local user. In effect, such control of the orientation of the tilt and pan mechanism enables its orientation effectively to be de-coupled from movement of the local user.
For example, once a remote user has aimed/targeted the laser pointer on a particular object or position in the real-world local environment, the tilt and pan mechanism may be configured to control the orientation of the laser pointer such that it remains targeted/pointed at the object/position in the local real-world environment irrespective of movement of the apparatus, i.e., via movement of the user such as the user's head movement. In such a manner, the apparatus can maintain the laser beam fixed on the remote user's desired target object/position and may thereby maintain a visible identifiable indication of the desired object/position in the local environment (e.g., a laser spot), such indication being perceived by the local user as well as the remote user (and any other local users and/or any other remote users receiving the real time video stream footage from the apparatus' camera).
In the above-described examples, the pointing means comprises a laser pointer, remotely controllably/aimable by a remote user, for directly pointing out/indicating an object, location or position in the real scene. In such a manner, the object/location/position may be directly physical indicated in the real world, i.e. by virtue of the laser pointer reflecting from the object/location/position of interest. However, in other examples, the pointing means may additionally or alternatively be implemented in a differing manner so as to indirectly pointing out/indicating an object, location or position in the real scene, namely by pointing out an object/location/position in a captured image of the real word scene and communicating the same to other users.
In this regard, an object, location or position in the real world can be indirectly pointed out/indicated to other users via captured image(s)/video(s) of the real scene.
For instance, in some examples, the pointing means comprises means configured to enable a remote user to indicate/point out (i.e. to another user, such as a local user or other remote users) an object, location or a position in the real scene by virtue of indicating an object, location or a position in a captured image or video of the real scene, whereupon the captured image or video, including the indication, is presented to the local user(s) and/or other remote user(s) to indicate the object/location/position of interest.
Such alternative implementations of the pointing means may be useful, for instance, where the laser pointer may not be able to clearly identify an object/location/position of interest in the real scene e.g. due to the object/location/position of interest being distant in the local environment and/or the laser beam not being sufficiently reflected from the object/location/position of interest so as to be readily perceived by the local user(s) (i.e. directly seen in the real scene) and/or remote user(s) (i.e. seen in the captured image/video of the real scene presented to the remote user(s)).
FIG. 4A shows an example wherein an image of one or more visual indicators such as fiducial markers, in this case cross hairs 401a' and reference/grid lines 401a", are generated and disposed on top of the captured image/video 402 of a real scene.
In some examples, this is done by superimposing static cross hairs and/or grid lines over the captured image/video when displaying the captured image/video to the one or more remote users (who could be viewing the same via an internet browser showing a live/substantially real-time video feed, captured by the apparatus 300, of the real scene). The position of the cross hairs/grid lines relative to the background real world scene could be moved/steered by a remote user directing the local user/helmet wearer to look left/right/up/down, and/or the remote user controlling movement of the camera with respect to the helmet (e.g via pan/tilt movement control icons 403').
In some examples, instead of static cross hairs and/or grid lines, remotely controllably dynamic cross hairs and/or grid lines could be provided. In such a manner, the relative position of the (e.g. foreground) cross hairs/grid lines relative to a (e.g. background) captured image of the scene could be moved/steered by the remote user (e.g. via control icons similar to 403). In effect, the visual image of the cross hairs may be remotely steerable (in the virtual domain) in a manner similar to the laser pointer being remotely steerable (in the physical domain). An object, location or position of interest could be indicated by steering the cross hairs over the object/location/position of interest. Additionally, or alternatively, an object/location/position of interest could be indicated with regards to a grid reference, of a grid square which the object/location/position of interest lies.
If the local user/helmet wearer needed to know which object/location/position (e.g., landmarks) in a real-world scene were of interest to the remote user, the helmet wearer could be presented with the captured image including the cross hairs. For example, the local user could log in to the telepresence session via a portable communications and display device, e.g. mobile phone, and view the captured image and cross hairs/grid. In some examples, the helmet may include a display device, such as a near eye display which could be a see-though display for enabling the display of augmented reality content (i.e., display of the cross hairs and/or gridlines with the real-world scene viewable through the see-through display).
However, in some envisage use scenarios, the local user/helmet wearer need not know which particular object/location/position in the real world scene were of interest to the remote user. For example, instead, the remote user may only need to convey and object/location/position of interest to other remote users. For instance, if the apparatus were to be used in the film industry for location scouting, a remote user may merely need to communicate with other remote users (e.g. special effects people communicating with photographers about the suitability of a location). There would be no need for the helmet wearer to know which objects and locations were being indicated and discussed by the remote users.
Instead of providing the fiducial markers, e.g., cross hairs and/or gridlines, in the image domain via software, i.e., superimposing a computer-generated image of the fiducial markers on to the capture image, a similar effect could be achieved via physical hardware. In this regard, the pointing means of the apparatus may additionally comprise one or more fiducial markers. Such fiducial markers may be provided in front an image capture device/camera of the apparatus, namely in the field of view/focal plane of the same such that images/video captured by the camera additionally capture an image of the fiducial marker. For example, a transparent physical device may be placed and attached over the camera in front of its field of view and focal plane, e.g., as a camera filter over a camera lens, wherein device has the fiducial markers (e.g. cross hairs and/or a grid) thereon. In such a manner, the image captured by the camera of the real scene, through the transparent device with fiducial markers thereon, would include, in addition to the captured image of the real scene in the background, a captured image of the fiducial markers in the foreground.
In some examples, the pointing means comprises means for generating a visual indication for indicating an object, a location and/or a position in a captured an image of a real scene.
In some examples, the pointing means comprises one or more of: means for capturing an image of a real scene; means for enabling a remote user to selectively indicate an object, a location or a position in a captured image/video of the real scene; means for generating a visual indication for indicating an object, a location and/or a position in a captured an image of a real scene; means for enabling a remote user to control movement, with respect to a real scene and/or a captured an image of a real scene, of a position of a visual indication for indicating an object, a location and/or a position in a captured image of a real scene.
means for generating a visual indication for selectively indicating an object, a location or a position in the captured image/video of the real scene, wherein the visual indication is superposed the captured image/video of the real scene; means for enabling one or more local users to view the real scene or a captured image/video of the real scene and an indication of the selectively indicated object, location or position; and means for enabling one or more remote users to view the captured image/video of the real scene and an indication of the selectively indicated object, location or position.
In some examples, the means for enabling a remote user to selectively indicate an object, a location or a position in the captured image/video of the real scene comprises means for generating a visual indicator for indicating an object, a location or a position of interest on the captured image, and superimposing the visual indicator on the captured image/video. In some examples, the position of the visual indicator with respect to real world scene is controllable by the remote user.
FIG. 5 schematically illustrates an example method according to an example of the present disclosure.
In block 501, information is received for controlling movement of a pointing means and at least one or first and second sensing means of an apparatus (not least such as an apparatus as described above with respect to figures 1, 2 and 3).
In block 502, movement of the pointing means and the at least one sensing means, relative to the apparatus, is controlled based, at least in part, on the received information. Such relative movement may be, not least for example: a tilt/pitch movement and/or a pan/yaw movement. Other movements could also be effected, such as a roll movement relative to the apparatus (for which the laser pointer may be configured to provide a non-point like laser beam, such as a cross hair indicator, so that a roll type rotational movement of the beam would be perceivable by the local and/or remote users) and a translational movement relative to the apparatus.
The component blocks of FIG. 5 are functional and the functions described can be performed by a single physical entity, such as described with reference to FIGS. 1, 2A and 2B, 3 and 6. The blocks illustrated in FIG. 5 can represent actions in a method and/or sections of instructions/code in a computer program such as is shown in FIG. 7.
It will be understood that each block and combinations of blocks, can be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions. For example, one or more of the procedures described above can be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above can be stored by a memory storage device and performed by a processor.
As will be appreciated, any such computer program instructions can be loaded onto a computer or other programmable apparatus (i.e., hardware) to produce a machine, such that the instructions when performed on the programmable apparatus create means for implementing the functions specified in the blocks. These computer program instructions can also be stored in a computer-readable medium that can direct a programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the blocks. The computer program instructions can also be loaded onto a programmable apparatus to cause a series of operational actions to be performed on the programmable apparatus to produce a computer-implemented process such that the instructions which are performed on the programmable apparatus provide actions for implementing the functions specified in the blocks.
Various, but not necessarily all, examples of the present disclosure can take the form of an apparatus, a method, or a computer program. Accordingly, various, but not necessarily all, examples can be implemented in hardware, software or a combination of hardware and software.
Various, but not necessarily all, examples of the present disclosure are described using flowchart illustrations and schematic block diagrams. It will be understood that each block (of the flowchart illustrations and block diagrams), and combinations of blocks, can be implemented by computer program instructions of a computer program. These program instructions can be provided to one or more processor(s), processing circuitry or controller(s) such that the instructions which execute on the same create means for causing implementing the functions specified in the block or blocks, i.e., such that the method can be computer implemented. The computer program instructions can be executed by the processor(s) to cause a series of operational steps/actions to be performed by the processor(s) to produce a computer implemented process such that the instructions which execute on the processor(s) provide steps for implementing the functions specified in the block or blocks.
Accordingly, the blocks support: combinations of means for performing the specified functions; combinations of actions for performing the specified functions; and computer program instructions/algorithm for performing the specified functions. It will also be understood that each block, and combinations of blocks, can be implemented by special purpose hardware-based systems which perform the specified functions or actions, or combinations of special purpose hardware and computer program instructions.
Various, but not necessarily all, examples of the present disclosure provide both a method and corresponding apparatus comprising various modules, means or circuitry that provide the functionality for performing/applying the actions of the method. The modules, means or circuitry can be implemented as hardware, or can be implemented as software or firmware to be performed by a computer processor. In the case of firmware or software, examples of the present disclosure can be provided as a computer program product including a computer readable storage structure embodying computer program instructions (i.e., the software or firmware) thereon for performing by the computer processor.
FIG. 6 schematically illustrates a block diagram of an apparatus 10 for performing the methods and providing the functionality described in the present disclosure, not least the method illustrated in FIG. 5. The component blocks of FIG. 6 are functional and the functions described can be performed by a single physical entity.
The apparatus 10 comprises a controller 11, which could be provided within a device such as a wearable device, head-mountable device, user equipment device and a remotely controllable vehicle.
The controller 11 can be embodied by a computing device. In some, but not necessarily all examples, the apparatus can be embodied as a chip, chip set or module, i.e., for use in any of the foregoing. As used here 'module' refers to a unit or apparatus that excludes certain parts/components that would be added by an end manufacturer or a user.
Implementation of the controller 11 can be as controller circuitry. The controller 11 can be implemented in hardware alone, have certain aspects in software including firmware alone or can be a combination of hardware and software (including firmware).
The controller 11 can be implemented using instructions that enable hardware functionality, for example, by using executable instructions of a computer program 14 in a general-purpose or special-purpose processor 12 that can be stored on a computer readable storage medium 13, for example memory, or disk etc, to be executed by such a processor 12.
The processor 12 is configured to read from and write to the memory 13. The processor 12 can also comprise an output interface via which data and/or commands are output by the processor 12 and an input interface via which data and/or commands are input to the processor 12. The apparatus can be coupled to or comprise one or more other components 15 (not least for example: a laser pointer, sensors, a radio transceiver, a light source, input/output user interface elements and/or other modules/devices/components for inputting and outputting data/commands).
The memory 13 stores a computer program 14 comprising computer program instructions (computer program code) that controls the operation of the apparatus 10 when loaded into the processor 12. The computer program instructions, of the computer program 14, provide the logic and routines that enables the apparatus to perform the methods, processes and procedures described in the present disclosure and also illustrated in FIG. 5. The processor 12 by reading the memory 13 is able to load and execute the computer program 14.
Although the memory 13 is illustrated as a single component/circuitry it can be implemented as one or more separate components/circuitry some or all of which can be integrated/removable and/or can provide permanent/semi-permanent/ dynamic/cached storage.
Although the processor 12 is illustrated as a single component/circuitry it can be implemented as one or more separate components/circuitry some or all of which can be integrated/removable. The processor 12 can be a single core or multi-core processor.
The apparatus can include one or more components for effecting the methods, processes and procedures described in the present disclosure and illustrated in FIG 5. It is contemplated that the functions of these components can be combined in one or more components or performed by other components of equivalent functionality. The description of a function should additionally be considered to also disclose any means suitable for performing that function. Where a structural feature has been described, it can be replaced by means for performing one or more of the functions of the structural feature whether that function or those functions are explicitly or implicitly described.
Although examples of the apparatus have been described above in terms of comprising various components, it should be understood that the components can be embodied as or otherwise controlled by a corresponding controller or circuitry such as one or more processing elements or processors of the apparatus. In this regard, each of the components described above can be one or more of any device, means or circuitry embodied in hardware, software or a combination of hardware and software that is configured to perform the corresponding functions of the respective components as described above.
The apparatus can, for example, be a portable device, a wearable device, a handheld device, a wireless communications device, a telepresence device, a telecommunications device, a head mountable device, a user equipment device, a client device, and a remotely controlled vehicle. The apparatus can be embodied by a computing device, not least such as those mentioned above. In some examples, the computing device can additionally provide one or more audio/video/data communication functions (for example telepresence, telecommunication, video-communication, and/or data transmission such as sensor data) and image capture functions. In some examples, the apparatus can be embodied as a chip, chip set or module, i.e., for use in any of the foregoing devices.
In some examples, the apparatus comprises: at least one processor 12; and at least one memory 13 including computer program code the at least one memory 13 and the computer program code configured to, with the at least one processor 12, cause the apparatus at least to perform: receiving, from a remote device, information for controlling movement of: a pointing means of an apparatus, and at least one of a first and a second sensing means of the apparatus; and moving the pointing means and the at least one of the first and second sensing means based, at least in part, on the information.
The apparatus can be a telecommunication device for enabling telecommunication between a local user and a remote user. The apparatus can be a telepresence device for enabling a remote user to feel as if they were present (e.g., by capturing images and other sensor data of the local environment and then transmitting and presenting the same to the remote user/a device of the remote user).
The above-described examples find application as enabling components of telecommunication systems, telemetry systems; remote visual inspection systems, mediated, virtual and/or augmented reality and related software and services.
FIG. 7, illustrates a computer program 14. The computer program can arrive at the apparatus 10 via any suitable delivery mechanism 20. The delivery mechanism 20 can be, for example, a machine readable medium, a computer-readable medium, a non-transitory computer-readable storage medium, a computer program product, a memory device, a solid-state memory, a record medium such as a USB stick, Compact Disc Read-Only Memory (CD-ROM) or a Digital Versatile Disc (DVD) or an article of manufacture that comprises or tangibly embodies the computer program 14. The delivery mechanism can be a signal configured to reliably transfer the computer program. The apparatus 10 can receive, propagate or transmit the computer program as a computer data signal.
In certain examples of the present disclosure, there is provided computer program instructions for causing an apparatus (e.g., apparatus 10 of FIG. 6 or apparatuses 100, 200 and 300 of FIG.s 1-3) to perform at least the following or for causing performing at least the following: receiving, from a remote device, information for controlling movement of a pointing means of an apparatus, and at least one of a first and a second sensing means of the apparatus; and moving the pointing means and the at least one of the first and second sensing means based, at least in part, on the information.
References to 'computer program', 'computer-readable storage medium', 'computer program product', 'tangibly embodied computer program' etc. or a 'controller', 'computer', 'processor' etc. should be understood to encompass not only computers having different architectures such as single /multi-processor architectures and sequential (Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other devices. References to computer program, instructions, code etc. should be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed-function device, gate array or programmable logic device etc. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
Features described in the preceding description can be used in combinations other than the combinations explicitly described.
Although functions have been described with reference to certain features, those functions can be performable by other features whether described or not. Although features have been described with reference to certain examples, those features can also be present in other examples whether described or not. Accordingly, features described in relation to one example/aspect of the disclosure can include any or all of the features described in relation to another example/aspect of the disclosure, and vice versa, to the extent that they are not mutually inconsistent.
Although various examples of the present disclosure have been described in the preceding paragraphs, it should be appreciated that modifications to the examples given can be made without departing from the scope of the invention as set out in the claims. For example, whilst FIGs. 1 to 3 show a single common mount for common movement of the laser pointer and sensors, in other examples, in other examples, plural mounts may be provided to enable independent movement of each of the laser pointer and one or more sensors.
The term 'comprise' is used in this document with an inclusive not an exclusive meaning. That is any reference to X comprising Y indicates that X can comprise only one Y or can comprise more than one Y. If it is intended to use 'comprise' with an exclusive meaning then it will be made clear in the context by referring to "comprising only one..." or by using "consisting".
In this description, the wording 'connect', 'couple' and 'communication' and their derivatives mean operationally connected/coupled/in communication. It should be appreciated that any number or combination of intervening components can exist (including no intervening components), i.e., so as to provide direct or indirect connection/coupling/communication. Any such intervening components can include hardware and/or software components.
As used herein, the term "determine/determining" (and grammatical variants thereof) can include, not least: calculating, computing, processing, deriving, measuring, investigating, identifying, looking up (for example, looking up in a table, a database or another data structure), ascertaining and the like. Also, "determining" can include receiving (for example, receiving information), accessing (for example, accessing data in a memory), obtaining and the like. Also, " determine/determining" can include resolving, selecting, choosing, establishing, and the like.
References to a parameter can be replaced by references to "data indicative of", "data defining" or "data representative of" the relevant parameter if not explicitly stated.
In this description, reference has been made to various examples. The description of features or functions in relation to an example indicates that those features or functions are present in that example. The use of the term 'example' or 'for example', 'can' or may' in the text denotes, whether explicitly stated or not, that such features or functions are present in at least the described example, whether described as an example or not, and that they can be, but are not necessarily, present in some or all other examples. Thus 'example', 'for example', can' or may' refers to a particular instance in a class of examples. A property of the instance can be a property of only that instance or a property of the class or a property of a sub-class of the class that includes some but not all of the instances in the class.
In this description, references to "a/an/the" [feature, element, component, means...] are to be interpreted as "at least one" [feature, element, component, means...] unless explicitly stated otherwise. That is any reference to X comprising a/the Y indicates that X can comprise only one Y or can comprise more than one Y unless the context clearly indicates the contrary. If it is intended to use a' or 'the' with an exclusive meaning then it will be made clear in the context. In some circumstances the use of 'at least one' or one or more' can be used to emphasise an inclusive meaning but the absence of these terms should not be taken to infer any exclusive meaning.
The presence of a feature (or combination of features) in a claim is a reference to that feature (or combination of features) itself and also to features that achieve substantially the same technical effect (equivalent features). The equivalent features include, for example, features that are variants and achieve substantially the same result in substantially the same way. The equivalent features include, for example, features that perform substantially the same function, in substantially the same way to achieve substantially the same result.
In this description, reference has been made to various examples using adjectives or adjectival phrases to describe characteristics of the examples. Such a description of a characteristic in relation to an example indicates that the characteristic is present in some examples exactly as described and is present in other examples substantially as described.
Whilst endeavouring in the foregoing specification to draw attention to those features of examples of the present disclosure believed to be of particular importance it should be understood that the applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.
The examples of the present disclosure and the accompanying claims can be suitably combined in any manner apparent to one of ordinary skill in the art.
Each and every claim is incorporated as further disclosure into the specification and the claims are embodiment(s) of the present disclosure. Further, while the claims herein are provided as comprising specific dependencies, it is contemplated that any claims can depend from any other claims and that to the extent that any alternative embodiments can result from combining, integrating, and/or omitting features of the various claims and/or changing dependencies of claims, any such alternative embodiments and their equivalents are also within the scope of the disclosure.

Claims (18)

  1. CLAIMSWe claim: An apparatus comprising: pointing means; and first and second sensing means; wherein movement of the pointing means and the at least one of the first and second sensing means is remotely controllable.
  2. 2. The apparatus of claim 1, wherein: the pointing means is configured for indicating an object or position in an environment, and wherein the pointing means is movably mounted to the apparatus such that the pointing means is movable relative to the apparatus; the first and second sensing means are configured for sensing the environment, and wherein at least one of the first and second sensing means is movably mounted to the apparatus such that the at least one of the first and second sensing means is movable relative to the apparatus; the apparatus further comprises communication means for receiving, from a remote device, information for controlling movement of the pointing means and the at least one of the first and second sensing means relative to the apparatus; and the apparatus further comprises means for moving the pointing means and the at least one of the first and second sensing means relative to the apparatus based, at least in part, on the information.
  3. 3. The apparatus of any preceding claim, wherein the pointing means comprises at least one selected from the group of: a pointer device, an illumination device, a laser pointer, and means for generating a visual indication for indicating an object, a location and/or a position in a captured an image of a real scene.
  4. 4. The apparatus of any preceding claim, wherein the first and/or second sensing means comprises at least one selected from the group of: a sensor, an image capture device, an audio capture device, and a temperature measurement device.
  5. 5. The apparatus of any preceding claim, wherein the first and/or second sensing means comprises a directional sensor.
  6. 6. The apparatus of any preceding claim, wherein: the first sensing means is configured to generate first sensor data: the second sensing means is configured to generate second sensor data; and the apparatus comprises a communication means configured to send the first and second sensor data to the remote device.
  7. 7. The apparatus of any preceding claim, wherein the pointing means and the at least one of the first and second sensing means are rotatably mounted to the apparatus such that the pointing means and the at least one of the first and second sensing means are rotatable relative to the apparatus so as to adjust: a direction and/or orientation of the pointing means relative to the apparatus, and a direction and/or orientation the at least one of the first and second sensing means relative to the apparatus.
  8. 8. The apparatus of any preceding claim, further comprising at least one mounting means for moveably mounting the pointing means and at least one of the first and second sensing means relative to the apparatus, wherein the at least one mounting means is controllable by a remote device so as to move the pointing means and the at least one of the first and second sensing means relative to the apparatus.
  9. 9. The apparatus of any preceding claim, further comprising a first mounting means for mounting the pointing means and the at least one of the first and second sensing means to the apparatus.
  10. 10. The apparatus of any preceding claim, further comprising: a first mounting means for mounting the pointing means to the apparatus; and at least a second mounting means for mounting the at least one of the first and second sensing means to the apparatus.
  11. 11. The apparatus of claim 10, wherein the first mounting means and the at least second mounting means are independently controllable such that movement of the pointing means relative to the apparatus and movement of the at least one of the first and second sensing means relative to the apparatus are independently adjustable.
  12. 12. The apparatus of any preceding claim, further comprising one or more further sensing means, wherein the one or more sensing means are attached to the apparatus such that movement of the one or more further means for sensing relative to the apparatus is prevented.
  13. 13. A device or module comprising the apparatus of any preceding claim.
  14. 14. The device of claim 13, wherein the device is at least one selected from the group of: a portable device, a wearable device, a handheld device, a wireless communications device, a telepresence device, a telecommunications device, and a user equipment device.
  15. 15. A head mountable device comprising the apparatus of any of preceding claims 1 to 14.
  16. 16. A remotely controllable vehicle comprising the apparatus of any of preceding claims 1 to 14. 20
  17. 17. A method comprising causing, at least in part, actions that result in: receiving, from a remote device, information for controlling movement of: a pointing means of an apparatus, and at least one of a first and a second sensing means of the apparatus; and moving the pointing means and the at least one of the first and second sensing means based, at least in part, on the information.
  18. 18. Computer program instructions for causing an apparatus to perform the method as claimed in claim 17.
GB2104502.6A 2021-03-30 2021-03-30 Apparatus, method and computer program for use in telepresence Pending GB2605390A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2104502.6A GB2605390A (en) 2021-03-30 2021-03-30 Apparatus, method and computer program for use in telepresence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2104502.6A GB2605390A (en) 2021-03-30 2021-03-30 Apparatus, method and computer program for use in telepresence

Publications (2)

Publication Number Publication Date
GB202104502D0 GB202104502D0 (en) 2021-05-12
GB2605390A true GB2605390A (en) 2022-10-05

Family

ID=75783490

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2104502.6A Pending GB2605390A (en) 2021-03-30 2021-03-30 Apparatus, method and computer program for use in telepresence

Country Status (1)

Country Link
GB (1) GB2605390A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5673082A (en) * 1995-04-10 1997-09-30 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Light-directed ranging system implementing single camera system for telerobotics applications
US5719622A (en) * 1996-02-23 1998-02-17 The Regents Of The University Of Michigan Visual control selection of remote mechanisms
US20050104547A1 (en) * 2003-11-18 2005-05-19 Yulun Wang Robot with a manipulator arm
US20100268383A1 (en) * 2009-04-17 2010-10-21 Yulun Wang Tele-presence robot system with software modularity, projector and laser pointer
US20140009561A1 (en) * 2010-11-12 2014-01-09 Crosswing Inc. Customizable robotic system
US20140297180A1 (en) * 2011-11-15 2014-10-02 John Minjae Cho Mobilized Sensor System
US20150077502A1 (en) * 2012-05-22 2015-03-19 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US20170225334A1 (en) * 2016-02-09 2017-08-10 Cobalt Robotics Inc. Mobile Robot Security Enforcement

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5673082A (en) * 1995-04-10 1997-09-30 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Light-directed ranging system implementing single camera system for telerobotics applications
US5719622A (en) * 1996-02-23 1998-02-17 The Regents Of The University Of Michigan Visual control selection of remote mechanisms
US20050104547A1 (en) * 2003-11-18 2005-05-19 Yulun Wang Robot with a manipulator arm
US20100268383A1 (en) * 2009-04-17 2010-10-21 Yulun Wang Tele-presence robot system with software modularity, projector and laser pointer
US20140009561A1 (en) * 2010-11-12 2014-01-09 Crosswing Inc. Customizable robotic system
US20140297180A1 (en) * 2011-11-15 2014-10-02 John Minjae Cho Mobilized Sensor System
US20150077502A1 (en) * 2012-05-22 2015-03-19 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US20170225334A1 (en) * 2016-02-09 2017-08-10 Cobalt Robotics Inc. Mobile Robot Security Enforcement

Also Published As

Publication number Publication date
GB202104502D0 (en) 2021-05-12

Similar Documents

Publication Publication Date Title
JP7170736B2 (en) interactive augmentation or virtual reality device
US9858643B2 (en) Image generating device, image generating method, and program
US9881422B2 (en) Virtual reality system and method for controlling operation modes of virtual reality system
US10382699B2 (en) Imaging system and method of producing images for display apparatus
US20240037880A1 (en) Artificial Reality System with Varifocal Display of Artificial Reality Content
JP6252849B2 (en) Imaging apparatus and method
US20070247457A1 (en) Device and Method for Presenting an Image of the Surrounding World
US10890771B2 (en) Display system with video see-through
US9939890B2 (en) Image generation apparatus and image generation method of generating a wide viewing angle image
US20180188802A1 (en) Operation input apparatus and operation input method
KR101690646B1 (en) Camera driving device and method for see-through displaying
US20170357476A1 (en) Location based audio filtering
GB2565140A (en) Virtual reality video processing
CN112655202B (en) Reduced bandwidth stereoscopic distortion correction for fisheye lenses of head-mounted displays
CN107844190A (en) Image presentation method and device based on Virtual Reality equipment
CN112188152A (en) Remote operation support system
EP3465631B1 (en) Capturing and rendering information involving a virtual environment
KR20140037730A (en) Wearable system for providing information
GB2605390A (en) Apparatus, method and computer program for use in telepresence
CN110809148A (en) Sea area search system and three-dimensional environment immersive experience VR intelligent glasses
WO2017163649A1 (en) Image processing device
US10701313B2 (en) Video communication device and method for video communication
US11619814B1 (en) Apparatus, system, and method for improving digital head-mounted displays
CN211123484U (en) Novel mixed reality head display
Kumar et al. Head Tracking: A Comprehensive Review