GB2549264A - Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices - Google Patents

Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices Download PDF

Info

Publication number
GB2549264A
GB2549264A GB1605835.6A GB201605835A GB2549264A GB 2549264 A GB2549264 A GB 2549264A GB 201605835 A GB201605835 A GB 201605835A GB 2549264 A GB2549264 A GB 2549264A
Authority
GB
United Kingdom
Prior art keywords
volume
display
operator
user input
input data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1605835.6A
Other versions
GB2549264B (en
Inventor
Reddish Stephen
Freeman Christopher
Lewis Michael
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rolls Royce Power Engineering PLC
Original Assignee
Rolls Royce Power Engineering PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rolls Royce Power Engineering PLC filed Critical Rolls Royce Power Engineering PLC
Priority to GB1605835.6A priority Critical patent/GB2549264B/en
Priority to US15/455,721 priority patent/US10606340B2/en
Priority to CN201710220981.4A priority patent/CN107422686B/en
Publication of GB2549264A publication Critical patent/GB2549264A/en
Application granted granted Critical
Publication of GB2549264B publication Critical patent/GB2549264B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/05Programmable logic controllers, e.g. simulating logic interconnections of signals according to ladder diagrams or function charts
    • G05B19/054Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/414Structure of the control system, e.g. common controller or multiprocessor systems, interface to servo, programmable interface controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/10Plc systems
    • G05B2219/11Plc I-O input output
    • G05B2219/1103Special, intelligent I-O processor, also plc can only access via processor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40153Teleassistance, operator assists, controls autonomous robot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40195Tele-operation, computer assisted manual operation

Abstract

The invention can be used to allow an operator 44, such as a technician, in a remote location to get assistance from a colleague 48, such as an engineer. The invention allows for the colleague to get a view of the particular job, such as repairing a component, as well as possibly operating a robot. The apparatus and method does this by receiving sensed position data 13 of an operator 44 within a first volume 49; controlling a display 42 located in a second volume 51, different to the first, to display a representation of at least a portion of the operator within a representation of at least a portion of the first volume, using the received position data; receiving user input data, from a user located in the second volume; and controlling a first device 32, 26, 28, 30 located at the first volume, using the received input data, to perform a first action. The device and action could be communication through a communicator, or a projector displaying information. The device may be a robot and the action be controlling the robot. The display may be a virtual reality (VR) head set.

Description

TITLE
Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices
TECHNOLOGICAL FIELD
The present disclosure concerns apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices.
BACKGROUND
In various industries, operators located within a facility (for example, a factory, a manufacturing facility, or an electrical energy generation plant) may require guidance and/or information from a colleague to assist them with the performance of a task. For example, a technician carrying out a repair on an aerospace component in a repair facility may require guidance from an engineer on how the repair should be performed. At present, the engineer may visit the repair facility to assist the technician. However, this may be relatively costly where the engineer is required to travel between a plurality of facilities, some of which may be located in different cities or countries. The engineer may alternatively deliver instructions and/or information orally via a telephone. However, the instructions delivered by the engineer may be challenging for the technician to follow and/or inaccurate due to the engineer having no visibility of the repair being performed.
BRIEF SUMMARY
According to various examples there is provided apparatus for enabling remote control of one or more devices, the apparatus comprising: a controller configured to: receive position data for a sensed position of an operator within a first volume; control a display located in a second volume, different to the first volume, to display a representation of at least a portion of the operator within a representation of at least a portion of the first volume, using the received position data; receive user input data from a user input device located in the second volume; and control a first device located at the first volume, using the received user input data, to perform a first action.
The first device may comprise a communication device. The first action may include communicating information to the operator.
The first device may comprise a projector or a display. The first action may include displaying the information to the operator.
The first device may comprise an electroacoustic transducer. The first action may include providing acoustic waves to the operator to convey the information to the operator.
The first action may include physical movement of part, or all, of the first device. For example, the first device may include, or be coupled to, one or more actuators (such as one or more servomotors) for moving part, or all, of the first device.
The first device may comprise a laser. The first action may include orienting and activating the laser to emit a laser beam towards a target.
The first device may comprise a robot. The first action may include operating the robot.
The controller may be configured to control a second device at the first volume, using the received user input data, to perform a second action, different to the first action.
The display may comprise a virtual reality (VR) headset.
The controller may be configured to receive environment data associated with the first volume.
The controller may be configured to: receive further user input data requesting information concerning a process to be performed by the operator; retrieve process information using the further user input data; and control the display to display the retrieved process information.
The controller may be configured to: receive additional user input data requesting a computer aided design (CAD) model of an article located within the first volume; retrieve the requested computer aided design (CAD) model using the additional user input data; and control the display to display at least a part of the retrieved computer aided design (CAD) model.
The controller may be configured to store the received position data and the received user input data in a memory.
The first volume may be at least a part of a manufacturing facility or at least a part of an electrical energy generation plant.
The operator may be a human or a robot.
According to various examples there is provided a method of enabling remote control of one or more devices, the method comprising: receiving position data for a sensed position of an operator within a first volume; controlling a display located in a second volume, different to the first volume, to display a representation of at least a portion of the operator within a representation of at least a portion of the first volume, using the received position data; receiving user input data from a user input device located in the second volume; and controlling a first device located at the first volume, using the received user input data, to perform a first action.
The first device may comprise a communication device. Controlling the first device may include controlling communication of information to the operator.
The first device may comprise a projector or a display. Controlling the first device may include controlling display of the information to the operator.
The first device may comprise an electroacoustic transducer. Controlling the first device may include controlling provision of acoustic waves to the operator to convey the information to the operator.
The first action may include physical movement of part, or all, of the first device.
The first device may comprise a laser. The first action may include orienting and activating the laser to emit a laser beam towards a target.
The first device may comprise a robot. The first action may include operating the robot.
The method may further comprise controlling a second device at the first volume, using the received user input data, to perform a second action, different to the first action.
The display may comprise a virtual reality (VR) headset.
The method may further comprise receiving environment data associated with the first volume.
The method may further comprise: receiving further user input data requesting information concerning a process to be performed by the operator; retrieving process information using the further user input data; and controlling the display to display the retrieved process information.
The method may further comprise: receiving additional user input data requesting a computer aided design (CAD) model of an article located within the first volume; retrieving the requested computer aided design (CAD) model using the additional user input data; and controlling the display to display at least a part of the retrieved computer aided design (CAD) model.
The method may further comprise controlling storage of the received position data and the received user input data in a memory.
The first volume may be at least a part of a manufacturing facility or at least a part of an electrical energy generation plant.
The operator may be a human or a robot.
According to various examples there is provided a computer program that, when read by a computer, causes performance of the method as described in any of the preceding paragraphs.
According to various examples there is provided a non-transitory computer readable storage medium comprising: computer readable instructions that, when read by a computer, cause performance of the method as described in any of the preceding paragraphs.
The skilled person will appreciate that except where mutually exclusive, a feature described in relation to any one of the above aspects may be applied mutatis mutandis to any other aspect. Furthermore except where mutually exclusive any feature described herein may be applied to any aspect and/or combined with any other feature described herein.
BRIEF DESCRIPTION
Embodiments will now be described by way of example only, with reference to the Figures, in which:
Fig. 1 illustrates a schematic diagram of apparatus for enabling remote control of one or more devices according to various examples;
Fig. 2 illustrates a flow diagram of a method of enabling remote control of one or more device according to various examples;
Fig. 3 illustrates an example output from the display within the second volume illustrated in Fig. 1;
Fig. 4 illustrates a flow diagram of another method according to various examples; and
Fig. 5 illustrates a flow diagram of a further method according to various examples.
DETAILED DESCRIPTION
In the following description, the terms ‘connected’ and ‘coupled’ mean operationally connected and coupled. It should be appreciated that there may be any number of intervening components between the mentioned features, including no intervening components.
Fig. 1 illustrates a schematic diagram of apparatus 10 according to various examples. The apparatus includes a controller 12, a sensor arrangement 13 (including a first sensor 14, a second sensor 16, and a third sensor 18), a fourth sensor 20, a fifth sensor 22, a sixth sensor 24, a display 26, a projector 28, a laser apparatus 30, a robot 32, a seventh sensor 34, an eighth sensor 36, a ninth sensor 38, a user input device 40, and a display 42. Fig. 1 also illustrates an operator 44, an article 46, and a user 48.
In some examples, the apparatus 10 may be a module. As used herein, the wording ‘module’ refers to a device or apparatus where one or more features are included at a later time and, possibly, by another manufacturer or by an end user. For example, where the apparatus 10 is a module, the apparatus 10 may only include the controller 12, and the remaining features may be added by another manufacturer, or by an end user.
The sensor arrangement 13, the fourth sensor 20, the fifth sensor 22, the sixth sensor 24, the display 26, the projector 28, the laser apparatus 30, the robot 32, the operator 44 and the article 46 are located within or at a first volume 49. By way of an example, the first volume 49 may be at least a part of a manufacturing facility, at least a part of an electrical energy generation plant, or at least a part of a factory.
The seventh sensor 34, the eighth sensor 36, the ninth sensor 38, the user input device 40, the display 42, and the user 48 are located within, or at, a second volume 51. The second volume 51 is different to the first volume 49 and may be located in a different part of a building to the first volume 49; may be located in a different building to the first volume 49, may be located in a different part of a country to the first volume 49, or may be located in a different country to the first volume 49.
The controller 12 may be located in the first volume 49, or may be located in the second volume 51, or may be located in a different volume to the first volume 49 and the second volume 51 (that is, the controller 12 may be located in third volume). For example, the controller 12 may be located in a different building, or may be located in a different part of a country, or may be located in a different country to the first volume 49 and/or the second volume 51. The controller 12 may be located at a single location, or may be distributed across a plurality of locations.
The controller 12, the sensor arrangement 13, the fourth sensor 20, the fifth sensor 22, the sixth sensor 24, the display 26, the projector 28, the laser apparatus 30, the robot 32, the seventh sensor 34, the eighth sensor 36, the ninth sensor 38, the user input device 40, and the display 42 may be coupled to one another via a wireless link and may consequently comprise transceiver circuitry and one or more antennas. Additionally or alternatively, the controller 12, the sensor arrangement 13, the fourth sensor 20, the fifth sensor 22, the sixth sensor 24, the display 26, the projector 28, the laser apparatus 30, the robot 32, the seventh sensor 34, the eighth sensor 36, the ninth sensor 38, the user input device 40, and the display 42 may be coupled to one another via a wired link and may consequently comprise interface circuitry (such as a Universal Serial Bus (USB) socket). It should be appreciated that the controller 12, the sensor arrangement 13, the fourth sensor 20, the fifth sensor 22, the sixth sensor 24, the display 26, the projector 28, the laser apparatus 30, the robot 32, the seventh sensor 34, the eighth sensor 36, the ninth sensor 38, the user input device 40, and the display 42 may be coupled to one another via any combination of wired and wireless links.
The controller 12 may comprise any suitable circuitry to cause performance of the methods described herein and as illustrated in Figs. 2, 4, and 5. The controller 12 may comprise: control circuitry; and/or processor circuitry; and/or at least one application specific integrated circuit (ASIC); and/or at least one field programmable gate array (FPGA); and/or single or multi-processor architectures; and/or sequential/parallel architectures; and/or at least one programmable logic controllers (PLCs); and/or at least one microprocessor; and/or at least one microcontroller; and/or a central processing unit (CPU); and/or a graphics processing unit (GPU), to perform the methods.
In various examples, the controller 12 may comprise at least one processor 50 and at least one memory 52. The memory 52 stores a computer program 54 comprising computer readable instructions that, when read by the processor 50, causes performance of the methods described herein, and as illustrated in Fig. 2, 4, and 5. The computer program 54 may be software or firmware, or may be a combination of software and firmware.
The processor 50 may include at least one microprocessor and may comprise a single core processor, may comprise multiple processor cores (such as a dual core processor or a quad core processor), or may comprise a plurality of processors (at least one of which may comprise multiple processor cores).
The memory 52 may be any suitable non-transitory computer readable storage medium, data storage device or devices, and may comprise a hard disk and/or solid state memory (such as flash memory). The memory 52 may be permanent non-removable memory, or may be removable memory (such as a universal serial bus (USB) flash drive or a secure digital card). The memory may include: local memory employed during actual execution of the computer program; bulk storage; and cache memories which provide temporary storage of at least some computer readable or computer usable program code to reduce the number of times code may be retrieved from bulk storage during execution of the code.
The computer program 54 may be stored on a non-transitory computer readable storage medium 56. The computer program 54 may be transferred from the non-transitory computer readable storage medium 56 to the memory 52. The non-transitory computer readable storage medium 56 may be, for example, a USB flash drive, a secure digital (SD) card, an optical disc (such as a compact disc (CD), a digital versatile disc (DVD) or a Blu-ray disc). In some examples, the computer program 54 may be transferred to the memory 52 via a signal 58 (which may be a wireless signal or a wired signal).
Input/output devices may be coupled to the controller 12 either directly or through intervening input/output controllers. Various communication adaptors may also be coupled to the controller 12 to enable the apparatus 10 to become coupled to other apparatus or remote printers or storage devices through intervening private or public networks. Non-limiting examples include modems and network adaptors of such communication adaptors.
The sensor arrangement 13 is configured to sense at least the position of at least the operator 44 within the first volume 49. The first sensor 14, the second sensor 16 and the third sensor 18 may comprise any suitable sensor, or sensors, for sensing the position of the operator 44. For example, the first sensor 14, the second sensor 16, and the third sensor 18 may be provided by any combination of three dimensional (3D) laser scanners, infrared sensors, and digital cameras (comprising charge coupled device (CCD) sensors, and/or complementary metal oxide semiconductor (CMOS) sensors). In some examples, the first sensor 14, the second sensor 16, and the third sensor 18 may be Microsoft Kinect v2 sensors. The first sensor 14, the second sensor 16, and the third sensor 18 may be positioned at various different locations within and/or around the first volume 49 to sense the position of at least the operator 44. The controller 12 is configured to receive position data from the sensor arrangement 13.
It should be appreciated that the sensor arrangement 13 may include any number of sensors and the sensor arrangement 13 is not limited to only comprising the first sensor 14, the second sensor 16 and the third sensor 18. For example, the sensor arrangement 13 may only include a single sensor that may be fixed, or moveable within or around the first volume 49. In other examples, the sensor arrangement 13 may comprise a relatively large number of sensors (ten or more sensors for example) that are positioned around the perimeter of the first volume 49.
The fourth sensor 20, the fifth sensor 22 and the sixth sensor 24 are configured to sense one or more environment parameters associated with the first volume 49. For example, the fourth sensor 20, the fifth sensor 22 and the sixth sensor 24 may be any combination of digital thermometers, radiation sensors (such as Geiger counters for example), gas detectors (for example, a carbon dioxide detector or a carbon monoxide detector), and humidity sensors. The controller 12 is configured to receive environment data from the fourth sensor 20, the fifth sensor 22 and the sixth sensor 24.
It should be appreciated that the apparatus 10 may include more or less environment sensors than the fourth sensor 20, the fifth sensor 22 and the sixth sensor 24. In some examples, the apparatus 10 may not include any environment sensors.
The display 26 may be any suitable device for displaying information to the operator 44. For example, the display 26 may comprise a liquid crystal display (LCD), a light emitting diode (LED) display (such as an active matrix organic light emitting diode (AMOLED) display), a thin film transistor (TFT) display, or a plasma display. The display 26 may be static within the first volume 49 (for example, the display 26 may be mounted on a wall or on a desk) or may be moveable within the first volume 49. For example, the display 26 may be integrated within a headset or spectacles worn by the operator 44 and may be configured to provide an augmented reality experience for the operator 44. The controller 12 is configured to control the display 26 to display information.
The projector 28 is configured to display information to the operator 44 via a two dimensional (2D) image or via a holographic image. The controller 12 is configured to control the operation of the projector 28 to display information.
The laser apparatus 30 may comprise a laser and one or more actuators for changing the orientation and/or position of the laser. The controller 12 is configured to control the operation of the laser apparatus 30. For example, the controller 12 may activate and deactivate the laser and may re-orient and reposition the laser.
The robot 32 may be configured to perform various actions. For example, the robot 32 may comprise one or more machine tools for machining the article 46, and/or one or more digital cameras for obtaining images of the article 46, and/or one or more pointing devices for pointing to a part of the article 46. The controller 12 is configured to control the operation of the robot 32.
The operator 44 may be a human or may be a robot. For example, the operator 44 may be a person involved in the repair, manufacture, inspection, or assembly of the article 46. By way of another example, the operator 44 may be an autonomous robot or a semi-autonomous robot involved in the repair, manufacture, inspection or assembly of the article 46. In some examples, there may be a plurality of operators 44 within the first volume 49.
The apparatus 10 may additionally include a first electroacoustic transducer 60 located within the first volume 49, and a second electroacoustic transducer 62 located within the second volume 51. The first and second electroacoustic transducers 60, 62 may be any combination of: ‘landline’ telephones (that is, telephones connected to a public switched telephone network); mobile cellular telephones; and voice over internet protocol (VoIP) devices. The operator 44 and the user 48 may use the first and second electroacoustic transducers 60, 62 respectively to communicate with one another. In various examples, the first and second electroacoustic transducers 60, 62 may communicate via the controller 12, and the controller 12 may store a recording of the conversation in the memory 52.
The seventh sensor 34, the eighth sensor 36, and the ninth sensor 38 may comprise any suitable sensor or sensors for obtaining images and/or position data of the user 48. For example, the sensors 34, 36, 38 may include any combination of digital cameras, infrared sensors, lasers, virtual reality peripheral network (VRPN) devices. In some examples, the sensors 34, 36, 38 may include a plurality of Kinect v2 sensors. The controller 12 is configured to receive the image data and/or the position data from the sensors 34, 36, 38.
The user input device 40 may comprise any suitable device or devices for enabling the user 48 to provide an input to the controller 12. For example, the user input device may comprise one or more of a keyboard, a keypad, a touchpad, a touchscreen display, and a computer mouse. The controller 12 is configured to receive signals from the user input device 40.
The display 42 may be any suitable device for displaying information to the user 48. For example, the display 42 may include one or more of: a liquid crystal display (LCD), a light emitting diode (LED) display (such as an active matrix organic light emitting diode (AMOLED) display), a thin film transistor (TFT) display, or a plasma display. The display 42 may be static within the second volume 51 (for example, the display 42 may be mounted on a wall or on a desk) or may be moveable within the second volume 51 (for example, the display 42 may be integrated within a headset or spectacles worn by the user 48). In some examples, the display 42 may be integrated within a virtual reality headset that may be worn by the user 48. The controller 12 is configured to control the display 42 to display information.
The user 48 is a human located within the second volume 51 and may be an expert on the activities being performed within the first volume 49 by the operator 44. For example, where the first volume 49 is at least a part of an aerospace facility, the user 48 may be an aerospace engineer. In some examples, there may be a plurality of users 48 within the second volume 51, each using a user input device 40 and a display 42.
The operation of the apparatus 10 is described in the following paragraphs with reference to Figs. 2 to 5.
At block 64, the method includes receiving position data for a sensed position of the operator 44 within the first volume 49. For example, the controller 12 may receive position data of the operator 44 and the article 46 within the first volume 49 from the sensor arrangement 13. The controller 12 may also receive environment data from one or more of the fourth sensor 20, the fifth sensor 22 and the sixth sensor 24 at block 64.
At block 66, the method includes controlling the display 42 located in the second volume 51 to display a representation of at least a portion of the operator 44 within a representation of at least a portion of the first volume 49 using the position data received at block 64. For example, the controller 12 may read data 68 stored in the memory 52 that defines a three dimensional model of the first volume 49. The controller 12 may then use the data received at block 64 to determine the position of the operator 44 and the article 46 within the three dimensional model of the first volume 49. The controller 12 may then control the display 42 to display a representation of the operator 44 and a representation of the article 46 within a representation of the first volume 49.
In some examples, the controller 12 may be configured to determine the position of a plurality of body parts of the operator 44 from the position data received at block 44. For example, where the sensor arrangement 13 includes a plurality of Kinect v2 sensors, the controller 12 may determine the positioning and orientation of the various body parts of the operator 44 and then control the display 42 so that the body parts are accurately positioned and oriented in the representation of the first volume 49. The user 48 may view the display 42 to view at least the representation of the operator 44 within the representation of the first volume 49.
Fig. 3 illustrates an example output from the display 42 within the second volume 51. The output includes a representation 49’ of the first volume 49, a representation 44’ of the operator 44 (which may also be referred to as an avatar), a representation 46’ of the article 46 (a fan blade in this example), and a representation 70’ of a machine tool (a linishing wheel in this example). The position and orientation of the representations 44’, 46’, 70’ within the representation 49’ of the first volume 49 accurately correspond to the position and orientation of the operator 44, the article 46 and the machine tool 70.
It should be appreciated that blocks 64 and 66 may be performed at a frequency so that at least the representation of the operator 44 is perceived as being updated continuously in the output of the display 42 by the user 48. For example, blocks 64 and 66 may be performed at a frequency of sixty hertz or higher so that the user 48 perceives the representation 44’ of the operator 44 to move fluidly.
At block 72, the method includes receiving user input data from the user input device 40 located in the second volume 51. For example, the controller 12 may receive the user input data from the user input device 40.
The user 48 may operate the user input device 40 in order to change the view displayed in the display 42. For example, the user 48 may control the user input device 40 to zoom the view in or out, move the view laterally, or rotate the view displayed in the display 42. The controller 12 may receive the user input data and then control the display 40 to display an updated view of the representations of the first volume 49 and the operator 44 in accordance with the input to the user input device 40.
At block 74, the method includes controlling a first device located at the first volume 49, using the user input data received at block 72, to perform a first action. For example, the controller 12 may control a first device located at the first volume 49 to present information to the operator 44 and/or to instruct the operator 44, and/or to manipulate one or more articles within the first volume 49.
For example, the controller 12 may use the received user input data to control the display 26 to display information to the operator 44. Where the display 26 is a head mounted display, the controller 12 may control the display 26 to overlay one or more images in the operators 44 field of vision so that the operator 44 experiences augmented reality. For example, the controller 12 may control the display 26 to overlay an arrow image in the field of vision of the operator 44, indicating a location where the operator 44 should perform machining. By way of a further example, the controller 12 may control the display 26 to display prerecorded images of an operator performing a task to educate the operator 44 on how to perform that task.
By way of another example, the controller 12 may use the received user input data to control the projector 28 to project images to display information to the operator 44.
By way of a further example, the controller 12 may use the received user input data to control the operation of the laser apparatus 30. For example, the controller 12 may control an actuator (such as one or more servomotors) of the laser apparatus 30 to orient the laser in a direction specified by the user input data, and then activate the laser to emit a laser beam towards a target. In various examples, the emitted laser beam may be targeted towards a location on the article 46 that the operator 44 is to inspect and/or machine.
By way of another example, the controller 12 may use the received user input data to control the operation of the robot 32. For example, the controller 12 may control the robot 32 to perform machining on the article 46, and/or obtain images of the article 46, and/or control one or more pointing devices of the robot 32 for pointing to a part of the article 46 to direct the actions of the operator 44.
At block 76, the method may include controlling a second device at the first volume 49, using the received user input data, to perform a second action, different to the first action. For example, the controller 12 may control any one of: the display 26, the projector 28, the laser apparatus 30, and the robot 32 as described in the preceding paragraphs.
It should be appreciated that at block 76, the method may include controlling a plurality of devices at the first volume 49, using the received user input data, to perform a variety of actions. For example, the controller 12 may control the display 26, the projector 28, the laser apparatus 30, and the robot 32 using the received user input data at blocks 74 and 76.
At block 78, the method includes storing the received position data and the received user input data in a memory. For example, the controller 12 may store the position data received at block 66 and the user input data received at block 72 in the memory 52.
In some examples, the seventh sensor 34, the eighth sensor 36, and the ninth sensor 38 may provide user input data to the controller 12 in addition to, or instead of, the user input device 40 (and consequently, the sensors 34, 36, 38 may be referred to as a user input device). Where the seventh sensor 34, the eighth sensor 36 and the ninth sensor 38 include Kinect v2 sensors or other virtual reality peripheral network (VRPN) devices, the user 48 may perform various gestures to provide user input data to the controller 12 and thereby control devices within or at the first volume 49. For example, the user 48 may perform gestures to control the orientation of the laser apparatus 30 and thus select the location where the laser beam is targeted.
The method may then return to block 72 or may end.
Fig. 4 illustrates a flow diagram of another method according to various examples.
At block 80, the method includes receiving further user input data requesting information concerning a process to be performed by the operator 44. For example, the controller 12 may receive further user input data from the user input device 40 (and/or the sensors 34, 36, 38) requesting information concerning a process to be performed by the operator 44. By way of an example, the user 48 may receive a telephone call from the operator 44 asking for assistance with a process. The user 48 may then operate the user input device 40 to request information from an enterprise resource planning (ERP) tool.
In some examples, block 80 may additionally or alternatively include receiving further user input data requesting information from a centrally stored database. For example, the user 48 may operate the user input device 40 to request finite element analysis (FEA) results or product life cycle management (PLM) data.
At block 82, the method includes retrieving information using the further user input data received at block 82. For example, the controller 12 may retrieve information 84 (such as process information) stored in the memory 52 or in remote memory (cloud storage for example).
At block 86, the method includes controlling the display 42 to display the retrieved information. For example, the controller 12 may control the display 42 to display the retrieved process information to the user 48. The user 48 may subsequently provide instructions to the operator 44 (for example, via the display 26) using the retrieved process information to assist the operator 44 with a task (as described in the preceding paragraphs with reference to blocks 72, 74, 76).
Fig. 5 illustrates a flow diagram of a further method according to various examples.
At block 88, the method includes receiving additional user input data requesting a computer aided design (CAD) model of an article located with the first volume 49. For example, the controller 12 may receive additional user input data from the user input device 40 (and/or the sensors 34, 36, 38) requesting a CAD model of the article 46 located within the first volume 49. By way of an example, the user 48 may wish to learn the internal structure of the article 46 in the first volume 49 and may operate the user input device 40 to request a CAD file of the article 46 from the controller 12.
At block 90, the method includes retrieving the requested computer aided design (CAD) model using the additional user input data. For example, the controller 12 may retrieve a computer aided design model 92 of the article 46 from the memory 52 or from remote memory.
At block 94, the method includes controlling the display 42 to display at least a part of the retrieved computer aided design (CAD) model. For example, the controller 12 may control the display 42 to display the CAD model 92 retrieved at block 90. The user 48 may subsequently provide instructions to the operator 44 (for example, via the display 26) using the retrieved CAD model (for example, to inform the operator 44 of the internal structure of the article 46).
The apparatus 10 may provide several advantages. First, the apparatus 10 may enable the user 48 to remotely view the actions of the operator 44 within the first volume 49 in real time and then remotely provide information and/or instructions to the operator 44 via one or more devices within the first volume 49. Second, the apparatus 10 may enable the user 48 to remotely manipulate physical objects in the first volume 49 using the robot 32. This may be particularly advantageous where the first volume 49 is an inhospitable environment (such as nuclear hot cell). In such examples, the robot 32 may be the operator 44. Third, the apparatus 10 may enable the user 48 to view a wealth of information (such as processes, data, CAD models and so on) and then share that information with the operator 44 in a condensed format. Fourth, the apparatus 10 may store data (such as the position data of the operator 44 and/or the user 48) in the memory 52 or in remote memory to record best practice and enable a review of the quality of the actions taken.
It will be understood that the invention is not limited to the embodiments above-described and various modifications and improvements can be made without departing from the concepts described herein. For example, the different embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment containing both hardware and software elements. By way of a further example, the controller 12 may be distributed and include a sensor server for collating the data from at least the sensor arrangement 13, and a virtual environment server for generating the output of the display 42 and to receive data from the user input device 40, the seventh sensor 34, the eighth sensor 36, and the ninth sensor 38.
Except where mutually exclusive, any of the features may be employed separately or in combination with any other features and the disclosure extends to and includes all combinations and sub-combinations of one or more features described herein.

Claims (34)

1. Apparatus for enabling remote control of one or more devices, the apparatus comprising: a controller configured to: receive position data for a sensed position of an operator within a first volume; control a display located in a second volume, different to the first volume, to display a representation of at least a portion of the operator within a representation of at least a portion of the first volume, using the received position data; receive user input data from a user input device located in the second volume; and control a first device located at the first volume, using the received user input data, to perform a first action.
2. Apparatus as claimed in claim 1, wherein the first device comprises a communication device, and the first action includes communicating information to the operator.
3. Apparatus as claimed in claim 1 or 2, wherein the first device comprises a projector or a display, and the first action includes displaying the information to the operator.
4. Apparatus as claimed in claim 1 or 2, wherein the first device comprises an electroacoustic transducer, and the first action includes providing acoustic waves to the operator to convey the information to the operator.
5. Apparatus as claimed in claim 1, wherein the first action includes physical movement of part, or all, of the first device.
6. Apparatus as claimed in claim 1 or 5, wherein the first device comprises a laser, and the first action includes orienting and activating the laser to emit a laser beam towards a target.
7. Apparatus as claimed in claim 1 or 5, wherein the first device comprises a robot, and the first action includes operating the robot.
8. Apparatus as claimed in any of the preceding claims, wherein the controller is configured to control a second device at the first volume, using the received user input data, to perform a second action, different to the first action.
9. Apparatus as claimed in any of the preceding claims, wherein the display comprises a virtual reality (VR) headset.
10. Apparatus as claimed in any of the preceding claims, wherein the controller is configured to receive environment data associated with the first volume.
11. Apparatus as claimed in any of the preceding claims, wherein the controller is configured to: receive further user input data requesting information concerning a process to be performed by the operator; retrieve process information using the further user input data; and control the display to display the retrieved process information.
12. Apparatus as claimed in any of the preceding claims, wherein the controller is configured to: receive additional user input data requesting a computer aided design (CAD) model of an article located within the first volume; retrieve the requested computer aided design (CAD) model using the additional user input data; and control the display to display at least a part of the retrieved computer aided design (CAD) model.
13. Apparatus as claimed in any of the preceding claims, wherein the controller is configured to store the received position data and the received user input data in a memory.
14. Apparatus as claimed in any of the preceding claims, wherein the first volume is at least a part of a manufacturing facility or at least a part of an electrical energy generation plant.
15. Apparatus as claimed in any of the preceding claims, wherein the operator is a human or a robot.
16. Apparatus substantially as described herein with reference to and as illustrated in the accompanying figures.
17. A method of enabling remote control of one or more devices, the method comprising: receiving position data for a sensed position of an operator within a first volume; controlling a display located in a second volume, different to the first volume, to display a representation of at least a portion of the operator within a representation of at least a portion of the first volume, using the received position data; receiving user input data from a user input device located in the second volume; and controlling a first device located at the first volume, using the received user input data, to perform a first action.
18. A method as claimed in claim 17, wherein the first device comprises a communication device, and controlling the first device includes controlling communication of information to the operator.
19. A method as claimed in claim 17 or 18, wherein the first device comprises a projector or a display, and controlling the first device includes controlling display of the information to the operator.
20. A method as claimed in claim 17 or 18, wherein the first device comprises an electroacoustic transducer, and controlling the first device includes controlling provision of acoustic waves to the operator to convey the information to the operator.
21. A method as claimed in claim 17, wherein the first action includes physical movement of part, or all, of the first device.
22. A method as claimed in claim 17 or 21, wherein the first device comprises a laser, and the first action includes orienting and activating the laser to emit a laser beam towards a target.
23. A method as claimed in claim 17 or 21, wherein the first device comprises a robot, and the first action includes operating the robot.
24. A method as claimed in any of claims 17 to 23, further comprising controlling a second device at the first volume, using the received user input data, to perform a second action, different to the first action.
25. A method as claimed in any of claims 17 to 24, wherein the display comprises a virtual reality (VR) headset.
26. A method as claimed in any of claims 17 to 25, further comprising receiving environment data associated with the first volume.
27. A method as claimed in any of claims 17 to 26, further comprising: receiving further user input data requesting information concerning a process to be performed by the operator; retrieving process information using the further user input data; and controlling the display to display the retrieved process information.
28. A method as claimed in any of claims 17 to 27, further comprising: receiving additional user input data requesting a computer aided design (CAD) model of an article located within the first volume; retrieving the requested computer aided design (CAD) model using the additional user input data; and controlling the display to display at least a part of the retrieved computer aided design (CAD) model.
29. A method as claimed in any of claims 17 to 28, further comprising controlling storage of the received position data and the received user input data in a memory.
30. A method as claimed in any of claims 17 to 29, wherein the first volume is at least a part of a manufacturing facility or at least a part of an electrical energy generation plant.
31. A method as claimed in any of claims 17 to 30, wherein the operator is a human or a robot.
32. A method substantially as described herein with reference to and as illustrated in the accompanying figures.
33. A computer program that, when read by a computer, causes performance of the method as claimed in any of claims 17 to 32.
34. A non-transitory computer readable storage medium comprising: computer readable instructions that, when read by a computer, cause performance of the method as claimed in any of claims 17 to 32.
GB1605835.6A 2016-04-06 2016-04-06 Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices Active GB2549264B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB1605835.6A GB2549264B (en) 2016-04-06 2016-04-06 Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices
US15/455,721 US10606340B2 (en) 2016-04-06 2017-03-10 Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices
CN201710220981.4A CN107422686B (en) 2016-04-06 2017-04-06 Apparatus for enabling remote control of one or more devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1605835.6A GB2549264B (en) 2016-04-06 2016-04-06 Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices

Publications (2)

Publication Number Publication Date
GB2549264A true GB2549264A (en) 2017-10-18
GB2549264B GB2549264B (en) 2020-09-23

Family

ID=59895510

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1605835.6A Active GB2549264B (en) 2016-04-06 2016-04-06 Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices

Country Status (3)

Country Link
US (1) US10606340B2 (en)
CN (1) CN107422686B (en)
GB (1) GB2549264B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109689992B (en) * 2016-09-14 2021-09-03 埃雷兹·哈拉米 Monitoring system and method for controlling a construction surface treatment process
CN108881161B (en) * 2018-05-11 2022-02-25 深圳增强现实技术有限公司 Remote work guidance method and system based on AR semantic tags
US11938907B2 (en) 2020-10-29 2024-03-26 Oliver Crispin Robotics Limited Systems and methods of servicing equipment
US11915531B2 (en) 2020-10-29 2024-02-27 General Electric Company Systems and methods of servicing equipment
US11874653B2 (en) 2020-10-29 2024-01-16 Oliver Crispin Robotics Limited Systems and methods of servicing equipment
US11935290B2 (en) * 2020-10-29 2024-03-19 Oliver Crispin Robotics Limited Systems and methods of servicing equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030179292A1 (en) * 2000-03-16 2003-09-25 Herve Provost Home-based remote medical assistance
US20040189675A1 (en) * 2002-12-30 2004-09-30 John Pretlove Augmented reality system and method
JP2008107871A (en) * 2006-10-23 2008-05-08 Yoji Shimizu Remote operator instruction system
WO2009036782A1 (en) * 2007-09-18 2009-03-26 Vrmedia S.R.L. Information processing apparatus and method for remote technical assistance
US20130231779A1 (en) * 2012-03-01 2013-09-05 Irobot Corporation Mobile Inspection Robot
WO2013176758A1 (en) * 2012-05-22 2013-11-28 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
EP2818948A1 (en) * 2013-06-27 2014-12-31 ABB Technology Ltd Method and data presenting device for assisting a remote user to provide instructions
WO2014208169A1 (en) * 2013-06-26 2014-12-31 ソニー株式会社 Information processing device, control method, program, and recording medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000055714A1 (en) * 1999-03-15 2000-09-21 Varian Semiconductor Equipment Associates, Inc. Remote assist system
US20020010734A1 (en) * 2000-02-03 2002-01-24 Ebersole John Franklin Internetworked augmented reality system and method
WO2007066166A1 (en) * 2005-12-08 2007-06-14 Abb Research Ltd Method and system for processing and displaying maintenance or control instructions
WO2014076919A1 (en) * 2012-11-15 2014-05-22 パナソニック株式会社 Information-providing method and information-providing device
CN104076949B (en) * 2013-03-29 2017-05-24 华为技术有限公司 Laser pointer beam synchronization method and related equipment and system
CN104506621B (en) * 2014-12-24 2018-07-31 北京佳讯飞鸿电气股份有限公司 A method of remote guide is carried out using video labeling
CN104660995B (en) * 2015-02-11 2018-07-31 尼森科技(湖北)有限公司 A kind of disaster relief rescue visible system
US9626850B2 (en) * 2015-09-02 2017-04-18 Vivint, Inc. Home automation communication system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030179292A1 (en) * 2000-03-16 2003-09-25 Herve Provost Home-based remote medical assistance
US20040189675A1 (en) * 2002-12-30 2004-09-30 John Pretlove Augmented reality system and method
JP2008107871A (en) * 2006-10-23 2008-05-08 Yoji Shimizu Remote operator instruction system
WO2009036782A1 (en) * 2007-09-18 2009-03-26 Vrmedia S.R.L. Information processing apparatus and method for remote technical assistance
US20130231779A1 (en) * 2012-03-01 2013-09-05 Irobot Corporation Mobile Inspection Robot
WO2013176758A1 (en) * 2012-05-22 2013-11-28 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
WO2014208169A1 (en) * 2013-06-26 2014-12-31 ソニー株式会社 Information processing device, control method, program, and recording medium
EP2818948A1 (en) * 2013-06-27 2014-12-31 ABB Technology Ltd Method and data presenting device for assisting a remote user to provide instructions

Also Published As

Publication number Publication date
GB2549264B (en) 2020-09-23
US10606340B2 (en) 2020-03-31
CN107422686A (en) 2017-12-01
US20170293275A1 (en) 2017-10-12
CN107422686B (en) 2022-05-03

Similar Documents

Publication Publication Date Title
US10606340B2 (en) Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices
US10685489B2 (en) System and method for authoring and sharing content in augmented reality
US11277655B2 (en) Recording remote expert sessions
US20200388080A1 (en) Displaying content in an augmented reality system
JP6895539B2 (en) Project planning and adaptation based on feasibility analysis
US9836483B1 (en) Using a mobile device for coarse shape matching against cloud-based 3D model database
US8902254B1 (en) Portable augmented reality
US9008839B1 (en) Systems and methods for allocating tasks to a plurality of robotic devices
JP6255706B2 (en) Display control apparatus, display control method, display control program, and information providing system
JP2020017264A (en) Bidirectional real-time 3d interactive operations of real-time 3d virtual objects within real-time 3d virtual world representing real world
US20180046363A1 (en) Digital Content View Control
US20170092000A1 (en) Method and system for positioning a virtual object in a virtual simulation environment
US20160227868A1 (en) Removable face shield for augmented reality device
US10105847B1 (en) Detecting and responding to geometric changes to robots
KR102625014B1 (en) Aiding maneuvering of obscured objects
CN103310378A (en) Method and apparatus for monitoring operation of system asset
EP3631712A1 (en) Remote collaboration based on multi-modal communications and 3d model visualization in a shared virtual workspace
JP6538760B2 (en) Mixed reality simulation apparatus and mixed reality simulation program
US10783710B2 (en) Configuration of navigational controls in geometric environment
US20210272269A1 (en) Control device, control method, and program
US10563979B2 (en) Methods, apparatus, computer programs and non-transitory computer readable storage mediums for controlling a robot within a volume
Berglund et al. Virtual reality and 3D imaging to support collaborative decision making for adaptation of long-life assets
EP4114620A1 (en) Task-oriented 3d reconstruction for autonomous robotic operations
US10155273B1 (en) Interactive object fabrication
US20220043455A1 (en) Preparing robotic operating environments for execution of robotic control plans