SE1300676A1 - Procedure and data presentation arrangements to assist a remote user to provide instructions to a local user - Google Patents
Procedure and data presentation arrangements to assist a remote user to provide instructions to a local user Download PDFInfo
- Publication number
- SE1300676A1 SE1300676A1 SE1300676A SE1300676A SE1300676A1 SE 1300676 A1 SE1300676 A1 SE 1300676A1 SE 1300676 A SE1300676 A SE 1300676A SE 1300676 A SE1300676 A SE 1300676A SE 1300676 A1 SE1300676 A1 SE 1300676A1
- Authority
- SE
- Sweden
- Prior art keywords
- projector
- presentation
- camera
- data
- remote user
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Uppfinningen hänför sig till ett förfarande, ett datapresentationsarrangemang och en datorprogramprodukt för att assistera en avlägsen användare att ge instruktioner till en lokal användare vid ett läge i en industriell plats där ett processtyrsystem arbetar. Datapresentationsarrangemanget innefattar en styrenhet för att styra en datapresentationsanordning (32) innefattande en projektor. Styrenheten mottager projektorstyrkommandon från den avlägsna användaren, där projektorstyrkommandona innefattar ett kommando att projicera ett presentationsföremål (PI2, PI3) i ett presentationsområde (PA) för projektorn, styr projektorn användande projektorstyrkommandona för att projicera presentationsföremålet (PI2, PI3) i presentationsområdet (PA), detekterar att den lokala användaren är i projektorns synfält och ändrar proj iceringen baserat på detekteringen för att undvika att den lokala användaren blockerar projiceringen.The invention relates to a method, a data presentation arrangement and a computer program product for assisting a remote user to give instructions to a local user at a location in an industrial site where a process control system operates. The data presentation arrangement comprises a control unit for controlling a data presentation device (32) comprising a projector. The controller receives projector control commands from the remote user, where the projector control commands include a command to project a presentation object (PI2, PI3) into a presentation area (PA) of the projector, the projector controls the projector control commands to project the presentation object (PI2, PI3) into the presentation area. detects that the local user is in the projector's field of view and changes the projection based on the detection to avoid the local user blocking the projection.
Description
10 15 20 25 30 downtimes in the process control system as the expert might have to travel long distances to get to the site. 10 15 20 25 30 downtimes in the process control system as the expert might have to travel long distances to get to the site.
Some efforts have been made for improving on the situation. US 2011/0310122 does for instance describe a remote instruction system provided in relation to circuit boards. In this system an image is captured of an object and an annotation image and an attention images are projected on to such an object.Some efforts have been made for improving on the situation. US 2011/0310122 does for instance describe a remote instruction system provided in relation to circuit boards. In this system an image is captured of an object and an annotation image and an attention images are projected on to such an object.
Similar systems are disclosed in JP 2009-194697 and JP 2003-209832.Similar systems are disclosed in JP 2009-194697 and JP 2003-209832.
However, there is still room for improvement within the field.However, there is still room for improvement within the field.
BRIEF DESCRIPTION OF THE DRAWINGS The present invention will in the following be described with reference being made to the accompanying drawings, where Fig. 1 schematically shows an industrial plant with a process control system operating an industrial process together with a data presenting device, Fig. 2 schematically shows a block schematic of units inside a housing of the data presenting device, Fig. 3 shows a perspective view of the data presenting device in the form of the housing on a tripod, Fig. 4a and 4b show perspective views of the data presenting device indicating various possibilities of movement of the housing, 10 15 20 25 30 Fig. 5 schematically shows the data presenting device communicating with a computer of a remote user via the Internet, Fig. 6 schematically shows the remote user with his computer on which video of a location in the process control system is shown, Fig. 7 schematically shows the use of a camera of the data presenting device for capturing a video of part of a process control object at the location, Fig. 8 schematically shows the presentation of the video together with a three-dimensional view of the location and the data presenting device, Fig. 9 schematically shows the location with the data presenting device providing a presentation area in which a first and a second presentation item is provided, Fig. 10a and 10b schematically shows the location with the data presenting device and presentation items when the presentation area is being moved, Fig. 11 schematically shows the location with the data presenting device providing a projecting area in which the second and a third presentation item is provided, Fig. 12 schematically shows the location with the data presenting device and second and third presentation items when the local user at least partially blocks the presentation area, Fig. 13 schematically shows a first way to move the presentation area in case of local user blocking, Fig. 14 schematically shows a second way to move the presentation area in case of local user blocking, and Fig. 15 schematically shows a data carrier with computer program code, in the form of a CD-ROM disc, 10 15 20 25 30 for implementing a control unit of the data presenting device.LETTER DESCRIPTION OF THE DRAWINGS The present invention will be in the following described with reference being made to the accompanying drawings, where Fig. 1 schematically shows an industrial plant with a process control system operating an industrial process together with a data presenting device, Fig. 2 schematically shows a block schematic of units inside a housing of the data presenting device, Fig. 3 shows a perspective view of the data presenting device in the form of the housing on a tripod, Fig. 4a and 4b show perspective views of the data presenting device indicating various possibilities of movement of the housing, 10 15 20 25 30 Fig. 5 schematically shows the data presenting device communicating with a computer of a remote user via the Internet, Fig. 6 schematically shows the remote user with his computer on which video of a location in the process control system is shown, Fig. 7 schematically shows the use of a camera of the data presenting device for capturing a video of part of a process control object at the location, Fig. 8 schematically shows the presentation of the video together with a three-dimensional view of the location and the data presenting device, Fig. 9 schematically shows the location with the data presenting device providing a presentation area in which is a first and a second presentation item provided, Fig. 10a and 10b schematically shows the location with the data presenting device and presentation items when the presentation area is being moved, Fig. 11 schematically shows the location with the data presenting device providing a projecting area in which the second and a third presentation item is provided, Fig. 12 schematically shows the location with the data presenting device and second and third presentation items when the local user at least partially blocks the presentation area, Fig. 13 schematically shows a first way to move the presentation area in case of local user blocking, Fig. 14 schematically shows a second way to move the presentation area in case of local user blocking, and Fig. 15 schematically shows a data carrier with computer program code, in the form of a CD-ROM disc, 10 15 20 25 30 for implementing a control unit of the data presenting device.
DETAILED DESCRIPTION OF THE INVENTION This invention presents a way for a remote user to gather relevant data and provide instructions and directions for local engineers at a location of an industrial plant where a process control system operates.DETAILED DESCRIPTION OF THE INVENTION This invention presents a way for a remote user to gather relevant data and provide instructions and directions for local engineers at a location of an industrial plant where a process control system operates.
Fig. 1 schematically shows an industrial plant where a process control system 10 is provided. The process control system 10 is a computerized process control system for controlling an industrial process. The process can be any type of industrial process, such as electrical power generation, transmission and distribution processes as well as water purification and distribution processes, oil and gas production and distribution processes, petrochemical, chemical, pharmaceutical and food processes, and pulp and paper production processes. These are just some examples of processes where the system can be applied. There exist countless other industrial processes. The processes may also be other types of industrial processes such as the manufacturing of goods. A process may be monitored through one or more process monitoring computers, which communicate with a server handling monitoring and control of the process.Fig. 1 schematically shows an industrial plant where a process control system 10 is provided. The process control system 10 is a computerized process control system for controlling an industrial process. The process can be any type of industrial process, such as electrical power generation, transmission and distribution processes as well as water purification and distribution processes, oil and gas production and distribution processes, petrochemical, chemical, pharmaceutical and food processes, and pulp and paper production processes. These are just some examples of processes where the system can be applied. There exists countless other industrial processes. The processes may also be other types of industrial processes such as the manufacturing of goods. A process may be monitored through one or more process monitoring computers, which communicate with a server handling monitoring and control of the process.
In fig. 1 the process control system 10 therefore includes a number of process monitoring computers 12 and 14. These computers may here also be considered to 10 15 20 25 30 form operator terminals and are connected to a first data bus Bl. There is also a communication handling device 16 connected to this first data bus Bl, which communication handling device 16 is connected to at least one wireless network WN. The communication handling device 16 is also connected to a public data communication network, which here is the internet IN.In Fig. 1 the process control system 10 therefore includes a number of process monitoring computers 12 and 14. These computers may here also be considered to 10 15 20 25 30 form operator terminals and are connected to a first data bus Bl. There is also a communication handling device 16 connected to this first data bus Bl, which communication handling device 16 is connected to at at least one wireless network WN. The communication handling device 16 is also connected to a public data communication network, which here is the internet IN.
The communication handling device 16 therefore has a wireless interface 17 for communicating with the wireless network WN, a computer network interface 18, for instance in the form of an Ethernet interface, for communicating with the Internet and a local interface 19, perhaps also in the form of an Ethernet interface, for communicating with other entities in the process control system 10. The communication handling device 16 also has a communication handling unit 20 connected to all three interfaces 17, 18 and 19. The wireless interface 17 is thereby a first communication interface for communicating with wireless devices, the computer network interface 18 a second communication interface for communicating with a device of a remote user and the local interface a third communication interface for communicating with devices of the process control system 10. The communication handling unit 20 has the function of providing a gateway between the process control system and other networks, such as the Internet or the wireless network WN, if this is external to the process control system 10. However, in some variations of the invention it also has another function and that is the function of controlling a communication session.The communication handling device 16 therefore has a wireless interface 17 for communicating with the wireless network WN, a computer network interface 18, for instance in the form of an Ethernet interface, for communicating with the Internet and a local interface 19, perhaps also in the form of an Ethernet interface, for communicating with other entities in the process control system 10. The communication handling device 16 also has a communication handling unit 20 connected to all three interfaces 17, 18 and 19. The wireless interface 17 is thereby a first communication interface for communicating with wireless devices, the computer network interface 18 a second communication interface for communicating with a device of a remote user and the local interface a third communication interface for communicating with devices of the process control system 10. The communication handling unit 20 has the function of providing a gateway between the process control system and other networks, such as the Internet or the wireless network WN, if this is external to the process control system 10. However, in some variations of the invention it also has another function and that is the function of controlling a communication session.
To the wireless network WN there is connected a data presenting device 32. The wireless network WN may be a 10 15 20 25 30 local network, such as a wireless local area network (WLAN). It may also be a Bluetooth network, i.e. a network with a number of interconnected Bluetooth nodes. It may also be a mobile communication network.To the wireless network WN there is connected a data presenting device 32. The wireless network WN may be a 10 15 20 25 30 local network, such as a wireless local area network (WLAN). It may also be a Bluetooth network, i.e. a network with a number of interconnected Bluetooth nodes. It may also be a mobile communication network.
There is furthermore a second data bus B2 and between the first and second data busses Bl and B2 there are connected a server 23 providing control and protection of the process and a database 22 where data relating to control and protection of the process is stored. Such data relating to control and protection may here comprise process data such as measurements and control commands, while data relating to protection may comprise alarm and event data as well as data on which alarms and events can be generated, such as measurements made in the process. It may also provide face plates of process control objects, which face places may comprise process control data from the database 22 regarding the process control object. There is furthermore an optional object data server 21 connected between the two buses Bl and B2. The object data server 21 comprises data about all process control instructions and manuals objects, such as blueprints, regarding the process control objects.There is furthermore a second data bus B2 and between the first and second data busses Bl and B2 there are connected a server 23 providing control and protection of the process and a database 22 where data relating to control and protection of the process is stored. Such data relating to control and protection may here comprise process data such as measurements and control commands, while data relating to protection may comprise alarm and event data as well as data on which alarms and events can be generated, such as measurements made in the process. It may also provide face plates of process control objects, which face places may comprise process control data from the database 22 regarding the process control object. There is furthermore an optional object data server 21 connected between the two buses Bl and B2. The object data server 21 comprises data about all process control instructions and manuals objects, such as blueprints, regarding the process control objects.
To the second data bus B2 there is furthermore connected a number of further devices 24, 26, 28 and 30. These further devices 24, 26, 28 and 30 are field devices, which are devices that are interfaces to the process being controlled. A field device is typically an interface via which measurements of the process are being made and to which control commands are given.To the second data bus B2 there is furthermore connected a number of further devices 24, 26, 28 and 30. These further devices 24, 26, 28 and 30 are field devices, which are devices that are interfaces to the process being controlled. A field device is typically an interface via which measurements of the process are being made and to which control commands are given.
Because of this the field devices are furthermore 10 15 20 25 30 process control objects. In one variation of the invention a first field device is a first process control object 24, a second field device is a second process control object 26 and a third field device is a third process control object 28.Because of this the field devices are furthermore 10 15 20 25 30 process control objects. In one variation of the invention a first field device is a first process control object 24, a second field device is a second process control object 26 and a third field device is a third process control object 28.
Fig. 2 shows a block schematic of a number of units that are provided in the data presenting device 32. The data presenting device 32 is provided with a housing 49. In the housing 49 there is provided a bus 33, and to this bus 33 there is connected an optional short range communication unit 46 or proximity sensor, a video projector 48, a camera 34, a recording controller 36, a program memory 39, a processor 40 as well as a radio communication circuit 42. It may also comprise at least one further sensor, for instance a temperature sensor, accelerometer, ambient light sensor and gyroscope (not shown). The radio communication circuit 42 is furthermore connected to an antenna 44, where the radio communication unit 42 and antenna 44 are provided for communication with the wireless network WN. The radio communication circuit 42 and antenna 44 together form one type of communication interface for communicating with the process control system as well as with other entities. It may for this reason be a WiFi or WLAN interface. It may also be a mobile communication interface. It should also be realized that there may be two communication interfaces in the data presenting device, one mobile communication interfaces and one WiFi interface. The recording controller 36 is in turn connected to a microphone 35.Fig. 2 shows a schematic block of a number of units that are provided in the data presenting device 32. The data presenting device 32 is provided with a housing 49. In the housing 49 there is provided a bus 33, and to this bus 33 there is connected an optional short range communication unit 46 or proximity sensor, a video projector 48, a camera 34, a recording controller 36, a program memory 39, a processor 40 as well as a radio communication circuit 42. It may also comprise at at least one further sensor, for instance a temperature sensor, accelerometer, ambient light sensor and gyroscope (not shown). The radio communication circuit 42 is furthermore connected to an antenna 44, where the radio communication unit 42 and antenna 44 are provided for communication with the wireless network WN. The radio communication circuit 42 and antenna 44 together form one type of communication interface for communicating with the process control system as well as with other entities. It may for this reason be a WiFi or WLAN interface. It may also be a mobile communication interface. It should also be realized that there may be two communication interfaces in the data presenting device, one mobile communication interfaces and one WiFi interface. The recording controller 36 is in turn connected to a microphone 35.
The recording controller 36 and microphone 35 together form a recording unit that may be used for recording 10 15 20 25 30 sound in a location of the process control system.The recording controller 36 and microphone 35 together form a recording unit that may be used for recording 10 15 20 25 30 sound in a location of the process control system.
Although it is not shown, the data presenting device 32 may also comprise sound emitting units such as speakers and earphones. It is also possible that a microphone and earphones are combined into a headset connected to the data presenting device 32. The short range communication unit 46 may also be regarded as a type of sensor, an object sensor or proximity sensor, for sensing a process control object to be serviced. This sensor may be implemented through Near Field Communication (NFC) technique.Although it is not shown, the data presenting device 32 may also comprise sound emitting units such as speakers and earphones. It is also possible that a microphone and earphones are combined into a headset connected to the data presenting device 32. The short range communication unit 46 may also be regarded as a type of sensor, an object sensor or proximity sensor, for sensing a process control object to be serviced. This sensor may be implemented through Near Field Technical Communication (NFC).
In the program memory 39 there is provided software code which when being run by the processor 40 forms a control element 38. The control element 38 is more particularly configured to perform a number of functions under the control of a remote user.In the program memory 39 there is provided software code which when being run by the processor 40 forms a control element 38. The control element 38 is more particularly configured to perform a number of functions under the control of a remote user.
The control element 38 is one variation of the invention a control unit of a data presenting arrangement. In this case that data presenting arrangement comprises the data presenting device 32. In another variation of the invention the communication handling unit 20 forms a control unit. In this other variation the data presenting arrangement comprises the communication handling device 16 of fig. 1 possibly together with the data presenting device 32.The control element 38 is one variation of the invention a control unit of a data presenting arrangement. In this case that data presenting arrangement comprises the data presenting device 32. In another variation of the invention the communication handling unit 20 forms a control unit. In this other variation the data presenting arrangement comprises the communication handling device 16 or Fig. 1 possibly together with the data presenting device 32.
In the following the control unit will be described when being implemented by the control element. It should however be realized that the same functions 10 15 20 25 30 described as being performed by the control element could be performed by the communication handling unit.In the following the control unit will be described when being implemented by the control element. It should however be realized that the same functions 10 15 20 25 30 described as being performed by the control element could be performed by the communication handling unit.
The data presenting device 32 may be moved within the premises of the industrial plant. It may thus be moved from one location to another location. It may also be placed so that it will be able to capture video images and present digital presentations via the projector 48.The data presenting device 32 may be moved within the premises of the industrial plant. It may thus be moved from one location to another location. It may also be placed so that it will be able to capture video images and present digital presentations via the projector 48.
For this reason the housing 49 may be placed on a tripod 50, which is schematically shown in fig. 3. It may also be provided with means for movement, such as wheels and a motor controlling the movement. The camera 34 has a field of view, i.e. an area in which it detects its environment. This field of view may be changed in different ways. It may be increased through zooming out commands and it may be decreased through zoom in commands. The field of view may also be shifted or moved using various type of pan commands. In order to obtain panning, the orientation of the camera may be changed. In a similar manner the projector has a presentation area or projection area, i.e. an area within which it is able to visually present information. The presentation area may be centred on a line of sight of the projector and may have any suitable shape, such as circular, rectangular and quadratic. Also this presentation area may be moved through changing the orientation of the projector. The camera 34 may change its orientation in three dimensions. Also the projector 48 may change its orientation in three dimensions. They may furthermore be independently changeable. In one variation the orientations may be changed jointly through the whole housing 49 being able to change orientation. Fig. 4a 10 15 20 25 30 10 and 4b schematically show movement achieving such reorientation. It can be seen that the housing 49 may be rotated in a horizontal plane 360 degrees around a vertical axis of rotation of the tripod 50. It can also be seen that the housing 49 may be tilted vertically upwards or downwards. In order to obtain such movement the data presenting device 32 may be provided with at least one motor for obtaining such movement. As was mentioned above, it may also be provided with more than one motor, for instance one for providing vertical movement and another for providing horizontal movement in addition to the motor for moving the optional wheels (not shown). In order to obtain separate movements of the camera and projector, there may also be two such pairs of motors provided, where one pair is provided for the camera 32 and the other for the projector 48.For this reason the housing 49 may be placed on a tripod 50, which is schematically shown in Fig. 3. It may also be provided with means for movement, such as wheels and a motor controlling the movement. The camera 34 has a field of view, i.e. an area in which it detects its environment. This field of view may be changed in different ways. It may be increased through zooming out commands and it may be decreased through zoom in commands. The field of view may also be shifted or moved using various type of pan commands. In order to obtain panning, the orientation of the camera may be changed. In a similar manner the projector has a presentation area or projection area, i.e. an area within which it is able to visually present information. The presentation area may be centered on a line of sight of the projector and may have any suitable shape, such as circular, rectangular and quadratic. Also this presentation area may be moved by changing the orientation of the projector. The camera 34 may change its orientation in three dimensions. Also the projector 48 may change its orientation in three dimensions. They may furthermore be independently changeable. In one variation the orientations may be changed jointly throughout the whole housing 49 being able to change orientation. Fig. 4a 10 15 20 25 30 10 and 4b schematically show movement achieving such reorientation. It can be seen that the housing 49 may be rotated in a horizontal plane 360 degrees around a vertical axis of rotation of the tripod 50. It can also be seen that the housing 49 may be tilted vertically upwards or downwards. In order to obtain such movement the data presenting device 32 may be provided with at at least one motor for obtaining such movement. As was mentioned above, it may also be provided with more than one motor, for instance one for providing vertical movement and another for providing horizontal movement in addition to the motor for moving the optional wheels (not shown). In order to obtain separate movements of the camera and projector, there may also be two such pairs of motors provided, where one pair is provided for the camera 32 and the other for the projector 48.
These separate movements may furthermore be provided while the camera and projector are still provided inside the same housing.These separate movements may furthermore be provided while the camera and projector are still provided inside the same housing.
As was indicated above, the projector 48 may change orientation independently of the camera. The camera 34 and projector 48 may thus point in different directions.As was indicated above, the projector 48 may change orientation independently of the camera. The camera 34 and projector 48 may thus point in different directions.
As can be seen in fig. 1 the data presenting device 32 may access the Internet IN via the wireless network WN and the communication handling device 16. This allows the data presenting device 32 to be operated remotely, i.e. from some other site than the plant. The data presenting device 32 may thereby be operated by a remote user, for instance via a computer 51 of a remote user 52 via a communication session set up under the 10 15 20 25 30 ll control of the communication handling device 16. This situation is schematically shown in fig. 5. Here it can be seen that a computer 51 of the remote user 52 may communicate with the data presenting device 32 via the communication handling device 16.As can be seen in Fig. 1 the data presenting device 32 may access the Internet IN via the wireless network WN and the communication handling device 16. This allows the data presenting device 32 to be operated remotely, i.e. from some other site than the plant. The data presenting device 32 may thereby be operated by a remote user, for instance via a computer 51 of a remote user 52 via a communication session set up under the 10 15 20 25 30 ll control of the communication handling device 16. This situation is schematically shown in Fig. 5. Here it can be seen that a computer 51 of the remote user 52 may communicate with the data presenting device 32 via the communication handling device 16.
The communication handling device 16 may also be accessed from outside of the process control system via the Internet IN. This allows a local user of the data presenting device 32 to engage in a communication session with at least one other user, which other use may be a remote user. It is then possible that video captured by the data presenting device 32 is being communicated to other participants of the communication session and especially to the remote user. The communication session will then be set up between the data presenting device 32 and the computer 51 of the remote user via the communication handling device 16, thereby making the local user of the data presenting device 32 and the remote user into participants of the communication session.The communication handling device 16 may also be accessed from outside of the process control system via the Internet IN. This allows a local user of the data presenting device 32 to engage in a communication session with at least one other user, which other use may be a remote user. It is then possible that video captured by the data presenting device 32 is being communicated to other participants of the communication session and especially to the remote user. The communication session will then be set up between the data presenting device 32 and the computer 51 of the remote user via the communication handling device 16, thereby making the local user of the data presenting device 32 and the remote user into participants of the communication session.
This situation is schematically shown in fig. 5. Here it can be seen that a computer 51 of a remote user may communicate with the data presenting device 32 of a local user via the communication handling device 16.This situation is schematically shown in Fig. 5. Here it can be seen that a computer 51 of a remote user may communicate with the data presenting device 32 of a local user via the communication handling device 16.
The communication of the session thus passes through the communication handling device 16, which allows the communication handling device 16 to control the communication session. In order to provide the communication session, the communication handling unit 20 of the communication handling device 16 may set up a 10 15 20 25 30 12 first communication channel to the local user and at least one second communication channel to the remote USGI".The communication of the session thus passes through the communication handling device 16, which allows the communication handling device 16 to control the communication session. In order to provide the communication session, the communication handling unit 20 of the communication handling device 16 may set up a 10 15 20 25 30 12 first communication channel to the local user and at at least one second communication channel to the remote USGI ".
Thereby the remote user 52 may be able to obtain video images captured by the camera 34 of the data presenting device 32, which video images are then presented for the remote user 52 via the display of his or her computer 51. This is shown in fig. 6. Fig. 5 schematically indicates the transmission of a video stream VS, a three-dimensional model 3DM and camera data CD from the data presenting device 32 to the computer 51 of the remote user 52 via the two communication channels of the session. Information from the sensors may also be sent wirelessly via the communication handling device to the remote user.Thereby the remote user 52 may be able to obtain video images captured by the camera 34 of the data presenting device 32, which video images are then presented for the remote user 52 via the display of his or her computer 51. This is shown in Fig. 6. Fig. 5 schematically indicates the transmission of a video stream VS, a three-dimensional model 3DM and camera data CD from the data presenting device 32 to the computer 51 of the remote user 52 via the two communication channels of the session. Information from the sensors may also be sent wirelessly via the communication handling device to the remote user.
Information provided by the remote user back to the plant may also be sent via the communication handling device. More information about the transmission will be give shortly.Information provided by the remote user back to the plant may also be sent via the communication handling device. More information about the transmission will be give shortly.
Now some Variations of the invention will be described in more detail.Now some Variations of the invention will be described in more detail.
In industry, for instance in an industrial plant where a process is being run by a process control system, it is very important to keep production running at all times as even a minor halt in production will cost large amounts of money and/or resources. Even a minor halt in production will result in any of wasted resources, reduced production output, or substandard product quality, wasted energy and also cost large 10 15 20 25 30 13 amounts of money. Because of lost production, waste and other problems caused by unplanned stops or slowdowns, maintenance is seen as very important in order to keep the production up and running.In industry, for instance in an industrial plant where a process is being run by a process control system, it is very important to keep production running at all times as even a minor halt in production will cost large amounts of money and / or resources. Even a minor halt in production will result in any of wasted resources, reduced production output, or substandard product quality, wasted energy and also cost large 10 15 20 25 30 13 amounts of money. Because of lost production, waste and other problems caused by unplanned stops or slowdowns, maintenance is seen as very important in order to keep the production up and running.
Maintenance can be very expensive as it is sometimes necessary to bring in external experts to help with advanced operations the regular personnel might not have the expertise to handle by themselves.Maintenance can be very expensive as it is sometimes necessary to bring in external experts to help with advanced operations the regular personnel might not have the expertise to handle by themselves.
Collaboration over a telephone line between a local worker and the expert is often not good enough. The expert may need to see what happens on site and may need to be able to instruct the personnel on site without the risk of any misinterpretation. Sending pictures back and forth is also a slow way of sharing information, so this is also not really good.Collaboration over a telephone line between a local worker and the expert is often not good enough. The expert may need to see what happens on site and may need to be able to instruct the staff on site without the risk of any misinterpretation. Sending pictures back and forth is also a slow way of sharing information, so this is also not really good.
° It can thus take a long time to get hold of the correct expert and fly in this expert to the site.° It can thus take a long time to get hold of the correct expert and fly in this expert to the site.
In the case of an unexpected breakdown requiring help from an external expert in order to continue production this can lead to long downtimes as the expert might have to travel long distances to get to the site.In the case of an unexpected breakdown requiring help from an external expert in order to continue production this can lead to long downtimes as the expert might have to travel long distances to get to the site.
~ Flying in an external expert can be very expensive. Not only from the costs associated with the expert (travels, accommodation etc.) but a halt to production as the plant personnel is waiting for the expert to arrive can be very expensive for the owner.~ Flying in an external expert can be very expensive. Not only from the costs associated with the expert (travels, accommodation etc.) but a halt to production as the plant personnel is waiting for the expert to arrive can be very expensive for the owner.
The above mentioned situation is solved through the use of the data presenting device.The above mentioned situation is solved through the use of the data presenting device.
In operation, i.e. when there is some kind of problem at a location in the plant, the data presenting device 10 15 20 25 30 14 is brought out to this location of the industrial site and placed at a position in the location where assistance is needed. The device may for instance be placed in the centre of a room. The data presenting device may be placed at this location by a local user in order to be used for solving a problem at the location, for instance the fact that one or more of the machines or process control objects may be faulty or that the process has a strange behaviour at the location.In operation, i.e. when there is some kind of problem at a location in the plant, the data presenting device 10 15 20 25 30 14 is brought out to this location of the industrial site and placed at a position in the location where assistance is needed. The device may for instance be placed in the center of a room. The data presenting device may be placed at this location by a local user in order to be used for solving a problem at the location, for instance the fact that one or more of the machines or process control objects may be faulty or that the process has a strange behavior at the location.
As the device is brought to the location a number of activities may thus be performed.As the device is brought to the location a number of activities may thus be performed.
In a first variation the remote user is provided with contextual data in relation to a video stream. Now this first variation will be described with reference being made to fig. 7 and 8, where fig. 7 schematically shows the use of a camera of the data presenting device for capturing a video of a part of a process control object at a location of the industrial site and fig. 8 schematically shows the presentation of the video together with a three-dimensional view of the location and the data presenting device.In a first variation the remote user is provided with contextual data in relation to a video stream. Now this first variation will be described with reference being made to Fig. 7 and 8, where Fig. 7 schematically shows the use of a camera of the data presenting device for capturing a video of a part of a process control object at a location of the industrial site and Fig. 8 schematically shows the presentation of the video together with a three-dimensional view of the location and the data presenting device.
According to the first variation, the control element 38 first makes the data presenting device 32 scan the area at the location. This may be done through the control element 38 controlling a motor to rotate the housing 49 around a vertical rotational axis combined with controlling a motor to tilt the housing 49 up and down with different tilt angles as shown in fig. 4a and 4b. In this way a three-dimensional space around the 10 15 20 25 30 15 position of the data presenting device is captured with different video images using the camera 34.According to the first variation, the control element 38 first makes the data presenting device 32 scan the area at the location. This may be done through the control element 38 controlling a motor to rotate the housing 49 around a vertical rotational axis combined with controlling a motor to tilt the housing 49 up and down with different tilt angles as shown in Fig. 4a and 4b. In this way a three-dimensional space around the 10 15 20 25 30 15 position of the data presenting device is captured with different video images using the camera 34.
After the area has been scanned, the control element 38 analyses the captured images and investigates if it recognises them with regard to a pre-existing three- dimensional model of the location and objects at this location, i.e. of the process control objects and possible other objects present at the location. If it recognizes the video images and that therefore there is a pre-existing model, then this model is fetched. The pre-existing three-dimensional model may be provided in the data presenting device 32. As an alternative it may be obtained or fetched from a server, such as server 21. If there is a pre-existing model then data about a previous position of the data presenting device at the location as well as camera orientations and video streams recorded as the data presenting device was placed at this previous position may be stored together with the model. Also this previous position data and associated historic video streams may be fetched. If any three-dimensional model has been made of the location, then this is thus fetched. However, if there was no pre-existing model, a new three-dimensional model 3DM of the location and the various objects in it is created by the control element 38. A model may for instance be created using augmented reality functionality. If the data presenting device comprises an infrared sensor it is also possible to use infrared technology, such as Microsoft Kinect. A 3D map of natural features at the location can be built using a variety of feature extraction methods such as corner or edge detection both with 2D RGB data and 3D RGBD (Red, 10 15 20 25 30 16 Green, Blue, Depth) data. Using this sparse map it is also possible to determine the location of the data presenting device 32 with camera 34. It is also possible to determine the orientation or pose of the camera 34. It is thus possible to determine in which direction the camera 34 is pointing. The orientation may be calculated based on Registration algorithms.After the area has been scanned, the control element 38 analyzes the captured images and investigates if it recognizes them with regard to a pre-existing three- dimensional model of the location and objects at this location, i.e. of the process control objects and possible other objects present at the location. If it recognizes the video images and that therefore there is a pre-existing model, then this model is fetched. The pre-existing three-dimensional model may be provided in the data presenting device 32. As an alternative it may be obtained or fetched from a server, such as server 21. If there is a pre-existing model then data about a previous position of the data presenting device at the location as well as camera orientations and video streams recorded as the data presenting device was placed at this previous position may be stored together with the model. Also this previous position data and associated historic video streams may be fetched. If any three-dimensional model has been made of the location, then this is thus fetched. However, if there was no pre-existing model, a new three-dimensional model 3DM of the location and the various objects in it is created by the control element 38. A model may for instance be created using augmented reality functionality. If the data presenting device comprises an infrared sensor it is also possible to use infrared technology, such as Microsoft Kinect. A 3D map of natural features at the location can be built using a variety of feature extraction methods such as corner or edge detection both with 2D RGB data and 3D RGBD (Red, 10 15 20 25 30 16 Green, Blue, Depth) data. Using this sparse map it is also possible to determine the location of the data presenting device 32 with camera 34. It is also possible to determine the orientation or pose of the camera 34. It is thus possible to determine in which direction the camera 34 is pointing. The orientation may be calculated based on Registration algorithms.
These algorithms can be used to locate the features of a current frame or video image in the map of the real world and based on this the orientation of the camera 34 may be determined.These algorithms can be used to locate the features of a current frame or video image in the map of the real world and based on this the orientation of the camera 34 may be determined.
The process control objects, i.e. the real world objects, may be provided with object identifiers, such as NFC tags or bar codes. If these are read it is possible to obtain information about what types of objects they are. The type may be identified through the camera 34 detecting a visual object identifier, like a bar code. As an alternative the short-range communication unit may be set to read a tag with the object identifier. Such a code may be used to fetch data associated with the object for instance from a database in the process control system. In order to simplify the fetching of such data, the control element 38 may therefore store an association of the object identifiers to the objects in the model 3DM of the location. As an alternative or in addition, it is also possible to use a gyro and/or accelerometer for determining the orientation.The process control objects, i.e. the real world objects, may be provided with object identifiers, such as NFC tags or bar codes. If these are read it is possible to obtain information about what types of objects they are. The type may be identified through the camera 34 detecting a visual object identifier, like a bar code. As an alternative the short-range communication unit may be set to read a tag with the object identifier. Such a code may be used to fetch data associated with the object for instance from a database in the process control system. In order to simplify the fetching of such data, the control element 38 may therefore store an association of the object identifiers to the objects in the model 3DM of the location. As an alternative or in addition, it is also possible to use a gyro and / or accelerometer for determining the orientation.
The above mentioned steps may have been performed before a communication session is started with the remote user 52. Such a session may be performed under 10 15 20 25 30 17 the control of the communication handling unit of the communication handling device using a TCP connection set up using WiFi and the Internet. As an alternative the steps are performed after a communication session is started. In both cases the control element 38 investigates if a communication session is in place or on-going, which in this case at least involves a voice communication session between the remote user and a local user via sound generating equipment and sound recording equipment of the data presenting device 32 and the computer 51. It also involves transmission of a live video stream VS, which may be a one way video stream from the data presenting device 32 in the process control system to the computer 51 of other participants of the session and particularly the remote user 52. In some instances it may involve a two-way video conference, i.e. where also video is provided by the computer 51 of the remote user 52 and conveyed to the data presenting device 32. Video images captured by the camera 34 may thus be transferred to the remote user 52. Also data of the remote user 52 may be made to be projected at the location under the control of the remote user.The above mentioned steps may have been performed before a communication session is started with the remote user 52. Such a session may be performed under 10 15 20 25 30 17 the control of the communication handling unit of the communication handling device using a TCP connection set up using WiFi and the Internet. As an alternative the steps are performed after a communication session is started. In both cases the control element 38 investigates if a communication session is in place or on-going, which in this case at least involves a voice communication session between the remote user and a local user via sound generating equipment and sound recording equipment of the data presenting device 32 and the computer 51. It also involves transmission of a live video stream VS, which may be a one way video stream from the data presenting device 32 in the process control system to the computer 51 of other participants of the session and particularly the remote user 52. In some instances it may involve a two-way video conference, i.e. where also video is provided by the computer 51 of the remote user 52 and conveyed to the data presenting device 32. Video images captured by the camera 34 may thus be transferred to the remote user 52. Also data of the remote user 52 may be made to be projected at the location under the control of the remote user.
If no session is in place, the control element 38 waits for one to be started either by the local user or the remote user 52.If no session is in place, the control element 38 waits for one to be started either by the local user or the remote user 52.
If however one is on-going, the control element 38 controls the camera 34 to record a video stream. It also determines the camera orientation, for instance based on the line of sight of a viewfinder of the camera 34. The orientation may be provided as a solid 10 15 20 25 30 18 angle related to the position of the data presenting device and a reference angle.If however one is on-going, the control element 38 controls the camera 34 to record a video stream. It also determines the camera orientation, for instance based on the line of sight of a viewfinder of the camera 34. The orientation may be provided as a solid 10 15 20 25 30 18 angle related to the position of the data presenting device and a reference angle.
In the communication session, the model 3DM may be transmitted from the data presenting device to the remote user 52. The three-dimensional model 3DM may more particularly be transmitted together with camera data CD in the video stream VS, where the camera data may comprise the position of the camera, i.e. of the data presenting device, as well as the camera orientation. It is furthermore possible that the control element 38 modifies the model of the location so that the data presenting device and orientation is a part of the model. The camera data may thus be provided as a part of the model.In the communication session, the model 3DM may be transmitted from the data presenting device to the remote user 52. The three-dimensional model 3DM may more particularly be transmitted together with camera data CD in the video stream VS, where the camera data may comprise the position of the camera, i.e. of the data presenting device, as well as the camera orientation. It is furthermore possible that the control element 38 modifies the model of the location so that the data presenting device and orientation is a part of the model. The camera data may thus be provided as a part of the model.
The remote user then receives the video stream together with the model 3DM and camera data CD. The remote user may then see both the captured video as well as obtain a three-dimensional view of the location using the model 3DM. It is in this way possible for the remote user to see where in the site he is looking.The remote user then receives the video stream together with the model 3DM and camera data CD. The remote user may then see both the captured video as well as obtain a three-dimensional view of the location using the model 3DM. It is in this way possible for the remote user to see where in the site he is looking.
An example of this is shown in fig. 7 and 8. Fig. 7 shows how the data presenting device 32 is at the same location as the first, second and third process control objects 24, 26 and 28 and how it captures video images of a part of the second process control object 26. It thus records the video for the remote user 52. Fig. 8 shows a video image VI in the video stream VS as it would look when displayed on the computer 51 of the remote user 52. This video image may comprise a lot of useful information. However, it may lack context. This 10 15 20 25 30 19 context is provided through also transmitting the model 3DM and camera data CD with the video stream VS. Fig. 8 shows the screen that the remote user 52 is able to see on the display of his or her computer 51. The view contains the live video stream from the camera, of which the image VI is presented. Furthermore, contextual information is provided through an overview image OI of the location, which overview image OI is obtained through visualizing the model 3DM of the location with the data presenting device DPD and its orientation. It is here possible that the remote user computer 51 is able to place a representation of the data presenting device with the orientation of the image into the model. Alternatively this has already been done by the control element 38 of the data presenting device 32. In the later case a modified model which shows the data presenting device and orientation is provided.An example of this is shown in Fig. 7 and 8. Fig. 7 shows how the data presenting device 32 is at the same location as the first, second and third process control objects 24, 26 and 28 and how it captures video images of a part of the second process control object 26. It thus records the video for the remote user 52. Fig. 8 shows a video image VI in the video stream VS as it would look when displayed on the computer 51 of the remote user 52. This video image may comprise a lot of useful information. However, it may lack context. This 10 15 20 25 30 19 context is provided through also transmitting the model 3DM and camera data CD with the video stream VS. Fig. 8 shows the screen that the remote user 52 is able to see on the display of his or her computer 51. The view contains the live video stream from the camera, or which the image VI is presented. Furthermore, contextual information is provided through an overview image OI of the location, which overview image OI is obtained through visualizing the 3DM model of the location with the data presenting device DPD and its orientation. It is here possible that the remote user computer 51 is able to place a representation of the data presenting device with the orientation of the image into the model. Alternatively this has already been done by the control element 38 of the data presenting device 32. In the later case a modified model which shows the data presenting device and orientation is provided.
The control element 38 then investigates if the communication session is ended. If it is not, then video is continued to be recorded and camera orientation determined and transferred together with the model to the remote user. However, if the communication session is ended, operation is also ended.The control element 38 then investigates if the communication session is ended. If it is not, then video is continued to be recorded and camera orientation determined and transferred together with the model to the remote user. However, if the communication session is ended, operation is also ended.
It can in this way be seen that according to this first variation, it is possible to track the current position and orientation of the camera 34 in a video conferencing situation while also building a map of the environment so that the remote user 52 can have a better situational awareness of the location. As can 10 15 20 25 30 20 be seen in fig. 8, the remote user 52 sees both the current camera view VI, but can also use the small picture OI in the right corner to get an excellent overview of the surroundings.It can in this way be seen that according to this first variation, it is possible to track the current position and orientation of the camera 34 in a video conferencing situation while also building a map of the environment so that the remote user 52 can have a better situational awareness of the location. As can 10 15 20 25 30 20 be seen in Fig. 8, the remote user 52 sees both the current camera view VI, but can also use the small picture OI in the right corner to get an excellent overview of the surroundings.
The remote user may here also be able to navigate in the constructed 3D view and is therefore not limited to observing the current frame from the video transition but is free to “explore” the known 3D model built from the video frames.The remote user may here also be able to navigate in the constructed 3D view and is therefore not limited to observing the current frame from the video transition but is free to “explore” the known 3D model built from the video frames.
A video conference call, where the goal is to share one user's environment, will not be limited to simply streaming video data but may also include data regarding the position as well as current pose or orientation of the camera, where the orientation may be set as the orientation of a line of sight of a view finder of the camera.A video conference call, where the goal is to share one user's environment, will not be limited to simply streaming video data but may also include data regarding the position as well as current pose or orientation of the camera, where the orientation may be set as the orientation of a line of sight of a view finder of the camera.
If a previous model existed, it is furthermore possible for the remote user to fetch video streams previously recorded at the location together with positions of the data presenting device and camera orientations when these historic video streams were recorded.If a previous model existed, it is furthermore possible for the remote user to fetch video streams previously recorded at the location together with positions of the data presenting device and camera orientations when these historic video streams were recorded.
A system that includes a projector and camera provides an efficient solution to support the collaboration between a field worker and remote expert. The data presenting device allows local users on site and the remote user to communicate verbally. It also allows the remote user to get an overview of the environment through the 3D camera. As a 3D camera is used the expert will be able to see a 3D model of the 10 15 20 25 21 environment in case he needs additional space information about the environment. The expert can also scroll, pan and zoom the camera on site to get a superior overview of the situation from the remote location. The equipment can also include a mobile platform for the expert to actually move the stick in the field.A system that includes a projector and camera provides an efficient solution to support the collaboration between a field worker and remote expert. The data presenting device allows local users on site and the remote user to communicate verbally. It also allows the remote user to get an overview of the environment through the 3D camera. As a 3D camera is used the expert will be able to see a 3D model of the 10 15 20 25 21 environment in case he needs additional space information about the environment. The expert can also scroll, pan and zoom the camera on site to get a superior overview of the situation from the remote location. The equipment can also include a mobile platform for the expert to actually move the stick in the field.
Some advantages of the first variation can be better realized from the following scenario: 1. A local maintenance engineer is doing some maintenance on the factory floor when he identifies a potential issue; he calls the remote user and starts a video call to get advice on the situation. 2. He uses the camera on the data presenting device and makes it scan the location to show the current situation of a process control object, in this example the second process control object 26, as well as the surrounding environment. 3. Different frames of the video are processed to form a map of the environment. In the forming of the map also the position of the data presenting device is determined. Using this map and a current video image, the current orientation or pose of the camera is calculated. 4. The map 3DM of the environment with camera data are then sent over the network during the call along with the video stream. 10 15 20 25 22 5. This additional information helps the remote user to orientate himself in the world given the dynamic nature of the orientation of the camera, as he can see the map of the environment and the video stream simultaneously. The remote user gets a much better situational awareness than a normal video conference system would give him. 6. The two users may then manage to solve the situation thanks to the efficiency of the video collaboration system that enables the remote user to get a very clear understanding of the local environment in the industrial site.Some advantages of the first variation can be better realized from the following scenario: A local maintenance engineer is doing some maintenance on the factory floor when he identifies a potential issue; he calls the remote user and starts a video call to get advice on the situation. 2. He uses the camera on the data presenting device and makes it scan the location to show the current situation of a process control object, in this example the second process control object 26, as well as the surrounding environment. 3. Different frames of the video are processed to form a map of the environment. In the forming of the map also the position of the data presenting device is determined. Using this map and a current video image, the current orientation or pose of the camera is calculated. 4. The map 3DM of the environment with camera data are then sent over the network during the call along with the video stream. 10 15 20 25 22 5. This additional information helps the remote user to orientate himself in the world given the dynamic nature of the orientation of the camera, as he can see the map of the environment and the video stream simultaneously. The remote user gets a much better situational awareness than a normal video conference system would give him. 6. The two users may then manage to solve the situation thanks to the efficiency of the video collaboration system that enables the remote user to get a very clear understanding of the local environment in the industrial site.
The first variation has a number of further advantages.The first variation has a number of further advantages.
In the first variation the camera and map data, i.e. the camera data and three-dimensional model, are transferred together with a video stream. This increases the situational awareness over a regular video stream, which leads to a less confusing situation and higher location awareness.In the first variation the camera and map data, i.e. the camera data and three-dimensional model, are transferred together with a video stream. This increases the situational awareness over a regular video stream, which leads to a less confusing situation and higher location awareness.
° The streamed data is used to create a complete picture of the location. The remote user can use this 3D model to navigate the point of view independently of the physical camera position; this will give the remote user a great situational awareness.° The streamed data is used to create a complete picture of the location. The remote user can use this 3D model to navigate the point of view independently of the physical camera position; this will give the remote user a great situational awareness.
Another advantage is that the number of unnecessary questions are reduced, question such as, “Where are you now?”, “What part am I looking at now?” and other deictic questions that engineers collaborating remotely are forced to ask today are avoided. 10 15 20 25 30 23 ° The communication will also become more accurate. Communication errors relating to location will be less common.Another advantage is that the number of unnecessary questions are reduced, question such as, “Where are you now? ”,“ What part am I looking at now? ” and other deictic questions that engineers collaborating remotely are forced to ask today are avoided. 10 15 20 25 30 23 ° The communication will also become more accurate. Communication errors relating to location will be less common.
~ The collaboration between the two users will also become more efficient. The time taken for video collaboration task to be completed will most probably be improved.~ The collaboration between the two users will also become more efficient. The time taken for video collaboration task to be completed will most likely be improved.
° It is furthermore possible that the safety is increased. As the remote user has a better awareness of the situation at hand, he can observe whether the local user is performing the correct actions.° It is furthermore possible that the safety is increased. As the remote user has a better awareness of the situation at hand, he can observe whether the local user is performing the correct actions.
A second variation will now be described with reference being made to fig. 9, l0a and lOb, where fig. 9 schematically shows the location with the data presenting device 32 providing a projecting area PA in which a first and a second presentation item PI1 and PI2 are projected, while fig. l0a and 10b schematically show the location with the data presenting device 32 and first and second presentation items PI1 and PI2 when the projecting area PA is being moved.A second variation will now be described with reference being made to Fig. 9, l0a and lOb, where Fig. 9 schematically shows the location with the data presenting device 32 providing a projecting area PA in which a first and a second presentation item PI1 and PI2 are projected, while Fig. L0a and 10b schematically show the location with the data presenting device 32 and first and second presentation items PI1 and PI2 when the projecting area PA is being moved.
When at the location the data presenting device 32 is with advantage used for obtaining data from the location for provision to the remote user 52 and for receiving instructions from the remote user 52 to the local user at the location. This may be done via a two- way voice or video communication.When at the location the data presenting device 32 is with advantage used for obtaining data from the location for provision to the remote user 52 and for receiving instructions from the remote user 52 to the local user at the location. This may be done via a two- way voice or video communication.
When a communication session is on-going the control element 38 therefore fetches sensor measurements from sensors, such as the temperature sensor and the ambient 10 15 20 25 30 24 light sensor and transfers these sensor measurements to the computer 51 of the remote user 52. The camera 34 also captures and transfers video VS to the remote user 52.When a communication session is on-going the control element 38 therefore fetches sensor measurements from sensors, such as the temperature sensor and the ambient 10 15 20 25 30 24 light sensor and transfers these sensor measurements to the computer 51 of the remote user 52. The camera 34 also captures and transfers video VS to the remote user 52.
The remote user 52 may now want to obtain some more data about the process control objects that he sees in the video stream VS. He may for instance desire to obtain data of the temperature in a tank or the voltage of a transformer. In order to do this he may select an object in the video, or in the previously obtained model of the location. He may for instance detect an object identifier in the video and send the object identifier to the data presenting device. He may also select an object in the model and the selection may be transferred to the control element 38. The control element 38 may then fetch data about the object from a database 20. It may for instance fetch a face plate with current data of the process control object.The remote user 52 may now want to obtain some more data about the process control objects that he sees in the video stream VS. He may for instance desire to obtain data of the temperature in a tank or the voltage of a transformer. In order to do this he may select an object in the video, or in the previously obtained model of the location. He may for instance detect an object identifier in the video and send the object identifier to the data presenting device. He may also select an object in the model and the selection may be transferred to the control element 38. The control element 38 may then fetch data about the object from a database 20. It may for instance fetch a face plate with current data of the process control object.
The control element 38 may therefore receive a process control object selection from the remote user 52, and based on this selection it fetches process control object data from the process control system such as from the database 22, and transfers the process control object data to the computer 51 of the remote user 52. A remote user 52 may thus select an object in the model of the location and when the object is selected he can obtain additional data such as faceplates with information of the operation.The control element 38 may therefore receive a process control object selection from the remote user 52, and based on this selection it fetches process control object data from the process control system such as from the database 22, and transfers the process control object data to the computer 51 of the remote user 52. A remote user 52 may thus select an object in the model of the location and when the object is selected he can obtain additional data such as faceplates with information of the operation.
After a process control object is selected or if no process control object is selected, the control element 10 15 20 25 30 25 38 may receive a presentation item from the remote user. The remote user 52 may more particularly provide presentation items to be projected by the projector. A presentation item may be a digital presentation item and may be a digital still image such as an image of an arrow or a circle, a presentation such as a slide show or a string of text with an instruction. It may also be a drawing made by the remote user 52. A presentation item may thus be a remote user generated presentation item comprising instructions and visual indicators. In the present example there is a first and a second presentation item PI1 and PI2 that are to be presented to the local user via the projector 48. If such a presentation item is received, it is also possible that a selection of position of the presentation item is received. The remote user may select a position for the presentation item in the 3D model 3DM of the location.After a process control object is selected or if no process control object is selected, the control element 10 15 20 25 30 25 38 may receive a presentation item from the remote user. The remote user 52 may more particularly provide presentation items to be projected by the projector. A presentation item may be a digital presentation item and may be a digital still image such as an image of an arrow or a circle, a presentation such as a slide show or a string of text with an instruction. It may also be a drawing made by the remote user 52. A presentation item may thus be a remote user generated presentation item comprising instructions and visual indicators. In the present example there is a first and a second presentation item PI1 and PI2 that are to be presented to the local user via the projector 48. If such a presentation item is received, it is also possible that a selection of position of the presentation item is received. The remote user may select a position for the presentation item in the 3D model 3DM of the location.
This position selection may also be transferred to the control element 38. The control element 38 then associates the presentation item with the selected position. The position of a presentation item may be set using a solid angle and a radius related to the position of the data presenting device and to a reference angle. A presentation item may thereby be assigned to a space in the three-dimensional model of the location. It is also possible to assign more than one presentation item in this way, with each presentation item being assigned as corresponding position. In the example in fig. 9, l0a and lOb, the first presentation item PI1 is an annotation in the form of a circle around a button on a process control object and the second presentation item PI2 is a string 10 15 20 25 30 26 of text “turn this button” (vrid denna knapp) provided at different positions.This position selection may also be transferred to the control element 38. The control element 38 then associates the presentation item with the selected position. The position of a presentation item may be set using a solid angle and a radius related to the position of the data presenting device and to a reference angle. A presentation item may thereby be assigned to a space in the three-dimensional model of the location. It is also possible to assign more than one presentation item in this way, with each presentation item being assigned as corresponding position. In the example in Fig. 9, l0a and lOb, the first presentation item PI1 is an annotation in the form of a circle around a button on a process control object and the second presentation item PI2 is a string 10 15 20 25 30 26 of text “turn this button” provided at different positions.
Thereafter the control element 38 awaits possible camera control commands from the remote user 52. The camera control commands may comprise field of view control commands, such as zooming commands that change the size of the field of view but retains the same line of sight or orientation control commands that change the line of sight. Orientation control commands typically comprise panning commands. The remote user 52 may thus change the orientation of the camera 34 through rotating or tilting it. He may also zoom in and out. If commands are received, these commands are then used by the control element 38. If the commands are field of view commands these are then used for controlling the field of view of the camera. Zooming commands are forwarded to the camera 34, which then zooms in or out depending on the type of control command. If tilting or rotation is required, the control element 38 controls a corresponding motor to obtain the required movement.Thereafter the control element 38 awaits possible camera control commands from the remote user 52. The camera control commands may comprise field of view control commands, such as zooming commands that change the size of the field of view but retains the same line of sight or orientation control commands that change the line of sight. Orientation control commands typically comprise panning commands. The remote user 52 may thus change the orientation of the camera 34 through rotating or tilting it. He may also zoom in and out. If commands are received, these commands are then used by the control element 38. If the commands are field of view commands these are then used for controlling the field of view of the camera. Zooming commands are forwarded to the camera 34, which then zooms in or out depending on the type of control command. If tilting or rotation is required, the control element 38 controls a corresponding motor to obtain the required movement.
Thereafter the control element 38 may receive a projector control command from the remote user 52. The projector control command may comprise a command to project one or more presentation items. In some instances such a command may also be a command to project a presentation item at a specific desired position. If a projector control command is received, the projector 48 is controlled by the control element if the 38 according to the command, which involves, control command is a command to project a presentation 10 15 20 25 30 27 item, in this example the first presentation item PI1 and the second presentation item PI2, controlling the projector 48 to project the first and second presentation items PI1 and PI2 in the presentation area PA of the projector. If the command is to project at a specific position the projector is controlled to project the corresponding presentation item at this position. A command may also comprise a command to change the orientation of the presentation area. In this case the projector may be moved, using the same or another motor than the one used for the camera 34, and controlled to project the presentation item so that it appears at the desired position. The remote user may thus control the data presenting device to project the presentation item at a selected space or position in the location. This may involve projecting the presentation item to a real world position corresponding to the associated position in the three- dimensional model. If a real world object at the location would be in front of the presentation item according to the presentation item position, then parts of the presentation item that would be blocked by the real world object are refrained from being presented.Thereafter the control element 38 may receive a projector control command from the remote user 52. The projector control command may comprise a command to project one or more presentation items. In some instances such a command may also be a command to project a presentation item at a specific desired position. If a projector control command is received, the projector 48 is controlled by the control element if the 38 according to the command, which involves, control command is a command to project a presentation 10 15 20 25 30 27 item, in this example the first presentation item PI1 and the second presentation item PI2, controlling the projector 48 to project the first and second presentation items PI1 and PI2 in the presentation area PA of the projector. If the command is to project at a specific position the projector is controlled to project the corresponding presentation item at this position. A command may also comprise a command to change the orientation of the presentation area. In this case the projector may be moved, using the same or another motor than the one used for the camera 34, and controlled to project the presentation item so that it appears at the desired position. The remote user may thus control the data presenting device to project the presentation item at a selected space or position in the location. This may involve projecting the presentation item to a real world position corresponding to the associated position in the three- dimensional model. If a real world object at the location would be in front of the presentation item according to the presentation item position, then parts of the presentation item that would be blocked by the real world object are refrained from being presented.
If the projector is reoriented so that the presentation area PA is moved, the presentation items PI1 and PI2 may be set to stay at the user selected positions.If the projector is reoriented so that the presentation area PA is moved, the presentation items PI1 and PI2 may be set to stay at the user selected positions.
Furthermore the projection of the presentation items is made independently of the presentation of the video. As the presentation items are associated with the model, this also means that it is possible to retain the presentation items for a later session at the location. 10 15 20 25 30 28 The control element 38 may therefore control the projector 48 separately from or independently of the control of the camera 34. If for instance the camera 34 stream is zooming in on a detail so that for instance the first presentation item PIl is outside of the field of view of the camera 34, then the first presentation item PI will still be presented. The controlling of the presentation in the presentation area is thus performed independently of the controlling of the field of view of the camera. As is evident from the zooming example given above, this thus means that the position of a presentation item in the presentation area PA of the projector 48 may be outside of the field of view of the camera. This also means that the presentation area PA may differ from the field of view of the camera 34.Furthermore the projection of the presentation items is made independently of the presentation of the video. As the presentation items are associated with the model, this also means that it is possible to retain the presentation items for a later session at the location. 10 15 20 25 30 28 The control element 38 may therefore control the projector 48 separately from or independently of the control of the camera 34. If for instance the camera 34 stream is zooming in on a detail so that for instance the first presentation item PIl is outside of the field of view of the camera 34, then the first presentation item PI will still be presented. The controlling of the presentation in the presentation area is thus performed independently of the controlling of the field of view of the camera. As is evident from the zooming example given above, this thus means that the position of a presentation item in the presentation area PA of the projector 48 may be outside of the field of view of the camera. This also means that the presentation area PA may differ from the field of view of the camera 34.
When the camera control commands are commands controlling the orientation of the camera and the projector control commands are commands controlling the orientation of the projector it can likewise be seen that the control of the orientation of the projector is performed independently of the control of the orientation of the camera, which thus means that the control of orientation of the camera does not influence the control of the orientation of the projector.When the camera control commands are commands controlling the orientation of the camera and the projector control commands are commands controlling the orientation of the projector it can likewise be seen that the control of the orientation of the projector is performed independently of the control of the orientation of the camera, which thus means that the control of orientation of the camera does not influence the control of the orientation of the projector.
As can be seen in fig. l0a and 10b, the projecting area PA of the projector 48 may be movable. If there are several presentation items that may fit in the presentation area when located at a current position, these may be presented singly or simultaneously based on the commands of the remote user. 10 15 20 25 30 29 If for instance several presentation items are provided, where some are outside of the current location of the presentation area PA, the projector 48 may be reoriented so that one or more of these are projected. After assignment, the remote user may simply select a presentation item for being presented and the control element 38 will control one or more motors for reorienting the projector so that the presentation area covers the selected presentation item.As can be seen in Fig. L0a and 10b, the projecting area PA of the projector 48 may be movable. If there are several presentation items that may fit in the presentation area when located at a current position, these may be presented singly or simultaneously based on the commands of the remote user. 10 15 20 25 30 29 If for instance several presentation items are provided, where some are outside of the current location of the presentation area PA, the projector 48 may be reoriented so that one or more of these are projected. After assignment, the remote user may simply select a presentation item for being presented and the control element 38 will control one or more motors for reorienting the projector so that the presentation area covers the selected presentation item.
Thereafter the capturing of video, is continued, as well as waiting, for various commands from the remote user. This type of operation is continued as long as the session is on-going.Thereafter the capturing of video, is continued, as well as waiting, for various commands from the remote user. This type of operation is continued as long as the session is on-going.
The remote user 52 may also send commands controlling the projector 48, the camera 34 as well as various sensors, such as the temperature sensor.The remote user 52 may also send commands controlling the projector 48, the camera 34 as well as various sensors, such as the temperature sensor.
Through the data presenting device 32 it is possible for the remote user to obtain knowledge of the operation of process control objects at the location as well as to obtain other information such as temperature at the location. In order to observe the location the remote user 52 may also rotate the camera and obtain visual data of the location. Through the voice connection the remote user may also communicate with a local user and receive audible comments on possible problems at the location.Through the data presenting device 32 it is possible for the remote user to obtain knowledge of the operation of process control objects at the location as well as to obtain other information such as temperature at the location. In order to observe the location the remote user 52 may also rotate the camera and obtain visual data of the location. Through the voice connection the remote user may also communicate with a local user and receive audible comments on possible problems at the location.
The remote user may then determine appropriate actions, such as what process control objects and part of the these that are to be actuated and when. The remote user 10 15 20 25 30 30 may for instance provide a number of presentation items, such as arrows and explaining text and assign these to different positions in the virtual model. The remote user may also provide a timing instruction, providing a sequence in which presentation items are to be presented. The commands and presentation items may then be sent to the data presenting device 32, which presents them via the projector 48 in an order decided by the remote user 52. If the presentation items are provided in the presentation area at a current position, then these may be presented simultaneously.The remote user may then determine appropriate actions, such as what process control objects and part of the these that are to be actuated and when. The remote user 10 15 20 25 30 30 may for instance provide a number of presentation items, such as arrows and explaining text and assign these to different positions in the virtual model. The remote user may also provide a timing instruction, providing a sequence in which presentation items are to be presented. The commands and presentation items may then be sent to the data presenting device 32, which presents them via the projector 48 in an order decided by the remote user 52. If the presentation items are provided in the presentation area at a current position, then these may be presented simultaneously.
When a new presentation item needs to be presented that is outside the current field of view of the projector 48, i.e. outside the presentation area when in its current position, the projector 48 may be moved or reoriented so that the presentation area covers the position of the new presentation item. This movement of the projector 48 may be made independently of the camera 34. In this way it is possible for the remote user 52 to present information at one place, for instance instructions about actuating a certain process control object, while at the same time monitoring another object at another place not covered by the projector 48.When a new presentation item needs to be presented that is outside the current field of view of the projector 48, i.e. outside the presentation area when in its current position, the projector 48 may be moved or reoriented so that the presentation area covers the position of the new presentation item. This movement of the projector 48 may be made independently of the camera 34. In this way it is possible for the remote user 52 to present information at one place, for instance instructions about actuating a certain process control object, while at the same time monitoring another object at another place not covered by the projector 48.
The second variation thus provides a way to allow a remote user to remotely guide personnel on site via a live video stream. The data presenting device will also allow the local user and the remote user to communicate verbally. It will also allow the remote user to get an overview of the environment through the camera. The remote user can also scroll, pan and zoom the camera on site to get a superior overview of the situation from 10 15 20 25 30 31 the remote location. As a 3D camera is used the remote user will be able to see a 3D model of the environment in case he needs additional space information about the location.The second variation thus provides a way to allow a remote user to remotely guide personnel on site via a live video stream. The data presenting device will also allow the local user and the remote user to communicate verbally. It will also allow the remote user to get an overview of the environment through the camera. The remote user can also scroll, pan and zoom the camera on site to get a superior overview of the situation from 10 15 20 25 30 31 the remote location. As a 3D camera is used the remote user will be able to see a 3D model of the environment in case he needs additional space information about the location.
It is also possible for the remote user to add presentation items or information, such as annotations and drawings to the physical world by using the projector to project information onto the real world, i.e., the remote user can visually share information and annotations with the local user at the location.It is also possible for the remote user to add presentation items or information, such as annotations and drawings to the physical world by using the projector to project information onto the real world, i.e., the remote user can visually share information and annotations with the local user at the location.
All the sensors together with camera and sound recording equipment will enable a remote connected user to see, hear and feel the situation at the plant. The projector and sound generating equipment, may in turn be used to communicate information back from the remote user to the personnel on site. The projector is used to for the remote user to visually communicate information back to the plant personnel.All the sensors together with camera and sound recording equipment will enable a remote connected user to see, hear and feel the situation at the plant. The projector and sound generating equipment, may in turn be used to communicate information back from the remote user to the staff on site. The projector is used to for the remote user to visually communicate information back to the plant personnel.
By allowing the remote user to take control of the data presenting device, the remote user can browse the surroundings using the camera, by rotating, tilting and zooming. Once the remote user has information that he/she wants to share with the local users on site he can “draw” this information on to the presentation area using the projector. The remote user can use text, images, or simply draw objects on the remote screen.By allowing the remote user to take control of the data presenting device, the remote user can browse the surroundings using the camera, by rotating, tilting and zooming. Once the remote user has information that he / she wants to share with the local users on site he can “draw” this information on to the presentation area using the projector. The remote user can use text, images, or simply draw objects on the remote screen.
The drawings will then be projected on site using the projector. As the camera records a 3D model of the environment, the notes can also be left behind objects. 10 15 20 25 32 All visual information provided by the remote user may be augmented reality information, meaning that any annotations or drawings that the remote user adds are saved and connected with the point where they were added by using the constructed 3D model of the environment. This means that if the remote user rotates the camera after an annotation has been added the annotation will stay in the same spot.The drawings will then be projected on site using the projector. As the camera records a 3D model of the environment, the notes can also be left behind objects. 10 15 20 25 32 All visual information provided by the remote user may be augmented reality information, meaning that any annotations or drawings that the remote user adds are saved and connected with the point where they were added by using the constructed 3D model of the environment. This means that if the remote user rotates the camera after an annotation has been added the annotation will stay in the same spot.
As can be seen in fig. 10a the remote user has added a presentation item PI. As the remote user rotates the data presenting device 32, as can be seen in fig. 10b, for instance in order to get a better overview of the first presentation item PI is still projected correctly even through the position of the presentation area PA has been changed. It can thereby be seen that the real world position in which the presentation item is projected is retained even if the presentation area is moved.As can be seen in Fig. 10a the remote user has added a presentation item PI. As the remote user rotates the data presenting device 32, as can be seen in Fig. 10b, for instance in order to get a better overview of the first presentation item PI is still projected correctly even through the position of the presentation area PA has been changed. It can thereby be seen that the real world position in which the presentation item is projected is retained even if the presentation area is moved.
Imagine the following scenario: 1. A Norwegian gas company unexpectedly experiences severe trouble at one of their offshore platforms. The problem is rare and technically complex, the operators on site need support from an expert in order to restore production. 2. Flying in an expert will take at least 48 hours as all experts are located far away. 3. The operator contacts a support company where an expert in this technical subject is available to instantly help the offshore platform with the problem. 10 15 20 25 30 33 4. Operators have discussions with the expert, the expert instructs the operators on site to bring the data presenting device to a specific part of the process so that the remote expert can have a look at the problem. 5. The remote expert observes the situation using the camera, sound recording equipment and sensors. 6. Based on the information from the offshore platform the remote expert can now instruct the operators on site to perform certain operations to correct the problem. 7. The remote expert utilizes both voice and the possibility to visually share information with the users offshore. The possibility for the remote expert to use both voice and to visually share information is extremely effective as it is possible for the remote expert to instantly “point out” where operators on site should perform actions.Imagine the following scenario: A Norwegian gas company unexpectedly experiences severe trouble at one of their offshore platforms. The problem is rare and technically complex, the operators on site need support from an expert in order to restore production. Flying in an expert will take at least 48 hours as all experts are located far away. 3. The operator contacts a support company where an expert in this technical subject is available to instantly help the offshore platform with the problem. 10 15 20 25 30 33 4. Operators have discussions with the expert, the expert instructs the operators on site to bring the data presenting device to a specific part of the process so that the remote expert can have a look at the problem. 5. The remote expert observes the situation using the camera, sound recording equipment and sensors. 6. Based on the information from the offshore platform the remote expert can now instruct the operators on site to perform certain operations to correct the problem. 7. The remote expert utilizes both voice and the possibility to visually share information with the offshore users. The possibility for the remote expert to use both voice and to visually share information is extremely effective as it is possible for the remote expert to instantly “point out” where operators on site should perform actions.
Through the second variation remote users are offered the possibility to instantly give support to any place in the world. No longer are they required to go to a site every time their assistance is needed, instead in many cases they can solve problems from their office.Through the second variation remote users are offered the possibility to instantly provide support to any place in the world. No longer are they required to go to a site every time their assistance is needed, instead of many cases they can solve problems from their office.
Remote users are offered a level of situational awareness that cannot be achieved with video streams alone by building a 3D model of the world. Rapid response by a remote expert saves time. This means that so little production quality or production quantity is jeopardised while waiting for an expert to travel to the site. 10 15 20 25 30 34 Local users, such as maintenance-engineers are offered an unobtrusive and natural way of viewing augmented reality information which, in a lot of situations is superior to viewing AR information on a pair of head mounted glasses or via a hand held screen.Remote users are offered a level of situational awareness that cannot be achieved with video streams alone by building a 3D model of the world. Rapid response by a remote expert saves time. This means that so little production quality or production quantity is jeopardised while waiting for an expert to travel to the site. 10 15 20 25 30 34 Local users, such as maintenance-engineers are offered an unobtrusive and natural way of viewing augmented reality information which, in a lot of situations is superior to viewing AR information on a pair of head mounted glasses or via a hand held screen.
~ The remote user is able to add notes to the environment that are projected onto the actual surface of the equipment for the local maintenance engineer to view.~ The remote user is able to add notes to the environment that are projected onto the actual surface of the equipment for the local maintenance engineer to view.
° There is a possibility to add notes to a 3D model of the world and display those notes on spot by using projectors ° Notes added to the 3D model stick to their place even if the camera covers another position.° There is a possibility to add notes to a 3D model of the world and display those notes on spot by using projectors ° Notes added to the 3D model stick to their place even if the camera covers another position.
The annotations and notes added to the environment and/or the 3D model of the world may also be recorded and saved as part of the maintenance history for the industrial plant. They may also be later retrieved, if the data presenting device is brought back to a known location.The annotations and notes added to the environment and / or the 3D model of the world may also be recorded and saved as part of the maintenance history for the industrial plant. They may also be later retrieved, if the data presenting device is brought back to a known location.
It should also be realized that the two variations described above may be combined. The activities in the two variations may thus be carried out in the same communication session. In this case the knowledge that the remote user gets of the location in the first variation may be used to control the data presenting device and especially in the use of presentation items. 10 15 20 25 30 35 Another situation that is possible to handle is shown in fig. 11, which depicts the presentation area PA being projected by the data presenting device. In this presentation area there is the second presentation item PI2 as well as a third presentation item PI3, which is likewise a string of text “check this meter first” (kolla denna mätare först). These items PI2 and PI3 may be provided in the previously described way.It should also be realized that the two variations described above may be combined. The activities in the two variations may thus be carried out in the same communication session. In this case the knowledge that the remote user gets of the location in the first variation may be used to control the data presenting device and especially in the use of presentation items. 10 15 20 25 30 35 Another situation that is possible to handle is shown in Fig. 11, which depicts the presentation area PA being projected by the data presenting device. In this presentation area there is the second presentation item PI2 as well as a third presentation item PI3, which is likewise a string of text “check this meter first” (check this meter first). These items PI2 and PI3 may be provided in the previously described way.
The remote expert here remotely guides the local user on site via a live stream. By using the data presenting device 32 it is thus possible for the remote user to add information, such as annotations and drawings to the physical world on site by using the projector to project information onto the real world. Therefore, the expert user can visually share information and annotations with the personnel on site.The remote expert here remotely guides the local user on site via a live stream. By using the data presenting device 32 it is thus possible for the remote user to add information, such as annotations and drawings to the physical world on site by using the projector to project information onto the real world. Therefore, the expert user can visually share information and annotations with the staff on site.
However, as can be seen in fig. 12, the local user 54 may move in front of the projection, so that at least a part of the presentation area PA is blocked. The local user 54 may do this in order to act on instructions given in the presentation items. However, then it is possible that the instructions are not visible any ITlOre .However, as can be seen in Fig. 12, the local user 54 may move in front of the projection, so that at least a part of the presentation area PA is blocked. The local user 54 may do this in order to act on instructions given in the presentation items. However, then it is possible that the instructions are not visible any ITlOre.
The projection area and content location may in this case be adjusted based on the location of the field Worker and process equipment. The control element will then select the best place to project information to ensure that the field worker is able to view the instructions. 10 15 20 25 36 As the data presenting device 32 has built a 3D model of the environment in real time, the control element can track where the local user is at the moment. It can thereby determine that the local user is in the field of view of the projector. Therefore, it can modify the projection area PA so that the information is projected around or aside the field worker as seen in Figure 12.The projection area and content location may in this case be adjusted based on the location of the field Worker and process equipment. The control element will then select the best place to project information to ensure that the field worker is able to view the instructions. 10 15 20 25 36 As the data presenting device 32 has built a 3D model of the environment in real time, the control element can track where the local user is at the moment. It can thereby determining that the local user is in the field of view of the projector. Therefore, it can modify the projection area PA so that the information is projected around or aside the field worker as seen in Figure 12.
There are several technical solutions available which are able to recognize a human in the camera view, Microsoft Kinect as an example.There are several technical solutions available which are able to recognize a human in the camera view, Microsoft Kinect as an example.
There are several ways how the control element 38 can modify or change the projection based on the detection: The data presenting device 32 can for instance change the projection area so that it is projected beside the local user.There are several ways how the control element 38 can modify or change the projection based on the detection: The data presenting device 32 can for instance change the projection area so that it is projected beside the local user.
One way of changing the projection area is through rotating or tilt the projector so that the projection is displayed on a free area beside the local user, see Figure 13. In this way the control element thus controls the projector to move the projection area beside the local user If the data presenting device 32 is attached to a mobile platform, for instance the previously described wheels and motor, the control element may control the mobile platform to move the data presenting device 32 i.e. to another location, to change position of the data presenting device, to have a free projection area, see fig. 14. 10 15 20 25 37 Another way is for the control element 38 to modify the projection itself. For example, if the local user stands in the way, where a line of text should be projected, the text may be wrapped into several lines beside the local user, see fig. 12. The change of the projection may thus be a change of one or more presentation items, in position and/or shape.One way of changing the projection area is through rotating or tilt the projector so that the projection is displayed on a free area beside the local user, see Figure 13. In this way the control element thus controls the projector to move the projection area beside the local user If the data presenting device 32 is attached to a mobile platform, for instance the previously described wheels and motor, the control element may control the mobile platform to move the data presenting device 32 i.e. to another location, to change position of the data presenting device, to have a free projection area, see Fig. 14. 10 15 20 25 37 Another way is for the control element 38 to modify the projection itself. For example, if the local user stands in the way, where a line of text should be projected, the text may be wrapped into several lines beside the local user, see fig. 12. The change of the projection may thus be a change of one or more presentation items, in position and / or shape.
Imagine the following scenario: 1. Plant maintenance engineer Adam 54 is doing a complex procedure in the process. He has contacted a remote expert assistance to guide him through the complex procedure. 2. Adam 54 places the data presenting device 32 next to the area where he knows he will be working. 3. The expert 52 can now “draw” onto the surface at Adams 54 location, and give Adam help of what he is supposed to do. 4. When Adam 54 sets up the projector, no user is identified in the projection area PA. Therefore full projection is used. 5. Next, Adam 54 moves so that he is in front of the projector. Adam 54 is detected by the camera to be presented inside the projection area PA and the projection is modified so that it is not projected on top of Adam. 6. Adam is able to see the instructions PA2 and PA3 from the expert 52 and complete the complex procedure. 10 15 20 25 38 According to this variation the following features are provided: ° Identification of a person in boundaries of projection area and modification of the projection based on this information.Imagine the following scenario: 1. Plant maintenance engineer Adam 54 is doing a complex procedure in the process. He has contacted a remote expert assistance to guide him through the complex procedure. 2. Adam 54 places the data presenting device 32 next to the area where he knows he will be working. The expert 52 can now “draw” onto the surface at Adams 54 location, and give Adam help of what he is supposed to do. 4. When Adam 54 sets up the projector, no user is identified in the projection area PA. Therefore full projection is used. 5. Next, Adam 54 moves so that he is in front of the projector. Adam 54 is detected by the camera to be presented inside the projection area PA and the projection is modified so that it is not projected on top of Adam. 6. Adam is able to see the instructions PA2 and PA3 from the expert 52 and complete the complex procedure. 10 15 20 25 38 According to this variation the following features are provided: ° Identification of a person in boundaries of projection area and modification of the projection based on this information.
° Identification of a person in boundaries of projection area and rotation/tilt of the projector based on this information.° Identification of a person in boundaries of projection area and rotation / tilt of the projector based on this information.
° Identification of a person in boundaries of projection area and moving the data presenting device based on this information.° Identification of a person in boundaries of projection area and moving the data presenting device based on this information.
Several benefits can be listed for this variation: - Improved quality of remote guidance, more flexible solution ° Improved safety: As instructions are displayed to the field worker in an accurate and unobtrusive way, the field worker can utilize the guidance in an efficient way.Several benefits can be listed for this variation: - Improved quality of remote guidance, more flexible solution ° Improved safety: As instructions are displayed to the field worker in an accurate and unobtrusive way, the field worker can utilize the guidance in an efficient way.
° Cost savings: Industries that needs help from an expert if they experience any problems will be able to save large amounts of money if they can get hold of an expert that can help them correct the problem faster.° Cost savings: Industries that need help from an expert if they experience any problems will be able to save large amounts of money if they can get hold of an expert that can help them correct the problem aunt.
~ Time saver: No longer do industries have to wait for an expert to travel to site. Instead instant support will be available. 10 15 39 The control unit may, as was mentioned above, be provided in the form of a processor together with memory including computer program code for performing its functions. This computer program code may also be provided on one or more data carriers which perform the functionality of the control unit when the program code thereon is being loaded into the memory and run by the processor. One such data carrier 56 with computer program code 58, in the form of a CD ROM disc, is schematically shown in fig. 15.~ Time saver: No longer do industries have to wait for an expert to travel to site. Instead instant support will be available. 10 15 39 The control unit may, as was mentioned above, be provided in the form of a processor together with memory including computer program code for performing its functions. This computer program code may also be provided on one or more data carriers which perform the functionality of the control unit when the program code thereon is being loaded into the memory and run by the processor. One such data carrier 56 with computer program code 58, in the form of a CD ROM disc, is schematically shown in Fig. 15.
The invention can be varied in many more ways than the ones already mentioned. It should therefore be realized that the present invention is only to be limited by the following claims.The invention can be varied in many more ways than the ones already mentioned. It should therefore be realized that the present invention is only to be limited by the following claims.
Claims (25)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1300676A SE1300676A1 (en) | 2013-10-29 | 2013-10-29 | Procedure and data presentation arrangements to assist a remote user to provide instructions to a local user |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1300676A SE1300676A1 (en) | 2013-10-29 | 2013-10-29 | Procedure and data presentation arrangements to assist a remote user to provide instructions to a local user |
Publications (1)
Publication Number | Publication Date |
---|---|
SE1300676A1 true SE1300676A1 (en) | 2013-10-29 |
Family
ID=49554544
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE1300676A SE1300676A1 (en) | 2013-10-29 | 2013-10-29 | Procedure and data presentation arrangements to assist a remote user to provide instructions to a local user |
Country Status (1)
Country | Link |
---|---|
SE (1) | SE1300676A1 (en) |
-
2013
- 2013-10-29 SE SE1300676A patent/SE1300676A1/en not_active Application Discontinuation
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9829873B2 (en) | Method and data presenting device for assisting a remote user to provide instructions | |
US9628772B2 (en) | Method and video communication device for transmitting video to a remote user | |
CN108234918B (en) | Exploration and communication architecture method and system of indoor unmanned aerial vehicle with privacy awareness | |
JP2019067383A (en) | 3D mapping of process control environment | |
US11816887B2 (en) | Quick activation techniques for industrial augmented reality applications | |
JP2016220173A (en) | Tracking support device, tracking support system and tracking support method | |
US12112439B2 (en) | Systems and methods for immersive and collaborative video surveillance | |
CN107452119A (en) | virtual reality real-time navigation method and system | |
Kritzler et al. | Remotebob: support of on-site workers via a telepresence remote expert system | |
US10212328B2 (en) | Intelligent presentation of surveillance information on a mobile device | |
JP2021018710A (en) | Site cooperation system and management device | |
KR100689287B1 (en) | Camera Control System In Use Of Tile Sensing and a Method Using Thereof | |
US20230324906A1 (en) | Systems and methods for remote viewing of self-driving vehicles | |
SE1300676A1 (en) | Procedure and data presentation arrangements to assist a remote user to provide instructions to a local user | |
KR20190135860A (en) | Method for video control by interaction with terminal and video control system using the method | |
JP2006115241A (en) | Monitoring system of specific region | |
SE1500055A1 (en) | Method and data presenting device for facilitating work at an industrial site assisted by a remote user and a process control system | |
WO2010130289A1 (en) | A teach pendant unit for an industrial robot and a service system | |
WO2015051816A1 (en) | Control of a communication session between a local and remote user of a process control system | |
JP6102961B2 (en) | Information processing apparatus, program, and information processing method | |
JP3121418B2 (en) | Pointing system | |
JP6540732B2 (en) | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, PROGRAM, AND INFORMATION PROCESSING SYSTEM | |
KR20170089575A (en) | A distribution management system using augment reality | |
JP2023058395A (en) | Control method, program, and display system | |
CN118092645A (en) | System and method for booting instructions or support using virtual objects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NAV | Patent application has lapsed |