WO2009128781A1 - A method and a device for remote visualization - Google Patents

A method and a device for remote visualization Download PDF

Info

Publication number
WO2009128781A1
WO2009128781A1 PCT/SE2009/050394 SE2009050394W WO2009128781A1 WO 2009128781 A1 WO2009128781 A1 WO 2009128781A1 SE 2009050394 W SE2009050394 W SE 2009050394W WO 2009128781 A1 WO2009128781 A1 WO 2009128781A1
Authority
WO
WIPO (PCT)
Prior art keywords
image information
information
display means
displaying
mobile device
Prior art date
Application number
PCT/SE2009/050394
Other languages
French (fr)
Inventor
Sten Lundgren
Original Assignee
Lundgren & Nordstrand Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lundgren & Nordstrand Ab filed Critical Lundgren & Nordstrand Ab
Publication of WO2009128781A1 publication Critical patent/WO2009128781A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • the invention relates to a method and a device for remote visualization. Service and maintenance sometimes have to be carried out at remote locations. Long distances, cultural barriers, confusion of language, shortage of competence, etc, make service and maintenance work very cost demanding. PRIOR ART
  • Prior art systems fail to provide appropriate working conditions for service personnel. They also show drawbacks in producing a usable feedback.
  • a document that describes the use of a video camera or equivalent equipment for other purposes is US6803887.
  • the equipment disclosed in this document includes a VRD ( Virtual Retina Display), a video camera and a module for determining the position of the eye.
  • the document is very brief and does not state clearly how the equipment is constructed.
  • the document implies that the information captured by the camera might be partially visible for the carrier of the device.
  • the captured information or some part of it is transferred wirelessly to some kind of server where the information is processed. On basis of the transferred information the further information is returned new and/or modified for display on the VRD-unit.
  • An object of the present invention is to provide a method for remote visualization that will allow service personnel at remote locations to be guided more accurately by skilled persons than in prior art systems.
  • the method in accordance with the invention also allows information to be exchanged between the remote location comprising a first display means and a central unit comprising a second display means.
  • a mobile device is provided at the remote location. Image information is generated and picked up locally by the mobile device and is displayed to a user wearing the mobile device. Preferably the image information is projected directly onto a retina of the user by a first display means.
  • the image information also is transmitted to and displayed at a second display means located at a position different from the location of the mobile device.
  • the mobile device is worn by the user to pick up image information in the line of sight of the user.
  • the mobile device comprises or is connected to a computer.
  • the computer connects to a webpage, where anyone granted access can login and he or she will be able to see the same picture from any location in the world, with internet access.
  • the person When connected to the carrier of the goggles, the person will be able to point in the picture with his/her mouse so that the cursor will appear on the screen of the goggles. With the cursor the connected person(s) will be able to guide the carrier of the goggles through any task he might need help performing.
  • Data and indicator information is generated at the position of the second display means and transmitted to the mobile device where it is displayed along with the locally generated image information at the first display means.
  • data and indicator information can include information from databases, technical information and digital maps and plans.
  • the mobile device comprises a remote temperature transducer.
  • the remote temperature transducer is directed in the line of sight of the user and a point of aim is displayed to the user as well as on the second display means.
  • distance seeing or remote visualizing is made possible with video goggles equipped with one or two zoom, and auto- focus cameras that are connected to a computer carried by the same person. The person will observe the images picked up by the camera(s) displayed in the goggles and thereby he will experience it as if he was looking without the goggles.
  • the computer can be connected to a web site at a remote location, where anyone granted access can login and will be able to observe the same image or picture from any location in the world having access to the internet.
  • a web site When connected to the wearer of the goggles, the person will be able to point in the image with a pointing means such as a mouse and the cursor will appear on the screen of the goggles.
  • a person connected to the web site will be able to assist the wearer of the goggles in performing tasks at the remote location of the wearer.
  • a web server is operating in the computer and the web server is connected to and made available to the internet.
  • the wearer of the goggles wishes to point something out directly in the picture this is simply done by pointing with a finger or with a rectangle that is visible in the centre of the image.
  • the wearer can communicate with connected parties through sound and picture as if standing right next to each other. If the carrier of the goggles is occupied and needs to adjust for example the camera zoom, this is simply done by asking the connected person(s) to make the adjustments of the camera.
  • the person(s) connected are able also to show the wearer of the goggles pictures, blueprints and other documents. There is also a possibility to point, zoom and more in any shared document, and via the sound link explain and guide the wearer of the goggles.
  • Indicator information such as the position of a cursor or a pointing symbol, is transmitted to the goggles and displayed to the wearer.
  • the discussion can be saved in a database that will be accessible for all par- ticipants later on.
  • the carrier of the goggles turns his head the image will move a little part for each transmission. Since a major part of the image remains unchanged only information indicative of how much and which direction any moment has occurred is transmitted. The transmitted information is suffi- cient. After that the new field which appears in the field of vision will be transmitted in its entirety.
  • the arrow which the adviser is pointing with can as an option be locked to the background so if the goggle carrier moves his head a little the arrow still points at the same point.
  • the adviser's movement of the arrow also will be in reference to the background and not in absolute coordinates.
  • the arrow changes depending on the angle of movement. This makes it possible to point at things in any desired direction.
  • a device in accordance with the invention can be used in applications where corresponding means have not been exploited.
  • the method and the device have properties that significantly will improve education, counseling and inspection in real time situations and on distance. Energy and travel time can be saved, and the environmental consequences will be positive.
  • a further advantage is the possibilities to fulfill future requirements of documentation in different applications. Possible areas of use are:
  • FIG. 1 shows schematically a basic embodiment of a device in accordance with the invention mounted on a wearer
  • Fig. 2 is a basic block diagram showing one embodiment of a device in accordance with the invention
  • Fig. 3 shows a view available on a display of the device in Fig. 1
  • Fig. 4 shows a view available on a display of a device at a remote location
  • Fig. 5 schematically shows division of an image into segments and Fig. 6 schematically illustrates movement of the view.
  • Image processing and internet communication are important elements when implementing the invention. By optimizing the information that is in fact transferred substantial improvements are possible. Image information is continuously monitored and only information relating to sections of changed im- age information is transmitted. The image information also can be evaluated with regard to importance, so as to give priority to the most relevant or important information.
  • Fig. 1 shows a mobile device 10 in accordance with one embodiment of the invention.
  • the device 10 is formed as goggles and worn by a user 12.
  • the user will only be able to see through the goggles, that is through a first display unit built in into the goggles.
  • Image information is picked up by a camera module 14 and transmitted to the first dis- play unit.
  • the camera module 14 comprises one digital camera. In other embodiments two or more cameras are included so as to pick up and display more complex views.
  • the device 10 comprises also a control unit, c.f. Fig. 2, interconnect- ing and controlling the camera module 14, the first display and headphones 16.
  • the control unit can be connected to the internet and will operate as a web server that other units may connect to through a conventional web browser.
  • the device in accordance with the invention also comprises a pyrometer or a radiation thermometer 18 and a light unit 20.
  • both the pyrometer 18 and the light unit 20 are mounted externally on the goggles to be directed in the line of sight.
  • the device in accordance with the invention comprises the mobile device 10 and at least one remote unit 22.
  • the mobile device 10 and the at least one remote units 22 are connected through a general communication system 24, such as the internet.
  • information is exchanged through a web server 25.
  • the connection between the mobile device 10 and the remote unit is wireless in the embodiment shown in Fig. 2. In other embodiments a wire connection is used.
  • the web server 25 comprises a computer 27 and a communications module 29.
  • the mobile device 10 comprises a first control unit 24 that is opera- tively connected to other units and modules of the mobile device 10.
  • the first control unit 24 includes a conventional personal computer.
  • the first control unit 24 is connected to a first communication unit 26, such as a modem or similar device. Communication between the mobile device 10 and the remote unit 22 is handled by the first communication unit 26, which can be designed for transmission through cable, radio signals, light or sound. Image information, other data as well as sound from parties involved are transferred through the communication unit 26.
  • the communication unit 26 comprises means for communicating through a TCP/IP protocol or a similar protocol.
  • the mobile device 10 further comprises a first display unit 28, the camera module 14, the light unit 20 and the radiation thermometer 18.
  • the first display unit 28 includes a virtual retinal display (VRD), also known as a retinal scan display (RSD).
  • VRD virtual retinal display
  • RSD retinal scan display
  • the RSD draws a raster display directly onto the retina of the eye.
  • Image information is picked up by the camera module 14, completed and modified by the first control unit 24 and transmitted to the first display unit 28.
  • a stroboscope vision module 30 that can be implemented partly or fully as software running in the first control unit 24 recognizes repeating views in one or many segments. The scanning frequency can then be adjusted to the appearing frequency of the views. When viewing for instance a rotating wheel, products on a conveyer belt or something vibrating in the field of vision it will be possible to study the component since it appears as if it did not move or moves slowly. Slower processes can be shown clearer by not changing the picture until the process repeats again.
  • the first control unit also calculates turns/sec, amount of products/min and so on and display correspond- ing information at the first display unit 28.
  • a search color is specified.
  • the first control unit 26 and the camera module 14 locate all occurrences of the specified color and all objects or image segments having that color are highlighted.
  • objects moving in the captured image are located and followed.
  • the first control unit 24 calculates the orbit or line of movement and displays a corresponding line in the first display unit 28.
  • the first control unit 24 is capable of displaying the corresponding line continuously or intermittently. At any time the displaying of the line can be interrupted.
  • Flow charts, maps, constructional drawings, component listing can be downloaded to the mobile device from the remote unit 22 and displayed through an information layer module 36.
  • the ob- jects that are downloaded are displayed in a transparent layer at the first display unit 28, so as not to cover elements of the image picked up by the camera module 14. Positioning of the transparent layer and objects displayed therein and a correct scale factor are taken from detectable reference dots in reality guides.
  • the mobile device 10 comprises means for acting as a server.
  • the server substantially presents the view that is displayed on the first display unit 28.
  • Specific remote users are informed about the server internet address and where appropriate user id and password. By accessing the server remote users will be able to see substantially the same view as the user of the mobile device 10.
  • the web page instead is provided in a web server 25 that is connected to the internet.
  • the web server comprises a computer 27 and a communication module 29.
  • the web server will receive image information from the camera module 14 and make the image information and also other information originating from the mobile device 10 available to other users.
  • the web server is implemented as a section of the mobile device 10, for instance as a part of the first control unit 24.
  • the first communication unit 26 comprises a first encryption function
  • each of the encryption functions also may include a tunneling function for security reasons.
  • a remote user will be able to move a mouse pointer or cursor generated and visible at a remote display and the position of the mouse pointer will be transmitted to the mobile device 10 and displayed also at the first display unit. As a result the wearer of the goggle as well as all remote users will be able to see the mouse pointer in the displayed image. If a plurality of re- mote users are active each remote user may have a mouse pointer that is different in color or other appearance. In one embodiment the mouse pointer or cursor generated at the remote display is locked at a background position. As a result the position of the cursor in relation to the background will not change, even if the mobile device is turned.
  • a locally generated cursor is locked at a central position of the displayed view, cf. Fig. 3.
  • the locally generated cursor can be generated in a cursor generation module 37.
  • the locally generated cursor is formed as a transparent rectangle.
  • Other image functions are processed in an image processing unit 40.
  • Each mobile device 10 can be given a specific identity that can be stored locally for instance in a memory circuit 38. The specific identity or an associated identification reference is required for remote users for establishing a contact to the mobile device 10.
  • the remote unit 22 comprises a remote or second control unit 42 which in one embodiment is a conventional computer, such as a personal computer.
  • the remote control unit 42 is connected to a second display unit 44 which can be a conventional monitor or display unit.
  • Communication be- tween the remote unit 22 and the mobile device 10 is provided by a second communication unit 46.
  • the first communication unit 26 and the second communication unit 46 communicate over a suitable communication medium 48 such as the internet.
  • data can be generated or retrieved for transfer to and for being displayed at the first display unit 28.
  • Fig. 3 shows the view as seen by the wearer of the goggles or mobile device 10
  • Fig. 4 shows the view as seen at the remote users.
  • Image information as captured by the camera module is shown within a double line rectangle 47.
  • the view of Fig. 3 is displayed at the first display unit 28. Most elements that are visible are objects in the area in front of the wearer. Switches 50, a control panel 52 and a machine contour line 53 are such elements.
  • the view as seen on the first display unit 28 also comprises objects and elements that are generated either by the first control unit 24 or by the second control unit 42.
  • a pointer 54 shaped as an arrow is generated by the second control unit 42 on the basis of actions taken by a remote user or advisor.
  • the pointer 54 can be locked either to a background element, such as a specific switch as shown in Fig. 3 or at a specific position of the view. In the first case the pointer will remain at the switch also when the image is changed as a result of a movement of the camera module 14, as long as the switch is visible.
  • a cursor 56 shaped as a rectangle is generated by the first control unit
  • the wearer of the mobile device can use the cursor 54 as a pointing device as an alternative to pointing with a finger that will be visible at the first display unit 28 as well as at the second display unit 44.
  • the central section of the image will not be covered.
  • a distinct point of the image can be referred to by moving the goggle to a position where a corner of the rectangle is located at the distinct point.
  • the control panel 52 of the machine in view also comprises a machine display 58.
  • the wearer of the goggle needs to know how to deal with a question mark 59 shown at the machine display.
  • the wearer positions the goggle to place the cursor rectangle 56 over the question mark 59.
  • An advisor at the remote location who can observe the same view, c.f. Fig. 4, is directed to the relevant position by the rectangle 58.
  • the advisor also can observe that one switch is switched to an incorrect position and enables the pointer 54. By positioning the pointer 54 at the switch in question the attention of the wearer is directed to said switch.
  • the radiation thermometer 18 is designed to measure the temperature along a sight line corresponding to the position of the cursor rectangle 56 at a reasonable distance from the goggle.
  • a measured temperature value is displayed on the first display unit 28 and the second display unit 44. In the shown embodiment the temperature value appears on a scale 60 as well as in a small window 61.
  • the camera module 14 is provided with a zoom function.
  • the zoom function can be controlled by the wearer of the goggle as well by the advisor. The wearer will adjust the zoom level locally and the advisor by clicking at a zoom field 62 in the view. Zoom information can be given by the advisor and transferred to the mobile device where the first control unit will adjust the zoom level of the camera module accordingly.
  • Name information or other identification of advisors can be displayed at name windows 64. Preferably the name of each advisor or participant is given a separate colour so as to allow identification during the process.
  • the advisor also has a possibility to fetch a marker such as an arrow 66 with his mouse pointer and then position it at anywhere in the vied image.
  • the marker arrow can be locked to the background.
  • the advisor also can choose to save snapshots or video sequences by clicking on corresponding recording buttons 65 on the second display unit 44. Recorded information, which may include also voice conversations and other data, is saved. The information can be saved locally, for instance in the memory circuit 38, and later on be transferred to a web server or to the remote unit 22.
  • An information field 68 is available in the view of the first display unit and of the second display unit.
  • Data and other information can be typed in by the advisor or other participant, generated by other means or collected from databases and other data sources, and will be made available at the display units.
  • Data from databases, maps and plans also can be displayed overlaid on captured image information at the display units.
  • the field of vision is divided into segments with appropriate appearance and shape, such as squares, rectangles or as shown in Fig. 5. By dividing the field of vision into segments the transmission of information can streamlined. Every segment is numbered after its importance. If the image changes only in one of the segments the information in this segment only will be transmitted. If many segments of the image changes the changes in the most important segments will be given the highest priority. In Fig. 5 a basic geometrical figure, a pentagon 70, is observed.
  • the segments are numbered (A1-A16). Each segment covers approximately 2,5% of a circular center area of the field of vision. The most important information is in the centre of the image, that is segments A1 , A2, A3 and A4. Secondly come segments A5, A6, A7, A8 and so on. If informa- tion in the image changes only in one of the segments that information will be transmitted, such information being only about 2,5% of the full image. If information in many segments changes, segments with the lower numbers will be given higher priority. In the shown embodiment this means the segments that are located near the centre of the image. The areas outside the outer segments are given low priority and are upgraded only when there is time left over. If segments of higher priority need to be upgraded continuously (if the user is looking at a rotating fan) the lower priority segments still will be upgraded but with a lower frequency, such as one tenth of frequency of the segments of higher priority.
  • segment A15 If the user moves a finger into the field of vision to point at something this would probably be in the centre of the image. If for example a finger is moved from the bottom right corner towards the centre segment A15 first will be changed then again segment A15 and segment A11 , then segments A15, A11 and A7 and finally segment A3, A7, A11 and A15. Together these segments represent 10% of the whole field of vision which means that the transmission of relevant information can be accomplished ten times faster than if no streamlining is made.
  • Fig. 6 the wearer of the goggle has changed his view by moving the line of sight slightly downward and to the left. As a result the pentagon 70 has moved up and a small distance to the right.
  • the division of the field of vision into segments will result in that only a hatched section 71 of the view has to be updated. In the shown embodiment only about 10% of the image information has to be transferred.
  • the adviser may get only the picture from one of the cameras. When doing service or troubleshooting it could be of value to be able to perform distance measurements. This will be possible with 3-D goggles.
  • the adviser points at an item in the picture he wants to measure from and marks it.
  • the point will be transmitted to the goggles where the computer in the goggles finds the same item in the other camera picture.
  • the difference of the positions of the item in the pictures is used to localize the item in reference of the goggles.
  • the adviser points then at the other item he wants to measure to and the position of that item will be calculated. There after the distance between the two items can be calculated.
  • the wearer of the goggle notes a goggle ID and receives a code at the display.
  • the code can be a random generated code.
  • the wearer gets in contact with an adviser for assistance and forwards the ID and the code.
  • the adviser opens a predetermined web page and inputs code and goggle ID.
  • the ID guides the adviser to the right camera.

Abstract

A method for remote visualization, including the steps of capturing image information of a view in a camera module (14) of a mobile electronic device (10) and displaying captured image information at a first display means (28), said displaying including projecting the image information onto the retina of a human user wearing the mobile electronic device (10). The method further includes separately making image information captured by the mobile electronic unit available at a second display means (44) at a remote location, generating data and indicator information at said remote location, and displaying generated data and indicator information at said second display means (44). Generated data and indicator information are transferred to the mobile device (10) and at least a part of the transmitted data and indicator information are displayed at said first display (28)means together with captured image information.

Description

A METHOD AND A DEVICE FOR REMOTE VISUALIZATION
TECHNICAL FIELD The invention relates to a method and a device for remote visualization. Service and maintenance sometimes have to be carried out at remote locations. Long distances, cultural barriers, confusion of language, shortage of competence, etc, make service and maintenance work very cost demanding. PRIOR ART
In many situations journeys have been replaced by broadband communications. Global communication possibilities have been made possible through the vast development of communication technologies. A standard for two-way broadband real time communication has been established, although only a few companies are able to deliver quality systems that are suitable for use in service and maintenance applications.
Prior art systems fail to provide appropriate working conditions for service personnel. They also show drawbacks in producing a usable feedback. A document that describes the use of a video camera or equivalent equipment for other purposes is US6803887. The equipment disclosed in this document includes a VRD ( Virtual Retina Display), a video camera and a module for determining the position of the eye. The document is very brief and does not state clearly how the equipment is constructed. The document implies that the information captured by the camera might be partially visible for the carrier of the device. The captured information or some part of it is transferred wirelessly to some kind of server where the information is processed. On basis of the transferred information the further information is returned new and/or modified for display on the VRD-unit.
SUMMARY OF THE INVENTION
An object of the present invention is to provide a method for remote visualization that will allow service personnel at remote locations to be guided more accurately by skilled persons than in prior art systems. The method in accordance with the invention also allows information to be exchanged between the remote location comprising a first display means and a central unit comprising a second display means. A mobile device is provided at the remote location. Image information is generated and picked up locally by the mobile device and is displayed to a user wearing the mobile device. Preferably the image information is projected directly onto a retina of the user by a first display means. The image information also is transmitted to and displayed at a second display means located at a position different from the location of the mobile device. The mobile device is worn by the user to pick up image information in the line of sight of the user. The mobile device comprises or is connected to a computer.
Distance seeing or remote visualizing is made possible with video goggles equipped with one or two zoom, and autofocus cameras that are connected to a computer carried by a wearer of the goggles. The wearer will se the camera(s) picture(s) displayed in the goggles and thereby he will experience it as if he was looking without the goggles, for example the picture will be projected on the retina of the carrier of the goggles, so called virtual Retina display (VRD).
The computer connects to a webpage, where anyone granted access can login and he or she will be able to see the same picture from any location in the world, with internet access. When connected to the carrier of the goggles, the person will be able to point in the picture with his/her mouse so that the cursor will appear on the screen of the goggles. With the cursor the connected person(s) will be able to guide the carrier of the goggles through any task he might need help performing.
Data and indicator information is generated at the position of the second display means and transmitted to the mobile device where it is displayed along with the locally generated image information at the first display means. Such data and indicator information can include information from databases, technical information and digital maps and plans. In various embodiments the mobile device comprises a remote temperature transducer. Preferably, the remote temperature transducer is directed in the line of sight of the user and a point of aim is displayed to the user as well as on the second display means. In a practical embodiment distance seeing or remote visualizing is made possible with video goggles equipped with one or two zoom, and auto- focus cameras that are connected to a computer carried by the same person. The person will observe the images picked up by the camera(s) displayed in the goggles and thereby he will experience it as if he was looking without the goggles.
The computer can be connected to a web site at a remote location, where anyone granted access can login and will be able to observe the same image or picture from any location in the world having access to the internet. When connected to the wearer of the goggles, the person will be able to point in the image with a pointing means such as a mouse and the cursor will appear on the screen of the goggles. A person connected to the web site will be able to assist the wearer of the goggles in performing tasks at the remote location of the wearer. In various embodiments a web server is operating in the computer and the web server is connected to and made available to the internet.
If the wearer of the goggles wishes to point something out directly in the picture this is simply done by pointing with a finger or with a rectangle that is visible in the centre of the image. The wearer can communicate with connected parties through sound and picture as if standing right next to each other. If the carrier of the goggles is occupied and needs to adjust for example the camera zoom, this is simply done by asking the connected person(s) to make the adjustments of the camera. The person(s) connected are able also to show the wearer of the goggles pictures, blueprints and other documents. There is also a possibility to point, zoom and more in any shared document, and via the sound link explain and guide the wearer of the goggles. Also the wearer of the goggles will have the possibility to point in the shared documents with his own cursor so that he can ask questions or explain more easily. Indicator information, such as the position of a cursor or a pointing symbol, is transmitted to the goggles and displayed to the wearer. The discussion can be saved in a database that will be accessible for all par- ticipants later on.
If the carrier of the goggles turns his head the image will move a little part for each transmission. Since a major part of the image remains unchanged only information indicative of how much and which direction any moment has occurred is transmitted. The transmitted information is suffi- cient. After that the new field which appears in the field of vision will be transmitted in its entirety. The arrow which the adviser is pointing with can as an option be locked to the background so if the goggle carrier moves his head a little the arrow still points at the same point. The adviser's movement of the arrow also will be in reference to the background and not in absolute coordinates. The arrow changes depending on the angle of movement. This makes it possible to point at things in any desired direction.
A device in accordance with the invention can be used in applications where corresponding means have not been exploited. The method and the device have properties that significantly will improve education, counseling and inspection in real time situations and on distance. Energy and travel time can be saved, and the environmental consequences will be positive. A further advantage is the possibilities to fulfill future requirements of documentation in different applications. Possible areas of use are:
1 ) Maintenance and service on unusual machines and equipment. 2) Telemedicine.
3) Emergency service and military, where the need of fast counseling/guidance is needed.
4) Damage control.
5) Products for the games and amusement market?
BRIEF DESCRIPTION OF THE DRAWINGS In order that the manner in which the above recited and other advantages and objects of the invention are obtained will be readily understood, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings.
Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which: Fig. 1 shows schematically a basic embodiment of a device in accordance with the invention mounted on a wearer, Fig. 2 is a basic block diagram showing one embodiment of a device in accordance with the invention,
Fig. 3 shows a view available on a display of the device in Fig. 1 , Fig. 4 shows a view available on a display of a device at a remote location,
Fig. 5 schematically shows division of an image into segments and Fig. 6 schematically illustrates movement of the view.
DETAILED DESCRIPTION
Image processing and internet communication are important elements when implementing the invention. By optimizing the information that is in fact transferred substantial improvements are possible. Image information is continuously monitored and only information relating to sections of changed im- age information is transmitted. The image information also can be evaluated with regard to importance, so as to give priority to the most relevant or important information.
Fig. 1 shows a mobile device 10 in accordance with one embodiment of the invention. The device 10 is formed as goggles and worn by a user 12. In a preferred embodiment the user will only be able to see through the goggles, that is through a first display unit built in into the goggles. Image information is picked up by a camera module 14 and transmitted to the first dis- play unit. In the shown embodiment the camera module 14 comprises one digital camera. In other embodiments two or more cameras are included so as to pick up and display more complex views.
The device 10 comprises also a control unit, c.f. Fig. 2, interconnect- ing and controlling the camera module 14, the first display and headphones 16. The control unit can be connected to the internet and will operate as a web server that other units may connect to through a conventional web browser. In the embodiment shown in Fig. 1 the device in accordance with the invention also comprises a pyrometer or a radiation thermometer 18 and a light unit 20. Preferably, both the pyrometer 18 and the light unit 20 are mounted externally on the goggles to be directed in the line of sight.
In the embodiment shown in Fig. 2 the device in accordance with the invention comprises the mobile device 10 and at least one remote unit 22. The mobile device 10 and the at least one remote units 22 are connected through a general communication system 24, such as the internet. In the shown embodiment information is exchanged through a web server 25. The connection between the mobile device 10 and the remote unit is wireless in the embodiment shown in Fig. 2. In other embodiments a wire connection is used. The web server 25 comprises a computer 27 and a communications module 29.
The mobile device 10 comprises a first control unit 24 that is opera- tively connected to other units and modules of the mobile device 10. In one embodiment the first control unit 24 includes a conventional personal computer. The first control unit 24 is connected to a first communication unit 26, such as a modem or similar device. Communication between the mobile device 10 and the remote unit 22 is handled by the first communication unit 26, which can be designed for transmission through cable, radio signals, light or sound. Image information, other data as well as sound from parties involved are transferred through the communication unit 26. In various embodiments the communication unit 26 comprises means for communicating through a TCP/IP protocol or a similar protocol. The mobile device 10 further comprises a first display unit 28, the camera module 14, the light unit 20 and the radiation thermometer 18. In a preferred embodiment the first display unit 28 includes a virtual retinal display (VRD), also known as a retinal scan display (RSD). The RSD draws a raster display directly onto the retina of the eye. Image information is picked up by the camera module 14, completed and modified by the first control unit 24 and transmitted to the first display unit 28.
A stroboscope vision module 30 that can be implemented partly or fully as software running in the first control unit 24 recognizes repeating views in one or many segments. The scanning frequency can then be adjusted to the appearing frequency of the views. When viewing for instance a rotating wheel, products on a conveyer belt or something vibrating in the field of vision it will be possible to study the component since it appears as if it did not move or moves slowly. Slower processes can be shown clearer by not changing the picture until the process repeats again.
By increasing or decreasing the picture frequency marginally it will be possible to imagine that for example a wheel is turning slowly or a fast mounting process is shown in ultra rapid. The first control unit also calculates turns/sec, amount of products/min and so on and display correspond- ing information at the first display unit 28.
In a color search module 32 a search color is specified. The first control unit 26 and the camera module 14 locate all occurrences of the specified color and all objects or image segments having that color are highlighted. In a movement visualization module 34 objects moving in the captured image are located and followed. The first control unit 24 calculates the orbit or line of movement and displays a corresponding line in the first display unit 28. The first control unit 24 is capable of displaying the corresponding line continuously or intermittently. At any time the displaying of the line can be interrupted. Flow charts, maps, constructional drawings, component listing can be downloaded to the mobile device from the remote unit 22 and displayed through an information layer module 36. In a preferred embodiment the ob- jects that are downloaded are displayed in a transparent layer at the first display unit 28, so as not to cover elements of the image picked up by the camera module 14. Positioning of the transparent layer and objects displayed therein and a correct scale factor are taken from detectable reference dots in reality guides.
In one embodiment the mobile device 10 comprises means for acting as a server. The server substantially presents the view that is displayed on the first display unit 28. Specific remote users are informed about the server internet address and where appropriate user id and password. By accessing the server remote users will be able to see substantially the same view as the user of the mobile device 10.
In the shown embodiment the web page instead is provided in a web server 25 that is connected to the internet. The web server comprises a computer 27 and a communication module 29. The web server will receive image information from the camera module 14 and make the image information and also other information originating from the mobile device 10 available to other users. In various embodiments the web server is implemented as a section of the mobile device 10, for instance as a part of the first control unit 24. The first communication unit 26 comprises a first encryption function
31 and the second communication unit 46 comprises a second encryption function 33. In a similar way the communication module 29 of the web server comprises a third encryption function 35. Each of the encryption functions also may include a tunneling function for security reasons. A remote user will be able to move a mouse pointer or cursor generated and visible at a remote display and the position of the mouse pointer will be transmitted to the mobile device 10 and displayed also at the first display unit. As a result the wearer of the goggle as well as all remote users will be able to see the mouse pointer in the displayed image. If a plurality of re- mote users are active each remote user may have a mouse pointer that is different in color or other appearance. In one embodiment the mouse pointer or cursor generated at the remote display is locked at a background position. As a result the position of the cursor in relation to the background will not change, even if the mobile device is turned.
A locally generated cursor is locked at a central position of the displayed view, cf. Fig. 3. The locally generated cursor can be generated in a cursor generation module 37. In one embodiment the locally generated cursor is formed as a transparent rectangle. Other image functions are processed in an image processing unit 40. Each mobile device 10 can be given a specific identity that can be stored locally for instance in a memory circuit 38. The specific identity or an associated identification reference is required for remote users for establishing a contact to the mobile device 10.
The remote unit 22 comprises a remote or second control unit 42 which in one embodiment is a conventional computer, such as a personal computer. The remote control unit 42 is connected to a second display unit 44 which can be a conventional monitor or display unit. Communication be- tween the remote unit 22 and the mobile device 10 is provided by a second communication unit 46. The first communication unit 26 and the second communication unit 46 communicate over a suitable communication medium 48 such as the internet. In the control unit 42 data can be generated or retrieved for transfer to and for being displayed at the first display unit 28. Fig. 3 shows the view as seen by the wearer of the goggles or mobile device 10 and Fig. 4 shows the view as seen at the remote users. Image information as captured by the camera module is shown within a double line rectangle 47. All elements that are common to both views are given the same reference numerals and are described only once. The view of Fig. 3 is displayed at the first display unit 28. Most elements that are visible are objects in the area in front of the wearer. Switches 50, a control panel 52 and a machine contour line 53 are such elements. The view as seen on the first display unit 28 also comprises objects and elements that are generated either by the first control unit 24 or by the second control unit 42. A pointer 54 shaped as an arrow is generated by the second control unit 42 on the basis of actions taken by a remote user or advisor. The pointer 54 can be locked either to a background element, such as a specific switch as shown in Fig. 3 or at a specific position of the view. In the first case the pointer will remain at the switch also when the image is changed as a result of a movement of the camera module 14, as long as the switch is visible. A cursor 56 shaped as a rectangle is generated by the first control unit
24 and is normally fixed in the centre, or at another specific position, of the image as displayed. The wearer of the mobile device can use the cursor 54 as a pointing device as an alternative to pointing with a finger that will be visible at the first display unit 28 as well as at the second display unit 44. By using a transparent rectangle the central section of the image will not be covered. A distinct point of the image can be referred to by moving the goggle to a position where a corner of the rectangle is located at the distinct point.
The control panel 52 of the machine in view also comprises a machine display 58. The wearer of the goggle needs to know how to deal with a question mark 59 shown at the machine display. The wearer positions the goggle to place the cursor rectangle 56 over the question mark 59. An advisor at the remote location who can observe the same view, c.f. Fig. 4, is directed to the relevant position by the rectangle 58. The advisor also can observe that one switch is switched to an incorrect position and enables the pointer 54. By positioning the pointer 54 at the switch in question the attention of the wearer is directed to said switch.
The radiation thermometer 18 is designed to measure the temperature along a sight line corresponding to the position of the cursor rectangle 56 at a reasonable distance from the goggle. A measured temperature value is displayed on the first display unit 28 and the second display unit 44. In the shown embodiment the temperature value appears on a scale 60 as well as in a small window 61.
In various embodiments the camera module 14 is provided with a zoom function. The zoom function can be controlled by the wearer of the goggle as well by the advisor. The wearer will adjust the zoom level locally and the advisor by clicking at a zoom field 62 in the view. Zoom information can be given by the advisor and transferred to the mobile device where the first control unit will adjust the zoom level of the camera module accordingly. Name information or other identification of advisors can be displayed at name windows 64. Preferably the name of each advisor or participant is given a separate colour so as to allow identification during the process.
The advisor also has a possibility to fetch a marker such as an arrow 66 with his mouse pointer and then position it at anywhere in the vied image. The marker arrow can be locked to the background. The advisor also can choose to save snapshots or video sequences by clicking on corresponding recording buttons 65 on the second display unit 44. Recorded information, which may include also voice conversations and other data, is saved. The information can be saved locally, for instance in the memory circuit 38, and later on be transferred to a web server or to the remote unit 22.
An information field 68 is available in the view of the first display unit and of the second display unit. Data and other information can be typed in by the advisor or other participant, generated by other means or collected from databases and other data sources, and will be made available at the display units. Data from databases, maps and plans also can be displayed overlaid on captured image information at the display units. The field of vision is divided into segments with appropriate appearance and shape, such as squares, rectangles or as shown in Fig. 5. By dividing the field of vision into segments the transmission of information can streamlined. Every segment is numbered after its importance. If the image changes only in one of the segments the information in this segment only will be transmitted. If many segments of the image changes the changes in the most important segments will be given the highest priority. In Fig. 5 a basic geometrical figure, a pentagon 70, is observed.
The segments are numbered (A1-A16). Each segment covers approximately 2,5% of a circular center area of the field of vision. The most important information is in the centre of the image, that is segments A1 , A2, A3 and A4. Secondly come segments A5, A6, A7, A8 and so on. If informa- tion in the image changes only in one of the segments that information will be transmitted, such information being only about 2,5% of the full image. If information in many segments changes, segments with the lower numbers will be given higher priority. In the shown embodiment this means the segments that are located near the centre of the image. The areas outside the outer segments are given low priority and are upgraded only when there is time left over. If segments of higher priority need to be upgraded continuously (if the user is looking at a rotating fan) the lower priority segments still will be upgraded but with a lower frequency, such as one tenth of frequency of the segments of higher priority.
If the user moves a finger into the field of vision to point at something this would probably be in the centre of the image. If for example a finger is moved from the bottom right corner towards the centre segment A15 first will be changed then again segment A15 and segment A11 , then segments A15, A11 and A7 and finally segment A3, A7, A11 and A15. Together these segments represent 10% of the whole field of vision which means that the transmission of relevant information can be accomplished ten times faster than if no streamlining is made.
In Fig. 6 the wearer of the goggle has changed his view by moving the line of sight slightly downward and to the left. As a result the pentagon 70 has moved up and a small distance to the right. The division of the field of vision into segments will result in that only a hatched section 71 of the view has to be updated. In the shown embodiment only about 10% of the image information has to be transferred. By mounting two cameras on the goggles a 3-D imagination can be made. The adviser may get only the picture from one of the cameras. When doing service or troubleshooting it could be of value to be able to perform distance measurements. This will be possible with 3-D goggles. The adviser points at an item in the picture he wants to measure from and marks it. The point will be transmitted to the goggles where the computer in the goggles finds the same item in the other camera picture. The difference of the positions of the item in the pictures is used to localize the item in reference of the goggles. The adviser points then at the other item he wants to measure to and the position of that item will be calculated. There after the distance between the two items can be calculated.
In use the wearer of the goggle notes a goggle ID and receives a code at the display. The code can be a random generated code. The wearer gets in contact with an adviser for assistance and forwards the ID and the code. The adviser opens a predetermined web page and inputs code and goggle ID. The ID guides the adviser to the right camera. By using a random code it is possible to limit the adviser from getting into contact with the wearer of the goggle when not being invited.

Claims

1. A method for remote visualization, including the steps of capturing image information of a view in a camera module (14) of a mobile electronic device (10) and displaying captured image information at a first display means (28), said displaying including projecting the image information onto the retina of a human user wearing the mobile electronic device (10), c h a r a c t e r i s e d by separately making image information captured by the mobile electronic unit available at a second display means (44) at a remote location, generating data and indicator information at said remote location, displaying generated data and indicator information at said second display means (44), and transferring generated data and indicator information to the mobile device (10) and displaying at least a part of the transmitted data and indicator information at said first display (28) means together with captured image information.
2. A method as claimed in claim 1 , also including the steps of displaying said at least part of the transmitted data and indicator information by overlaying it on said captured image information.
3. A method as claimed in claim 1 , also including the steps of generating at the remote location a pointer symbol, determining the location of the pointer symbol in the view, transmitting the location of the pointer symbol to the mobile device (10), generating a corresponding pointer symbol in the mobile device and displaying the corresponding pointer symbol at the first display means (28) at a position corresponding to the location of the pointer symbol in the view.
4. A method as claimed in claim 1 , also including the steps of generating at the remote location a pointer symbol, determining the location of the pointer symbol in relation to background elements of the view, transmitting the location of the pointer symbol to the mobile device (10), generating a corre- sponding pointer symbol in the mobile device and displaying the corresponding pointer symbol at the first display means (28) at a position that is fixed in relation to said background elements of the view.
5. A method as claimed in claim 1 , also including the steps of activating from the remote location a zoom function of the camera module (14).
6. A method as claimed in claim 1 , also including the steps of recording sequences of images that are displayed at said first display means (28).
7. A method as claimed in claim 1 , also including the steps of providing the image information captured by the mobile electronic unit at a web server.
8. A method as claimed in claim 1 , also including the steps of adjusting a frequency of the capturing of image information in the camera module (14) in dependence of an actual frequency or speed of a moving object in the view.
9. An apparatus including a mobile electronic device (10) with a camera module (14) and first display means (28) for displaying image information captured by said camera module (14), said displaying including projecting the image information onto the retina of a human user wearing the mobile electronic device (10), c h a r a c t e r i s e d by a web server (25) receiving image information from the camera module (14) and making the image information available to a second display means (44) provided in at least one remote unit (22), first communication means (26) provided in the mobile device (10), second communication means (46) provided in the remote unit (22), and data generating means (42) providing information to be displayed at the first display means (28) and the second display means (44).
10. An apparatus as claimed in claim 9, wherein the camera module (14) is connected through a first communications means (26) to a web server (25) for supplying image information, and the second display means (44) is connected to the web server for retrieving the image information.
11. An apparatus as claimed in claim 9, wherein the mobile device (10) comprises an information layer module (37) and an image processing unit (40).
12. An apparatus as claimed in claim 9, wherein mobile device (10) comprises a radiation thermometer (18) mounted so as to be directed in same direction as the camera module (14).
PCT/SE2009/050394 2008-04-17 2009-04-16 A method and a device for remote visualization WO2009128781A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE0800888-0 2008-04-17
SE0800888 2008-04-17

Publications (1)

Publication Number Publication Date
WO2009128781A1 true WO2009128781A1 (en) 2009-10-22

Family

ID=40856515

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2009/050394 WO2009128781A1 (en) 2008-04-17 2009-04-16 A method and a device for remote visualization

Country Status (1)

Country Link
WO (1) WO2009128781A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014076236A1 (en) 2012-11-15 2014-05-22 Steen Svendstorp Iversen Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
EP2765502A1 (en) 2013-02-08 2014-08-13 ShowMe Telepresence ApS Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefore
EP2945341A1 (en) 2014-05-15 2015-11-18 ShowMe Telepresence ApS Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
WO2015173344A1 (en) 2014-05-15 2015-11-19 Showme Telepresence Aps Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
WO2016014871A1 (en) * 2014-07-25 2016-01-28 Microsoft Technology Licensing, Llc Multi-user gaze projection using head mounted display devices
US20160026242A1 (en) 2014-07-25 2016-01-28 Aaron Burns Gaze-based object placement within a virtual reality environment
US9645397B2 (en) 2014-07-25 2017-05-09 Microsoft Technology Licensing, Llc Use of surface reconstruction data to identify real world floor
EP3214586A1 (en) * 2016-03-04 2017-09-06 Thales Deutschland GmbH Method for maintenance support and maintenance support system
US9858720B2 (en) 2014-07-25 2018-01-02 Microsoft Technology Licensing, Llc Three-dimensional mixed-reality viewport
US9865089B2 (en) 2014-07-25 2018-01-09 Microsoft Technology Licensing, Llc Virtual reality environment with real world objects
US9904055B2 (en) 2014-07-25 2018-02-27 Microsoft Technology Licensing, Llc Smart placement of virtual objects to stay in the field of view of a head mounted display
US10311638B2 (en) 2014-07-25 2019-06-04 Microsoft Technology Licensing, Llc Anti-trip when immersed in a virtual reality environment
US10451875B2 (en) 2014-07-25 2019-10-22 Microsoft Technology Licensing, Llc Smart transparency for virtual objects
WO2019220043A1 (en) 2018-05-14 2019-11-21 Diotasoft Method and system for teletransmitting to a processing site a video stream captured on a remote intervention site

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020101568A1 (en) * 2001-01-30 2002-08-01 Eberl Heinrich A. Interactive data view and command system
US6611242B1 (en) * 1999-02-12 2003-08-26 Sanyo Electric Co., Ltd. Information transmission system to transmit work instruction information
US20030200058A1 (en) * 2002-04-17 2003-10-23 Kenji Ogawa Maintenance system
US6847336B1 (en) * 1996-10-02 2005-01-25 Jerome H. Lemelson Selectively controllable heads-up display system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6847336B1 (en) * 1996-10-02 2005-01-25 Jerome H. Lemelson Selectively controllable heads-up display system
US6611242B1 (en) * 1999-02-12 2003-08-26 Sanyo Electric Co., Ltd. Information transmission system to transmit work instruction information
US20020101568A1 (en) * 2001-01-30 2002-08-01 Eberl Heinrich A. Interactive data view and command system
US20030200058A1 (en) * 2002-04-17 2003-10-23 Kenji Ogawa Maintenance system

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9491418B2 (en) 2012-11-15 2016-11-08 Steen Svendstorp Iversen Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
WO2014076236A1 (en) 2012-11-15 2014-05-22 Steen Svendstorp Iversen Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
EP2765502A1 (en) 2013-02-08 2014-08-13 ShowMe Telepresence ApS Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefore
EP2945341A1 (en) 2014-05-15 2015-11-18 ShowMe Telepresence ApS Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
WO2015173344A1 (en) 2014-05-15 2015-11-19 Showme Telepresence Aps Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
US10096168B2 (en) 2014-07-25 2018-10-09 Microsoft Technology Licensing, Llc Three-dimensional mixed-reality viewport
WO2016014871A1 (en) * 2014-07-25 2016-01-28 Microsoft Technology Licensing, Llc Multi-user gaze projection using head mounted display devices
US9645397B2 (en) 2014-07-25 2017-05-09 Microsoft Technology Licensing, Llc Use of surface reconstruction data to identify real world floor
CN106662925A (en) * 2014-07-25 2017-05-10 微软技术许可有限责任公司 Multi-user gaze projection using head mounted display devices
US10649212B2 (en) 2014-07-25 2020-05-12 Microsoft Technology Licensing Llc Ground plane adjustment in a virtual reality environment
US10451875B2 (en) 2014-07-25 2019-10-22 Microsoft Technology Licensing, Llc Smart transparency for virtual objects
US9766460B2 (en) 2014-07-25 2017-09-19 Microsoft Technology Licensing, Llc Ground plane adjustment in a virtual reality environment
US9858720B2 (en) 2014-07-25 2018-01-02 Microsoft Technology Licensing, Llc Three-dimensional mixed-reality viewport
US9865089B2 (en) 2014-07-25 2018-01-09 Microsoft Technology Licensing, Llc Virtual reality environment with real world objects
US9904055B2 (en) 2014-07-25 2018-02-27 Microsoft Technology Licensing, Llc Smart placement of virtual objects to stay in the field of view of a head mounted display
US20160026242A1 (en) 2014-07-25 2016-01-28 Aaron Burns Gaze-based object placement within a virtual reality environment
US10311638B2 (en) 2014-07-25 2019-06-04 Microsoft Technology Licensing, Llc Anti-trip when immersed in a virtual reality environment
US10416760B2 (en) 2014-07-25 2019-09-17 Microsoft Technology Licensing, Llc Gaze-based object placement within a virtual reality environment
WO2017149120A1 (en) * 2016-03-04 2017-09-08 Thales Deutschland Gmbh Method for maintenance support and maintenance support system
EP3214586A1 (en) * 2016-03-04 2017-09-06 Thales Deutschland GmbH Method for maintenance support and maintenance support system
WO2019220043A1 (en) 2018-05-14 2019-11-21 Diotasoft Method and system for teletransmitting to a processing site a video stream captured on a remote intervention site

Similar Documents

Publication Publication Date Title
WO2009128781A1 (en) A method and a device for remote visualization
US20190147619A1 (en) Method and system for image georegistration
EP3149698B1 (en) Method and system for image georegistration
US10169923B2 (en) Wearable display system that displays a workout guide
CN107646098A (en) System for tracking portable equipment in virtual reality
CN107667328A (en) System for tracking handheld device in enhancing and/or reality environment
CN103180893A (en) Method and system for use in providing three dimensional user interface
JPH0759032A (en) Picture display device
KR20130108643A (en) Systems and methods for a gaze and gesture interface
US10521013B2 (en) High-speed staggered binocular eye tracking systems
CN103124945A (en) Image recognition apparatus, operation evaluation method, and program
CN104204848A (en) Surveying apparatus having a range camera
KR20180006573A (en) The apparatus and method of forming a multi experience
CN108141565A (en) Information processing equipment and information processing method
EP3264380B1 (en) System and method for immersive and collaborative video surveillance
JP2007320715A (en) Work relation information provision system and work relation information provision method
CN111417916B (en) Multi-layer viewing system and method
KR101842210B1 (en) Contents providing method through eye tracking and apparatus thereof
CN111722401A (en) Display device, display control method, and display system
EP3903285B1 (en) Methods and systems for camera 3d pose determination
JPH089423A (en) Remote monitor device
CN112055034B (en) Interaction method and system based on optical communication device
CN113920221A (en) Information processing apparatus, information processing method, and computer readable medium
JP2018074420A (en) Display device, display system, and control method for display device
JP2019079469A (en) Work supporting device, method for supporting work, work supporting program, and helmet-mounted display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09732859

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09732859

Country of ref document: EP

Kind code of ref document: A1