WO2016206874A1 - Système d'interaction - Google Patents

Système d'interaction Download PDF

Info

Publication number
WO2016206874A1
WO2016206874A1 PCT/EP2016/061276 EP2016061276W WO2016206874A1 WO 2016206874 A1 WO2016206874 A1 WO 2016206874A1 EP 2016061276 W EP2016061276 W EP 2016061276W WO 2016206874 A1 WO2016206874 A1 WO 2016206874A1
Authority
WO
WIPO (PCT)
Prior art keywords
operator
expert
operating environment
interaction system
unit
Prior art date
Application number
PCT/EP2016/061276
Other languages
German (de)
English (en)
Inventor
Rebecca Johnson
Asa Macwilliams
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2016206874A1 publication Critical patent/WO2016206874A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/48Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
    • G03B17/54Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/101Collaborative creation, e.g. joint development of products or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1822Conducting the conference, e.g. admission, detection, selection or grouping of participants, correlating users to one or more conference sessions, prioritising transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

Definitions

  • the invention relates to an interaction system for operator assistance of an operator by an expert.
  • the proposed means for assisting operators have the significant disadvantage that it remains for the operator to be supported to interpret instructions received from the expert and to locate the points to be processed in the operating environment. It is therefore an object of the present invention to provide a
  • the interaction system according to the invention for operator assistance of an operator by an expert comprises one or more operator support devices on the part of the operator as well as a remote expert device on the part of the expert.
  • the operator support device and the remote expert device are at least indirectly coupled via a bidirectional data connection.
  • the operator assistance device on the side of the operator comprises one or more operating environment detection units for detecting an operating environment of the operator and one or more projection units for optical projection of one or more operating instructions on the operating environment.
  • the remote expert device comprises an output unit for outputting the operating environment detected by the operating environment detection unit on the operator's side to the expert and an operator guidance unit for detecting a gesture of the expert assigned to a context of the operating environment and for generating the operating information on the basis of the gesture.
  • the operating instruction is projected on the operator's side in its operating environment.
  • the invention is supported by the idea of a remote-acting finger-pointer by the expert.
  • the finger pointing or operating instruction of an expert is carried out by means of an optical projection on the operating environment. This operating instruction is intuitively provided by the expert on the part of the expert
  • Gesture prompted Operator's note is processed and transmitted via the bidirectional data link to the operator support device. There, the operating instruction is projected by the projection unit on the part of the operator in its operating environment.
  • the interaction system ensures an intuitive instruction to the operator as to which exact point in the operating environment is to be operated. This is particularly evident when asked which fuse to operate in a cabinet with a variety of fuses. While previous means of assistance rely on linguistic instructions, such as "row 40, twelfth backup from the left", optical projection of an operation indication, such as marking the fuse to be operated by means of a light emitting point, allows a more natural instruction of the operator ,
  • the erfindungsgenzeße interaction system also facilitates the finger pointing the expert.
  • the gesture causing the operating instruction is initiated by a fixation with the eyes of the expert.
  • the server can see the look of the Experts follow, which is projected into the operating environment as a luminous point or as a marker or illumination of a field of view.
  • an expert is not necessarily more qualified than the operator.
  • the interaction system according to the invention is used, for example, in the field of automation technology, in production and machine tools, in process automation, in diagnosis or service support systems as well as for complex components, devices and systems such as vehicles and industrial machines and plants.
  • the user guidance unit is adapted to detect the gesture for detecting a position or movement of body parts of the expert or to detect a gesture as a combination of a position and a movement of a body part of the expert, such as gestures , which provide a positioning and swiping (Swipe Gesture).
  • the gesture control also includes meta-commands which are not directly linked to a finger gesture, for example gestures which deactivate or activate the optical projection of the operation advice, select certain patterns or colors of the projection, control playback of stored instruction videos, etc.
  • Body parts for defining the gesture are in particular: one or more fingers of the expert, one or both hands of the expert, the head of the expert and one or both eyes of the expert. Depending on the body part, suitable techniques are used to detect the gesture by the user guidance unit.
  • the operator guidance unit comprises a gaze-tracking or eye-tracking device.
  • a gaze tracking device for detecting an eye position and for generating a location information about the eye position, a gesture triggered by an eye of the expert is detected.
  • the location information is transferred to the user guidance unit, which calculates the spatial coordinates of the operating information to be projected in the spatial context of the operating environment.
  • the gaze tracking device can be integrated, for example, in a data goggle or in a head mounted display.
  • a gesture triggered by a hand and / or at least one finger of the expert is detected.
  • the data glove translates movements of the hand and fingers into electrical signals. Furthermore, this allows a determination of position and orientation of the hand.
  • detection of the position and movement of the hand and fingers may be detected by camera and / or laser-based gesture detection, which eliminates the need to wear a data glove.
  • a combination of gestures is advantageous, for example, it can be provided that a gesture causing a first operating instruction from a fixation of an object of the operating environment is triggered by the eyes of the expert, and a gesture causing a second operation is triggered by a finger pointing to an object of the operating environment.
  • the first and second operating instructions are preferably simultaneously projected into the operating environment.
  • a visual differentiation of the two operating instructions can be provided, for example by a different coloration of the operating instruction triggered by the eye fixation and the operating instruction triggered by the finger pointer.
  • the operator can follow the gaze of the expert, which is projected into the operating environment as a luminous point or else as a marking or illumination of a field of view.
  • Different coloration of an operating instruction can also be advantageously used for other signaling purposes, for example, reserved colors may indicate a higher importance or priority. Further, certain colors may indicate the status of the data connection between the operator support device and the remote expert unit or also the time elapsed between detection of the expert's gesture and transmission and generation of the operator side operator indication, thereby indicating possible data link delays. Further, certain colors may be associated with a respective hand of the expert, such as a right-hand red laser, a left-hand green laser. To generate different colors, several independently controllable lasers with different spray colors can be provided on the operator side. A color can alternatively or additionally be adjusted by mixing different laser beams or by frequency tuning individual lasers.
  • the projection unit includes For example, a laser in the visible wavelength range, a Mikroaptarrayproj ector, a liquid crystal projector and / or a LED projector. Combinations of the mentioned projection units are also advantageous, for example by projecting a circuit diagram of a system to be serviced onto a flat surface of the operating environment, while the expert can highlight individual areas with a laser marking point. Furthermore, the projection units allow a projection of moving pictures or video sequences, for example to represent a sequence of required hand movements for a complex operating step.
  • the projection unit comprises a pattern projector for the random definition of patterns. For example, a fixation of an area of the operating environment with the eyes of the operator on the operator side is displayed with a circular operating instruction, a finger gesture with a punctiform operating instruction.
  • the pattern projector is set up for generating a pattern that is statically perceived by the human eye with an outline marked by a trajectory of a movement gesture.
  • a border made by the expert border of an area with his finger is displayed on the operator side as an outline of the border area or with a filled border area.
  • the laser beam is guided so fast along the contour lines or within the border area that the contour or area is perceived statically for the human eye due to its inertia for rapid image changes.
  • Further advantageous embodiments of the invention relate to the operator assistance device on the part of the operator.
  • Optical detection devices detect in particular optical cameras and / or depth data scanners.
  • Optical detection devices detect in particular optical cameras and / or depth data scanners.
  • combinations of infrared sensors or cameras in combination with a light or laser projection of structures such as grid lines are used as depth data scanners.
  • laser depth data scanners are used.
  • annular or spherical segment-shaped arrangement is also advantageous for the acoustic detection devices or microphones in order to allow the experts a spatial and directional hearing of the ambient noise.
  • Other detection devices may include thermal sensors, radiation sensors, gas sensors, as well as physical and chemical sensors of all kinds.
  • the output unit on the part of the expert as data glasses or head-mounted display, which on the one hand allows a spatial vision through a stereoscopic Dar- position of the operating environment and on the other hand also allows integration of the gaze tracking device.
  • smart glasses allow integration of inertial sensors for detecting head movement as well as gesture detection scanners for detecting gestures of the fingers and / or hands of the expert.
  • the interaction control system is, for example, a server which controls a preparation and transmission of the video data of the operating environment.
  • the server continues to perform a definition and calculation of patterns and Location coordinates of the respective operating instructions to ensure a correct position projection on the operating environment.
  • the interaction control system is formed by a respective, on the one hand, the remote expert unit and the other hand, the operator support device via a wireless data connection associated portable device with a software brought to expiration thereon.
  • Fig. La a schematic representation of an operator assistance of an operator in a control environment
  • FIG. 1b shows a schematic representation of an operator guidance by an expert
  • FIG. 2 shows a schematic representation of an operator assistance device according to an exemplary embodiment.
  • FIG. 1a shows an operator assistance of an operator OPR in a service environment or "service site".
  • the operator OPR works on screw connections of a flange of a pipe connection, which is shown only schematically and in sections. In the situation shown in Fig. 1, the operator has to release a hexagon bolt of the flange connection with a wrench.
  • An operator support device OSD assists the operator OPR in his operating task.
  • the operator assistance device OSD is set up by means of a stand in a not necessarily immediate vicinity to the operator OPR.
  • the operator support device OSD is preferably placed in a position that would be occupied by a personally present expert for operator assistance of the operator OPR.
  • the operator support device OSD comprises a plurality of operating environment detection units for detecting an operating environment of the operator.
  • several optical detection units CAM arranged annularly around the circumference of the operator support device OSD, preferably high-resolution video cameras with wide-angle lenses, serve as the operating environment detection unit. With this arrangement as complete as possible optical detection of the operating environment is achieved.
  • the operator support device OSD further comprises at least one projection unit for the optical projection of an operating instruction on the operating environment.
  • a laser projection unit as
  • Projection unit used by a laser beam LSB is projected onto the operating environment.
  • a point or area marked by the laser beam shows the operator OPR which screw is to be loosened.
  • FIG. 1b shows an operator guidance at the same time taking place at a location remote from the operating environment by an expert RME or "remote expert” by means of a remote expert unit.
  • the remote expert unit is at least indirectly coupled to the operator support device OSD via a bidirectional data link and comprises an output unit and a guidance unit.
  • a direct coupling is suitable for simple solutions, in which the respective - not shown - arithmetic units in the interaction system, ie in the operator support device OSD and / or in the remote expert unit control a preparation and transmission of the video data of the operating environment, as well as a definition and calculation of Carry out patterns and location coordinates of the respective operating instructions in order to ensure a correct position projection on the operating environment.
  • These tasks can be ⁇ ⁇
  • one of the remote expert unit and the operator support device OSD via a wireless data connection assigned portable device with a run on it brought software or "App" occupy.
  • an advantageous outsourcing of at least part of the computing capacity to an already routinely entrained computing unit is achieved.
  • a more powerful interaction control system in the form of a server (not shown) may be provided via a coupling (not shown).
  • the output unit in the present exemplary embodiment at least comprising a data goggles HMD or "head-mounted display", serves for an essentially time-synchronized output of the detected operating environment to the expert RME.
  • the operating environment detected by the operator support device OSD is output via the bidirectional data connection to the data glasses HMD of the expert RME.
  • FIG. 1b schematically shows an optical representation of a virtual operating environment VRV which is output in the data goggles HMD of the expert.
  • the expert RME captures the operating environment in a spatial manner via separately prepared optical data for his left and right eyes, which are output stereoscopically from the data glasses HMD.
  • the stereoscopic processing advantageously includes a perspective curvature for correcting optical differences in the imaging behavior of the operating environment detection units and the data glasses HMD or the output unit in general.
  • Other components of the output unit include a
  • stereo-acoustic output of the acoustic signals detected in the operating environment for example via - not shown - speakers or headphones via ge separates prepared acoustic data for the left and right ear of the expert RME.
  • the stereoacoustic output changes to output with a changed stereoacoustic preferential direction according to the changed turning or tilting angle of the head of the expert RME.
  • the expert can turn his head in this way to better perceive incoming noises from a certain direction.
  • the visual representation of the virtual operating environment VRV changes so that it is displayed with a changed image detail and a changed perspective in accordance with the angle of vision changed by rotation or inclination of the head of the expert RME.
  • signals of at least one, in the case of stereo-optical or stereo-acoustic representation of at least two, of a plurality of detection units are selected which surround the operator support device in an annular manner, for example along a full circle.
  • the expert RME can turn in all possible directions and receives stereoscopically prepared views or soundscapes according to a taken in direction.
  • an image or audio processing is used, which performs an interpolation on the basis of the signals supplied by participating detection units.
  • an image processing process which is preferably carried out by the operator support device OSD, is a 3D or 2.5D process.
  • the expert RME acts with the remote expert unit essentially as if it were in close proximity to the operator OPR. He sees and hears what is going on in the operating environment, communicates with the operator OPR and gives it a hint in the form of operating instructions about the location and type of operation to be performed.
  • the guide unit comprises at least one data glove GLV or "data glove”. and further - not shown - means for processing the data supplied by the data glove GLV raw data with the aim of generating operating instructions from these raw data.
  • the raw data of the data glove GLV are generated by converting movements of the hand and fingers into electrical signals.
  • fiber optic structures run along the fingers and thumb of the expert RME.
  • a light-emitting diode at one end sends light along the fiber optic structures to a phototransistor at the other end, which converts the light into an electrical signal.
  • the position and orientation of the hand is determined by means of a sensor provided on the back of the hand of the data glove GLV, by means of which, for example, the strength and orientation of three mutually perpendicular artificially generated magnetic fields in space is measured.
  • gestures are detected by corresponding data glasses HMD integrated devices, for example with a gaze tracking device, inertial sensors for detecting a head movement and optical devices for detecting gestures of the fingers and / or hand of the expert RME.
  • Support for detecting gestures of the fingers and / or hands of the expert RME may optionally be assisted by a glove which has a signal color facilitating detection.
  • This glove GLV need not necessarily have the facilities of the data glove for translating hand and finger movements into electrical signals.
  • the guidance unit finally assigns the gestures to the context of the operating environment to result in a correctly-produced operator's instruction being projected into the operator's operating environment.
  • By showing the expert RME on virtual Objects in the virtual operating environment VRV is projected an operating instruction on the real objects corresponding to the virtual objects.
  • a virtual representation of the hand of the expert RME is displayed in the virtual operating environment VRV.
  • FIG. 2 shows a schematic representation of an operator assistance device OSD according to an exemplary embodiment.
  • the operator support device OSD has a cylindrical base body, which is connected via a tripod coupling TRP with a tripod.
  • a suspension device HOK alternatively allows a suspended attachment of the operator support device OSD.
  • a plurality of optical detection unit CAM and acoustic detection units MIC are arranged in an annular manner, advantageously along a full circle. Adjacent detection units are preferably so far apart that their detection ranges overlap and / or that the distance between the detection units corresponds approximately to the distance between corresponding human sensory organs - eye or ear.
  • At least two acoustic output units LSP and at least one projection / display unit DSP are arranged along the lateral surface or else only in a partial surface which is dressed in a preferred direction.
  • the projection / display unit DSP serves to output opti- Information which is either displayed on a display of the projection / display unit DSP or projected onto the operating environment by the projection / display unit DSP.
  • an embodiment of the projection / display unit DSP as a micromirror array projector, liquid crystal projector, light-emitting diode projector or a combination of the aforementioned projectors is suitable for projection onto the operating environment.
  • the operator support device OSD further comprises two laser guide devices STA for steering a respective laser beam LSB, which serves to project a respective operating instruction.
  • two embodiments of the invention will be explained, in which the outer dimensions of the user assistance device OSD are considered. While an operator support device OSD usable for usual technical purposes is to be dimensioned approximately to the size of a human head, so that the recorded and expert-generated virtual operating environment VRV corresponds approximately to that of the real operating environment, dimensions which deviate upwards and downwards also show up technical advantages.
  • An operator support device OSD in a miniature format according to a first embodiment allows placement inside a machine in which access by an operator OPR is not possible.
  • the operator OPR accesses from outside the operating environment detected by the operator support device OSD with the aid of corresponding miniature tools, with projections according to the invention being supported by operating instructions in the placement and application of the miniature tools.
  • the operator instructing expert RME has through the miniature size of the operator support device OSD and the corresponding operating environment detection unit a more detailed view of the operating environment and can also in a simpler way an operating instruction on miniature objects.
  • An operator support device OSD in a giant format allows a placement in a position in which a complete area, such as an airport, a construction site, a factory, etc. can be detected.
  • the operating instructions of the expert RME are in such an embodiment, for example, used to point a group of workers a way to a maintenance position.
  • a first possible application of the interaction system according to the invention provides that the roles of the operator OPR and the expert RME coincide.
  • a person named in the following user changes his workstation between the operating environment and the workstation where the remote expert unit is located.
  • a workstation at the remote expert unit first allows the user to inspect the control panel via the output unit and to set one or more operating instructions for points to be processed in the following. The user fixes these operating instructions and then switches to the operating environment to carry out the work there.
  • Such an upstream remote inspection of an operating environment prior to the actual execution of the work is advantageous, for example, in cases in which the personal stay in the operating environment is to be reduced to a minimum, for example due to health-threatening environmental conditions or due to environmental conditions requiring a posture of the user, which complicate longer inspections prior to the actual work in the operating environment.
  • an application of a change of job also lends itself to cases where an operator assistance ⁇ Q
  • device OSD is used in a miniature or giant format explained above and a forthcoming work in advance inspection is in a more detailed or overviewing view of advantage.
  • a coincidence of the role of the operator OPR and the expert RME is also present in an application in which an operator support device OSD in a giant format allows an overview of a complete area from a bird's eye view and an operator in the search for a path through the area a mobile Output unit of the remote expert unit, in particular a data goggles, uses to get on his path finding a representation of the bird's eye view of the area.
  • an operator support device OSD in a giant format allows an overview of a complete area from a bird's eye view and an operator in the search for a path through the area a mobile Output unit of the remote expert unit, in particular a data goggles, uses to get on his path finding a representation of the bird's eye view of the area.
  • Interaction system provides that the roles of the operator and the expert of several persons are taken. For example, in certain situations, at least one interaction system assigned to an operator or a group of operators is advantageous, which is assigned to a plurality of remote expert units, each with different experts. An alternating or simultaneous guidance by different experts is carried out in an advantageous manner by optical projection of operating instructions on the operating environment with a different color depending on the experts.
  • the interaction system according to the invention for operating support of an operator by an expert ensures a long-term finger pointing by the expert.
  • the finger pointing or operating instruction of an expert is carried out by means of an optical projection on the operating environment of the operator.
  • This operating instruction is presented on the part of the expert in an intuitive way by a gesture control in front of the Background generated on the side of the operator operating environment and the experts transmitted and displayed operating environment generated.
  • the operator prompt, prompted by a gesture is processed and transmitted to the operator support facility of the operator.
  • the operating instruction is projected by the projection unit on the part of the operator in its operating environment.
  • the interaction system according to the invention ensures an intuitive instruction to the operator as to which exact point in the operating environment is to be operated.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Strategic Management (AREA)
  • Human Computer Interaction (AREA)
  • Tourism & Hospitality (AREA)
  • Quality & Reliability (AREA)
  • General Business, Economics & Management (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un système d'interaction destiné à fournir l'aide d'un expert à un utilisateur, lequel garantit une aide visuelle à distance par l'expert. L'aide visuelle ou l'instruction de commande d'un expert est fournie au moyen d'une projection optique sur l'environnement de commande de l'utilisateur. Cette instruction de commande est générée par l'expert de façon intuitive au moyen d'une commande gestuelle devant l'arrière-plan d'un environnement de commande saisi par l'utilisateur puis transmis à l'expert et représenté. L'instruction de commande produite par un geste de l'expert est traitée puis transmise au dispositif d'aide de l'utilisateur. L'instruction de commande y est projetée par l'utilisateur dans son environnement de commande au moyen de l'unité de projection. Contrairement aux dispositifs connus jusqu'à présent qui fournissent une aide aux utilisateurs, le système d'interaction de l'invention fournit une instruction intuitive à l'utilisateur lui indiquant précisément quel emplacement doit être utilisé dans l'environnement de commande.
PCT/EP2016/061276 2015-06-23 2016-05-19 Système d'interaction WO2016206874A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015211515.1A DE102015211515A1 (de) 2015-06-23 2015-06-23 Interaktionssystem
DE102015211515.1 2015-06-23

Publications (1)

Publication Number Publication Date
WO2016206874A1 true WO2016206874A1 (fr) 2016-12-29

Family

ID=56068892

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/061276 WO2016206874A1 (fr) 2015-06-23 2016-05-19 Système d'interaction

Country Status (2)

Country Link
DE (1) DE102015211515A1 (fr)
WO (1) WO2016206874A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020016488A1 (fr) 2018-07-18 2020-01-23 Holomake Système d'asservissement mécanique motorisé d'un plan holographique pour le guidage manuel de précision
WO2022037758A1 (fr) 2020-08-18 2022-02-24 Siemens Aktiengesellschaft Collaboration à distance utilisant une réalité augmentée et virtuelle

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016006768A1 (de) * 2016-06-02 2017-12-07 Audi Ag Verfahren zum Betreiben eines Anzeigesystems und Anzeigesystem
JP2019067050A (ja) * 2017-09-29 2019-04-25 株式会社日立ビルシステム 作業支援システム
DE102017217834A1 (de) * 2017-10-06 2019-04-11 Robert Bosch Gmbh Verfahren für einen Montagearbeitsplatz, Montagearbeitsplatz, Computerprogramm und computerlesbares Medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100238194A1 (en) * 2009-03-20 2010-09-23 Roach Jr Peter Methods And Apparatuses For Using A Mobile Device To Provide Remote Assistance
US20130083063A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Service Provision Using Personal Audio/Visual System
EP2765502A1 (fr) * 2013-02-08 2014-08-13 ShowMe Telepresence ApS Procédé permettant de fournir une instruction visuelle représentée numériquement provenant d'un spécialiste pour un utilisateur nécessitant ladite instruction visuelle et système associé
US20140361988A1 (en) * 2011-09-19 2014-12-11 Eyesight Mobile Technologies Ltd. Touch Free Interface for Augmented Reality Systems

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE0203908D0 (sv) * 2002-12-30 2002-12-30 Abb Research Ltd An augmented reality system and method
JP2008078690A (ja) * 2006-09-19 2008-04-03 Fuji Xerox Co Ltd 画像処理システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100238194A1 (en) * 2009-03-20 2010-09-23 Roach Jr Peter Methods And Apparatuses For Using A Mobile Device To Provide Remote Assistance
US20140361988A1 (en) * 2011-09-19 2014-12-11 Eyesight Mobile Technologies Ltd. Touch Free Interface for Augmented Reality Systems
US20130083063A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Service Provision Using Personal Audio/Visual System
EP2765502A1 (fr) * 2013-02-08 2014-08-13 ShowMe Telepresence ApS Procédé permettant de fournir une instruction visuelle représentée numériquement provenant d'un spécialiste pour un utilisateur nécessitant ladite instruction visuelle et système associé

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Videoprojektor", 26 May 2015 (2015-05-26), Wikipedia, XP055297486, Retrieved from the Internet <URL:https://de.wikipedia.org/w/index.php?title=Videoprojektor&oldid=142487783> [retrieved on 20160824] *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020016488A1 (fr) 2018-07-18 2020-01-23 Holomake Système d'asservissement mécanique motorisé d'un plan holographique pour le guidage manuel de précision
FR3084173A1 (fr) 2018-07-18 2020-01-24 Holomake Systeme d'asservissement mecanique motorise d'un plan holographique pour le guidage manuel de precision
WO2022037758A1 (fr) 2020-08-18 2022-02-24 Siemens Aktiengesellschaft Collaboration à distance utilisant une réalité augmentée et virtuelle

Also Published As

Publication number Publication date
DE102015211515A1 (de) 2016-12-29

Similar Documents

Publication Publication Date Title
WO2016206874A1 (fr) Système d&#39;interaction
EP2048557B1 (fr) Capteur optoélectronique et dispositif mobile ainsi que son procédé de configuration
DE102018109463C5 (de) Verfahren zur Benutzung einer mehrgliedrigen aktuierten Kinematik, vorzugsweise eines Roboters, besonders vorzugsweise eines Knickarmroboters, durch einen Benutzer mittels einer mobilen Anzeigevorrichtung
EP3458939B1 (fr) Système et procédé d&#39;interaction
EP2737278B1 (fr) Dispositif de mesure commandable sans contact et son procédé de commande
EP2161219A1 (fr) Procédé et dispositif de soutien visuel de procédés de commissionnement
EP2380709A2 (fr) Dispositif de sécurité 3D et procédé de sécurisation et de commande d&#39;au moins une machine
DE3609469A1 (de) Anzeigesteuervorrichtung
WO2017055054A1 (fr) Procédé et dispositif de génération de données de commande pour commander une installation d&#39;ascenseur par surveillance thermique d&#39;une zone de commande
DE102020211408A1 (de) Schweißinformations-Bereitstellungsvorrichtung
DE102016010284A1 (de) Robotersystem, das einen Sichtsensor verwendet
EP3546136A1 (fr) Système de réalité augmentée
EP3012712A1 (fr) Motif virtuel dans un environnement reel
WO2019110263A1 (fr) Dispositif sécurisé de type lunettes et procédé
DE102016221861B4 (de) Einrichtung und Verfahren zur Einwirkung auf Gegenstände
WO2019174672A2 (fr) Procédé d&#39;utilisation de lunettes intelligentes, procédé d&#39;aide à un opérateur, procédé de préparation de marchandises, lunettes intelligentes, dispositif d&#39;actionnement de fonctions, système formé de lunettes intelligentes et d&#39;un système informatique communiquant avec celles-ci, entrepôt et chariot de préparation de commandes
DE102005003443A1 (de) Einheit und Verfahren zur internen Blickführung in Funduskameras
WO2020233883A1 (fr) Système de réalité augmentée
DE102017216134B4 (de) Verfahren zum Identifizieren eines Roboters, tragbares Roboterbedienhandgerät und Roboterarbeitsplatz
WO2019228780A1 (fr) Concept pour la commande d&#39;un écran d&#39;un appareil mobile à réalité augmentée
DE102013021280A1 (de) Objekt Vergrösserungs-Gerät
EP3649539A1 (fr) Appareil de sortie visuelle doté d&#39;une caméra et procédé de présentation
DE102017221305A1 (de) Verfahren zum Betreiben eines kollaborativen Roboters
EP0656613B1 (fr) Générateur de mouvement pour simuler des mouvements dans un environnement virtuel
DE102014015025B4 (de) Visieranlage und Verfahren zum Betreiben einer Visieranlage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16724396

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16724396

Country of ref document: EP

Kind code of ref document: A1