EP3465389A1 - Verfahren zur interaktion eines bedieners mit einem technischen objekt - Google Patents
Verfahren zur interaktion eines bedieners mit einem technischen objektInfo
- Publication number
- EP3465389A1 EP3465389A1 EP17735087.3A EP17735087A EP3465389A1 EP 3465389 A1 EP3465389 A1 EP 3465389A1 EP 17735087 A EP17735087 A EP 17735087A EP 3465389 A1 EP3465389 A1 EP 3465389A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- gesture
- operator
- selection
- interaction
- identified
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
Definitions
- the invention relates to a method for the interaction of an operator with a technical object.
- Such a method is used, for example, in the field of building automation and industrial automation technology, in production or machine tools, in diagnostic or service support systems and in the operation and maintenance of complex components, devices and systems, especially industrial or medical equipment.
- control by voice input is known, which, for example, permits a spoken selection of a room, followed by a setting of the room temperature.
- voice control in anticipation of the problem underlying the invention already has the advantage of a hands-free interaction with technical objects, the operation is not very intuitive due to the voice commands to be learned.
- the object of the present invention is to provide an interaction system with an intuitive and open-handed selection of a technical object as well as an interaction with the technical object.
- the object is achieved by a method having the features of patent claim 1.
- the inventive method for interaction of an operator with a technical object provides to detect an associated at least one arm of the operator plurality of local Para ⁇ meter of a gesture capture unit of an interaction ⁇ system.
- the plurality of local parameters is evaluated by a control unit of the interaction system as a per ⁇ stays awhile gesture, wherein a selection and interaction with the selected object is controlled by a sequence of gestures.
- Rei ⁇ hen describe the following sequence mentioned:
- step d) If the detection of the selection gesture in step b) results in no ⁇ to arrangement of a plurality, but only one identified by retrievable location coordinates of the object, step d) is skipped.
- the invention provides an activation of the system, an off ⁇ optionally, selection, control and release of an object by forming gestures by means of a movement of the arms of the operator.
- a gesture detection is preferably carried out with provided in the Un ⁇ terarm Scheme of the operator inertial sensors or with an optical detection of the arm gestures.
- Selecting (ger .: to choose) is to be understood as a Grobaus ⁇ either a first amount of identified objects using a selection gesture (ger .: choice gesture).
- the mentioned »first set of identified objects « closes Of course, it does not mean that the quantity identified by means of the selection gesture exactly corresponds to one identified object.
- to select is to be understood as a fine selection among the first quantity by means of a selection gesture with the aim of selecting exactly one identified object from the preceding rough selection.
- a particular advantage of the invention arises from the possibility of individual gestures such as the selection gesture, the selection gesture, the confirmation gesture, etc. to assign a intuiti ⁇ ven and well-known sequence of movements, that is examples game, known from an operator of a lasso.
- the operation of the interaction system according to the invention is non-contact, so that an operator in an industrial or medical environment has no operation to perform input devices with contaminating effect.
- a further advantage of the invention is that the operator does not have to look at an output device such as a screen during the operating operations. Instead, he can take the technical objects to be operated directly into his field of vision. In fact, the operation beyond even done without consideration in the event that the operator is sufficiently familiar with the environment, the technical objects to be operated and their location.
- An additional providence of a haptic feedback according to an embodiment mentioned further enhances this effect of the viewing-free operation.
- Voice control or a selection of an object using a portable terminal - the interaction method of the invention provides a faster and more accurate selection of technical objects.
- the inventive method is less susceptible to unwanted Fehlbedie ⁇ calculations.
- a selection of distinct gestures according to the exemplary embodiments mentioned below leads to a further reduction in the susceptibility to errors.
- Two-step process that is a combination of an off ⁇ optionally gesture and a selection gesture, according to the method MAESSEN erfindungsge ⁇ a much more accurate selection and interaction. Compared to a screen-based interaction method of the invention allows a more intuitive In ⁇ ter syndrome with technical objects.
- a special advantage is also the feedback means according to the invention for confirming a selection of objects to the operator.
- this measure according to the invention it is possible for an operator to visually recognize an identified object, but not necessarily to touch this object in the context of the interaction according to the invention - for example by a usual actuation of actuation units - or to approach it - for example to detect an object assigned barcodes.
- Further embodiments of the invention are the subject of the dependent claims.
- Fig. 1 is a schematic structural diagram of a Bedie ⁇ listeners in a rear view in an embodiment of a first gesture
- FIG. 2 shows a schematic structural representation of the operator in the rear view in an embodiment of a second gesture
- 3 shows a schematic structural representation of the operator in the rear view in an embodiment of a third gesture
- 4 shows a schematic structural representation of the operator in the rear view in an embodiment of a fourth gesture
- Fig. 5 a schematic structural representation of the operator in a front view.
- Fig. 5 shows an operator in a front view when executing a gesture.
- the operator is assigned a gesture detection unit, which is worn in the embodiment on the right wrist.
- the gesture detection unit is worn on the left or both wrists.
- the gesture detection unit comprises a plurality of inertial sensors for detecting a plurality of local parameters associated with the posture of the operator, in particular local parameters which are formed by a movement, rotation and / or position of the arms of the operator.
- the majority of local parameters are determined by a control unit (not shown). unit of the interaction system evaluated as a respective gesture ⁇ .
- the gesture control is intuitive with a movement and / or rotation of the body, especially one or both forearms. An input device which is often inappropriate in an industrial environment is therefore not required.
- a commercially available smartwatch as the gesture detection unit.
- a particular advantage of this embodiment of the Gestenerfas- sungsaku is that commercially available are usable with Träg ⁇ integrated sensors equipped Smart Watches.
- Un ⁇ ter application of such a portable gesture capture unit can have many according to the invention or further development, used according to functional units advertising reused to, integral sensors, for example, a gesture detection based on Träg ⁇ an example, Bluetooth beacons ba ⁇ sierende locating unit, as well as haptic remindmel ⁇ dung medium, such as an unbalance motor to output a Vibra ⁇ tion feedback to the wrist of the operator.
- alternative gesture detection units provide an optical detection of gestures, for example using one or more optical detection devices, which detect a posture of the operator three-dimensionally, for example using time of flight or structured-light topometry.
- the procedure referred also have the advantage of hands-free operation of the operator, but erfor ⁇ countries a use of optical Er chargedseinrichtun- gen in the working environment of the operator.
- a plurality of local parameters associated with at least one arm of the operator are detected by the gesture detection unit and evaluated by the control unit as a respective gesture, wherein a selection and interaction with the selected object is controlled by a sequence of gestures.
- FIG. 1 shows a schematic structural representation of a Be ⁇ servant in a rear view in an embodiment of an initiation tion gesture.
- the initiation gesture involves lifting an arm of the operator.
- the initiation gesture comprises a rotation of a palm towards a body center of the operator.
- the initiation gesture is detected by the interaction ⁇ system and enabled this for the following Interak ⁇ tion between the operator and the technical object.
- a haptic feedback issued to the operator to signal the readiness of the interaction system.
- Fig. 2 shows the user in continuing the initiation gesture comprising a circular movement around a forearm ⁇ main axis of the operator.
- the palm continues to be directed towards the center of the body of the operator.
- a preferred embodiment of the selection gesture provides a swirling movement of the wrist, ie a nikförmi ⁇ ge movement, for example, the right wrist, along an imaginary circular line, with a raised hand of the operator.
- This selection gesture corresponds to an imaginary actuation of a virtual Lasso before it is dropped. Is optionally by the wrist-worn Gestenerfas ⁇ sungshim a haptic feedback to the operator ⁇ given to signal the readiness of the interaction system for a subsequent selection of an object.
- Fig. 3 shows the operator during a selection gesture.
- the selection gesture for example, a throwing motion in a direction to be selected and by the technical Inter reliessys ⁇ system object to be identified.
- this selection gesture corresponds to an imaginary throw of the virtual lasso in the direction of the object to be selected.
- a virtual throwing motion triggers a selection on a previously raised hand, the gesture of which corresponds approximately to the operation of a virtual harpoon.
- the motion sequence substantially corresponding to that of the above beschrie ⁇ surrounded lasso movement with the exception that the initial circular movement is not executed.
- the selection gesture is therefore less distinctive than the lasso movement and possibly leads to an increased number of unwanted activations of initiation or selection.
- the selection gesture leads to a plurality of identified objects which, refined by a selection gesture, finally lead to a single identified object, namely the object which is actually to be controlled by the operator.
- This object to be controlled is, under the three illumination sources shown in FIG. 3, the illumination source located on the far left.
- a determination of first target coordinates takes place from the the selection gesture associated plurality of local parameters.
- the plurality of local parameters assigned to the selection gesture comprises in particular the position of the operator and the direction of the throw movement.
- These first destination coordinates are assigned to one or more objects identified by retrievable location coordinates, in the example the three illumination sources.
- the location coordinates of these three illumination sources are held or retrievable, for example, in an arbitrarily configurable data source, which is accessed by the interaction system. If the first target coordinates can be uniquely assigned to an identified object, the subsequent selection gesture is superfluous.
- the feedback means are inherent in the respective Be ⁇ leuchtungs provoken, to confirm these flashing briefly.
- embodiments may also include separate feedback means, et ⁇ wa if no suitable self feedback means are included in the technical objects associated feedback means. This applies, for example, to a selected pump in a production environment, which does not comprise any acoustic or optical signal transmitters. For feedback for a warning or indicator to select the pump on a display panel of Indust ⁇ riestrom would appear here, so this display panel in the above embodiment as a the identified objects feedback means is assigned to understand.
- FIG. 4 shows the operator during a selection gesture which, in the case of several identified objects, is to be prompted to refine the selection.
- the operator performs a rotational movement of his arm - for example a pronation or supination - or a translation movement - for example a movement of an arm in a horizontal or vertical direction - or an internal or external rotation of the shoulder.
- a rotational movement of his arm for example a pronation or supination - or a translation movement - for example a movement of an arm in a horizontal or vertical direction - or an internal or external rotation of the shoulder.
- an activation of the feedback means which are associated with the identified objects.
- a Rotationsbewe ⁇ tion up to a left stop position brings here, for example, the leftmost illumination source to
- Such a rotational movement for example a pronation or supination of the forearm, is substantially easier with the inertial sensors provided in the forearm region of the operator or also with an optical detection of an arm rotation gesture than the detection of a finger gesture.
- the selection gesture is detected by determining second target coordinates from the plurality of local parameters assigned to the selection gesture and assigning the second target coordinates to the object identified by retrievable location coordinates.
- this selection ⁇ step is skipped in an advantageous manner.
- An exemplary confirmation gesture includes a forward arm that is retracted toward the operator and / or pulled up. This gesture would - in continuation of the intuitive semiotics of a Lassobedienung - correspond to a tightening of a Lasso.
- the interaction system acquires this confirmation gesture, optionally returns a feedback message to the operator and assigns the identified object to a subsequent interaction mode.
- a controller of the assigned in a mode of interaction object by means of a plurality of Inter forcingges ⁇ th which can be carried out in conventional manner times.
- a rotational movement leads to a closing ei ⁇ nes valve, a lifting movement to an opening of shading devices, an upward movement to an increase in the light intensity of a room, a rotational movement to a change in light color, etc.
- Completion of the interaction is triggered by the operator through a release gesture.
- This release gesture results in the release of the object associated with the interaction mode by the interaction system.
- Release release is signaled in an advantageous manner by a feedback ⁇ tion to the operator.
- An exemplary release rack comprises a forwardly directed arm with a rotary motion of its hand, for example in a counterclockwise direction. This gesture would - in continuation of the intuitive semiotics of a lasso operation - correspond to a rotary movement of a loosely lying lasso, in which the lasso end is to be lifted off an object.
- the inventive control of technical objects in particular for on and Switch off light sources or to open or close blinds or other shading devices are used.
- the method according to the invention also finds application for the selection of a particular one of a plurality of display means, for example in appropriately equipped command-and-control centers or in medical operation rooms.
- the method according to the invention finds use in the selection and activation of pumps, valves or the like.
- use in the production and logistics environment is possible, in which a specific package or production part is selected in order to obtain more detailed information about this or to assign a specific production step to it.
- the method according to the invention particularly meets the need for a hands-free interaction, which is freed in particular from the hitherto customary need to operate an actuating or handling unit for an interaction.
- a hands-free operation is particularly advantageous in an environment which is either contaminated or subjected to increased purity requirements, or if the working environment necessitates the wearing of gloves.
- An advantageous embodiment of the invention enables a multiple selection of identified objects, which are to be transferred into the interaction mode.
- This configuration enables it ⁇ example, a multiple choice trenchpad lamps.
- a multiple selection is ensured by an alternati ve ⁇ confirmation gesture which adds one or more previously identified with a selection and / or selection gesture subject by more identified objects.
- the alternate acknowledgment gesture results in an association of another identified object without a change takes place in a mode of interaction due to the alternative Affirmation ⁇ supply gesture.
- An exemplary alternative Affirmation ⁇ supply gesture comprises a forward arm which is pushed in the opposite direction to the operator to the front and / or pulled up. This alternative confirmation gesture is carried out in each case after a selection of an object to be added, until the plurality of identified objects is complete from the operator's point of view. With the last identi fied ⁇ object then the operator performs the usual confirmation gesture - thus a Retired ⁇ ner toward the operator arm - and confirms its selection as the extended ⁇ selected plurality of identified objects.
- an analogy to a known drag-and-drop mouse operation is provided.
- a further gesture detecting a gesture training optionally - in particular throwing motion - are available, can be selected with the other objects, and selected.
- An alternative acknowledgment gesture is performed after a Hinzu Stahl ⁇ supply of the last operator of view objects to be added, which indicates that the plurality of fully identified ed objects from the perspective of the operator.
- the exemplary alternative confirmation gesture again comprises a pushed in the opposite direction to the operator to the front and / or pulled-up arm at ⁇ play.
- An application example of such a multiple selection encompasses a surveillance space with a plurality of display devices.
- the operator would like to transfer the contents of three smaller display devices to a large area display for a better overview.
- a gesture sequence he would select the lasso selection gesture to select the three display contents and then use a harpoon throw to select the large-scale display Apply ge Rhein.
- the control system of the interaction system then adjusts the display content accordingly.
- the plurality of objects identified with the selection gesture is reduced to a selection to a specific object type.
- a selection on Be ⁇ lighting means makes sense in certain cases. This selection can be transmitted to the interaction system before, during or even after the acquisition of the selection gesture, for example by means of a voice command "select only illumination means".
- a combination with a visualization means for example a virtual reality head-mounted display, is conceivable. This allows one
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102016212234.7A DE102016212234A1 (de) | 2016-07-05 | 2016-07-05 | Verfahren zur Interaktion eines Bedieners mit einem technischen Objekt |
PCT/EP2017/066238 WO2018007247A1 (de) | 2016-07-05 | 2017-06-30 | Verfahren zur interaktion eines bedieners mit einem technischen objekt |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3465389A1 true EP3465389A1 (de) | 2019-04-10 |
Family
ID=59276741
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17735087.3A Withdrawn EP3465389A1 (de) | 2016-07-05 | 2017-06-30 | Verfahren zur interaktion eines bedieners mit einem technischen objekt |
Country Status (5)
Country | Link |
---|---|
US (1) | US20190163285A1 (de) |
EP (1) | EP3465389A1 (de) |
CN (1) | CN109416588A (de) |
DE (1) | DE102016212234A1 (de) |
WO (1) | WO2018007247A1 (de) |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8073198B2 (en) * | 2007-10-26 | 2011-12-06 | Samsung Electronics Co., Ltd. | System and method for selection of an object of interest during physical browsing by finger framing |
US8614663B2 (en) * | 2010-03-15 | 2013-12-24 | Empire Technology Development, Llc | Selective motor control classification |
KR20140068855A (ko) * | 2011-06-23 | 2014-06-09 | 오블롱 인더스트리즈, 인크 | 공간 입력 장치를 위한 적응적 추적 시스템 |
US9164589B2 (en) * | 2011-11-01 | 2015-10-20 | Intel Corporation | Dynamic gesture based short-range human-machine interaction |
US9301372B2 (en) * | 2011-11-11 | 2016-03-29 | Osram Sylvania Inc. | Light control method and lighting device using the same |
DE202012005255U1 (de) * | 2012-05-29 | 2012-06-26 | Youse Gmbh | Bedienvorrichtung mit einer Gestenüberwachungseinheit |
US9977492B2 (en) * | 2012-12-06 | 2018-05-22 | Microsoft Technology Licensing, Llc | Mixed reality presentation |
US20140240215A1 (en) * | 2013-02-26 | 2014-08-28 | Corel Corporation | System and method for controlling a user interface utility using a vision system |
WO2015062751A1 (de) * | 2013-10-28 | 2015-05-07 | Johnson Controls Gmbh | Verfahren zum betreiben einer vorrichtung zur berührungslosen erfassung von gegenständen und/oder personen und von diesen ausgeführten gesten und/oder bedienvorgängen in einem fahrzeuginnenraum |
EP3100249B1 (de) * | 2014-01-30 | 2022-12-21 | Signify Holding B.V. | Gestensteuerung |
US9535495B2 (en) * | 2014-09-26 | 2017-01-03 | International Business Machines Corporation | Interacting with a display positioning system |
-
2016
- 2016-07-05 DE DE102016212234.7A patent/DE102016212234A1/de not_active Withdrawn
-
2017
- 2017-06-30 EP EP17735087.3A patent/EP3465389A1/de not_active Withdrawn
- 2017-06-30 WO PCT/EP2017/066238 patent/WO2018007247A1/de unknown
- 2017-06-30 CN CN201780041840.4A patent/CN109416588A/zh active Pending
- 2017-06-30 US US16/315,246 patent/US20190163285A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
WO2018007247A1 (de) | 2018-01-11 |
CN109416588A (zh) | 2019-03-01 |
DE102016212234A1 (de) | 2018-01-11 |
US20190163285A1 (en) | 2019-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102018109463C5 (de) | Verfahren zur Benutzung einer mehrgliedrigen aktuierten Kinematik, vorzugsweise eines Roboters, besonders vorzugsweise eines Knickarmroboters, durch einen Benutzer mittels einer mobilen Anzeigevorrichtung | |
DE102013019869B4 (de) | Roboterarm mit Eingabemodul | |
DE112018002565B4 (de) | System und Verfahren zum direkten Anlernen eines Roboters | |
DE102016113060A1 (de) | Verfahren zum Steuern eines Objekts | |
EP3458939B1 (de) | Interaktionssystem und -verfahren | |
DE102017202439B4 (de) | Eingabeeinrichtung, Verfahren zur Bereitstellung von Bewegungsbefehlen an einen Aktor und Aktorsystem | |
DE102015011830C5 (de) | Robotersystem zum Einstellen eines Bewegungsüberwachungsbereichs eines Roboters | |
EP3098034B1 (de) | Auswahl eines gerätes oder eines objektes mit hilfe einer kamera | |
EP3518055B1 (de) | Überwachungs- und bediensystem für einen fertigungsarbeitsplatz und verfahren zum fertigen eines produkts oder teilprodukts | |
DE102010039540C5 (de) | Handbediengerät zum manuellen Bewegen eines Roboterarms | |
EP3374135B1 (de) | Verfahren zum vereinfachten ändern von applikationsprogrammen zur steuerung einer industrieanlage | |
EP3990231B1 (de) | System zum vornehmen einer eingabe an einem robotermanipulator | |
DE102016118486A1 (de) | Beleuchtungssteuerungsvorrichtung, beleuchtungssystem und verfahren zur steuerung einer beleuchtungsvorrichtung | |
EP3465389A1 (de) | Verfahren zur interaktion eines bedieners mit einem technischen objekt | |
DE102014106680B4 (de) | Schalterbetätigungseinrichtung, mobiles Gerät und Verfahren zum Betätigen eines Schalters durch eine nicht-taktile "Push"-Geste | |
DE102019128583B4 (de) | Optimierungsmodi für Steuerprogramme eines Robotermanipulators | |
DE102016214391B4 (de) | Verfahren zum Steuern eines Betriebes eines Koordinatenmessgerätes und Koordinatenmessgerät mit Steuerung | |
DE102017118982A1 (de) | Roboter und Verfahren zum Betrieb eines Roboters | |
WO2018007102A1 (de) | Verfahren zur interaktion eines bedieners mit einem modell eines technischen systems | |
DE102019207594A1 (de) | Vorrichtung zum Steuern und/oder Konfigurieren einer Anlage | |
EP3120672B1 (de) | System zum ansteuern von verbrauchern einer haushaltsleittechnik mittels muskelimpulsen wenigstens eines benutzers und entsprechendes verfahren | |
EP4189500A1 (de) | Verfahren zum erweitern der bedienfunktionalität eines feldgeräts der automatisierungstechnik von einem bediengerät auf zumindest ein weiteres bediengerät | |
DE102016204137A1 (de) | Programmierbares Manipulatorsystem mit einer Funktionsschaltervorrichtung | |
DE102015211521A1 (de) | Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung | |
EP1510893A1 (de) | Verfahren und Vorrichtung zum Festlegen der Bewegungsbahn eines Handling-systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190102 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 5/11 20060101ALI20191210BHEP Ipc: G06F 1/16 20060101ALI20191210BHEP Ipc: G06F 3/01 20060101AFI20191210BHEP Ipc: A61B 5/00 20060101ALI20191210BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20200116 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200603 |