EP0859977A2 - Interaction area for data representation - Google Patents
Interaction area for data representationInfo
- Publication number
- EP0859977A2 EP0859977A2 EP97938724A EP97938724A EP0859977A2 EP 0859977 A2 EP0859977 A2 EP 0859977A2 EP 97938724 A EP97938724 A EP 97938724A EP 97938724 A EP97938724 A EP 97938724A EP 0859977 A2 EP0859977 A2 EP 0859977A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- perception
- space
- detector
- area
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
Definitions
- the invention relates to an interaction space for displaying data.
- An interaction of the type mentioned takes place, for example, between a person and an electronic data processing system when data has been entered, processed data has been received and any new data has to be entered.
- Aid for the interaction is, for example, a mask that appears on the screen and forms the perception area for the user, so that the available data can be recognized and data still to be entered can be identified.
- the actions for entering the data are carried out using the keyboard, mouse or on the screen itself (touch screen); in the first two cases the action space (keyboard, mouse) is separated from the perception space (screen), in the third example the action space and perception space coincide.
- perception room is located in an ergonomically optimal location.
- Physical objects for example models of machines or furnishings, can now be placed directly in the perception room;
- the computer is able to determine the position of the objects via a network of horizontal infrared rays which are interrupted by the objects.
- data is entered into the computer in an optimal way for humans: perception space and action space coincide; the actions take place in an ergonomically and occupationally preferred manner without special, machine-related actions, which supports the intellectual performance of the person.
- the disadvantage of this type of interaction is that an infrared coordinate network can only be created with considerable technical effort and the position of the objects (casting shadows, identification, orientation) is so poor that the potential advantages of such an interaction space are not are realizable.
- the perceptual space should also be an action space in that it can be used to work with physical objects as part of the problem area and this work automatically leads to a computer-compatible input into the data processing system.
- the detector Due to the fact that the object emits signals to the detector from a special transmitter and is reflected or even transmitted by its surface according to the invention, the detector receives stronger or different signals due to the presence of the object than that of the perception area and the respective natural conditions, e.g. the light distribution in a work area in which the users are located and in which the perception room is set up (stray light, spotlight etc.).
- the contrast between the object and the background (perception area) is thus greatly increased in comparison with known methods.
- the computer's image processing unit is therefore relieved of the task of identifying a contour of the object known per se as such from the other surroundings of the perception area, provided that the state of the art assumes that the object and the area surrounding the perception area are subject to the same conditions.
- the object for image processing can no longer be camouflaged in the viewing area.
- the detector for only or for essentially only the reception of the signals of the object and in this way let the image processing directly determine the coordinates of the object or the desired object-related information, with which the image processing is entirely or almost entirely based on the search and Identification of the object is relieved from the structures of the perception area.
- the coordinates or the object-related information can then be processed directly into corresponding input signals for the data processing unit used, which in turn carries out the corresponding change in the perception area as well as any further processing stepsuitee that are necessary.
- the perception space for humans is projected in one channel and in the other channel essentially only the signals of the object located in the perception space are received and / or processed.
- the CPU of the processing unit will then merge the object-related information (from the image processing) and the data relating to the perception space (from the work station) and generate the new perception space to be processed, as well as carry out the further processing steps, for example CAD-related.
- the type of the object and any properties of the object can be coded accordingly by spatial and / or temporal variation of the object signal, transmitted to the detector and received by the latter. It can be so a simple and reliable way to differentiate between several objects in the perception area, in that each object transmits a code assigned to it at least once, for identification, or several times, for example each time the object is manipulated by humans, or continuously. The effect of object manipulation on the perception area can be different for each individual object.
- the objects are preferably designed in such a way that they can also be easily distinguished by humans, for example by means of different shapes or colors or by applying patterns. Accordingly, an object then consists of a carrier part, suitable for manipulation by humans, and a reflecting or transmitting part for emitting the signals for the detector.
- the various objects can act, for example, through appropriate design of the carrier part, like different tools from a tool box that are easily distinguishable by humans. This analogy to working with different tools is intuitively understandable for humans and thus supports the object of the present invention to provide an efficient, intuitive, easy-to-learn interaction space for the execution of sophisticated, computer-aided problem solutions.
- Image processing is not affected by this, since it does not have to find the object based on its spatial or colored design in the spatial and / or colored perception space, but rather can focus on the special signals of the object itself (which, according to the invention, corresponds to the signals from the other surroundings) are in contrast).
- the transmitted spatial and / or temporal code can itself be variable and controlled by suitable sensors on the object.
- a device can be used for an object with its own transmitter or for an object with a retroreflective surface, which works exclusively in the visible light range or, for example, in the near infrared and therefore from the simplest standard components (for example commercially available video cameras as a detector) and is accordingly cheap.
- a simple, i.e. well-tried, commercially available and maintainable, reliable and comparatively inexpensive technology is just as important for the general usability of a system as the technical solution itself.
- the optical signal transmission is preferably carried out directly.
- the signals can also be deflected via one or more mirrors, both from the projector to the area of the representation of the perception area and from the objects to the detector, which can enable a more compact system solution.
- a redirection via mirrors can be particularly advantageous if the perception space is to be projected down from a projector onto a preferably horizontal work surface.
- the maximum possible distance between the projector and the work surface (without mirror deflection) is determined by the room height. hey limited. This can have an adverse effect on the fact that either the projected image on the table surface is too small or that the light of the projector falls on the edges of the image at an angle that deviates strongly from a perpendicular to the work surface.
- Such a mirror preferably has an area of at least 100 cm 2 .
- a mirror deflection can enlarge the optical path between the projector and the work surface and thus reduce the described unfavorable effects.
- Eline mirror deflection can also enable a lighter design, since instead of a heavy projector, only a light mirror has to be attached above the work surface.
- the present invention can thus be made lighter overall and thus also easier to transport. This makes it possible not only to use the present invention as a permanently installed system, but to use the same system at several locations.
- FIG. 1 purely schematically, the structure of a device according to the invention.
- the halogen lamp 1 designed as a transmitter throws light on the table 4 and on or several objects designed as blocks 3, which can be pushed around on the table by a user.
- the block 3 is coated with a retroreflective sheeting 3.1, which preferably reflects the light back in the direction of the light source.
- the block appears much brighter to the viewer who is in the vicinity of the light source than the surroundings, for example the table surface underneath. No further effect occurs for the observer at a distance from the light source.
- the detector designed as a CCD camera 2 which is arranged close to the transmitter, always sees a bright object on a dark background - even if an LCD projector 5 additionally projects an image 5.1 as a perception space onto the table surface.
- the block 3 is recognized by its brightness, ie by the contrast to the picture 5.1, and without further analysis of the picture 5.1.
- the CCD camera 2 delivers a video image of the blocks and the table surface, which is transmitted via the video cable 7 to the frame grabber 9 of a workstation (8 - 16, 20).
- the analog video image, controlled by the CPU 11, is digitized in the frame grabber 9 and is then available in the main memory 12 in digital form for automatic evaluation, likewise by the CPU 11.
- This evaluation then combines the information which led to Figure 5.1 (memory 16 and memory 13) with the new evaluation results (from memory 12) to a new state of the perception area (memory 13).
- the CPU 11 receives its instructions from the program memory 10. Data is exchanged between the various memory areas and the CPU via the bus 8.
- the digital video image in main memory 12 is automatically analyzed using image processing algorithms.
- the shape, position and orientation of the blocks will analyzed (be it by a specific, orientation-defining design of the surface of the blocks to be detected or by the signals of several transmitters arranged in the blocks or by the signals emitted by the transmitters corresponding to a specific code) and the evaluation results in main memory 13 saved. These results are now used to control the position, orientation and properties of graphic objects which are present in the main memory 16 as images or CAD models.
- the graphic objects are put together in the graphics module 15 to form a digital image and converted into a high-resolution analog video image.
- the video image is transmitted to the LCD projectors 5 via the video cable 17.
- LCD projector 5 projects the image onto the table surface 4.
- the graphic objects are shown in the top view (for example a machine or a layout or a building plan viewed from above) and coupled to the position of the blocks so that the object projected by the LCD projectors 5 runs on the table surface 4 with the block 3.
- the perception area changes continuously with the movement performed.
- the stored CAD models allow the consequences of every movement to be determined continuously and displayed in the perception area or at another point. In this way, an undercut safety distance in the perception room can be displayed or a cost calculation can be issued elsewhere.
- the known possibilities of CAD are open.
- projector 6 projects a side view of the graphic objects (for example a machine viewed from the side) onto the screen 19.
- a digital image of the side view is compiled in a separate memory area of the graphics module 15, an analog image Converted video image and transmitted via the video cable 18 to the projector 6.
- a further detector 2 can then also be provided for multidimensional detection of the arrangement of the blocks.
- the camera 2 is used as a data source for the construction of the perception space. It is thus possible to spread out a plan on the table surface, so that the camera can record the corresponding structure as the basic data to be entered first and store it in the memory 14 via the image processing. A separate generation of the perception area is then no longer necessary (although this can certainly be generated by previous CAD operations). If, as described above, basic data are read in by the detector, a high resolution is a prerequisite if a fine structure is to form the perception area.
- a swiveling detector or a camera with a swiveling lens enables the sequential scanning of the perception area to be read; ie the detector can focus on individual sections, which, seen across the entire perceptual space, is much higher Resolution of the detector corresponds.
- several and / or differently designed objects 3 can be used in the perception area, for example those whose spatial orientation is recognizable and those which are symmetrical in this regard. It is also possible, among other things, to mix passive (reflective) and active (transmitting) objects 3 in order to support or simplify the corresponding CAD operations.
- a simple video camera is, however, inexpensive and, according to the invention, allows the full functionality of the interaction space described here.
- a purely two-channel system can be implemented: representation of the perception space in the visible wavelength range via the one channel with the aid of the projector or projectors 5 and detection of only the signals of the objects 3 via the appropriately trained camera 3 it is then only necessary to adjust the position of a block 3 relative to Figure 5.1 in the CPU 11 at the beginning of the interaction, which can be easily achieved with any technical means, for example, each object 3 can be brought to a standard position in the perception area, whereupon the respective first signal reception The position adjustment by the CPU 11 is used via the detector 2. Thereafter, the blocks 3 are always referenced to the image 5.1 in the CPU 11 with each new manipulation, which enables two-channel operation. The system can of course be re-adjusted at any time or the user can do it at any time.
- the interaction space according to the invention allows the manipulations carried out in the perception space to be processed in real time. According to the invention, it is not excluded to provide, in addition to the actions in the perception room, further data inputs into the processing system (via the input unit 22 and the input line 23) (in the case of panning tasks, the question could arise as to how far a computer's existing changes changed in a perception room Boundary conditions are relevant).
- an interaction space occupied by a discussion partner is located at location A, while the same facility is also set up at location B and one or more other participants are present there. It is then conceivable to project the other participant onto the work surface 19 and to make his comments audible via a speech connection. The intervention of the other participant in the interaction space changes this, which can be transmitted through the communication lines provided between the two interaction spaces, so that both interaction spaces change at the same time. Communication at a distance is significantly improved because information can be exchanged not only verbally, but via the changed perception areas, which significantly reduces misunderstandings.
- the perception area can initially be generated in such a way that by projecting an image onto the work surfaces and then reading in via a suitable camera 2, the data of the perception room to be used are read completely or only in addition into the relevant memory of the work station and are then available for further processing.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CH2229/96 | 1996-09-12 | ||
CH222996 | 1996-09-12 | ||
CH225596 | 1996-09-13 | ||
CH2255/96 | 1996-09-13 | ||
PCT/CH1997/000336 WO1998013745A2 (en) | 1996-09-12 | 1997-09-11 | Interaction area for data representation |
Publications (1)
Publication Number | Publication Date |
---|---|
EP0859977A2 true EP0859977A2 (en) | 1998-08-26 |
Family
ID=25689852
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP97938724A Withdrawn EP0859977A2 (en) | 1996-09-12 | 1997-09-11 | Interaction area for data representation |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP0859977A2 (en) |
AU (1) | AU4109097A (en) |
DE (1) | DE19781024D2 (en) |
WO (1) | WO1998013745A2 (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69939858D1 (en) * | 1998-08-31 | 2008-12-18 | Sony Corp | image processing |
DE19917660A1 (en) * | 1999-04-19 | 2000-11-02 | Deutsch Zentr Luft & Raumfahrt | Method and input device for controlling the position of an object to be graphically represented in a virtual reality |
WO2002035909A2 (en) * | 2000-11-03 | 2002-05-10 | Siemens Corporate Research, Inc. | Video-supported planning and design with physical marker objects sign |
KR20030016404A (en) * | 2001-05-14 | 2003-02-26 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Device for interacting with real-time streams of content |
DE10203992A1 (en) * | 2002-01-31 | 2003-08-14 | Deutsch Zentr Luft & Raumfahrt | input device |
NL1020440C2 (en) * | 2002-04-19 | 2003-10-21 | Univ Eindhoven Tech | Data input method for electronic desktop, provides visual illustration determined by position and type of input device used |
AT501442B1 (en) * | 2005-02-22 | 2007-07-15 | Thomas Dipl Ing Kienzl | COMPUTER-BASED INTERFACE SYSTEM |
JP4991154B2 (en) * | 2005-06-03 | 2012-08-01 | 株式会社リコー | Image display device, image display method, and command input method |
US7873479B2 (en) | 2005-12-01 | 2011-01-18 | Prometheus Laboratories Inc. | Methods of diagnosing inflammatory bowel disease |
DE102009007477A1 (en) * | 2009-01-30 | 2010-08-05 | Siemens Aktiengesellschaft | Model construction of a production facility with scale models of manufacturing facilities and method for entering a spatial structure of manufacturing facilities in a computer-aided planning program |
EP2748675B1 (en) * | 2011-07-29 | 2018-05-23 | Hewlett-Packard Development Company, L.P. | Projection capture system, programming and method |
DE102012201202A1 (en) * | 2012-01-27 | 2013-08-01 | Siemens Aktiengesellschaft | Model structure for production site, has flat base with multiple models of production equipments, where surface section is marked on base by mechanically readable carrier of an information characteristic within surface section |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4561017A (en) * | 1983-08-19 | 1985-12-24 | Richard Greene | Graphic input apparatus |
CA2014853C (en) * | 1989-07-19 | 1995-05-16 | Lanny Starkes Smoot | Light-pen system for projected images |
EP0622722B1 (en) * | 1993-04-30 | 2002-07-17 | Xerox Corporation | Interactive copying system |
-
1997
- 1997-09-11 EP EP97938724A patent/EP0859977A2/en not_active Withdrawn
- 1997-09-11 DE DE19781024T patent/DE19781024D2/en not_active Ceased
- 1997-09-11 AU AU41090/97A patent/AU4109097A/en not_active Abandoned
- 1997-09-11 WO PCT/CH1997/000336 patent/WO1998013745A2/en not_active Application Discontinuation
Non-Patent Citations (1)
Title |
---|
See references of WO9813745A3 * |
Also Published As
Publication number | Publication date |
---|---|
WO1998013745A2 (en) | 1998-04-02 |
DE19781024D2 (en) | 1999-08-05 |
WO1998013745A3 (en) | 1998-08-20 |
AU4109097A (en) | 1998-04-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE60028894T2 (en) | Presentation system with an interactive presentation | |
DE102014013677B4 (en) | Method for optically scanning and measuring an environment with a handheld scanner and subdivided display | |
DE69530395T2 (en) | INTERACTIVE PROJECTED VIDEO DISPLAY SYSTEM | |
DE69819181T2 (en) | POSITION DETECTION SYSTEM IN VIRTUAL STUDIO | |
DE102013110580B4 (en) | Method for optically scanning and measuring a scene and laser scanner designed to carry out the method | |
DE60133386T2 (en) | DEVICE AND METHOD FOR DISPLAYING A TARGET BY IMAGE PROCESSING WITHOUT THREE DIMENSIONAL MODELING | |
DE60127644T2 (en) | Teaching device for a robot | |
DE69832119T2 (en) | Method and apparatus for the visual detection of people for active public interfaces | |
DE69804591T2 (en) | POSITION DETECTION SYSTEM FOR VIRTUAL STUDIO | |
EP1047014A2 (en) | Method and input device for controlling the position of a graphically displayed object in a virtual reality space | |
DE102016113060A1 (en) | Method for controlling an object | |
DE112017005059T5 (en) | SYSTEM AND METHOD FOR PROJECTING GRAPHIC OBJECTS | |
DE19825302A1 (en) | System for setting up a three-dimensional waste mat, which enables a simplified setting of spatial relationships between real and virtual scene elements | |
EP0859977A2 (en) | Interaction area for data representation | |
DE10245226A1 (en) | presentation system | |
DE102004061841B4 (en) | Markerless tracking system for augmented reality applications | |
DE102019133753A1 (en) | TOOLS FOR AUGMENTED REALITY IN LIGHT DESIGN | |
EP2549431B1 (en) | Image fusion for surveillance of a hazardous area | |
DE102019108807A1 (en) | Digital status report | |
DE20117645U1 (en) | operating device | |
DE102018118422A1 (en) | METHOD AND SYSTEM FOR PRESENTING DATA FROM A VIDEO CAMERA | |
DE102011119082A1 (en) | Device arrangement for providing interactive screen of picture screen, has pointer which scans gestures in close correlation with screen, and control unit is configured to interpret scanned gestures related to data as user input | |
EP1347672B1 (en) | System and method of iluminating an object | |
DE102019133757A1 (en) | THREE-DIMENSIONAL RECONSTRUCTION, AUTOMATIC, FIRST LIGHTING BODIES AND APPLICABILITIES | |
DE102021131060B3 (en) | System and method with a system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 19980613 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT CH DE FR GB LI NL |
|
R17P | Request for examination filed (corrected) |
Effective date: 19980610 |
|
PUAK | Availability of information related to the publication of the international search report |
Free format text: ORIGINAL CODE: 0009015 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AT CH DE FR GB LI NL |
|
17Q | First examination report despatched |
Effective date: 20000828 |
|
GRAG | Despatch of communication of intention to grant |
Free format text: ORIGINAL CODE: EPIDOS AGRA |
|
GRAG | Despatch of communication of intention to grant |
Free format text: ORIGINAL CODE: EPIDOS AGRA |
|
GRAH | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOS IGRA |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20020806 |