EP1015959A1 - Raumsteuerung - Google Patents

Raumsteuerung

Info

Publication number
EP1015959A1
EP1015959A1 EP97953648A EP97953648A EP1015959A1 EP 1015959 A1 EP1015959 A1 EP 1015959A1 EP 97953648 A EP97953648 A EP 97953648A EP 97953648 A EP97953648 A EP 97953648A EP 1015959 A1 EP1015959 A1 EP 1015959A1
Authority
EP
European Patent Office
Prior art keywords
room
control device
trigger
sensor area
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP97953648A
Other languages
German (de)
English (en)
French (fr)
Inventor
Detlef Günther
Andreas Bohn
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Twosuns Media Development GmbH
Original Assignee
Twosuns Media Development GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DE1996153682 external-priority patent/DE19653682C2/de
Application filed by Twosuns Media Development GmbH filed Critical Twosuns Media Development GmbH
Publication of EP1015959A1 publication Critical patent/EP1015959A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Definitions

  • the invention relates to a control device, a room with a control device, and a method for controlling functions of a room.
  • Controls for rooms are known in which certain functions (e.g. visual or acoustic signals) are influenced by a trigger (e.g. a person in the room).
  • a trigger e.g. a person in the room.
  • room monitoring systems are known from security technology, in which motion detectors respond to movements of a person and trigger an alarm.
  • the system has light-emitting sources (e.g. LEDs) that are reflected by surfaces in the room (e.g. a person). The reflections are detected by sensors, from which a control unit calculates the exact spatial position of a reflecting surface.
  • the room condition e.g. the room temperature
  • a disadvantage of such room control systems is that the triggering of functions in the room (eg alarm) is non-specific, since any type of movement already triggers the alarm function of the room.
  • Such known controls are not suitable for rooms in which, for example, light signals, heaters or acoustic signals are to act on a person in a complex manner.
  • the present invention has for its object to provide a control device for a room, a room with a control device and a method for room control with which functions of the room can be controlled in a particularly differentiated manner via devices of the room and the functionality of the room is increased.
  • the control device of a room has sensor means with which the position of at least one trigger (e.g. a person in the room) can be detected. If the position of the trigger lies within a certain, predetermined 1-, 2- or 3-dimensional sub-area of the room, the sensor area, then the dwell time of the trigger in the sensor area is measured.
  • a trigger e.g. a person in the room
  • the flexibility of the room control is considerably increased by using the position and the dwell time of the trigger in the sensor area.
  • the trigger thus provides two parameters (position and dwell time in the sensor area) for controlling devices in the room.
  • the control device according to the invention can thus be used to influence the functions of a room.
  • a computer system with means with which functions of the room, such as light, heating or sound, can be controlled is used as the control device.
  • a room is understood to mean any defined part of the three-dimensional environment, so that functions of closed rooms as well as functions of defined parts of the outside can be controlled by a control device according to the invention.
  • a certain sub-area of the room is understood as the sensor area, in which the control device registers the presence of a trigger and then measures the dwell time of the trigger in this part of the room.
  • the room control device can e.g. be designed in the form of a processor or a computer program.
  • the functional units described below can be implemented either as software or hardware.
  • An advantageous embodiment of the control device according to the invention has means for detecting the speed and / or the acceleration of the at least one trigger. This information can then be used to influence at least one device in the room. These additional kinematic parameters of the release increase the functionality of the room.
  • a control device according to the invention for a room e.g. react differently to a person's fast or slow movement.
  • the control device has means for detecting a trajectory of the at least one trigger.
  • at least one device of the room can be influenced.
  • the person's trajectories in the room are characteristic of certain situations when using the room or for certain users. This information can be used to better adapt the functions of a room to a user.
  • the control device according to the invention has means with which the kinematic (dynamic) and / or temporal behavior of the trigger is quantified.
  • the kinematic behavior of the release is generally understood to mean the space-time behavior of the release in space, which includes in particular the dwell time, the position, the speed and the acceleration of the release.
  • Quantification is understood to mean that the kinematic behavior of the release is recorded by parameters or functions that describe, for example, the dwell time or the shape of the trajectory. These parameters and functions form the input values for functional relationships that directly link the kinematic behavior of the release with a function of a device in the room. These functional relationships can be permanently stored in a database or can be changed over time. By quantifying the kinematic behavior of the trigger, the functions of the room can be very differentiated and specifically influenced.
  • the quantization of the kinematic and / or temporal behavior in combination with a random generator is used with particular advantage, so that novel and differentiated effects can be achieved in particular in rooms with multimedia applications, games or artistically designed rooms.
  • the control device advantageously has means for changing the position, shape and / or the function of at least one sensor area in the room in a predeterminable or randomly controlled manner.
  • the sensor areas can thus be adapted to changing situations, which increases the flexibility of the control device and the functionality of the facilities in the room.
  • a database is used to store the kinematic behavior of at least one trigger. It is also advantageous to record the spatial, temporal and / or functional changes in at least one sensor area in a database. In this way, for example, certain movements or movement patterns of the trigger can be stored and used in a particularly advantageous manner for influencing functions of devices in the room and / or a sensor area.
  • control device there is a continuous transition (fading) between at least two different functions of devices in the room.
  • fading a continuous transition between at least two different functions of devices in the room.
  • At least one object stored in the database for influencing the function of at least one device in the room has an attribute which describes a property of the object.
  • these objects can be information about a sensor area, images, texts, sounds or pieces of music.
  • the attribute can e.g. describe the type of object (e.g. text) or the content of the object (e.g. poem).
  • At least one object stored in the database and / or an attribute of the object has jektes a modifier.
  • This modifier is a measure by means of which the control device can compare different objects or attributes with one another. This also makes it possible to recompose objects that can be assigned to a specific sensor area.
  • a modifier can be stored in a predeterminable manner in a database or can be changed by the control device in the course of time.
  • the control device advantageously has means with which at least one device of the room can be controlled by the kinematic behavior of the trigger in connection with attributes and / or modifiers of at least one object. This makes it possible for the kinematic behavior of the trigger and the properties of the objects to influence the function of the at least one device in the room, which enables very flexible control of the functions of the room.
  • the control device also advantageously has means with which objects, in particular media, can be automatically stored in the database, sorted according to their type. This can considerably accelerate the acquisition of objects (e.g. texts or images) that are to be used as functions in a room. For example, the control device can automatically assign certain attributes to the objects.
  • objects e.g. texts or images
  • the object is information about a sensor area, an image, a text, a sound, a piece of music, a video or a 3D information or a group of objects.
  • this information can be used in a uniform manner to influence facilities in the room.
  • at least one sensor area of the control device according to the invention is invisible in space. For example, a certain area of the room can be scanned by sensors without people in the room being able to perceive it.
  • the sensor area can be arranged at any point in the room, for example, also floating freely in the air, which is particularly useful for rooms monitored for security reasons.
  • a particularly advantageous embodiment of the control device according to the invention has, as at least one controlled device in the room, a device for generating 2D or 3D representations, a video projector, a slide projector, a sound system, a device for achieving tactile stimuli, a lighting system, an air conditioning system, or is a plant for the production of smells. With such facilities, the conditions in the room can be controlled in a comprehensive manner.
  • a control device is used to control at least one device of the room.
  • a space according to the invention can be used, for example, in the context of a multimedia exhibition.
  • the sensor means of the control device that are customary today (for example, ultrasonic sensors for determining the position of people in a room)
  • a very precise position determination of a trigger in the room is possible.
  • At least one function of a device of the room, in particular an optical display and / or a sound system can be activated via the at least one trigger.
  • the control device uses the detected positions and dwell times of the trigger to determine whether, where and for how long the trigger is in a sensor area.
  • functions of at least one device in the room are influenced.
  • the functions of at least one The arrangement of the room can be controlled in a particularly flexible manner by dividing the room into sensor areas and the associated dwell time recording.
  • the position of at least one trigger in the room is first detected by sensor means.
  • Functions of the at least one device of the room e.g. audio-visual signals, can be activated.
  • the information detected by the sensor means is then transmitted to a control device.
  • the control device determines whether the at least one trigger is located in a 1, 2 or 3-dimensional area (sensor area) of the room. If the position lies within the sensor area, the control device then determines the dwell time and the position of the at least one trigger within the sensor area. Depending on the dwell time and the position of the at least one trigger in the sensor area, the control device finally influences at least one function of the at least one device of the room.
  • a person in the room serves as a trigger.
  • Fig. 1 - a schematic representation of a room, the functions of which can be influenced by a sensor area for a person; 2 shows a schematic representation of the projection of a sensor area of a room;
  • FIG. 3 shows a schematic representation of a functional connection between the position of a person in a room and a function of the room (interaction graph);
  • FIG. 4 shows a schematic representation of a functional relationship between the temporal behavior of a trigger and a function of the room
  • FIG. 6 shows a schematic illustration of the chronological sequence of positioning a person in space
  • Fig. 7 - a representation of a room that is completely filled with cuboid sensor areas.
  • a room 1 which has an air conditioning system and a loudspeaker system as devices with certain functions 10.
  • a room 1 can be, for example, a living room, a sports hall or a room in a public building.
  • the functions 10 of the facilities of room 1 are influenced by a computer system as control device 4, which is arranged in room 1.
  • the control device 4 can also be accommodated in a control center from which several rooms 1 are monitored and controlled.
  • the room 1 has a sensor area 2.
  • a sensor area 2 is understood to mean a certain part of the room 1, in which it is registered when a position 8 of the trigger 3 lies in the interior of the area.
  • the control device 4 also detects and stores the dwell time of the trigger 3 in the sensor area 2.
  • the control device 4 has timer functions.
  • the shape of a sensor area 2 is not rigid, but can be adapted to the requirements in space 1 in terms of position, shape and / or function. It is also possible for the entire room 1 to be filled with sensor areas 2, so that the dwell time of the trigger 3 is measured at every point in the room, two different functions 10 being triggered by devices in room 1 depending on the sensor area.
  • control device 4 determines the manner in which the dwell times are processed (e.g. weighting, addition of the dwell times).
  • sensor areas 2 are invisible, i.e. they are e.g. not visible in an exhibition room 1.
  • sensor areas 2 can be made visible in room 1 by light signals.
  • a plurality of sensor areas 2 can also be arranged in a room 1. In particular, these can also be arranged "floating" in space 1. There are no restrictions with regard to the shape of a sensor region 2. Rather, the position, shape and / or function of the sensor areas 2 can be changed by the control device 4 in a predeterminable and / or randomly controlled manner.
  • the trigger 3 is a person 3 who is moving in the room 1. For simplification, only one person 3 and only one two-dimensional position 8 of person 3 is shown. In principle, several people 3 can also serve as triggers 3, whose trajectories in space 1 are recorded three-dimensionally by the control device 4.
  • the kinematic behavior of the person 3 is determined by sensors 5, which in particular monitor whether a person 3 enters a sensor area 2.
  • sensors 5 on the walls and floor of the room 1 shown in FIG. 1 only serve to illustrate the general concept.
  • sensors 5 e.g. can also be arranged hanging from a ceiling in room 1. All means are considered as sensors 5 with which the position, the speed, the acceleration and / or the trajectory of the person 3 or another trigger 3 can be detected. Infrared, ultrasound or light sensors are particularly suitable for this purpose.
  • the sensors 5 and the loudspeaker and lighting system (and thus also the functions 10 of the devices) of the room 1 are connected to the control device 4 according to the invention in a manner not shown here.
  • the connection can e.g. be made via cables laid in the wall or wireless data transmission!
  • the control device 4 has a database in which the kinematic behavior of one or more people 3 is stored, the control device 4 using this information to influence the functions 10 of the room 1.
  • the operation of the sensor areas 2 and the control device 4 is described below with reference to a room with multimedia devices (for example audio-visual devices).
  • the control device 4 now specifically influences functions of the multimedia devices of the room 1.
  • the control device interprets the dwell time and the position 8 of the trigger 3, e.g. a person in sensor area 2 as the person's interest and quantifies this interest as the so-called energy value. In this way, the perception of a user of the room 1 can be described by a measure.
  • the energy value is stored in a database and thus serves as a memory for the interest of a person in room 1, e.g. on a certain multimedia projection.
  • the control device 4 ensures that the energy value is changed after some time, so that forgetting or a waning interest is simulated.
  • the control device maintains an "energy budget" with which it can always be determined in which sensor areas 2 which energy was consumed.
  • the state of a trigger is detected at all times by the current position, the current speed and the dwell time at its current position 8.
  • the current state of a trigger is described by seven values.
  • the control device 4 determines the further behavior of the room 1 (see also FIGS. 3 to 4) 5). After a certain time (reaching a threshold value for the energy value), cross-references to related subject areas of a multimedia projection are displayed, for example, or a piece of music that fits into the context of room 1 is played. It is possible that the newly projected images or recorded music overlap each other, thus creating a continuous transition between the scenes (fading).
  • the control device 4 can control the behavior of the room 1 not only in a deterministic dependence on the kinematic behavior of the trigger. Rather, multimedia content can also be selected and presented via a random generator. Through a combination of deterministic and random selection of content, certain associations of the user (expressed by movements of the user in room 1) of room 1 can be taken into account.
  • random control e.g. create images and atmospheres in an artistic multimedia room that are not repeatable and that challenge the creativity of a user.
  • randomly controlled images and texts can be used in spatial games, which always unfold new aspects.
  • a sensor area 2 can be an image of an exhibition, for example. If the trigger 3 remains in front of this image for a longer period of time, this is interpreted as increased interest by the control device 4 and a text for this image is displayed.
  • the control device 4 of the room 1 can detect and use the kinematic or dynamic behavior of the trigger in another way.
  • the controller 4 of the room 1 not only registers the position 8 of the trigger 3, but also measures the speed, the acceleration and the trajectory of the trigger 3 in room 1.
  • control device 4 By detecting the trajectory of the trigger 3, the control device 4 recognizes the order in which the trigger 3 was in certain sensor areas 2.
  • the control device 4 triggers different functions 10 of the facilities of the room 1 depending on the sequence that has been run through.
  • the control device 4 of the space 1 can also carry out numerical differentiations at certain points on the trajectory, by means of which the speeds and the accelerations at the points of the trajectory are calculated.
  • the kinematic behavior of the trigger 3 is thus completely recorded. These measurements of the kinematic behavior of the trigger 3 are also quantified as energy values.
  • the control device 4 evaluates this as a small release of energy, i.e. The interest of the user is rated as low and little information is projected onto the walls. On the other hand, if a trigger 3 moves slowly in a room, more energy is consumed. The interest is rated higher, which leads to a different behavior of room 1, e.g. playing a video.
  • the kinematic behavior of the trigger 3 depends crucially on the person.
  • the kinematic keep a user of room 1 stored in a database.
  • the control device 4 can thus adapt functions 10 of the room 1 to a specific user (for example by means of an expert system or a neural network). It is also possible that it recognizes from the kinematic behavior of the trigger 3 that a particular behavior of a user is not efficient, and it adjusts a function 10 to a device in the room 1 or indicates the inefficiency to the user.
  • the dwell time and the trajectory curves of people 3 are recorded in the corresponding sensor areas 2.
  • the control device 4 now influences the air conditioning system in the corresponding sensor areas 2 as a function of the kinematic behavior of the people 3. If a person 3 is in a sensor area 2 for a long time, the air conditioning system optimizes the conditions for this area accordingly.
  • the kinematic behavior of the people 3 in the room 1 is also used for the control device 4 of a music system which sonicates the room 1. When staying in certain areas of a room, one or more pieces of music can be predetermined or played randomly by the control device 4. In effect, a composition machine is made available.
  • the first of the sensor area 2 will not necessarily play the same pieces of music as the first time when it is re-visited. Rather, there is the possibility of playing 3 thematically related pieces of music to the person.
  • the control device 4 according to the invention for the room 1 can also be used in a completely new way for sports purposes.
  • sensor areas 2 can be arranged floating in space 1, with sensors 5 monitoring, for example, the kinematic behavior of a ball as trigger 3.
  • Sensor areas 2 can also be changed in the course of a game. Depending on established rules, the control device 4 determines a score based on the kinematic behavior of the ball.
  • a control device 4 according to the invention with sensor areas 2 also offers a wide range of possibilities in the entertainment industry or art.
  • the type and volume of the music could be adapted to the movements of people 3.
  • lighting effects and noises can be adapted to the kinematic behavior of people in room 1. A viewer would become part of the artwork.
  • FIG. 2 shows the projection of a circular sensor area 2 of a room 1 with a radius 7. If a trigger 3, as shown in FIG. 2, is located within the sensor area 2, the kinematic behavior of the trigger 3 and its dwell time in the sensor area 2 are detected by the control device 4 of the room 1 according to the invention.
  • the position 8 of the trigger 3 is represented in a polar coordinate system, with the center as the reference point 6 of the sensor area 2.
  • the position 8 of the trigger 3 is determined from the was the trigger 3 from the reference point 6 and an angle, not shown here, to a reference line.
  • the position 8 of the trigger 3 is represented in an absolute coordinate system in space 1, i.e. the coordinates are counted from the corner of room 1.
  • the control device 4 additionally evaluates the angular coordinate and the dwell time at different points in the sensor area 2 and determines at least one function 10 of a device of the room 1 therefrom.
  • FIGS. 3 and 4 The relationship between the kinematic behavior of the trigger 3 and a function 10 of a device of the room 1 is shown in FIGS. 3 and 4.
  • the functional relationship 9 is part of the control device 4 according to the invention.
  • the functional relationships 9 between a function 10 of the space 1 and the position 8 of a trigger 3 can be both linear and non-linear.
  • the control device 4 of the room 1 then weights the various information about the kinematic behavior of the trigger 3 and then arranges one certain function 10 of furnishing a room.
  • a random generator is additionally used to determine the function 10.
  • the following input variables are typically used by the control device: sensor information, camera information, laser pointer information, scanning of body functions (heartbeat, perspiration, temperature, etc.).
  • the input variables are linked by the control device to the functions 10 of the room 1 via interaction graphs.
  • the output variables are typically: visual 2D and 3D representations, video information, slide projections, sound, tactile information about active sensors in data gloves, light, temperature, humidity, odors.
  • FIG. 4 shows a further functional relationship 9 'between a position 8 of the trigger 3 and a function 10'.
  • the function 10 ' consists in the 0 capacity of an image in a multimedia projection in a room 1.
  • the start time 11 is defined by a specific action (e.g. exceeding a specific dwell time of the trigger 3 in a sensor area 2). From this point in time, the opacity of an image is determined by the functional relationship 9 ', i.e. the opacity increases and decreases again after a while. If the trigger 3 is removed from the sensor area 2 at any time 13, the opacity 10 'of the image assigned at this time 13 remains.
  • Both the spatial (see FIG. 3) and the temporal evaluation of interaction graphs (see FIG. 4) can be used in combination.
  • Several functions 10 can be influenced as a function of them or independently of one another.
  • FIG. 5 shows an example of how a control device 4 according to the invention influences space 1 via interaction graphs 9 ′′, 9 ′′ ′′ functions 10 ′′, 10 ′′ ′′ of a multimedia system.
  • a database is essential for the function of the control device 4 according to the invention, in which all signals measured by the control device 4 and output by the control device 4 are stored.
  • the database holds objects 14, such as images, texts, music, sounds, videos, programs, control commands for external devices, which are made accessible to the user of the room 1.
  • objects 14 such as images, texts, music, sounds, videos, programs, control commands for external devices, which are made accessible to the user of the room 1.
  • information about sensor areas 2 is also treated as objects 14.
  • Media are stored in the database as objects 14 of various types.
  • the objects 14 are combined in a container 15 in terms of program technology, the contents 14 of the objects 14 stored in the container 15 belonging together (i.e. images, texts, music on a subject).
  • a container 15 is also an object 14 from a program point of view.
  • An object 14 can be a member of different containers 15.
  • the selection of an object 14 or a specific number of objects 14 takes place as a function of the position 8 ′′, 8 ′′ ′′ of the trigger 3 via the interaction graphs 9 ′′, 9 ′′ ′′.
  • a measure is determined from the positions 8 ′′, 8 ′′ ′′ and / or another kinematic parameter of the trigger 3 via the interaction graphs 9 ′′, 9 ′′ ′′ that are valid at the respective points and / or at the respective time.
  • the control device determines which object 14 or which group of objects 14 from the suitable container 15 is displayed or played.
  • Each object 14 has attributes 16 that describe the properties of the object 14. Using these attributes 16, the control device determines, among other things, which objects 14 are displayed.
  • the image of a Greek temple is stored, which has the attributes 16 "building", “Greece”, “religion” and "antiquity”.
  • the control device 4 displays the image of the temple. If the control device 4 has determined, for example, that a user is requesting information about Greece, it determines, depending on the kinematic behavior of the trigger 3 in sensor areas 2, whether, in addition to travel information about Greece, the image of the temple is also displayed. If a user informs himself about the ancient world in room 1, the image of the temple can again be displayed depending on the kinematic behavior of the trigger 3.
  • the attributes 16 thus establish cross-connections between different objects 14 stored in a database. Since all information is stored in the database as objects 14 in terms of program technology, diverse interactions between the information and the kinematic behavior of the trigger 3 can be established.
  • the control device 4 does not specify a rigid information hierarchy, where e.g. under the generic term Greece only the sub-terms "travel information” and "pictures” can be called up. Rather, the range of information on the display is determined dynamically by the control device as a function of the kinematic behavior of the trigger 3. Simply by lingering the trigger 3 at a certain point in a sensor area 2, the focus (see FIG. 6), different information can be displayed or played back gradually; the control device 4 interprets the stay in the sensor area 2 as increased interest and controls the projections onto the walls of the room 1 on the basis of the respective energy values.
  • each object 14 has a modifier 17 which assigns a measure (for example in the range 1 to 100) to the object 14.
  • the modifier 17 can be used, for example, to determine the transparency with which an image is displayed.
  • the control device displays the image with full opacity, the background of the display is completely covered. With a value of 10, the image is only translucent on the screen, so that elements behind the image shine through the image.
  • a modifier 17 can also be used, for example, to influence the volume of a noise, the frequency with which images are displayed or music is played, the selection of an image from a container or the sensitivity of the energy output or the energy consumption.
  • Both the attributes 16 and the modifiers 17 can be changed in a predeterminable manner by the control device. Likewise, it is possible for attributes 16 or modifiers 17 to be changed by the kinematic behavior of the trigger 3 and thus to be influenced directly by the behavior of the user.
  • An audio system is mentioned here as an example, which controls the playing of pieces of music in a room 1 as a function of the movement of a trigger 3. If the trigger 3 interacts with different sensor areas 2 one after the other in time, a sensor area 2 will not necessarily play the same pieces of music as the first time when it is re-visited. Rather, it is possible to play thematically related pieces of music. Under certain circumstances, the interaction between the trigger 3 and the control device 4 has signaled that the interest of a user has changed. After evaluating the information about the energy, the attributes 16 and the modifiers 17, the container content is therefore recompiled and the pieces of music then contained are played.
  • a user of the room 1 also controls the display by the movements of the trigger 3, but at the same time the control device 4 controls the user based on the kinematic behavior of the trigger 3, for example through an exhibition.
  • the navigation of the user thus takes place in the constant interplay between the user and the control device 4 of the room 1 according to the invention, the kinematic behavior of the trigger 3 being the link.
  • the control device 4 controls the cooperation of the database and the evaluation of the kinematic behavior of the trigger 3 so that new information is always displayed. This creates a knowledge browser with completely new properties, namely the creation and viewing of data rooms and the possibility of interacting with a trigger 3.
  • FIG. 6 schematically shows the influence of a function 10 of a device in a room 1 by the dwell time of a trigger (not shown here) at a position 8.
  • FIG. 6 shows the chronological sequence, symbolized by a time axis 18, when the trigger 3 remains in position 8.
  • the control device 4 of the room 1 does not react to the presence of the trigger 3 in a sensor area 2, not shown here. No functions 10 of the devices of the room 1 are therefore carried out.
  • the direction of the time axis 18 and the orientation of a so-called focal funnel thus indicate the “direction of interest”, that is, the focus of the user of the room 1.
  • the increasing interest is therefore shown in FIG. 6 by an expanding focal funnel 19; more and more objects 14 are being detected.
  • a shift of the position 8 into another sensor area 2 therefore corresponds to a changed orientation of the focal funnel 19.
  • the cuboid sensor areas 2 have essentially the same sizes.
  • the sensors are Sor areas 2 of different sizes within the room 1, so that in a part of the room 1 the control device 4 carries out a finer scanning of the kinematic behavior of the person 3.
  • the control device 4 By completely filling the room 1 with sensor areas 2, the control device 4 detects the kinematic behavior of one or more people in detail and accordingly controls functions 10 of the room 1.
  • the temperatures of the people can be detected in particular with infrared sensors, so that the air conditioning of the Room 1 is controllable. With other sensors, the control device 4 can e.g. the heartbeat or skin moisture are detected and used to influence the functions of room 1.
  • the control device 4 can also be used to simulate the kinematic behavior of people 1, so that the effect of the behavior of the people on the functions 10 of a device in the room 1 can already be determined in the spatial planning.
  • the embodiment of the invention is not limited to the preferred exemplary embodiments specified above. Rather, a number of variants are conceivable which make use of the control device according to the invention for a room even in the case of fundamentally different types.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Toys (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
EP97953648A 1996-12-13 1997-12-15 Raumsteuerung Withdrawn EP1015959A1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE1996153682 DE19653682C2 (de) 1996-12-13 1996-12-13 Steuervorrichtung und -verfahren für mindestens eine Einrichtung eines Raumes, und Raum mit Steuervorrichtung
DE19653682 1996-12-13
PCT/DE1997/002969 WO1998026345A1 (de) 1996-12-13 1997-12-15 Raumsteuerung

Publications (1)

Publication Number Publication Date
EP1015959A1 true EP1015959A1 (de) 2000-07-05

Family

ID=7815787

Family Applications (2)

Application Number Title Priority Date Filing Date
EP97953649A Withdrawn EP1015960A1 (de) 1996-12-13 1997-12-15 Computersteuerung
EP97953648A Withdrawn EP1015959A1 (de) 1996-12-13 1997-12-15 Raumsteuerung

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP97953649A Withdrawn EP1015960A1 (de) 1996-12-13 1997-12-15 Computersteuerung

Country Status (5)

Country Link
EP (2) EP1015960A1 (ja)
JP (2) JP2000512467A (ja)
CA (2) CA2274702A1 (ja)
DE (1) DE19654944A1 (ja)
WO (2) WO1998026345A1 (ja)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6393407B1 (en) * 1997-09-11 2002-05-21 Enliven, Inc. Tracking user micro-interactions with web page advertising
DE10125309C1 (de) * 2001-05-21 2002-12-12 Humatic Gmbh Verfahren und Anordnung zum Steuern von audiovisuellen medialen Inhalten
KR100575906B1 (ko) 2002-10-25 2006-05-02 미츠비시 후소 트럭 앤드 버스 코포레이션 핸드 패턴 스위치 장치
JP2005242694A (ja) 2004-02-26 2005-09-08 Mitsubishi Fuso Truck & Bus Corp ハンドパターンスイッチ装置
DE102007057799A1 (de) * 2007-11-30 2009-06-10 Tvinfo Internet Gmbh Grafische Benutzerschnittstelle
DE102011102038A1 (de) * 2011-05-19 2012-11-22 Rwe Effizienz Gmbh Heimautomatisierungssteuerungssystem sowie Verfahren zum Steuern einer Einrichtung eines Heimautomatisierungssteuerungssystems

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4109145A (en) * 1974-05-20 1978-08-22 Honeywell Inc. Apparatus being controlled by movement of the eye
GB2183889B (en) * 1985-10-07 1989-09-13 Hagai Sigalov Optical control means
US4896291A (en) * 1988-05-20 1990-01-23 International Business Machines Corporation Valuator menu for use as a graphical user interface tool
CA2012796C (en) * 1989-06-16 1996-05-14 Bradley James Beitel Trigger field display selection
US5107746A (en) * 1990-02-26 1992-04-28 Will Bauer Synthesizer for sounds in response to three dimensional displacement of a body
JP3138058B2 (ja) * 1992-05-25 2001-02-26 東芝キヤリア株式会社 換気扇の制御装置
US5196838A (en) * 1990-12-28 1993-03-23 Apple Computer, Inc. Intelligent scrolling
US5534917A (en) * 1991-05-09 1996-07-09 Very Vivid, Inc. Video image based control system
JPH05108258A (ja) * 1991-10-14 1993-04-30 Nintendo Co Ltd 座標データ発生装置
US5326028A (en) * 1992-08-24 1994-07-05 Sanyo Electric Co., Ltd. System for detecting indoor conditions and air conditioner incorporating same
US5448693A (en) * 1992-12-29 1995-09-05 International Business Machines Corporation Method and system for visually displaying information on user interaction with an object within a data processing system
DE4406668C2 (de) * 1993-04-27 1996-09-12 Hewlett Packard Co Verfahren und Vorrichtung zum Betreiben eines berührungsempfindlichen Anzeigegeräts
US5452240A (en) * 1993-11-23 1995-09-19 Roca Productions, Inc. Electronically simulated rotary-type cardfile
JP3546337B2 (ja) * 1993-12-21 2004-07-28 ゼロックス コーポレイション 計算システム用ユーザ・インタフェース装置及びグラフィック・キーボード使用方法
US5704836A (en) * 1995-03-23 1998-01-06 Perception Systems, Inc. Motion-based command generation technology

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO9826345A1 *

Also Published As

Publication number Publication date
CA2274786A1 (en) 1998-06-18
CA2274702A1 (en) 1998-06-18
JP2000512467A (ja) 2000-09-19
DE19654944A1 (de) 1998-06-25
EP1015960A1 (de) 2000-07-05
WO1998026345A1 (de) 1998-06-18
JP2000512415A (ja) 2000-09-19
WO1998026346A1 (de) 1998-06-18

Similar Documents

Publication Publication Date Title
DE69832119T2 (de) Verfahren und Apparat zur visuellen Erfassung von Menschen für aktive öffentliche Schnittstellen
Jacucci et al. Bodily explorations in space: Social experience of a multimodal art installation
DE19653682C2 (de) Steuervorrichtung und -verfahren für mindestens eine Einrichtung eines Raumes, und Raum mit Steuervorrichtung
DE102014213414A1 (de) Fahrzeug zur visuellen Kommunikation mit einem anderen Verkehrsteilnehmer
WO1998026345A1 (de) Raumsteuerung
DE69837165T2 (de) Verfahren und gerät für automatische animation von dreidimensionalen grafischen szenen für verbesserte 3-d visualisierung
AT519289B1 (de) Sicherheitsvorrichtung zur Einbruchsprävention
Distler et al. Velocity constancy in a virtual reality environment
DE102016209671A1 (de) Vorrichtung zum Designen eines Musters für einen tragbaren Gegenstand
Prager Making sense of the modernist muse: Creative cognition and play at the Bauhaus
Kluss et al. Exploring the Role of Narrative Contextualization in Film Interpretation
DE102012006694A1 (de) Beleuchtungssystem
Koay et al. A user study on visualization of agent migration between two companion robots
McGinley et al. Olfactory design elements in theater: The practical considerations
Josa et al. The action constraints of an object increase distance estimation in extrapersonal space
JP7189434B2 (ja) 空間制御システム
Bouko Dramaturgy and the immersive theatre experience
AT520234B1 (de) Vorrichtung zur interaktiven Präsentation von visuellen Inhalten
Tang Step into the Void: A Study of Spatial Perception in Virtual Reality
DE10019984A1 (de) VAR 3D "3dimensionale virtuelle Akustik Berechnung", ist ein Verfahren zur Simulation von Akustik auf einem informationsverarbeitendem System
Ritter The intersection of art and interactivity
DE102012021893A1 (de) Verfahren zur Aufnahme und Wiedergabe einer Abfolge von Ereignissen
Deby urbanflows (immersed in worlds): Choreographing body-environment metabolisms
Haslem Sip My Ocean: Immersion, senses and colour
Ray et al. Agents of Spatial Influence: Designing incidental interactions with arrangements and gestures

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19990702

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB IE IT LI LU MC NL PT

17Q First examination report despatched

Effective date: 20000817

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20020425