WO2007085303A1 - Système de recouvrement d'informations - Google Patents
Système de recouvrement d'informations Download PDFInfo
- Publication number
- WO2007085303A1 WO2007085303A1 PCT/EP2006/050537 EP2006050537W WO2007085303A1 WO 2007085303 A1 WO2007085303 A1 WO 2007085303A1 EP 2006050537 W EP2006050537 W EP 2006050537W WO 2007085303 A1 WO2007085303 A1 WO 2007085303A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- data
- primary data
- information
- unit
- Prior art date
Links
- 238000012545 processing Methods 0.000 claims abstract description 31
- 238000001514 detection method Methods 0.000 claims abstract description 22
- 239000000523 sample Substances 0.000 claims description 25
- 238000003909 pattern recognition Methods 0.000 claims description 16
- 238000000034 method Methods 0.000 claims description 13
- 230000000007 visual effect Effects 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 5
- 238000013507 mapping Methods 0.000 claims description 3
- 238000013461 design Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 12
- 230000008447 perception Effects 0.000 description 10
- 230000008439 repair process Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 230000035943 smell Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 210000001747 pupil Anatomy 0.000 description 3
- 235000019640 taste Nutrition 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000007175 bidirectional communication Effects 0.000 description 2
- 230000006854 communication Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000009932 equilibrioception Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000011835 investigation Methods 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000009023 proprioceptive sensation Effects 0.000 description 2
- 230000020341 sensory perception of pain Effects 0.000 description 2
- 230000010989 thermoception Effects 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 235000009508 confectionery Nutrition 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000004870 electrical engineering Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000005389 magnetism Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 239000002304 perfume Substances 0.000 description 1
- 230000010399 physical interaction Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 150000003839 salts Chemical class 0.000 description 1
- 230000008786 sensory perception of smell Effects 0.000 description 1
- 230000014860 sensory perception of taste Effects 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Definitions
- the present invention relates to the presentation of information to a human user.
- a device for processing data comprising a detection unit adapted for detecting primary data indicative of an environment of a user, a recognition unit adapted for analyzing the detected primary data so as to recognize secondary data assigned to the primary data, and a presentation unit adapted for presenting the secondary data to the user so that the assignment of the secondary data to the primary data is perceivable for the user.
- perceivable thereby may have the meaning that technical features like, oral commands generated by a loudspeaker or visual elements like arrows pointing from one information item to another generated on a display unit, are implemented by the presentation unit to present the link between the information items in a manner that a human being may take note of such a correlation.
- a method of processing data comprising detecting primary data indicative of an environment of a user, analyzing the detected primary data so as to recognize secondary data assigned to the primary data, and presenting the secondary data to the user so that the assignment of the secondary data to the primary data is perceivable for the user.
- a computer-readable medium in which a computer program of processing data is stored, which computer program, when being executed by a processor, is adapted to control or carry out the above-mentioned method.
- a program element of processing data is provided, which program element, when being executed by a processor, is adapted to control or carry out the above-mentioned method.
- Embodiments of the invention can be partly or entirely embodied or supported by one or more suitable software programs, which can be stored on or otherwise provided by any kind of data carrier, and which might be executed in or by any suitable data processing unit.
- Software programs or routines can be preferably applied in the context of querying a database and in the context of automatic pattern recognition.
- the data processing scheme according to an embodiment of the invention can be performed or assisted by a computer program, i.e. by software, or by using one or more special electronic optimization circuits, i.e. in hardware, or in hybrid form, i.e. by means of software components and hardware components.
- an environment of a user may be automatically detected or may be specified by the user.
- the result of this detection or specification which may be performed in an optical, visual, tactile, or olfactory manner, or using any other sensual detection, may be analyzed in a way to recognize characteristics or properties of the environment.
- This may include an identification of one or more objects which is or are presently located in the environment. For instance, such an object may be visually detected or analyzed using image recognition methods so that a shape, a color, a noise, a smell, a surface structure, an electric or a magnetic property or any other property of the object may be used as a basis for deriving additional information about this object.
- acoustic information is detected and primary data is derived from this audio detection, for instance using a voice recognition system to identify a person being the origin of such speech.
- olfactory information may be derived, for instance by a gas sensor detecting a kind of perfume to identify a person, a smell typical for a particular animal, a smell typical for a particular event like a fire, a taste typical for a particular food or drink, or the like.
- This "abstract" primary information may be automatically derived from the detected data.
- Such a recognition may use known methods of automatic signal processing or artificial intelligence systems like neural networks being capable of deriving abstract information from sensed information.
- the abstract information for instance the identity of a person, a detected event, the identification of a part of an electric device as a particular electronic member or the like may then be used to search in a database storing a plurality of items of secondary data which being assigned to corresponding items of primary information. For instance, habits of the person, a curriculum vitae, or the like may be stored in the database so that the identification of the abstract primary information "identity of the person" may be used to find such secondary information related to the primary information.
- information concerning the device may be provided in the database as a secondary information, like an instruction manual for assisting a user to operate the device, a functional description, tourist information or the like.
- This secondary information may then be automatically presented to the user in combination with the primary information so that a human user can intuitively understand the correlation between the primary data and the secondary data.
- the system may output "Mr. Miller is a teacher” or may output "in order to stop the microwave, press the red button”.
- the presentation may include the information "pins 32 and 47 of the electric circuit are used for providing test result signals". For instance, such a presentation may be overlaid visually or may be overlaid acoustically so as to be perceivable for the human user.
- a correlation between the primary data and the secondary data may be made clear to the human user.
- the detection unit may identify any one or any combination of human and non-human senses.
- a "sense” may be denoted as any human physiological perception that responds to a specific kind of physical energy and corresponds to a defined region or group of regions within the brain where the signals are received and interpreted.
- Data related to human senses which may be detected by embodiments of the invention are particularly seeing, hearing, tasting, smelling, tactile sensing, thermoception, nociception, equilibrioception and proprioception.
- Seeing or vision describes the ability to detect light.
- Hearing or audition is the sense of sound perception.
- Taste or gustation relates to the human tongue having receptors to detect tastes like sweet, salt, sour and bitter.
- Smell or olfaction relates to olfactory reception neurons.
- Tactition is the sense of pressure perception, generally in the skin.
- Thermoception is the sense of heat and cold, also by the skin.
- Nociception is the perception of pain.
- Equilibrioception is the perception of balance.
- Proprioception is the perception of body awareness.
- One or a plurality of these human senses in any combination may be used as input information for the system from which the system may derive information and may take a "reasonable" decision which secondary information might be appropriate to be overlaid to the primary information. For each of the discussed senses, technical solutions are available to detect the respective perception.
- a “sense” may also denote any non- human perception of a sensor device, for instance electricity, magnetism, radioactivity, computer input or the like.
- an overlay information system in which detected information may be overlaid with secondary information in a sight of view of a human being.
- the information perceived by a human being from her or his environment may be combined with information which may help the user to better understand the environment.
- a human operator or technician may have to analyze a complex device (for instance an electric circuit comprising a plurality of interconnected electric members).
- exemplary embodiments of the invention may assist the human operator in repairing the device by providing secondary information concerning the electric circuit or concerning components thereof.
- the human operator may analyze the function of a damaged resistor and may indicate the system that additional information concerning this particular resistor is desired.
- This specification can be performed by simply looking in the direction of this resistor or by bringing a probe in contact with this resistor. This may trigger the system to provide the operator with further information concerning the function and the characteristics of this resistor, like "the resistor should have a value of 120 Ohm and has the function to limit the maximum current in the circuit to 1.5 Ampere".
- a user may carry a head-mounted display with a camera connected to the head-mounted display.
- a camera detects this looking direction and the corresponding image of this part of the environment.
- the image of the circuit may be displayed on a display unit of the head-mounted display. Analyzing the captured image may allow the system to derive the information that the user looks in this direction and wishes to be provided with background information about the object detected in accordance with this viewing direction.
- an automatic pattern recognition system may detect that the user monitors a particular electric member of the electric circuit, for instance a particular resistor.
- This information may be used to search in a linked database for information with respect to this resistor. When such information is found, it may be indicated by an arrow and/or a speech balloon secondary information about this resistor which is displayed simultaneously to the image of the resistor using the head-mounted display.
- a user may simply have to look in a particular direction, and the system may recognize the region the user monitors and may analyze one or more components of this region. It is also possible that the user may ask a specific question like "I need more information with regard to resistor 102" or may select one or more items of available information from a menu provided by the system.
- a communication system may be provided which combines automatically recognized information concerning a user specific environment with an archive in which items of design information are stored. Then, the result of the automatic detection recognition may be mapped to such design data included in the database.
- a detection in the environment may be performed which imitates human perception. This may include analyzing a viewing direction of a human user, for instance by a pupil observation detector which tracks the motion of the human pupil so as to derive information about a region of interest in which the person is looking.
- detection information as some kind of "fingerprint" of events in the environment of the human user may be set in relation to design data stored in a database, after having automatically determined a structure of the monitored region. This information may be mapped to the stored design data to derive secondary information from the primary information. Then, the secondary information may be superposed to the primary information so as to be perceivable simultaneously for the human user.
- the pattern recognition or automatic derivation of abstract information from the detected information may be assisted by implementing neural networks or expert rules.
- a self-adaptive system may be provided which may learn about user habits and may improve the user-specific correlation between perceived data and desired additional information, the longer a user has used the system.
- information of any degree of complexity may be mirrored onto the field of vision of a human user.
- Exemplary fields of application of embodiments of the invention are assistance in repair of devices, particularly of electronic devices, maintenance and placing into operation of technical devices, for instance at home, governmental applications like manhunt or military applications.
- Embodiments of the invention can be implemented in combination with any electrical circuit.
- Design information may be overlaid onto a physical object.
- virtual information may be presented in combination with real information so as to provide a better understanding for a human being of an object in her or his sensual field.
- This may include the identification of an object in a field of vision of a human user (including for example distance and/or angular information). Then, the pattern recognition may be performed as a basis for the definition which design data shall be retrieved from a database. This design data may then be superpositioned with the physical object.
- the overlaid information may be displayed using a monitor or a head- mounted display.
- the detection may be performed with any camera, for instance with a CCD camera.
- the detection may also include the recognition of a spatial orientation and identification of the object. It is also possible to provide a physical interaction, user induced, between the object and a probe which may provide the system with an information about the region of interest.
- At least a part of a system according to an exemplary embodiment of the invention may be delivered in the form of a software product, so that this software may be used in combination with a device for presenting information related to a bought device to the user.
- a device for presenting information related to a bought device to the user For example, when a consumer buys a microwave, a CD may be provided to the consumer as some kind of smart manual. Such a CD may include a collection of items of secondary information which might be interesting for the consumer when operating the microwave. Such a CD may then be used with a hardware system according to an exemplary embodiment of the invention.
- an overlay information system is provided. To repair a complex machine, a lot of information may be required which may conventionally make such a repair time consuming. According to an exemplary embodiment, such needed information may be displayed directly linked on the device under repair as it is debugged by a human operator. The information displayed may include references, interconnections, diagrams, etc. of an object under investigation, for instance of an electric circuit. For this purpose, repair information may be overlaid by a head-mounted display. Before overlaying such information, the object may be photographed by a camera, the object may be automatically analyzed by an automatic analysis systems, and the identified section of interest may be mapped to the design data of the object by a pattern recognition algorithm. Then, this information may be displayed on the device.
- the information is selected or specified by a probe (like a pen-like element having an electrically conductive tip), and an OCR software recognizing the probe. This may be advantageous, since the shape, the colour, markers of the probe may be prestored in the system so the pattern recognition may be significantly simplified. It is also possible that the probe is attached to a scope, for instance an oscilloscope, and the output can be displayed by the head-mounted display.
- a scope for instance an oscilloscope
- the system may point out various information for a service technician to check an electrical PCB board. It is also possible that a human consumer may be provided with information about a product using such a system. The system may then help to understand the human consumer about a recently purchased product. For instance, when making himself familiar with a new microwave, the system may assist a user to understand as to how to operate the microwave. For instance, it may be displayed information like "open the microwave door by gently pressing the open button".
- a further exemplary field of application of embodiments of the invention is an information system for the police.
- the system may be used by a police officer to identify persons or objects like car license plates. For instance, when a police car with a policeman carrying a head-mounted display having a system according to an embodiment of the invention follows a car, number plate information of this car may be automatically detected by the system, and may be compared with database information. If the car corresponding to the number plate has been reported to be stolen, the system may automatically display to the policeman information like "Alert - this car is stolen! Driver might be dangerous!.
- the detection unit may be adapted for detecting the primary data being indicative of at least a part of a field of vision of the user.
- a viewing direction of the user may be automatically detected using position, velocity, angular sensors or the like.
- a motion of the head of the user may be automatically detected.
- a motion of the eye or parts of the eye can be detected automatically.
- the function of the detection unit is not restricted to visual information. Additionally or alternatively, any other information detectable from the environment and being usable for characterizing the environment may be evaluated.
- the detection unit may comprise a camera for detecting an image in the environment of the user. Such a camera may be a CCD camera or any other kind of camera.
- the recognition unit may be adapted for recognizing the secondary data assigned to the primary data by pattern recognition. Pattern recognition may be denoted as a field within the area of machine learning. It may also be denoted as the act of analyzing raw data and taking an action based on the category of the data. As such, pattern recognition may comprise a collection of methods for supervised learning. Pattern recognition may aim to classify data patterns based on an a priori knowledge or on statistical information extracted from the patterns. The patterns to be classified may be groups of measurements or observations, defining points in an appropriate multidimensional space.
- a pattern recognition system may comprise a sensor that acquires the observations to be classified or described, a feature extraction mechanism computing numeric or symbolic information from the observations, and a classification or description scheme that does the actual job of classifying or describing observations, relying on the extracted features.
- Exemplary fields of pattern recognition are automatic speech recognition, classification of text into several categories, the automatic recognition of handwritten postal codes on postal envelopes, or the automatic recognition of images of human faces.
- the recognition unit may be adapted for mapping the secondary data to the primary data.
- mapping may particularly denote an instruction for an electronic data processing conversion system for converting primary data into secondary data. This may include selecting and assigning information, and creating meaningful links between items of information.
- the presentation unit may be adapted for presenting the secondary data to the user by at least one of the group consisting of a visual presentation, an audible presentation, and an audiovisual presentation. More generally, the presentation unit may present the secondary data in any manner so as to be sensable by a human user.
- the presentation unit may comprise a display unit.
- a display unit may display the image in such a manner that it is perceivable by a human user.
- a display unit may comprise an LCD device, a plasma display, or the like.
- the presentation unit may further comprise or may be part of a head- mounted display (HMD).
- HMD head-mounted display
- a head-mounted display may be denoted as a display system built and worn similar like goggles giving the illusion of a floating monitor in front of a user's face.
- Such a head-mounted display which may also be denoted as a head-up display may be denoted as an electronically generated display of any visual information.
- the presentation unit may further be adapted for overlaying the secondary information with the primary data to the user.
- an impression or illusion may be given to the user that the physical primary data is perceived simultaneously with the virtual secondary data.
- the presentation unit may further be adapted for visually and/or acoustically overlaying the secondary data with the primary data to the user.
- image and/or sound information may be presented to the user in a multimedia-like system.
- the presentation unit may be adapted for overlaying information about a physical object as the secondary data with the physical object as the primary data to the user. Therefore, the retrieved information concerning the identified object may be presented in accordance with the perceived object as additional information.
- the device may comprise a database unit for storing the secondary data and may be accessible by the recognition unit.
- a database may be a storage device in which information about different objects may be stored.
- information may be an "encyclopedia" which may be adjusted to a special field of interest.
- the database may comprise information about the device to be repaired, for instance about electrical engineering knowledge.
- the device may further comprise a user interface unit adapted to enable a user to provide the device with instructions concerning the primary data to be analyzed.
- the user interface unit may be adapted to enable a user to provide the device with instructions via at least one of the group consisting of speech, looking at a particular position, entering instructions via a menu, and entering instructions via a button.
- Such a user interface may be a graphical user interface (GUI).
- GUI graphical user interface
- Such a graphical user interface may include a display device (like a cathode ray tube, a liquid crystal display, a plasma display device or the like) for displaying information to a human operator, like data related to the environment.
- the graphical user interface may comprise an input device allowing the user to input data (like data specifying the desired information) or to provide the system with control commands.
- Such an input device may include a keypad, a joystick, a trackball, or may even be a microphone of a voice recognition system.
- the GUI may allow a human user to communicate in a bi-directional manner with the system.
- the secondary data may be any data related to a region of interest in accordance with a user selection.
- the secondary data may be at least one of the group consisting of information assisting the user in servicing or repairing an apparatus related to the primary data, information concerning an electronic product related to the primary data, security related information, and police related information.
- the device may furthermore comprise a probe unit adapted to physically interact with an object in the environment of the user.
- a probe unit for instance some kind of pencil, it is easily possible that the user specifies a portion of an even miniature device about which addition information is desired. This may significantly simplify the pattern recognition since the shape of the probe may be prestored for easy identification.
- the probe unit may be provided with a marker (for instance an optical marker) allowing for an easy identification of the position and pointing direction of the probe unit on a captured image.
- the probe unit may comprise or may be coupled with an oscilloscope.
- an oscilloscope When handling an electronic device, measurements with the oscilloscope may be performed, and a probe unit may electrically couple the oscilloscope with an electric circuit and may further serve for identifying a portion of the electric circuit concerning which further information is desired.
- FIG. 1 shows a device for processing data according to an exemplary embodiment of the invention.
- Fig. 2 shows a part of a device for processing data according to an exemplary embodiment of the invention.
- FIG. 3 shows a device for processing data according to an exemplary embodiment of the invention.
- Fig.4 shows an information overlay function of a device for processing data according to an exemplary embodiment of the invention.
- Fig. 5 shows an information overlay function of a device for processing data according to an exemplary embodiment of the invention.
- Fig. 6 shows a block diagram of a device for processing data according to an exemplary embodiment of the invention.
- Fig. 7 shows a block diagram of a device for processing data according to an exemplary embodiment of the invention.
- FIG. 1 a device 100 for processing data according to an exemplary embodiment of the invention will be explained.
- the device 100 comprises a head-mounted display 110 on which a camera 101 is mounted for detecting image data with respect to an electric circuit 102 as an environment of a user carrying the head-mounted display 110, in an operation state in which the user carrying the head-mounted display 110 looks in the direction of the electric circuit 102.
- the camera 101 may detect an image representing this region of interest and may provide the corresponding image data to the interface computer 103.
- a user interface 106 is provided laterally at the head-mounted display 110 and allows a user using the head-mounted display 110 to specify information which she or he desires.
- the user interface 106 may include a microphone via which a user may indicate further information, for instance saying "I need further information concerning the resistor R1".
- the device 106 may be a touch screen presenting menus from which a user may select a desired function.
- the image data captured by the camera 101 and/or the user-defined input via the user interface 106 may be supplied to the interface computer 103 which may be a laptop.
- the function of the interface computer 103 may also be integrated into the HMD 110.
- a probe 112 may be connected via an optional adapter unit 111 to the interface computer 103.
- a user may indicate a region of interest in her or his field of view, for instance by taking the probe 112 in her or his hand and by pointing onto the resistor R1 107 with regard to which the user needs additional information.
- information concerning a region of interest may be supplied, for instance via the adapter unit 111 , from the probe 112 to the interface computer 103.
- the interface computer 103 may receive information concerning the environment 102 from the detection units 101 , 106 and 112.
- a microprocessor (CPU) of the interface computer 103 may then process the received sensor information to derive abstract information from the detected issues. For instance, the interface computer 103 may recognize that the user 101 wishes further information concerning the resistor R1 107 of the circuit 102.
- This information may be supplied from the interface computer 103 to an optional central server computer 113.
- the central server computer 113 may be connected to a database 105 in which a huge amount of items of design data is stored.
- the server 113 and/or the interface computer 103 may search in the database 105 to find additional data in the database 105 which further specifies the resistor R1 107 which has been identified by the interface computer 103 as an element of interest for the user.
- this information is supplied from the central server computer 113 to the interface computer 103.
- the interface computer 103 may recondition this data for output by a presentation unit 104 of the device 100.
- the presentation unit 104 may have an optical section and an acoustical section.
- the acoustical section of the presentation unit 104 may output acoustic waves indicating details concerning the function of the resistor R1 107, which is symbolized in Fig. 1 as a speech balloon 114.
- further information may be visually displayed on a display of the head-mounted display 110, for instance the value of the resistor R1 which is indicated to be 100 Ohm on the head-mounted display 110.
- a central processing unit (CPU) 103 of the system may receive various detection information concerning an environment of a user operating the system.
- An optical detection block 201 may detect an image of the environment.
- a 3D position sensor block 202 may derive 3D position and/or velocity and/or acceleration information of the object under investigation.
- a voice or sound recognition sensor block 203 may capture sound waves from the environment so as to derive information about the environment.
- An olfactory sensor block 204 may be a gas sensor to detect smells in the environment.
- the results from the sensors 201 to 204 are provided to the CPU 103 which may derive more abstract information concerning the environment. Via a bi-directional communication path 205, the CPU 103 may send a request and may receive a response in the form of secondary information about the environment based on the results of the detection of the blocks 201 to 204.
- Such secondary information which is received via the communication path 205 may be displayed by a display/image projection output block 206, may be emitted in the form of acoustic waves via a loudspeaker unit 207 and/or may be output as artificial gas compositions using an olfactory perception generator 208.
- a three-dimensional object 301 is detected using an image capture device 302 and a three-dimensional orientation detection device 303 for deriving image and spatial orientation information concerning the three-dimensional object 301.
- the devices 302 and 303 are mounted on a head-mounted display 110.
- the data captured by the camera 302 is provided to an image recognition unit 304.
- the data detected by the three-dimensional position sensor 303 may be provided to a three-dimensional model generator unit 305.
- the units 304,305 further derive abstract information concerning position and orientation of the three-dimensional object 301 and provide the result to a CPU 103.
- the CPU 103 may query a database 105 to derive secondary information concerning the three-dimensional object 301 which may then be presented by presentation elements 306 mounted on the head-mounted display 110 to the user.
- Fig. 4 shows an image of an electric circuit 102 which is to be repaired by a human operator.
- the human user carries a head-mounted display 110 on which an image is displayed which relates to an image captured by a camera 101 so that the information displayed on the head-mounted display 102 corresponds to what the user would see if the user would look into the same direction without the head-mounted display 110.
- the user looks with the eyes onto a particular resistor which is denoted in Fig. 4 as the resistor R103. About this resistor, the user desires to have more information.
- the corresponding section of the electric circuit 102 is photographed by the camera 101.
- the camera 101 supplies the captured image data not only to the head-mounted display 110 for display but also to a pattern recognition unit 103.
- the captured section is mapped to design data stored in a database 105 by the pattern recognition software.
- the repair information with respect to the resistor R103 which is stored in the database 105 may then be overlaid to the head-mounted display 110 so that the corresponding information shown in block 401 is presented to the human operator in accordance with the image of the electric circuit 102.
- the data processing system 500 distinguishes from the data processing unit 400 in that an OCR system 501 is used as a pattern recognition system to be interconnected between the camera 101 and the design system 105.
- a probe 503 is provided via which a human operator may indicate a particular portion (for instance the resistor R103) of interest in the electric circuit 102. Via a scope 502 additional information may then be displayed as information 401 to be overlaid to a human user. In other words, the probe 503 is attached to the scope 502 so that the output can be displayed by the head-mounted display 110.
- FIG. 6 shows a block diagram 600 of a device for processing data according to an exemplary embodiment of the invention.
- a camera 101 and a probe 112 may be used to define a region of interest of an object 102.
- the data captured by the probe 112 may be supplied to the scope 111 or 502.
- the output of the scope 111 may be provided to the interface computer 103, and the output of the camera 101 may further be supplied to the interface computer 103.
- the interface computer 103 is in bi-directional communication with the server 105 for 3D design, to derive data concerning the object 102.
- the information taken from the 3D server 105 is provided to the head-mounted display 110 so that the display of the object 102 together with overlaid information may be displayed to a user.
- the block diagram 700 of the data processing device shown in Fig. 7 distinguishes from the block diagram 600 in that the camera 101 is substituted by a position sensor 701 which receives information from a reference point transmitter 702.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
L'invention concerne un dispositif (100) pour traiter des données, le dispositif (100) comprenant une unité de détection (101) adaptée pour détecter des données primaires révélatrices d'un environnement (102) d’un utilisateur, une unité de reconnaissance (103) adaptée pour analyser les données primaires détectées de façon à reconnaître des données secondaires attribuées aux données primaires, et une unité de présentation (104) adaptée pour présenter les données secondaires à l'utilisateur de sorte que l'attribution des données secondaires aux données primaires est perceptible par l'utilisateur.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2006/050537 WO2007085303A1 (fr) | 2006-01-30 | 2006-01-30 | Système de recouvrement d'informations |
TW096103186A TWI326847B (en) | 2006-01-30 | 2007-01-29 | Information overlay system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2006/050537 WO2007085303A1 (fr) | 2006-01-30 | 2006-01-30 | Système de recouvrement d'informations |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007085303A1 true WO2007085303A1 (fr) | 2007-08-02 |
Family
ID=37074732
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2006/050537 WO2007085303A1 (fr) | 2006-01-30 | 2006-01-30 | Système de recouvrement d'informations |
Country Status (2)
Country | Link |
---|---|
TW (1) | TWI326847B (fr) |
WO (1) | WO2007085303A1 (fr) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2728846A1 (fr) * | 2012-11-06 | 2014-05-07 | Konica Minolta, Inc. | Dispositif d'affichage d'informations de guidage |
JP2014235704A (ja) * | 2013-06-05 | 2014-12-15 | 富士機械製造株式会社 | 基板生産支援システム |
EP2652940A4 (fr) * | 2010-12-16 | 2015-05-20 | Microsoft Technology Licensing Llc | Contenu basé sur la compréhension et l'intention pour dispositifs d'affichage en réalité augmentée |
US9323325B2 (en) | 2011-08-30 | 2016-04-26 | Microsoft Technology Licensing, Llc | Enhancing an object of interest in a see-through, mixed reality display device |
US10019962B2 (en) | 2011-08-17 | 2018-07-10 | Microsoft Technology Licensing, Llc | Context adaptive user interface for augmented reality display |
US10223832B2 (en) | 2011-08-17 | 2019-03-05 | Microsoft Technology Licensing, Llc | Providing location occupancy analysis via a mixed reality device |
US10839214B2 (en) | 2018-03-13 | 2020-11-17 | International Business Machines Corporation | Automated intent to action mapping in augmented reality environments |
US11127210B2 (en) | 2011-08-24 | 2021-09-21 | Microsoft Technology Licensing, Llc | Touch and social cues as inputs into a computer |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002080091A1 (fr) * | 2001-03-28 | 2002-10-10 | Citysync Ltd. | Systeme de reconnaissance automatique de plaques d'immatriculation |
-
2006
- 2006-01-30 WO PCT/EP2006/050537 patent/WO2007085303A1/fr active Application Filing
-
2007
- 2007-01-29 TW TW096103186A patent/TWI326847B/zh active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002080091A1 (fr) * | 2001-03-28 | 2002-10-10 | Citysync Ltd. | Systeme de reconnaissance automatique de plaques d'immatriculation |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2652940A4 (fr) * | 2010-12-16 | 2015-05-20 | Microsoft Technology Licensing Llc | Contenu basé sur la compréhension et l'intention pour dispositifs d'affichage en réalité augmentée |
US9213405B2 (en) | 2010-12-16 | 2015-12-15 | Microsoft Technology Licensing, Llc | Comprehension and intent-based content for augmented reality displays |
US10019962B2 (en) | 2011-08-17 | 2018-07-10 | Microsoft Technology Licensing, Llc | Context adaptive user interface for augmented reality display |
US10223832B2 (en) | 2011-08-17 | 2019-03-05 | Microsoft Technology Licensing, Llc | Providing location occupancy analysis via a mixed reality device |
US11127210B2 (en) | 2011-08-24 | 2021-09-21 | Microsoft Technology Licensing, Llc | Touch and social cues as inputs into a computer |
US9323325B2 (en) | 2011-08-30 | 2016-04-26 | Microsoft Technology Licensing, Llc | Enhancing an object of interest in a see-through, mixed reality display device |
EP2728846A1 (fr) * | 2012-11-06 | 2014-05-07 | Konica Minolta, Inc. | Dispositif d'affichage d'informations de guidage |
US9760168B2 (en) | 2012-11-06 | 2017-09-12 | Konica Minolta, Inc. | Guidance information display device |
JP2014235704A (ja) * | 2013-06-05 | 2014-12-15 | 富士機械製造株式会社 | 基板生産支援システム |
US10839214B2 (en) | 2018-03-13 | 2020-11-17 | International Business Machines Corporation | Automated intent to action mapping in augmented reality environments |
Also Published As
Publication number | Publication date |
---|---|
TWI326847B (en) | 2010-07-01 |
TW200809220A (en) | 2008-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107656613B (zh) | 一种基于眼动追踪的人机交互系统及其工作方法 | |
WO2007085303A1 (fr) | Système de recouvrement d'informations | |
CN112034977B (zh) | Mr智能眼镜内容交互、信息输入、应用推荐技术的方法 | |
Grauman et al. | Communication via eye blinks and eyebrow raises: Video-based human-computer interfaces | |
CN110167421B (zh) | 整体地测量视觉功能的临床参数的系统 | |
US7680295B2 (en) | Hand-gesture based interface apparatus | |
US6625299B1 (en) | Augmented reality technology | |
CN110647237A (zh) | 在人工现实环境中基于手势的内容共享 | |
US20110260965A1 (en) | Apparatus and method of user interface for manipulating multimedia contents in vehicle | |
US6539100B1 (en) | Method and apparatus for associating pupils with subjects | |
CN108229345A (zh) | 一种驾驶员检测系统 | |
CN109171638A (zh) | 视力检测的方法、终端、头戴显示设备以及视力检测系统 | |
WO2012108998A1 (fr) | Interface homme-machine sans contact | |
JP5225870B2 (ja) | 情動分析装置 | |
Zhao et al. | Research on human-computer interaction intention recognition based on EEG and eye movement | |
KR20190066428A (ko) | 기계학습에 기반한 가상 현실 콘텐츠의 사이버 멀미도 예측 모델 생성 및 정량화 조절 장치 및 방법 | |
Brock et al. | Kin'touch: understanding how visually impaired people explore tactile maps | |
Mansoor et al. | A machine learning approach for non-invasive fall detection using Kinect | |
CN117292601A (zh) | 一种虚拟现实手语教育系统 | |
EP4124073A1 (fr) | Dispositif de réalité augmentée réalisant une reconnaissance audio et son procédé de commande | |
CN112669578B (zh) | 余光区域中基于声源的感兴趣对象告警方法和系统 | |
US20170316482A1 (en) | Person and machine matching device, matching system, person and machine matching method, and person and machine matching program | |
JP6167675B2 (ja) | 人と機械のマッチング装置、マッチングシステム、人と機械のマッチング方法、および人と機械のマッチングプログラム | |
CN109508089B (zh) | 一种基于层级随机森林的视线控制系统与方法 | |
CN113469055A (zh) | 人体姿势识别系统及人体姿势识别评估的方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 06704275 Country of ref document: EP Kind code of ref document: A1 |