WO2005015829A1 - Method and system for applying sensor information by replacement of a set of sensors. - Google Patents

Method and system for applying sensor information by replacement of a set of sensors. Download PDF

Info

Publication number
WO2005015829A1
WO2005015829A1 PCT/SE2004/001167 SE2004001167W WO2005015829A1 WO 2005015829 A1 WO2005015829 A1 WO 2005015829A1 SE 2004001167 W SE2004001167 W SE 2004001167W WO 2005015829 A1 WO2005015829 A1 WO 2005015829A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensors
sensor
interface
application
platform
Prior art date
Application number
PCT/SE2004/001167
Other languages
French (fr)
Inventor
Stuart Mendelsohn
Original Assignee
Stuart Mendelsohn
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stuart Mendelsohn filed Critical Stuart Mendelsohn
Publication of WO2005015829A1 publication Critical patent/WO2005015829A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41845Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by system universality, reconfigurability, modularity
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31113General, vendor indenpendant display and control interface for sensor actuator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31125Signal, sensor adapted interfaces build into fielddevice
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/33Director till display
    • G05B2219/33125System configuration, reconfiguration, customization, automatic
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the invention relates to a method and system for sensor devices.
  • the invention and system relates to a method for the standard classification and management of sensors for sensors connected to any system.
  • the method and system of the present invention provides a solution to the above-outlined problems.
  • the system of the present invention has a common interface bus/specification that may be used to define, sensor use and specification between different devices.
  • the interface permits the interchange and upgrade of replacement sensors without requiring any modification of the application devices.
  • the sensor devices may be replaced by enhanced versions or different sensor versions that provide the same or additional sensor functions.
  • the method of the present invention may be used for applying sensor information.
  • a sensor unit has a set of sensors associated therewith.
  • the sensor unit is in communication with a sensor application-programming interface that is in communication with an application unit.
  • a plurality of sensors is associated to the set according to a universal mapping standard wherein the sensors are based on a first platform.
  • the interface recognizes and classifies the sensors.
  • the sensors are replaced with a second set of sensors that are based on a second, platform that is different from the first platform.
  • the interface recognizes and classifies the second set of sensors without requiring a replacement of the application unit.
  • a sensor is any device that can produce data in response to some event it is monitoring.
  • a sensor API presented to applications; physical sensors and associated data can be consistently mapped and classified.
  • equivalent sensors from different , manufacturers or different sensor arrays can be consistently processed by an application system because the API translates sensors and arrays to one common standardised layer presented to the application system or systems accessing any standard sensor API. Both sensors and associated data and applications that access sensor data via the API can be distributed and need not be in the same physical location.
  • a team of mobile robots could communicate wirelessly and appear to as one virtual array of sensors, like a pack of dogs following a scent presented to them by their handler.
  • Sensors can be of chemical, biological or electronic in nature, once defined the sensor capabilities must remain consistently mapped in the API. If a new sensor is developed it is either mapped as an extension to an existing sensor class, or if a suitable class does not exist it must be created.
  • sensor arrays can be visualized as rows the lens analogy can be used to define, by depth of field which sensors in the row are being accessed, /processed or selected at a a given time or selection range.
  • the rows can be of the same sensor or various types. For example, different sensors accessed at the same time, or physical sensor location. Thus the sensors focussed on in a row would vary with the depth of field in the visualization, this could correspond to a value range for example.
  • sensor are reading any data the brightness of each sensor could be used to indicate the values being read, the highest value corresponding to the highest luminosity of the sensor being visualized in this way.
  • a change in color could also be used to indicate sensor values. For example, in an array of sensors of the same type or sensor sensor luminosity could be proportional to the sensor signal amplitude.
  • Pseudo sensors are can be any combination of:
  • a pseudo sensor could also be a software simulation of sensor data for test, experimental or other purpose. Values and combinations not occurring normally could be simulated in this manner. The equivalent of a machine hallucination could also be simulated in this way.
  • sensor When sensor are displayed in an application they can be visualized, for example, as having:
  • the standard sensor API will allow applications to prioritise and/or reduce the number of sensors read if processing is time critical or too much data is present for a given application to process effectively. Additionally applications can pass tasks to other applications more suited to process specific sensors and associated data.
  • Fig. 1 is a schematic flow diagram of the method of the present invention
  • Fig. 2 is a schematic flow diagram of two devices with sensor API units.
  • the system 10 of the present invention has a sensor unit 12 in communication with a sensor API 40.
  • a sensor unit 12 in communication with a sensor API 40.
  • the universal mapping or standard also enables a unique way of managing preferences and that the system has a hardware/so tware independent platform for sensors.
  • the method of the present invention provides a hardware and software independent way to map, classify and structure external events that are recorded, detected and associated via sensors.
  • the sensors connected to the sensor unit 12 may be any type of device such as a camera, photocell, biochemical sensor or a receiver/microphone that operates at all wavelengths/frequencies.
  • the sensors are not limited to the human sensor spectrum and they may detect energy and other such events that can or cannot be detected with human senses.
  • the sensors may be physical sensors or pseudo-sensors that could be simulated in software or combinations of sensors and associated events or data such as news, time of day or date.
  • the sensor unit 12 may have a sensor set 15 including sensor units 16, 18, 20, 22, 24, 26, 28, 30.
  • the set 15 may include sensor functionalities such as microphone, X-ray, receiver, camera, infrared camera, gamma ray, vapor, vibration, magnetic or any other sensor functionality, as desired.
  • Any sensor 16-30 that is attached to the sensor unit 12 can be recognized and classified and it is known what the attached sensor can do regardless of the manufacturer due to the standardized software/hardware.
  • the sensors may also communicate with the set 15 via wireless communication or any other type of communication.
  • the universal communication between the sensors and the set 15 enables the system 10 to prioritize how the sensors, such as the sensor 16, is used and associate the sensor 16 with other devices.
  • the system also enables a group of sensors, such as the sensors 16, 18, 20, to act together as one sensor unit.
  • the sensor unit 12 communicates with a sensor application-programming-interface (API) 40.
  • API application-programming-interface
  • the applications may operate across different platforms since the interface hides the underlying hardware and software differences of the sensors.
  • the platform sensor preferences can be stored in a database that can be accessed by other compatible platforms and any application using the interface 40.
  • the system 10 can autonomously build up a world-view or perception that is based on data stored in the database associated with a particular combination of sensors and the associated preferences and data associated with these sensor combinations. These preferences or views stored in the database can be shared, processed and modified by compatible systems and applications.
  • the interface 40 may be used to classify the sensors that are plugged into the set 15.
  • the interface 40 may be used to transfer data to program sensors irrespective of the sensor hardware used but all sensor hardware must comply with a classification that defines the function and capabilities of the sensors connected to the sensor unit that is in operative engagement with the interface 40.
  • the interface 40 By separating the interface 40 from the sensor unit 12, the interface 40 does not need to directly deal with the sensor hardware that is plugged into the set 15. The intelligence of the interface 40 may effectively translate the sensor information. The interface 40 may then communicate with an application unit 42. The interface 40 may give different preferences to the sensors connected to the set 15 to save processing power of or to devolve the system 10. Because the interface 40 provides the required translation of the sensor information received by the sensor device 12, the application unit 42 does not need to be directly compatible with the hardware of the sensor itself. In this way, the unit 42 may cooperate with a wider range of sensors and the application unit 42 is hardware/software- independent of the sensor unit 12. .
  • the interface 40 may cooperate with a sensor preferences filter database 44 that is associated with a computer 46 so that the filter database 44 may be used to filter, such as selecting or excluding, information from the interface 40 before the sensor information is displayed on a display 48 of the computer 46.
  • the settings of each sensor 16-30 connected to the sensor unit 12 and the selection of the sensors 16-30 may be saved in the filter database 44. When the same application is carried out again, the settings and selection of the sensors may be retrieved from the database 44 so there is no need to reset or re-select the sensors each time an application is performed. Filters may be used for other wavelength spectra such as sound, radio and so oh.
  • the same units 12, 40, 42 may be used for two different devices 50, 52.
  • the sensor mapping and functionality are the same for each device 50, 52 but the device 52 does not include sensors for the slots 20b, 30b.
  • both devices 50, 52 run identical applications 54a, 54b, respectively although the sensor sets are different. This means the application 54b does not require the sensor 20b, 30b and only one common sensor or a plurality of common sensors 16, 18, 22, 24, 26, 28 are used.
  • the devices may also run different applications 56, 58, as required.
  • the device 50 may be equipped with a high-quality camera while the device 52 is equipped with an electronic smell sensor, a medium-quality camera and with a hearing device.
  • the devices 50, 52 may cooperate thanks to the universal mapping and act as one device with three sensor features. The grouping could also include pseudo-sensors for simulated events. Also, if the high-quality camera of the device 50 malfunctions, the medium-quality camera of the device 52 may be used as a backup and provides a redundancy.
  • the system 10 may select or prioritize the sensor devices, such as the high-quality camera of the device 50 that may have the best or most suitable properties for a particular application.
  • a standard such as XML allows not only the mapping of sensors connected to the system 10 but also for the combination of sensors used to record external events or pseudo-events as explained above.
  • Non-sensor information may be combined with sensor data to allow for a more complex construction of events or concepts.
  • data from an information system or the Internet could be related to events recorded via any combination of sensors.
  • a system could autonomously evolve its own conceptual framework and experience independent of human input and in a way can be parsed and communicated via the standard mapping system of the present invention to other systems or human reading of events.
  • News or calendar events could be used for activation.
  • the XML standard also makes it possible to generate DTDs (document type definitions) , as rules that can be used to define a specific set of sensors and/or capabilities to define the structure and' content of events recorded via any sensor combination and associated data. For example, if a specific set of sensors are required for a task a DTD could be used to parse suitable sensor platforms even if the platforms are of different manufacture. Additionally, simulated or imaginary events may be described in this way, so that the requirements for events or sensors may be defined before they are detected or created. Combinations of sensors and sensor readings can be filtered in a standard way by storing data in, for example, a database. A standardized database could store preferences in a system independent way allowing preferences to be transferred or loaded on to one or more systems or replicated.
  • a photographic metaphor such as the filter placed in front of a camera lens an easy to use system could be created to manage the loading and management of filter data for sensors arrays. Extending the photographic theme would allow the visualization of sensor arrays to be seen through a camera lens, changing focal length would allow narrowing the number of sensors ⁇ in view', for example. Visualizing sensors in rows would allow for depth of field to be used to select adjacent sensors in a row, for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Arrangements For Transmission Of Measured Signals (AREA)

Abstract

The method is for applying sensor information. A sensor unit (12) has a set (15). The sensor unit is in communication with a sensor application-programming interface (40) that is in communication with an application unit (42). A plurality of sensors (16a, 18a) is associated to the set (15) according to a universal mapping standard wherein the sensors (16a, 18a) are based on a first platform. The interface recognizes and classifies the sensors (16a, 18a). The sensors (16a, 18a) are replaced with sensors (16b, 18b) that are based on a second platform that is different from the first platform. The interface (40) recognizes and classifies the sensors (16b, 18b) without requiring a replacement of the application unit (42).

Description

METHOD AND SYSTEM FOR SENSOR DEVICES
Technical field The invention relates to a method and system for sensor devices. The invention and system relates to a method for the standard classification and management of sensors for sensors connected to any system.
Background of Invention In today' s robotics and complex systems there is an increased need for effective communication and interchangeability between various devices/sensors even if the devices are manufactured by different suppliers and based on different protocols/program platforms. Because a common standard/API does not exist for sensors the standard will allow for greater application interchange. A larger market for sensors and platforms will give reduced costs due to economies of scale. Currently available robotics and other such artificial intelligence devices do not allow for common communication and concepts to evolve. Each robot project is like a species that may become extinct after only one generation. Since if a platform cannot easily support the evolution of applications, no inheritance, the only evolution is in the slow progress made by the engineers who create each new generation of robots. Each robot project varies with the design of motion, sensor and programming and cannot evolve or communicate with other machines if they are incompatible, and without a standard for sensors, most are. Due to the wide variety of different and incompatible protocols and technologies available, the sensors often cannot communicate sensor information to the application devices unless some pre-customization has taken place. They often require complicated adjustments both in the software and the hardware that make communication impossible, cumbersome and expensive. Most systems that include sensors from different sensor manufacturers do not allow for easy interchangeability. Standard sensor API and preference/capability management There is a need for a standardized foundation to enable primitive communication and interchangeability between systems, devices and their sensors. To allow better cooperation between systems and to allow comparison of system capabilities there is also a need for classifying the capabilities and preferences/specialties of systems, devices and their sensors . Summary of Invention The method and system of the present invention provides a solution to the above-outlined problems. The system of the present invention has a common interface bus/specification that may be used to define, sensor use and specification between different devices. The interface permits the interchange and upgrade of replacement sensors without requiring any modification of the application devices. The sensor devices may be replaced by enhanced versions or different sensor versions that provide the same or additional sensor functions. More particularly, the method of the present invention may be used for applying sensor information. A sensor unit has a set of sensors associated therewith. The sensor unit is in communication with a sensor application-programming interface that is in communication with an application unit. A plurality of sensors is associated to the set according to a universal mapping standard wherein the sensors are based on a first platform. The interface recognizes and classifies the sensors. The sensors are replaced with a second set of sensors that are based on a second, platform that is different from the first platform. The interface recognizes and classifies the second set of sensors without requiring a replacement of the application unit. A sensor is any device that can produce data in response to some event it is monitoring. With a sensor API presented to applications; physical sensors and associated data can be consistently mapped and classified. Thus equivalent sensors from different, manufacturers or different sensor arrays can be consistently processed by an application system because the API translates sensors and arrays to one common standardised layer presented to the application system or systems accessing any standard sensor API. Both sensors and associated data and applications that access sensor data via the API can be distributed and need not be in the same physical location. For example a team of mobile robots could communicate wirelessly and appear to as one virtual array of sensors, like a pack of dogs following a scent presented to them by their handler. Sensors can be of chemical, biological or electronic in nature, once defined the sensor capabilities must remain consistently mapped in the API. If a new sensor is developed it is either mapped as an extension to an existing sensor class, or if a suitable class does not exist it must be created. Once a common API exists for applications and sensors to access each other, a common way of describing sensor preferences can described as well as a common format for the storage of sensor preferences. This would allow applications to "choose" sensor (s) suitable for certain tasks or programs. Complex preference sets could form the basis for primitive personalities or traits of systems equipped with . sensors . Teams of robots could combine their sensors if necessary. If duplicate sensors exist in a group they could be used as replacements in the event of a failure or a sensor could be selected as having the highest specification of equivalent sensors present in the "team". For example, if a team of robots had an individual robot with a digital compass and another with a GPS navigation system, an application could choose the GPS system if location accuracy was required, but if only a heading > as required the compass could be used. A sensor can be accessed by any number of applications, they can be distributed not necessarily on the same platform. With a common API, an application can be upgraded more easily and different applications can access the same platform and use the same platform preferences/capability set. Thus a platform' s capability set could be copied or inherited, so that a sensor combination could matched to suitable applications and subsequent versions of those applications. Standardizing sensor capabilities and preferences under a common standard would allow applications to: • Filter capability associated with the sensor
• Identify sensor type, class sub class
• Sensor service life
• Read the sensor was accessed
• What type of applicatio (s) accessed the sensor • ID of applications accessing the sensor
• Tasks to be allocated to platforms depending on sensor capability. If sensor arrays can be visualized as rows the lens analogy can be used to define, by depth of field which sensors in the row are being accessed, /processed or selected at a a given time or selection range. The rows can be of the same sensor or various types. For example, different sensors accessed at the same time, or physical sensor location. Thus the sensors focussed on in a row would vary with the depth of field in the visualization, this could correspond to a value range for example. If sensor are reading any data the brightness of each sensor could be used to indicate the values being read, the highest value corresponding to the highest luminosity of the sensor being visualized in this way. A change in color could also be used to indicate sensor values. For example, in an array of sensors of the same type or sensor sensor luminosity could be proportional to the sensor signal amplitude. Pseudo sensors are can be any combination of:
• physical sensors (either connected virtually or in the same array or device) to be processed as if they were an actual sensor. These combinations could also allow for easier visualization and processing of larger sensor arrays/combinations. For example, humidity, light, temperature and soil pH values could be processed as one pseudo sensor to monitor plants. • A sensor or sensors combined with non sensor data/devices to be processed by applications in- the same way as other sensors. Thus contextual information (such as data from other applications, or even news or the time of day. For example, if an array includes cameras working in daylight, external data such as times of eclipses could be combined with sensor data, effecting sensor priorities and interpretation.
• A pseudo sensor could also be a software simulation of sensor data for test, experimental or other purpose. Values and combinations not occurring normally could be simulated in this manner. The equivalent of a machine hallucination could also be simulated in this way. When sensor are displayed in an application they can be visualized, for example, as having:
• differing brightness to indicate the relative amplitude of monitored signals.
• flashing to optionally indicate periodicity of readings
• different colors to indicate relative frequency of waveforms
• different shapes to indicate the class of sensor. For each sensor platform the standard allows for applications to discover available sensors, their capabilities and, how they have or might be used for each given platform. If the database holds historical data on the use of sensors, this could be used for preferences for each platform to be expressed in a standard way so that applications could know' the capabilities of sensors and how they have been used for each platform or group of sensor platforms/arrays. For example, a sensor may have registered certain readings at regular time intervals, so an application could use this data to anticipate events based on this data. If an event is anticipated, modulating sensor readings or the selective use of amplification could be used to Λheighten the awareness' of a platform, or create bias in selected sensor readings to emphasize the events being monitored. When dealing with complex or large sensor arrays, the standard sensor API will allow applications to prioritise and/or reduce the number of sensors read if processing is time critical or too much data is present for a given application to process effectively. Additionally applications can pass tasks to other applications more suited to process specific sensors and associated data.
Brief Description of the Drawing Fig. 1 is a schematic flow diagram of the method of the present invention; and Fig. 2 is a schematic flow diagram of two devices with sensor API units.
Detailed Description With reference to Fig. 1, the system 10 of the present invention has a sensor unit 12 in communication with a sensor API 40. As outlined in detail below, one important feature of the present invention is that there is a universal standard or mapping for all devices so that all sensor devices are compatible and interchangeable. Another important feature is that the universal mapping or standard also enables a unique way of managing preferences and that the system has a hardware/so tware independent platform for sensors. The method of the present invention provides a hardware and software independent way to map, classify and structure external events that are recorded, detected and associated via sensors. The sensors connected to the sensor unit 12 may be any type of device such as a camera, photocell, biochemical sensor or a receiver/microphone that operates at all wavelengths/frequencies. The sensors are not limited to the human sensor spectrum and they may detect energy and other such events that can or cannot be detected with human senses. The sensors may be physical sensors or pseudo-sensors that could be simulated in software or combinations of sensors and associated events or data such as news, time of day or date. For example, the sensor unit 12 may have a sensor set 15 including sensor units 16, 18, 20, 22, 24, 26, 28, 30. The set 15 may include sensor functionalities such as microphone, X-ray, receiver, camera, infrared camera, gamma ray, vapor, vibration, magnetic or any other sensor functionality, as desired. Any sensor 16-30 that is attached to the sensor unit 12 can be recognized and classified and it is known what the attached sensor can do regardless of the manufacturer due to the standardized software/hardware. It should be understood that it is not necessary for the sensors to be physically plugged into the. set 15. The sensors may also communicate with the set 15 via wireless communication or any other type of communication. The universal communication between the sensors and the set 15 enables the system 10 to prioritize how the sensors, such as the sensor 16, is used and associate the sensor 16 with other devices. The system also enables a group of sensors, such as the sensors 16, 18, 20, to act together as one sensor unit. The sensor unit 12 communicates with a sensor application-programming-interface (API) 40. By providing the interface above the physical sensors, the applications may operate across different platforms since the interface hides the underlying hardware and software differences of the sensors. As discussed below, the platform sensor preferences can be stored in a database that can be accessed by other compatible platforms and any application using the interface 40. These preferences allow for immediate processes associated with the interface layer independent of the applications that access the interface 40. The system 10 can autonomously build up a world-view or perception that is based on data stored in the database associated with a particular combination of sensors and the associated preferences and data associated with these sensor combinations. These preferences or views stored in the database can be shared, processed and modified by compatible systems and applications. As indicated above, the interface 40 may be used to classify the sensors that are plugged into the set 15. The interface 40 may be used to transfer data to program sensors irrespective of the sensor hardware used but all sensor hardware must comply with a classification that defines the function and capabilities of the sensors connected to the sensor unit that is in operative engagement with the interface 40. By separating the interface 40 from the sensor unit 12, the interface 40 does not need to directly deal with the sensor hardware that is plugged into the set 15. The intelligence of the interface 40 may effectively translate the sensor information. The interface 40 may then communicate with an application unit 42. The interface 40 may give different preferences to the sensors connected to the set 15 to save processing power of or to devolve the system 10. Because the interface 40 provides the required translation of the sensor information received by the sensor device 12, the application unit 42 does not need to be directly compatible with the hardware of the sensor itself. In this way, the unit 42 may cooperate with a wider range of sensors and the application unit 42 is hardware/software- independent of the sensor unit 12. . The interface 40 may cooperate with a sensor preferences filter database 44 that is associated with a computer 46 so that the filter database 44 may be used to filter, such as selecting or excluding, information from the interface 40 before the sensor information is displayed on a display 48 of the computer 46. The settings of each sensor 16-30 connected to the sensor unit 12 and the selection of the sensors 16-30 may be saved in the filter database 44. When the same application is carried out again, the settings and selection of the sensors may be retrieved from the database 44 so there is no need to reset or re-select the sensors each time an application is performed. Filters may be used for other wavelength spectra such as sound, radio and so oh. If the interface 40 also contains standardized filter information, devices could learn from other devices and transmit to them how the device filters information from their sensors and information related to the processing of the sensor input. As best shown in Fig. 2, the same units 12, 40, 42 may be used for two different devices 50, 52. In other words, the sensor mapping and functionality are the same for each device 50, 52 but the device 52 does not include sensors for the slots 20b, 30b. In the illustrated example, both devices 50, 52 run identical applications 54a, 54b, respectively although the sensor sets are different. This means the application 54b does not require the sensor 20b, 30b and only one common sensor or a plurality of common sensors 16, 18, 22, 24, 26, 28 are used. The devices may also run different applications 56, 58, as required. The device 50 may be equipped with a high-quality camera while the device 52 is equipped with an electronic smell sensor, a medium-quality camera and with a hearing device. The devices 50, 52 may cooperate thanks to the universal mapping and act as one device with three sensor features. The grouping could also include pseudo-sensors for simulated events. Also, if the high-quality camera of the device 50 malfunctions, the medium-quality camera of the device 52 may be used as a backup and provides a redundancy. When several sensors of a sensor type are plugged into the set 15, the system 10 may select or prioritize the sensor devices, such as the high-quality camera of the device 50 that may have the best or most suitable properties for a particular application. By using a suitable standard protocol such as XML or any other suitable protocol/program, all data may be parsed. A standard such as XML allows not only the mapping of sensors connected to the system 10 but also for the combination of sensors used to record external events or pseudo-events as explained above. Non-sensor information may be combined with sensor data to allow for a more complex construction of events or concepts. For example, data from an information system or the Internet could be related to events recorded via any combination of sensors. In this way, a system could autonomously evolve its own conceptual framework and experience independent of human input and in a way can be parsed and communicated via the standard mapping system of the present invention to other systems or human reading of events. News or calendar events could be used for activation. The XML standard also makes it possible to generate DTDs (document type definitions) , as rules that can be used to define a specific set of sensors and/or capabilities to define the structure and' content of events recorded via any sensor combination and associated data. For example, if a specific set of sensors are required for a task a DTD could be used to parse suitable sensor platforms even if the platforms are of different manufacture. Additionally, simulated or imaginary events may be described in this way, so that the requirements for events or sensors may be defined before they are detected or created. Combinations of sensors and sensor readings can be filtered in a standard way by storing data in, for example, a database. A standardized database could store preferences in a system independent way allowing preferences to be transferred or loaded on to one or more systems or replicated. For example, this could be to select sensors facing a specific direction or only certain frequencies or spectra. If the visualization of the sensors used a photographic metaphor such as the filter placed in front of a camera lens an easy to use system could be created to manage the loading and management of filter data for sensors arrays. Extending the photographic theme would allow the visualization of sensor arrays to be seen through a camera lens, changing focal length would allow narrowing the number of sensors Λin view', for example. Visualizing sensors in rows would allow for depth of field to be used to select adjacent sensors in a row, for example. While the present invention has been described in accordance with preferred compositions and embodiments, it is to be understood that certain substitutions and alterations may be made thereto without departing from the spirit and scope of the following claims .

Claims

Claims : 1. A method of applying sensor information: providing a sensor unit (12) having a set (15), the sensor unit being in communication with a sensor application-programming interface (40), the interface being in communication with an application unit (42) ; associating a plurality of sensors (16a, 18a) to the set (15) according to a universal mapping standard, the sensors (16a, 18a) being based on a first platform; the interface recognizing and classifying the sensors (16a, 18a); replacing the sensors (16a, 18a) with sensors (16b, 18b), the sensors (16b, 18b) being based on a second platform that is different from the first platform, the sensors (16b, 18b) being associated with the set (15) according to the universal mapping standard; and the interface (40) recognizing and classifying the sensors (16b, 18b) without requiring a replacement of the application unit (42).
2. The method according to claim 1 wherein the method further comprises the interface (40) selecting the sensors to be used for a first application and the interface setting the selected sensors for the first application.
3. The method according to claim 1 wherein the method further comprises the interface (40) saving the selection and settings of the sensors for the first application in a filter database (44).
4. The method according to claim 1 wherein the method further comprises performing a second application that is identical for the first application, the interface (40) retrieving the selection and settings of sensors from the filter database (44).
5. The method according to claim 4 wherein the method further comprises connecting the filter database (44) to a computer system (48) for displaying information from the selected sensors.
6. The method according to claim 1 wherein the method further comprises the interface (40) converting information from the sensor unit (12) to requirements of the application unit (42) and the requirements of the application unit (42) being independent of the sensors connected to the sensor unit (12) .
PCT/SE2004/001167 2003-08-10 2004-08-05 Method and system for applying sensor information by replacement of a set of sensors. WO2005015829A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US48120303P 2003-08-10 2003-08-10
US60/481,203 2003-08-10

Publications (1)

Publication Number Publication Date
WO2005015829A1 true WO2005015829A1 (en) 2005-02-17

Family

ID=34135055

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2004/001167 WO2005015829A1 (en) 2003-08-10 2004-08-05 Method and system for applying sensor information by replacement of a set of sensors.

Country Status (1)

Country Link
WO (1) WO2005015829A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008538481A (en) * 2005-04-18 2008-10-23 センサーマティック・エレクトロニクス・コーポレーション Channel selection method for improved wireless communication
WO2014143576A1 (en) 2013-03-15 2014-09-18 The Iams Company A composition comprising mannoheptulose for use in the treatment or prevention of overweight and obesity
US9922512B2 (en) 2013-10-17 2018-03-20 Utc Fire And Security Americas Corporation, Inc. Security panel with virtual sensors

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6053031A (en) * 1997-05-30 2000-04-25 Dragerwerk Ag Detection system with interchangeable sensors
US6512968B1 (en) * 1997-05-16 2003-01-28 Snap-On Technologies, Inc. Computerized automotive service system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6512968B1 (en) * 1997-05-16 2003-01-28 Snap-On Technologies, Inc. Computerized automotive service system
US6053031A (en) * 1997-05-30 2000-04-25 Dragerwerk Ag Detection system with interchangeable sensors

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DATABASE INSPEC [online] Database accession no. 8134661 *
ZHANG K. ET AL.: "General software architecture for multi-sensor INF fusion system", IEEE ONLINE PUBLICATIONS, vol. 5, pages 4640 - 4644 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008538481A (en) * 2005-04-18 2008-10-23 センサーマティック・エレクトロニクス・コーポレーション Channel selection method for improved wireless communication
WO2014143576A1 (en) 2013-03-15 2014-09-18 The Iams Company A composition comprising mannoheptulose for use in the treatment or prevention of overweight and obesity
US9922512B2 (en) 2013-10-17 2018-03-20 Utc Fire And Security Americas Corporation, Inc. Security panel with virtual sensors

Similar Documents

Publication Publication Date Title
EP2446364B1 (en) Method and system for ontology-driven querying and programming of sensors
CN1737790B (en) Device and method of at least one part of automation configuration for industrial system
US6076952A (en) Fieldbus network configuration utility with improved parameter control
JP2022061042A (en) Distributed industry performance surveillance and analysis
US5971581A (en) Fieldbus network configuration utility with improved scheduling and looping
JP2022084791A (en) Distributed industrial performance monitoring and analysis platform
CN105792751B (en) System and method for correcting the programming based on permanent ROM
CN104635686B (en) Targeted resource allocation
CN1774679B (en) Process control system and method for configuring a process control system
CN103238309B (en) Operation scheduler for building automation system
US7117040B2 (en) Tool attachable to controller
EP3069488B1 (en) Communicator with profiles
CN108507608A (en) Sensor management module, sensor management system, Method of Sensor Management and computer-readable non-volatile recording medium
CN1755564A (en) Enabling object oriented capabilities in automation systems
US10685155B2 (en) Method and system for designing a distributed heterogeneous computing and control system
CN101398686A (en) Adaptive industrial systems via embedded historian data
WO2007143406A1 (en) Method for integrating wireless field devices in a process control system with a wired protocol
JP2018106687A (en) Apparatus and method for dynamic device description language menus
CN101201600A (en) Self configuration of embedded historians
CN101008849A (en) Device and method for alarm information processing
CA2653907A1 (en) A system and a method for managing sample test results and respective sample result context information
JP2019096301A (en) Smart function block for integration of PLC into control system and method therefor
CN111095195A (en) Controller, control method, and control program
CN111095194A (en) Control system, controller and control method
US8666518B2 (en) Monitoring and control of electronic devices

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase