WO2013153890A1 - デバイス管理装置及びデバイス検索方法 - Google Patents
デバイス管理装置及びデバイス検索方法 Download PDFInfo
- Publication number
- WO2013153890A1 WO2013153890A1 PCT/JP2013/056400 JP2013056400W WO2013153890A1 WO 2013153890 A1 WO2013153890 A1 WO 2013153890A1 JP 2013056400 W JP2013056400 W JP 2013056400W WO 2013153890 A1 WO2013153890 A1 WO 2013153890A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- sensor
- search
- user
- information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9537—Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/951—Indexing; Web crawling techniques
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q9/00—Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q2209/00—Arrangements in telecontrol or telemetry systems
- H04Q2209/40—Arrangements in telecontrol or telemetry systems using a wireless architecture
Definitions
- the present invention relates to a technique for making it possible to easily search for a device that matches a purpose from a large number of devices in various places.
- a sensor network is a technology that enables collection, management, and seamless use of sensing data by installing sensor devices having a detection function and a communication function in various places and networking them (Patent Document 1). 2). If a sensor network is realized, it will be easy to quickly grasp the situation of any place from anywhere, so it can be applied to industrial fields such as manufacturing sites and logistics, as well as social systems such as transportation and various infrastructures, A wide range of applications are expected in fields related to life such as medical care and education.
- the sensor network has the advantage that the detection performance (resolution, type of information that can be detected, etc.) of the whole system increases as the number of sensors increases. On the other hand, if there are too many options, the purpose can be achieved. It may be difficult to find an optimal sensor, and the convenience of the user may be reduced. Therefore, it is desirable to provide a search mechanism on the system side that accepts a user request (for example, an area to be sensed or desired information) and extracts a sensor that matches the request.
- a user request for example, an area to be sensed or desired information
- the request of the user who uses the sensor network is normally specified in the area to be sensed, for example, “I want to know the congestion situation of the Kyoto Station No. 0 platform”, whereas the system side
- the location of the sensor is managed by the database.
- the installation location of the sensor and the target area that the sensor senses do not necessarily match. In other words, even if there is a surveillance camera installed at “Kyoto Station Line 0 Home”, the camera is not shooting at Line 0 Home, it is a train arrival / departure, face-to-face platform, ticket gate It may be the situation.
- the user himself / herself can select each sensor. It was necessary to check the parameters such as the installation direction and range to determine whether or not the desired area information could be sensed.
- the present invention has been made in view of the above circumstances, and the object of the present invention is to make it possible to accurately and easily search for a device that meets a user's request from among a large number of devices in various places. It is to provide the technology for.
- identification information (referred to as “area ID” in the present invention) defined so as to be able to uniquely identify an area (point, line, surface, or space) is used. This is used to manage the target area of each device and to search for a device in terms of whether the target area (not the installation location) of the device matches the user's desired area.
- the target area of a device is an area that should have a causal relationship with the operation of the device. As the causal relationship, there are one in which the output value of the device is determined in accordance with the state of the target area (sensing) and one in which the state of the target area is determined by the output of the device (actuator).
- the present invention is a device management apparatus that manages a plurality of devices existing at different locations, a storage unit in which information on each device is registered, and an acquisition unit that acquires a device search request from a user
- a matching processing unit that extracts a device that matches the search condition by matching the search condition included in the device search request with information about each device registered in the storage unit, and the matching process
- a search result presentation unit that presents the extraction result of the unit to the user, and an area ID that can uniquely identify the target area of the device is registered in the storage unit as information about the device.
- the search condition specifies an area that the user desires to act by the device.
- the matching processing unit determines a device to be extracted by comparing the target area specified by the area ID of each device with the area condition included in the device search request. It is characterized by doing.
- the target areas of all devices can be managed in a unified manner.
- this area ID is defined so that the target area can be uniquely identified, unlike the conventional system, this area ID represents not the device installation position but the area that is actually affected by the device. it can. Therefore, it is possible to search for a device that covers an area desired by the user with high accuracy.
- the action that the device exerts on the target area may be detecting the state of the target area or changing the state of the target area.
- the matching processing unit preferably calculates the degree of overlap between the target area specified by the area ID and the area specified by the area condition, and determines a device to be extracted based on the size of the degree of overlap. For example, by extracting devices with a degree of overlap greater than a predetermined value, or by extracting a predetermined number of devices in order from the one with the highest degree of overlap, search for devices that are likely to match the user's wishes. It can be presented.
- position information representing the installation position of the device is registered as information about the device in the storage unit, the matching processing unit narrows down candidate devices based on the installation position of the device, Extraction based on the degree of overlap may be executed for the narrowed candidates.
- the matching processing load may increase as the network size (number of devices) increases. Therefore, by first narrowing down the installation position where the calculation amount is small, the processing load of matching can be reduced and the search time can be shortened.
- a plurality of area IDs respectively corresponding to the plurality of target areas are registered in the storage unit.
- a movable or scanning device, a camera capable of PTZ (pan / tilt / zoom) control, and the like correspond to this.
- a control parameter used for changing to the corresponding target area is registered in the storage unit in association with each of the plurality of area IDs.
- the search result presenting unit provides the user with information about the control parameter for the device along with the extraction result. It is preferable to present.
- the user can directly or indirectly control the device to sense the desired area or output information to the desired area. Convenience is improved.
- the present invention can also be understood as a device management apparatus having at least one of the above means, or as a device network system including a device management apparatus and a plurality of devices.
- the present invention can also be understood as a device search method including at least one of the above processes, a program for causing a computer to execute each step of the device search method, and a storage medium storing this program. You can also be understood as a device management apparatus having at least one of the above means, or as a device network system including a device management apparatus and a plurality of devices.
- the present invention can also be understood as a device search method including at least one of the above processes, a program for causing a computer to execute each step of the device search method, and a storage medium storing this program. You can also
- the present invention it is possible to accurately and easily search for a device that meets a user's request from among a large number of devices in various places.
- the figure which shows the structure of a sensor network system The flowchart which shows the flow of a sensor registration process.
- the figure which shows the sensing object area of a sensor typically.
- the figure which shows an example of the data structure of area ID typically.
- the flowchart which shows the flow of a sensor search process The flowchart which shows the flow of a calculation process of overlap degree.
- the sensor network system is composed of a sensor network 1 which is a kind of device network, and a sensor management device 2 as a device management device.
- the sensor network 1 is a network configured by a large number of sensors 10 present at various places.
- the network configuration, communication method, and the like can be arbitrarily designed and are not particularly limited.
- Each sensor 10 can communicate with the sensor management apparatus 2 via a wide area network such as the Internet.
- the sensor management device 2 manages information about each sensor 10 constituting the sensor network 1, information collected from each sensor, and the like, and various services for users who want to use the sensor 10 (device search is one of them).
- Is a server device that provides A user (user) can access a service provided by the sensor management device 2 from the user terminal 3 through a wide area network such as the Internet.
- the sensors 10 that make up the sensor network 1 are not all owned by the operator of the sensor management device 2, and many people such as individuals and corporate entities, apart from the operators, own the sensors. Or may be operated and managed (hereinafter, a person who owns and manages the sensor is referred to as a “sensor provider”). Therefore, the sensor management device 2 has a function of registering a new sensor in the system and a function of changing information on the sensor as functions for the sensor provider. Although not described in detail in the present embodiment, the sensor management device 2 functions to mediate between the sensor user and the provider (checking / arbitration of the use condition and the supply condition, charging to the user and to the provider) It is also preferable to have a payment for
- Each sensor 10 is a device for detecting (acquiring) the state of the sensing target area, and any kind of information to be sensed or information to be output, a detection method and a detection means may be used. Examples include an image sensor, a temperature sensor, a humidity sensor, an illuminance sensor, a force sensor, a sound sensor, an RFID sensor, an infrared sensor, an attitude sensor, a rainfall sensor, a radioactivity sensor, and a gas sensor. In addition, when one piece of information is obtained by combining a plurality of sensors, the plurality of sensors can be virtually handled as one sensor.
- the sensor management device 2 has functions such as a search request acquisition unit 20, an area ID determination unit 21, a storage unit (database) 22, a matching processing unit 23, a search result creation unit 24, and a sensor registration unit 25.
- the sensor management device 2 can be configured by a computer including a CPU, a main storage device (memory), an auxiliary storage device (HDD, SSD, etc.), a communication device, an input device, a display device, and the like in hardware.
- Each functional block shown in FIG. 1 is implemented by loading a computer program stored in the auxiliary storage device into the main storage device and executing the program by the CPU.
- the sensor management device 2 may be configured by a single computer or may be configured by a plurality of cooperating computers.
- the user terminal 3 for example, a personal computer, a mobile phone, a smartphone, a slate type terminal, or the like can be used.
- a user searches for a sensor via the Internet.
- a configuration in which the user operates the sensor management apparatus itself, or a part or all of the functions of the sensor management apparatus is provided on the user terminal 3 side. It may be configured to be mounted.
- the sensor provider terminal 4 for example, a personal computer, a mobile phone, a smartphone, a slate type terminal, or the like can be used.
- FIG. 2 is a flowchart showing a flow of sensor registration processing executed by the sensor management apparatus 2
- FIG. 3 is a diagram schematically showing a sensing target area of the sensor
- FIG. 4 is a data structure of an area ID.
- a surveillance camera having a PTZ (pan / tilt / zoom) function is taken as an example of the sensor.
- the provider of the sensor can access the sensor registration service of the sensor management apparatus 2 via the network using the terminal 4.
- the sensor registration unit 25 presents a screen (not shown) for inputting information necessary for sensor registration, and prompts the provider to input information (step S20).
- Information necessary for sensor registration includes information related to the provider (for example, the name or name of the provider), information related to the sensor (for example, sensor type, capability, installation position, target area, network address, etc.), provision Conditions (for example, usage purpose, available time, usage fee, etc.) are included. It should be noted that manual input by the provider can be omitted for information relating to the sensor that can be automatically acquired from the sensor.
- the coordinates of the installation position are acquired from the sensor, and various information such as the installation position, installation direction (angle), and capability are stored in the sensor's built-in memory. Such information may be read from the sensor.
- GPS Global Positioning System
- the area ID determination unit 21 determines the area ID of the sensor based on the information on the target area acquired in step S20 (step S21).
- the area ID is identification information for uniquely specifying the sensing target area of the sensor.
- the target area is represented by a point, a line, a surface, or a space (that is, an area of 0 to 3 dimensions). The number of dimensions of the target area is appropriately set according to the type of sensor.
- the sensing target area can be defined by a pentahedron having five points A0 to A4 as vertices.
- the camera installation position (A0), installation height (h), camera direction (angle), and field angle is obtained in step S20, the camera field of view and the camera installation surface are used.
- the coordinates A1 to A4 can be calculated geometrically.
- the target area may be defined by a quadrangular pyramid having the camera installation position (A0) as the apex and the imageable distance as the height.
- the provider can specify the coordinates of the vertices A0 to A4, measure the shooting range with a distance sensor provided in the camera, or calculate the shooting range by analyzing the image obtained by the camera. Also good.
- FIG. 4 is an example of an area ID determined by the area ID determination unit 21.
- the target area is defined by a polyhedron
- the area ID includes a data string representing the coordinates of the vertexes of the polyhedron.
- the data has a structure in which the three-dimensional coordinate values (x, y, z) of the vertices A0, A1, A2, A3, A4.
- the coordinate values of four points in the area ID data are indispensable, and the data after the fifth point are optional.
- the value of each point may be an absolute coordinate or a relative coordinate.
- each point may be expressed not by xyz coordinates but by latitude / longitude / height, or by a standard GPS format.
- the target area is three-dimensional (space)
- four coordinate values are indispensable for the area ID.
- the target area is two-dimensional (surface)
- at least three points and when the target area is one-dimensional (line)
- the sensor registration unit 25 registers the various information received in step S20 and the area ID in the storage unit 22 (step S22). This completes the sensor registration process.
- an area ID may be obtained for each target area that can be taken, and a plurality of area IDs may be registered in the storage unit 22.
- the control parameters of the sensor used for changing the target area may be registered in the storage unit 22 together with the area ID.
- parameters such as pan, tilt, and zoom and an area ID corresponding to each parameter are registered together.
- the sensor search service is a service for facilitating the user (user) to find an optimal sensor for achieving the purpose from among a large number of sensors 10 constituting the sensor network 1.
- the search service will be described with reference to FIG.
- FIG. 5 (a) When the user accesses the sensor search service using the user terminal 3, a search condition input screen as shown in FIG. 5 (a) is displayed. On this screen, it is possible to input a search condition (purpose of search) in free text, for example, “I want to know the traffic situation at 6 pm around 100 m around Kyoto Station.”
- FIG. 5B is another example of the search condition input screen. For this screen, enter the area, time, and content in separate boxes.
- search button is pressed, a search request is transmitted from the user terminal 3 to the sensor management device 2, and search processing (matching processing) is executed in the sensor management device 2. Details of the search process will be described later.
- FIG. 5C is an example of a search result screen returned from the sensor management device 2.
- a traffic jam sensor, an optical beacon, an image sensor (camera), an ETC entry record, and the like are assumed as means for grasping the traffic jam situation on the road.
- the sensor search service all sensors that may satisfy the search condition input by the user are searched, and a list is displayed in order from the sensor having the highest matching degree with the search condition.
- the display order of the list may be sorted according to the conditions specified by the user, such as the type of sensor, the order of low to high charges, or the like.
- the sensor list also displays information such as the specifications of each sensor, the provision conditions, the usage fee, and the sensor provider as necessary. The user can select a sensor to be used while looking at the list presented as a search result and comparing which sensor is used.
- the installation position (or existing position) of the sensor 10 does not necessarily match the target area sensed by the sensor 10. Therefore, in the sensor search service of this system, the sensor search can be performed by comparing the area condition input by the user with the target area of the sensor. This makes it possible to accurately extract a sensor that matches the user's purpose and desire. In addition, since the purpose achievement means that are not assumed by the user are listed (recommended), an increase in the user's usage opportunities can be expected.
- FIG. 6 shows a flow of sensor search processing by the sensor management device 2.
- the search request acquisition unit 20 acquires a search request from the user terminal 3 (step S60). Then, the area ID determination unit 21 converts the area condition (information for specifying an area that the user desires to sense) included in the search request into an area ID (step S61).
- the definition and data structure of the area ID are the same as described above.
- the area condition is converted into the area ID in order to unify the handling of the area information in the matching process described later. Note that step S61 may be omitted when there is no need for unification. Also, step S61 can be omitted when the area ID itself is designated as the area condition by the user.
- the matching processing unit 23 determines a sensor search range based on the area condition designated by the user (step S62). For example, the desired area specified as the area condition and its surrounding L [m] may be set as the search range. The value of L, which is a margin, may be set as appropriate according to the size of the sensor network, the type of sensor (detectable distance), and the like.
- the matching processing unit 23 selects only the sensors installed in the search range determined in step S62 as search candidates from all the sensors registered in the storage unit 22. As a result, the number of targets for calculating / evaluating the degree of overlap in the subsequent stage can be greatly reduced, and the processing load can be reduced. If the number of sensors constituting the sensor network is not so large, step S62 may be omitted and all sensors may be search candidates.
- the matching processing unit 23 performs the following processing for each of the sensors selected as candidates in step S63.
- the matching processing unit 23 reads the area ID of the sensor from the storage unit 22 (step S64), and designates the sensing target area of the sensor and the user based on the area ID of the sensor and the area ID obtained in step S61.
- the degree of overlap with the desired sensing area is calculated (step S65).
- FIG. 7 shows an example of the calculation process of the overlapping degree in step S65.
- the area designated by the user (the sensing desired area) is called “area 1”, and the sensing target area of each sensor is called “area 2”.
- the matching processing unit 23 sets a reference points at an equal interval and exhaustively within the area 1 (step S70).
- the number a of reference points may be set as appropriate according to the type (resolution) of the sensor, the size of the area, and the like.
- the matching processing unit 23 determines whether each reference point is also included in the area 2, and is included in both the area 2 (that is, included in both the areas 1 and 2).
- Count b) (steps S71 to S73).
- the matching processing unit 23 calculates the ratio of the reference points included in both areas 1 and 2, that is, the value of b / a, as the degree of overlap of the sensing target area of the sensor with respect to the sensing desired area (step) S74).
- the matching processing unit 23 determines whether or not the degree of overlap is equal to or greater than a predetermined threshold, and if it is equal to or greater than the threshold, sets a flag on the sensor (step S67).
- the threshold value can be arbitrarily set, in the present embodiment, it is set to 0.5 as an example (that is, a sensor that covers 50% or more of the sensing desired area is extracted).
- step S64 to S67 is executed for each of the sensors selected as candidates in step S63 (loop R1). Further, in the case of a sensor having a plurality of area IDs (sensors that can change the target area), an overlap degree calculation / evaluation process is executed for each area ID (loop R2).
- the search result creation unit 24 reads out the information of the sensor with the flag set from the storage unit 22 to create a sensor list, and the user It transmits to the terminal 3 (step S68).
- information on each extracted sensor is described in the sensor list.
- a control parameter for that purpose is also described in the sensor list.
- the “area ID” is registered as one of the information about the sensors, so that the sensing target areas of all the sensors can be managed in a unified manner.
- this area ID is defined so that the target area can be uniquely identified, unlike the conventional system, the area ID can represent an area that is actually a state detection target, not the sensor installation position. . Therefore, it is possible to search for a sensor that covers an area desired by the user with high accuracy.
- the sensor is extracted by evaluating the degree of overlap between the area desired by the user and the sensing target area of the sensor, a sensor that is highly likely to match the user's desire is presented as a search result. It becomes possible.
- the overall processing load can be reduced and search time can be reduced by roughly narrowing down candidates based on installation positions that require less computation. Can do.
- the desired area can be sensed by controlling the sensor
- information on the control parameters of the sensor is also presented to the user, so that the user directly or indirectly controls the desired area. It is also possible to use the system in such a manner that the convenience of the system is improved.
- the above-described embodiment shows one specific example of the present invention, and is not intended to limit the scope of the present invention to these specific examples.
- a sensor network system for example, an example in which the present invention is applied to a sensor network system will be described.
- the present invention is applied to a device network system including devices other than sensors (for example, actuators, controllers, robots, lighting, digital signage, displays, etc.).
- the invention can also be applied. This is because these devices also have the same problems as in the case of sensors in that they have an effect on a predetermined target area and the target area does not necessarily match the installation position of the device.
- the overlap is evaluated based on the ratio of the reference points included in both areas 1 and 2, but the overlap between the two areas may be evaluated using another algorithm. Or you may evaluate the inclusion relationship whether the sensing object area of a sensor includes a user's desired area instead of an overlap.
- Sensor network 2 Sensor management device 3: User terminal 4: Sensor provider's terminal 10: Sensor 20: Search request acquisition unit 21: Area ID determination unit 22: Storage unit 23: Matching processing unit 24: Search result creation unit 25: Sensor registration unit
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
図1を参照して、本発明に係るデバイスネットワークシステムの一実施形態として、センサネットワークシステムの構成例を説明する。
図2~図4を参照して、センサの登録処理の一例を説明する。図2は、センサ管理装置2で実行されるセンサ登録処理の流れを示すフローチャートであり、図3は、センサのセンシング対象エリアを模式的に示す図であり、図4は、エリアIDのデータ構造の一例を模式的に示す図である。ここでは、センサの一例として、PTZ(パン・チルト・ズーム)機能をもつ監視カメラを挙げる。
次に、センサ管理装置2が提供するセンサ検索サービスについて説明する。センサ検索サービスとは、センサネットワーク1を構成する多数のセンサ10の中から、ユーザ(利用者)が目的達成に最適なセンサを見つけることを、容易化するためのサービスである。まずは図5を用いて検索サービスの具体的な使用例から説明する。
図6に、センサ管理装置2によるセンサ検索処理の流れを示す。
以上述べた本実施形態の構成によれば、センサに関する情報の一つとして「エリアID」を登録するようにしたことで、すべてのセンサのセンシング対象エリアを統一的に管理することができる。また、このエリアIDは対象エリアを一意に特定できるように定義付けされているため、従来システムとは異なり、センサの設置位置ではなく、実際に状態を検知する対象となるエリアを表すことができる。よって、ユーザが希望するエリアをカバーするセンサを精度良く検索することが可能となる。
2:センサ管理装置
3:ユーザ端末
4:センサ提供者の端末
10:センサ
20:検索要求取得部
21:エリアID決定部
22:記憶部
23:マッチング処理部
24:検索結果作成部
25:センサ登録部
Claims (8)
- 異なる場所に存在する複数のデバイスを管理するデバイス管理装置であって、
各デバイスに関する情報が登録されている記憶部と、
ユーザからデバイス検索要求を取得する取得部と、
前記デバイス検索要求に含まれる検索条件と前記記憶部に登録されている各デバイスに関する情報とのマッチングを行うことにより、前記検索条件に合致するデバイスを抽出するマッチング処理部と、
前記マッチング処理部の抽出結果をユーザに提示する検索結果提示部と、を備え、
前記記憶部には、デバイスに関する情報として、当該デバイスの対象エリアを一意に特定可能なエリアIDが登録されており、
前記デバイス検索要求には、前記検索条件として、デバイスによって作用を及ぼすことをユーザが希望するエリアを特定するためのエリア条件が含まれており、
前記マッチング処理部は、各デバイスのエリアIDにより特定される対象エリアと、前記デバイス検索要求に含まれるエリア条件とを比較することによって、抽出するデバイスを決定することを特徴とするデバイス管理装置。 - 前記マッチング処理部は、エリアIDにより特定される対象エリアと、エリア条件により特定されるエリアとの重なり度合を算出し、重なり度合の大きさに基づいて抽出するデバイスを決定することを特徴とする請求項1に記載のデバイス管理装置。
- 前記記憶部には、デバイスに関する情報として、当該デバイスの設置位置を表す位置情報が登録されており、
前記マッチング処理部は、デバイスの設置位置に基づいて候補となるデバイスを絞り込んだ後、その絞り込まれた候補を対象として、重なり度合の大きさに基づく抽出を実行することを特徴とする請求項2に記載のデバイス管理装置。 - 対象エリアを複数もつデバイスについては、前記記憶部に、複数の対象エリアにそれぞれ対応する複数のエリアIDが登録されていることを特徴とする請求項1~3のうちいずれか1項に記載のデバイス管理装置。
- 対象エリアの変更をユーザが制御可能であるデバイスについては、前記記憶部に、前記複数のエリアIDのそれぞれに関連付けて、対応する対象エリアへ変更するために用いる制御パラメータが登録されていることを特徴とする請求項4に記載のデバイス管理装置。
- 前記マッチング処理部によって、対象エリアを変更する制御が必要なデバイスが抽出された場合、前記検索結果提示部は、抽出結果とともに、当該デバイスに対する制御パラメータに関する情報もユーザに提示することを特徴とする請求項5に記載のデバイス管理装置。
- 異なる場所に存在する複数のデバイスの中から、要求条件を満たすデバイスを検索するためのデバイス検索方法であって、
各デバイスに関する情報が登録されている記憶部を備えるコンピュータが、
ユーザからデバイス検索要求を取得するステップと、
前記デバイス検索要求に含まれる検索条件と前記記憶部に登録されている各デバイスに関する情報とのマッチングを行うことにより、前記検索条件に合致するデバイスを抽出するステップと、
前記マッチングによる抽出結果をユーザに提示するステップと、を実行するものであり、
前記記憶部には、デバイスに関する情報として、当該デバイスの対象エリアを一意に特定可能なエリアIDが登録されており、
前記デバイス検索要求には、前記検索条件として、デバイスによって作用を及ぼすことをユーザが希望するエリアを特定するためのエリア条件が含まれており、
前記マッチングでは、各デバイスのエリアIDにより特定される対象エリアと、前記デバイス検索要求に含まれるエリア条件とを比較することによって、抽出するデバイスが決定されることを特徴とするデバイス検索方法。 - 請求項7に記載のデバイス検索方法の各ステップをコンピュータに実行させることを特徴とするプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014510081A JP5716866B2 (ja) | 2012-04-12 | 2013-03-08 | デバイス管理装置及びデバイス検索方法 |
US14/391,783 US9898539B2 (en) | 2012-04-12 | 2013-03-08 | Device management apparatus and device search method |
EP13775148.3A EP2838034A4 (en) | 2012-04-12 | 2013-03-08 | DEVICE FOR MANAGING DEVICES AND DEVICE SEARCH METHODS |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-091022 | 2012-04-12 | ||
JP2012091022 | 2012-04-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013153890A1 true WO2013153890A1 (ja) | 2013-10-17 |
Family
ID=49327464
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/056400 WO2013153890A1 (ja) | 2012-04-12 | 2013-03-08 | デバイス管理装置及びデバイス検索方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9898539B2 (ja) |
EP (1) | EP2838034A4 (ja) |
JP (1) | JP5716866B2 (ja) |
WO (1) | WO2013153890A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015128954A1 (ja) * | 2014-02-26 | 2015-09-03 | オムロン株式会社 | デバイス情報提供システム、デバイス情報提供方法 |
JP2016038703A (ja) * | 2014-08-07 | 2016-03-22 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | センサ制御装置、センサ制御方法およびセンサ制御プログラム |
JP2017016306A (ja) * | 2015-06-30 | 2017-01-19 | オムロン株式会社 | データフロー制御装置およびデータフロー制御方法 |
WO2020116610A1 (ja) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | データ検索装置とそのデータ検索方法およびプログラム、エッジサーバとそのプログラム |
WO2020116611A1 (ja) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | データ検索装置とそのデータ検索方法およびプログラム、エッジサーバとそのプログラム |
US12019911B2 (en) | 2018-12-06 | 2024-06-25 | Ntt Communications Corporation | Storage management apparatus, method and program |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016115155A (ja) * | 2014-12-15 | 2016-06-23 | 株式会社リコー | 機器管理装置、機器管理システム、対応指示方法及びプログラム |
JP6406341B2 (ja) * | 2016-12-15 | 2018-10-17 | オムロン株式会社 | センシングデータ流通システムとその装置およびプログラム |
JP6390692B2 (ja) * | 2016-12-15 | 2018-09-19 | オムロン株式会社 | データ配信システム、指示装置、データ配信装置、センサ管理装置、データ配信方法、およびプログラム |
CN108874753B (zh) * | 2018-06-13 | 2022-05-10 | 百度在线网络技术(北京)有限公司 | 主题帖回复的查找方法、装置和计算机设备 |
US20200272138A1 (en) * | 2019-02-22 | 2020-08-27 | Rockwell Automation Technologies, Inc. | Selection of industrial sensors on objects and their features |
US20220164231A1 (en) * | 2019-07-29 | 2022-05-26 | Hewlett-Packard Development Company, L.P. | Determine specific devices from follow-up questions |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007300571A (ja) | 2006-05-08 | 2007-11-15 | Hitachi Ltd | センサネットシステム、センサネット位置特定プログラム |
JP2007300572A (ja) | 2006-05-08 | 2007-11-15 | Hitachi Ltd | センサネットシステム、センサネット位置特定方法 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5724040A (en) * | 1995-06-23 | 1998-03-03 | Northrop Grumman Corporation | Aircraft wake vortex hazard warning apparatus |
US20020164962A1 (en) * | 2000-07-18 | 2002-11-07 | Mankins Matt W. D. | Apparatuses, methods, and computer programs for displaying information on mobile units, with reporting by, and control of, such units |
US7685224B2 (en) * | 2001-01-11 | 2010-03-23 | Truelocal Inc. | Method for providing an attribute bounded network of computers |
CN1417755A (zh) * | 2002-11-18 | 2003-05-14 | 冯鲁民 | 功能完善且架构简易的智能交通系统体系 |
CN1934598B (zh) * | 2004-12-24 | 2011-07-27 | 松下电器产业株式会社 | 传感设备、检索设备和中继设备 |
US7912628B2 (en) * | 2006-03-03 | 2011-03-22 | Inrix, Inc. | Determining road traffic conditions using data from multiple data sources |
JP4654163B2 (ja) * | 2006-07-14 | 2011-03-16 | 日立オートモティブシステムズ株式会社 | 車両の周囲環境認識装置及びシステム |
JP4891742B2 (ja) * | 2006-11-27 | 2012-03-07 | 株式会社日立製作所 | 情報処理システムおよび情報処理方法 |
US20080295171A1 (en) * | 2007-05-23 | 2008-11-27 | Honeywell International Inc. | Intrusion Detection System For Wireless Networks |
US7502619B1 (en) * | 2008-01-22 | 2009-03-10 | Katz Daniel A | Location determination of low power wireless devices over a wide area |
EP2131292A1 (en) | 2008-06-06 | 2009-12-09 | NTT DoCoMo, Inc. | Method and apparatus for searching a plurality of realtime sensors |
JP5347742B2 (ja) * | 2009-06-12 | 2013-11-20 | ブラザー工業株式会社 | 来客管理システム及び来客管理システムの制御プログラム |
US8644273B2 (en) * | 2009-07-01 | 2014-02-04 | Apple Inc. | Methods and apparatus for optimization of femtocell network management |
US20120246214A1 (en) * | 2009-11-02 | 2012-09-27 | Hitachi, Ltd. | Method for supporting service setting |
GB201012178D0 (en) * | 2010-07-20 | 2010-09-01 | Telemactics Technology Llp | Indicia identifying system |
CN103026395A (zh) * | 2010-11-15 | 2013-04-03 | 图像传感系统有限公司 | 混合交通传感器系统和相关的方法 |
-
2013
- 2013-03-08 JP JP2014510081A patent/JP5716866B2/ja active Active
- 2013-03-08 EP EP13775148.3A patent/EP2838034A4/en not_active Ceased
- 2013-03-08 US US14/391,783 patent/US9898539B2/en active Active
- 2013-03-08 WO PCT/JP2013/056400 patent/WO2013153890A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007300571A (ja) | 2006-05-08 | 2007-11-15 | Hitachi Ltd | センサネットシステム、センサネット位置特定プログラム |
JP2007300572A (ja) | 2006-05-08 | 2007-11-15 | Hitachi Ltd | センサネットシステム、センサネット位置特定方法 |
Non-Patent Citations (4)
Title |
---|
See also references of EP2838034A4 * |
TOMOHIRO NAGATA ET AL.: "A Study on Network Virtualization for Ubiquitous Environment", IEICE TECHNICAL REPORT, vol. 107, 28 February 2008 (2008-02-28), pages 129 - 134, XP008174797 * |
YOICHI YAMADA ET AL.: "Ubiquitous Sensor Network ni Okeru Eizo Kanshi System", OKI TECHNICAL REVIEW, vol. 72, no. 4, 1 October 2005 (2005-10-01), pages 44 - 47, XP008174799 * |
YUKI FUJISAKI ET AL.: "A Cooperative Storage System with Wireless Ad-hoc Networking for Wireless Sensor Networks", IPSJ SIG NOTES, vol. 2008, no. 18, 27 February 2008 (2008-02-27), pages 149 - 156, XP008174803 * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015128954A1 (ja) * | 2014-02-26 | 2015-09-03 | オムロン株式会社 | デバイス情報提供システム、デバイス情報提供方法 |
JP5822050B1 (ja) * | 2014-02-26 | 2015-11-24 | オムロン株式会社 | デバイス情報提供システム、デバイス情報提供方法 |
EP2940601A4 (en) * | 2014-02-26 | 2016-10-12 | Omron Tateisi Electronics Co | SYSTEM AND METHOD FOR PROVIDING DEVICE INFORMATION |
US9679032B2 (en) | 2014-02-26 | 2017-06-13 | Omron Corporation | Device information providing system and device information providing method |
JP2016038703A (ja) * | 2014-08-07 | 2016-03-22 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | センサ制御装置、センサ制御方法およびセンサ制御プログラム |
JP2017016306A (ja) * | 2015-06-30 | 2017-01-19 | オムロン株式会社 | データフロー制御装置およびデータフロー制御方法 |
WO2020116610A1 (ja) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | データ検索装置とそのデータ検索方法およびプログラム、エッジサーバとそのプログラム |
JP2020091707A (ja) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | データ検索装置とそのデータ検索方法およびプログラム、エッジサーバとそのプログラム |
WO2020116611A1 (ja) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | データ検索装置とそのデータ検索方法およびプログラム、エッジサーバとそのプログラム |
JP2020091705A (ja) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | データ検索装置とそのデータ検索方法およびプログラム、エッジサーバとそのプログラム |
JP7150585B2 (ja) | 2018-12-06 | 2022-10-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | データ検索装置とそのデータ検索方法およびプログラム、エッジサーバとそのプログラム |
JP7150584B2 (ja) | 2018-12-06 | 2022-10-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | エッジサーバとそのプログラム |
US11695832B2 (en) | 2018-12-06 | 2023-07-04 | Ntt Communications Corporation | Data search apparatus, and data search method and program thereof, and edge server and program thereof |
US11886520B2 (en) | 2018-12-06 | 2024-01-30 | Ntt Communications Corporation | Data search apparatus, and data search method and program thereof, and edge server and program thereof |
US12019911B2 (en) | 2018-12-06 | 2024-06-25 | Ntt Communications Corporation | Storage management apparatus, method and program |
Also Published As
Publication number | Publication date |
---|---|
US9898539B2 (en) | 2018-02-20 |
JP5716866B2 (ja) | 2015-05-13 |
EP2838034A1 (en) | 2015-02-18 |
US20150142848A1 (en) | 2015-05-21 |
EP2838034A4 (en) | 2015-12-09 |
JPWO2013153890A1 (ja) | 2015-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5716866B2 (ja) | デバイス管理装置及びデバイス検索方法 | |
JP6208654B2 (ja) | 関心地点情報をプッシュするための方法及びシステム | |
US10896347B2 (en) | Dataflow control apparatus and dataflow control method for metadata matching and device extraction | |
US10244359B2 (en) | Venue data framework | |
JP5534007B2 (ja) | 特徴点検出システム、特徴点検出方法、及びプログラム | |
US20130272569A1 (en) | Target identification system target identification server and target identification terminal | |
EP2902913A1 (en) | Device management apparatus and device search method | |
WO2015128954A1 (ja) | デバイス情報提供システム、デバイス情報提供方法 | |
CN107251049A (zh) | 基于语义指示检测移动装置的位置 | |
CN107193820B (zh) | 位置信息获取方法、装置及设备 | |
JP2013156912A (ja) | 機器の保全支援システムおよび機器保全サーバ | |
EP1976324B1 (en) | Search system, management server, mobile communication device, search method, and program | |
Shangguan et al. | Towards accurate object localization with smartphones | |
Wang et al. | iNavigation: an image based indoor navigation system | |
US10360266B2 (en) | Data-flow control device and data-flow control method | |
KR20200037168A (ko) | 관심 영역 변화를 검출하는 방법 및 시스템 | |
Shahid et al. | Indoor positioning:“an image-based crowdsource machine learning approach” | |
US10264420B2 (en) | System and method for providing a descriptive location of a user device | |
KR20190029412A (ko) | 네트워크를 통한 오프라인 매장 정보 제공 방법 및 이에 사용되는 관리 서버 | |
KR102514000B1 (ko) | 지리적 참조 정보를 위해 실제 세계에 쿼리하기 위한 이미지 센서의 사용 | |
US20230370570A1 (en) | Data transmission device, data transmission method, information processing device, information processing method, and program | |
JP2021021288A (ja) | 道路損傷情報管理システム | |
KR102112715B1 (ko) | 모바일 컴퓨팅 디바이스의 위치 측위 방법 및 시스템 | |
EP3432593B1 (en) | Data-flow control device and data-flow control method | |
Mizan et al. | Indoor Positioning and Navigation Using Bluetooth Low Energy and Cloud Service in Healthcare Perspective |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13775148 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014510081 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14391783 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013775148 Country of ref document: EP |