US10952041B2 - Control device and method for processing a service using a plurality of sensor devices in a distributed manner while suppressing a communication load - Google Patents

Control device and method for processing a service using a plurality of sensor devices in a distributed manner while suppressing a communication load Download PDF

Info

Publication number
US10952041B2
US10952041B2 US16/477,740 US201716477740A US10952041B2 US 10952041 B2 US10952041 B2 US 10952041B2 US 201716477740 A US201716477740 A US 201716477740A US 10952041 B2 US10952041 B2 US 10952041B2
Authority
US
United States
Prior art keywords
information
sensing
edge server
sensor device
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/477,740
Other languages
English (en)
Other versions
US20190364399A1 (en
Inventor
Sho FURUICHI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FURUICHI, Sho
Publication of US20190364399A1 publication Critical patent/US20190364399A1/en
Application granted granted Critical
Publication of US10952041B2 publication Critical patent/US10952041B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W36/00Hand-off or reselection arrangements
    • H04W36/16Performing reselection for specific purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/18Processing of user or subscriber data, e.g. subscribed services, user preferences or user profiles; Transfer of user or subscriber data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W92/00Interfaces specially adapted for wireless communication networks
    • H04W92/16Interfaces between hierarchically similar devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q2209/00Arrangements in telecontrol or telemetry systems
    • H04Q2209/20Arrangements in telecontrol or telemetry systems using a distributed architecture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q2209/00Arrangements in telecontrol or telemetry systems
    • H04Q2209/40Arrangements in telecontrol or telemetry systems using a wireless architecture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
    • H04W4/21Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel for social networking applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W92/00Interfaces specially adapted for wireless communication networks
    • H04W92/16Interfaces between hierarchically similar devices
    • H04W92/24Interfaces between hierarchically similar devices between backbone network devices

Definitions

  • the present disclosure relates to a control device and a method.
  • IoT internet of things
  • 3GPP 3rd generation partnership project
  • MIC machine type communication
  • NB-IoT narrow band IoT
  • Patent Document 1 discloses a monitoring camera system that integrates sensing information obtained by a plurality of sensor devices into one server on a network and processes, for example, the sensing information.
  • Patent Document 1 performs cooperative operation of a plurality of monitoring cameras on the basis of control by one server. Therefore, a load and a communication load on the server may become excessive as a sensable range is expanded and the number of monitoring cameras increases, and the accuracy of the monitoring cameras is improved and the amount of data to be transmitted increases.
  • the present disclosure proposes a mechanism capable of processing a service using a plurality of sensor devices in a distributed manner while suppressing a communication load and providing the service.
  • control device arranged on a local network side with respect to a gateway between an internet and a local network, the control device including a control unit configured to transmit or receive information regarding one or more sensor devices wirelessly connected to a first cell associated with the control device to or from another control device associated with a second cell.
  • a method executed by a control device arranged on a local network side with respect to a gateway between an internet and a local network the method including transmitting or receiving information regarding one or more sensor devices wirelessly connected to a first cell associated with the control device to or from another control device associated with a second cell.
  • the control device is arranged on the local network side. Therefore, a load regarding communication between the control device and the sensor device can be suppressed, as compared with a case where the control device is arranged on the internet side. Furthermore, according to the present disclosure, the information regarding the sensor devices is transmitted or received between the control devices.
  • information processing based on the information regarding the sensor devices can be performed by the control devices in a distributed manner.
  • a mechanism capable of processing a service using a plurality of sensor devices in a distributed manner while suppressing a communication load and providing the service is provided.
  • the above-described effect is not necessarily limited, and any of effects described in the present specification or other effects that can be grasped from the present specification may be exerted in addition to or in place of the above-described effect.
  • FIG. 1 is a diagram for describing an outline of a system according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram for describing an outline of a system according to a comparative example.
  • FIG. 3 is a diagram illustrating an example of a schematic configuration of a system according to the present embodiment.
  • FIG. 4 is a block diagram illustrating an example of a configuration of a global server according to the present embodiment.
  • FIG. 5 is a block diagram illustrating an example of a configuration of an edge server according to the present embodiment.
  • FIG. 6 is a block diagram illustrating as example of a configuration of a sensor device according to the present embodiment.
  • FIG. 7 is a block diagram illustrating an example of a configuration of a user device according to the present embodiment.
  • FIG. 8 is a diagram for describing a logical functional configuration of a system according to the present embodiment.
  • FIG. 9 is a diagram illustrating an example of configuration of a reference point A of the system according to the present embodiment.
  • FIG. 10 is a diagram illustrating an example of a configuration of a reference point B of the system according to the present embodiment.
  • FIG. 11 is a diagram illustrating an example of a configuration of a reference point C of the system according to the present embodiment.
  • FIG. 12 is a diagram illustrating an example of configuration of a reference point D of the system according to the present embodiment.
  • FIG. 13 is a sequence diagram illustrating an example of a flow of service providing processing executed in the system according to the present embodiment.
  • FIG. 14 is a diagram for describing an outline of a first scenario.
  • FIG. 15 is a sequence diagram illustrating an example of a flow of service providing processing in the first scenario executed in the system according to the present embodiment.
  • FIG. 16 is a sequence diagram illustrating an example of a flow of the service providing processing in the first scenario executed in the system according to the present embodiment.
  • FIG. 17 is a diagram for describing an outline of a second scenario.
  • FIG. 18 is a sequence diagram illustrating an example of a flow of service providing processing in the second scenario executed in the system according to the present embodiment.
  • FIG. 19 is a sequence diagram illustrating an example of a flow of the service providing processing in the second scenario executed in the system according to the present embodiment.
  • FIG. 20 is a diagram for describing an outline of a third scenario.
  • FIG. 21 is a sequence diagram illustrating an example of a flow of service providing processing in the third scenario executed in the system according to the present embodiment.
  • FIG. 22 is a sequence diagram illustrating an example of a flow of inquiry processing to the global server 10 executed in the system according to the present embodiment.
  • FIG. 23 is a diagram for describing another implementation example of a reference model according to the present embodiment.
  • FIG. 24 is a diagram for describing another in example of a reference model according to the present embodiment.
  • FIG. 25 is a diagram for describing a first application example provided by the system according to the present embodiment.
  • FIG. 26 is a diagram for describing a second application example provided by the system according to the present embodiment.
  • FIG. 27 is a sequence diagram illustrating as example of a flow of service providing processing in the second application example provided by the system according to the present embodiment.
  • FIG. 28 is a block diagram illustrating an example of a hardware configuration of an information processing apparatus according to the present embodiment.
  • elements having substantially the same functional configuration may be distinguished by appending different alphabets to the same reference numeral.
  • a plurality of elements having substantially the same functional configuration is distinguished as the edge servers 20 A and 20 B as necessary.
  • the edge servers 20 A and 20 B are also simply referred to as edge servers 20 .
  • FIG. 1 is a diagram for describing an outline of a system according to an embodiment of the present disclosure.
  • a system 1 according to the present embodiment includes a global server 10 , a plurality of edge servers 20 ( 20 A and 20 B), a plurality of sensor devices 30 ( 30 A to 30 D) connected to the edge server 20 , and a user device 40 .
  • the global server 10 is a device that mediates exchange of information between the plurality of edge servers 20 .
  • the global server 10 may be installed on the Internet 500 .
  • the edge server 20 is a control device arranged on a side of a local network with respect, to a gateway between the Internet 500 and the local network.
  • the local network may be, for example, a core network in cellular communication, and the gateway may be a packet data network-gateway (P-GW) in long term evolution (LTE).
  • P-GW packet data network-gateway
  • LTE long term evolution
  • the local network may be, for example, a local area network (LAN), and the gateway may be a gateway of a LAN and a wide area network (WAN).
  • One or more access points 200 are associated with the edge server 20 .
  • one or more cells are associated with the edge server 20 .
  • the edge server 20 manages one or more sensor devices 30 wirelessly connected to the associated cell.
  • the edge server 20 causes the sensor device 30 under control to perform sensing and collects and processes sensing information from the sensor device 30 under control to provide a service to a user 400 .
  • Sensable ranges of one or more sensor devices 30 under control are collectively referred to as a management range of the edge server 20 .
  • different edge servers 20 are associated with difference access points 200 .
  • the edge server 20 may be included in or provided side by side with the access point 200 , for example. Furthermore, the access point.
  • the 200 may include a main body (also referred to as a base station device) that controls wireless communication and one or more remote radio heads (RRHs) arranged at a location different from the main body.
  • the edge server 20 may be included in or provided side by side with the main body.
  • the access point 200 operates a cell and provides a wireless service to one or more terminal devices located inside the cell.
  • an access point 200 A provides a wireless service to each of sensor devices 30 A and 30 B located inside a cell 201 A
  • an access point 200 B provides a wireless service to each of sensor devices 30 C and 30 D located inside a cell 201 B.
  • the cells can be operated according to any wireless communication scheme such as LTE, 5G new radio (NR), or a wireless LAN.
  • the sensor device 30 performs sensing to acquire sensing information.
  • the sensor device 30 transmits the obtained sensing information to the edge server 20 that manages the sensor device 30 .
  • the sensor device 30 may perform sensing regarding a specific sensing target 300 on the basis of control by the edge server 20 .
  • An area or space that can be sensed by the sensor device 30 is also referred to as a sensable range 31 ( 31 A to 31 D).
  • the sensable ranges 31 may have overlapping portions. Furthermore, the shape of the sensable range 31 may be different depending on a type of sensor.
  • the user device 40 is a terminal device operated by the user 400 .
  • the user device 10 performs processing for receiving provision of a service desired by the user 400 .
  • the user device 40 requests the edge server 20 to provide a desired service and receives the service.
  • the user device 40 may be connected to the edge server 20 via the Internet 500 or may be connected to the edge server 20 associated with the access point 200 via the access point 200 .
  • the system 1 provides the user 400 with a sensing service regarding the sensing target 300 .
  • the system 1 continuously captures the sensing target 300 with the sensor device 30 around the sensing target 300 and delivers a captured video to the user device 40 in real time.
  • the system 1 typically provides a subscription service.
  • the subscription service is a type of service in which the user pays for a provided service.
  • the user 400 signs a service contract in advance with a service provider and receives provision of a service that requires processing by the edge server 20 and sensing and the like by the sensor device 30 , and pays for the provided service.
  • the edge server 20 that collects the sensing information and provides a service is arranged in the local network. Since the distance between the sensor device 30 and the edge server 20 is short, the system 1 can provide the service without a delay.
  • the sensing target 300 does not stay in the sensable range 31 of one sensor device 30 and may move beyond the sensable range 31 and can further move beyond the management range of the edge server 20 . Therefore, it is desirable to provide a mechanism for enabling continuous provision of a service without a delay even if such movement occurs.
  • a configuration illustrated in FIG. 2 can also be considered as a system for providing a sensing service regarding the sensing target 300 .
  • FIG. 2 is a diagram for describing an outline of a system according to a comparative example.
  • the system according to the comparative example does not include the edge server 20 illustrated in FIG. 1 . Therefore, in the system according to the comparative example, a service provision server 600 on the Internet 500 collectively collects the sensing information sensed by the sensor devices 30 and provides a service.
  • occurrence of congestion in a communication line is considered as a large number of sensor devices 30 simultaneously provides the sensing information to the service provision server 600 .
  • the information is provided to the user device 40 in a delayed manner.
  • information to be provided to the user 400 can also dynamically change.
  • the information is provided to the user device 40 in a delayed manner due to the congestion of the communication line, the information provided to the user device 40 may be no longer valuable for the user 400 .
  • the sensable range of the sensor device 30 and the coverage (in other words, the cell) of the access point 200 to which the sensor device 30 is wirelessly connected are restrictive. Due to such spatial limitations, the following situations can occur as obstacles to the continuous provision of the information provision service based on the sensing information.
  • the user device moves beyond the sensable range of the sensor device.
  • the user device moves beyond the coverage of the access point.
  • the user device requests information based on sensing information obtained outside the sensable range of the sensor device and the coverage of the access point.
  • the sensor device moves beyond the coverage of the access point.
  • the sensing target moves beyond the sensable range of the sensor device.
  • the sensing target moves beyond the coverage of the access point.
  • the present disclosure overcomes the above-described obstacles and provides a mechanism capable of continuously providing a consumer with an information provision service based on sensing information without a delay.
  • FIG. 3 is a diagram illustrating an example of a schematic configuration of the system 1 according to the present embodiment.
  • the system 1 according to the present embodiment includes the global server 10 , the edge servers 20 ( 20 A and 20 B), the sensor device 30 , and the user device 40 . These devices are also referred to as entities.
  • Contact points between the entities are also referred to as reference points.
  • the contact point between the edge server 20 and the user device 40 is also referred to as a reference point A.
  • the contact point between the edge server 20 and the sensor device 30 is also referred to as a reference point B.
  • the contact point between the edge servers 20 is also referred to as a reference point C.
  • the contact point between the global server 10 and the edge server 20 is also referred to as a reference point D.
  • Each entity is defined by a functional role and the reference point.
  • a device configuration of each entity will be described.
  • the global server 10 is an entity that manages and controls one or more edge servers 20 .
  • an example of a configuration of the global server 10 will be described with reference to FIG. 4 .
  • FIG. 4 is a block diagram illustrating an example of a configuration of the global server 10 according to the present embodiment.
  • the global server 10 includes a network communication unit 110 , a storage unit 120 , and a control unit 130 .
  • the network communication unit 110 transmits and receives information.
  • the network communication unit 110 transmits information to another node and receives information from another node.
  • the another node includes the edge server 20 .
  • the storage unit 120 temporarily or permanently stores programs and various data for the operation of the global server 10 .
  • the control unit 130 provides various functions of the global server 10 .
  • the control unit 130 performs processing of providing the edge server 20 with information regarding another edge server 20 present in a periphery, and providing information of another edge server 20 corresponding to a request from the edge server 20 .
  • the edge server 20 is an entity that collects the sensing information from the sensor device 30 and provides a service to the user device 40 .
  • an example of a configuration of the edge server 20 will be described with reference to FIG. 5 .
  • FIG. 5 is a block diagram illustrating an example of a configuration of the edge server 20 according to the present embodiment.
  • the edge server 20 includes a network communication unit 210 , a storage unit 220 , and a control unit 230 .
  • the network communication unit 210 transmits and receives information.
  • the network communication unit 210 transmits information to other nodes and receives information from other nodes.
  • the other nodes include the global server 10 , another edge server 20 , the sensor device 30 , and the user device 40 .
  • the storage unit 220 temporarily or permanently stores programs and various data for the operation of the edge server 20 .
  • the control unit 230 provides various functions of the edge server 20 .
  • the control unit 230 acquires the sensing information from the sensor device 30 , processes, for example, the sensing information to generate service information, and provides the user device 40 with the generated service information.
  • the control unit 230 transmits or receives information regarding one or more sensor devices 30 under control of the edge server 20 to or from the another edge server 20 to perform processing in cooperation with the another edge server 20 .
  • the sensor device 30 is an entity that performs sensing and transmits the obtained sensing information to the edge server 20 .
  • an example of a configuration of the sensor device 30 will be described with reference to FIG. 6 .
  • FIG. 6 is a block diagram illustrating an example of a configuration of the sensor device 30 according to the present embodiment.
  • the sensor device 30 includes an antenna unit 310 , a wireless communication unit 320 , a sensor unit 330 , a storage unit 340 , and a control unit 350 .
  • the antenna unit 310 radiates a signal output from the wireless communication unit 320 into a space as a radio wave. Furthermore, the antenna unit 310 converts a radio wave in the space into a signal and outputs the signal to the wireless communication unit 320 .
  • the wireless communication unit 320 transmits and receives a signal. For example, the wireless communication unit 320 receives a signal from the access point and transmits a signal to the access point.
  • the sensor unit 330 performs sensing to acquire the sensing information.
  • the sensing is processing of acquiring information from an environment.
  • the sensor unit 330 can include various sensors.
  • the sensor unit 330 may include an electric field sensor and measure an electric field strength in a specific frequency band of an installation position of the sensor device 30 .
  • the sensor unit 330 may include a camera and capture an image (a still image or a moving image) with respect to an environment around the installation position of the sensor device 30 or a specific target.
  • the sensor unit 330 may include an environmental sensor such as a thermometer or a barometer and measure environmental information such as air temperature or atmospheric pressure.
  • the sensor unit 330 can include any other sensors.
  • the storage unit 340 temporarily or permanently stores programs and various data for the operation of the sensor device 30 .
  • the control unit 350 provides various functions of the sensor device 30 .
  • the control unit 350 sets parameters of the sensing on the basis of the control by the edge server 20 , acquires the sensing information, and reports the sensing information to the edge server 20 .
  • the user device 40 is an entity that receives provision of a service from the system 1 .
  • an example of a configuration of the user device 40 will be described with reference to FIG. 7 .
  • FIG. 7 is a block diagram illustrating an example of a configuration of the user device 40 according to the present embodiment.
  • the user device 40 includes an antenna unit 410 , a wireless communication unit 420 , an input unit. 430 , an output unit 440 , a storage unit 450 , and a control unit 460 .
  • the antenna unit 410 radiates a signal output from the wireless communication unit 420 into a space as a radio wave. Furthermore, the antenna unit 410 converts a radio wave in the space into a signal and outputs the signal to the wireless communication unit 420 .
  • the wireless communication unit 420 transmits and receives a signal. For example, the wireless communication unit 420 receives a signal from the access point and transmits a signal to the access point.
  • the input unit 430 and the output unit 440 are interfaces with the user.
  • the input unit 430 receives an input from the user and obtains input information indicating the user input.
  • the input unit 430 may include a physical key such as a keyboard or may include a sound input device for sound recognition.
  • the output unit. 440 outputs information to the user.
  • the output unit 440 may include a display device, a sound output device, a bone conduction output device, a vibration device, or the like.
  • the input unit 430 and the output unit 440 may perform user interaction via different devices using a wireless interface.
  • the storage unit 450 temporarily or permanently stores programs and various data for the operation of the user device 40 .
  • the control unit 460 provides various functions of the user device 40 .
  • the control unit 460 transmits information to the edge server 20 and performs output based on information received from the edge server 20 in order to receive provision of a service desired by the user.
  • the user device 40 may be, for example, a mobile phone, a smartphone, a tablet, a personal computer (PC), a wearable device, or the like.
  • the wearable device may be a wristband type, a wrist watch type, a headset type, a ring type, or the like.
  • FIG. 8 is a diagram for describing a logical functional configuration of the system 1 according to the present embodiment.
  • the system 1 is configured by six logic function entities (a database function 50 , a decision making function 60 , a sensing function 70 , a control function 80 , an interface function 90 , and a communication function 100 ), and five service access points (SAPs) (database (DB)-SAP 51 , decision making (DM)-SAP 61 , sensing (S)-SAP 71 , control (C)-SAP 81 , communication (Com)-SAP 101 ).
  • the database function 50 is a software or hardware module that stores the sensing information, a processing result of the sensing information, and information (auxiliary information) necessary for information generation.
  • the database function 50 is typically provided by the global server 10 .
  • the service access point of the database function 50 is DB-SAP 51 .
  • the DB-SAP 51 is used by the interface function 90 to access a service provided by the database function 50 , such as storage of the sensing information and the auxiliary information.
  • the decision making function 60 is a software or hardware module that processes the sensing information and generates information, for example.
  • the decision making function 60 is typically provided by the edge server 20 .
  • the service access point of the decision making function 60 is the DM-SAP 61 .
  • the DM-SAP 61 is used by the interface function 90 to access a service provided by the decision making function 60 , such as processing of the sensing information and the information generation.
  • the sensing function 70 is a software or hardware module that acquires the sensing information.
  • the sensing function 70 is typically provided by the sensor device 30 .
  • the service access point of the sensing function 70 is S-SAP 71 .
  • the S-SAP 71 is used by the interface function 90 to access a service provided by the sensing function 70 , such as acquisition of the sensing information.
  • the sensor device 30 using the S-SAP 71 acquires a command for setting parameters related to sensing from the edge server 20 and implements control of the sensor parameters.
  • the control function 80 is a software or hardware module that carries control for transmitting the auxiliary information to the global server 10 or the edge server 20 and for receiving notification of a determination result from the edge server 20 .
  • Control functionality 80 is typically provided by user device 40 .
  • the service access point of the control function 80 is C-SAP 81 .
  • the C-SAP 81 is mainly used by an application to access information of the global server 10 or the edge server 20 . For example, the user device 40 using the C-SAP 81 becomes a consumer of the information of the global server 10 or the edge server 20 or plays a role of application control.
  • the communication function 100 is a software or hardware module for providing a communication protocol stack required for an interface between logical function entities and other communication services.
  • the service access point of the communication function 100 is Com-SAP 101 .
  • the Com-SAP 101 exchanges and provides the sensing information, and exchanges the auxiliary information or other related information between the communication function 100 and the interface function 90 .
  • Com-SAP 101 has a role of abstracting a communication mechanism for use of the communication function 100 by the reference point, by defining a set of generic primitives and mapping these primitives to a transfer protocol.
  • the communication mechanism applied on implementation may be, for example, for a PHY/MAC layer, global system for mobile communications (GSM) (registered trademark), universal mobile telecommunications system (UMTS), long term evolution (LTE), new radio (NR) being studied in 5G or a later cellular system technology, or a wireless local area network (LAN) standard (IEEE 802.11a, b, n, g, ac, ad, af or ah) formulated in the IEEE 802.11 working group (WG) or a standard to be formulated in the future (IEEE 802.11ax, ay, or the like), an IEEE 802.16 WG, or an IEEE 802.15 WG.
  • GSM global system for mobile communications
  • UMTS universal mobile telecommunications system
  • LTE long term evolution
  • NR new radio
  • LAN wireless local area network
  • IEEE 802.11a, b, n, g, ac, ad, af or ah formulated in the IEEE
  • the interface function 90 is abstraction of integrity between functional blocks for realizing a reference access point between the global server 10 and the edge server 20 , between the edge servers 20 , between the edge server 20 and the sensor device 30 , or between the edge server 20 and the user device 40 .
  • all of the SAPs ( 51 , 61 , 71 , 81 , and 101 ) are the service access points.
  • FIG. 9 is a diagram illustrating an example of a configuration of the reference point A of the system 1 according to the present embodiment. As illustrated in FIG. 9 , the decision making function 60 and the interface function 90 are mapped to the edge server 20 , and the control function 80 and the interface function 90 are mapped to the user device 40 . Then, the communication function 100 is mapped to the reference point A between the edge server 20 and the user device 40 .
  • the following information is transmitted from the user device 40 to the edge server 20 , for example.
  • the user device registration information is information registered in advance for the user device 40 , and includes, for example, a telephone number, an e-mail address, and the like of the user device 40 .
  • the service subscription information is information for receiving provision of a subscription service, and includes, for example, contract information indicating a contract between the user and the service provider.
  • the user device position information is information indicating a current position (latitude, longitude, altitude, and the like) of the user device 40 .
  • the user authentication information is authentication information for use of the subscription service, and includes identification information, a password, and the like of the user.
  • the service provision request is a message for requesting start of the provision of the subscription service.
  • the service provision request can include information indicating service information to be generated.
  • the service provision request can include information for identifying the sensing target.
  • the sensing request is a message for requesting execution of sensing.
  • the following information is transmitted from the edge server 20 to the user device 40 , for example.
  • the service information is information generated by processing the sensing information or the like in the edge server 20 in response to the service provision request, and includes, for example, a video of the sensing target and the like.
  • the sensing information includes the sensing target (for example, a person, an animal, a thing), a sensing result. (for example, an electric field intensity, an image, a moving image, an air temperature), and the like.
  • the sensing information may be raw data or may be a result of an analysis of the raw data, for example.
  • the edge server access information is information for the user device 40 to be connected to the edge server 20 , and includes, for example, an address of the edge server 20 , and the like.
  • FIG. 10 is a diagram illustrating an example of a configuration of the reference point B of the system 1 according to the present embodiment. As illustrated in FIG. 10 , the decision making function 60 and the interface function 90 are mapped to the edge server 20 , and the sensing function 70 and the interface function 90 are mapped to the sensor device 30 . Then, the communication function 100 is mapped to the reference point B between the edge server 20 and the sensor device 30 .
  • the following information is transmitted from the sensor device 30 to the edge server 20 , for example.
  • the sensor registration information is information registered in advance for the sensor device 30 , and includes, for example, information indicating a charge for use of the sensor device 30 and the like.
  • the sensor position information is information indicating a current position (latitude, longitude, altitude, and the like) of the sensor device 30 .
  • the sensor operation parameter information is information indicating a parameter used for sensing currently performed by the sensor device 30 .
  • the sensor operation parameter information includes, for example, a sensing area (for example, a direction), the sensing target. (for example, a person, an animal, or a thing), a sensing parameter (for example, a radio wave, an image, a moving image, temperature, or the like), and the like.
  • the following information is transmitted from the edge server 20 to the sensor device 30 , for example.
  • the sensing operation parameter setting request is a message for specifying a sensing operation parameter to be set.
  • the information regarding the sensing target includes, for example, identification information, characteristic information of an image (for example, a face image), a voiceprint, or the like, mobility, attribute information, and the like, of the sensing target.
  • FIG. 11 is a diagram illustrating an example of a configuration of the reference point C of the system 1 according to the present embodiment. As illustrated in FIG. 11 , the decision making function 60 and the interface function 90 are mapped to the edge server 20 A, and similarly, the decision making function 60 and the interface function 90 are mapped to the edge server 20 B. Then, the communication function 100 is mapped to the reference point C between the edge server 20 A and the edge server 20 B.
  • information regarding one or more sensor devices 30 under control of the edge server 20 is transmitted or received to or from another edge server 20 .
  • the following information is transmitted from the edge server 20 A to the edge server 20 B, or from the edge server 20 B to the edge server 20 A, for example.
  • the information necessary for determination with service provision includes, for example, information regarding the sensor device 30 under control of another edge server 20 and the like.
  • the sensor information includes, for example, information indicating a type, performance, and the like of the sensor included in the sensor device 30 .
  • FIG. 12 is a diagram illustrating an example of a configuration of the reference point D of the system 1 according to the present embodiment.
  • the database function 50 and the interface function 90 are mapped to the global server 10
  • the decision making function 60 and the interface function 90 are mapped to the edge server 20 .
  • the communication function 100 is mapped to the reference point D between the global server 10 and the edge server 20 .
  • the following information is transmitted from the edge server 20 to the global server 10 , for example.
  • the user device search request is a message requesting a search for the user device 40 , and includes, for example, identification information of the user device 40 and the like.
  • the sensor search request is a message for requesting a search for the sensor device 30 , and includes, for example, position information, information for specifying the sensable range, information regarding the sensing target, and the like.
  • the edge server search request is a message for requesting a search for the edge server 20 , and includes information for specifying the management range and the like.
  • the following information is transmitted from the global server 10 to the edge server 20 , for example.
  • the user device search result is a response to the user device search request, and includes, for example, position information of the searched user device 40 and the like.
  • the sensor search result is a response to the sensor search request, and includes, for example, identification information and position information of the searched sensor device 30 , information indicating the edge server 20 of the management source, and the like.
  • the edge server search result is a response to the edge server search request, and includes, for example, identification information of the searched edge server 20 , information indicating the management range, and the like.
  • the user device 40 first performs subscription in order to receive provision of a service from the system 1 .
  • the system 1 performs preliminary preparation for providing the service to the user device 40 , such as selection of an appropriate sensor device 30 and parameter setting. Thereafter, the system 1 starts provision of the service to the user device 40 .
  • a series of processing will be described below.
  • FIG. 13 is a sequence diagram illustrating an example of a flow of service providing processing executed in the system 1 according to the present embodiment. As illustrated in FIG. 13 , the edge server 20 , the sensor device 30 , and the user device 40 are involved in the present sequence.
  • the user device 10 transmits a subscription request to the edge server 20 (step S 102 ).
  • the subscription request includes information for preliminary preparation, such as the user device registration information, the user authentication information, and the service subscription information, for example.
  • a transmission trigger of the subscription request can be considered in various ways.
  • the subscription request may be transmitted in a case where a button arranged on a GUI of application software or a web page for a sensing service, which is displayed on the user device 40 , pressed.
  • a creation application of a user account in the application software or the web page for a sensing service from the user device 40 may be treated as the subscription request.
  • the edge server 20 transmits a subscription response to the user device 40 (step S 104 ).
  • the edge server 20 performs sensor selection on the basis of the subscription request received from the user device 40 (step S 106 ). For example, the edge server 20 selects the sensor device 30 capable of properly sensing the sensing target requested in the subscription request.
  • the edge server 20 transmits a sensing request to the selected sensor device 30 (step S 108 ).
  • the sensor device 30 transmits a response to the edge server 20 (step S 110 ).
  • the sensor device 30 starts sensing based on the sensing request (step S 112 ) and appropriately transmits the sensing information to the edge server 20 (step S 114 ).
  • the sensor device 30 performs presetting such as directing a lens to the sensing target, setting a capture mode, and then starts capture. Then, the sensor device 30 transmits the captured image to the edge server 20 at predetermined intervals or performs streaming delivery in real time.
  • the edge server 20 performs information processing based on the sensing information acquired from the sensor device 30 (step S 116 ) and provides a sensing service to the user device 40 (step S 118 ).
  • the edge server 20 selects a sensor device 30 (corresponding to a first sensor device) capable of sensing the sensing target on the basis of a request from the user device 40 , generates the service information on the basis of the sensing information of the selected sensor device 30 , and provides the service information to the user device 40 .
  • the request may include the subscription request.
  • the edge server 20 performs user authentication in response to the subscription request, and selects the sensor device 30 according to the contract.
  • the request may include the service provision request.
  • the edge server 20 identifies the sensing target in response to the service provision request, and generates the service information regarding the sensing target.
  • the edge server 20 may process the sensing information for generation of the service information or may adopt the sensing information as it is without processing.
  • the service provision request may be transmitted along with the subscription request or may be transmitted at any time after transmission of the subscription request.
  • the edge server 20 may perform push notification regarding the sensing service, and notification content may be given in notification to the user via a user interface provided by the user device 40 .
  • the user device 40 can notify the user by a method such as displaying as an icon on the screen, vibrating, or playing a sound.
  • a first scenario is a scenario in which the sensing target moves within the management range of the edge server 20 and beyond the sensable range of the sensor device 30 .
  • FIG. 14 is a diagram for describing an outline of a first scenario.
  • a configuration involved in the description of the present scenario is extracted from an overall configuration illustrated in FIG. 1 .
  • the sensing target 300 moves from the sensable range 31 A of the sensor device 30 A under control of the edge server 20 A to the sensable range 31 B of the sensor device 30 B under control of the edge server 20 A.
  • the user device 40 is connected to the edge server 20 A.
  • the user device 40 can receive provision of a sensing service from the edge server 20 A.
  • the edge server 20 A generates the service information on the basis of an image of the sensing target 300 and provides the service information to the user device 40 .
  • the sensing target 300 moves beyond the sensable range (for example, a capturable range) of a specific sensor device 30 .
  • a second sensor device 30 having the moved sensing target included in the sensable range has the same managing edge server 20 with the first sensor device 30 having the sensing target included in the sensable range before the movement. Therefore, in the case where the sensing target moves from the sensable range of the first sensor device 30 to the sensable range of the second sensor device 30 , the edge server 20 transmits the information regarding the sensing target to the second sensor device 30 . With the transmission, the edge server 20 can hand over the sensing of the sensing target to the second sensor device 30 , the first sensor device 30 having a difficulty in continuing the sensing.
  • FIG. 15 is a sequence diagram illustrating an example of a flow of the service providing processing in the first scenario executed in the system 1 according to the present embodiment.
  • the sensor device 30 A, the sensor device 30 B, and the edge server 20 A are involved in the present sequence.
  • characteristic processing in the present scenario is extracted, and processing having a weak relationship with the present scenario, such as transmission of the subscription request, is omitted.
  • the subsequent sequences are described in a similar manner.
  • the sensor device 30 A transmits the sensing information to the edge server 20 A (step S 202 ).
  • the sensor device 30 A uploads a captured image of the sensing target 300 to the edge server 20 A.
  • the edge server 20 A makes a determination of change in charge on the basis of the sensing information received from the sensor device 30 A (step S 204 ). For example, the edge server 20 A estimates information indicating the sensing target 300 , a characteristic amount, position information, a moving speed, and the like on the basis of the sensing information. In the estimation, an analysis technology such as image analysis or pattern recognition may be used, or a learning technology such as machine learning using past sensing information may be used. Furthermore, the service subscription information from the user device 40 or information acquired from a position identification device (such as a global positioning system (GPS) or a global navigation satellite system (GNSS)) may be used for the estimation.
  • GPS global positioning system
  • GNSS global navigation satellite system
  • the edge server 20 A determines whether or not the sensor device 30 A continues the sensing of the sensing target 300 on the basis of the estimated information, information regarding the sensable range 31 A of the sensor device 30 A. In a case where it is determined that the sensing by the sensor device 30 A should be continued, the sensing by the sensor device 30 A is continued as it is.
  • the edge server 20 A performs sensor device search processing for searching for a changed sensor device 30 in charge (step S 206 ). For example, in a case where the sensing target is predicted to be out of the sensable range 31 A of the sensor device 30 A after a predetermined time, or the like, the edge server 20 A determines that the sensor should not be continued. Then, the edge server 20 A searches for the changed sensor device 30 in charge on the basis of the information indicating the sensing target 300 , the characteristic amount, the position information, and the moving speed, and position information of another sensor device 30 .
  • the edge server 20 A identifies adjacent other sensor devices 30 on the basis of the position information of the sensor device 30 A.
  • the edge server 20 A estimates which sensable range of a sensor device 30 among the adjacent other sensor devices 30 the sensing target 300 will move to on the basis of the moving speed and a moving direction of the sensing target 300 , thereby determining the changed sensor device 30 in charge.
  • the edge server 20 A determines the sensor device 30 B as the changed sensor device 30 in charge.
  • the edge server 20 A transmits the information regarding the sensing target to the sensor device 30 B (step S 208 ).
  • the sensor device 30 B transmits an ACK (acknowledgement response) to the edge server 20 A (step 210 ) and starts sensing of the sensing target 300 (step S 212 ).
  • the edge server 20 A may notify the sensor device 30 A of a release command to release the parameter setting (step S 214 ).
  • FIG. 16 is a sequence diagram illustrating an example of a flow of the service providing processing in the first scenario executed in the system 1 according to the present embodiment. As illustrated in FIG. 16 , the sensor device 30 A, the sensor device 30 B, and the edge server 20 A are involved in the present sequence.
  • the sensor device 30 A appropriately transmits the sensing information to the edge server 20 A (step S 302 ).
  • the sensor device 30 A uploads the captured image of the sensing target 300 to the edge server 20 A at predetermined intervals.
  • the sensing target 300 has moved from the sensable range 31 A of the sensor device 30 A to the sensable range 31 B of the sensor device 30 B.
  • the sensor device 309 detects the sensing target 300 as an unknown object (step S 304 ) and transmits a detection report to the edge server 20 A (step S 306 ).
  • the detection report can include, for example, position information at which the unknown object is detected, an image of the unknown object, and the like.
  • the edge server 20 A collates the information included in the received detection report with the sensing information held by the edge server 20 A (for example, the sensing information received from the sensor device 30 A and accumulated) and identifies that the unknown object is the sensing target 300 (step S 308 ).
  • the edge server 20 A transmits the information regarding the sensing target to the sensor device 30 B (step S 310 ).
  • the sensor device 30 B transmits an ACK to the edge server 20 A (step S 312 ) and starts sensing of the sensing target 300 (step S 314 ).
  • the edge server 20 A may notify the sensor device 30 A of a release command to release the parameter setting (step S 316 ).
  • the sensing information can be continuously acquired without a delay even if the sensing target moves within the management range of the same edge server 20 .
  • the edge server 20 can continuously provide a service to the user.
  • a second scenario is a scenario in which the sensing target moves beyond the management range of the edge server 20 .
  • FIG. 17 is a diagram for describing an outline of the second scenario.
  • a configuration involved in the description of the present scenario is extracted from the overall configuration illustrated in FIG. 1 .
  • the sensing target 300 moves from the sensable range 31 B of the sensor device 30 B under control of the edge server 20 A to the sensable range 31 C of the sensor device 30 C under control of the edge server 20 B.
  • the user device 40 is connected to the edge server 20 A via the access point 200 A.
  • the sensing target moves beyond the management range the edge server 20 . Therefore, in the second scenario, the second sensor device 30 having the moved sensing target included in the sensable range has a different managing edge server 20 from the first sensor device 30 having the sensing target included in the sensable range before the movement. Therefore, in a case where the second sensor device 30 is connected to a second cell, the edge server 20 transmits the information regarding the sensing target to another edge server 20 associated with the second sensor device 30 . Typically, the information regarding the sensing target transmitted to the another edge server 20 is transferred to the second sensor device 30 .
  • a certain edge server 20 can hand over the sensing of the sensing target to the second sensor device 30 under control of a different edge server 20 , the first sensor device 30 under control of the certain edge server 20 having a difficulty in continuing the sensing.
  • the edge server 20 specifies that the sensing target has moved to the sensable range of the second sensor device 30
  • the edge server 20 transmits the information regarding the sensing target to the another edge server 20 associated with the second sensor device 30 .
  • the edge server 20 transmits the information regarding the sensing target only to the another edge server 20 that has been determined to be a moving destination of the sensing target. Thereby, unnecessary spreading of a face image and the like of the sensing target can be prevented.
  • the edge server 20 transmits a request for searching for the second sensor device 30 to the another edge server 20 . With the transmission, the edge server 20 can cause the another edge server 20 to search for the second sensor device 30 appropriate as a handover destination.
  • the edge server 20 receives a request for searching for the first sensor device 30 from the another edge server 20 .
  • This request requests, by the another edge server 20 , a search for the sensor device 30 in charge that has been responsible for the sensing until then of the unknown object sensed by the second sensor device 30 under control.
  • the edge server 20 can recognize that the sensing target sensed by the first sensor device 30 has moved to the sensable range of the second sensor device 30 . Then, the edge server 20 can request the second sensor device 30 to take over the sensing.
  • FIG. 18 is a sequence diagram illustrating an example of a flow of service providing processing in the second scenario executed in the system 1 according to the present embodiment. As illustrated in FIG. 18 , the sensor device 30 B, the sensor device 30 C, the edge server 20 A, and the edge server 20 B are involved in the present sequence.
  • the sensor device 30 B transmits the sensing information to the edge server 20 A (step S 402 ).
  • the sensor device 30 B uploads a captured image of the sensing target 300 to the edge server 20 A.
  • the edge server 20 A makes a determination of change in charge on the basis of the sensing information received from the sensor device 30 B (step S 404 ). In a case where it is determined that the sensing by the sensor device 30 B should be continued, the sensing by the sensor device 30 B is continued as it is.
  • the edge server 20 A performs the sensor device search processing for searching for the changed sensor device 30 in charge (step S 406 ).
  • the edge server 20 A transmits a sensor search request to another edge server 20 (for example, the edge server 20 B) having an adjacent management, range (step S 408 ).
  • the sensor search request can include, for example, the position information of the sensor device 30 B in charge, information indicating the sensable range, and the identification information of the sensing target, and the like.
  • the edge server 20 B that has received the sensor search request performs the sensor device search processing for searching for the changed sensor device 30 in charge (step S 410 ). For example, the edge server 20 B identifies the sensor device 30 under control of the edge server 20 B, the sensor device 30 being adjacent to the sensor device 30 B, on the basis of the position information of the sensor device 30 B. Next, the edge server 20 B estimates which sensable range of a sensor device 30 among the sensor devices 30 under control of the edge server 20 B the sensing target 300 will move to on the basis of the moving speed and the moving direction of the sensing target 300 , thereby determining the changed sensor device 30 in charge. Here, it is assumed that the edge server 20 B determines the sensor device 305 as the changed sensor device 30 in charge.
  • the edge server 20 B that has succeeded in the search transmits a response to the sensor search request to the edge server 20 A (step S 412 ).
  • the edge server 20 A transmits the information regarding the sensing target to the edge server 201 B that is the transmission source of the response to the sensor u search request (step S 414 ).
  • the edge server 20 A may transmit, to the edge server 20 B, the sensing information acquired so far and data processed by the information processing, and further, information regarding the user device 40 that the sensing target 300 is involved.
  • the edge server 20 B transmits the received information regarding the sensing target to the sensor device 30 C determined as the changed sensor device 30 in charge (step S 416 ).
  • the sensor device 30 C transmits an ACK to the edge server 20 B (step S 418 ) and starts sensing of the sensing target 300 (step S 420 ).
  • the edge server 20 A may notify the sensor device 30 B of a release command to release the parameter setting (step S 422 ).
  • FIG. 19 is a sequence diagram illustrating an example of a flow of the service providing processing in the second scenario executed in the system 1 according to the present embodiment. As illustrated in FIG. 19 , the sensor device 30 B, the sensor device 30 C, the edge server 20 A, and the edge server 20 B are involved in the present sequence.
  • the sensor device 30 B appropriately transmits the sensing information to the edge server 20 A (step S 502 ).
  • the sensor device 30 B uploads the captured image of the sensing target 300 to the edge server 20 A at predetermined intervals.
  • the sensing target 300 has moved from the sensable range 31 B of the sensor device 30 B to the sensable range 31 C of the sensor device 30 C.
  • the sensor device 30 C detects the sensing target 300 as an unknown object (step S 504 ) and transmits a detection report to the edge server 20 B (step S 506 ).
  • the edge server 20 B collates the information included in the received detection report with the sensing information held by the edge server 20 B (step S 508 ). In a case where the edge server 20 B cannot specify the unknown object, the edge server 20 B transmits the sensor search request for searching for the sensor device 30 in charge to another edge server 20 (for example, the edge server 20 A) having an adjacent management range (step S 510 ).
  • the edge server 20 A that has received the sensor search request performs the sensor device search processing for searching for the sensor device 30 in charge (step S 512 ).
  • the edge server 20 A specifies that the unknown object is the sensing target 300 of the sensor device 30 B on the basis of the received sensor search request.
  • the edge server 20 A that has succeeded in the specification transmits the information regarding the sensing target to the edge server 20 B (step S 514 ).
  • the edge server 20 A may transmit, to the edge server 20 B, the sensing information acquired so far and data processed by the information processing, and further, information regarding the user device 40 that the sensing target 300 is involved.
  • the edge server 20 B transmits the received information regarding the sensing target to the sensor device 30 C (step S 516 ).
  • the sensor device 30 C transmits an ACK to the edge server 20 B (step S 518 ) and starts sensing of the sensing target 300 (step S 520 ).
  • the edge server 20 A may notify the sensor device 30 B of a release command to release the parameter setting (step S 522 ).
  • the sensing information can be continuously acquired without a delay even if the sensing target moves between the management ranges of the different edge servers 20 .
  • the edge server 20 can continuously provide a service to the user.
  • a third scenario is a scenario in which the user device 40 moves beyond the cell associated with the edge server 20 .
  • FIG. 20 is a diagram for describing an outline of a third scenario.
  • a configuration involved in the description of the present scenario is extracted from the overall configuration illustrated in FIG. 1 .
  • the sensing target 300 is located in the sensable range 31 A of the sensor device 30 A, and the user device 40 moves from the cell 201 A associated with the edge server 20 A to the cell 2015 associated with the edge server 20 B.
  • the edge server 20 transmits the information regarding the user device 40 to the another edge server 20 .
  • the edge server 20 transmits user device information of the moving user device 40 to the another edge server 20 associated with the second cell at a moving destination.
  • the edge server 20 can cause the another edge server 20 to recognize movement of the user device 40 and can hand over processing for service provision to the another edge server 20 .
  • the edge server 20 transmits information regarding a service of the moving user device 40 (for example, the service subscription information, and the like), thereby causing the another edge server 20 to cause the user device 40 to continue an equivalent service.
  • the edge server 20 transmits the information regarding the another edge server 20 to the user device 40 .
  • the edge server 20 transmits access information of the another edge server 20 associated with the second cell at the moving destination to the user device 40 .
  • the user device 40 can access the another edge server 20 after the movement to seamlessly change a provider of the service and can continuously receive provision of the service.
  • FIG. 21 is a sequence diagram illustrating an example of a flow of service providing processing in the third scenario executed in the system 1 according to the present embodiment. As illustrated in FIG. 21 , the user device 40 , the edge server 20 A, and the edge server 20 B are involved in the present sequence.
  • the user device 40 periodically or irregularly transmits the position information to the edge server 20 A (step S 602 ).
  • the edge server 20 A detects movement of the user device 40 on the basis of the received position information (step S 604 ).
  • the edge server 20 searches for another edge server 20 associated with a cell at the moving destination (step S 606 ).
  • the edge server 20 B is searched for as the another edge server 20 associated with the cell at the moving destination.
  • the edge server 20 A transmits the information regarding the user device 40 and the information regarding the service to the edge server 20 B (step S 608 ). For example, in a case where the edge server 20 A has been providing a streaming service such as moving image delivery to the user device 40 , the edge server 20 A transmits position information and registration information of the user device 40 and streaming data to the edge server 20 B. Thereby, the edge server 20 B can prepare for the provision of the streaming service to the user device 40 thereafter.
  • a streaming service such as moving image delivery to the user device 40
  • the edge server 20 B transmits an ACK to the edge server 20 A (step S 610 ).
  • the edge server 20 A transmits the edge server access information regarding the edge server 20 B at the moving destination to the user device 40 (step S 612 ).
  • the user device 40 transmits an ACK to the edge server 20 A (step S 614 ).
  • the user device 40 accesses the edge server 20 B using the received edge server access information (step S 616 ).
  • the edge server 20 B starts provision of the service to the user device 40 (step S 618 ).
  • the edge server 20 can continuously provide the service to the user without a delay.
  • the edge server 20 may inquire of the global server 10 about the information. A flow of processing in that case will be described with reference to FIG. 22 .
  • FIG. 22 is a sequence diagram illustrating an example of a flow of inquiry processing to the global server 10 executed in the system 1 according to the present embodiment. As illustrated in FIG. 22 , the edge server 20 and the global server 10 are involved in the present sequence.
  • the edge server 20 transmits a discovery request to the global server 10 (step S 702 ).
  • the global server 10 performs discovery processing to obtain information regarding other edge servers 20 on the basis of the received discovery request (step S 704 ).
  • the global server 10 transmits a discovery response including a result of the discovery processing to the edge server 20 (step S 706 ).
  • a reference mode of the functional configuration of the system 1 illustrated in FIG. 8 can be implemented in various ways. Hereinafter, another implementation example of the reference model will be described with reference to FIGS. 23 and 24 .
  • FIG. 23 is a diagram for describing another implementation example of a reference model according to the present embodiment.
  • the implementation example illustrated in FIG. 23 is a configuration in which a master service server 11 of central control type controls slave service servers 21 ( 21 A to 21 D) for processing the sensing information in a distributed manner.
  • the decision making function 60 and the interface function 90 are mapped to each slave service server 21 .
  • the decision making function 60 , the interface function 90 corresponding to the decision making function 60 , the database function 50 , and the interface function 90 corresponding to the database function 50 are mapped to the master service server 11 .
  • the communication function 100 is mapped to the reference points C and D between the master service server 11 and each slave service server 21 .
  • the slave service servers 21 can be connected to one or more sensor devices 30 via the reference point B and may be connected to one or more user devices 40 via the reference point A.
  • the master service server 11 delegates information processing for providing a sensing service to the slave service servers 21 . Then, the master service server 11 bundles and processes processing results by the slave service servers 21 to provide the service. Such distributed processing realizes distribution of computing load.
  • the master service server 11 manages mobility of the user device 40 and the sensing target. As a result, overhead of information exchange between the slave service servers 21 can be reduced and high efficiency can be realized.
  • the configuration capable of processing the sensing information in a distributed manner can realize calculation load distribution of servers and efficient information exchange between servers.
  • FIG. 24 is a diagram for describing another implementation example of a reference model according to the present embodiment.
  • the implementation example illustrated in FIG. 24 is a configuration in which a big data processing server 12 performs big data processing for the sensing information acquired from the plurality of sensor devices 30 ( 30 A to 30 D).
  • the decision making function 60 , the interface function 90 corresponding to the decision making function 60 , a repository function 52 , and the interface function 90 corresponding to the repository function are mapped to the big data processing server 12 .
  • the repository function 52 is a software or hardware module that stores information for the big data processing.
  • a service access point of the repository function 52 is Repository (R)-SAP 53 .
  • the decision making function 60 of the big data processing server 12 applies the big data processing to the sensing information stored in the repository function 52 so that the sensing information can be provided as a service to the user.
  • the big data processing server 12 is connected to one or more sensor devices 30 via the reference point C. Furthermore, the user device 40 can access the big data processing server 12 via the reference point A and receive provision of the service.
  • the big data processing server 12 can efficiently and appropriately provide a service using the big data processing to the user, similarly to the first scenario and the second scenario. Furthermore, in a case where the user device 40 moves beyond the cell, the big data processing server 12 can efficiently and appropriately provide a service using the big data processing to the user, similarly to the third scenario.
  • FIG. 25 is a diagram for describing a first application example provided by the system 1 according to the present embodiment.
  • the first application example relates to marathon.
  • the user 400 is a runner of the marathon, and the user device 40 is a device of the runner, such as a wearable device.
  • the sensing target 300 is another runner preceding the user 400 .
  • the sensor devices 30 A and 305 are environment sensing cameras installed along a marathon course.
  • the user 400 receives provision of a sensing service regarding the sensing target 300 and receives provision of a service instructing optimal pace distribution according to a course situation or the like.
  • provision of the services by the present application will be described in detail.
  • the user 400 performs subscription to receive provision of the services provided by the present application. Specifically, the user 400 notifies the edge server 20 of the subscription request via the user device 40 before starting. A procedure of this notification may be started by the sound recognition, an input with a physical key, or the like, as a trigger.
  • the subscription request may include, for example, the position information of the user device 40 , information regarding physical conditions of the user 400 detected by the user device 40 (for example, a body temperature, a pulse rate, and the like).
  • the subscription request can include image information including information that can identify the appearance of the user 400 , such as a face image or a full-body image of the user 400 . This image information may be used as auxiliary information for identification of the user 400 by image recognition for continuously tracking the user 400 across the edge server 20 .
  • the user 400 While running, the user 400 notifies the edge server 20 A of the service provision request via the user device 40 .
  • a procedure of this notification may be started by the sound recognition, an input with a physical key, or the like, as a trigger.
  • the service provision request a case is assumed in which the user 400 seeks provision of information for optimal running based on current information (for example, the speed) of the sensing target 300 that the user 400 wants to use as a pacemaker and current information of the user 400 .
  • the service provision request can include characteristic information (for example, a runner name, an event registration number, and the like) of the sensing target 300 , or information of separation distance from the user 400 (in other words, information that the user 400 wants to have a runner of x [m] ahead as a pacemaker).
  • the edge server 20 A acquires the sensing information from the sensor device 30 A and the sensor device 30 B via the edge server 20 B on the basis of the information, and processes the acquired sensing information and the information obtained from the user 400 , thereby providing the sensing service.
  • the first application example has been described above. Note that, in the present application, both the user and the sensing target are runners and involve geographical and spatial movement. Therefore, by application of the above-described present technology, the present application can continuously provide a service to the user without a delay.
  • FIG. 26 is a diagram for describing a second application example provided by the system 1 according to the present embodiment.
  • the second application example relates to watching.
  • the user 400 is a guardian or a caregiver
  • the user device 40 is a device such as a smartphone of the guardian or the caregiver.
  • the sensing target 300 is a person to be watched by the guardian or the caregiver, such as an elderly person or a child.
  • the sensor device 30 for example, a heat detection sensor (especially for detecting a body temperature of an elderly person), a smart phone (for detecting a position by GPS), a monitoring camera (for acquiring a video and an image), and the like can be considered.
  • the sensor device 30 A is a monitoring camera
  • the sensor device 30 B is a smartphone used by the sensing target 300 .
  • the present application is intended to protect children from crimes such as abduction or to protect elder people with dementia having pyromania, for example, and delivers a video or the like in real time.
  • the guardian or the caregiver can watch the person to be watched from a remote place regardless of where the person to be watched is.
  • provision of a service by the present application will be described in detail.
  • the user 400 performs subscription to receive provision of the service provided by the present application. Specifically, the user 400 notifies the edge server 20 of the subscription request via the user device 40 . A procedure of this notification may be started by the sound recognition, an input with a physical key, or the like, as a trigger.
  • the subscription request can include, for example, a phone number, an email address, the position information, and the like of the user device 40 .
  • the subscription request can include information regarding health conditions (for example, a body temperature, a pulse rate, and the like) of the sensing target 300 and the like.
  • the subscription request can include image information including information that can identify the appearance of the sensing target 300 , such as a face image or a full-body image of the user 400 .
  • This image information may be used as auxiliary information for identification of the sensing target 300 by image recognition for continuously tracking the sensing target 300 across the edge server 20 .
  • the subscription request may be given in notification via an application of a smartphone or a PC or a GUI such as a web browser.
  • information regarding a smartphone 30 B owned by the sensing target 300 may be registered in the subscription.
  • a phone number and owner information of the smartphone 309 can be registered.
  • the user 400 After performing the subscription, the user 400 notifies the edge server 20 of the service provision request via the user device 40 .
  • a procedure of this notification may be started by the sound recognition, an input with a physical key, or the like, as a trigger.
  • this notification may also be performed via an application of a smartphone or a PC or a GUI such as a web browser, similarly to the subscription request.
  • FIG. 27 is a sequence diagram illustrating an example of a flow of service providing processing in the second application example provided by the system 1 according to the present embodiment. As illustrated in FIG. 27 , the sensor device 30 A and the edge server 20 are involved in the present sequence.
  • the edge server 20 performs the sensor device search processing for searching for the sensor device 30 capable of sensing the sensing target 300 (step S 802 ). For example, the edge server 20 searches for the sensing target 300 on the basis of subscription information, information included in the service provision request, and the sensing information provided by the sensor device 30 .
  • the sensor device 30 A captures the sensing target 300 .
  • the edge server 20 transmits the information regarding the sensing target to the sensor device 30 A (step S 804 ). At that time, the edge server 20 may also provide notification of information regarding the analysis result.
  • the sensor device 30 A starts sensing (step S 806 ).
  • the sensor device 30 A may perform setting for sensing on the basis of the information received from the edge server 20 , for example.
  • a third application example relates to tracking offenders.
  • the user is a store clerk at a convenience store or a police officer
  • the user device 40 is a device of the store or a police
  • the sensor device 30 is a monitoring camera
  • the sensing target is a customer who has left the store, for example.
  • the clerk enters characteristics of the offender, store information, and the like on the PC provided in the store to notify the edge server 20 of the information. At this time, notification to the police may be automatically performed.
  • the edge server 20 handles the information given in notification as the service provision request.
  • the edge server 20 simultaneously transmits the sensing request to the sensor devices 30 around the store to acquire the sensing information. Then, the edge server 20 searches for the offender on the basis of the acquired sensing information and the information included in the service provision request, and transmits the sensing request to the sensor device 30 capable of capturing the offender.
  • the system 1 continuously tracks the offender without a delay while switching the edge server 20 of the processing main constituent and the sensor device 30 which performs sensing with the movement of the offender. Thereby, the system 1 can contribute to arrest of the offender by the police.
  • the third application example has been described above. Note that, in the present application, both the user and the sensing target involve geographical and spatial movement. Therefore, by application of the above-described present technology, the present application can continuously provide a service to the user without a delay.
  • FIG. 28 is a block diagram illustrating an example of a hardware configuration of an information processing apparatus according to the present embodiment.
  • an information processing apparatus 900 illustrated in FIG. 28 can realize, for example, the global server 10 , the edge server 20 , the sensor device 30 , or the user device 40 illustrated in FIGS. 4, 5, 6, and 7 .
  • Information processing by the global server 10 , the edge server 20 , the sensor device 30 , or the user device 40 according to the present embodiment is realized by cooperation of software with hardware described below.
  • the information processing apparatus 900 includes a central processing unit (CPU) 901 , a read only memory (ROM) 902 , a random access memory (RAM) 903 , and a host bus 904 a . Furthermore, the information processing apparatus 900 includes a bridge 904 , an external bus 904 b , an interface 905 , an input device 906 , an output device 907 , a storage device 908 , a drive 909 , a connection port 911 , and a communication device 913 .
  • the information processing apparatus 900 may include a processing circuit such as an electric circuit, a DSP, or an ASIC, in place of or in addition to the CPU 901 .
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls the overall operation in the information processing apparatus 900 according to various programs. Furthermore, the CPU 901 may be a microprocessor.
  • the ROM 902 stores programs, operation parameters, and the like used by the CPU 901 .
  • the RAM 903 temporarily stores programs used in the execution of the CPU 901 , parameters that appropriately change in the execution, and the like.
  • the CPU 901 can form, for example, the control unit 130 illustrated in FIG. 4 , the control unit 230 illustrated in FIG. 5 , the control unit 350 illustrated in FIG. 6 , or the control unit 460 illustrated in FIG. 7 .
  • the CPU 901 , the ROM 902 , and the RAM 903 are mutually connected by the host bus 904 a including a CPU bus and the like.
  • the host bus 904 a is connected to the external bus 904 b such as a peripheral component interconnect/interface (PCI) bus via the bridge 904 .
  • PCI peripheral component interconnect/interface
  • the host bus 904 a , the bridge 904 , and the external bus 904 b are not necessarily separately configured, and these functions may be implemented on one bus.
  • the input device 906 is realized by, for example, a device to which information is input by the user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever. Furthermore, the input device 906 may be, for example, a remote control device using infrared rays or other radio waves or an externally connected device such as a mobile phone or a PDA corresponding to the operation of the information processing apparatus 900 . Moreover, the input device 906 may include, for example, an input control circuit that generates an input signal on the basis of the information input by the user using the above input means and outputs the input signal to the CPU 901 , and the like. The user of the information processing apparatus 900 can input various data and give an instruction of processing operations to the information processing apparatus 900 by operating the input device 906 .
  • the input device 906 can be formed by a device that detects information regarding the user.
  • the input device 906 can include various sensors such as an image sensor (for example, a camera), a depth sensor (for example, a stereo camera), an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance sensor, and a force sensor.
  • the input device 906 may acquire an attitude, a moving speed, or the like of the information processing apparatus 900 , information regarding a state of the information processing apparatus 900 itself, or information regarding an environment around the information processing apparatus 900 such as brightness, noise, or the like around the information processing apparatus 900 .
  • the input device 906 may include a global navigation satellite system (GNSS) module that receives the GNSS signal from the GNSS satellite (for example, a global positioning system (GPS) signal from a GPS satellite) and measures position information including the latitude, longitude, and altitude of the apparatus. Furthermore, regarding the position information, the input device 906 may detect the position by transmission and reception with Wi-Fi (registered trademark), a mobile phone, a PHS, a smart phone, or the like, or near field communication, or the like.
  • Wi-Fi registered trademark
  • the input device 906 can form, for example, the sensor unit 330 illustrated in FIG. 6 or the input unit 430 illustrated in FIG. 7 .
  • the output device 907 is configured b a device that can visually or audibly notify the user of acquired information. Examples of such devices include display devices such as a CRT display device, a liquid crystal display device, a plasma display device, an EL display device, a laser projector, an LED projector, and a lamp, sound output devices such as a speaker and a headphone, a printer device, and the like.
  • the output device 907 outputs, for example, results obtained by various types of processing performed by the information processing apparatus 900 .
  • the display device visually displays the results obtained by the various types of processing performed by the information processing apparatus 900 in various formats such as text, images, tables, and graphs.
  • the sound output device converts an audio signal including reproduced sound data, voice data, or the like into an analog signal and aurally outputs the analog signal.
  • the output device 907 can form, for example, the output unit 440 illustrated in FIG. 7 .
  • the storage device 908 is a device for data storage formed as an example of a storage unit of the information processing apparatus 900 .
  • the storage device 908 is realized by, for example, a magnetic storage unit device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the storage device 908 may include a storage medium, a recording device that records data in the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded in the storage medium, and the like.
  • the storage device 908 stores programs and various data executed by the CPU 901 , and various data acquired from the outside, and the like.
  • the storage device 908 can form, for example, the storage unit 120 illustrated in FIG. 4 , the storage unit 220 illustrated in FIG. 5 , the storage unit 340 illustrated in. FIG. 6 , or the storage unit 450 illustrated in FIG. 7 .
  • the drive 909 is a reader/writer for storage medium, and is built in or externally attached to the information processing apparatus 900 .
  • the drive 909 reads out information recorded in a removable storage medium such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 903 . Furthermore, the drive 909 can also write information to the removable storage medium.
  • connection port 911 is an interface connected to an external device, and is a connection port to an external device capable of data transmission with a universal serial bus (USB) or the like, for example.
  • USB universal serial bus
  • the communication device 913 is, for example, a communication interface configured by a communication device or the like for being connected to a network 920 .
  • the communication device 913 is, for example, a communication card for wired or wireless local area network (LAN), long term evolution (LTE), Bluetooth (registered trademark), wireless USB (WUSB), or be like.
  • the communication device 913 may be a router for optical-communication, a router for an asymmetric digital subscriber line (ADSL), a modem for various communications, or the like.
  • the communication device 913 can transmit and receive signals and the like to and from the Internet and other communication devices in accordance with a predetermined protocol such as TCP/IP, for example.
  • the communication device 913 includes, for example, the network communication unit 110 illustrated in FIG. 4 , the network communication unit 210 illustrated in FIG. 5 , the antenna unit 310 and the wireless communication unit 320 illustrated in FIG. 6 , or the antenna unit 410 and the wireless communication unit 420 illustrated in FIG. 7 .
  • the network 920 is a wired or wireless transmission path of information transmitted from a device connected to the network 920 .
  • the network 920 may include a public network such as the Internet, a telephone network, and a satellite network, various local area networks including Ethernet (registered trademark), a wide area network (WAN), and the like.
  • the network 920 may include a leased line network such as an internet protocol-virtual private network (IP-VPN).
  • IP-VPN internet protocol-virtual private network
  • a computer program for realizing the functions of the information processing apparatus 900 according to the above-described present embodiment can be prepared and mounted on a PC or the like.
  • a computer-readable recording medium in which such a computer program is stored can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be delivered via, for example, a network without using a recording medium.
  • the edge server 20 is arranged on a local network side with respect to a gateway between the internet and a local network, and transmits or receives the information regarding one or more sensor devices 30 wirelessly connected to the first cell associated with the edge server 20 to or from another edge server 20 associated with the second cell. Since the edge server 20 is arranged on the local network side, the communication distance from the sensor device 30 becomes short, and the load regarding the communication can be suppressed. Furthermore, since the information regarding the sensor device 30 is exchanged between the edge servers 20 , the information processing based on the information regarding the sensor device 30 can be performed by the plurality of edge servers 20 in a distributed manner. In this manner, the edge server 20 can efficiently and appropriately provide a service to the user in cooperation with other edge servers 20 while accommodating a large number of sensor devices 30 .
  • a fixedly installed device such as the surveillance camera has been described as an example of the sensor device 30 , but the present technology is not limited to such an example.
  • the sensor device 30 may have mobility. In that case, processing based on movement of the sensor device 30 can be similarly performed to the above-described third scenario.
  • processing described using the sequence diagrams in the present specification may not necessarily be executed in the illustrated order. Some processing steps may be executed in parallel. Furthermore, additional processing steps may be employed and some processing steps may be omitted.
  • control device arranged on a local network side with respect to a gateway between an internet and a local network, the control device including:
  • control unit configured to transmit or receive information regarding one or more sensor devices wirelessly connected to a first cell associated with the control device to or from another control device associated with a second cell.
  • control device in which the control unit selects a first sensor device capable of sensing a sensing target on the basis of a request from a terminal device, generates service information on the basis of sensing information of the selected first sensor device, and provides the service information to the terminal device.
  • control device in which the request includes information giving an instruction of the service information to be generated.
  • control device according to any one of (2) to (4), in which the request includes information for identifying the sensing target.
  • control device in which the control unit transmits the information regarding the sensing target to the another control device in a case where the second sensor device is connected to the second cell.
  • control device in which the control unit transmits the information regarding the sensing target to the another control device after the control unit specifies that the sensing target has moved to the sensable range of the second sensor device.
  • control device in which, in a case where the control unit recognizes movement of the sensing target on the basis of the sensing information of the first sensor device, the control unit transmits a request for searching for the second sensor device to the another control device.
  • control device in which, in a case where the control unit recognizes movement or the sensing target on the basis of sensing information of the second sensor device, the control unit receives a request for searching for the first sensor device from the another control device.
  • control device in which the control unit transmits information regarding the terminal device to the another control device in a case where the terminal device moves from the first cell to the second cell.
  • control device in which the control unit transmits information regarding the another control device to the terminal device in a case where the terminal device moves from the first cell to the second cell.
  • control device in which the control unit receives position information of the terminal device from the terminal device and transmits sensing information to the terminal device.
  • control device in which the control unit receives position information or the sensing information of the sensor device from the sensor device.
  • control device in which the control unit transmits or receives position information of the terminal device, or position information or the sensing information of the sensor device to or from the another control device.
  • the control device according to any one of (2) to (15), in which the terminal device is a device of a runner, and the sensing target is another runner preceding the runner.
  • control device in which the terminal device is a device of a guardian or a caregiver, and the sensing target is a person to be watched of the guardian or the caregiver,
  • control device in which the terminal device is a device of a store or a police, and the sensing target is a customer who has left the store,

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Telephonic Communication Services (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Information Transfer Between Computers (AREA)
US16/477,740 2017-02-21 2017-11-30 Control device and method for processing a service using a plurality of sensor devices in a distributed manner while suppressing a communication load Active US10952041B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2017-030039 2017-02-21
JPJP2017-030039 2017-02-21
JP2017030039A JP2018137575A (ja) 2017-02-21 2017-02-21 制御装置及び方法
PCT/JP2017/042941 WO2018154901A1 (ja) 2017-02-21 2017-11-30 制御装置及び方法

Publications (2)

Publication Number Publication Date
US20190364399A1 US20190364399A1 (en) 2019-11-28
US10952041B2 true US10952041B2 (en) 2021-03-16

Family

ID=63253226

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/477,740 Active US10952041B2 (en) 2017-02-21 2017-11-30 Control device and method for processing a service using a plurality of sensor devices in a distributed manner while suppressing a communication load

Country Status (6)

Country Link
US (1) US10952041B2 (zh)
EP (1) EP3588998A4 (zh)
JP (1) JP2018137575A (zh)
KR (1) KR102391111B1 (zh)
CN (1) CN110313190B (zh)
WO (1) WO2018154901A1 (zh)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11681040B2 (en) * 2018-08-21 2023-06-20 Siren Marine, Inc. Marine machine type communication device
US20220095105A1 (en) * 2019-02-22 2022-03-24 Mitsubishi Electric Corporation Mobile body communication system
JP2022063883A (ja) 2019-03-01 2022-04-25 ソニーグループ株式会社 情報処理装置、情報処理方法、およびプログラム
EP3879796B1 (en) * 2020-03-13 2024-02-21 Apple Inc. Selection of edge application server
US20210396524A1 (en) * 2020-06-17 2021-12-23 Astra Navigation, Inc. Generating a Geomagnetic Map
WO2022026522A1 (en) * 2020-07-28 2022-02-03 Intel Corporation Ai-based cellular network management and orchestration
US11615039B2 (en) 2020-07-31 2023-03-28 Siren Marine, Inc. Data transmission system
EP4047963B1 (en) * 2021-02-22 2024-04-10 Nokia Technologies Oy Managing network sensing capabilities in a wireless network
WO2023058125A1 (ja) * 2021-10-05 2023-04-13 株式会社Nttドコモ ネットワークノード及び通信方法
CN116847395A (zh) * 2022-03-23 2023-10-03 中国移动通信有限公司研究院 一种支持感知的通信方法及装置、通信设备

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005165952A (ja) 2003-12-05 2005-06-23 Nippon Telegr & Teleph Corp <Ntt> パーソナルコンテンツデリバリ方法、エッジノード及びパーソナルコンテンツデリバリシステム装置並びに処理プログラム
JP2008085832A (ja) 2006-09-28 2008-04-10 Sony Corp 監視カメラ、監視カメラの制御方法および監視カメラシステム
US20090128298A1 (en) * 2007-11-15 2009-05-21 Ryu Jehyok Method and system for locating sensor node in sensor network using transmit power control
JP2014522164A (ja) 2011-07-29 2014-08-28 インテレクチュアル ベンチャーズ ホールディング 81 エルエルシー 通信端末及び方法
US20140365517A1 (en) * 2013-06-06 2014-12-11 International Business Machines Corporation QA Based on Context Aware, Real-Time Information from Mobile Devices
US20160366008A1 (en) * 2014-02-28 2016-12-15 Huawei Technologies Co., Ltd. Data retransmission method and apparatus
US20170086011A1 (en) * 2015-09-22 2017-03-23 Veniam, Inc. Systems and methods for shipping management in a network of moving things
US20180284735A1 (en) * 2016-05-09 2018-10-04 StrongForce IoT Portfolio 2016, LLC Methods and systems for industrial internet of things data collection in a network sensitive upstream oil and gas environment
US20180297732A1 (en) * 2017-04-14 2018-10-18 Omron Corporation Packaging machine, control apparatus for packaging machine, control method, and program
US20200348662A1 (en) * 2016-05-09 2020-11-05 Strong Force Iot Portfolio 2016, Llc Platform for facilitating development of intelligence in an industrial internet of things system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060143439A1 (en) * 2004-12-06 2006-06-29 Xpaseo Method and system for sensor data management
US8923202B2 (en) * 2012-07-23 2014-12-30 Adidas Ag Communication network for an athletic activity monitoring system
US9467274B2 (en) * 2013-07-25 2016-10-11 Verizon Patent And Licensing Inc. Processing communications via a sensor network

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005165952A (ja) 2003-12-05 2005-06-23 Nippon Telegr & Teleph Corp <Ntt> パーソナルコンテンツデリバリ方法、エッジノード及びパーソナルコンテンツデリバリシステム装置並びに処理プログラム
JP2008085832A (ja) 2006-09-28 2008-04-10 Sony Corp 監視カメラ、監視カメラの制御方法および監視カメラシステム
US20090128298A1 (en) * 2007-11-15 2009-05-21 Ryu Jehyok Method and system for locating sensor node in sensor network using transmit power control
JP2014522164A (ja) 2011-07-29 2014-08-28 インテレクチュアル ベンチャーズ ホールディング 81 エルエルシー 通信端末及び方法
US20140365517A1 (en) * 2013-06-06 2014-12-11 International Business Machines Corporation QA Based on Context Aware, Real-Time Information from Mobile Devices
JP2016528583A (ja) 2013-06-06 2016-09-15 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation 質問応答のための方法、システム、およびコンピュータ読取可能な記憶媒体
US20160366008A1 (en) * 2014-02-28 2016-12-15 Huawei Technologies Co., Ltd. Data retransmission method and apparatus
US20170086011A1 (en) * 2015-09-22 2017-03-23 Veniam, Inc. Systems and methods for shipping management in a network of moving things
US20180284735A1 (en) * 2016-05-09 2018-10-04 StrongForce IoT Portfolio 2016, LLC Methods and systems for industrial internet of things data collection in a network sensitive upstream oil and gas environment
US20200348662A1 (en) * 2016-05-09 2020-11-05 Strong Force Iot Portfolio 2016, Llc Platform for facilitating development of intelligence in an industrial internet of things system
US20180297732A1 (en) * 2017-04-14 2018-10-18 Omron Corporation Packaging machine, control apparatus for packaging machine, control method, and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Frantti et al., Requirements of Secure WSN-MCN Edge Router, The International Conference on Information Networking 2013, Jan. 28-30, 2013, pp. 210-215, IEEE, Bangkok, Thailand.

Also Published As

Publication number Publication date
CN110313190A (zh) 2019-10-08
JP2018137575A (ja) 2018-08-30
EP3588998A1 (en) 2020-01-01
WO2018154901A1 (ja) 2018-08-30
CN110313190B (zh) 2022-06-03
KR20190117477A (ko) 2019-10-16
EP3588998A4 (en) 2020-02-26
US20190364399A1 (en) 2019-11-28
KR102391111B1 (ko) 2022-04-26

Similar Documents

Publication Publication Date Title
US10952041B2 (en) Control device and method for processing a service using a plurality of sensor devices in a distributed manner while suppressing a communication load
EP3391673B1 (en) Systems and methods for emergency data communication
EP3391674B1 (en) Systems and methods for emergency data communication
US11593951B2 (en) Multi-device object tracking and localization
AU2018422609B2 (en) System, device, and method for an electronic digital assistant recognizing and responding to an audio inquiry by gathering information distributed amongst users in real-time and providing a calculated result
WO2013182101A1 (zh) 目标人的确定方法、装置及移动终端
KR101973934B1 (ko) 증강현실 서비스 제공 방법, 이를 이용하는 사용자 단말 장치 및 액세스 포인트
US9706380B1 (en) Providing emergency notification and tracking data from a mobile device
JP6245254B2 (ja) 位置推定装置、位置推定方法、対象端末、通信方法、通信端末、記録媒体および位置推定システム
KR20130134585A (ko) 휴대 단말의 센싱 정보 공유 장치 및 방법
JP6016383B2 (ja) 通信装置、通信装置の制御方法、プログラム
WO2016088611A1 (ja) 情報処理装置、情報処理方法及びコンピュータプログラム
CN110047246A (zh) 告警方法、设备及计算机可读存储介质
CN113449273A (zh) 解锁方法、移动终端及存储介质
KR20180035052A (ko) IoT 식별값을 이용한 보안 시스템 및 방법
CN115174740B (zh) 推荐方法、电子设备及存储介质
KR101422216B1 (ko) 위험 감지 알림 방법
KR101983252B1 (ko) 주변기기 정보 수집 방법 및 시스템
JP2017531878A (ja) モバイル機器とそのユーザ識別の同時決定
JP2016103231A (ja) 追跡システム及び追跡プログラム
KR20180035056A (ko) 무선 디바이스의 웨이킹 업을 이용한 cctv 보안 시스템 및 방법
KR20180035046A (ko) IoT 식별값을 이용한 보안 시스템 및 방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FURUICHI, SHO;REEL/FRAME:049826/0490

Effective date: 20190708

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STCF Information on status: patent grant

Free format text: PATENTED CASE