WO2016170548A2 - System and method for remote asset management - Google Patents

System and method for remote asset management Download PDF

Info

Publication number
WO2016170548A2
WO2016170548A2 PCT/IN2016/050117 IN2016050117W WO2016170548A2 WO 2016170548 A2 WO2016170548 A2 WO 2016170548A2 IN 2016050117 W IN2016050117 W IN 2016050117W WO 2016170548 A2 WO2016170548 A2 WO 2016170548A2
Authority
WO
WIPO (PCT)
Prior art keywords
unit
remote
cameras
sensors
processing unit
Prior art date
Application number
PCT/IN2016/050117
Other languages
French (fr)
Other versions
WO2016170548A3 (en
Inventor
Ramasubramaniam BARATHY
Original Assignee
Barathy Ramasubramaniam
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Barathy Ramasubramaniam filed Critical Barathy Ramasubramaniam
Publication of WO2016170548A2 publication Critical patent/WO2016170548A2/en
Publication of WO2016170548A3 publication Critical patent/WO2016170548A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • the embodiments herein generally relate to a system and method for remote asset management, and more particularly to a system and method for managing valuable assets and a remote master based on cameras, assisted sensors, and distributed machine learning.
  • One typical approach that addresses the above problem is a remote monitoring system that includes cameras at a remote unit for the purpose of observing an area or any valuable assets in the remote unit, and a remote master that receives data from the cameras.
  • the different components of the system are unable to coordinate with each other or learn from each other.
  • the system often have human operators attending to them to deal with any unusual situations or malfunctions, each of the operators only know what is happening with the subset of components that he is in charge of, and does not benefit from the data in other components.
  • This lack of communication and shared newly learned features between components of the system creates inefficiency and redundancy that result in errors.
  • continuous monitoring and feedback loop is not there, and constantly the configuration and work is not improved according to resource constraints. Accordingly, there remains a need for a remote asset management system for managing assets remotely with improved communication among components of the system.
  • an embodiment herein provides a remote asset management system for managing valuable assets remotely and for improving communication among components in the remote assessment management system.
  • the remote assessment management system includes one or more cameras, one or more sensors, a processing unit, a supporting unit, and a remote master.
  • the one or more cameras coupled with a remote unit to monitor the remote unit.
  • the remote unit is one or more assets which are to be monitored.
  • the one or more sensors coupled with the remote unit to sense abnormalities or changes in the remote unit.
  • the processing unit that interacts and receives (i) camera information and data captured by the one or more cameras, and (ii) sensor information and data captured by the one or more sensors.
  • the processing unit includes a machine learning module that (a) processes (i) the camera information, and data captured by the one or more cameras, and (ii) the sensor information and data captured by the one or more sensors, and (b) performs video analytics.
  • the supporting unit stores one or more rules for operating the one or more cameras and the one or more sensors remotely.
  • the remote master that interacts with (i) the supporting unit to obtain the one or more rules for operating the one or more cameras, and the one or more sensors, and (ii) the processing unit to obtain information associated with the one or more cameras, and the one or more sensors.
  • the remote master (i) processes the information received from the supporting unit, and the processing unit, and (ii) provides feedback to the one or more cameras and the one or more sensors for monitoring the remote unit.
  • the processing unit controls (a) the one or more cameras, and (b) the one or more sensors functionalities assisted by the remote master.
  • the remote master gives an input to the processing unit that carried out with (a) image processing techniques, (b) machine learning algorithms, (c) stored results, and/or (d) self-healing inputs.
  • the image processing techniques, and the machine learning algorithms are applied across the processing unit, the one or more cameras, and the one or more sensors to obtain an analytics output.
  • the one or more cameras includes one or more command receivers, a processor communication unit, a sensor communication unit, a remote communication unit, a command converter, an internal rule logic, a communication optimizer, a power optimizer, a camera neck, a camera mounter, a physical cushioner, and an alarm raiser.
  • the one or more command receivers that receives commands from (i) the remote master, (ii) the processing unit, (iii) a local unit, and (iv) the one or more sensors.
  • the processor communication unit that communicates output or data to the processing unit.
  • the sensor communication unit that communicates outputs or data to the one or more sensors.
  • the remote communication unit that communicates outputs or data to the remote master and the remote unit.
  • the command converter that converts commands from (i) the remote master, (ii) the processing unit, (iii) the one or more sensors, and/or (iv) the local unit.
  • the internal rule logic that verifies the commands.
  • the commands are converted by the command converter in order to avoid any wrong commands that come in.
  • the communication optimizer that optimizes communication options for communicating with (i) the remote master, (ii) the remote unit, (iii) the one or more sensors, and/or (iv) the processing unit.
  • the power optimizer that optimizes (i) a battery unit, and (ii) an electric power unit.
  • the camera neck adapted to tilt and adjust a view of the one or more cameras.
  • the camera mounter adapted to place the one or more cameras flexibly in different surfaces.
  • the physical cushioner that protects the one or more cameras when fall unexpectedly while mounting.
  • the alarm raiser adapted to raise alarm when people try to tamper or harm the remote unit/asset when the remote unit/asset
  • the one or more sensors includes a configurator, a configuration verifier, a configuration optimizer, a camera optimizer, a processing unit optimizer, a machine learning optimizer, a situation optimizer, an assisting sensor optimizer, a remote master optimizer, a remote close unit optimizer, and a remote far unit optimizer.
  • the configurator that configures the one or more rules for operating the one or more sensors.
  • the configuration verifier that verifies the one or more rules for operating the one or more sensors.
  • the configuration optimizer that optimizes the configurator and updates the one or more rules dynamically.
  • the camera optimizer that controls and optimizes functionalities of the one or more cameras based on sensor information captured by the one or more sensors.
  • the processing unit optimizer that provides sensor information captured to the processing unit, and optimizes the processing unit.
  • the machine learning optimizer that optimizes the machine learning of the one or more cameras and the processing unit.
  • the situation optimizer and a context optimizer that selects the one or more sensors type and optimizes the one or more sensors type for capturing sensor information based on situation and context of the remote unit.
  • the assisting sensor optimizer that optimizes functionalities of the one or more assisting sensors.
  • the remote master optimizer that optimizes functionalities of the remote master.
  • the remote close unit optimizer that captures data from the remote unit that is located close to the one or more sensors.
  • the remote far unit optimizer that receives data from the remote unit that is located far from the remote master.
  • the processing unit includes a remote master input/output (I/O) unit, a remote unit I/O unit, a support unit I/O unit, a camera unit real time I/O unit, a camera unit batch I/O unit, a camera unit command I/O unit, a sensor real time I/O unit, a sensor batch I/O unit, a video decompressor, an I/O optimizer and add I/O unit, an admin console, a VCA video engine, a VCA stream engine and a VCA image engine, a sensors data analytics engine, a reporting unit, an audit trail unit, an archiving unit, and a commander unit.
  • I/O remote master input/output
  • the remote master input output (I/O) unit that interacts with the remote master, and provides sensor and camera information to the remote master, and in turn receives instructions from the remote master.
  • the remote unit I/O unit that receives input from the remote unit, and provides output to the remote unit.
  • the support unit I/O unit that interacts with the supporting unit for operating the one or more cameras and the one or more sensors.
  • the camera unit real time I/O unit that provides input to the one or more cameras and receives output from the one or more cameras in a real-time.
  • the camera unit batch I/O unit that provides input to the one or more cameras and receives output from the one or more cameras in a batch.
  • the camera unit command I/O unit that provides commands for operating the one or more cameras, and receives information from the one or more cameras.
  • the sensor real time I/O unit that provides input to the one or more sensors and receives output from the one or more sensors in a real-time.
  • the sensor batch I/O unit that provides input to the one or more sensors and receives output from the one or more sensors in a batch.
  • the video decompressor that decompresses videos captured by the one or more cameras.
  • the I/O optimizer and add I/O unit that optimizes interaction between the processing unit and other components of the remote asset management system.
  • the I/O optimizer and add I/O unit adds new input and output to the processing unit when any update is received from the one or more sensors and the one or more cameras.
  • the admin console that provides user interfaces to monitor, track, and control functionalities of the processing unit.
  • the VCA video engine that receives video dataset from the one or more cameras.
  • the VCA video engine processes the video and provides information to the remote master to control the one or more cameras.
  • the VCA stream engine and a VCA image engine that receives images and photos from the one or more cameras and/or the one or more sensors.
  • the VCA stream engine and the VCA image engine processes the images and photos and provides information to the remote master to control the one or more cameras, and/or the one or more sensors.
  • the sensors data analytics engine that (i) receives a data that captured by the one or more sensors, (ii) processes the received data, and (iii) analyzes the data for meaningful information.
  • the reporting unit that generates a report that includes data associated with the one or more cameras and the one or more sensors based on a request from a user.
  • the audit trail unit that provides documentary evidence of sequence of activities that are affected at any time at a specific operation of (i) the one or more cameras, (ii) the one or more sensors, (iii) procedure, or (iv) event,.
  • the audit trail unit includes (a) set of records, (b) destination, and (c) source of records.
  • the archiving unit that archives data associated with the one or more cameras, the one or more sensors, and/or the remote master.
  • the commander unit that provides commands for operating the one or more cameras and/or the one or more sensors to monitor the remote unit.
  • the rules are dynamically updated and are configured remotely by the remote master through the processing unit.
  • the remote master processes a request from a user about the remote unit or to remotely mount the one or more cameras in right place or surface.
  • the remote asset management system includes a database that stores (i) the camera information that includes videos and photos captured by the one or more cameras, (ii) the sensor information captured by the one or more sensors, (iii) commands, and (iv) operations data for operating the one or more cameras, the one or more sensors, and/or the remote master.
  • the commands controls at least one of: (a) one or more software units that includes (i) a software machine learning module, (ii) an image and video processor, and/or (iii) a sensor input processors; and (b) one or more hardware units that includes (i) actuators, (ii) a lens, (iii) an image sensor, (iv) a hardware machine learning unit, and/or (v) a camera body.
  • a remote asset management system for managing valuable assets remotely and for improving communication among components in the remote assessment management system.
  • the remote assessment management system includes a database, one or more cameras, one or more sensors, a processing unit, a supporting unit, and a remote master.
  • the database that stores (i) camera information that includes videos and photos captured by one or more cameras, (ii) sensor information captured by one or more sensors, (iii) commands, and (iv) operations data for operating the one or more cameras, the one or more sensors.
  • the one or more cameras coupled with a remote unit to monitor the remote unit.
  • the remote unit is one or more assets which are to be monitored.
  • the one or more sensors coupled with the remote unit to sense abnormalities or changes in the remote unit.
  • the one or more sensors includes a software machine learning module and a hardware machine learning unit that optimizes machine learning of the one or more cameras and a processing unit.
  • the processing unit that interacts and receives (i) camera information and data captured by the one or more cameras, and (ii) sensor information and data captured by the one or more sensors.
  • the processing unit includes a machine learning module that (a) processes (i) the camera information and data captured by the one or more cameras, and (ii) the sensor information and data captured by the one or more sensors, and (b) performs video analytics.
  • the supporting unit adapted to store one or more rules for operating the one or more cameras and the one or more sensors remotely. The rules are dynamically updated and are configured remotely by the remote master through the processing unit.
  • the remote master that interacts with (i) the supporting unit to obtain the one or more rules for operating the one or more cameras, and the one or more sensors, and (ii) the processing unit to obtain information associated with the one or more cameras, and the one or more sensors.
  • the remote master processes the information received from the supporting unit, and the processing unit, and (ii) provides feedback to the one or more cameras and the one or more sensors for monitoring the remote unit.
  • the processing unit controls (a) the one or more cameras, and (b) the one or more sensors functionalities and operations assisted by the remote master.
  • the remote master gives an input to the processing unit that carried out with (a) image processing techniques, (b) machine learning algorithms, (c) stored results, and/or (d) self-healing inputs.
  • the image processing techniques, and the machine learning algorithms are applied across the processing unit, the one or more cameras, and the one or more sensors to obtain an analytics output.
  • the remote master processes a request from a user about the remote unit or to remotely mount the one or more cameras in right place or surface.
  • a method for managing valuable assets remotely and for improving communication among components in a remote assessment management system includes the following steps: (i) monitoring a remote unit by employing one or more cameras around the remote unit, (ii) sensing abnormalities or changes in the remote unit by employing one or more sensors around the remote unit, (iii) receiving, using a processing unit, (a) camera information and data captured by the one or more cameras, and (b) sensor information and data captured by the one or more sensors, (iv) storing, using a supporting unit, one or more rules for operating the one or more cameras and the one or more sensors remotely, (v) interacting, using a remote master, with (a) the supporting unit to obtain the one or more rules for operating the one or more cameras, and the one or more sensors, and (b) the processing unit to obtain information associated with the one or more cameras, and the one or more sensors, and (vi) processing, using the remote master, the information received from the supporting unit, and the processing unit, and providing feedback
  • the remote unit is one or more assets which are to be monitored.
  • the method further includes the step of controlling (a) the one or more cameras, and (b) the one or more sensors functionalities and operations assisted by the remote master.
  • the method further includes the step of processing a request from a user to remotely mount the one or more cameras in right place or surface.
  • the processing unit includes a machine learning module that (a) processing (i) the camera information, and data captured by the one or more cameras, and (ii) the sensor information, and data captured by the one or more sensors, and (b) performing video analytics.
  • FIG. 1 is a system view illustrating a remote asset management system for managing valuable assets remotely and for improving communication among components of the system in accordance with an embodiment herein;
  • FIG. 2 is an exploded view of a camera in a remote unit of the system of FIG. 1 in accordance with an embodiment herein.
  • FIG. 3 is an exploded view of a sensor of one or more sensors of the remote asset management system of FIG. 1 according to an embodiment herein.
  • FIG. 4 is an exploded view of the processing unit of the remote management system of FIG. 1 according to an embodiment herein
  • FIG. 5 illustrates a schematic diagram of a computer architecture according to an embodiment herein.
  • FIGS. 1 through 5 where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
  • FIG. 1 is a system view illustrating a remote asset management system 100 for managing valuable assets remotely and for improving communication among components of the system 100 in accordance with an embodiment herein.
  • the remote asset management system 100 includes a remote master 102, a supporting unit 104, a remote unit 106 to be monitored, a remote unit 108 that is located far from the remote master 102, one or more cameras 110, one or more sensors 112, a processing unit 114, and other external inputs 116.
  • the remote master 102 is, but not limited to, a computer, a desktop, a laptop, or a mobile phone.
  • the remote master 102 interacts with the supporting unit 104, and obtains one or more rules for operating the one or more cameras 110 and the one or more sensors 112.
  • the remote master 102 further interacts with the processing unit 114 to obtain information associated with the one or more cameras 110 and the one or more sensors 112.
  • the remote master 102 processes the information, self-learns, and provides feedback to the one or more cameras 110 and the one or more sensors 112 for the better operation.
  • the remote master 102 processes a request from a user to the remote unit 106 or to remotely mount the one or more cameras 110 in the right place or surface.
  • the remote master 102 forms a wireless network with the one or more cameras 110.
  • the supporting unit 104 stores rules (e.g., configuration data) for operating the one or more cameras 110 and the one or more sensors 112 remotely.
  • the rules are dynamically updated and are configured remotely by the remote master 102 through the processing unit 114.
  • the remote unit 106 is a person or a physical location in which assets are to be monitored.
  • the remote unit 106 is, but not restricted to, a hospital having medical devices to be monitored, a mining site, a manufacturing unit having equipments, a retail shop having sale products, a kids care center, a communication tower, tele-operations, floors, etc.
  • the one or more cameras 110 are located inside or in close proximity to the remote unit 106.
  • the cameras 110 are mountable. A position and a direction of each of the cameras 110 are changed remotely.
  • the cameras 110 are adapted to easily move from one place to another place and fix in the remote unit 106.
  • the cameras 110 are non-mountable.
  • the cameras are portable.
  • the cameras 110 have both a software machine learning module and a hardware machine learning unit for improving its functionality.
  • the one or more sensors 112 are located inside or in close proximity to the remote unit 106.
  • the type of sensors that are used in the remote unit 106 vary depends on the purpose or context. For example, when the remote unit 106 is a manufacturing unit having equipments, a temperature sensor may be used to track temperatures of the equipments. When the remote unit 106 is a retail shop having products, a motion sensor may be used to monitor people around the remote unit 106.
  • the sensors have both a software machine learning module and a hardware machine learning unit for improving its functionality. Data captured by the one or more cameras 110 and the one or more sensors 112 are provided to the processing unit 114 for further processing.
  • the processing unit 114 interacts and receives camera information (e.g., available power, video resolution, camera angle or direction, computation power, communication power, frame rate, etc.), data captured (e.g., multimedia content such as video and audio) by the one or more camera units 110, sensor information, and data captured by the one or more sensors 112.
  • the processing unit 114 processes the camera information, data captured by the one or more cameras 110 and the one or more sensors 112, and sensor information, and performs video analytics assisted by a machine learning module. Further, the processing unit 114 automatically calculates with processing techniques such as, but not limited to, image processing and machine learning, and gets inputs ready for the remote master 102.
  • the inputs are provided to the remote master 102 through a wired or a wireless network.
  • the remote master 102 With the inputs from the processing unit 114, the remote master 102 improves itself, self-configures, self-learns, and provides input to the one or more cameras 110 and the one or more sensors 112.
  • the processing unit 114 controls the one or more cameras 110 and the one or more sensors 112 functionalities and operations, and associated distributed machine learning.
  • the processing unit 114 is located in close proximity to the remote master 102. In another embodiment, the processing unit 114 is located in close proximity to the remote unit 106. In yet another embodiment, the processing unit 114 is located centrally in between the remote master 102 and/or the remote unit 106. The processing unit 114 receives other external inputs 116 such as past information.
  • Instructions that are given by the remote master 102 to the processing unit 114 is carried out with image processing techniques, machine learning algorithms, and previous, self-healing inputs and other inputs which are given to the processing unit 114.
  • Machine learning and image processing is split and distributed across the processing unit 114, the one or more cameras 110, and the one or more sensors 112, and all these work together to give meaningful analytics (e.g., video analytics and sensor analytics combine with a distributed machine learning) to the remote master 102.
  • Operations e.g., capturing videos at a higher or a lower resolution, passing frame rates, etc.
  • of one or more cameras 110 are controlled based on the one or more sensors 112, distributed machine learning, the processing unit 114, and/or the remote master 102.
  • FIG. 2 is an exploded view of a camera in the remote unit 106 of the remote asset management system 100 of FIG. 1 in accordance with an embodiment herein.
  • Each of the cameras 110 includes a command receiver 202 for receiving commands from the remote master 102, a command receiver 204 for receiving commands from the processing unit 114, a command receiver 206 for receiving commands from the one or more sensors 112, a command receiver 208 for receiving commands from a local unit, a processor communication unit 210 for communicating outputs or data to the processing unit 114, a sensor communication unit 212 for communicating outputs or data to the one or more sensors 112, and a remote communication unit 214 for communicating outputs or data to the remote master 102 and the remote unit 106.
  • Each of the cameras 110 further includes a command converter 216 for converting commands from the remote master 102, the processing unit 114, the one or more sensors 112, and/or the local unit.
  • the commands control a software machine learning module 218, an image and video processor 220, and a sensor input processors 222.
  • the commands also control hardware units include actuators 224, a lens 226, an image sensor 228, a hardware machine learning unit 230, and a camera body.
  • Each of the cameras 110 further includes an internal rule logic 232, a communication optimizer 234, a power optimizer 242, a camera neck 250, a camera head 252, a camera mounter 254, a physical cushioner 256, an and alarm raiser 258.
  • the internal rule logic 232 verifies commands that are converted by the command converter 216 to avoid any wrong commands that come in.
  • the communication optimizer 234 optimizes the communication options (3G 236, Wi-Fi 238, or fixed 240) for communicating with the remote master 102, the remote unit 106, the one or more sensors 112, and/or the processing unit 114.
  • the power optimizer 242 optimizes a battery unit 246 and an electric power unit 248.
  • the camera neck 250 helps in tilting and adjusting view of the cameras 110.
  • the camera head 252 is movable.
  • the camera mounter 254 helps to place flexibly in different surfaces.
  • the physical cushioner 256 protects the cameras 110 when fall unexpectedly while mounting, and the alarm raiser 258 if people try to tamper or harm physically the whole unit in operation.
  • FIG. 3 is an exploded view of a sensor of the one or more sensors 112 of the remote asset management system 100 of FIG. 1 according to an embodiment herein.
  • the one or more sensors 112 includes a templating engine 302, a rule engine 304, a real-time trigger engine 306, a configurator 308, a configuration verifier 310, a configuration optimizer 312, a camera optimizer 314, a processing unit optimizer 316, a machine learning optimizer 318, a situation optimizer 320, a context optimizer 322, an assisting sensor optimizer 324, a remote master optimizer 326, a remote close unit optimizer 328, a remote far unit optimizer 330, a predeployer 332, an auditing engine 334, and a reconciliation engine 336.
  • the configurator 308 enables configuring one or more rules for operating the one or more sensors 112.
  • the configurator 308 enables configuring one or more rules are performed locally or remotely using the remote master 102 and/or the supporting unit 104.
  • the configuration verifier 310 verifies the one or more rules for operating the sensors 112, and the configuration optimizer 312 optimizes the configurator 308 and updates the one or more rules dynamically.
  • the camera optimizer 314 controls and optimizes functionalities of the one or more cameras 110 based on sensor information captured by the one or more sensors 112. For example, the one or more sensors 112 controls the one or more cameras 110 to obtain high quality videos upon reaching only a certain temperature. Below such temperature, the one or more cameras 110 may be operated to obtain low quality videos.
  • the processing unit optimizer 316 provides sensor information that is captured to the processing unit 114, and optimizes the processing unit 114.
  • the one or more sensors 112 includes both a software machine learning module and a hardware machine learning unit 318 that optimizes the machine learning (hardware and/or software) of the one or more cameras 110 and the processing unit 114.
  • the situation optimizer 320 and the context optimizer 322 selects sensors type and optimize it for capturing sensor information based on the situation and the context.
  • One or more primary sensors of the sensors 112 optimize functionalities of one or more assisting sensors based on the assisting sensor optimizer 324.
  • the remote master optimizer 326 optimizes functionalities of the remote master 102
  • the remote close unit optimizer 328 captures data from the remote unit 106 that is close to the sensors 112
  • the remote far unit optimizer 330 receives data from the remote unit 108 that is located far from the remote master 102.
  • FIG. 4 is an exploded view of the processing unit 114 of the remote management system 100 of FIG. 1 according to an embodiment herein.
  • the processing unit 114 includes a remote master input/output (I/O) unit 402, a remote unit I/O unit 404, a support unit I/O unit 406, a camera unit real time I/O unit 408, a camera unit batch I/O unit 410, a camera unit command I/O unit 412, a sensor real time I/O unit 414, a sensor batch I/O unit 416, a video decompressor 418, I/O optimizer and add I/O unit 420, an admin console 422, a VCA video engine 424, a VCA stream engine 426, a VCA image engine 428, a video dataset unit 430, a stat engine 432, a custom device video analysis unit 434, a sensors data analytics engine 436, a map reduce unit 438, a ETL staging, data transformer and mobile converter unit 440, a reporting unit,
  • the remote master I/O unit 402 interacts with the remote master 102, provides sensor and camera information to the remote master 102, and in turn receives instructions from the remote master 102.
  • the remote unit I/O unit 404 receives input from the remote unit 106, and provides output to the remote unit 106.
  • the supporting unit I/O unit 406 interacts with the supporting unit 104 for operating the one or more cameras 110 and the one or more sensors 112.
  • the camera unit real time I/O unit 408 provides input to the one or more cameras 110 and receives output from the one or more cameras 110 in a real-time.
  • the camera unit batch unit 410 provides input to the one or more cameras 110 and receives output from the one or more cameras 110 in a batch.
  • the camera unit command I/O unit 412 provides commands for operating the one or more cameras 110, and receives information from the cameras 110.
  • the sensors real-time I/O unit 414 provides input to the one or more sensors 112 and receives output from the one or more sensors 112 in a real-time.
  • the sensors batch I/O unit 416 provides input to the one or more sensors 112 and receives output from the one or more sensors 112 in a batch.
  • the video decompressor 418 decompresses videos captured by the one or more cameras 110 and/or one or more sensors 112.
  • the I/O optimizer and add I/O unit 420 optimizes interaction between the processing unit 114 and other components of the system 100, and adds new input and output to the processing unit when any update is available.
  • the admin console 422 provides user interfaces for monitoring, tracking, and controlling functionalities of the processing unit 114.
  • the VCA video engine 424, the VCA stream engine 426, and the VCA image engine 428 receives multimedia (e.g., videos, photos, image, etc.) from the one or more cameras 110 and/or the one or more sensors 112, and process the multimedia, and provide information to the remote master 102 for controlling the cameras 110 and the sensors 112.
  • the VCA video engine 424 receives multimedia as a video dataset from the cameras 110 and/or the sensors 112.
  • the custom device video analysis unit 434 analyses videos received from the cameras 110 and the sensors 112.
  • the sensor data analytics engine 436 receives sensors data, and data captured by the sensors 112, process, and analyze the data for meaningful information.
  • the reporting unit 442 generates a report including data associated with the cameras 110 and the sensors 112 based on a request from a user.
  • the audit trail unit 444 includes set of records, destination and source of records that provide documentary evidence of the sequence of activities that have affected at any time a specific operation of the cameras 110 and the sensors 112, procedure, or event.
  • the archiving unit 446 archives data associated with the cameras 110, the sensors 112, and/or the remote master 102.
  • the processing unit 114 includes the machine learning master unit 448 is managed by the machine learning hardware agent manager unit 452. Similarly, the machine learning software module 450 is managed by the machine learning software agent manager 454.
  • the database 456 stores camera information (e.g., videos captured by the cameras 110), sensor information, commands and operations data for operating the cameras 110, the sensors 112, and/or the remote master 102.
  • the commander unit 458 provides commands for operating the cameras 110 and/or the sensors 112.
  • the remote asset management system 100 utilizes radio technology, cellular network, and Wi-Fi to control and communicating data among components of the system 100. Further, the system 100 improves functionalities of the remote unit 106, reduces cost at the remote unit 106, reduces workload of the remote master 102, saves time of escalation, and takes minimal time for installing the cameras 110 and effectively monitors the remote unit 106.
  • the techniques provided by the embodiments herein may be implemented on an integrated circuit chip (not shown).
  • the chip design is created in a graphical computer programming language, and stored in a computer storage medium (such as a disk, tape, physical hard drive, or virtual hard drive such as in a storage access network). If the designer does not fabricate chips or the photolithographic masks used to fabricate chips, the designer transmits the resulting design by physical means (e.g., by providing a copy of the storage medium storing the design) or electronically (e.g., through the Internet) to such entities, directly or indirectly.
  • the stored design is then converted into the appropriate format (e.g., GDSII) for the fabrication of photolithographic masks, which typically include multiple copies of the chip design in question that are to be formed on a wafer.
  • the photolithographic masks are utilized to define areas of the wafer (and/or the layers thereon) to be etched or otherwise processed.
  • the resulting integrated circuit chips can be distributed by the fabricator in raw wafer form (that is, as a single wafer that has multiple unpackaged chips), as a bare die, or in a packaged form.
  • the chip is mounted in a single chip package (such as a plastic carrier, with leads that are affixed to a motherboard or other higher level carrier) or in a multichip package (such as a ceramic carrier that has either or both surface interconnections or buried interconnections).
  • the chip is then integrated with other chips, discrete circuit elements, and/or other signal processing devices as part of either (a) an intermediate product, such as a motherboard, or (b) an end product.
  • the end product can be any product that includes integrated circuit chips, ranging from toys and other low- end applications to advanced computer products having a display, a keyboard or other input device, and a central processor.
  • the embodiments herein can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment including both hardware and software elements.
  • the embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc.
  • the embodiments herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system.
  • a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
  • Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W) and DVD.
  • a data processing system suitable for storing and/or executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus.
  • the memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
  • I/O devices can be coupled to the system either directly or through intervening I/O controllers.
  • Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
  • FIG. 5 A representative hardware environment for practicing the embodiments herein is depicted in FIG. 5.
  • the system comprises at least one processor or central processing unit (CPU) 10.
  • the CPUs 10 are interconnected via system bus 12 to various devices such as a random access memory (RAM) 14, read-only memory (ROM) 16, and an input/output (I/O) adapter 18.
  • RAM random access memory
  • ROM read-only memory
  • I/O input/output
  • the I/O adapter 18 can connect to peripheral devices, such as disk units 11 and tape drives 13, or other program storage devices that are readable by the system.
  • the system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments herein.
  • the system further includes a user interface adapter 19 that connects a keyboard 15, mouse 17, speaker 24, microphone 22, and/or other user interface devices such as a touch screen device (not shown) to the bus 12 to gather user input.
  • a communication adapter 20 connects the bus 12 to a data processing network 25, and a display adapter 21 connects the bus 12 to a display device 23 which may be embodied as an output device such as a monitor, printer, or transmitter, for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Testing And Monitoring For Control Systems (AREA)
  • General Factory Administration (AREA)
  • Selective Calling Equipment (AREA)

Abstract

A method for managing valuable assets remotely and for improving communication among components in a remote assessment management system is provided. The method includes: (i) monitoring a remote unit, (ii) sensing abnormalities or changes in the remote unit, (iii) receiving (a) camera information and data captured by the cameras, and (b) sensor information and data captured by the sensors, (iv) storing rules for operating the cameras and the sensors remotely, (v) interacting with (a) the supporting unit to obtain the rules for operating the cameras, and the sensors, and (b) the processing unit to obtain information associated with the cameras, and the sensors, and (vi) processing the information received from the supporting unit, and the processing unit, and providing feedback to the cameras and the sensors for monitoring the remote unit.

Description

SYSTEM AND METHOD FOR REMOTE ASSET MANAGEMENT
BACKGROUND
Technical Field
[0001] The embodiments herein generally relate to a system and method for remote asset management, and more particularly to a system and method for managing valuable assets and a remote master based on cameras, assisted sensors, and distributed machine learning.
Description of the Related Art
[0002] Managing and locating vital assets remotely is a complicated, an expensive, and a time-consuming endeavor. Typically, in some situations, it is done based by humans who go themselves for manually training remote units or remotely do calls or conferencing. But there are lots of problems that occur in this process, as human errors are possible unknowingly or knowingly. Sometimes the human operators at a remote site also might not be that well trained. A remote master would not be able to receive and give feedbacks to the remote unit regularly. These may lead to large inefficiencies and asset breakdowns.
[0003] One typical approach that addresses the above problem is a remote monitoring system that includes cameras at a remote unit for the purpose of observing an area or any valuable assets in the remote unit, and a remote master that receives data from the cameras. As a huge amount of data is collected by the remote monitoring system, the different components of the system are unable to coordinate with each other or learn from each other. Although the system often have human operators attending to them to deal with any unusual situations or malfunctions, each of the operators only know what is happening with the subset of components that he is in charge of, and does not benefit from the data in other components. This lack of communication and shared newly learned features between components of the system creates inefficiency and redundancy that result in errors. Further, in existing systems continuous monitoring and feedback loop is not there, and constantly the configuration and work is not improved according to resource constraints. Accordingly, there remains a need for a remote asset management system for managing assets remotely with improved communication among components of the system.
SUMMARY
[0004] In view of the foregoing, an embodiment herein provides a remote asset management system for managing valuable assets remotely and for improving communication among components in the remote assessment management system. The remote assessment management system includes one or more cameras, one or more sensors, a processing unit, a supporting unit, and a remote master. The one or more cameras coupled with a remote unit to monitor the remote unit. The remote unit is one or more assets which are to be monitored. The one or more sensors coupled with the remote unit to sense abnormalities or changes in the remote unit. The processing unit that interacts and receives (i) camera information and data captured by the one or more cameras, and (ii) sensor information and data captured by the one or more sensors. The processing unit includes a machine learning module that (a) processes (i) the camera information, and data captured by the one or more cameras, and (ii) the sensor information and data captured by the one or more sensors, and (b) performs video analytics. The supporting unit stores one or more rules for operating the one or more cameras and the one or more sensors remotely. The remote master that interacts with (i) the supporting unit to obtain the one or more rules for operating the one or more cameras, and the one or more sensors, and (ii) the processing unit to obtain information associated with the one or more cameras, and the one or more sensors. The remote master (i) processes the information received from the supporting unit, and the processing unit, and (ii) provides feedback to the one or more cameras and the one or more sensors for monitoring the remote unit. The processing unit controls (a) the one or more cameras, and (b) the one or more sensors functionalities assisted by the remote master. The remote master gives an input to the processing unit that carried out with (a) image processing techniques, (b) machine learning algorithms, (c) stored results, and/or (d) self-healing inputs. The image processing techniques, and the machine learning algorithms are applied across the processing unit, the one or more cameras, and the one or more sensors to obtain an analytics output. [0005] In one embodiment, the one or more cameras includes one or more command receivers, a processor communication unit, a sensor communication unit, a remote communication unit, a command converter, an internal rule logic, a communication optimizer, a power optimizer, a camera neck, a camera mounter, a physical cushioner, and an alarm raiser. The one or more command receivers that receives commands from (i) the remote master, (ii) the processing unit, (iii) a local unit, and (iv) the one or more sensors. The processor communication unit that communicates output or data to the processing unit. The sensor communication unit that communicates outputs or data to the one or more sensors. The remote communication unit that communicates outputs or data to the remote master and the remote unit. The command converter that converts commands from (i) the remote master, (ii) the processing unit, (iii) the one or more sensors, and/or (iv) the local unit. The internal rule logic that verifies the commands. The commands are converted by the command converter in order to avoid any wrong commands that come in. The communication optimizer that optimizes communication options for communicating with (i) the remote master, (ii) the remote unit, (iii) the one or more sensors, and/or (iv) the processing unit. The power optimizer that optimizes (i) a battery unit, and (ii) an electric power unit. The camera neck adapted to tilt and adjust a view of the one or more cameras. The camera mounter adapted to place the one or more cameras flexibly in different surfaces. The physical cushioner that protects the one or more cameras when fall unexpectedly while mounting. The alarm raiser adapted to raise alarm when people try to tamper or harm the remote unit/asset when the remote unit/assets are in operation.
[0006] In another embodiment, the one or more sensors includes a configurator, a configuration verifier, a configuration optimizer, a camera optimizer, a processing unit optimizer, a machine learning optimizer, a situation optimizer, an assisting sensor optimizer, a remote master optimizer, a remote close unit optimizer, and a remote far unit optimizer. The configurator that configures the one or more rules for operating the one or more sensors. The configuration verifier that verifies the one or more rules for operating the one or more sensors. The configuration optimizer that optimizes the configurator and updates the one or more rules dynamically. The camera optimizer that controls and optimizes functionalities of the one or more cameras based on sensor information captured by the one or more sensors. The processing unit optimizer that provides sensor information captured to the processing unit, and optimizes the processing unit. The machine learning optimizer that optimizes the machine learning of the one or more cameras and the processing unit. The situation optimizer and a context optimizer that selects the one or more sensors type and optimizes the one or more sensors type for capturing sensor information based on situation and context of the remote unit. The assisting sensor optimizer that optimizes functionalities of the one or more assisting sensors. The remote master optimizer that optimizes functionalities of the remote master. The remote close unit optimizer that captures data from the remote unit that is located close to the one or more sensors. The remote far unit optimizer that receives data from the remote unit that is located far from the remote master.
[0007] In yet another embodiment, the processing unit includes a remote master input/output (I/O) unit, a remote unit I/O unit, a support unit I/O unit, a camera unit real time I/O unit, a camera unit batch I/O unit, a camera unit command I/O unit, a sensor real time I/O unit, a sensor batch I/O unit, a video decompressor, an I/O optimizer and add I/O unit, an admin console, a VCA video engine, a VCA stream engine and a VCA image engine, a sensors data analytics engine, a reporting unit, an audit trail unit, an archiving unit, and a commander unit. The remote master input output (I/O) unit that interacts with the remote master, and provides sensor and camera information to the remote master, and in turn receives instructions from the remote master. The remote unit I/O unit that receives input from the remote unit, and provides output to the remote unit. The support unit I/O unit that interacts with the supporting unit for operating the one or more cameras and the one or more sensors. The camera unit real time I/O unit that provides input to the one or more cameras and receives output from the one or more cameras in a real-time. The camera unit batch I/O unit that provides input to the one or more cameras and receives output from the one or more cameras in a batch. The camera unit command I/O unit that provides commands for operating the one or more cameras, and receives information from the one or more cameras. The sensor real time I/O unit that provides input to the one or more sensors and receives output from the one or more sensors in a real-time. The sensor batch I/O unit that provides input to the one or more sensors and receives output from the one or more sensors in a batch. The video decompressor that decompresses videos captured by the one or more cameras. The I/O optimizer and add I/O unit that optimizes interaction between the processing unit and other components of the remote asset management system. The I/O optimizer and add I/O unit adds new input and output to the processing unit when any update is received from the one or more sensors and the one or more cameras. The admin console that provides user interfaces to monitor, track, and control functionalities of the processing unit. The VCA video engine that receives video dataset from the one or more cameras. The VCA video engine processes the video and provides information to the remote master to control the one or more cameras. The VCA stream engine and a VCA image engine that receives images and photos from the one or more cameras and/or the one or more sensors. The VCA stream engine and the VCA image engine processes the images and photos and provides information to the remote master to control the one or more cameras, and/or the one or more sensors. The sensors data analytics engine that (i) receives a data that captured by the one or more sensors, (ii) processes the received data, and (iii) analyzes the data for meaningful information. The reporting unit that generates a report that includes data associated with the one or more cameras and the one or more sensors based on a request from a user. The audit trail unit that provides documentary evidence of sequence of activities that are affected at any time at a specific operation of (i) the one or more cameras, (ii) the one or more sensors, (iii) procedure, or (iv) event,. The audit trail unit includes (a) set of records, (b) destination, and (c) source of records. The archiving unit that archives data associated with the one or more cameras, the one or more sensors, and/or the remote master. The commander unit that provides commands for operating the one or more cameras and/or the one or more sensors to monitor the remote unit.
[0008] In yet another embodiment, the rules are dynamically updated and are configured remotely by the remote master through the processing unit. In yet another embodiment, the remote master processes a request from a user about the remote unit or to remotely mount the one or more cameras in right place or surface. In yet another embodiment, the remote asset management system includes a database that stores (i) the camera information that includes videos and photos captured by the one or more cameras, (ii) the sensor information captured by the one or more sensors, (iii) commands, and (iv) operations data for operating the one or more cameras, the one or more sensors, and/or the remote master. In yet another embodiment, the commands controls at least one of: (a) one or more software units that includes (i) a software machine learning module, (ii) an image and video processor, and/or (iii) a sensor input processors; and (b) one or more hardware units that includes (i) actuators, (ii) a lens, (iii) an image sensor, (iv) a hardware machine learning unit, and/or (v) a camera body.
[0009] In another aspect, a remote asset management system for managing valuable assets remotely and for improving communication among components in the remote assessment management system is provided. The remote assessment management system includes a database, one or more cameras, one or more sensors, a processing unit, a supporting unit, and a remote master. The database that stores (i) camera information that includes videos and photos captured by one or more cameras, (ii) sensor information captured by one or more sensors, (iii) commands, and (iv) operations data for operating the one or more cameras, the one or more sensors. The one or more cameras coupled with a remote unit to monitor the remote unit. The remote unit is one or more assets which are to be monitored. The one or more sensors coupled with the remote unit to sense abnormalities or changes in the remote unit. The one or more sensors includes a software machine learning module and a hardware machine learning unit that optimizes machine learning of the one or more cameras and a processing unit. The processing unit that interacts and receives (i) camera information and data captured by the one or more cameras, and (ii) sensor information and data captured by the one or more sensors. The processing unit includes a machine learning module that (a) processes (i) the camera information and data captured by the one or more cameras, and (ii) the sensor information and data captured by the one or more sensors, and (b) performs video analytics. The supporting unit adapted to store one or more rules for operating the one or more cameras and the one or more sensors remotely. The rules are dynamically updated and are configured remotely by the remote master through the processing unit. The remote master that interacts with (i) the supporting unit to obtain the one or more rules for operating the one or more cameras, and the one or more sensors, and (ii) the processing unit to obtain information associated with the one or more cameras, and the one or more sensors. The remote master (i) processes the information received from the supporting unit, and the processing unit, and (ii) provides feedback to the one or more cameras and the one or more sensors for monitoring the remote unit. The processing unit controls (a) the one or more cameras, and (b) the one or more sensors functionalities and operations assisted by the remote master. The remote master gives an input to the processing unit that carried out with (a) image processing techniques, (b) machine learning algorithms, (c) stored results, and/or (d) self-healing inputs. The image processing techniques, and the machine learning algorithms are applied across the processing unit, the one or more cameras, and the one or more sensors to obtain an analytics output.
[0010] In one embodiment, the remote master processes a request from a user about the remote unit or to remotely mount the one or more cameras in right place or surface.
[0011] In yet another aspect, a method for managing valuable assets remotely and for improving communication among components in a remote assessment management system is provided. The method includes the following steps: (i) monitoring a remote unit by employing one or more cameras around the remote unit, (ii) sensing abnormalities or changes in the remote unit by employing one or more sensors around the remote unit, (iii) receiving, using a processing unit, (a) camera information and data captured by the one or more cameras, and (b) sensor information and data captured by the one or more sensors, (iv) storing, using a supporting unit, one or more rules for operating the one or more cameras and the one or more sensors remotely, (v) interacting, using a remote master, with (a) the supporting unit to obtain the one or more rules for operating the one or more cameras, and the one or more sensors, and (b) the processing unit to obtain information associated with the one or more cameras, and the one or more sensors, and (vi) processing, using the remote master, the information received from the supporting unit, and the processing unit, and providing feedback to the one or more cameras and the one or more sensors for monitoring the remote unit.
[0012] In one embodiment, the remote unit is one or more assets which are to be monitored. In another embodiment, the method further includes the step of controlling (a) the one or more cameras, and (b) the one or more sensors functionalities and operations assisted by the remote master. In yet another embodiment, the method further includes the step of processing a request from a user to remotely mount the one or more cameras in right place or surface. In yet another embodiment, the processing unit includes a machine learning module that (a) processing (i) the camera information, and data captured by the one or more cameras, and (ii) the sensor information, and data captured by the one or more sensors, and (b) performing video analytics.
[0013] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[0015] FIG. 1 is a system view illustrating a remote asset management system for managing valuable assets remotely and for improving communication among components of the system in accordance with an embodiment herein; and
[0016] FIG. 2 is an exploded view of a camera in a remote unit of the system of FIG. 1 in accordance with an embodiment herein.
[0017] FIG. 3 is an exploded view of a sensor of one or more sensors of the remote asset management system of FIG. 1 according to an embodiment herein.
[0018] FIG. 4 is an exploded view of the processing unit of the remote management system of FIG. 1 according to an embodiment herein
[0019] FIG. 5 illustrates a schematic diagram of a computer architecture according to an embodiment herein.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0020] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0021] As mentioned, there remains a need for a system and method for remote asset management. The embodiments herein achieves this by providing a remote asset management system that manages valuable assets and a remote master based on cameras, assisted sensors, and distributed machine learning. Referring now to the drawings, and more particularly to FIGS. 1 through 5, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[0022] FIG. 1 is a system view illustrating a remote asset management system 100 for managing valuable assets remotely and for improving communication among components of the system 100 in accordance with an embodiment herein. The remote asset management system 100 includes a remote master 102, a supporting unit 104, a remote unit 106 to be monitored, a remote unit 108 that is located far from the remote master 102, one or more cameras 110, one or more sensors 112, a processing unit 114, and other external inputs 116. In one embodiment, the remote master 102 is, but not limited to, a computer, a desktop, a laptop, or a mobile phone.
[0023] The remote master 102 interacts with the supporting unit 104, and obtains one or more rules for operating the one or more cameras 110 and the one or more sensors 112. The remote master 102 further interacts with the processing unit 114 to obtain information associated with the one or more cameras 110 and the one or more sensors 112. The remote master 102 processes the information, self-learns, and provides feedback to the one or more cameras 110 and the one or more sensors 112 for the better operation. The remote master 102 processes a request from a user to the remote unit 106 or to remotely mount the one or more cameras 110 in the right place or surface. In one embodiment, the remote master 102 forms a wireless network with the one or more cameras 110.
[0024] The supporting unit 104 stores rules (e.g., configuration data) for operating the one or more cameras 110 and the one or more sensors 112 remotely. In one embodiment, the rules are dynamically updated and are configured remotely by the remote master 102 through the processing unit 114.
[0025] The remote unit 106 is a person or a physical location in which assets are to be monitored. For example, the remote unit 106 is, but not restricted to, a hospital having medical devices to be monitored, a mining site, a manufacturing unit having equipments, a retail shop having sale products, a kids care center, a communication tower, tele-operations, floors, etc. The one or more cameras 110 are located inside or in close proximity to the remote unit 106. In one embodiment, the cameras 110 are mountable. A position and a direction of each of the cameras 110 are changed remotely. The cameras 110 are adapted to easily move from one place to another place and fix in the remote unit 106. In another embodiment, the cameras 110 are non-mountable. In yet another embodiment, the cameras are portable. The cameras 110 have both a software machine learning module and a hardware machine learning unit for improving its functionality.
[0026] Similarly, the one or more sensors 112 are located inside or in close proximity to the remote unit 106. The type of sensors that are used in the remote unit 106 vary depends on the purpose or context. For example, when the remote unit 106 is a manufacturing unit having equipments, a temperature sensor may be used to track temperatures of the equipments. When the remote unit 106 is a retail shop having products, a motion sensor may be used to monitor people around the remote unit 106. The sensors have both a software machine learning module and a hardware machine learning unit for improving its functionality. Data captured by the one or more cameras 110 and the one or more sensors 112 are provided to the processing unit 114 for further processing.
[0027] The processing unit 114 interacts and receives camera information (e.g., available power, video resolution, camera angle or direction, computation power, communication power, frame rate, etc.), data captured (e.g., multimedia content such as video and audio) by the one or more camera units 110, sensor information, and data captured by the one or more sensors 112. The processing unit 114 processes the camera information, data captured by the one or more cameras 110 and the one or more sensors 112, and sensor information, and performs video analytics assisted by a machine learning module. Further, the processing unit 114 automatically calculates with processing techniques such as, but not limited to, image processing and machine learning, and gets inputs ready for the remote master 102. The inputs are provided to the remote master 102 through a wired or a wireless network. With the inputs from the processing unit 114, the remote master 102 improves itself, self-configures, self-learns, and provides input to the one or more cameras 110 and the one or more sensors 112. The processing unit 114 controls the one or more cameras 110 and the one or more sensors 112 functionalities and operations, and associated distributed machine learning.
[0028] In one embodiment, the processing unit 114 is located in close proximity to the remote master 102. In another embodiment, the processing unit 114 is located in close proximity to the remote unit 106. In yet another embodiment, the processing unit 114 is located centrally in between the remote master 102 and/or the remote unit 106. The processing unit 114 receives other external inputs 116 such as past information.
[0029] Instructions that are given by the remote master 102 to the processing unit 114 is carried out with image processing techniques, machine learning algorithms, and previous, self-healing inputs and other inputs which are given to the processing unit 114. Machine learning and image processing is split and distributed across the processing unit 114, the one or more cameras 110, and the one or more sensors 112, and all these work together to give meaningful analytics (e.g., video analytics and sensor analytics combine with a distributed machine learning) to the remote master 102. Operations (e.g., capturing videos at a higher or a lower resolution, passing frame rates, etc.) of one or more cameras 110 are controlled based on the one or more sensors 112, distributed machine learning, the processing unit 114, and/or the remote master 102.
[0030] FIG. 2 is an exploded view of a camera in the remote unit 106 of the remote asset management system 100 of FIG. 1 in accordance with an embodiment herein. Each of the cameras 110 includes a command receiver 202 for receiving commands from the remote master 102, a command receiver 204 for receiving commands from the processing unit 114, a command receiver 206 for receiving commands from the one or more sensors 112, a command receiver 208 for receiving commands from a local unit, a processor communication unit 210 for communicating outputs or data to the processing unit 114, a sensor communication unit 212 for communicating outputs or data to the one or more sensors 112, and a remote communication unit 214 for communicating outputs or data to the remote master 102 and the remote unit 106.
[0031] Each of the cameras 110 further includes a command converter 216 for converting commands from the remote master 102, the processing unit 114, the one or more sensors 112, and/or the local unit. The commands control a software machine learning module 218, an image and video processor 220, and a sensor input processors 222. The commands also control hardware units include actuators 224, a lens 226, an image sensor 228, a hardware machine learning unit 230, and a camera body. Each of the cameras 110 further includes an internal rule logic 232, a communication optimizer 234, a power optimizer 242, a camera neck 250, a camera head 252, a camera mounter 254, a physical cushioner 256, an and alarm raiser 258. The internal rule logic 232 verifies commands that are converted by the command converter 216 to avoid any wrong commands that come in.
[0032] The communication optimizer 234 optimizes the communication options (3G 236, Wi-Fi 238, or fixed 240) for communicating with the remote master 102, the remote unit 106, the one or more sensors 112, and/or the processing unit 114. The power optimizer 242 optimizes a battery unit 246 and an electric power unit 248.
[0033] The camera neck 250 helps in tilting and adjusting view of the cameras 110. The camera head 252 is movable. The camera mounter 254 helps to place flexibly in different surfaces. The physical cushioner 256 protects the cameras 110 when fall unexpectedly while mounting, and the alarm raiser 258 if people try to tamper or harm physically the whole unit in operation.
[0034] FIG. 3 is an exploded view of a sensor of the one or more sensors 112 of the remote asset management system 100 of FIG. 1 according to an embodiment herein. The one or more sensors 112 includes a templating engine 302, a rule engine 304, a real-time trigger engine 306, a configurator 308, a configuration verifier 310, a configuration optimizer 312, a camera optimizer 314, a processing unit optimizer 316, a machine learning optimizer 318, a situation optimizer 320, a context optimizer 322, an assisting sensor optimizer 324, a remote master optimizer 326, a remote close unit optimizer 328, a remote far unit optimizer 330, a predeployer 332, an auditing engine 334, and a reconciliation engine 336.
[0035] The configurator 308 enables configuring one or more rules for operating the one or more sensors 112. The configurator 308 enables configuring one or more rules are performed locally or remotely using the remote master 102 and/or the supporting unit 104. The configuration verifier 310 verifies the one or more rules for operating the sensors 112, and the configuration optimizer 312 optimizes the configurator 308 and updates the one or more rules dynamically.
[0036] The camera optimizer 314 controls and optimizes functionalities of the one or more cameras 110 based on sensor information captured by the one or more sensors 112. For example, the one or more sensors 112 controls the one or more cameras 110 to obtain high quality videos upon reaching only a certain temperature. Below such temperature, the one or more cameras 110 may be operated to obtain low quality videos. The processing unit optimizer 316 provides sensor information that is captured to the processing unit 114, and optimizes the processing unit 114. The one or more sensors 112 includes both a software machine learning module and a hardware machine learning unit 318 that optimizes the machine learning (hardware and/or software) of the one or more cameras 110 and the processing unit 114. The situation optimizer 320 and the context optimizer 322 selects sensors type and optimize it for capturing sensor information based on the situation and the context. One or more primary sensors of the sensors 112 optimize functionalities of one or more assisting sensors based on the assisting sensor optimizer 324. The remote master optimizer 326 optimizes functionalities of the remote master 102, the remote close unit optimizer 328 captures data from the remote unit 106 that is close to the sensors 112, and the remote far unit optimizer 330 receives data from the remote unit 108 that is located far from the remote master 102.
[0037] FIG. 4 is an exploded view of the processing unit 114 of the remote management system 100 of FIG. 1 according to an embodiment herein. The processing unit 114 includes a remote master input/output (I/O) unit 402, a remote unit I/O unit 404, a support unit I/O unit 406, a camera unit real time I/O unit 408, a camera unit batch I/O unit 410, a camera unit command I/O unit 412, a sensor real time I/O unit 414, a sensor batch I/O unit 416, a video decompressor 418, I/O optimizer and add I/O unit 420, an admin console 422, a VCA video engine 424, a VCA stream engine 426, a VCA image engine 428, a video dataset unit 430, a stat engine 432, a custom device video analysis unit 434, a sensors data analytics engine 436, a map reduce unit 438, a ETL staging, data transformer and mobile converter unit 440, a reporting unit, 442, an audit trail unit 444, an archiving unit 446, a machine learning master unit 448, a machine learning software module 450, a machine learning hardware agent manager unit 452, a machine learning software agent manager 454, a database 456, a commander unit 458, a templating engine 460, a rule engine 462, and a real time trigger engine 464.
[0038] The remote master I/O unit 402 interacts with the remote master 102, provides sensor and camera information to the remote master 102, and in turn receives instructions from the remote master 102. The remote unit I/O unit 404 receives input from the remote unit 106, and provides output to the remote unit 106. The supporting unit I/O unit 406 interacts with the supporting unit 104 for operating the one or more cameras 110 and the one or more sensors 112.
[0039] The camera unit real time I/O unit 408 provides input to the one or more cameras 110 and receives output from the one or more cameras 110 in a real-time. The camera unit batch unit 410 provides input to the one or more cameras 110 and receives output from the one or more cameras 110 in a batch. The camera unit command I/O unit 412 provides commands for operating the one or more cameras 110, and receives information from the cameras 110.
[0040] The sensors real-time I/O unit 414 provides input to the one or more sensors 112 and receives output from the one or more sensors 112 in a real-time. The sensors batch I/O unit 416 provides input to the one or more sensors 112 and receives output from the one or more sensors 112 in a batch. The video decompressor 418 decompresses videos captured by the one or more cameras 110 and/or one or more sensors 112. The I/O optimizer and add I/O unit 420 optimizes interaction between the processing unit 114 and other components of the system 100, and adds new input and output to the processing unit when any update is available. The admin console 422 provides user interfaces for monitoring, tracking, and controlling functionalities of the processing unit 114.
[0041] The VCA video engine 424, the VCA stream engine 426, and the VCA image engine 428 receives multimedia (e.g., videos, photos, image, etc.) from the one or more cameras 110 and/or the one or more sensors 112, and process the multimedia, and provide information to the remote master 102 for controlling the cameras 110 and the sensors 112. In one embodiment, the VCA video engine 424 receives multimedia as a video dataset from the cameras 110 and/or the sensors 112. In one embodiment, the custom device video analysis unit 434 analyses videos received from the cameras 110 and the sensors 112. The sensor data analytics engine 436 receives sensors data, and data captured by the sensors 112, process, and analyze the data for meaningful information.
[0042] The reporting unit 442 generates a report including data associated with the cameras 110 and the sensors 112 based on a request from a user. The audit trail unit 444 includes set of records, destination and source of records that provide documentary evidence of the sequence of activities that have affected at any time a specific operation of the cameras 110 and the sensors 112, procedure, or event. The archiving unit 446 archives data associated with the cameras 110, the sensors 112, and/or the remote master 102.
[0043] The processing unit 114 includes the machine learning master unit 448 is managed by the machine learning hardware agent manager unit 452. Similarly, the machine learning software module 450 is managed by the machine learning software agent manager 454. The database 456 stores camera information (e.g., videos captured by the cameras 110), sensor information, commands and operations data for operating the cameras 110, the sensors 112, and/or the remote master 102. The commander unit 458 provides commands for operating the cameras 110 and/or the sensors 112.
[0044] The remote asset management system 100 utilizes radio technology, cellular network, and Wi-Fi to control and communicating data among components of the system 100. Further, the system 100 improves functionalities of the remote unit 106, reduces cost at the remote unit 106, reduces workload of the remote master 102, saves time of escalation, and takes minimal time for installing the cameras 110 and effectively monitors the remote unit 106.
[0045] The techniques provided by the embodiments herein may be implemented on an integrated circuit chip (not shown). The chip design is created in a graphical computer programming language, and stored in a computer storage medium (such as a disk, tape, physical hard drive, or virtual hard drive such as in a storage access network). If the designer does not fabricate chips or the photolithographic masks used to fabricate chips, the designer transmits the resulting design by physical means (e.g., by providing a copy of the storage medium storing the design) or electronically (e.g., through the Internet) to such entities, directly or indirectly. The stored design is then converted into the appropriate format (e.g., GDSII) for the fabrication of photolithographic masks, which typically include multiple copies of the chip design in question that are to be formed on a wafer. The photolithographic masks are utilized to define areas of the wafer (and/or the layers thereon) to be etched or otherwise processed.
[0046] The resulting integrated circuit chips can be distributed by the fabricator in raw wafer form (that is, as a single wafer that has multiple unpackaged chips), as a bare die, or in a packaged form. In the latter case the chip is mounted in a single chip package (such as a plastic carrier, with leads that are affixed to a motherboard or other higher level carrier) or in a multichip package (such as a ceramic carrier that has either or both surface interconnections or buried interconnections). In any case the chip is then integrated with other chips, discrete circuit elements, and/or other signal processing devices as part of either (a) an intermediate product, such as a motherboard, or (b) an end product. The end product can be any product that includes integrated circuit chips, ranging from toys and other low- end applications to advanced computer products having a display, a keyboard or other input device, and a central processor. The embodiments herein can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment including both hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc.
[0047] Furthermore, the embodiments herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[0048] The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W) and DVD.
[0049] A data processing system suitable for storing and/or executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
[0050] Input/output (I/O) devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
[0051] A representative hardware environment for practicing the embodiments herein is depicted in FIG. 5. This schematic drawing illustrates a hardware configuration of an information handling/computer system in accordance with the embodiments herein. The system comprises at least one processor or central processing unit (CPU) 10. The CPUs 10 are interconnected via system bus 12 to various devices such as a random access memory (RAM) 14, read-only memory (ROM) 16, and an input/output (I/O) adapter 18. The I/O adapter 18 can connect to peripheral devices, such as disk units 11 and tape drives 13, or other program storage devices that are readable by the system. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments herein. The system further includes a user interface adapter 19 that connects a keyboard 15, mouse 17, speaker 24, microphone 22, and/or other user interface devices such as a touch screen device (not shown) to the bus 12 to gather user input. Additionally, a communication adapter 20 connects the bus 12 to a data processing network 25, and a display adapter 21 connects the bus 12 to a display device 23 which may be embodied as an output device such as a monitor, printer, or transmitter, for example.
[0052] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the invention.

Claims

CLAIMS I/We Claim:
1. A remote asset management system for managing valuable assets remotely and for improving communication among components in said remote assessment management system, wherein said remote assessment management system comprises:
a plurality of cameras coupled with a remote unit to monitor said remote unit, wherein said remote unit is a plurality of assets which are to be monitored;
a plurality of sensors coupled with said remote unit to sense abnormalities or changes in said remote unit;
a processing unit that interacts and receives (i) camera information and data captured by said plurality of cameras, and (ii) sensor information and data captured by said plurality of sensors, wherein said processing unit comprises:
(i) a machine learning module that (a) processes (i) said camera information, and data captured by said plurality of cameras, and (ii) said sensor information and data captured by said plurality of sensors, and (b) performs video analytics;
a supporting unit that stores a plurality of rules for operating said plurality of cameras and said plurality of sensors remotely; and
a remote master that interacts with (i) said supporting unit to obtain said plurality of rules for operating said plurality of cameras, and said plurality of sensors, and (ii) said processing unit to obtain information associated with said plurality of cameras, and said plurality of sensors,
wherein said remote master (i) processes said information received from said supporting unit, and said processing unit, and (ii) provides feedback to said plurality of cameras and said plurality of sensors for monitoring said remote unit, wherein said processing unit controls (a) said plurality of cameras, and (b) said plurality of sensors functionalities assisted by said remote master, wherein said remote master gives an input to said processing unit that carried out with (a) image processing techniques, (b) machine learning algorithms, (c) stored results, and/or (d) self-healing inputs, wherein said image processing techniques, and said machine learning algorithms are applied across said processing unit, said plurality of cameras, and said plurality of sensors to obtain an analytics output.
2. The remote asset management system as claimed in claim 1 , wherein said plurality of cameras comprises:
a plurality of command receivers that receives commands from (i) said remote master, (ii) said processing unit, (iii) a local unit, and (iv) said plurality of sensors; a processor communication unit that communicates output or data to said processing unit;
a sensor communication unit that communicates outputs or data to said plurality of sensors;
a remote communication unit that communicates outputs or data to said remote master and said remote unit;
a command converter that converts commands from (i) said remote master, (ii) said processing unit, (iii) said one or more sensors, and/or (iv) said local unit; an internal rule logic that verifies said commands, wherein said commands are converted by said command converter in order to avoid any wrong commands that come in;
a communication optimizer that optimizes communication options for communicating with (i) said remote master, (ii) said remote unit, (iii) said plurality of sensors, and/or (iv) said processing unit;
a power optimizer that optimizes (i) a battery unit, and (ii) an electric power unit;
a camera neck adapted to tilt and adjust a view of said plurality of cameras; a camera mounter adapted to place said plurality of cameras flexibly in different surfaces; a physical cushioner that protects said plurality of cameras when fall unexpectedly while mounting; and
an alarm raiser adapted to raise alarm when people try to tamper or harm said remote unit/asset when said remote unit/assets is in operation.
3. The remote asset management system as claimed in claim 1, wherein said plurality of sensors comprises:
a configurator that configures said plurality of rules for operating said plurality of sensors;
a configuration verifier that verifies said plurality of rules for operating said plurality of sensors;
a configuration optimizer that optimizes said configurator and updates said plurality of rules dynamically;
a camera optimizer that controls and optimizes functionalities of said plurality of cameras based on sensor information captured by said plurality of sensors;
a processing unit optimizer that provides sensor information captured to said processing unit, and optimizes said processing unit;
a machine learning optimizer that optimizes said machine learning of said plurality of cameras and said processing unit;
a situation optimizer and a context optimizer that selects said plurality of sensors type and optimizes said plurality of sensors type for capturing sensor information based on situation and context of said remote unit;
an assisting sensor optimizer that optimizes functionalities of said plurality of assisting sensors;
a remote master optimizer that optimizes functionalities of said remote master;
a remote close unit optimizer that captures data from said remote unit that is located close to said plurality of sensors; and a remote far unit optimizer that receives data from said remote unit that is located far from said remote master.
4. The remote asset management system as claimed in claim 1 , wherein said processing unit comprises:
a remote master input output (I/O) unit that interacts with said remote master, and provides sensor and camera information to said remote master, and in turn receives instructions from said remote master;
a remote unit I O unit that receives input from said remote unit, and provides output to said remote unit;
a support unit I/O unit that interacts with said supporting unit for operating said plurality of cameras and said plurality of sensors;
a camera unit real time I/O unit that provides input to said plurality of cameras and receives output from said plurality of cameras in a real-time;
a camera unit batch I/O unit that provides input to said plurality of cameras and receives output from said plurality of cameras in a batch;
a camera unit command I/O unit that provides commands for operating said plurality of cameras, and receives information from said plurality of cameras;
a sensor real time I/O unit that provides input to said plurality of sensors and receives output from said plurality of sensors in a real-time;
a sensor batch I/O unit that provides input to said plurality of sensors and receives output from said plurality of sensors in a batch;
a video decompressor that decompresses videos captured by said plurality of cameras;
an I/O optimizer and add I/O unit that optimizes interaction between said processing unit and other components of said remote asset management system, wherein said I/O optimizer and add I/O unit adds new input and output to said processing unit when any update is received from said plurality of sensors and said plurality of cameras; an admin console that provides user interfaces to monitor, track, and control functionalities of said processing unit;
a VCA video engine that receives video dataset from said plurality of cameras, wherein said VCA video engine processes said video and provides information to said remote master to control said plurality of cameras;
a VCA stream engine and a VCA image engine that receives images and photos from said plurality of cameras and/or said plurality of sensors, wherein said VCA stream engine and said VCA image engine processes said images and photos and provides information to said remote master to control said plurality of cameras, and/or said plurality of sensors;
a sensors data analytics engine that (i) receives a data that captured by said plurality of sensors, (ii) processes said received data, and (iii) analyzes said data for meaningful information;
a reporting unit that generates a report that comprising data associated with said plurality of cameras and said plurality of sensors based on a request from a user; an audit trail unit that provides documentary evidence of sequence of activities that are affected at any time at a specific operation of (i) said plurality of cameras, (ii) said plurality of sensors, (iii) procedure, or (iv) event, wherein said audit trail unit comprises (a) set of records, (b) destination, and (c) source of records;
an archiving unit that archives data associated with said plurality of cameras, said plurality of sensors, and/or said remote master; and
a commander unit that provides commands for operating said plurality of cameras and/or said plurality of sensors to monitor said remote unit.
5. The remote asset management system as claimed in claim 1, wherein said rules are dynamically updated and are configured remotely by said remote master through said processing unit.
6. The remote asset management system as claimed in claim 1, wherein said remote master processes a request from a user about said remote unit or to remotely mount said plurality of cameras in right place or surface.
7. The remote asset management system as claimed in claim 1, wherein said remote asset management system comprises a database that stores (i) said camera information that comprising videos and photos captured by said plurality of cameras, (ii) said sensor information captured by said plurality of sensors, (iii) commands, and (iv) operations data for operating said plurality of cameras, said plurality of sensors, and/or said remote master.
8. The remote asset management system as claimed in claim 2, wherein said commands controls at least one of:
(a) a plurality of software units that comprises (i) a software machine learning module, (ii) an image and video processor, and/or (iii) a sensor input processors; and
(b) a plurality of hardware units that comprises (i) actuators, (ii) a lens, (iii) an image sensor, (iv) a hardware machine learning unit, and/or (v) a camera body.
9. A remote asset management system for managing valuable assets remotely and for improving communication among components in said remote assessment management system, wherein said remote assessment management system comprises:
a database that stores (i) camera information that comprising videos and photos captured by a plurality of cameras, (ii) sensor information captured by a plurality of sensors, (iii) commands, and (iv) operations data for operating said plurality of cameras, said plurality of sensors;
a plurality of cameras coupled with a remote unit to monitor said remote unit, wherein said remote unit is a plurality of assets which are to be monitored;
a plurality of sensors coupled with said remote unit to sense abnormalities or changes in said remote unit, wherein said plurality of sensors comprises: a software machine learning module and a hardware machine learning unit that optimizes machine learning of said plurality of cameras and a processing unit;
said processing unit that interacts and receives (i) camera information and data captured by said plurality of cameras, and (ii) sensor information, and data captured by said plurality of sensors, wherein said processing unit comprises:
a machine learning module that (a) processes (i) said camera information and data captured by said plurality of cameras, and (ii) said sensor information and data captured by said plurality of sensors, and (b) performs video analytics;
a supporting unit adapted to store a plurality of rules for operating said plurality of cameras and said plurality of sensors remotely, wherein said rules are dynamically updated and are configured remotely by said remote master through said processing unit; and
a remote master that interacts with (i) said supporting unit to obtain said plurality of rules for operating said plurality of cameras, and said plurality of sensors, and (ii) said processing unit to obtain information associated with said plurality of cameras, and said plurality of sensors,
wherein said remote master (i) processes said information received from said supporting unit, and said processing unit, and (ii) provides feedback to said plurality of cameras and said plurality of sensors for monitoring said remote unit, wherein said processing unit controls (a) said plurality of cameras, and (b) said plurality of sensors functionalities and operations assisted by said remote master,
wherein said remote master gives an input to said processing unit that carried out with (a) image processing techniques, (b) machine learning algorithms, (c) stored results, and/or (d) self-healing inputs, wherein said image processing techniques, and said machine learning algorithms are applied across said processing unit, said plurality of cameras, and said plurality of sensors to obtain an analytics output.
10. The remote asset management system as claimed in claim 9, wherein said remote master processes a request from a user about said remote unit or to remotely mount said plurality of cameras in right place or surface.
11. A method for managing valuable assets remotely and for improving communication among components in a remote assessment management system, said method comprising: monitoring a remote unit by employing a plurality of cameras around said remote unit, wherein said remote unit is a plurality of assets which are to be monitored;
sensing abnormalities or changes in said remote unit by employing a plurality of sensors around said remote unit;
receiving, using a processing unit, (i) camera information and data captured by said plurality of cameras, and (ii) sensor information and data captured by said plurality of sensors;
storing, using a supporting unit, a plurality of rules for operating said plurality of cameras and said plurality of sensors remotely; and
interacting, using remote master, with (i) said supporting unit to obtain said plurality of rules for operating said plurality of cameras, and said plurality of sensors, and (ii) said processing unit to obtain information associated with said plurality of cameras, and said plurality of sensors; and
processing, using said remote master, said information received from said supporting unit, and said processing unit, and (ii) providing feedback to said plurality of cameras and said plurality of sensors for monitoring said remote unit.
12. The method as claimed in claim 11, further comprising step of:
controlling (a) said plurality of cameras, and (b) said plurality of sensors functionalities and operations assisted by said remote master.
13. The method as claimed in claim 11, further comprising step of: processing a request from a user to remotely mount said plurality of cameras in right place or surface.
14. The method as claimed in claim 11, wherein said processing unit comprises a machine learning module that (a) processing (i) said camera information, and data captured by said plurality of cameras, and (ii) said sensor information, and data captured by said plurality of sensors, and (b) performing video analytics.
PCT/IN2016/050117 2015-04-21 2016-04-21 System and method for remote asset management WO2016170548A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN2045/CHE/2015 2015-04-21
IN2045CH2015 2015-04-21

Publications (2)

Publication Number Publication Date
WO2016170548A2 true WO2016170548A2 (en) 2016-10-27
WO2016170548A3 WO2016170548A3 (en) 2017-11-16

Family

ID=57143766

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2016/050117 WO2016170548A2 (en) 2015-04-21 2016-04-21 System and method for remote asset management

Country Status (1)

Country Link
WO (1) WO2016170548A2 (en)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100442170B1 (en) * 2001-10-05 2004-07-30 (주)아이디스 Remote Control and Management System
US20070226616A1 (en) * 2004-06-01 2007-09-27 L-3 Communications Corporation Method and System For Wide Area Security Monitoring, Sensor Management and Situational Awareness

Also Published As

Publication number Publication date
WO2016170548A3 (en) 2017-11-16

Similar Documents

Publication Publication Date Title
US11158177B1 (en) Video streaming user interface with data from multiple sources
US10637783B2 (en) Method and system for processing data in an internet of things (IoT) environment
US10936600B2 (en) Sensor time series data: functional segmentation for effective machine learning
US11249492B2 (en) Methods and apparatus to facilitate autonomous navigation of robotic devices
US20220181020A1 (en) System and method for remote patient monitoring
JP7413641B2 (en) Video surveillance using neural networks
US20190179647A1 (en) Auto throttling of input data and data execution using machine learning and artificial intelligence
EP3631712B1 (en) Remote collaboration based on multi-modal communications and 3d model visualization in a shared virtual workspace
US11601949B2 (en) Distributed edge-environment computing platform for context-enabled ambient intelligence, environmental monitoring and control, and large-scale near real-time informatics
US10200631B2 (en) Method for configuring a camera
US11683579B1 (en) Multistream camera architecture
US11500370B2 (en) System for predictive maintenance using generative adversarial networks for failure prediction
US20190102523A1 (en) Using augmented reality interface and real-time glucose data to control insulin delivery device
US20200014854A1 (en) Factory data management method and system
US9769434B1 (en) Remote control of a user's wearable computing device in help desk applications
EP2973462B1 (en) Surveillance system with intelligently interchangeable cameras
WO2014156901A1 (en) Medical information display system, server device, and portable terminal
WO2016170548A2 (en) System and method for remote asset management
JP7209033B2 (en) Scheduling system and method for online program updates
JP2018169641A (en) Medical device, information processing device, processing method, program, and medical information processing system
US10748646B2 (en) Chunk-wise transmission of time-series data to mobile devices
US10447769B2 (en) Enhanced time-series data streams for transmission to mobile devices
JP2020166847A (en) Method and system for detecting device reassignment in manufacturing site
US20240046650A1 (en) Method to operate multiple sensors for multiple instances of interconnected human activity recognition
US20220283870A1 (en) Forecasting and reporting available access times of physical resources

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16782742

Country of ref document: EP

Kind code of ref document: A2