US9270940B1 - Remote object sensing in video - Google Patents
Remote object sensing in video Download PDFInfo
- Publication number
- US9270940B1 US9270940B1 US14/502,110 US201414502110A US9270940B1 US 9270940 B1 US9270940 B1 US 9270940B1 US 201414502110 A US201414502110 A US 201414502110A US 9270940 B1 US9270940 B1 US 9270940B1
- Authority
- US
- United States
- Prior art keywords
- input
- data
- sensory
- user
- sensory properties
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
- G06V10/235—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on user input or interaction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
Definitions
- the present application relates generally to remote object sensing and, more particularly, to techniques for replicating the sense of touch remotely.
- content available for media streaming continually expands in the virtual environment. Users can now easily send or receive videos and watch movies and shows on a user device. Communication via remote means also enable non-traditional forms of education such as distance learning. Students can now watch live or pre-recorded lectures as well as explore different subject areas using the vast media content available on the Internet. For example, a child can watch videos of lions in Africa without having to travel to Africa or even to a local zoo; a student in London can participate in a live lecture that is taking place in a New York City classroom.
- Embodiments of the invention provide techniques for recreating the sense of touch remotely using data received at a device.
- a method comprises steps of obtaining an input comprising audio and visual data for display on a first device, receiving data associated with one or more sensory properties of one or more objects in the input, and reconstructing the one or more sensory properties at the first device based on the data received.
- a method comprises obtaining an input comprising visual and audio data for display on a device, detecting user selection of one or more objects in the visual input, identifying the one or more selected objects and surroundings of the one or more objects, obtaining data associated with one or more sensory properties of the identified one or more objects and the surroundings, and reconstructing the one or more sensory properties associated with the identified one or more objects and the surroundings based on the data received.
- a method comprises obtaining an input comprising visual and audio data for display on a first device, detecting sensory input from a user of the first device, capturing data associated with the sensory input from the user of the first device, and transmitting the captured data to a second device for reconstruction of one or more sensory properties associated with the user of the first device.
- an apparatus comprises a memory and a processor operatively coupled to the memory.
- the processor is configured to obtain an input comprising audio and visual data for display on a first device, receive data associated with one or more sensory properties of one or more objects in the input, and reconstruct the one or more sensory properties at the first device based on the data received.
- FIG. 1 depicts an overview of a remote sensing methodology according to an embodiment of the invention.
- FIG. 2 depicts a remote sensory module for the remote sensing methodology of FIG. 1 according to an embodiment of the invention.
- FIG. 3 depicts a coordinated plane used in the remote sensing module of FIG. 2 , according to an embodiment of the invention.
- FIG. 4 depicts a controller used in the remote sensing module of FIG. 2 , according to an embodiment of the invention.
- FIG. 5 depicts an additional embodiment of the remote sensory module of the remote sensing methodology of FIG. 1 , according to an embodiment of the invention.
- FIG. 6 depicts a sensory estimator used in the remote sensory module of FIG. 5 , according to an embodiment of the invention.
- FIG. 7 depicts a further embodiment of the remote sensory module of the remote sensing methodology of FIG. 1 .
- FIG. 8A depicts an example of a front portion of a device with a gridded interface for implementing the remote sensing methodology according to an embodiment of the invention.
- FIG. 8B depicts a back portion of the device of FIG. 8A .
- FIG. 9 depicts an exemplary application of the remote sensing methodology according to embodiments of the invention.
- FIG. 10 depicts a computer system in accordance with which one or more components/steps of techniques of the invention may be implemented according to an embodiment of the invention.
- Illustrative embodiments of the invention provide for replicating a sense of touch by augmenting existing audio and/or visual content with a sensory dimension in the form of temperature, texture and movement.
- embodiments of the invention augment traditional videotelephony and video recordings by conveying the sense of warmth or pressure from a hug or a hand shake, the sensation of snow, the feeling of a patient's pulse and body temperature, etc.
- embodiments of the invention replicate a sense of touch remotely by recreating roughness and/or temperature sensation. It is believed that a combination of the three remote patterns—sense of texture, temperature and movement, would allow for a more accurate representation of the sense of touch.
- FIG. 1 shows an overview of the remote sensing methodology according to illustrative embodiments of the invention.
- Remote sensing methodology 100 starts at block 102 in which data is obtain as input for display on a user interface such as a screen on a device.
- the device may be a handheld device or any suitable computing or electronic device.
- Non-limiting examples of input can include audio and visual input such as a video, a still image, an audio clip or a live feed.
- a remote sensory module performs operations on the data received as input. Details of the remote sensory module 104 are further described in the context of FIGS. 2 through 7 .
- output from the remote sensory module is reproduced for the user at the user interface.
- FIG. 2 shows an illustrative embodiment of the remote sensory module 104 of FIG. 1 .
- the visual and/or audio input can be pre-recorded content or live streaming content.
- the input may be a recording of a patient for the past twenty-four hours or any given duration, or a live stream of the patient in his room.
- the input may be a live video conference between two or more users.
- the input may be captured by a visual or audio recording component on a user's device (e.g., the camera on a mobile device).
- data relating to one or more sensory properties for one or more objects in the visual and/or audio input is received.
- a sensory property refers to a temperature, a texture or movement associated with the object.
- an object in the visual and/or audio input refers to a person, an animal, an inanimate object (e.g., a fireplace, a car) or the background (e.g., snow, wind, book shelves).
- Sensory data associated with the one or more objects of the input may be received from one or more sensors on the user's device (e.g., a camera on a mobile phone, a tablet, a computer).
- a second device such as sensory data captured by one or more sensors on the second device.
- sensors may refer to infrared sensors, movement detectors, tactile sensors, etc.
- the data associated with one or more sensory properties of the one or more objects in the input are processed by the coordinated plane and sent to the controller at block 208 .
- the coordinated plane will be described in further detail in the context of FIG. 3 below.
- the controller coordinates and controls reconstruction and recreation of the one or more sensory properties associated with the one or more objects in the input. The reconstructed sensory properties can then be experienced by the user at a user device. Details of controller 208 will be further described in the context of FIG. 4 below.
- Coordinated plane 300 includes block 302 , at which data is received.
- the data received corresponds to data associated with the one or more sensory properties of the one or more objects in the input previously described in the context of FIG. 2 .
- the data received may be in the form of temperature data, electrical signals, vibration duration and frequency, etc.
- the sensory data is organized and/or interpreted.
- the data is translated into metrics such that the one or more sensory properties associated with the one or more objects in the input may be replicated.
- the translated metrics are then sent to the controller at block 308 for reconstruction.
- coordinated plane 300 may include a calibration algorithm, such that calibration can be performed depending on the input. For example, if the input is that of a dessert scene, the algorithm would automatically calibrate the coordinated plane 300 to a hot climate module.
- FIG. 4 shows an illustrative embodiment of the controller 208 of FIG. 2 .
- Controller 400 coordinates and controls reconstruction of the one or more sensory properties associated with the one or more objects in the input.
- Controller 400 includes a temperature simulator 402 , a texture simulator 404 and a movement simulator 406 .
- Temperature simulator 402 may comprise elements capable of heating and cooling, so that variations in temperature can be produced by means such as electromagnetic induction, laser and air flow.
- Texture simulator 404 may comprise elements capable of producing vibrations at various frequencies or materials that are capable of morphing (such as phase change materials) giving rise to a feeling of roughness.
- Movement simulator 406 may comprise elements capable of generating force and pressure to emulate motion or micro-accelerators and micro-sensors
- Controller 400 may be implemented as software capable of controlling and activating temperature simulator 402 , texture simulator 404 and movement simulator 406 to reconstruct the sensory properties associated with the one or more objects in the input.
- the temperature simulator 402 , texture simulator 404 and movement simulator 406 may be located on the enclosure of a user device, such as the back of the device, the screen of the device or a case suitable for enclosing a back and/or front of the device.
- a doctor is interested in monitoring a patient for a specific duration of time.
- a first device obtains visual and/or audio input relating to the patient for display on the first device or on a second device.
- Sensors on the first device such as an infrared thermometer, a movement sensor, and a tactile sensor, can be used to monitor and record the patient's vitals or other parameters of interest.
- Data from the sensors can be captured and stored in a database for analysis at a later time, or it can be analyzed simultaneously at the same device, or sent to a second device for use.
- Sensory data to be captured and recorded can be specified by the doctor prior to monitoring the patient. Then at block 104 of FIG. and block 204 of FIG. 2 , the sensory data relating to the patient is received for processing. At block 206 of FIG. 2 , also corresponding to the coordinated plane 300 of FIG. 3 , the data is processed and sent to a controller at block 208 . At block 208 and corresponding controller 400 of FIG. 4 , the controller reconstructs and replicates the one or more sensory properties from the data received by activating and controlling the texture simulator 402 , texture simulator 404 and/or movement simulator 406 . Corresponding to block 106 , the reconstructed sensory properties can be experienced by the doctor at a user device.
- the patient's temperature and heart rate would be reconstructed at the device on which the doctor is using and the doctor would feel the body heat and the pulse of the patient at a given point in time.
- a child is be at a zoo and points the camera of a mobile device at a lion (corresponding to obtaining visual and/or auditory input at blocks 102 and 202 ).
- Infrared sensors on the mobile device can establish the lion's temperature, especially relative to the background; tactile sensors on the mobile device can capture texture so as to communicate what the lion feels like; movement detectors on the mobile device can capture the lion's periodic movement (e.g., as in pulsing).
- This data is received at blocks 104 and 204 .
- the coordinated plane of block 206 and FIG. 3 interprets and translates the captured sensory data.
- the translated sensory data is then sent to the controller of block 208 (correspondingly controller 400 of FIG. 4 ) for reconstruction.
- the reconstructed sensory properties are then experienced as the output at block 106 . Therefore, as the child looks at the lion being recorded through the camera and displayed on the mobile device, through a sensory replication interface on the mobile device, the child can also feel the lion's heat, feel its texture and its movement on an interface at his mobile device, thus giving a very realistic sense of the lion.
- the interface will be described in further details in the context of FIGS. 8A and 8B below.
- FIG. 5 shows an alternative embodiment of the remote sensory module 104 of FIG. 1 .
- visual and/or audio input is obtained for display on a user device.
- the visual and/or audio input can be pre-recorded content or live streaming content.
- the input may be a pre-recorded show or live streaming content from the camera of a device.
- a user may select a specific object and/or sensory property for reconstruction through the user interface of the user device.
- the user may optionally choose to segment and partition the input by duration (temporal variation) or by object/region (spatial variation).
- a sensory estimator generates or obtains data based on the user selection.
- the data is then sent to block 510 , where the coordinated plane (as disclosed in FIG. 3 ) interprets the data and translates it into metrics for reconstruction by a controller (as disclosed in FIG. 4 ) at block 512 .
- FIG. 6 shows an illustrative embodiment of the sensory estimator 508 of FIG. 5 .
- Sensory estimator 600 includes temperature estimator 604 , texture estimator 606 and movement estimator 608 .
- Sensory estimator 600 interacts with database 602 to obtain sensory data for the one or more selected objects or regions.
- Database 602 was created at an earlier point in time with objects indexed with sensory information, much like a library having numbers and vibrations that have been translated into a lexicon or dictionary of temperatures, textures and movement that matches the physical experience of the associated object.
- Database 602 continually updates and expands as new information is acquired.
- Information in database 602 may come from the data captured by one or more devices with access to the database 602 , entered by a user (e.g., the temperature of boiling water is approximately 100° C.), as well as acquired from online resources such as the World Wide Web.
- Sensory estimator 600 interacts with database 602 to retrieve data related to one or more sensory properties of an object when desired or needed. For example, when a user is interested in an average temperature of a specific object or region in the input, sensory estimator 600 can obtain temperature information of the object or region for a given duration and calculate an average through temperature estimator 604 . As another example, a user may stream a pre-recorded show in which no sensory data is transmitted and associated with the input.
- sensory estimator 600 interacts with database 602 to identify the object or region of interest (e.g., via an image recognition module within the database or by querying the user for identification of the object). Once the object or region of interest is identified, database 602 sends sensory information, such as temperature, texture and movement, associated with the identified object/region to the respective estimators of sensory estimator 600 . If the object is not found in database 602 , a search outside the database may be performed to identify the object. If sensory information for the object is not found in database 602 , then database 602 may identify related and similar objects (e.g., a fire pit is similar to a fireplace) and send sensory data associated with the related objects to sensory estimator 600 .
- database 602 may identify related and similar objects (e.g., a fire pit is similar to a fireplace) and send sensory data associated with the related objects to sensory estimator 600 .
- the doctor can select the parameters of interest to reconstruct. This selection is detected at block 504 .
- the doctor may choose to monitor how a patient's temperature evolve over time, and thus select segmentation of the input by duration corresponding to block 506 .
- the doctor may choose to partition the input by object such that in an entire room of patients, the doctor can select a specific patient or a specific section of the patient room to monitor.
- the sensory estimator may obtain information associated with the one or more sensory properties of the one or more objects in the input.
- the doctor wants to monitor how the patient's temperature evolve over time (e.g., the temperature change per hour for the past three hours to detect change in fever in response to a medication)
- sensory estimator 600 can obtain temperature information for the patient over the given duration from database 602 .
- Temperature estimator 604 can then calculate an average temperature for the patient for each hour of the duration of interest.
- Sensory estimator 600 then sends this average temperature data to coordinated plane 510 .
- Coordinated plane 510 interprets and translates the temperature data as described in the context of FIG. 3 above and sends the metrics to controller 512 .
- Controller 512 reconstructs the average temperature for each of the three hours as output for the doctor to experience at his device. For example, if the average temperature for the three hours were 101.6° F., 100.2° F. and 98.8° F., controller 512 may control the heating element(s) to reproduce the heat feel of these temperatures for the doctor to experience. The doctor can choose to experience the average temperature for each of the three hours for a length of time (e.g., five seconds per average temperature) consecutively or independently. Additionally, the controller 512 can also use the tactile and/or movement simulators to supplement the temperature feel of the patient with the pulse rate of the patient for the same duration of interest.
- FIG. 7 shows an additional embodiment of the remote sensory module 104 of FIG. 1 .
- visual and/or audio input is obtained for display on a device.
- the visual and/or audio input is preferably live streaming content, for example, a video chat between two or more people.
- a determination is made whether sensory input is detected. For example, a mother and her child are chatting remotely via a video chat application such as SKYPE or FACETIME, the mother wants to give the child a reassuring gesture such as a pat or hand squeeze. The mother can make contact with the sensors on her device as if she is actually squeezing the child's hand or patting the child on the hand.
- the module returns to block 702 . If sensory input is detected at block 708 , the module proceeds to block 710 in which sensory data associated with the gestures are captured (e.g., the temperature and texture of the hand, the movement and pressure associated with the gesture). The captured data associated with the various sensory properties of the gesture is then transmitted to a device of an intended recipient at block 712 for reconstruction at the recipient's device, here, transmitted to the child's device for reconstruction. Reconstruction can be performed at the recipient's device using a remote sensory module as according to embodiments of the invention, for example, as described in the context of FIGS. 2 to 6 above.
- FIGS. 8A and 8B show an illustrative embodiment of a device capable of implementing the remote sensing methodology 100 as shown in FIG. 1 .
- the device may be any handheld or portable device (e.g., smartphone, tablet, thermometer) or any other suitable device (e.g., personal computer, dashboard of an automobile).
- Sensors for capturing sensory data and simulators for reproducing one or more sensory properties may be placed anywhere along the enclosure of the device, for example, on the screen and/or back of a tablet or smart phones.
- the sensors and simulators may be part of a gridded interface spanning a desired portion of the enclosure of the device.
- FIG. 8A shows the front of a device 802 , with screen 810 .
- a gridded interface 808 spans at least a portion of the screen 810 of device 802 .
- the gridded interface 808 may be embedded into the screen 810 of device 802 or embedded into an enclosure, such as a case or cover for device 802 that covers the screen 810 .
- the gridded interface 808 may comprise a miniscule grid with a network of micro pipes for delivery of heating and cooling (e.g., using hot air and cold air stored within reservoirs in the device or provided as part of the gridded interface), illustrated as dashed lines 804 - 1 . . . 804 -M.
- the controller can create an appropriate mixture of hot air and cold air to provide the appropriate temperature sensation.
- the number of micro pipes, M may vary accordingly, for example, depending on the size of the device or area of desired coverage on the device.
- the network of micro pipes span the entire surface of a device.
- the gridded interface 808 may also comprise an array of tactile and movement sensors and simulators, illustrated as ovals 806 - 1 . . . 806 -N. (e.g., micro motors for generating vibrations at various frequencies to simulate different textures).
- the number, size and arrangement of the tactile and/or movement sensors and simulators may vary accordingly, for example, depending on the size of the device.
- the gridded interface 808 may be activated electronically and locally by the controller module.
- the gridded interface 808 may be activated and continuously capturing, transmitting and/or receiving sensory data, or the gridded interface 808 may be in sleep mode such that sensory data is only captured, transmitted and/or received when re-activated.
- FIG. 8B shows the gridded interface 808 implemented on at least a portion of the back of device 802 . It is to be noted that the gridded interface described herein may be implemented in a front portion of device 802 , a back portion of device 802 or in the front and back portions of device 802 .
- the gridded interface 808 may be implemented apart from a device as a stand-alone apparatus through which a device may be plugged in when sensory replication is desired or needed.
- FIG. 9 shows an exemplary application of embodiments of the invention.
- a visual and/or audio input is displayed on screen 902 .
- Screen 902 may be a screen on any suitable device, such as a tablet or a mobile phone.
- displayed on screen 902 is an outdoor scene.
- Sensory properties such as temperature, texture and/or movement associated with objects in the scene, such as the sun 904 , snow 906 and fire 908 , may be experienced by a user of the device.
- the sensors on the device may capture sensory data associated with the objects in the scene. For example, the temperature and texture of the snow 906 , the temperature and movement of the fire 908 , along with the temperature of the sun 904 may be captured by the temperature sensor, tactile sensor and motion sensor on the device. These sensory data can then be sent to coordinated plane 910 (details of which are disclosed in the context of FIG. 3 above), which interprets and translates the received sensory data into metrics and sends the translated metrics to controller 912 (details of which are disclosed in the context of FIG. 4 above).
- Controller 912 then activates and coordinates the temperature, texture and movement simulators on the device to reconstruct the received sensory data into one or more sensory property associated with the sun 904 , snow 906 and fire 908 .
- the sensory properties can be experienced by the user on the device through the gridded interface 808 as described in FIGS. 8A and 8B .
- a user may also chose to experience only one sensory property associated with the objects in the scene. For example, a user may choose to experience only the temperature of the objects in the scene.
- infrared sensors on the device can be used to sense the temperature of the sun 904 , snow 906 and fire 908 .
- an exemplary output produced by the controller 912 may be a relative temperature map of the scene such that portions of the image is much warmer than the surroundings.
- the sun 904 would be perceived as warm relative to the hot fire 908
- the snow 906 would be cold relative to the sun 904 and fire 908 .
- a user would be able to touch the screen or back of the device, depending on where the gridded interface is located, to feel the temperature distribution of the scene displayed on screen 902 .
- the remote sensing methodology 100 would replicate the sense of touch using a remote sensory module 500 as described in FIG. 5 .
- Sensory data associated with the one or more objects and the surroundings in the scene displayed on screen 902 would be obtained from a database such as database 602 of FIG. 6 .
- the user may choose to segment the input by duration or object or region. If the user selects an object for which the database may not have sensory information, such as the cloud in the scene, a sensory estimator, such as sensory estimator 600 of FIG. 6 , may estimate the texture, temperature and/or movement of the cloud.
- screen 902 may be a window through which the scene is being viewed.
- the window 902 may contain a gridded interface 808 , as described in FIGS. 8A and 8B , embedded into the window pane.
- a user may, for example, perceive the temperature feel of the scene outside by touching the various regions on the window pane, which is reproduced by the simulators of the gridded interface. As such, a user may “feel” the temperature outside without having to leave the house or check the weather reports.
- user selection is an option, such as user selection of the object of interest or user selection of segmentation by duration
- the user selection may be performed using conventional methods such as employing a stylus, a mouse or a finger to highlight the object or choice in the user interface.
- Embodiments of the present invention may be a system, a method, and/or a computer program product.
- the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
- FIG. 10 may be used to implement the various components/steps shown and described above in the context of FIGS. 1-9 .
- the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
- the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
- a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- SRAM static random access memory
- CD-ROM compact disc read-only memory
- DVD digital versatile disk
- memory stick a floppy disk
- a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
- a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
- Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
- the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
- a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
- Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
- These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures.
- two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- One or more embodiments can make use of software running on a general-purpose computer or workstation.
- a computer system/server 1012 which is operational with numerous other general purpose or special purpose computing system environments or configurations.
- Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with computer system/server 1012 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputer systems, mainframe computer systems, and distributed cloud computing environments that include any of the above systems or devices, and the like.
- Computer system/server 1012 may be described in the general context of computer system executable instructions, such as program modules, being executed by a computer system.
- program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types.
- Computer system/server 1012 may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network.
- program modules may be located in both local and remote computer system storage media including memory storage devices.
- computer system/server 1012 in computing node 1010 is shown in the form of a general-purpose computing device.
- the components of computer system/server 1012 may include, but are not limited to, one or more processors or processing units 1016 , a system memory 1028 , and a bus 1018 that couples various system components including system memory 1028 to processor 1016 .
- the bus 1018 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures.
- bus architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnects (PCI) bus.
- the computer system/server 1012 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer system/server 1012 , and it includes both volatile and non-volatile media, removable and non-removable media.
- the system memory 1028 can include computer system readable media in the form of volatile memory, such as random access memory (RAM) 1030 and/or cache memory 1032 .
- the computer system/server 1012 may further include other removable/non-removable, volatile/nonvolatile computer system storage media.
- storage system 1034 can be provided for reading from and writing to a non-removable, non-volatile magnetic media (not shown and typically called a “hard drive”).
- a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”)
- an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media
- each can be connected to the bus 1018 by one or more data media interfaces.
- the memory 1028 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
- a program/utility 1040 having a set (at least one) of program modules 1042 , may be stored in memory 1028 by way of example, and not limitation, as well as an operating system, one or more application programs, other program modules, and program data. Each of the operating system, one or more application programs, other program modules, and program data or some combination thereof, may include an implementation of a networking environment.
- Program modules 1042 generally carry out the functions and/or methodologies of embodiments of the invention as described herein.
- Computer system/server 1012 may also communicate with one or more external devices 1014 such as a keyboard, a pointing device, a display 1024 , etc., one or more devices that enable a user to interact with computer system/server 1012 , and/or any devices (e.g., network card, modem, etc.) that enable computer system/server 1012 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interfaces 1022 . Still yet, computer system/server 1012 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 1020 .
- LAN local area network
- WAN wide area network
- public network e.g., the Internet
- network adapter 1020 communicates with the other components of computer system/server 1012 via bus 1018 .
- bus 1018 It should be understood that although not shown, other hardware and/or software components could be used in conjunction with computer system/server 1012 . Examples include, but are not limited to, microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc.
Abstract
Description
Claims (25)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/502,110 US9270940B1 (en) | 2014-09-30 | 2014-09-30 | Remote object sensing in video |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/502,110 US9270940B1 (en) | 2014-09-30 | 2014-09-30 | Remote object sensing in video |
Publications (1)
Publication Number | Publication Date |
---|---|
US9270940B1 true US9270940B1 (en) | 2016-02-23 |
Family
ID=55314822
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/502,110 Active US9270940B1 (en) | 2014-09-30 | 2014-09-30 | Remote object sensing in video |
Country Status (1)
Country | Link |
---|---|
US (1) | US9270940B1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10332639B2 (en) * | 2017-05-02 | 2019-06-25 | James Paul Smurro | Cognitive collaboration with neurosynaptic imaging networks, augmented medical intelligence and cybernetic workflow streams |
US10402068B1 (en) | 2016-06-16 | 2019-09-03 | Amazon Technologies, Inc. | Film strip interface for interactive content |
US10417356B1 (en) * | 2016-06-16 | 2019-09-17 | Amazon Technologies, Inc. | Physics modeling for interactive content |
US10623696B1 (en) * | 2018-08-06 | 2020-04-14 | Paula Muller | Communication system for use with protected persons |
WO2021035362A1 (en) * | 2019-08-30 | 2021-03-04 | Vrx Ventures Ltd. | Systems and methods for mapping motion-related parameters of remote moving objects |
US11388371B1 (en) * | 2021-01-22 | 2022-07-12 | Toyota Research Institute, Inc. | Systems and methods for telepresence rooms |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2357706A (en) | 1942-06-29 | 1944-09-05 | Rca Corp | Heating and cooling system |
US5109277A (en) * | 1990-06-20 | 1992-04-28 | Quadtek, Inc. | System for generating temperature images with corresponding absolute temperature values |
US5292195A (en) | 1992-09-09 | 1994-03-08 | Martin Marietta Corporation | Thermographic evaluation technique |
US5615953A (en) | 1994-07-25 | 1997-04-01 | The Babcock & Wilcox Company | Boiler bank surface temperature profiler |
US6067371A (en) | 1995-11-28 | 2000-05-23 | Dornier Medical Systems, Inc. | Method and system for non-invasive temperature mapping of tissue |
US6552656B2 (en) | 2001-04-12 | 2003-04-22 | Horizon Navigation, Inc. | Method and apparatus for generating notification of changed conditions behind a vehicle |
US20030210259A1 (en) * | 2001-11-14 | 2003-11-13 | Liu Alan V. | Multi-tactile display haptic interface device |
US6679830B2 (en) | 2001-02-06 | 2004-01-20 | Hill-Rom Services, Inc. | Infant incubator with non-contact sensing and monitoring |
US7079995B1 (en) * | 2003-01-10 | 2006-07-18 | Nina Buttafoco | Tactile simulator for use in conjunction with a video display |
US7369156B1 (en) | 2005-05-12 | 2008-05-06 | Raytek Corporation | Noncontact temperature measurement device having compressed video image transfer |
US20090105605A1 (en) | 2003-04-22 | 2009-04-23 | Marcio Marc Abreu | Apparatus and method for measuring biologic parameters |
US20100265326A1 (en) * | 2009-04-20 | 2010-10-21 | Kujala Kevin A | Sensory enhancement method and system for visual media |
US20130038792A1 (en) * | 2008-10-10 | 2013-02-14 | Internet Services, Llc | System and method for synchronization of haptic data and media data |
US20130339431A1 (en) * | 2012-06-13 | 2013-12-19 | Cisco Technology, Inc. | Replay of Content in Web Conferencing Environments |
-
2014
- 2014-09-30 US US14/502,110 patent/US9270940B1/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2357706A (en) | 1942-06-29 | 1944-09-05 | Rca Corp | Heating and cooling system |
US5109277A (en) * | 1990-06-20 | 1992-04-28 | Quadtek, Inc. | System for generating temperature images with corresponding absolute temperature values |
US5292195A (en) | 1992-09-09 | 1994-03-08 | Martin Marietta Corporation | Thermographic evaluation technique |
US5615953A (en) | 1994-07-25 | 1997-04-01 | The Babcock & Wilcox Company | Boiler bank surface temperature profiler |
US6067371A (en) | 1995-11-28 | 2000-05-23 | Dornier Medical Systems, Inc. | Method and system for non-invasive temperature mapping of tissue |
US6679830B2 (en) | 2001-02-06 | 2004-01-20 | Hill-Rom Services, Inc. | Infant incubator with non-contact sensing and monitoring |
US6552656B2 (en) | 2001-04-12 | 2003-04-22 | Horizon Navigation, Inc. | Method and apparatus for generating notification of changed conditions behind a vehicle |
US20030210259A1 (en) * | 2001-11-14 | 2003-11-13 | Liu Alan V. | Multi-tactile display haptic interface device |
US7079995B1 (en) * | 2003-01-10 | 2006-07-18 | Nina Buttafoco | Tactile simulator for use in conjunction with a video display |
US20090105605A1 (en) | 2003-04-22 | 2009-04-23 | Marcio Marc Abreu | Apparatus and method for measuring biologic parameters |
US7369156B1 (en) | 2005-05-12 | 2008-05-06 | Raytek Corporation | Noncontact temperature measurement device having compressed video image transfer |
US20130038792A1 (en) * | 2008-10-10 | 2013-02-14 | Internet Services, Llc | System and method for synchronization of haptic data and media data |
US20100265326A1 (en) * | 2009-04-20 | 2010-10-21 | Kujala Kevin A | Sensory enhancement method and system for visual media |
US20130339431A1 (en) * | 2012-06-13 | 2013-12-19 | Cisco Technology, Inc. | Replay of Content in Web Conferencing Environments |
Non-Patent Citations (7)
Title |
---|
Andrew G. Hargroder et al., "Infrared Imaging of Burn Wounds to Determine Burn Depth," Part of the SPIE Conference on Infrared Technology and Applications XXV (SPIE), Apr. 1999, pp. 103-108, vol. 3698. |
G.A. Tsyba et al., "A Video Pyrometer," Instruments and Experimental Techniques, Jul. 2003, pp. 480-483, vol. 46, No. 4. |
Hoffman Specialty, "Basic Steam Heating Systems," ITT Industries, HS-901(A), 1999, 12 pages. |
Tactile Sensing and Teletaction Research, "Teletaction System," http://robotics.eecs.berkeley.edu/~ronf/tactile.html, Jul. 1998, 5 pages. |
Tactile Sensing and Teletaction Research, "Teletaction System," http://robotics.eecs.berkeley.edu/˜ronf/tactile.html, Jul. 1998, 5 pages. |
The Engineer, "Process Heating with Affordable Cascade Control," http://www.theengineer.co.uk/in-depth/process-heating-with-affordable-cascade-contro1/285295.article, Oct. 2000, 2 pages. |
W. Hofmeister et al., "Investigating Solidification with the Laser-Engineered Net Shaping (LENS) Process," JOM, Jul. 1999, 6 pages, vol. 51, No. 7. |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10402068B1 (en) | 2016-06-16 | 2019-09-03 | Amazon Technologies, Inc. | Film strip interface for interactive content |
US10417356B1 (en) * | 2016-06-16 | 2019-09-17 | Amazon Technologies, Inc. | Physics modeling for interactive content |
US10332639B2 (en) * | 2017-05-02 | 2019-06-25 | James Paul Smurro | Cognitive collaboration with neurosynaptic imaging networks, augmented medical intelligence and cybernetic workflow streams |
US10623696B1 (en) * | 2018-08-06 | 2020-04-14 | Paula Muller | Communication system for use with protected persons |
US10848711B2 (en) | 2018-08-06 | 2020-11-24 | SociAvi Company | Communication system for use with protected persons |
WO2021035362A1 (en) * | 2019-08-30 | 2021-03-04 | Vrx Ventures Ltd. | Systems and methods for mapping motion-related parameters of remote moving objects |
US11740600B2 (en) | 2019-08-30 | 2023-08-29 | Vrx Ventures Ltd. | Computerized method, processing structure and server for controlling a target-motion device based on source-motion object |
US11388371B1 (en) * | 2021-01-22 | 2022-07-12 | Toyota Research Institute, Inc. | Systems and methods for telepresence rooms |
US20220295015A1 (en) * | 2021-01-22 | 2022-09-15 | Toyota Research Institute, Inc. | Systems and Methods for Telepresence Rooms |
US11671563B2 (en) * | 2021-01-22 | 2023-06-06 | Toyota Research Institute, Inc. | Systems and methods for telepresence rooms |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9270940B1 (en) | Remote object sensing in video | |
US11523103B2 (en) | Providing a three-dimensional preview of a three-dimensional reality video | |
US9570113B2 (en) | Automatic generation of video and directional audio from spherical content | |
KR102246355B1 (en) | Hybrid visual communication | |
US10681341B2 (en) | Using a sphere to reorient a location of a user in a three-dimensional virtual reality video | |
EP3125076A1 (en) | Crowd-based haptics | |
US20180160194A1 (en) | Methods, systems, and media for enhancing two-dimensional video content items with spherical video content | |
JP2023501553A (en) | Information reproduction method, apparatus, computer-readable storage medium and electronic equipment | |
JP2020144872A (en) | Reaction type video generation method and generation program | |
US20180232194A1 (en) | Guided Collaborative Viewing of Navigable Image Content | |
US11032535B2 (en) | Generating a three-dimensional preview of a three-dimensional video | |
US20210326594A1 (en) | Computer-generated supplemental content for video | |
US11698707B2 (en) | Methods and systems for provisioning a collaborative virtual experience of a building | |
KR20210028198A (en) | Avatar animation | |
JP7385733B2 (en) | Position synchronization between virtual and physical cameras | |
US20230298282A1 (en) | Extended reality recorder | |
US10582190B2 (en) | Virtual training system | |
CA3119609A1 (en) | Augmented reality (ar) imprinting methods and systems | |
Weinzierl et al. | On the epistemic potential of virtual realities for the historical sciences. A methodological framework | |
US11790653B2 (en) | Computer-generated reality recorder | |
WO2015200914A1 (en) | Techniques for simulating kinesthetic interactions | |
US20190339771A1 (en) | Method, System and Apparatus For Brainwave and View Based Recommendations and Story Telling | |
US20220319108A1 (en) | Methods and systems for provisioning a virtual experience of a building based on user profile data | |
NL2014682B1 (en) | Method of simulating conversation between a person and an object, a related computer program, computer system and memory means. | |
Kim | In the flesh |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ARAVKIN, ALEKSANDR Y.;BASU, ANIRBAN;KANEVSKY, DIMITRI;AND OTHERS;SIGNING DATES FROM 20140930 TO 20141001;REEL/FRAME:033883/0103 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
AS | Assignment |
Owner name: ECOBEE INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTERNATIONAL BUSINESS MACHINES CORPORATION;REEL/FRAME:048763/0075 Effective date: 20190110 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: STRUCTURED ALPHA LP, CANADA Free format text: SECURITY INTEREST;ASSIGNOR:INC., ECOBEE;REEL/FRAME:052678/0864 Effective date: 20200504 |
|
AS | Assignment |
Owner name: AST TRUST COMPANY (CANADA), CANADA Free format text: SECURITY INTEREST;ASSIGNOR:ECOBEE INC.;REEL/FRAME:052704/0656 Effective date: 20200505 |
|
AS | Assignment |
Owner name: ECOBEE, INC., CANADA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:AMERICAN STOCK TRANSFER & TRUST COMPANY, LLC D/B/A AST TRUST COMPANY (CANADA);REEL/FRAME:058568/0001 Effective date: 20211201 Owner name: ECOBEE, INC., CANADA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:STRUCTURED ALPHA LP, BY ITS GENERAL PARTNER THOMVEST ASSET MANAGEMENT LTD.;REEL/FRAME:058521/0001 Effective date: 20211129 |
|
AS | Assignment |
Owner name: 1339416 B.C. LTD., CANADA Free format text: MERGER AND CHANGE OF NAME;ASSIGNORS:1339416 B.C. LTD.;ECOBEE TECHNOLOGIES INC.;REEL/FRAME:061001/0949 Effective date: 20211231 Owner name: GENERAC POWER SYSTEMS, INC., WISCONSIN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAC HOLDINGS INC.;REEL/FRAME:058917/0161 Effective date: 20220101 Owner name: GENERAC HOLDINGS INC., WISCONSIN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ECOBEE TECHNOLOGIES ULC;REEL/FRAME:058917/0069 Effective date: 20220101 Owner name: ECOBEE TECHNOLOGIES ULC, CANADA Free format text: CHANGE OF NAME;ASSIGNOR:1339416 B.C. LTD.;REEL/FRAME:058907/0363 Effective date: 20211231 Owner name: ECOBEE TECHNOLOGIES INC., CANADA Free format text: CHANGE OF NAME;ASSIGNOR:ECOBEE INC.;REEL/FRAME:058905/0211 Effective date: 20211222 |
|
AS | Assignment |
Owner name: ECOBEE INC., CANADA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE RECEIVING PARTY NAME FROM ECOBEE, INC. TO ECOBEE INC. PREVIOUSLY RECORDED AT REEL: 058521 FRAME: 0001. ASSIGNOR(S) HEREBY CONFIRMS THE RELEASE OF SECURITY INTEREST;ASSIGNOR:STRUCTURED ALPHA LP, BY ITS GENERAL PARTNER THOMVEST ASSET MANAGEMENT LTD.;REEL/FRAME:059205/0822 Effective date: 20211129 |
|
AS | Assignment |
Owner name: ECOBEE INC., CANADA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE RECEIVING PARTY FROM ECOBEE, INC. TO ECOBEE INC. PREVIOUSLY RECORDED AT REEL: 058568 FRAME: 0001. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:AMERICAN STOCK TRANSFER & TRUST COMPANY, LLC D/B/A AST TRUST COMPANY (CANADA);REEL/FRAME:058965/0106 Effective date: 20211201 |
|
AS | Assignment |
Owner name: GENERAC POWER SYSTEMS, INC., WISCONSIN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAC HOLDINGS INC.;REEL/FRAME:059713/0799 Effective date: 20220101 Owner name: GENERAC HOLDINGS INC., WISCONSIN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ECOBEE TECHNOLOGIES ULC;REEL/FRAME:059713/0780 Effective date: 20220101 |
|
AS | Assignment |
Owner name: ECOBEE TECHNOLOGIES INC., CANADA Free format text: CONTINUED - CHANGE OF JURISDICTION;ASSIGNOR:ECOBEE INC.;REEL/FRAME:059805/0101 Effective date: 20211222 Owner name: 1339416 B.C. LTD., CANADA Free format text: AMALGAMATION;ASSIGNORS:1339416 B.C. LTD.;ECOBEE TECHNOLOGIES INC.;REEL/FRAME:059825/0888 Effective date: 20211231 Owner name: ECOBEE TECHNOLOGIES ULC, CANADA Free format text: CHANGE OF NAME;ASSIGNOR:1339416 B.C. LTD.;REEL/FRAME:059825/0668 Effective date: 20211231 |
|
AS | Assignment |
Owner name: 1339416 B.C. LTD., CANADA Free format text: AMALGAMATION;ASSIGNORS:1339416 B.C. LTD.;ECOBEE TECHNOLOGIES INC.;REEL/FRAME:060907/0090 Effective date: 20211231 |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNOR:GENERAC POWER SYSTEMS, INC.;REEL/FRAME:061476/0745 Effective date: 20220629 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |