US20110248877A1 - System and method providing remote user selection of a device - Google Patents
System and method providing remote user selection of a device Download PDFInfo
- Publication number
- US20110248877A1 US20110248877A1 US13/042,198 US201113042198A US2011248877A1 US 20110248877 A1 US20110248877 A1 US 20110248877A1 US 201113042198 A US201113042198 A US 201113042198A US 2011248877 A1 US2011248877 A1 US 2011248877A1
- Authority
- US
- United States
- Prior art keywords
- user
- image
- module
- present
- presenting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
- H04L67/125—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/12—Discovery or management of network topologies
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/22—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks comprising specially adapted graphical user interfaces [GUI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/75—Indicating network or usage conditions on the user display
Definitions
- FIG. 1 shows a flow diagram of a non-limiting exemplary method for providing remote user selection of a device, in accordance with various aspects of the present invention.
- FIG. 2 shows a flow diagram of a non-limiting exemplary method for providing remote user selection of a device, in accordance with various aspects of the present invention.
- FIG. 3 shows a block diagram of a non-limiting exemplary database that may be utilized for device determination, in accordance with various aspects of the present invention.
- FIG. 4 shows a block diagram of a non-limiting exemplary system for providing remote user selection of a device, in accordance with various aspects of the present invention.
- FIG. 5 shows a block diagram of a non-limiting exemplary system for providing remote user selection of a device, in accordance with various aspects of the present invention.
- FIG. 6 shows a block diagram of a non-limiting exemplary system for providing remote user selection of a device, in accordance with various aspects of the present invention.
- FIG. 7 shows a block diagram of a non-limiting exemplary device for providing remote user selection of a device, in accordance with various aspects of the present invention.
- FIG. 8 shows a block diagram of a non-limiting exemplary device for providing remote user selection of a device, in accordance with various aspects of the present invention.
- modules, components or circuits may generally comprise hardware and/or a combination of hardware and software (e.g., including firmware).
- modules may also, for example, comprise a computer readable medium (e.g., a non-transitory medium) comprising instructions (e.g., software instructions) that, when executed by a processor, cause the processor to perform various functional aspects of the present invention. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular hardware and/or software implementations of a module, component or circuit unless explicitly claimed as such.
- various aspects of the present invention may be implemented by one or more processors (e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.) executing software instructions (e.g., stored in volatile and/or non-volatile memory).
- processors e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.
- software instructions e.g., stored in volatile and/or non-volatile memory
- aspects of the present invention may be implemented by an application-specific integrated circuit (“ASIC”) and/or other hardware components.
- ASIC application-specific integrated circuit
- a communication network is generally the communication infrastructure through which a communication device (e.g., a portable communication device, a personal computer, a network controller, a user-selectable device with network communication capability, a consumer electronic device with network communication capability, etc.) may communicate with other systems.
- a communication network may comprise a cable and/or satellite television communication network, a cellular communication network, a wireless metropolitan area network (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN), any home or premises communication network, etc.
- WMAN wireless metropolitan area network
- WLAN wireless local area network
- WPAN wireless personal area network
- a particular communication network may, for example, generally have a corresponding communication protocol according to which a communication device may communicate with other devices in and/or via the communication network. Unless so claimed, the scope of various aspects of the present invention should not be limited by characteristics of a particular type of communication network.
- a camera may capture an image (e.g., a real-time image) of an environment (e.g., a home environment, office environment, etc.) comprising devices that may be monitored and/or controlled by a user.
- image may, for example, be a real-time image, a periodically updated image, an image acquired during system configuration, etc.).
- a graphical map image may also (and/or alternatively), for example, be similarly utilized.
- the system may then present the image to the user (e.g., on a display device, a touch input device, etc.).
- the user may then identify a device in the image with which the user desires to interact (e.g., in a monitoring and/or controlling capacity).
- the user may touch the device in the image on a touch screen, identify the device with a movable cursor, indicate the image with a light pen, etc.
- the system may then determine the user-identified device.
- Such determination may, for example, comprise analyzing a database of devices in a particular area.
- Such database may, for example, be maintained manually and/or automatically without user interaction.
- the system may, for example, establish a communication link (directly or indirectly) with the user-identified device.
- the system may then determine user interface options to present to the user.
- Such user interface options may, for example, be related to monitoring and/or controlling the user-identified device.
- the system may present the determined user interface options to the user.
- the system may then interact with the user (e.g., presenting user interface options to the user and receiving user input corresponding to such user interface options).
- the system may then form signals recognized by the user-identified device, where such signals correspond to the user input.
- Such signals may, for example, comprise data structures, command identifiers, etc., corresponding to a communication protocol understood by the user-identified device.
- the system may then communicate the formed signal to the user-identified device (e.g., utilizing a communication protocol understood by the user-identified device over a communication network to which the user-identified device is communicatively coupled.
- the system may then, depending on the nature of the user input, continue to interact with the user-identified device, for example transmitting additional commands/information to the user-identified device and/or receiving information from the user-identified device.
- FIG. 1 shows a flow diagram of a method 100 illustrating a method of providing remote user selection of a device, in accordance with various aspects of the present invention.
- the method 100 may, for example, be implemented in a network controller device, gateway, router, access point, etc.
- Such network controller device may, for example, be implemented in a home device (e.g., a hub of a home computing network, a home-based desktop computer, a home gateway and/or router, etc.).
- the method 100 may also, for example, be implemented in a portable computing device.
- a portable computing device may, for example, be a laptop or notebook computer, handheld computing device, cellular phone, personal digital assistant, any personal electronic device, etc.
- a first portion may be performed in a communication network infrastructure device, for example a network controller, and a second portion may be performed in a personal electronic device or terminal).
- the method 100 may begin executing at step 105 .
- the method 100 may begin executing in response to any of a variety of causes and/or conditions, non-limiting examples of which will now be provided.
- the method 100 may begin executing in response to a user command to begin.
- a user may input a command to begin any of a large variety of tasks that include remote user identification of a device.
- the method 100 may begin executing in response to detection of a change in the device makeup of a particular environment.
- the method 100 may begin executing in response to detecting the presence of an unknown device within a particular geographic area (e.g., a premises, office, room, suite of rooms, etc.).
- the method 100 may begin executing in response to a determination that a device being tracked has moved or has potentially moved within a particular environment.
- the method 100 may begin executing in response to failure to establish a communication link with a device in a particular area (e.g., when such device is predicted to be in such area).
- the exemplary method 100 may begin executing periodically (e.g., at a constant or substantially constant period, for example hourly, daily, etc.), surveying one or more particular areas.
- the exemplary method 100 may, for example, begin executing in response to a timer and/or in response to a predefined time schedule. Additionally for example, the method 100 may begin executing in response to a system power-up or reset.
- the exemplary method 100 may begin executing at step 105 . Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular initiating cause or condition unless explicitly claimed.
- the exemplary method 100 may, for example at step 120 , comprise providing an image (e.g., one or more images) of a device environment to a user.
- image may comprise characteristics of any of a variety of types of images.
- the image may correspond to a camera (or photographic) image of a particular area (e.g., a room, office, set of rooms, etc.).
- step 120 may comprise providing a map image (e.g., a 2-dimensional and/or 3-dimensional map image) to the user.
- Step 120 may, for example, comprise presenting the image on a display (e.g., a touch screen) of a user device (e.g., a personal electronic device, a personal computer, a personal digital assistant, a smart phone, etc.).
- Step 120 may comprise presenting the image on a display of any of a variety of other devices (e.g., a network controller with a screen, an access point with a screen, a gateway with a screen, etc.).
- Step 120 may also, for example, comprise communicating the image (e.g., camera image and/or map image) information to a device (e.g., a user device) if such information does not originate at such device.
- a device e.g., a user device
- step 120 may comprise utilizing a previously established communication link or may also comprise establishing a communication link configured for the communication of video and/or graphics information.
- step 120 may be performed in communication network infrastructure apparatus (e.g., a network controller) and comprise transmitting the image information to a personal computer terminal.
- Step 120 may comprise presenting graphics features in the presented image, where such graphics features indicate selectable devices in the image (e.g., devices for which communication has occurred in the past, devices for which communication information exists in a database, devices that are presently communicatively coupled to one or more known communication networks, etc.). Such graphical features may also indicate status of the device (e.g., currently on-line, currently off-line, sleeping, monitorable device, controllable device, device for which the user is authorized to interact with or not, new undefined device, failing device, device for which communication should be possible but which is presently not possible, etc.
- selectable devices in the image e.g., devices for which communication has occurred in the past, devices for which communication information exists in a database, devices that are presently communicatively coupled to one or more known communication networks, etc.
- Such graphical features may also indicate status of the device (e.g., currently on-line, currently off-line, sleeping, monitorable device, controllable device, device for which the user is authorized to interact with or
- Step 120 may also comprise providing an output image or graphics feature indicating where a user is presently indicating.
- step 120 may comprise outputting a graphical feature (e.g., a cursor, etc.) indicating where the user is presently pointing, touching, looking, etc.
- step 120 may comprise outputting a graphical indication of an area being selected by a user (e.g., as such area is being defined by the user).
- the image presented at step 120 may comprise a non-real-time stored image and/or a real-time image acquired as such image is needed (e.g., during present execution of the exemplary method 100 , during present execution of step 120 , etc.).
- step 120 may comprise retrieving the image from a database (e.g., a local database and/or a remote networked database).
- a database e.g., a local database and/or a remote networked database.
- a database may, for example and without limitation, have been formed and/or maintained in accordance with procedures or systems outlined in U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No.
- step 120 may comprise presenting an image to the user that was obtained at some point in the past (e.g., at a regular periodic time point in the past, during system calibration defining user-selectable devices, during a most recent network maintenance and/or definition operation, etc.).
- step 120 may comprise presenting a camera image of a family room to a user, where such image was taken prior to the present execution of the exemplary method 100 .
- a camera image of a family room to a user, where such image was taken prior to the present execution of the exemplary method 100 .
- such previously obtained image may be utilized for a substantial duration of time before updating. For example, such an image need only be updated when a user-selectable device is added to and/or removed from and/or moved within the pictured or mapped environment.
- step 120 may comprise retrieving the image utilizing a camera.
- a user may utilize a user interface provided at step 120 to specify a particular room, in response to which step 120 may comprise utilizing a camera in the room to obtain a current image of the room.
- step 120 may comprise utilizing a camera in the room to obtain a current image of the room.
- selection may be multi-tiered, for example, a user may first specify a building or system, then a room of the specified building or system, then a camera of the specified room, etc.
- Step 120 may similarly comprise providing a user interface by which the user may select a particular camera from which an image is desired.
- step 120 may present the user with a list of cameras from which to select (e.g., a kitchen camera, family room camera 1 , family room camera 2 , patio camera, pool camera, rec room camera 1 , rec room camera 2 , office camera 1 , office camera 2 , front door camera, etc.). The user may then select the desired camera.
- a list of cameras may, for example be textual, graphical, image-based (e.g., showing a thumbnail image from each camera), etc.
- step 120 may also comprise providing for user control of a selected camera.
- step 120 may comprise providing a user interface to the user by which the user may control movement of the camera, operation of the camera lens (e.g., panning, zooming and/or focusing), operation of camera lighting, filtering, resolution, etc.
- step 120 may comprise providing the user with the ability to zoom in on a desired device to assist the user in selecting a device and/or defining an area and/or volume occupied by such device. Step 120 may thus provide for additional control of an image being presented to the user for use by the user in selecting a device, inputting information identifying location of the device, orientation of the device, spatial characteristics of the device, etc.
- step 120 may comprise providing an image (e.g., one or more images) of a device environment to a user.
- image e.g., one or more images
- the scope of various aspects of the present invention should not be limited by characteristics of any particular image (e.g., a camera image, map image, etc.) and/or any particular manner of presenting such an image unless explicitly claimed.
- the exemplary method 100 may, for example at step 130 , comprise receiving an image selection input from a user.
- Step 130 may comprise receiving such an image selection input in any of a variety of manners, non-limiting examples of which will now be presented.
- step 130 may comprise receiving a touch screen input from a user. Also for example, step 130 may comprise receiving a cursor movement and selection input from a user (e.g., utilizing a mouse, touch pad, track ball, etc.). Further for example, step 130 may comprise receiving a light pen input from a user, a camera input from a user, etc.
- Step 130 may comprise receiving a user indication of a specific point in the image as input (e.g., a touch point, a cursor click point, etc.). Also for example, step 130 may comprise receiving a user input indicative of a selected area (e.g., an area enclosed by a figure drawn by the user) or volume.
- a user indication of a specific point in the image e.g., a touch point, a cursor click point, etc.
- step 130 may comprise receiving a user input indicative of a selected area (e.g., an area enclosed by a figure drawn by the user) or volume.
- step 130 may comprise receiving information descriptive of a square, rectangle or other polygon drawn by the user.
- step 130 may comprise receiving information descriptive of a circle or oval drawn by the user.
- step 130 may comprise providing the user with a graphical toolset, by which the user may draw a polygon around the device, circle the device, paint the device, draw an ellipse around the device, define the device region with a plurality of figures, etc.
- graphical toolset may also comprise tools that automatically provide borders on a selected object (e.g., by performing pixel analysis to search for device edges).
- step 130 may comprise receiving a user input that toggles between and/or traverses a plurality of selectable devices in the image presented to the user.
- step 130 may comprise receiving a user inputting a “next” command that causes traversal of the selectable devices (e.g., one at a time) until the user reaches the desired device.
- step 130 may comprise outputting a graphical indication (e.g., a highlighted area or polygon) indicating the presently selected device.
- step 130 may comprise receiving an image selection input from a user. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of such input and/or any particular manner of receiving such input unless explicitly claimed.
- the exemplary method 100 may, for example at step 140 , comprise identifying a device in the image that has been selected by the user.
- Step 140 may comprise identifying the selected device in any of a variety of manners.
- step 140 may comprise correlating the user device selection information received at step 130 with known device selection areas in the presented image.
- U.S. patent application Ser. No. ______ filed concurrently herewith, titled “SYSTEM AND METHOD FOR MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23010US02 and/or U.S. patent application Ser. No.
- a database may be formed that includes information defining device selection areas within an image (e.g., a pictorial or map image).
- step 140 may comprise comparing a user-selected location (or area or volume) in the image presented at step 120 , as received at step 130 , with such database information to determine whether the user-selected location (or area or volume) corresponds to a defined region of the image corresponding to a particular device.
- step 140 may comprise determining that the location (X, Y) has been previously defined to correspond to the location of Device A.
- Various exemplary manners of forming such a database were discussed in the above-mentioned incorporated patent applications.
- step 140 may comprise processing the user device selection information received at step 130 to determine a location in the image at which to begin performing image matching analysis.
- pixel characteristics with a previously defined image of a device may be processed and stored in a fashion (e.g., as pixel histograms, etc.) that may be processed to identify such device appearing in an image.
- respective pixel histograms may be formed that correspond to respective devices (e.g., respective pixel histograms for a television, a stereo, a light switch, a receiver, a thermostat, etc.).
- Step 140 may, in such a scenario, comprise determining the previously defined histogram that best matches the area surrounding a user-selected point in the image and/or that best matches the characteristics of an area defined by a user input received at step 130 .
- step 140 may comprise performing character recognition processing.
- Such processing may, for example, comprise analyzing an area of the image near a point specified by the user to determine whether any recognizable characters (e.g., alphabetical characters, numerical characters, fanciful logos, etc.) are present in the area of the image.
- the system implementing the method 100 may know that there is only one device in the family room that has been produced by a particular manufacturer. The presence of a logo associated with such manufacturer in the image relatively near a user's designated point in the image may be utilized by step 140 as an indication that such device has been selected by the user.
- step 140 may comprise analyzing location information to identify a particular device that has been selected by a user. For example, the camera's location and field of view may be processed in conjunction with a user-identified location in the image to draw a line extending out from the camera through the camera image plane. In such a scenario, step 140 may comprise determining that the first device encountered along such line (in other words, the first device, the known location of which lies along such line) extending outward from the camera lens corresponds to the user-selected device.
- step 140 may comprise correlating the user-identified image location and field of view with a generally analogous field of received RF energy. For example, user-identified image location may be relatively closer to an identified RF hot-spot in such image field of view than other RF hot-spots. In such a scenario, step 140 may comprise determining that the user has identified the device corresponding to the closest RF hot-spot in the image field of view. Similarly, infrared processing may identify devices emitting heat energy and determining the closest heat-emitting device in the field of view relative to the user-identified location in the image.
- step 140 may also comprise establishing a communication link with a determined device (or candidate device) to determine additional information about such device that may be utilized in determining whether the user has identified such device. For example, such communication link(s) may be established to communicate current device location information, to communicate test messages, to determine whether the user has access to a particular device, etc.
- step 140 may comprise identifying a device in the image that has been selected by the user. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such identification unless explicitly claimed.
- the exemplary method 100 may, for example at step 195 , comprise performing continued processing. Various non-limiting examples of such processing will be provided below.
- step 195 may comprise returning execution flow of the method 100 to any previous step.
- step 195 may comprise communicating information indicating the device identified by the system to the user. Such operation may, for example, provide the user the opportunity to determine whether the correct device was identified. Such information may, for example, outputting a graphical feature on the image presented at step 120 to identify the device identified at step 140 .
- step 195 may comprise communicating information of the identified device to any of a variety of functional modules that may utilize such information.
- step 195 may comprise determining a desired user action to take regarding the user-identified device and performing such action.
- step 195 may, for example, comprise interfacing with one or more databases (e.g., local and/or remote databases) to acquire information regarding such user-selected device.
- databases e.g., local and/or remote databases
- Such information may, for example, be presented to the user, stored in a database record, utilized to upgrade and/or reconfigure the identified device, utilized to establish a communication link with such device, utilized to configure secure communication with such device, purchasing information, manufacturer information, technical assistance information, cost information, operation instructions, remote operation instructions, personal information associated with such device, images related to such device, history of such device, etc.
- step 195 may comprise presenting to a user via a user interface, the options available to the user concerning interaction with such identified device. Further, as will be discussed in more detail below, step 195 may comprise establishing a communication network with the identified device and/or establishing a remote control environment between the user and the identified device.
- step 195 may comprise interfacing with the user to acquire an image of the selected device (or object) to communicate to a destination.
- Such continued processing may also, for example, comprise passing control of a selected device to another entity (e.g., to a service person, etc.).
- Additional examples of such continued processing may, for example, comprise any or all of the functionality discussed in co-pending applications U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23010US02; U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR AUTOMATICALLY MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23011US02; and U.S. patent application Ser. No.
- step 195 may comprise performing continued processing. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of continued processing unless explicitly claimed.
- FIG. 2 a flow diagram of a non-limiting exemplary method 200 for providing remote user selection of a device, in accordance with various aspects of the present invention.
- the exemplary method 200 may, for example and without limitation, share any or all aspects with the exemplary method 100 discussed above.
- the exemplary method 200 may, for example at step 205 , share any or all aspects with step 105 of the exemplary method 100 illustrated in FIG. 1 and discussed previously.
- the exemplary method 200 may, for example at step 220 , comprise presenting (or providing) an image (e.g., one or more images) of a device environment to a user.
- Step 220 may, for example, share any or all characteristics with step 120 discussed above.
- step 220 may, at sub-step 222 , comprise establishing a user interface (local or remote).
- a user interface may, for example, be utilized by a user to request and/or identify an image (e.g., a camera image, graphical image, map image, etc.) to be presented to the user.
- sub-step 222 may comprise establishing communication links between a user device and other devices depending on where various aspects are implemented.
- FIGS. 4-6 which will be discussed later, provide non-limiting illustrations of a variety of system configurations, including communication links between various system components and devices.
- a user device may operate the entire user interface (U/I) (e.g., executing all or most U/I software instructions and utilizing user interface modules to interact with the user).
- the user device may effectively operate as a U/I terminal for another device (e.g., a gateway, network controller, access point, etc.) that is executing all or most of the U/I software instructions, for example acting primarily as a conduit of user interface information, which is mostly processed by another device.
- Step 220 may, for example at sub-step 224 , comprise receiving a user input requesting an image (e.g., a camera image or map).
- sub-step 224 may comprise receiving such a user input via the user interface established at sub-step 222 .
- Sub-step 224 may, for example, comprise receiving a user request for a view of a particular area (e.g., a camera image, a 2-D and/or 3-D graphical map image of an area, etc.
- Sub-step 224 may, for example, comprise receiving a request for an image from a particular camera or a map of a particular area (e.g., a room).
- sub-step 224 may comprise providing a graphical user interface to a user that provides the user the capability to select a particular image, room to be imaged, camera from which to acquire an image, etc.
- Step 220 may, for example at sub-step 226 , comprise acquiring an image (e.g., a camera image or map image) (e.g., the image requested at step 224 ).
- Sub-step 226 may comprise acquiring such image in any of a variety of manners.
- sub-step 226 may comprise identifying the desired image (e.g., as expressed by the user at sub-step 224 ).
- Sub-step 226 may, for example, comprise acquiring a stored non-real-time image stored previously (e.g., in a database, for example, local database and/or remote database).
- sub-step 226 may comprise interfacing with a camera (e.g., a fixed camera, user-controllable camera, etc.) to acquire a real-time image from such camera (e.g., acquiring the real-time image when the need for such an image is realized).
- sub-step 226 may comprise providing user control over a controllable camera (e.g., panning, zooming, resolution, filtering, etc.).
- Sub-step 226 may, for example, comprise utilizing the communication link established at step 222 and/or may also comprise establishing a communication link specifically configured for the communication of video and/or graphics information.
- Step 220 may, for example at sub-step 228 , comprise presenting the image (e.g., camera image, map image, etc.) acquired at sub-step 226 to the user (e.g., via the user interface established at sub-step 222 ).
- Sub-step 228 may comprise presenting the image in any of a variety of manners.
- sub-step 228 may comprise presenting the image on a display (e.g., a touch screen) of a user device (e.g., a personal computing device, smart phone, etc.), many examples of which have been presented herein.
- Sub-step 228 may, for example, comprise presenting the image on a display of a communication network infrastructure device (e.g., a network controller with a screen, access point with a screen, gateway with a screen, etc.).
- a communication network infrastructure device e.g., a network controller with a screen, access point with a screen, gateway with a screen, etc.
- Sub-step 228 may, for example, comprise communicating information describing the image (e.g., a digital image data file) to a user device (e.g., if such information is not already at such device).
- step 228 may comprise utilizing a communication link established at sub-step 222 and/or may also comprise establishing and utilizing a communication link specifically configured for the communication of video and/or graphics information.
- sub-step 228 may comprise presenting graphics features indicating selectable devices in the image (e.g., devices for which communication has occurred in the past, devices for which communication information exists in the database, devices that are presently communicatively coupled to one or more known communication networks, etc.). Such graphical features may also indicate status of the device (e.g., currently on-line, currently off-line, sleeping, monitorable device, controllable device, device for which the user is authorized to interact with or not, new non-defined device, failing device, device for which communication should be possible but which is presently not, etc.).
- graphics features indicating selectable devices in the image (e.g., devices for which communication has occurred in the past, devices for which communication information exists in the database, devices that are presently communicatively coupled to one or more known communication networks, etc.).
- Such graphical features may also indicate status of the device (e.g., currently on-line, currently off-line, sleeping, monitorable device, controllable device, device for which the user is authorized to interact with or not
- sub-step 228 may comprise presenting an on-image indication of where a user is presently indicating.
- sub-step 228 may comprise outputting a graphical feature (e.g., a cursor, etc.) indicating where the user is presently pointing, touching, looking, etc.
- sub-step 220 may comprise outputting a graphical indication of an area being selected by a user (e.g., as such area is being defined by the user).
- step 220 may comprise providing an image (e.g., one or more camera images, one or more map images, etc.) of a device environment to a user.
- an image e.g., one or more camera images, one or more map images, etc.
- the scope of various aspects of the present invention should not be limited by characteristics of any particular type of image and/or any particular manner of providing such an image to a user unless explicitly claimed.
- the exemplary method 200 may, for example at steps 230 - 295 , share any or all aspects with steps 130 - 195 of the exemplary method 100 illustrated in FIG. 1 and discussed previously.
- FIG. 3 shows a block diagram 300 of a non-limiting exemplary database for automatic network maintenance and/or characterization, and/or for utilization for device identification in accordance with various aspects of the present invention.
- identification of a selected device may be performed via interaction with a database.
- the discussions of other method steps mentioned the acquisition of information related to a selected device may comprise interfacing with one or more local and/or remote databases.
- FIG. 3 provides a non-limiting illustration of such a database.
- other method steps may comprise the formation and/or maintenance of a database where devices of an image may be related to respective device records that include any of a variety of different types of device information (many examples of which were provided above).
- the database may, for example, comprise a memory 361 .
- a memory 361 may, for example, be a non-volatile memory (or other non-transitory data retention device) such as a magnetic disk hard drive, optical disk drive, CD, DVD, laser disc, Blueray disc, diskette, flash drive, EPROM, EEPROM, flash memory, etc.
- the memory 361 may, in the illustrated example, include information of an image 320 (or plurality of images), where areas or features of the image may be related to records, which may also be stored in the memory 361 .
- the image 320 may comprise a still photograph or other camera image.
- the image 320 may comprise an image of a thermostat that is linked to a thermostat record 331 .
- the image 320 may also comprise an image of a DVR that is logically linked to a DVR record 332 , an image of a stereo that is logically linked to a stereo record 333 , an image of a television that is logically linked to a television record 334 , an image of a set top box (“STB”) and/or TV receiver that is logically linked to an STB/Receiver record 335 , an image of a power switch that is logically linked to a power switch record 336 and an image of a camera that is logically linked to a camera record 337 .
- the devices of the image 320 may be defined by respective areas (e.g., boxes, outlines, etc.) input by a user and/or automatically defined (e.g., utilizing pixel analysis).
- various aspects of the present invention may comprise providing a database interface 390 by which various processes (e.g., a process implementing steps 140 and/or 240 ) may access and utilize the information stored in the memory 361 .
- interface 390 may be implemented in a variety of manners.
- such interface 390 may be implemented by a processor 360 or other hardware device executing instructions, which may, for example, be stored in the memory 361 or another memory (e.g., either collocated with the memory 361 or geographically distinct from the memory 361 .
- the database interface 390 may comprise modules stored in a variety of locations (e.g., at a central controller, server, gateway, access point, and/or user computing device).
- the system 300 illustrated in FIG. 3 also comprises one or more processors, communication interface modules, or any other hardware components 360 that may interact with the memory 361 via the database interface 390 .
- processors may be operate to interface with the database 361 while implementing the exemplary methods 100 , 200 illustrated in FIGS. 1-2 and discussed previously.
- FIG. 4 shows a block diagram of a non-limiting exemplary system 400 for providing remote user selection of a device, in accordance with various aspects of the present invention.
- the exemplary system 400 (or components thereof) may operate to perform any or all of the functional characteristics discussed herein (e.g., with regard to the exemplary methods 100 and 200 , with regard to the exemplary database 300 , etc.).
- the exemplary system 400 may, for example, comprise a personal computer system 480 , which is communicatively coupled to a variety of devices via one or more communication networks 405 (e.g., the Internet, a telecommunication network, cable television network, satellite communication network, local area network, personal area network, metropolitan area network, wide area network, campus network, home network, etc.).
- networks 405 e.g., the Internet, a telecommunication network, cable television network, satellite communication network, local area network, personal area network, metropolitan area network, wide area network, campus network, home network, etc.
- Such network(s) may, for example, be wired, wireless RF, tethered optical, non-tethered optical, etc.
- the personal computer system 480 may comprise one or more processors 460 and a database 461 .
- the personal computer system 480 may, for example, operate to perform any or all of the method steps discussed previously.
- the database 461 may comprise any or all database characteristics discussed herein.
- the personal computer system 480 may, for example, operate to provide for user selection (e.g., remote user selection) of an electronic device in a network that comprises any or all of the non-limiting exemplary devices (e.g., the thermostat 431 , DVR 432 , stereo 433 , television 434 , STB/Rcvr 435 , power switch 436 and camera 437 ).
- the processor 460 may operate to provide for user selection of any or all of such devices in a same communication network and/or in separate communication networks.
- the personal computer system 480 may, for example, operate to interact with the user to implement any of the above-mentioned functionality.
- the personal computer system 480 may operate to obtain image information (or map information) from a database 461 and/or camera 437 to present an image (e.g., a camera image, a map image, etc.) to a user thereof.
- the personal computer system 480 may then operate to receive image selection input from a user of such system, identify the user-selected device, and perform continued processing regarding such user-selected device.
- FIG. 5 shows a block diagram of a non-limiting exemplary system 500 for providing remote user selection of a device, in accordance with various aspects of the present invention.
- the exemplary system 500 (or components thereof) may operate to perform any or all of the functional characteristics discussed herein (e.g., with regard to the exemplary methods 100 and 200 illustrated in FIGS. 1 and 2 , with regard to the exemplary database illustrated in FIG. 3 , with regard to the exemplary system 400 illustrated in FIG. 4 , etc.).
- the exemplary system 500 may, for example, comprise a personal computer system 580 , which is communicatively coupled to a gateway, network controller and/or access point 585 via one or more communication networks 505 (e.g., the Internet, a telecommunication network, cable television network, satellite communication network, local area network, personal area network, metropolitan area network, wide area network, campus network, home network, etc.).
- networks 505 e.g., the Internet, a telecommunication network, cable television network, satellite communication network, local area network, personal area network, metropolitan area network, wide area network, campus network, home network, etc.
- Such network(s) may, for example, be wired, wireless RF, tethered optical, non-tethered optical, etc.
- the gateway, network controller and/or access point 585 is communicatively coupled to a variety of devices via one or more local communication networks 506 (e.g., local area network(s), personal area network(s), home network(s), office network(s), etc.).
- local communication networks 506 e.g., local area network(s), personal area network(s), home network(s), office network(s), etc.
- local communication network(s) may, for example, be wired, wireless RF, tethered optical, non-tethered optical, etc.
- the personal computer system 580 may comprise one or more processors 560 and a database 561 .
- the personal computer system 580 may, for example, operate to perform any or all of the method steps discussed previously.
- the database 561 may comprise any or all database characteristics discussed herein.
- the personal computer system 580 may, for example and without limitation, operate to provide for user selection of an electronic device in a network that comprises any or all of the previously exemplified thermostat 531 , DVR 532 , stereo 533 , television 534 , STB/Rcvr 535 , power switch 536 and camera 537 .
- the processor 560 may operate to provide for user selection of any or all of such devices in a same communication network and/or in separate communication networks.
- the personal computer system 580 may, for example, operate to interact with the user to implement any of the above-mentioned functionality.
- the personal computer system 580 may (e.g., utilizing a communication module) operate to establish a communication link with any of the devices illustrated (e.g., including the camera 537 ) via the communication network(s) 505 ; local gateway, network controller and/or access point 585 ; and communication network(s) 506 ; and provide the capability to view and/or select a device.
- Such operation may, for example, comprise interfacing with the camera 537 to obtain a photographic image of a device environment and/or obtaining image or map information from the database 561 .
- the personal computer system 580 may, for example, operate to receive image selection input from a user of such system, identify the user-selected device, and perform continued processing regarding such user-selected device.
- FIG. 6 shows a block diagram of a non-limiting exemplary system 600 for providing remote user selection of a device, in accordance with various aspects of the present invention.
- the exemplary system 600 (or components thereof) may operate to perform any or all of the functional characteristics discussed herein (e.g., with regard to the exemplary methods 100 and 200 illustrated in FIGS. 1 and 2 , with regard to the exemplary database illustrated in FIG. 3 , with regard to the exemplary systems 400 and 500 illustrated in FIGS. 4 and 5 , etc.).
- the exemplary system 600 may, for example, comprise a personal computer system 680 , which is communicatively coupled to a gateway, network controller and/or access point 685 via one or more communication networks 605 (e.g., the Internet, a telecommunication network, cable television network, satellite communication network, local area network, personal area network, metropolitan area network, wide area network, campus network, home network, etc.).
- networks 605 e.g., the Internet, a telecommunication network, cable television network, satellite communication network, local area network, personal area network, metropolitan area network, wide area network, campus network, home network, etc.
- Such network(s) may, for example, be wired, wireless RF, tethered optical, non-tethered optical, etc.
- the gateway, network controller and/or access point 685 is communicatively coupled to a variety of devices via one or more local communication networks 606 (e.g., local area network(s), personal area network(s), home network(s), office network(s), etc.).
- local communication network(s) may, for example, be wired, wireless RF, tethered optical, non-tethered optical, etc.
- the gateway, network controller and/or access point 685 may comprise one or more processors 660 and a database 661 .
- the gateway, network controller and/or access point 685 may, for example, operate to perform any or all of the method steps discussed previously.
- the database 661 may comprise any or all database characteristics discussed herein.
- the gateway, network controller and/or access point 685 may, for example and without limitation, operate to provide for user selection of an electronic device in a network that comprises any or all of the previously exemplified thermostat 631 , DVR 632 , stereo 633 , television 634 , STB/Rcvr 635 , power switch 636 and camera 637 .
- the processor 660 may operate to provide for user selection of any or all of such devices in a same communication network and/or in separate communication networks.
- the personal computer system 680 and/or local gateway(s), network controller(s) and/or access point(s) 685 may, for example, operate to perform any of the above-mentioned functionality.
- the personal computer system 680 may operate receive an indication from the user that the user desires to select a device from a particular environment (e.g., a particular room, premises, office, etc.).
- the personal computer system 680 may then, for example, operate to establish a communication link with the local gateway(s), network controller(s) and/or access point(s) 685 .
- functional aspects discussed previously may be performed by the local gateway, network controller and/or AP 685 instead of by the personal computer system 680 .
- the personal computer system 680 and local gateway, network controller or AP 685 may distribute performance of any of the previously discussed method steps among the various system entities.
- the local gateway, network controller and/or access point 685 may operate to provide a communication link between the personal computer system 680 and the camera, by which the personal computer system 680 may obtain an image from the camera and present such image to the user.
- a communication link may also comprise characteristics providing for user control of the camera 637 .
- the personal computer system 680 may operate to establish a communication link with the local gateway, network controller and/or access point 685 over which such desired image may be communicated from the database 661 to the personal computer system 680 .
- the processor 660 may operate to perform any or all of the device-identifying functionality discussed previously and/or any or all of the continued processing functionality discussed previously.
- the gateway, network controller and/or access point 685 may (e.g., utilizing one or more communication modules) operate to establish a communication link with a user-selected device via the communication network(s) 606 , and establish a communication pathway between the personal computer system 680 and the user-selected device. The gateway, network controller and/or access point 685 may then operate to provide the capability to monitor and/or control the selected device.
- the exemplary system 600 locates most of the previously discussed functionality in the gateway, network controller and/or access point 685 (e.g., as opposed to the personal computer system 680 ).
- Such an implementation may, for example, remove system complexity from the personal computer 680 , which may for example have limited energy, memory and/or processing capabilities, and place such complexity in a central location, which may for example have relatively increased energy, memory and/or processing capabilities.
- Such an arrangement also provides for a plurality of personal computer systems to perform the disclosed operation with a minimum of additional complexity.
- FIG. 7 shows a block diagram of a non-limiting exemplary device (or system) 700 for providing remote user selection of a device, in accordance with various aspects of the present invention.
- the exemplary device 700 (or various components thereof) may, for example, operate to perform any or all functionality discussed previously with regard to FIGS. 1-6 .
- the exemplary device 700 may share any or all characteristics with the database system 300 and/or personal computer systems 480 , 580 and 680 discussed previously.
- the exemplary device 700 may share any or all characteristics with the local gateways, remote controllers and/or access points 585 , 685 discussed previously.
- the exemplary device 700 may share any or all characteristics with the exemplary device (e.g., terminal devices) 331 - 337 , 431 - 437 , 531 - 537 and 631 - 637 discussed previously.
- the exemplary device 700 may, for example, comprise a first communication interface module 710 .
- the first communication interface module 710 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols.
- the first communication interface module 710 is illustrated coupled to a wireless RF antenna via a wireless port 712 , the wireless medium is merely illustrative and non-limiting.
- the first communication interface module 710 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which data is communicated.
- communication networks e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.
- the exemplary device 700 comprises a second communication interface module 720 .
- the second communication interface module 720 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols.
- the second communication interface module 720 may communicate via a wireless RF communication port 722 and antenna, or may communicate via a non-tethered optical communication port 724 (e.g., utilizing laser diodes, photodiodes, etc.).
- the second communication interface module 720 may communicate via a tethered optical communication port 726 (e.g., utilizing a fiber optic cable), or may communicate via a wired communication port 728 (e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.).
- the second communication interface module 720 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which data is communicated.
- the second communication module 720 may operate to communicate with local devices.
- the exemplary device 700 may also comprise additional communication interface modules, which are not illustrated. Such additional communication interface modules may, for example, share any or all aspects with the first 710 and second 720 communication interface modules discussed above.
- the exemplary device 700 may also comprise a communication module 730 .
- the communication module 730 may, for example, operate to control and/or coordinate operation of the first communication interface module 710 and the second communication interface module 720 (and/or additional communication interface modules as needed).
- the communication module 730 may, for example, provide a convenient communication interface by which other components of the device 700 may utilize the first 710 and second 720 communication interface modules. Additionally, for example, in an exemplary scenario where a plurality of communication interface modules are sharing a medium and/or network, the communication module 730 may coordinate communications to reduce collisions and/or other interference between the communication interface modules 710 , 720 .
- the communication module 730 may, for example, operate to utilize one or more of the communication interface modules 710 , 720 to perform any or all of the communication functionality discussed herein (e.g., with regard to any of the steps illustrated in FIGS. 1 and 2 and discussed previously).
- the exemplary device 700 may additionally comprise one or more user interface modules 740 .
- the user interface module(s) 740 may generally operate to provide user interface functionality to a user of the device 700 .
- the user interface module(s) 740 may operate to perform any or all of the exemplary user interface functionality discussed herein (e.g., with regard to any of FIGS. 1-6 ).
- the user interface module(s) 740 may operate to provide for user control of any or all standard device commands.
- the user interface module(s) 740 may, for example, operate and/or respond to user commands utilizing user interface features disposed on the device 700 (e.g., buttons, touch screen, microphone, etc.) and may also utilize the communication module 730 (and/or first 710 and second 720 communication interface modules) to communicate with the device 700 and/or any device that is communicatively coupled thereto (e.g., to control any monitoring and/or controlling functionality discussed herein).
- the user interface module 740 may also operate to interface with and/or control operation of any of a variety of sensors that may be utilized to ascertain an on-screen pointing location. As discussed elsewhere herein, the user may specify location and/or other characteristics of a device.
- the exemplary device 700 may comprise a display 750 .
- the user may utilize a display to view and/or interact with maps and/or images of network environments (e.g., for device location input and/or definition, for device selection, etc.). Such interaction may, for example, utilize a display 750 of the device 700 , which may also, for example, be utilized for user input (e.g., as a touch screen, utilizing cursor control, etc.).
- the exemplary device 700 may comprise one or more processors 760 .
- the processor(s) 760 may, for example, comprise a general purpose processor, digital signal processor, application-specific processor, microcontroller, microprocessor, etc.
- the processor(s) 760 may, for example, share any or all characteristics with processors discussed elsewhere herein (e.g., processors 360 , 460 , 560 and 660 ).
- the processor 760 may operate in accordance with software (or firmware) instructions.
- any or all functionality discussed herein may be performed by a processor executing instructions.
- various modules are illustrated as separate blocks or modules in FIG.
- processor(s) 760 may be implemented by the processor(s) 760 .
- the processor(s) 760 may operate to perform any or all of the steps discussed previously with regard to FIGS. 1 and 2 .
- the exemplary device 700 may comprise one or more memories 761 . As discussed above, various aspects may be performed by one or more processors executing instructions. Such instructions may, for example, be stored in the one or more memories 761 .
- Such memory 761 may, for example, comprise characteristics of any of a variety of types of memory.
- such memory 761 may comprise one or more memory chips (e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.), hard drive memory, CD memory, DVD memory, etc.
- the memory 761 may share any or all characteristics with any of the databases discusses herein (e.g., databases 361 , 461 , 561 and 661 ).
- the exemplary device 700 may also comprise one or more image (e.g., pictorial image, map image, etc.) presentation modules 771 .
- image presentation modules 771 may, for example and without limitation, operate to perform any or all of the functionality discussed herein with regard to receiving device identify information (e.g., at steps 120 and 220 ).
- the exemplary device 700 may also comprise one or more user device selection modules 772 .
- Such module(s) 772 may, for example and without limitation, operate to perform any or all of the functionality discussed herein with regard to receiving device identify information (e.g., at steps 130 and 230 ).
- the exemplary device 700 may also comprise one or more device identification modules 774 .
- Such module(s) 774 may, for example and without limitation, operate to perform any or all of the functionality discussed herein with regard to receiving additional device information (e.g., at steps 140 and 240 ).
- the exemplary device 700 may also comprise one or more database interface modules 790 .
- Such module(s) 790 may, for example and without limitation, operate to perform any or all of the database interface functionality discussed herein.
- FIG. 8 shows a block diagram of a non-limiting exemplary device 800 for providing remote user selection of a device, in accordance with various aspects of the present invention.
- FIG. 7 provided a diagram illustrating an exemplary device (or system) 700 in accordance with various aspects of the present invention.
- FIG. 8 provides another diagram illustrating an exemplary device (or system) 800 in accordance with various aspects of the present invention.
- the exemplary device 800 may share any or all aspects with any of the devices (e.g., portable computer devices, access points, gateways, network controllers, terminal devices, etc.) discussed herein (e.g., with regard to FIGS. 1-7 ).
- the exemplary device 800 (or various modules thereof) may operate to perform any or all functionality discussed herein.
- the components of the exemplary device 800 may be co-located a single housing.
- the device 800 comprises a processor 860 .
- a processor 860 may, for example, share any or all characteristics with the processor 760 discussed with regard to FIG. 7 .
- the device 800 comprises a memory 861 .
- Such memory 861 may, for example, share any or all characteristics with the memory 761 discussed with regard to FIG. 7 .
- the exemplary device (or system) 800 may comprise any of a variety of user interface module(s) 840 .
- Such user interface module(s) 840 may, for example, share any or all characteristics with the user interface module(s) 740 discussed previously with regard to FIG. 7 .
- the user interface module(s) 840 may comprise: a display device, a camera (for still or moving picture acquisition), a speaker, an earphone (e.g., wired or wireless), a microphone, a video screen (e.g., a touch screen display), a vibrating mechanism, a keypad, a remote control interface, and/or any of a variety of other user interface devices (e.g., a mouse, a trackball, a touch pad, touch screen, light pen, game controlling device, etc.).
- the exemplary device 800 may also, for example, comprise any of a variety of communication modules ( 805 , 806 , and 830 ). Such communication module(s) may, for example, share any or all characteristics with the communication interface module(s) 710 , 720 and the communication module 730 discussed previously with regard to FIG. 7 .
- the communication interface module(s) 830 may comprise: a Bluetooth interface module; an IEEE 802.11, 802.15, 802.16 and/or 802.20 module; any of a variety of cellular telecommunication interface modules (e.g., GSM/GPRS/EDGE, CDMA/CDMA2000/1x-EV-DO, WCDMA/HSDPA/HSUPA, TDMA/PDC, WiMAX, etc.); any of a variety of position-related communication interface modules (e.g., GPS, A-GPS, etc.); any of a variety of wired/tethered communication interface modules (e.g., USB, Fire Wire, RS-232, HDMI, component and/or composite video, Ethernet, wire line and/or cable modem, etc.); any of a variety of communication interface modules related to communicating with external memory devices; etc.
- the exemplary device 800 is also illustrated as comprising various wired 806 and/or wireless 805 front-end modules that may, for example, be included in the communication interface modules and/or
- the exemplary device (or system) 800 may also comprise any of a variety of signal processing module(s) 865 .
- Such signal processing module(s) 865 may, for example, be utilized to assist in processing various types of information discussed previously (e.g., with regard to sensor processing, position or location determination, orientation determination, video processing, image processing, audio processing, general user interface information data processing, etc.).
- the signal processing module(s) 890 may comprise: video/graphics processing modules (e.g.
- audio processing modules e.g., MP3, AAC, MIDI, QCELP, AMR, CMX, etc.
- tactile processing modules e.g., Keypad I/O, touch screen processing, motor control, etc.
Abstract
Description
- This patent application is related to and claims priority from provisional patent application Ser. No. 61/323,223 filed Apr. 12, 2010, and titled “SYSTEMS AND METHODS FOR PROVIDING IMAGE-BASED REMOTE CONTROL,” the contents of which are hereby incorporated herein by reference in their entirety. This patent application is also related to U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23010US02; U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR AUTOMATICALLY MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23011US02; and U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A NETWORK CONTROLLER FOR REMOTELY MONITORING AND/OR CONTROLLING DEVICES”, Attorney Docket No. 23013US02. The contents of each of the above-mentioned applications are hereby incorporated herein by reference in their entirety.
- [Not Applicable]
- [Not Applicable]
- [Not Applicable]
- Present communication networks are incapable of providing for and/or conveniently providing for user-selection of communication network devices (e.g., for remote monitoring and/or control). Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.
- Various aspects of the present invention provide a system and method providing remote user selection of a device, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims. These and other advantages, aspects and novel features of the present invention, as well as details of illustrative aspects thereof, will be more fully understood from the following description and drawings.
-
FIG. 1 shows a flow diagram of a non-limiting exemplary method for providing remote user selection of a device, in accordance with various aspects of the present invention. -
FIG. 2 shows a flow diagram of a non-limiting exemplary method for providing remote user selection of a device, in accordance with various aspects of the present invention. -
FIG. 3 shows a block diagram of a non-limiting exemplary database that may be utilized for device determination, in accordance with various aspects of the present invention. -
FIG. 4 shows a block diagram of a non-limiting exemplary system for providing remote user selection of a device, in accordance with various aspects of the present invention. -
FIG. 5 shows a block diagram of a non-limiting exemplary system for providing remote user selection of a device, in accordance with various aspects of the present invention. -
FIG. 6 shows a block diagram of a non-limiting exemplary system for providing remote user selection of a device, in accordance with various aspects of the present invention. -
FIG. 7 shows a block diagram of a non-limiting exemplary device for providing remote user selection of a device, in accordance with various aspects of the present invention. -
FIG. 8 shows a block diagram of a non-limiting exemplary device for providing remote user selection of a device, in accordance with various aspects of the present invention. - The following discussion will refer to various modules, components or circuits. Such modules, components or circuits may generally comprise hardware and/or a combination of hardware and software (e.g., including firmware). Such modules may also, for example, comprise a computer readable medium (e.g., a non-transitory medium) comprising instructions (e.g., software instructions) that, when executed by a processor, cause the processor to perform various functional aspects of the present invention. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular hardware and/or software implementations of a module, component or circuit unless explicitly claimed as such. For example and without limitation, various aspects of the present invention may be implemented by one or more processors (e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.) executing software instructions (e.g., stored in volatile and/or non-volatile memory). Also for example, various aspects of the present invention may be implemented by an application-specific integrated circuit (“ASIC”) and/or other hardware components.
- Additionally, the following discussion will refer to various system modules (e.g., communication modules, processor modules, memory or database modules, user interface modules, etc.). It should be noted that the following discussion of such various modules is segmented into such modules for the sake of illustrative clarity. However, in actual implementation, the boundaries between various modules may be blurred. For example, any or all of the functional modules discussed herein may share various hardware and/or software components. For example, any or all of the functional modules discussed herein may be implemented wholly or in-part by a shared processor executing software instructions. Additionally, various software sub-modules that may be executed by one or more processors may be shared between various software modules. Accordingly, the scope of various aspects of the present invention should not be limited by arbitrary boundaries between various hardware and/or software components, unless explicitly claimed.
- The following discussion may also refer to communication networks and various aspects thereof. For the following discussion, a communication network is generally the communication infrastructure through which a communication device (e.g., a portable communication device, a personal computer, a network controller, a user-selectable device with network communication capability, a consumer electronic device with network communication capability, etc.) may communicate with other systems. For example and without limitation, a communication network may comprise a cable and/or satellite television communication network, a cellular communication network, a wireless metropolitan area network (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN), any home or premises communication network, etc. A particular communication network may, for example, generally have a corresponding communication protocol according to which a communication device may communicate with other devices in and/or via the communication network. Unless so claimed, the scope of various aspects of the present invention should not be limited by characteristics of a particular type of communication network.
- In a non-limiting exemplary scenario that comprises various aspects of the present invention, a camera may capture an image (e.g., a real-time image) of an environment (e.g., a home environment, office environment, etc.) comprising devices that may be monitored and/or controlled by a user. Such image may, for example, be a real-time image, a periodically updated image, an image acquired during system configuration, etc.). A graphical map image may also (and/or alternatively), for example, be similarly utilized. The system may then present the image to the user (e.g., on a display device, a touch input device, etc.). The user may then identify a device in the image with which the user desires to interact (e.g., in a monitoring and/or controlling capacity). For example, the user may touch the device in the image on a touch screen, identify the device with a movable cursor, indicate the image with a light pen, etc. The system may then determine the user-identified device. Such determination may, for example, comprise analyzing a database of devices in a particular area. Such database may, for example, be maintained manually and/or automatically without user interaction.
- The system may, for example, establish a communication link (directly or indirectly) with the user-identified device. The system may then determine user interface options to present to the user. Such user interface options may, for example, be related to monitoring and/or controlling the user-identified device. The system may present the determined user interface options to the user. The system may then interact with the user (e.g., presenting user interface options to the user and receiving user input corresponding to such user interface options). The system may then form signals recognized by the user-identified device, where such signals correspond to the user input. Such signals may, for example, comprise data structures, command identifiers, etc., corresponding to a communication protocol understood by the user-identified device. The system may then communicate the formed signal to the user-identified device (e.g., utilizing a communication protocol understood by the user-identified device over a communication network to which the user-identified device is communicatively coupled. The system may then, depending on the nature of the user input, continue to interact with the user-identified device, for example transmitting additional commands/information to the user-identified device and/or receiving information from the user-identified device.
- Turning first to
FIG. 1 , shows a flow diagram of amethod 100 illustrating a method of providing remote user selection of a device, in accordance with various aspects of the present invention. Themethod 100 may, for example, be implemented in a network controller device, gateway, router, access point, etc. Such network controller device may, for example, be implemented in a home device (e.g., a hub of a home computing network, a home-based desktop computer, a home gateway and/or router, etc.). Themethod 100 may also, for example, be implemented in a portable computing device. Such a portable computing device may, for example, be a laptop or notebook computer, handheld computing device, cellular phone, personal digital assistant, any personal electronic device, etc. Also note that various aspects of the exemplary method 100 (and all methods herein) may also be performed by a distributed system (e.g., a first portion may be performed in a communication network infrastructure device, for example a network controller, and a second portion may be performed in a personal electronic device or terminal). - The
method 100 may begin executing atstep 105. Themethod 100 may begin executing in response to any of a variety of causes and/or conditions, non-limiting examples of which will now be provided. - For example, the
method 100 may begin executing in response to a user command to begin. For example, a user may input a command to begin any of a large variety of tasks that include remote user identification of a device. Also for example, themethod 100 may begin executing in response to detection of a change in the device makeup of a particular environment. For example, themethod 100 may begin executing in response to detecting the presence of an unknown device within a particular geographic area (e.g., a premises, office, room, suite of rooms, etc.). Further for example, themethod 100 may begin executing in response to a determination that a device being tracked has moved or has potentially moved within a particular environment. Additionally, for example, themethod 100 may begin executing in response to failure to establish a communication link with a device in a particular area (e.g., when such device is predicted to be in such area). - Additionally for example, the
exemplary method 100 may begin executing periodically (e.g., at a constant or substantially constant period, for example hourly, daily, etc.), surveying one or more particular areas. Theexemplary method 100 may, for example, begin executing in response to a timer and/or in response to a predefined time schedule. Additionally for example, themethod 100 may begin executing in response to a system power-up or reset. - In general, the
exemplary method 100 may begin executing atstep 105. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular initiating cause or condition unless explicitly claimed. - The
exemplary method 100 may, for example atstep 120, comprise providing an image (e.g., one or more images) of a device environment to a user. Such image may comprise characteristics of any of a variety of types of images. For example and without limitation, the image may correspond to a camera (or photographic) image of a particular area (e.g., a room, office, set of rooms, etc.). Alternatively for example, step 120 may comprise providing a map image (e.g., a 2-dimensional and/or 3-dimensional map image) to the user. - Step 120 may, for example, comprise presenting the image on a display (e.g., a touch screen) of a user device (e.g., a personal electronic device, a personal computer, a personal digital assistant, a smart phone, etc.). Step 120 may comprise presenting the image on a display of any of a variety of other devices (e.g., a network controller with a screen, an access point with a screen, a gateway with a screen, etc.).
- Step 120 may also, for example, comprise communicating the image (e.g., camera image and/or map image) information to a device (e.g., a user device) if such information does not originate at such device. For example, step 120 may comprise utilizing a previously established communication link or may also comprise establishing a communication link configured for the communication of video and/or graphics information. In a non-limiting exemplary scenario, step 120 may be performed in communication network infrastructure apparatus (e.g., a network controller) and comprise transmitting the image information to a personal computer terminal.
- Step 120 may comprise presenting graphics features in the presented image, where such graphics features indicate selectable devices in the image (e.g., devices for which communication has occurred in the past, devices for which communication information exists in a database, devices that are presently communicatively coupled to one or more known communication networks, etc.). Such graphical features may also indicate status of the device (e.g., currently on-line, currently off-line, sleeping, monitorable device, controllable device, device for which the user is authorized to interact with or not, new undefined device, failing device, device for which communication should be possible but which is presently not possible, etc.
- Step 120 may also comprise providing an output image or graphics feature indicating where a user is presently indicating. For example, step 120 may comprise outputting a graphical feature (e.g., a cursor, etc.) indicating where the user is presently pointing, touching, looking, etc. Also for example, step 120 may comprise outputting a graphical indication of an area being selected by a user (e.g., as such area is being defined by the user).
- The image presented at
step 120 may comprise a non-real-time stored image and/or a real-time image acquired as such image is needed (e.g., during present execution of theexemplary method 100, during present execution ofstep 120, etc.). In a stored-image scenario, step 120 may comprise retrieving the image from a database (e.g., a local database and/or a remote networked database). Such a database may, for example and without limitation, have been formed and/or maintained in accordance with procedures or systems outlined in U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23010US02 and/or U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR AUTOMATICALLY MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23011US02, each of which is hereby incorporated herein by reference in its entirety. - For example, step 120 may comprise presenting an image to the user that was obtained at some point in the past (e.g., at a regular periodic time point in the past, during system calibration defining user-selectable devices, during a most recent network maintenance and/or definition operation, etc.). In a non-limiting exemplary scenario, step 120 may comprise presenting a camera image of a family room to a user, where such image was taken prior to the present execution of the
exemplary method 100. In environments that have a relatively static device configuration, such previously obtained image may be utilized for a substantial duration of time before updating. For example, such an image need only be updated when a user-selectable device is added to and/or removed from and/or moved within the pictured or mapped environment. - In a real-time scenario, step 120 may comprise retrieving the image utilizing a camera. For example, in an exemplary scenario, a user may utilize a user interface provided at
step 120 to specify a particular room, in response to whichstep 120 may comprise utilizing a camera in the room to obtain a current image of the room. Note that such selection may be multi-tiered, for example, a user may first specify a building or system, then a room of the specified building or system, then a camera of the specified room, etc. - A previous example mentioned a user indicating a room, in response to which
step 120 comprised presenting an image of such room to the user. Step 120 may similarly comprise providing a user interface by which the user may select a particular camera from which an image is desired. In an exemplary scenario, step 120 may present the user with a list of cameras from which to select (e.g., a kitchen camera,family room camera 1,family room camera 2, patio camera, pool camera,rec room camera 1,rec room camera 2,office camera 1,office camera 2, front door camera, etc.). The user may then select the desired camera. Such a list of cameras may, for example be textual, graphical, image-based (e.g., showing a thumbnail image from each camera), etc. - In an implementation including utilization of camera images,
step 120 may also comprise providing for user control of a selected camera. For example, step 120 may comprise providing a user interface to the user by which the user may control movement of the camera, operation of the camera lens (e.g., panning, zooming and/or focusing), operation of camera lighting, filtering, resolution, etc. In a non-limiting exemplary scenario, step 120 may comprise providing the user with the ability to zoom in on a desired device to assist the user in selecting a device and/or defining an area and/or volume occupied by such device. Step 120 may thus provide for additional control of an image being presented to the user for use by the user in selecting a device, inputting information identifying location of the device, orientation of the device, spatial characteristics of the device, etc. - In general,
step 120 may comprise providing an image (e.g., one or more images) of a device environment to a user. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular image (e.g., a camera image, map image, etc.) and/or any particular manner of presenting such an image unless explicitly claimed. - The
exemplary method 100 may, for example atstep 130, comprise receiving an image selection input from a user. Step 130 may comprise receiving such an image selection input in any of a variety of manners, non-limiting examples of which will now be presented. - For example, step 130 may comprise receiving a touch screen input from a user. Also for example, step 130 may comprise receiving a cursor movement and selection input from a user (e.g., utilizing a mouse, touch pad, track ball, etc.). Further for example, step 130 may comprise receiving a light pen input from a user, a camera input from a user, etc.
- Step 130 may comprise receiving a user indication of a specific point in the image as input (e.g., a touch point, a cursor click point, etc.). Also for example, step 130 may comprise receiving a user input indicative of a selected area (e.g., an area enclosed by a figure drawn by the user) or volume.
- For example, step 130 may comprise receiving information descriptive of a square, rectangle or other polygon drawn by the user. Similarly for example, step 130 may comprise receiving information descriptive of a circle or oval drawn by the user. For example, step 130 may comprise providing the user with a graphical toolset, by which the user may draw a polygon around the device, circle the device, paint the device, draw an ellipse around the device, define the device region with a plurality of figures, etc. Such graphical toolset may also comprise tools that automatically provide borders on a selected object (e.g., by performing pixel analysis to search for device edges).
- Additionally for example, step 130 may comprise receiving a user input that toggles between and/or traverses a plurality of selectable devices in the image presented to the user. For example, step 130 may comprise receiving a user inputting a “next” command that causes traversal of the selectable devices (e.g., one at a time) until the user reaches the desired device. In such a scenario, during a traversal of devices,
step 130 may comprise outputting a graphical indication (e.g., a highlighted area or polygon) indicating the presently selected device. - In general,
step 130 may comprise receiving an image selection input from a user. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of such input and/or any particular manner of receiving such input unless explicitly claimed. - The
exemplary method 100 may, for example atstep 140, comprise identifying a device in the image that has been selected by the user. Step 140 may comprise identifying the selected device in any of a variety of manners. For example, step 140 may comprise correlating the user device selection information received atstep 130 with known device selection areas in the presented image. For example, as discussed in U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23010US02 and/or U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR AUTOMATICALLY MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23011US02, each of which is hereby incorporated herein by reference in its entirety, a database may be formed that includes information defining device selection areas within an image (e.g., a pictorial or map image). In such a scenario, step 140 may comprise comparing a user-selected location (or area or volume) in the image presented atstep 120, as received atstep 130, with such database information to determine whether the user-selected location (or area or volume) corresponds to a defined region of the image corresponding to a particular device. For example, if a user selects location (X, Y) in the image, step 140 may comprise determining that the location (X, Y) has been previously defined to correspond to the location of Device A. Various exemplary manners of forming such a database were discussed in the above-mentioned incorporated patent applications. - Also for example, step 140 may comprise processing the user device selection information received at
step 130 to determine a location in the image at which to begin performing image matching analysis. For example, as explained in the above-mentioned incorporated patent applications, pixel characteristics with a previously defined image of a device may be processed and stored in a fashion (e.g., as pixel histograms, etc.) that may be processed to identify such device appearing in an image. For example, respective pixel histograms may be formed that correspond to respective devices (e.g., respective pixel histograms for a television, a stereo, a light switch, a receiver, a thermostat, etc.). Step 140 may, in such a scenario, comprise determining the previously defined histogram that best matches the area surrounding a user-selected point in the image and/or that best matches the characteristics of an area defined by a user input received atstep 130. - Further for example, step 140 may comprise performing character recognition processing. Such processing may, for example, comprise analyzing an area of the image near a point specified by the user to determine whether any recognizable characters (e.g., alphabetical characters, numerical characters, fanciful logos, etc.) are present in the area of the image. In a non-limiting example, the system implementing the
method 100 may know that there is only one device in the family room that has been produced by a particular manufacturer. The presence of a logo associated with such manufacturer in the image relatively near a user's designated point in the image may be utilized bystep 140 as an indication that such device has been selected by the user. - Additionally for example, step 140 may comprise analyzing location information to identify a particular device that has been selected by a user. For example, the camera's location and field of view may be processed in conjunction with a user-identified location in the image to draw a line extending out from the camera through the camera image plane. In such a scenario, step 140 may comprise determining that the first device encountered along such line (in other words, the first device, the known location of which lies along such line) extending outward from the camera lens corresponds to the user-selected device.
- Still further for example, step 140 may comprise correlating the user-identified image location and field of view with a generally analogous field of received RF energy. For example, user-identified image location may be relatively closer to an identified RF hot-spot in such image field of view than other RF hot-spots. In such a scenario, step 140 may comprise determining that the user has identified the device corresponding to the closest RF hot-spot in the image field of view. Similarly, infrared processing may identify devices emitting heat energy and determining the closest heat-emitting device in the field of view relative to the user-identified location in the image.
- Note that
step 140 may also comprise establishing a communication link with a determined device (or candidate device) to determine additional information about such device that may be utilized in determining whether the user has identified such device. For example, such communication link(s) may be established to communicate current device location information, to communicate test messages, to determine whether the user has access to a particular device, etc. - In general,
step 140 may comprise identifying a device in the image that has been selected by the user. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such identification unless explicitly claimed. - The
exemplary method 100 may, for example atstep 195, comprise performing continued processing. Various non-limiting examples of such processing will be provided below. For example, step 195 may comprise returning execution flow of themethod 100 to any previous step. - Also for example, step 195 may comprise communicating information indicating the device identified by the system to the user. Such operation may, for example, provide the user the opportunity to determine whether the correct device was identified. Such information may, for example, outputting a graphical feature on the image presented at
step 120 to identify the device identified atstep 140. - Also for example, step 195 may comprise communicating information of the identified device to any of a variety of functional modules that may utilize such information. For example, step 195 may comprise determining a desired user action to take regarding the user-identified device and performing such action.
- Additionally, step 195 may, for example, comprise interfacing with one or more databases (e.g., local and/or remote databases) to acquire information regarding such user-selected device. Such information may, for example, be presented to the user, stored in a database record, utilized to upgrade and/or reconfigure the identified device, utilized to establish a communication link with such device, utilized to configure secure communication with such device, purchasing information, manufacturer information, technical assistance information, cost information, operation instructions, remote operation instructions, personal information associated with such device, images related to such device, history of such device, etc.
- Further for example, step 195 may comprise presenting to a user via a user interface, the options available to the user concerning interaction with such identified device. Further, as will be discussed in more detail below,
step 195 may comprise establishing a communication network with the identified device and/or establishing a remote control environment between the user and the identified device. - Also for example, step 195 may comprise interfacing with the user to acquire an image of the selected device (or object) to communicate to a destination. Such continued processing may also, for example, comprise passing control of a selected device to another entity (e.g., to a service person, etc.).
- Additional examples of such continued processing may, for example, comprise any or all of the functionality discussed in co-pending applications U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23010US02; U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR AUTOMATICALLY MANAGING A NETWORK OF USER-SELECTABLE DEVICES”, Attorney Docket No. 23011US02; and U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A NETWORK CONTROLLER FOR REMOTELY MONITORING AND/OR CONTROLLING DEVICES”, Attorney Docket No. 23013US02. Each of such applications is hereby incorporated herein by reference in its entirety.
- In general,
step 195 may comprise performing continued processing. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of continued processing unless explicitly claimed. -
FIG. 2 a flow diagram of a non-limitingexemplary method 200 for providing remote user selection of a device, in accordance with various aspects of the present invention. Theexemplary method 200 may, for example and without limitation, share any or all aspects with theexemplary method 100 discussed above. - The
exemplary method 200 may, for example atstep 205, share any or all aspects withstep 105 of theexemplary method 100 illustrated inFIG. 1 and discussed previously. - The
exemplary method 200 may, for example atstep 220, comprise presenting (or providing) an image (e.g., one or more images) of a device environment to a user. Step 220 may, for example, share any or all characteristics withstep 120 discussed above. - For example, step 220 may, at
sub-step 222, comprise establishing a user interface (local or remote). Such a user interface may, for example, be utilized by a user to request and/or identify an image (e.g., a camera image, graphical image, map image, etc.) to be presented to the user. For example, sub-step 222 may comprise establishing communication links between a user device and other devices depending on where various aspects are implemented.FIGS. 4-6 , which will be discussed later, provide non-limiting illustrations of a variety of system configurations, including communication links between various system components and devices. For example, a user device may operate the entire user interface (U/I) (e.g., executing all or most U/I software instructions and utilizing user interface modules to interact with the user). Also for example, the user device may effectively operate as a U/I terminal for another device (e.g., a gateway, network controller, access point, etc.) that is executing all or most of the U/I software instructions, for example acting primarily as a conduit of user interface information, which is mostly processed by another device. - Step 220 may, for example at
sub-step 224, comprise receiving a user input requesting an image (e.g., a camera image or map). For example, sub-step 224 may comprise receiving such a user input via the user interface established atsub-step 222. Sub-step 224 may, for example, comprise receiving a user request for a view of a particular area (e.g., a camera image, a 2-D and/or 3-D graphical map image of an area, etc. Sub-step 224 may, for example, comprise receiving a request for an image from a particular camera or a map of a particular area (e.g., a room). Such a request may also, for example, designate whether the user desires a formerly acquired image or a real-time image. There may also be a default image to present (e.g., use a real-time image, if available, unless the user indicates otherwise). In an exemplary scenario, sub-step 224 may comprise providing a graphical user interface to a user that provides the user the capability to select a particular image, room to be imaged, camera from which to acquire an image, etc. - Step 220 may, for example at
sub-step 226, comprise acquiring an image (e.g., a camera image or map image) (e.g., the image requested at step 224). Sub-step 226 may comprise acquiring such image in any of a variety of manners. For example and without limitation, sub-step 226 may comprise identifying the desired image (e.g., as expressed by the user at sub-step 224). Sub-step 226 may, for example, comprise acquiring a stored non-real-time image stored previously (e.g., in a database, for example, local database and/or remote database). For example, sub-step 226 may comprise interfacing with a camera (e.g., a fixed camera, user-controllable camera, etc.) to acquire a real-time image from such camera (e.g., acquiring the real-time image when the need for such an image is realized). In such an exemplary scenario, sub-step 226 may comprise providing user control over a controllable camera (e.g., panning, zooming, resolution, filtering, etc.). Sub-step 226 may, for example, comprise utilizing the communication link established atstep 222 and/or may also comprise establishing a communication link specifically configured for the communication of video and/or graphics information. - Step 220 may, for example at
sub-step 228, comprise presenting the image (e.g., camera image, map image, etc.) acquired at sub-step 226 to the user (e.g., via the user interface established at sub-step 222). Sub-step 228 may comprise presenting the image in any of a variety of manners. - For example, sub-step 228 may comprise presenting the image on a display (e.g., a touch screen) of a user device (e.g., a personal computing device, smart phone, etc.), many examples of which have been presented herein. Sub-step 228 may, for example, comprise presenting the image on a display of a communication network infrastructure device (e.g., a network controller with a screen, access point with a screen, gateway with a screen, etc.).
- Sub-step 228 may, for example, comprise communicating information describing the image (e.g., a digital image data file) to a user device (e.g., if such information is not already at such device). For example, step 228 may comprise utilizing a communication link established at
sub-step 222 and/or may also comprise establishing and utilizing a communication link specifically configured for the communication of video and/or graphics information. - Further for example, sub-step 228 may comprise presenting graphics features indicating selectable devices in the image (e.g., devices for which communication has occurred in the past, devices for which communication information exists in the database, devices that are presently communicatively coupled to one or more known communication networks, etc.). Such graphical features may also indicate status of the device (e.g., currently on-line, currently off-line, sleeping, monitorable device, controllable device, device for which the user is authorized to interact with or not, new non-defined device, failing device, device for which communication should be possible but which is presently not, etc.).
- Still further for example, sub-step 228 may comprise presenting an on-image indication of where a user is presently indicating. For example, sub-step 228 may comprise outputting a graphical feature (e.g., a cursor, etc.) indicating where the user is presently pointing, touching, looking, etc. Also for example, sub-step 220 may comprise outputting a graphical indication of an area being selected by a user (e.g., as such area is being defined by the user).
- In general,
step 220 may comprise providing an image (e.g., one or more camera images, one or more map images, etc.) of a device environment to a user. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of image and/or any particular manner of providing such an image to a user unless explicitly claimed. - The
exemplary method 200 may, for example at steps 230-295, share any or all aspects with steps 130-195 of theexemplary method 100 illustrated inFIG. 1 and discussed previously. -
FIG. 3 shows a block diagram 300 of a non-limiting exemplary database for automatic network maintenance and/or characterization, and/or for utilization for device identification in accordance with various aspects of the present invention. - The previous discussion (e.g., the discussion of
steps 140 and 240) mentioned that identification of a selected device may be performed via interaction with a database. Additionally, the discussions of other method steps mentioned the acquisition of information related to a selected device (e.g., device identity information, device location information, additional information related to a device, etc.) may comprise interfacing with one or more local and/or remote databases.FIG. 3 provides a non-limiting illustration of such a database. Additionally, other method steps may comprise the formation and/or maintenance of a database where devices of an image may be related to respective device records that include any of a variety of different types of device information (many examples of which were provided above). - The database may, for example, comprise a
memory 361. Such amemory 361 may, for example, be a non-volatile memory (or other non-transitory data retention device) such as a magnetic disk hard drive, optical disk drive, CD, DVD, laser disc, Blueray disc, diskette, flash drive, EPROM, EEPROM, flash memory, etc. Thememory 361 may, in the illustrated example, include information of an image 320 (or plurality of images), where areas or features of the image may be related to records, which may also be stored in thememory 361. For example, theimage 320 may comprise a still photograph or other camera image. - The
image 320 may comprise an image of a thermostat that is linked to athermostat record 331. Theimage 320 may also comprise an image of a DVR that is logically linked to aDVR record 332, an image of a stereo that is logically linked to astereo record 333, an image of a television that is logically linked to atelevision record 334, an image of a set top box (“STB”) and/or TV receiver that is logically linked to an STB/Receiver record 335, an image of a power switch that is logically linked to apower switch record 336 and an image of a camera that is logically linked to a camera record 337. As discussed previously, the devices of the image 320 (or environment) may be defined by respective areas (e.g., boxes, outlines, etc.) input by a user and/or automatically defined (e.g., utilizing pixel analysis). - Also as discussed previously in the discussion of the methods of
FIGS. 1 and 2 , various aspects of the present invention may comprise providing adatabase interface 390 by which various processes (e.g., aprocess implementing steps 140 and/or 240) may access and utilize the information stored in thememory 361. As discussed previously,such interface 390 may be implemented in a variety of manners. For example,such interface 390 may be implemented by aprocessor 360 or other hardware device executing instructions, which may, for example, be stored in thememory 361 or another memory (e.g., either collocated with thememory 361 or geographically distinct from thememory 361. Thedatabase interface 390 may comprise modules stored in a variety of locations (e.g., at a central controller, server, gateway, access point, and/or user computing device). - The
system 300 illustrated inFIG. 3 also comprises one or more processors, communication interface modules, or anyother hardware components 360 that may interact with thememory 361 via thedatabase interface 390. For example, one of such processors may be operate to interface with thedatabase 361 while implementing theexemplary methods FIGS. 1-2 and discussed previously. -
FIG. 4 shows a block diagram of a non-limitingexemplary system 400 for providing remote user selection of a device, in accordance with various aspects of the present invention. The exemplary system 400 (or components thereof) may operate to perform any or all of the functional characteristics discussed herein (e.g., with regard to theexemplary methods exemplary database 300, etc.). - The
exemplary system 400 may, for example, comprise apersonal computer system 480, which is communicatively coupled to a variety of devices via one or more communication networks 405 (e.g., the Internet, a telecommunication network, cable television network, satellite communication network, local area network, personal area network, metropolitan area network, wide area network, campus network, home network, etc.). Such network(s) may, for example, be wired, wireless RF, tethered optical, non-tethered optical, etc. - As a non-limiting example, the personal computer system 480 (e.g., a laptop or notebook computer, desktop computer, handheld computer, cellular telephone with computing capability, smart phone, personal digital assistant, etc.) may comprise one or
more processors 460 and adatabase 461. Thepersonal computer system 480 may, for example, operate to perform any or all of the method steps discussed previously. For example, thedatabase 461 may comprise any or all database characteristics discussed herein. - The
personal computer system 480 may, for example, operate to provide for user selection (e.g., remote user selection) of an electronic device in a network that comprises any or all of the non-limiting exemplary devices (e.g., thethermostat 431,DVR 432,stereo 433,television 434, STB/Rcvr 435,power switch 436 and camera 437). As a non-limiting example, theprocessor 460 may operate to provide for user selection of any or all of such devices in a same communication network and/or in separate communication networks. - For example, the
personal computer system 480 may, for example, operate to interact with the user to implement any of the above-mentioned functionality. For example, thepersonal computer system 480 may operate to obtain image information (or map information) from adatabase 461 and/or camera 437 to present an image (e.g., a camera image, a map image, etc.) to a user thereof. Thepersonal computer system 480 may then operate to receive image selection input from a user of such system, identify the user-selected device, and perform continued processing regarding such user-selected device. -
FIG. 5 shows a block diagram of a non-limitingexemplary system 500 for providing remote user selection of a device, in accordance with various aspects of the present invention. The exemplary system 500 (or components thereof) may operate to perform any or all of the functional characteristics discussed herein (e.g., with regard to theexemplary methods FIGS. 1 and 2 , with regard to the exemplary database illustrated inFIG. 3 , with regard to theexemplary system 400 illustrated inFIG. 4 , etc.). - The
exemplary system 500 may, for example, comprise apersonal computer system 580, which is communicatively coupled to a gateway, network controller and/or access point 585 via one or more communication networks 505 (e.g., the Internet, a telecommunication network, cable television network, satellite communication network, local area network, personal area network, metropolitan area network, wide area network, campus network, home network, etc.). Such network(s) may, for example, be wired, wireless RF, tethered optical, non-tethered optical, etc. - The gateway, network controller and/or access point 585, in turn, is communicatively coupled to a variety of devices via one or more local communication networks 506 (e.g., local area network(s), personal area network(s), home network(s), office network(s), etc.). Such local communication network(s) may, for example, be wired, wireless RF, tethered optical, non-tethered optical, etc.
- As a non-limiting example, the personal computer system 580 (e.g., a laptop or notebook computer, desktop computer, handheld computer, cellular telephone with computing capability, personal digital assistant, etc.) may comprise one or
more processors 560 and adatabase 561. Thepersonal computer system 580 may, for example, operate to perform any or all of the method steps discussed previously. For example, thedatabase 561 may comprise any or all database characteristics discussed herein. - The
personal computer system 580 may, for example and without limitation, operate to provide for user selection of an electronic device in a network that comprises any or all of the previously exemplifiedthermostat 531,DVR 532,stereo 533,television 534, STB/Rcvr 535,power switch 536 andcamera 537. As a non-limiting example, theprocessor 560 may operate to provide for user selection of any or all of such devices in a same communication network and/or in separate communication networks. - For example, the
personal computer system 580 may, for example, operate to interact with the user to implement any of the above-mentioned functionality. Thepersonal computer system 580 may (e.g., utilizing a communication module) operate to establish a communication link with any of the devices illustrated (e.g., including the camera 537) via the communication network(s) 505; local gateway, network controller and/or access point 585; and communication network(s) 506; and provide the capability to view and/or select a device. Such operation may, for example, comprise interfacing with thecamera 537 to obtain a photographic image of a device environment and/or obtaining image or map information from thedatabase 561. - The
personal computer system 580 may, for example, operate to receive image selection input from a user of such system, identify the user-selected device, and perform continued processing regarding such user-selected device. -
FIG. 6 shows a block diagram of a non-limitingexemplary system 600 for providing remote user selection of a device, in accordance with various aspects of the present invention. The exemplary system 600 (or components thereof) may operate to perform any or all of the functional characteristics discussed herein (e.g., with regard to theexemplary methods FIGS. 1 and 2 , with regard to the exemplary database illustrated inFIG. 3 , with regard to theexemplary systems FIGS. 4 and 5 , etc.). - The
exemplary system 600 may, for example, comprise apersonal computer system 680, which is communicatively coupled to a gateway, network controller and/oraccess point 685 via one or more communication networks 605 (e.g., the Internet, a telecommunication network, cable television network, satellite communication network, local area network, personal area network, metropolitan area network, wide area network, campus network, home network, etc.). Such network(s) may, for example, be wired, wireless RF, tethered optical, non-tethered optical, etc. - The gateway, network controller and/or
access point 685, in turn, is communicatively coupled to a variety of devices via one or more local communication networks 606 (e.g., local area network(s), personal area network(s), home network(s), office network(s), etc.). Such local communication network(s) may, for example, be wired, wireless RF, tethered optical, non-tethered optical, etc. - As a non-limiting example, the gateway, network controller and/or
access point 685 may comprise one ormore processors 660 and adatabase 661. The gateway, network controller and/oraccess point 685 may, for example, operate to perform any or all of the method steps discussed previously. For example, thedatabase 661 may comprise any or all database characteristics discussed herein. - The gateway, network controller and/or
access point 685 may, for example and without limitation, operate to provide for user selection of an electronic device in a network that comprises any or all of the previously exemplifiedthermostat 631,DVR 632,stereo 633,television 634, STB/Rcvr 635, power switch 636 and camera 637. As a non-limiting example, theprocessor 660 may operate to provide for user selection of any or all of such devices in a same communication network and/or in separate communication networks. - For example, the
personal computer system 680 and/or local gateway(s), network controller(s) and/or access point(s) 685 may, for example, operate to perform any of the above-mentioned functionality. For example, thepersonal computer system 680 may operate receive an indication from the user that the user desires to select a device from a particular environment (e.g., a particular room, premises, office, etc.). Thepersonal computer system 680 may then, for example, operate to establish a communication link with the local gateway(s), network controller(s) and/or access point(s) 685. In such an exemplary scenario, functional aspects discussed previously (apart from direct user interaction) may be performed by the local gateway, network controller and/orAP 685 instead of by thepersonal computer system 680. Alternatively, in such an exemplary scenario, thepersonal computer system 680 and local gateway, network controller orAP 685 may distribute performance of any of the previously discussed method steps among the various system entities. - For example, in an exemplary scenario, where the user desires a real-time image of a particular environment, the local gateway, network controller and/or
access point 685 may operate to provide a communication link between thepersonal computer system 680 and the camera, by which thepersonal computer system 680 may obtain an image from the camera and present such image to the user. Such a communication link may also comprise characteristics providing for user control of the camera 637. - Also for example, in an exemplary scenario where the user desires a non-real time image, the
personal computer system 680 may operate to establish a communication link with the local gateway, network controller and/oraccess point 685 over which such desired image may be communicated from thedatabase 661 to thepersonal computer system 680. - In the particular example illustrated in
FIG. 6 , theprocessor 660 may operate to perform any or all of the device-identifying functionality discussed previously and/or any or all of the continued processing functionality discussed previously. - For example, the gateway, network controller and/or
access point 685 may (e.g., utilizing one or more communication modules) operate to establish a communication link with a user-selected device via the communication network(s) 606, and establish a communication pathway between thepersonal computer system 680 and the user-selected device. The gateway, network controller and/oraccess point 685 may then operate to provide the capability to monitor and/or control the selected device. - Note that, as opposed to the
exemplary systems FIGS. 4-5 , theexemplary system 600 locates most of the previously discussed functionality in the gateway, network controller and/or access point 685 (e.g., as opposed to the personal computer system 680). Such an implementation may, for example, remove system complexity from thepersonal computer 680, which may for example have limited energy, memory and/or processing capabilities, and place such complexity in a central location, which may for example have relatively increased energy, memory and/or processing capabilities. Such an arrangement also provides for a plurality of personal computer systems to perform the disclosed operation with a minimum of additional complexity. -
FIG. 7 shows a block diagram of a non-limiting exemplary device (or system) 700 for providing remote user selection of a device, in accordance with various aspects of the present invention. The exemplary device 700 (or various components thereof) may, for example, operate to perform any or all functionality discussed previously with regard toFIGS. 1-6 . Also for example, theexemplary device 700 may share any or all characteristics with thedatabase system 300 and/orpersonal computer systems exemplary device 700 may share any or all characteristics with the local gateways, remote controllers and/oraccess points 585, 685 discussed previously. Further for example, theexemplary device 700 may share any or all characteristics with the exemplary device (e.g., terminal devices) 331-337, 431-437, 531-537 and 631-637 discussed previously. - The
exemplary device 700 may, for example, comprise a firstcommunication interface module 710. The firstcommunication interface module 710 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, though the firstcommunication interface module 710 is illustrated coupled to a wireless RF antenna via a wireless port 712, the wireless medium is merely illustrative and non-limiting. The firstcommunication interface module 710 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which data is communicated. - The
exemplary device 700 comprises a secondcommunication interface module 720. The secondcommunication interface module 720 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, the secondcommunication interface module 720 may communicate via a wirelessRF communication port 722 and antenna, or may communicate via a non-tethered optical communication port 724 (e.g., utilizing laser diodes, photodiodes, etc.). Also for example, the secondcommunication interface module 720 may communicate via a tethered optical communication port 726 (e.g., utilizing a fiber optic cable), or may communicate via a wired communication port 728 (e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.). The secondcommunication interface module 720 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which data is communicated. Also for example, thesecond communication module 720 may operate to communicate with local devices. - The
exemplary device 700 may also comprise additional communication interface modules, which are not illustrated. Such additional communication interface modules may, for example, share any or all aspects with the first 710 and second 720 communication interface modules discussed above. - The
exemplary device 700 may also comprise acommunication module 730. Thecommunication module 730 may, for example, operate to control and/or coordinate operation of the firstcommunication interface module 710 and the second communication interface module 720 (and/or additional communication interface modules as needed). Thecommunication module 730 may, for example, provide a convenient communication interface by which other components of thedevice 700 may utilize the first 710 and second 720 communication interface modules. Additionally, for example, in an exemplary scenario where a plurality of communication interface modules are sharing a medium and/or network, thecommunication module 730 may coordinate communications to reduce collisions and/or other interference between thecommunication interface modules communication module 730 may, for example, operate to utilize one or more of thecommunication interface modules FIGS. 1 and 2 and discussed previously). - The
exemplary device 700 may additionally comprise one or moreuser interface modules 740. The user interface module(s) 740 may generally operate to provide user interface functionality to a user of thedevice 700. For example, the user interface module(s) 740 may operate to perform any or all of the exemplary user interface functionality discussed herein (e.g., with regard to any ofFIGS. 1-6 ). For example, and without limitation, the user interface module(s) 740 may operate to provide for user control of any or all standard device commands. The user interface module(s) 740 may, for example, operate and/or respond to user commands utilizing user interface features disposed on the device 700 (e.g., buttons, touch screen, microphone, etc.) and may also utilize the communication module 730 (and/or first 710 and second 720 communication interface modules) to communicate with thedevice 700 and/or any device that is communicatively coupled thereto (e.g., to control any monitoring and/or controlling functionality discussed herein). Theuser interface module 740 may also operate to interface with and/or control operation of any of a variety of sensors that may be utilized to ascertain an on-screen pointing location. As discussed elsewhere herein, the user may specify location and/or other characteristics of a device. - The
exemplary device 700 may comprise adisplay 750. As discussed elsewhere herein, the user may utilize a display to view and/or interact with maps and/or images of network environments (e.g., for device location input and/or definition, for device selection, etc.). Such interaction may, for example, utilize adisplay 750 of thedevice 700, which may also, for example, be utilized for user input (e.g., as a touch screen, utilizing cursor control, etc.). - The
exemplary device 700 may comprise one ormore processors 760. The processor(s) 760 may, for example, comprise a general purpose processor, digital signal processor, application-specific processor, microcontroller, microprocessor, etc. The processor(s) 760 may, for example, share any or all characteristics with processors discussed elsewhere herein (e.g.,processors processor 760 may operate in accordance with software (or firmware) instructions. As mentioned previously, any or all functionality discussed herein may be performed by a processor executing instructions. For example, though various modules are illustrated as separate blocks or modules inFIG. 7 for illustrative clarity, such illustrative modules, or a portion thereof, may be implemented by the processor(s) 760. For example, the processor(s) 760 may operate to perform any or all of the steps discussed previously with regard toFIGS. 1 and 2 . - The
exemplary device 700 may comprise one ormore memories 761. As discussed above, various aspects may be performed by one or more processors executing instructions. Such instructions may, for example, be stored in the one ormore memories 761.Such memory 761 may, for example, comprise characteristics of any of a variety of types of memory. For example and without limitation,such memory 761 may comprise one or more memory chips (e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.), hard drive memory, CD memory, DVD memory, etc. Also for example, thememory 761 may share any or all characteristics with any of the databases discusses herein (e.g.,databases - The
exemplary device 700 may also comprise one or more image (e.g., pictorial image, map image, etc.)presentation modules 771. Such module(s) 771 may, for example and without limitation, operate to perform any or all of the functionality discussed herein with regard to receiving device identify information (e.g., atsteps 120 and 220). - The
exemplary device 700 may also comprise one or more userdevice selection modules 772. Such module(s) 772 may, for example and without limitation, operate to perform any or all of the functionality discussed herein with regard to receiving device identify information (e.g., atsteps 130 and 230). - The
exemplary device 700 may also comprise one or moredevice identification modules 774. Such module(s) 774 may, for example and without limitation, operate to perform any or all of the functionality discussed herein with regard to receiving additional device information (e.g., atsteps 140 and 240). - The
exemplary device 700 may also comprise one or moredatabase interface modules 790. Such module(s) 790 may, for example and without limitation, operate to perform any or all of the database interface functionality discussed herein. -
FIG. 8 shows a block diagram of a non-limitingexemplary device 800 for providing remote user selection of a device, in accordance with various aspects of the present invention.FIG. 7 provided a diagram illustrating an exemplary device (or system) 700 in accordance with various aspects of the present invention.FIG. 8 provides another diagram illustrating an exemplary device (or system) 800 in accordance with various aspects of the present invention. - The
exemplary device 800 may share any or all aspects with any of the devices (e.g., portable computer devices, access points, gateways, network controllers, terminal devices, etc.) discussed herein (e.g., with regard toFIGS. 1-7 ). For example, the exemplary device 800 (or various modules thereof) may operate to perform any or all functionality discussed herein. As with theexemplary device 700, the components of theexemplary device 800 may be co-located a single housing. - For example, the
device 800 comprises aprocessor 860. Such aprocessor 860 may, for example, share any or all characteristics with theprocessor 760 discussed with regard toFIG. 7 . Also for example, thedevice 800 comprises amemory 861.Such memory 861 may, for example, share any or all characteristics with thememory 761 discussed with regard toFIG. 7 . - Also for example, the exemplary device (or system) 800 may comprise any of a variety of user interface module(s) 840. Such user interface module(s) 840 may, for example, share any or all characteristics with the user interface module(s) 740 discussed previously with regard to
FIG. 7 . For example and without limitation, the user interface module(s) 840 may comprise: a display device, a camera (for still or moving picture acquisition), a speaker, an earphone (e.g., wired or wireless), a microphone, a video screen (e.g., a touch screen display), a vibrating mechanism, a keypad, a remote control interface, and/or any of a variety of other user interface devices (e.g., a mouse, a trackball, a touch pad, touch screen, light pen, game controlling device, etc.). - The
exemplary device 800 may also, for example, comprise any of a variety of communication modules (805, 806, and 830). Such communication module(s) may, for example, share any or all characteristics with the communication interface module(s) 710, 720 and thecommunication module 730 discussed previously with regard toFIG. 7 . For example and without limitation, the communication interface module(s) 830 may comprise: a Bluetooth interface module; an IEEE 802.11, 802.15, 802.16 and/or 802.20 module; any of a variety of cellular telecommunication interface modules (e.g., GSM/GPRS/EDGE, CDMA/CDMA2000/1x-EV-DO, WCDMA/HSDPA/HSUPA, TDMA/PDC, WiMAX, etc.); any of a variety of position-related communication interface modules (e.g., GPS, A-GPS, etc.); any of a variety of wired/tethered communication interface modules (e.g., USB, Fire Wire, RS-232, HDMI, component and/or composite video, Ethernet, wire line and/or cable modem, etc.); any of a variety of communication interface modules related to communicating with external memory devices; etc. Theexemplary device 800 is also illustrated as comprising various wired 806 and/orwireless 805 front-end modules that may, for example, be included in the communication interface modules and/or utilized thereby. - The exemplary device (or system) 800 may also comprise any of a variety of signal processing module(s) 865. Such signal processing module(s) 865 may, for example, be utilized to assist in processing various types of information discussed previously (e.g., with regard to sensor processing, position or location determination, orientation determination, video processing, image processing, audio processing, general user interface information data processing, etc.). For example and without limitation, the signal processing module(s) 890 may comprise: video/graphics processing modules (e.g. MPEG-2, MPEG-4, H.263, H.264, JPEG, TIFF, 3-D, 2-D, MDDI, etc.); audio processing modules (e.g., MP3, AAC, MIDI, QCELP, AMR, CMX, etc.); and/or tactile processing modules (e.g., Keypad I/O, touch screen processing, motor control, etc.).
- In summary, various aspects of the present invention provide a system and method providing remote user selection (e.g., image-based selection) of a device. While the invention has been described with reference to certain aspects and embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Claims (30)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/042,198 US20110248877A1 (en) | 2010-04-12 | 2011-03-07 | System and method providing remote user selection of a device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US32322310P | 2010-04-12 | 2010-04-12 | |
US13/042,198 US20110248877A1 (en) | 2010-04-12 | 2011-03-07 | System and method providing remote user selection of a device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110248877A1 true US20110248877A1 (en) | 2011-10-13 |
Family
ID=44760542
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/042,198 Abandoned US20110248877A1 (en) | 2010-04-12 | 2011-03-07 | System and method providing remote user selection of a device |
US13/042,165 Active 2031-04-08 US8812656B2 (en) | 2010-04-12 | 2011-03-07 | System and method for automatically managing a network of user-selectable devices |
US13/042,223 Abandoned US20110252328A1 (en) | 2010-04-12 | 2011-03-07 | System and method in a network controller for remotely monitoring and/or controlling devices |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/042,165 Active 2031-04-08 US8812656B2 (en) | 2010-04-12 | 2011-03-07 | System and method for automatically managing a network of user-selectable devices |
US13/042,223 Abandoned US20110252328A1 (en) | 2010-04-12 | 2011-03-07 | System and method in a network controller for remotely monitoring and/or controlling devices |
Country Status (1)
Country | Link |
---|---|
US (3) | US20110248877A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103869776A (en) * | 2014-03-07 | 2014-06-18 | 南通大学 | System and method for campus environment monitoring and information processing |
EP2942677A1 (en) * | 2014-05-09 | 2015-11-11 | GIRA GIERSIEPEN GmbH & Co. KG | Building automation system |
CN106200433A (en) * | 2016-08-17 | 2016-12-07 | 北京小米移动软件有限公司 | Equipment long-range control method and device, electronic equipment |
US20190122538A1 (en) * | 2017-10-25 | 2019-04-25 | Sony Interactive Entertainment Inc. | Spatial Remote Control |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120057734A (en) * | 2010-11-22 | 2012-06-07 | 삼성전자주식회사 | Server, device accessing server and control method |
TWI433568B (en) * | 2011-05-05 | 2014-04-01 | Univ Nat Taiwan Science Tech | Human-environment interactive system and portable device using the same |
US8612530B1 (en) * | 2011-05-27 | 2013-12-17 | Mu Dynamics, Inc. | Pass-through testing using message exchange identifiers |
MX342956B (en) * | 2011-08-30 | 2016-10-19 | Allure Energy Inc | Resource manager, system, and method for communicating resource management information for smart energy and media resources. |
US9201432B2 (en) * | 2011-12-15 | 2015-12-01 | Verizon Patent And Licensing Inc. | Home monitoring settings based on weather forecast |
US9784836B2 (en) * | 2013-11-08 | 2017-10-10 | Sharper Shape Oy | System for monitoring power lines |
US9661483B2 (en) * | 2014-06-20 | 2017-05-23 | Enrico Bastianelli | Multi-function emergency assistance request, tracking, and communication system |
US9491587B2 (en) * | 2015-03-17 | 2016-11-08 | The Boeing Company | Spatially mapping radio frequency data in 3-dimensional environments |
CN105915359A (en) * | 2015-10-22 | 2016-08-31 | 乐视致新电子科技(天津)有限公司 | Method for controlling equipment networking condition and device and system thereof |
US10404697B1 (en) | 2015-12-28 | 2019-09-03 | Symantec Corporation | Systems and methods for using vehicles as information sources for knowledge-based authentication |
US10326733B2 (en) | 2015-12-30 | 2019-06-18 | Symantec Corporation | Systems and methods for facilitating single sign-on for multiple devices |
US10116513B1 (en) | 2016-02-10 | 2018-10-30 | Symantec Corporation | Systems and methods for managing smart building systems |
US10163219B2 (en) * | 2016-06-24 | 2018-12-25 | Fca Us Llc | Machine vision cargo monitoring in a vehicle |
US10375114B1 (en) | 2016-06-27 | 2019-08-06 | Symantec Corporation | Systems and methods for enforcing access-control policies |
US10462184B1 (en) | 2016-06-28 | 2019-10-29 | Symantec Corporation | Systems and methods for enforcing access-control policies in an arbitrary physical space |
US10469457B1 (en) | 2016-09-26 | 2019-11-05 | Symantec Corporation | Systems and methods for securely sharing cloud-service credentials within a network of computing devices |
US10812981B1 (en) | 2017-03-22 | 2020-10-20 | NortonLifeLock, Inc. | Systems and methods for certifying geolocation coordinates of computing devices |
TWI690788B (en) * | 2018-09-17 | 2020-04-11 | 王啟祥 | Situation Mode Editing System |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6020881A (en) * | 1993-05-24 | 2000-02-01 | Sun Microsystems | Graphical user interface with method and apparatus for interfacing to remote devices |
US6037936A (en) * | 1993-09-10 | 2000-03-14 | Criticom Corp. | Computer vision system with a graphic user interface and remote camera control |
US20020141586A1 (en) * | 2001-03-29 | 2002-10-03 | Aladdin Knowledge Systems Ltd. | Authentication employing the bluetooth communication protocol |
US20030037341A1 (en) * | 2001-08-17 | 2003-02-20 | Van Der Meulen Pieter Sierd | System for remotely controlling consumer electronics using a web-cam image |
US20030126239A1 (en) * | 2001-12-31 | 2003-07-03 | Hwang Hye-Sook | Mobile communication terminal, network access system and method thereof using the same |
US20040203906A1 (en) * | 2002-04-19 | 2004-10-14 | Takayuki Kato | Monitoring device and monitoring system for monitoring the location of communication devices |
US7188139B1 (en) * | 1999-11-18 | 2007-03-06 | Sony Corporation | Portable information processing terminal, information input/output system and information input/output method |
US20090285443A1 (en) * | 2008-05-15 | 2009-11-19 | Sony Ericsson Mobile Communications Ab | Remote Control Based on Image Recognition |
US20090307255A1 (en) * | 2008-06-06 | 2009-12-10 | Johnson Controls Technology Company | Graphical management of building devices |
US7710456B2 (en) * | 2006-03-09 | 2010-05-04 | Fujifilm Corporation | Remote control device, method and system |
US20100211912A1 (en) * | 2002-08-08 | 2010-08-19 | Rf Check, Inc. | Interactive Graphical User Interface for an Internet Site Providing Data Related to Radio Frequency Emitters |
US20110037851A1 (en) * | 2009-08-14 | 2011-02-17 | Lg Electronics Inc. | Remote control device and remote control method using the same |
US20110138416A1 (en) * | 2009-12-04 | 2011-06-09 | Lg Electronics Inc. | Augmented remote controller and method for operating the same |
US7986299B2 (en) * | 2006-03-09 | 2011-07-26 | Fujifilm Corporation | Remote control apparatus, remote control system and device-specific information display method |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100340253B1 (en) | 1997-06-25 | 2002-06-12 | 윤종용 | Improved home network, browser based, command and control |
US8073921B2 (en) * | 1997-07-01 | 2011-12-06 | Advanced Technology Company, LLC | Methods for remote monitoring and control of appliances over a computer network |
US6430629B1 (en) * | 1999-06-10 | 2002-08-06 | Sony Corporation | Methods and apparatus for monitoring a 1394 home network |
WO2000078001A2 (en) * | 1999-06-11 | 2000-12-21 | Microsoft Corporation | General api for remote control of devices |
US6463343B1 (en) * | 1999-08-10 | 2002-10-08 | International Business Machines Corporation | System and method for controlling remote devices from a client computer using digital images |
US7337217B2 (en) * | 2000-07-21 | 2008-02-26 | Samsung Electronics Co., Ltd. | Architecture for home network on world wide web |
US7690017B2 (en) * | 2001-05-03 | 2010-03-30 | Mitsubishi Digital Electronics America, Inc. | Control system and user interface for network of input devices |
US7127271B1 (en) * | 2001-10-18 | 2006-10-24 | Iwao Fujisaki | Communication device |
KR100474483B1 (en) | 2002-03-12 | 2005-03-09 | 삼성전자주식회사 | Aparatus for providing device information via network and method thereof |
US8116889B2 (en) * | 2002-06-27 | 2012-02-14 | Openpeak Inc. | Method, system, and computer program product for managing controlled residential or non-residential environments |
US7987489B2 (en) | 2003-01-07 | 2011-07-26 | Openpeak Inc. | Legacy device bridge for residential or non-residential networks |
JP2004266453A (en) * | 2003-02-28 | 2004-09-24 | Toshiba Corp | Network system, server equipment, and communication method |
US7085838B2 (en) * | 2003-08-04 | 2006-08-01 | Sbc Knowledge Ventures, Lp | Communications system for identifying remote digital subscriber line (DSL) customer premises equipment (CPE) devices during a discovery phase |
WO2005094270A2 (en) * | 2004-03-24 | 2005-10-13 | Sharp Laboratories Of America, Inc. | Methods and systems for a/v input device to diplay networking |
JP2005292879A (en) * | 2004-03-31 | 2005-10-20 | Fujitsu Ltd | Photographic information server and photographic information transmission system |
US7463304B2 (en) * | 2004-05-06 | 2008-12-09 | Sony Ericsson Mobile Communications Ab | Remote control accessory for a camera-equipped wireless communications device |
CN101208933A (en) | 2005-06-23 | 2008-06-25 | 皇家飞利浦电子股份有限公司 | An apparatus and method of configuring a device in a network |
JP2007013694A (en) | 2005-06-30 | 2007-01-18 | Sony Corp | Interactive communication instrument and method of connection |
US8374623B2 (en) * | 2006-07-21 | 2013-02-12 | Microsoft Corporation | Location based, software control of mobile devices |
US8619136B2 (en) * | 2006-12-01 | 2013-12-31 | Centurylink Intellectual Property Llc | System and method for home monitoring using a set top box |
US20080136972A1 (en) * | 2006-12-12 | 2008-06-12 | Blankenburg Carl J | Control system and user interface for network of input devices |
EP3445021B1 (en) * | 2007-03-29 | 2020-08-05 | Signify Holding B.V. | Networked control system using logical addresses |
WO2009146199A2 (en) * | 2008-04-16 | 2009-12-03 | Deka Products Limited Partnership | Systems, apparatus, and methods for the management and control of remotely controlled devices |
US20100137020A1 (en) * | 2008-12-02 | 2010-06-03 | Broadcom Corporation | Communications device with millimeter wave remote control and methods for use therewith |
WO2010073732A1 (en) | 2008-12-26 | 2010-07-01 | パナソニック株式会社 | Communication device |
GB0823591D0 (en) * | 2008-12-30 | 2009-01-28 | Eldon Technology Ltd | A remote control device for controlling the presentation of broadcast programming |
US9014685B2 (en) * | 2009-06-12 | 2015-04-21 | Microsoft Technology Licensing, Llc | Mobile device which automatically determines operating mode |
US8508478B2 (en) * | 2009-09-25 | 2013-08-13 | At&T Intellectual Property I, Lp | Devices, systems and methods for remote control input |
US8606896B2 (en) | 2009-10-08 | 2013-12-10 | Sony Corporation | Home network component controlling data and function of another home network component |
US9413836B2 (en) * | 2010-04-08 | 2016-08-09 | At&T Intellectual Property I, L.P. | Communication routing based on presence in a confined wireless environment |
-
2011
- 2011-03-07 US US13/042,198 patent/US20110248877A1/en not_active Abandoned
- 2011-03-07 US US13/042,165 patent/US8812656B2/en active Active
- 2011-03-07 US US13/042,223 patent/US20110252328A1/en not_active Abandoned
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6020881A (en) * | 1993-05-24 | 2000-02-01 | Sun Microsystems | Graphical user interface with method and apparatus for interfacing to remote devices |
US6037936A (en) * | 1993-09-10 | 2000-03-14 | Criticom Corp. | Computer vision system with a graphic user interface and remote camera control |
US7188139B1 (en) * | 1999-11-18 | 2007-03-06 | Sony Corporation | Portable information processing terminal, information input/output system and information input/output method |
US20020141586A1 (en) * | 2001-03-29 | 2002-10-03 | Aladdin Knowledge Systems Ltd. | Authentication employing the bluetooth communication protocol |
US20030037341A1 (en) * | 2001-08-17 | 2003-02-20 | Van Der Meulen Pieter Sierd | System for remotely controlling consumer electronics using a web-cam image |
US20030126239A1 (en) * | 2001-12-31 | 2003-07-03 | Hwang Hye-Sook | Mobile communication terminal, network access system and method thereof using the same |
US20040203906A1 (en) * | 2002-04-19 | 2004-10-14 | Takayuki Kato | Monitoring device and monitoring system for monitoring the location of communication devices |
US20100211912A1 (en) * | 2002-08-08 | 2010-08-19 | Rf Check, Inc. | Interactive Graphical User Interface for an Internet Site Providing Data Related to Radio Frequency Emitters |
US7710456B2 (en) * | 2006-03-09 | 2010-05-04 | Fujifilm Corporation | Remote control device, method and system |
US7986299B2 (en) * | 2006-03-09 | 2011-07-26 | Fujifilm Corporation | Remote control apparatus, remote control system and device-specific information display method |
US20090285443A1 (en) * | 2008-05-15 | 2009-11-19 | Sony Ericsson Mobile Communications Ab | Remote Control Based on Image Recognition |
US20090307255A1 (en) * | 2008-06-06 | 2009-12-10 | Johnson Controls Technology Company | Graphical management of building devices |
US20110037851A1 (en) * | 2009-08-14 | 2011-02-17 | Lg Electronics Inc. | Remote control device and remote control method using the same |
US20110037609A1 (en) * | 2009-08-14 | 2011-02-17 | Lg Electronics Inc. | Remote control device and remote control method using the same |
US20110138416A1 (en) * | 2009-12-04 | 2011-06-09 | Lg Electronics Inc. | Augmented remote controller and method for operating the same |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103869776A (en) * | 2014-03-07 | 2014-06-18 | 南通大学 | System and method for campus environment monitoring and information processing |
EP2942677A1 (en) * | 2014-05-09 | 2015-11-11 | GIRA GIERSIEPEN GmbH & Co. KG | Building automation system |
CN106200433A (en) * | 2016-08-17 | 2016-12-07 | 北京小米移动软件有限公司 | Equipment long-range control method and device, electronic equipment |
US20190122538A1 (en) * | 2017-10-25 | 2019-04-25 | Sony Interactive Entertainment Inc. | Spatial Remote Control |
US10475332B2 (en) * | 2017-10-25 | 2019-11-12 | Sony Interactive Entertainment Inc. | Spatial remote control |
Also Published As
Publication number | Publication date |
---|---|
US20110252328A1 (en) | 2011-10-13 |
US8812656B2 (en) | 2014-08-19 |
US20110252131A1 (en) | 2011-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110248877A1 (en) | System and method providing remote user selection of a device | |
US11609684B2 (en) | Timeline-video relationship presentation for alert events | |
US11386285B2 (en) | Systems and methods of person recognition in video streams | |
US20220215664A1 (en) | Timeline-Video Relationship Processing for Alert Events | |
US11256951B2 (en) | Systems and methods of person recognition in video streams | |
EP2375695B1 (en) | System and method for managing a network of user-selectable devices | |
KR20170015622A (en) | User terminal apparatus and control method thereof | |
US11671275B2 (en) | Method and system of controlling device using real-time indoor image | |
JP7419495B2 (en) | Projection method and projection system | |
JP6658519B2 (en) | Information processing apparatus, information processing system, control method of information processing apparatus, and program | |
US20230418908A1 (en) | Systems and Methods of Person Recognition in Video Streams | |
US20170221219A1 (en) | Method and apparatus for surveillance using location-tracking imaging devices | |
CN113727193A (en) | Method and system for processing connection of multimedia content, and storage medium | |
US10582130B1 (en) | System and method for connecting a network camera | |
US11756302B1 (en) | Managing presentation of subject-based segmented video feed on a receiving device | |
KR102008672B1 (en) | System for Performing Linkage Operation of Augmented Reality and Event in Association with Camera and Driving Method Thereof | |
CN115243085A (en) | Display device and device interconnection method | |
CN109194920B (en) | Intelligent object searching method based on high-definition camera | |
US20150048173A1 (en) | Method of processing at least one object in image in computing device, and computing device | |
US20230388447A1 (en) | Subject-based smart segmentation of video feed on a transmitting device | |
US20240080366A1 (en) | Systems and methods for controlling network devices in an augmented reality environment | |
US20240078761A1 (en) | Systems and methods for managing network devices using augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KARAOGUZ, JEYHAN;REEL/FRAME:026284/0165 Effective date: 20110303 |
|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 |
|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001 Effective date: 20170119 |