US20160183604A1 - Wearable imaging sensor with wireless remote communications - Google Patents
Wearable imaging sensor with wireless remote communications Download PDFInfo
- Publication number
- US20160183604A1 US20160183604A1 US15/062,867 US201615062867A US2016183604A1 US 20160183604 A1 US20160183604 A1 US 20160183604A1 US 201615062867 A US201615062867 A US 201615062867A US 2016183604 A1 US2016183604 A1 US 2016183604A1
- Authority
- US
- United States
- Prior art keywords
- camera
- image
- garment
- data interface
- description information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000004891 communication Methods 0.000 title claims description 31
- 238000003384 imaging method Methods 0.000 title claims description 10
- 238000000034 method Methods 0.000 claims description 23
- 238000012545 processing Methods 0.000 claims description 21
- 230000008569 process Effects 0.000 claims description 15
- 230000008859 change Effects 0.000 claims description 3
- 239000000835 fiber Substances 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 17
- 230000011664 signaling Effects 0.000 description 7
- 230000001413 cellular effect Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 241000251468 Actinopterygii Species 0.000 description 2
- 239000004744 fabric Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 150000001875 compounds Chemical class 0.000 description 1
- 235000014510 cooky Nutrition 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- A—HUMAN NECESSITIES
- A41—WEARING APPAREL
- A41D—OUTERWEAR; PROTECTIVE GARMENTS; ACCESSORIES
- A41D1/00—Garments
- A41D1/002—Garments adapted to accommodate electronic equipment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G06F17/30265—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G06K9/6215—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0264—Details of the structure or mounting of specific components for a camera module assembly
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00095—Systems or arrangements for the transmission of the picture signal
- H04N1/00103—Systems or arrangements for the transmission of the picture signal specially adapted for radio transmission, e.g. via satellites
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00095—Systems or arrangements for the transmission of the picture signal
- H04N1/00111—Systems or arrangements for the transmission of the picture signal specially adapted for optical transmission
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/51—Housings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H04N5/2252—
-
- H04N5/23238—
-
- H04N5/23293—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
Definitions
- the present description relates to wearable sensors and in particular to a wearable camera capable of connection to display and information systems.
- a variety of applications have developed that allows smartphone users to use the built-in cameras that are included in many such phones.
- data from the camera is sent by the smartphone to servers for some purpose.
- the phone may be used to send images to friends, upload pictures to social networking or photo sharing sites, or to find product information using a camera image of a Quick Response Code.
- FIG. 1 is a diagram of a wearable camera as part of a garment and a mobile computer system according to an embodiment of the invention.
- FIG. 2 is a diagram of a wearable camera and a mobile computer system as part of a garment according to an embodiment of the invention.
- FIG. 3 is a signaling diagram of sending an image to a server through a portable device according to an embodiment of the invention.
- FIG. 4 is a signaling diagram of sending an image to a server according to an embodiment of the invention.
- FIG. 5 is a signaling diagram of sending an image to a server and receiving information at a portable device according to an embodiment of the invention.
- FIG. 6A is a top elevation view of a camera module with a horizontal panoramic field of view suitable for use with the present invention.
- FIG. 6B is a perspective view of the camera module of FIG. 6A .
- FIG. 7A is a top elevation view of a camera sensor system with multiple camera modules suitable for use with the present invention.
- FIG. 7B is a perspective view of the camera sensor system of FIG. 7A .
- FIG. 8 is a block diagram of a processing system suitable for use with a camera sensor according to an embodiment of the invention.
- FIG. 9A is a process flow diagram of a first usage scenario according to an embodiment of the invention.
- FIG. 9B is a process flow diagram of a second usage scenario according to an embodiment of the invention.
- FIG. 10 is a block diagram of a computing device according to an embodiment of the invention.
- Wearable technology can extend mobile computer usage beyond its current state.
- the user may be required to hold the device one way to image an object and a different way to use the device.
- sensors such as cameras are directional with a limited field of view typically 30 to 70 degrees horizontal, the camera can only capture a limited view of the area surrounding a user, not a full 180 degrees around the consumer. Even the limited directional view can only be captured when the camera is held in a particular way.
- the sensor element may be integrated into clothing either as an attachable element such as a pin or as a semi-permanent attached element.
- the cameras or sensors may then transfer the image to the user's handheld or other type of mobile device.
- the sensor may be equipped with circuitry and a wireless antenna to transmit data to a mobile device.
- the sensor may be equipped with fiber optic ports to seamlessly transfer data at a high rate of speed to the mobile device.
- the captured images may then be used with image recognition software to help improve the overall mobile experience of the user.
- FIG. 1 is a diagram of a wearable camera and computer system according to an embodiment of the invention.
- a garment 10 for example a shirt, carries a camera sensor 12 that is coupled through a wire interface 14 to an electronic device on the garment.
- the electronic device may include a system on a chip 16 coupled to a battery 18 or other type of power supply and an antenna 20 for external communications.
- the camera sensor 12 captures images and conveys them through the wired connection 14 to a data interface of the system on a chip 16 .
- the wired connection may be electrical, optical, or wireless.
- the wire may be embedded into the garment or separate from the garment and attached to the garment.
- the data interface of the system on a chip 16 receives the images and processes them depending on the particular application.
- the external antenna 20 allows the processor to communicate with external devices 22 such as a smartphone, tablet, external computer, wearable computer, or data relay terminal.
- external devices 22 such as a smartphone, tablet, external computer, wearable computer, or data relay terminal.
- the external device is a smartphone with a user interface 24 , a display 26 , and an external antenna 28 for communication with the wearable camera system 20 and with external servers (not shown).
- processing resources 16 and power supply 18 are shown as being separate and apart from the camera module 12 , this is not required. All components may be integrated into a single camera module which transmits information directly to an external device. The images captured by the camera may be further processed by the camera module 20 or by the connected processor 16 . Alternatively, raw image data may be sent directly to an external device 22 for processing.
- FIG. 2 shows an alternative implementation integrated with a garment 30 , in this case a shirt.
- a garment 30 in this case a shirt.
- any other suitable garment may be used including a blouse, a jacket, a coat, a hat, or pants.
- the garment has a camera module 32 in the shape of a shirt button which is sewn onto the garment in a conventional manner.
- the camera module is connected through a communication link 34 which may be electrical or optical or wireless to a processing system that includes a system on a chip 36 and a power supply 38 .
- the processing module is also coupled to a display 42 which, in this case, is connected or embedded into a sleeve of the shirt.
- the display may be a touch screen display or it may include user interface buttons that allow the user to view the display and send commands and instructions from the display or associated components 42 back to the processing system 36 .
- the shirt 30 is a wearable computer with an awareness of its surrounding environment through the camera module 32 .
- the imaging system is provided with an awareness of the surrounding conditions in front of the user which is the direction in which the user is usually headed. This allows the system to provide the user with information about what is in the user's path.
- the camera 12 , 32 and SOC 16 , 36 may be embedded into or incorporated into or attached to the garment in any of a variety of different ways. They may be connected using a pin through the fabric of the garment so that the camera may easily be removed and then attached to other garments. Straps and belts may alternatively be used. Similarly hook and loop fasteners may be used to hold the camera sensor, SOC, and screen to the garment. They may be held in some type of holder incorporated into the garment such as a special pocket, flap, or tab. They may be sewn into the garment as a separate structure such as the button camera 32 of FIG. 2 or woven into the fabric. Some parts may be attached in one way and other parts in different ways.
- FIG. 3 is a signal flow diagram to show communications between a camera 52 , a processing device 54 , and an external server 56 .
- the camera system sends an image in signal (a) to the processing device.
- the image (a) is then forwarded from the processing device 54 to the server 56 .
- the server analyzes the image and returns descriptive information related to the image back to the processing device. This information may be a description of an item in the image, purchasing, historical, or status information about an item in the image, or information about objects or services near an object in the image, among other kinds of information.
- the processing device 54 may be an integral part of the imaging system such as the wearable computer 36 of FIG. 2 , an external handheld or portable device, such as the smartphone 22 of FIG. 1 , or a larger, more fixed computing device such as a desktop, workstation, or other fixed computer.
- the communication of FIG. 3 allows the camera system to observe the surroundings in front of the user, send information about these surroundings to an information source 56 , and then provide information to the user through the user's smartphone 22 .
- This allows the user to hold the smartphone in any position and yet have full situational awareness about the environment immediately in front of the user. So for example as a tourist walking down the street, the user can be informed of buildings which are coming into view through a display on his smartphone 22 .
- a plant maintenance worker can receive information on the smartphone about equipment, fixtures, and rigs which come into view of the camera system 12 . This can be used for systems that are distant or for very close objects.
- the tourist may obtain information about specific items displayed on store shelves or about monuments in a city park.
- the maintenance worker may obtain information about large systems or detailed service information about a specific piece of equipment at a facility.
- the wearable camera sensor system requires only a low power, short range connection to the smart phone, such as Bluetooth, Ultra-Low Power WiFi, or NFC (Near Field Communications). This allows for a lighter, smaller system with less need for recharging.
- the smart phone may then use a higher power long range radio, such as mobile cellular.
- a wearable system may be used in the same way except that the user views the information on the sleeve display screen 42 .
- the sleeve display may be held in any position and yet the system obtains information about the environment in front of the user.
- FIG. 4 presents a signaling diagram for an alternative signaling communication.
- the camera 52 sends an image signal (a) directly to a server 56 .
- the server 56 then sends a description (b) back to the camera 52 .
- the camera can then present the information on an incorporated display 42 or send the descriptive information to an external device 22 for consideration by the user.
- FIG. 5 presents another alternative signaling diagram.
- the camera 52 sends an image signal (a) directly to a server 56 .
- the server sends the description information not to the camera but to the processing device 54 .
- the camera system has a wireless antenna 20 40 attached to its processing resources 16 , 36 , it may be able to communicate through a cellular telephone network or a WiFi network directly to a remote server to obtain the desired information. This simplifies communications for the external device 22 .
- the server can then send information directly to the user for display on the external device.
- FIG. 6A is a top elevation view of a camera system for detecting images.
- a photo detector or image sensor 72 is coupled to an imaging processor 74 that generates an output signal for wired, optical, or radio communication.
- the sensor 72 sees the surrounding environment through a wide field view lens 76 .
- the wide field of view lens may be a fish eye type lens or any other appropriate type lens.
- the lens is shown as having a 180 degree and panoramic horizontal field of view so that the camera can observe everything that is in front of the user.
- the panoramic view may be more or less than 180 degrees, depending on the particular implementation.
- FIG. 6B is a perspective view of the same camera module as in FIG. 6A .
- This view shows that the fish eye lens 76 has a 180 degree field of view in a horizontal direction and a much narrower, for example a 60 degree field of view, in a vertical direction. Reducing the field of view in the vertical direction simplifies the lens, the imaging requirements of the sensor 72 and the processing requirements of the image processor 74 while still detecting most everything of interest to the viewer and the user.
- FIG. 7A is a top elevation view of an alternative type of camera sensor system suitable for use as described with embodiments of the present invention.
- five separate camera modules are coupled together to a single image processor 88 .
- Each camera module includes a lens 86 - 1 to 86 - 5 for imaging a narrower field of view onto a photo detector 82 - 1 to 82 - 5 .
- the data from each image sensor is then processed in a separate image processor 84 - 1 to 84 - 5 . All of the processed imaged information is then consolidated and combined by an additional image processor 88 .
- the additional image processor 88 is then coupled through an output interface to the processing resources 16 , 36 of the system.
- each photodetector may be coupled into the same processor so that a single processor receives the raw data and generates a consolidated image. In such an implementation, the separate image processors 84 - 1 to 84 - 5 would not be needed.
- Each camera module of FIG. 7A may be simpler and less expensive than that of FIG. 6A which may allow for a less expensive and yet more detailed view of the surroundings of the user.
- the camera modules may be independently switched on and off to save power and to switch between a panoramic view and a more detailed single camera view.
- the camera sensors may also be activated one at a time as a scanning array or a multiple spot imaging array. While five camera modules are shown, there may be fewer or many more. Additional or different types of camera sensors may be used including artificial compound eye sensors with many more than five individual camera modules.
- Any type of camera module may be attached to a garment 10 , 30 using a clip or a pin.
- the camera module may also be permanently attached by being sewn on such as the example of the camera sensor 32 of FIG. 2 which appears as and operates as a button to hold the shirt together.
- the camera module may also be outfitted with a special holder in the garment to hold the camera module in place.
- the processing system 16 , 36 may take a variety of different forms. A simple example is shown in the block diagram of FIG. 8 .
- the processing system 90 is coupled to the camera module 92 through a data interface 94 .
- the data interface may connect to the camera module using a wired or wireless connector 91 , as explained above .
- the data interface is coupled to a controller 96 of the processor 90 which has internal memory resources 97 that may or may not be available to other components.
- the controller may be a simple FPGA (Field Programmable Gate Array) or a complex microprocessor with many functions, or any.
- the memory resources 97 may be magnetic, solid state, phase change, or any other type of memory depending on the particular implementation.
- the controller 96 may also be connected to a communications interface 98 with, for example, an antenna 99 with which to send and receive data with external devices.
- data from the camera 92 may be delivered through the data interface 94 directly through the communications interface 98 to be sent through the antenna 99 to other devices.
- information may be received from the communications interface 99 to the controller for communication to the user.
- a common bus connects the data interface, communications interface and controller to each other to exchange data, signaling, and commands.
- the memory may be connected directly to the bus or connected through the controller depending on the implementation.
- FIG. 9A shows an example usage scenario as a process flow diagram for the camera and capture systems described above.
- the camera sensor captures an image using the photo detectors as described above.
- the image is sent to an external information source. This may be sent directly from a camera module or from a larger external device. The image may be sent to a remote server or to a local device for analysis, depending on the type of analysis and the capabilities of the local resources.
- the image is analyzed to find information, metadata, or other resources that are related to the image.
- the analysis is sent back to the user for use. The analysis may be displayed on a screen as described or it may be presented with sound, such as a simulated voice or in any other way.
- This process may be repeated as the camera sensor continues to capture images.
- the process may be timed so that a new image is sent a specific time intervals, such as once a second, once a minute, once every ten minutes, etc.
- the process may be triggered by user command or by a remote command or by the system.
- the camera sensor may capture images and perform an analysis to determine if the scene has significantly changed. A significant change may be used as a trigger to send a new image.
- the image may be sent with additional information or commands based on an application currently in use or a request from the user.
- FIG. 9B shows a related process as a process flow diagram from the perspective of the server.
- an image is received from the camera sensor at the server. While use with a remote server is described, the same functions may be performed locally on an external device or by a wearable computer.
- the source of the image is identified together with any commands, related applications, or other information that may be useful in analyzing the image.
- the source that sent the image may be associated with a user account, with network identification, with cookies in a web interaction or in any other way.
- the source is identified as having a particular IP (Internet Protocol) address and specific preferences for the type of information desired and where that information may be sent.
- IP Internet Protocol
- the server analyzed the image and determines an information set for the user based on the image and general information about what information is preferred by other users.
- the image may contain EXIF (Exchangeable Image File Format) or similar data providing an identification of the camera, the conditions when taking the image, and the time and location of the camera when the image was taken. This information together with any past images may be used when analyzing the image.
- the image is analyzed and at 128 , the analysis is sent back to the source of the image.
- the analysis may be sent to the same device that sent the image or to a related device, such as a smart phone, a mobile computer, a fixed workstation, or a different remote server, depending upon how the camera system is configured and any preferences provided by the user.
- FIG. 10 illustrates a computing device 500 in accordance with one implementation of the invention that may be used for the camera modules, the processing system 16 , 36 , or the external device.
- the computing device 500 houses a board 502 .
- the board 502 may include a number of components, including but not limited to a processor 504 and at least one communication chip 506 .
- the processor 504 is physically and electrically coupled to the board 502 .
- the at least one communication chip 506 is also physically and electrically coupled to the board 502 .
- the communication chip 506 is part of the processor 504 .
- computing device 500 may include other components that may or may not be physically and electrically coupled to the board 502 .
- these other components include, but are not limited to, volatile memory (e.g., DRAM) 508 , non-volatile memory (e.g., ROM) 509 , flash memory (not shown), a graphics processor 512 , a digital signal processor (not shown), a crypto processor (not shown), a chipset 514 , an antenna 516 , a display 518 such as a touchscreen display, a touchscreen controller 520 , a battery 522 , an audio codec (not shown), a video codec (not shown), a power amplifier 524 , a global positioning system (GPS) device 526 , a compass 528 , an accelerometer (not shown), a gyroscope (not shown), a speaker 530 , a camera 532 , and a mass storage device (such as hard disk drive) 510 .
- volatile memory e.g., D
- the communication chip 506 enables wireless and/or wired communications for the transfer of data to and from the computing device 500 .
- wireless and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not.
- the communication chip 506 may implement any of a number of wireless or wired standards or protocols, including but not limited to Wi-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long term evolution (LTE), Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, Bluetooth, Ethernet derivatives thereof, as well as any other wireless and wired protocols that are designated as 3G, 4G, 5G, and beyond.
- the computing device 500 may include a plurality of communication chips 506 .
- a first communication chip 506 may be dedicated to shorter range wireless communications such as Wi-Fi and Bluetooth and a second communication chip 506 may be dedicated to longer range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others.
- the processor 504 of the computing device 500 includes an integrated circuit die packaged within the processor 504 .
- the term “processor” may refer to any device or portion of a device that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory.
- the computing device 500 may be a laptop, a netbook, a notebook, an ultrabook, a smartphone, a tablet, a personal digital assistant (PDA), an ultra mobile PC, a mobile phone, a desktop computer, a server, a printer, a scanner, a monitor, a set-top box, an entertainment control unit, a digital camera, a portable music player, or a digital video recorder.
- the computing device 500 may be any other electronic device that processes data.
- Embodiments may be implemented as a part of one or more memory chips, controllers, CPUs (Central Processing Unit), microchips or integrated circuits interconnected using a motherboard, an application specific integrated circuit (ASIC), and/or a field programmable gate array (FPGA).
- CPUs Central Processing Unit
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- references to “one embodiment”, “an embodiment”, “example embodiment”, “various embodiments”, etc. indicate that the embodiment(s) of the invention so described may include particular features, structures, or characteristics, but not every embodiment necessarily includes the particular features, structures, or characteristics. Further, some embodiments may have some, all, or none of the features described for other embodiments.
- Coupled is used to indicate that two or more elements co-operate or interact with each other, but they may or may not have intervening physical or electrical components between them.
- Some embodiments pertain to an apparatus that comprises a camera to capture images with a wide field of view, a data interface to send camera images to an external device, and a power supply to power the camera and the data interface.
- the camera, data interface, and power supply are attached to a garment.
- the camera is integrated into the garment, such as by using a pin, or being sewn to the garment.
- the camera has a panoramic field of view or a 180 degree horizontal field of view.
- the data interface is an optic fiber interface
- the data interface is a wireless interface
- the external device is a cellular telephone.
- the apparatus further includes a processor and image recognition software to process images captured by the camera before sending by the data interface.
- an imaging and communication system comprises a camera to capture images and to send the images through a short range wireless interface, a data interface to receive the images from the camera through the short range wireless interface, a processor coupled to the data interface to process the images for analysis, and a long range wireless interface coupled to the processor to send the processed images to a remote device.
- the long range wireless image is further to receive information about the sent images from the remote device, a display is coupled to the processor to display the received information about the images, and a control interface coupled to the display to allow user control of the displayed information.
- a method comprises capturing an image in a camera attached to a garment, sending the image to an external device for analysis, receiving the analysis from the external device, and presenting the analysis to a user of the garment.
- sending the image comprises sending the image to a local portable device and the local portable device sending the image to a remote server.
- the local portable device is also attached to the garment, or the local portable device is a cellular telephone.
- sending the image further comprises sending additional information about the image including time and location, presenting the analysis comprises presenting information about the image on a display that is movable independent of the camera.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Databases & Information Systems (AREA)
- Library & Information Science (AREA)
- Textile Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Astronomy & Astrophysics (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
A wearable image sensor is described. In one example an apparatus includes a camera attached to a garment to capture an image of a view of an area surrounding a user that is wearing the garment, the image including an item. A data interface is attached to the garment and coupled to the camera to send the camera image to an external device and to receive description information about the item from the external device. A power supply is attached to the garment and coupled to the camera and the data interface to power the camera and the data interface. The apparatus presents the received description information to a user of the garment.
Description
- The present application is a continuation of prior U.S. patent application Ser. No. 13/717,254, filed Dec. 17, 2012, entitled Wearable Imaging Sensor for Communications, by Ravi Pillarisetty, et al., the priority of which is hereby claimed and the contents of which are hereby incorporated by reference herein.
- The present description relates to wearable sensors and in particular to a wearable camera capable of connection to display and information systems.
- A variety of applications have developed that allows smartphone users to use the built-in cameras that are included in many such phones. In some cases, data from the camera is sent by the smartphone to servers for some purpose. As examples of using a camera, the phone may be used to send images to friends, upload pictures to social networking or photo sharing sites, or to find product information using a camera image of a Quick Response Code.
- Embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements.
-
FIG. 1 is a diagram of a wearable camera as part of a garment and a mobile computer system according to an embodiment of the invention. -
FIG. 2 is a diagram of a wearable camera and a mobile computer system as part of a garment according to an embodiment of the invention. -
FIG. 3 is a signaling diagram of sending an image to a server through a portable device according to an embodiment of the invention. -
FIG. 4 is a signaling diagram of sending an image to a server according to an embodiment of the invention. -
FIG. 5 is a signaling diagram of sending an image to a server and receiving information at a portable device according to an embodiment of the invention. -
FIG. 6A is a top elevation view of a camera module with a horizontal panoramic field of view suitable for use with the present invention. -
FIG. 6B is a perspective view of the camera module ofFIG. 6A . -
FIG. 7A is a top elevation view of a camera sensor system with multiple camera modules suitable for use with the present invention. -
FIG. 7B is a perspective view of the camera sensor system ofFIG. 7A . -
FIG. 8 is a block diagram of a processing system suitable for use with a camera sensor according to an embodiment of the invention. -
FIG. 9A is a process flow diagram of a first usage scenario according to an embodiment of the invention -
FIG. 9B is a process flow diagram of a second usage scenario according to an embodiment of the invention -
FIG. 10 is a block diagram of a computing device according to an embodiment of the invention. - Wearable technology can extend mobile computer usage beyond its current state. When a sensor is integrated into a mobile device, the user may be required to hold the device one way to image an object and a different way to use the device. In addition, because sensors, such as cameras are directional with a limited field of view typically 30 to 70 degrees horizontal, the camera can only capture a limited view of the area surrounding a user, not a full 180 degrees around the consumer. Even the limited directional view can only be captured when the camera is held in a particular way.
- By mounting, attaching, integrating, or embedding the camera into a shirt, hat, or pants, all objects in front of the user may be sensed, imaged, and captured in a panoramic 180 degree visual plane. The sensor element may be integrated into clothing either as an attachable element such as a pin or as a semi-permanent attached element.
- The cameras or sensors may then transfer the image to the user's handheld or other type of mobile device. In one embodiment, the sensor may be equipped with circuitry and a wireless antenna to transmit data to a mobile device. In another example, the sensor may be equipped with fiber optic ports to seamlessly transfer data at a high rate of speed to the mobile device. The captured images may then be used with image recognition software to help improve the overall mobile experience of the user.
-
FIG. 1 is a diagram of a wearable camera and computer system according to an embodiment of the invention. Agarment 10, for example a shirt, carries a camera sensor 12 that is coupled through awire interface 14 to an electronic device on the garment. The electronic device may include a system on achip 16 coupled to abattery 18 or other type of power supply and anantenna 20 for external communications. The camera sensor 12 captures images and conveys them through thewired connection 14 to a data interface of the system on achip 16. The wired connection may be electrical, optical, or wireless. The wire may be embedded into the garment or separate from the garment and attached to the garment. - The data interface of the system on a
chip 16 receives the images and processes them depending on the particular application. Theexternal antenna 20 allows the processor to communicate with external devices 22 such as a smartphone, tablet, external computer, wearable computer, or data relay terminal. In the example ofFIG. 1 the external device is a smartphone with auser interface 24, adisplay 26, and anexternal antenna 28 for communication with thewearable camera system 20 and with external servers (not shown). - While the
processing resources 16 andpower supply 18 are shown as being separate and apart from the camera module 12, this is not required. All components may be integrated into a single camera module which transmits information directly to an external device. The images captured by the camera may be further processed by thecamera module 20 or by the connectedprocessor 16. Alternatively, raw image data may be sent directly to an external device 22 for processing. -
FIG. 2 shows an alternative implementation integrated with agarment 30, in this case a shirt. However any other suitable garment may be used including a blouse, a jacket, a coat, a hat, or pants. The garment has acamera module 32 in the shape of a shirt button which is sewn onto the garment in a conventional manner. The camera module is connected through acommunication link 34 which may be electrical or optical or wireless to a processing system that includes a system on a chip 36 and apower supply 38. - The processing module is also coupled to a
display 42 which, in this case, is connected or embedded into a sleeve of the shirt. The display may be a touch screen display or it may include user interface buttons that allow the user to view the display and send commands and instructions from the display or associatedcomponents 42 back to the processing system 36. In this example theshirt 30 is a wearable computer with an awareness of its surrounding environment through thecamera module 32. - While only one camera module is shown in the examples of
FIGS. 1 and 2 , additional camera modules may be provided in other positions on the garment. Using a front facing camera as shown, the imaging system is provided with an awareness of the surrounding conditions in front of the user which is the direction in which the user is usually headed. This allows the system to provide the user with information about what is in the user's path. - The
camera 12, 32 andSOC 16, 36 may be embedded into or incorporated into or attached to the garment in any of a variety of different ways. They may be connected using a pin through the fabric of the garment so that the camera may easily be removed and then attached to other garments. Straps and belts may alternatively be used. Similarly hook and loop fasteners may be used to hold the camera sensor, SOC, and screen to the garment. They may be held in some type of holder incorporated into the garment such as a special pocket, flap, or tab. They may be sewn into the garment as a separate structure such as thebutton camera 32 ofFIG. 2 or woven into the fabric. Some parts may be attached in one way and other parts in different ways. - A variety of different kinds of communications are possible using the wearable camera system shown in
FIGS. 1 and 2 . As an exampleFIG. 3 is a signal flow diagram to show communications between acamera 52, a processing device 54, and anexternal server 56. In the example ofFIG. 3 the camera system sends an image in signal (a) to the processing device. The image (a) is then forwarded from the processing device 54 to theserver 56. The server analyzes the image and returns descriptive information related to the image back to the processing device. This information may be a description of an item in the image, purchasing, historical, or status information about an item in the image, or information about objects or services near an object in the image, among other kinds of information. The processing device 54 may be an integral part of the imaging system such as the wearable computer 36 ofFIG. 2 , an external handheld or portable device, such as the smartphone 22 ofFIG. 1 , or a larger, more fixed computing device such as a desktop, workstation, or other fixed computer. - The communication of
FIG. 3 allows the camera system to observe the surroundings in front of the user, send information about these surroundings to aninformation source 56, and then provide information to the user through the user's smartphone 22. This allows the user to hold the smartphone in any position and yet have full situational awareness about the environment immediately in front of the user. So for example as a tourist walking down the street, the user can be informed of buildings which are coming into view through a display on his smartphone 22. In another example, a plant maintenance worker can receive information on the smartphone about equipment, fixtures, and rigs which come into view of the camera system 12. This can be used for systems that are distant or for very close objects. - The tourist may obtain information about specific items displayed on store shelves or about monuments in a city park. Similarly, the maintenance worker may obtain information about large systems or detailed service information about a specific piece of equipment at a facility.
- Using a smart phone, the wearable camera sensor system requires only a low power, short range connection to the smart phone, such as Bluetooth, Ultra-Low Power WiFi, or NFC (Near Field Communications). This allows for a lighter, smaller system with less need for recharging. The smart phone may then use a higher power long range radio, such as mobile cellular. Alternatively, a wearable system may be used in the same way except that the user views the information on the
sleeve display screen 42. Using a camera mounted in a separate independent position, the sleeve display may be held in any position and yet the system obtains information about the environment in front of the user. -
FIG. 4 presents a signaling diagram for an alternative signaling communication. In the example ofFIG. 4 thecamera 52 sends an image signal (a) directly to aserver 56. Theserver 56 then sends a description (b) back to thecamera 52. The camera can then present the information on an incorporateddisplay 42 or send the descriptive information to an external device 22 for consideration by the user. -
FIG. 5 presents another alternative signaling diagram. Thecamera 52 sends an image signal (a) directly to aserver 56. However, the server sends the description information not to the camera but to the processing device 54. Because the camera system has awireless antenna 20 40 attached to itsprocessing resources 16, 36, it may be able to communicate through a cellular telephone network or a WiFi network directly to a remote server to obtain the desired information. This simplifies communications for the external device 22. However, the server can then send information directly to the user for display on the external device. - In the described examples the
camera system 12, 32 may take many different configurations. It may be attached to the garment as an accessory or integrated into the garment as a button or a nonfunctional item.FIG. 6A is a top elevation view of a camera system for detecting images. A photo detector orimage sensor 72 is coupled to animaging processor 74 that generates an output signal for wired, optical, or radio communication. Thesensor 72 sees the surrounding environment through a widefield view lens 76. The wide field of view lens may be a fish eye type lens or any other appropriate type lens. In the example ofFIG. 6A the lens is shown as having a 180 degree and panoramic horizontal field of view so that the camera can observe everything that is in front of the user. The panoramic view may be more or less than 180 degrees, depending on the particular implementation. -
FIG. 6B is a perspective view of the same camera module as inFIG. 6A . This view shows that thefish eye lens 76 has a 180 degree field of view in a horizontal direction and a much narrower, for example a 60 degree field of view, in a vertical direction. Reducing the field of view in the vertical direction simplifies the lens, the imaging requirements of thesensor 72 and the processing requirements of theimage processor 74 while still detecting most everything of interest to the viewer and the user. -
FIG. 7A is a top elevation view of an alternative type of camera sensor system suitable for use as described with embodiments of the present invention. In this example five separate camera modules are coupled together to asingle image processor 88. Each camera module includes a lens 86-1 to 86-5 for imaging a narrower field of view onto a photo detector 82-1 to 82-5. The data from each image sensor is then processed in a separate image processor 84-1 to 84-5. All of the processed imaged information is then consolidated and combined by anadditional image processor 88. Theadditional image processor 88 is then coupled through an output interface to theprocessing resources 16, 36 of the system. Alternatively, each photodetector may be coupled into the same processor so that a single processor receives the raw data and generates a consolidated image. In such an implementation, the separate image processors 84-1 to 84-5 would not be needed. - Each camera module of
FIG. 7A may be simpler and less expensive than that ofFIG. 6A which may allow for a less expensive and yet more detailed view of the surroundings of the user. In one embodiment the camera modules may be independently switched on and off to save power and to switch between a panoramic view and a more detailed single camera view. The camera sensors may also be activated one at a time as a scanning array or a multiple spot imaging array. While five camera modules are shown, there may be fewer or many more. Additional or different types of camera sensors may be used including artificial compound eye sensors with many more than five individual camera modules. - Any type of camera module may be attached to a
garment camera sensor 32 ofFIG. 2 which appears as and operates as a button to hold the shirt together. The camera module may also be outfitted with a special holder in the garment to hold the camera module in place. - The
processing system 16, 36 may take a variety of different forms. A simple example is shown in the block diagram ofFIG. 8 . The processing system 90 is coupled to the camera module 92 through a data interface 94. The data interface may connect to the camera module using a wired orwireless connector 91, as explained above . The data interface is coupled to a controller 96 of the processor 90 which hasinternal memory resources 97 that may or may not be available to other components. The controller may be a simple FPGA (Field Programmable Gate Array) or a complex microprocessor with many functions, or any. Thememory resources 97 may be magnetic, solid state, phase change, or any other type of memory depending on the particular implementation. - The controller 96 may also be connected to a
communications interface 98 with, for example, an antenna 99 with which to send and receive data with external devices. Depending on the particular implementation, data from the camera 92 may be delivered through the data interface 94 directly through thecommunications interface 98 to be sent through the antenna 99 to other devices. Similarly, information may be received from the communications interface 99 to the controller for communication to the user. As shown a common bus connects the data interface, communications interface and controller to each other to exchange data, signaling, and commands. The memory may be connected directly to the bus or connected through the controller depending on the implementation. -
FIG. 9A shows an example usage scenario as a process flow diagram for the camera and capture systems described above. At 102, the camera sensor captures an image using the photo detectors as described above. At 104, the image is sent to an external information source. This may be sent directly from a camera module or from a larger external device. The image may be sent to a remote server or to a local device for analysis, depending on the type of analysis and the capabilities of the local resources. At 106, the image is analyzed to find information, metadata, or other resources that are related to the image. Finally at 108, the analysis is sent back to the user for use. The analysis may be displayed on a screen as described or it may be presented with sound, such as a simulated voice or in any other way. - This process may be repeated as the camera sensor continues to capture images. The process may be timed so that a new image is sent a specific time intervals, such as once a second, once a minute, once every ten minutes, etc. The process may be triggered by user command or by a remote command or by the system. For example, the camera sensor may capture images and perform an analysis to determine if the scene has significantly changed. A significant change may be used as a trigger to send a new image. In addition, the image may be sent with additional information or commands based on an application currently in use or a request from the user.
-
FIG. 9B shows a related process as a process flow diagram from the perspective of the server. At 122, an image is received from the camera sensor at the server. While use with a remote server is described, the same functions may be performed locally on an external device or by a wearable computer. At 124, the source of the image is identified together with any commands, related applications, or other information that may be useful in analyzing the image. The source that sent the image may be associated with a user account, with network identification, with cookies in a web interaction or in any other way. In one embodiment the source is identified as having a particular IP (Internet Protocol) address and specific preferences for the type of information desired and where that information may be sent. In another embodiment, the server analyzed the image and determines an information set for the user based on the image and general information about what information is preferred by other users. The image may contain EXIF (Exchangeable Image File Format) or similar data providing an identification of the camera, the conditions when taking the image, and the time and location of the camera when the image was taken. This information together with any past images may be used when analyzing the image. At 126 the image is analyzed and at 128, the analysis is sent back to the source of the image. The analysis may be sent to the same device that sent the image or to a related device, such as a smart phone, a mobile computer, a fixed workstation, or a different remote server, depending upon how the camera system is configured and any preferences provided by the user. -
FIG. 10 illustrates acomputing device 500 in accordance with one implementation of the invention that may be used for the camera modules, theprocessing system 16, 36, or the external device. Thecomputing device 500 houses aboard 502. Theboard 502 may include a number of components, including but not limited to a processor 504 and at least onecommunication chip 506. The processor 504 is physically and electrically coupled to theboard 502. In some implementations the at least onecommunication chip 506 is also physically and electrically coupled to theboard 502. In further implementations, thecommunication chip 506 is part of the processor 504. - Depending on its applications,
computing device 500 may include other components that may or may not be physically and electrically coupled to theboard 502. These other components include, but are not limited to, volatile memory (e.g., DRAM) 508, non-volatile memory (e.g., ROM) 509, flash memory (not shown), agraphics processor 512, a digital signal processor (not shown), a crypto processor (not shown), a chipset 514, an antenna 516, a display 518 such as a touchscreen display, a touchscreen controller 520, abattery 522, an audio codec (not shown), a video codec (not shown), apower amplifier 524, a global positioning system (GPS)device 526, acompass 528, an accelerometer (not shown), a gyroscope (not shown), aspeaker 530, acamera 532, and a mass storage device (such as hard disk drive) 510. These components may be connected to thesystem board 502, mounted to the system board, or combined with any of the other components. - The
communication chip 506 enables wireless and/or wired communications for the transfer of data to and from thecomputing device 500. The term “wireless” and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not. Thecommunication chip 506 may implement any of a number of wireless or wired standards or protocols, including but not limited to Wi-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long term evolution (LTE), Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, Bluetooth, Ethernet derivatives thereof, as well as any other wireless and wired protocols that are designated as 3G, 4G, 5G, and beyond. Thecomputing device 500 may include a plurality ofcommunication chips 506. For instance, afirst communication chip 506 may be dedicated to shorter range wireless communications such as Wi-Fi and Bluetooth and asecond communication chip 506 may be dedicated to longer range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others. - The processor 504 of the
computing device 500 includes an integrated circuit die packaged within the processor 504. The term “processor” may refer to any device or portion of a device that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory. - In various implementations, the
computing device 500 may be a laptop, a netbook, a notebook, an ultrabook, a smartphone, a tablet, a personal digital assistant (PDA), an ultra mobile PC, a mobile phone, a desktop computer, a server, a printer, a scanner, a monitor, a set-top box, an entertainment control unit, a digital camera, a portable music player, or a digital video recorder. In further implementations, thecomputing device 500 may be any other electronic device that processes data. - Embodiments may be implemented as a part of one or more memory chips, controllers, CPUs (Central Processing Unit), microchips or integrated circuits interconnected using a motherboard, an application specific integrated circuit (ASIC), and/or a field programmable gate array (FPGA).
- References to “one embodiment”, “an embodiment”, “example embodiment”, “various embodiments”, etc., indicate that the embodiment(s) of the invention so described may include particular features, structures, or characteristics, but not every embodiment necessarily includes the particular features, structures, or characteristics. Further, some embodiments may have some, all, or none of the features described for other embodiments.
- In the following description and claims, the term “coupled” along with its derivatives, may be used. “Coupled” is used to indicate that two or more elements co-operate or interact with each other, but they may or may not have intervening physical or electrical components between them.
- As used in the claims, unless otherwise specified, the use of the ordinal adjectives “first”, “second”, “third”, etc., to describe a common element, merely indicate that different instances of like elements are being referred to, and are not intended to imply that the elements so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
- The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.
- The following examples pertain to further embodiments. The various features of the different embodiments may be variously combined with some features included and others excluded to suit a variety of different applications. Some embodiments pertain to an apparatus that comprises a camera to capture images with a wide field of view, a data interface to send camera images to an external device, and a power supply to power the camera and the data interface. The camera, data interface, and power supply are attached to a garment. In further embodiments, the camera is integrated into the garment, such as by using a pin, or being sewn to the garment.
- In further embodiments, the camera has a panoramic field of view or a 180 degree horizontal field of view. In further embodiments, the data interface is an optic fiber interface, the data interface is a wireless interface, and the external device is a cellular telephone. In further embodiments the apparatus further includes a processor and image recognition software to process images captured by the camera before sending by the data interface.
- In another embodiment an imaging and communication system comprises a camera to capture images and to send the images through a short range wireless interface, a data interface to receive the images from the camera through the short range wireless interface, a processor coupled to the data interface to process the images for analysis, and a long range wireless interface coupled to the processor to send the processed images to a remote device.
- In further embodiments, the long range wireless image is further to receive information about the sent images from the remote device, a display is coupled to the processor to display the received information about the images, and a control interface coupled to the display to allow user control of the displayed information.
- In another embodiment, a method comprises capturing an image in a camera attached to a garment, sending the image to an external device for analysis, receiving the analysis from the external device, and presenting the analysis to a user of the garment.
- In further embodiments, sending the image comprises sending the image to a local portable device and the local portable device sending the image to a remote server. In further embodiments, the local portable device is also attached to the garment, or the local portable device is a cellular telephone. In further embodiments, sending the image further comprises sending additional information about the image including time and location, presenting the analysis comprises presenting information about the image on a display that is movable independent of the camera.
Claims (20)
1. An apparatus comprising:
a camera attached to a garment to capture an image of a view of an area surrounding a user that is wearing the garment, the image including an item;
a data interface attached to the garment and coupled to the camera to send the camera image to an external device and to receive description information about the item from the external device; and
a power supply attached to the garment and coupled to the camera and the data interface to power the camera and the data interface,
wherein the apparatus presents the received description information to a user of the garment.
2. The apparatus of claim 1 , wherein the camera has a panoramic field of view.
3. The apparatus of claim 2 , wherein the camera has a 180 degree horizontal field of view.
4. The apparatus of claim 1 , wherein the data interface is an optic fiber interface coupled to a local processing device.
5. The apparatus of claim 4 , wherein the local processing device is a separate mobile device.
6. The apparatus of claim 1 , wherein the data interface is a wireless interface through a mobile device and the external device is a remote server.
7. The apparatus of claim 1 , further comprising a processor and image recognition software to process a sequence of images captured by the camera and to select an image in the sequence that shows a significant change from an image before the selected image and wherein the data interface is to send the selected image to the external device.
8. The apparatus of claim 1 , further comprising a display coupled to the data interface to present the received description information to the user.
9. The apparatus of claim 8 , further comprising a handheld device with a wireless connection to the data interface and wherein the handheld device comprises the display.
10. The apparatus of claim 9 , wherein the handheld device comprises a long range radio and wherein the data interface sends to the external device through the long range radio of the handheld device.
11. The apparatus of claim 1 , wherein the apparatus presents the received description information as sound.
12. The apparatus of claim 1 , wherein the description information is historical and status information about the item.
13. An imaging and communication system comprising:
a camera attached to a garment to capture images of a view of an area surrounding a user that is wearing the garment, the images including an item, and to send the images through a short range wireless interface;
a handheld device having a data interface to receive the images from the camera through the short range wireless interface;
a processor of the handheld device coupled to the data interface;
a long range wireless interface of the handheld coupled to the processor to send the images to an external device and to receive description information about the item from the external device; and
a display of the handheld device to present the received description information.
14. The system of claim 13 , further comprising a control interface of the handheld device coupled to the display to allow user control of the displayed information.
15. The system of claim 11 , wherein the description information is historical and status information about the item.
16. A method comprising:
capturing an image in a camera attached to a garment, the image including an item;
sending the image to an external device for analysis;
receiving description information about the item from the external device; and
presenting the received description information to a user of the garment.
17. The method of claim 16 , wherein sending the image comprises sending the image to a local portable device and the local portable device sending the image to a remote server.
18. The method of claim 17 , wherein the local portable device is also attached to the garment.
19. The method of claim 18 , wherein presenting the received description information comprises presenting information about the item on a display of the local portable device
20. The method of claim 16 , wherein sending the image further comprises sending additional information about the image including time and location.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/062,867 US20160183604A1 (en) | 2012-12-17 | 2016-03-07 | Wearable imaging sensor with wireless remote communications |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/717,254 US9374509B2 (en) | 2012-12-17 | 2012-12-17 | Wearable imaging sensor for communications |
US15/062,867 US20160183604A1 (en) | 2012-12-17 | 2016-03-07 | Wearable imaging sensor with wireless remote communications |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/717,254 Continuation US9374509B2 (en) | 2012-12-17 | 2012-12-17 | Wearable imaging sensor for communications |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160183604A1 true US20160183604A1 (en) | 2016-06-30 |
Family
ID=50911843
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/717,254 Expired - Fee Related US9374509B2 (en) | 2012-12-17 | 2012-12-17 | Wearable imaging sensor for communications |
US15/062,867 Abandoned US20160183604A1 (en) | 2012-12-17 | 2016-03-07 | Wearable imaging sensor with wireless remote communications |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/717,254 Expired - Fee Related US9374509B2 (en) | 2012-12-17 | 2012-12-17 | Wearable imaging sensor for communications |
Country Status (3)
Country | Link |
---|---|
US (2) | US9374509B2 (en) |
CN (1) | CN103873750B (en) |
BR (1) | BR102013030317A2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190109966A1 (en) * | 2017-10-05 | 2019-04-11 | Haddon Spurgeon Kirk, III | System for Live Streaming and/or Video Recording of Platform Tennis Matches |
EP3621288A1 (en) | 2018-09-07 | 2020-03-11 | Bundesdruckerei GmbH | Arrangement and method for optically detecting objects and / or persons to be checked |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160234358A9 (en) * | 2012-05-09 | 2016-08-11 | Tara Chand Singhal | Apparatus and method for an alternate form-factor handheld smart phone device |
US10122963B2 (en) | 2013-06-11 | 2018-11-06 | Milestone Av Technologies Llc | Bidirectional audio/video: system and method for opportunistic scheduling and transmission |
US10880712B2 (en) | 2014-09-14 | 2020-12-29 | Eugene Joseph Bransfield | Multifunction scanner and computer |
US10791415B2 (en) * | 2014-09-14 | 2020-09-29 | Eugene Joseph Bransfield | Hand-held, simplified WiFi scanner |
CN104317393B (en) * | 2014-09-25 | 2018-05-08 | 小米科技有限责任公司 | Method for information display and device, electronic equipment |
CN107003514A (en) * | 2014-12-26 | 2017-08-01 | 英特尔公司 | Wear-type wearable device power-supply system |
WO2016196411A1 (en) | 2015-05-30 | 2016-12-08 | Jordan Frank | Electronic utility strap |
US20190273133A1 (en) * | 2016-12-14 | 2019-09-05 | Intel Corporation | Transistor source/drain amorphous interlayer arrangements |
IT201700081272A1 (en) * | 2017-07-18 | 2019-01-18 | Eurotech S P A | CLOTHING FOR USE IN FIRE AND SIMILAR SHUTDOWN OPERATIONS |
GB2599356A (en) * | 2020-09-16 | 2022-04-06 | Haddad Nafeesa | Video capture device for attaching to an article of clothing |
WO2022141348A1 (en) * | 2020-12-31 | 2022-07-07 | SZ DJI Technology Co., Ltd. | Systems, devices, and methods supporting multiple photography modes with a control device |
WO2022195342A1 (en) * | 2021-03-19 | 2022-09-22 | Lui Ying Lun | Imaging device and system, method of capturing images and computer readable storage medium |
CN114051722A (en) * | 2021-03-19 | 2022-02-15 | 吕应麟 | Imaging apparatus and system, method of capturing image, and computer-readable storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030133008A1 (en) * | 1999-02-02 | 2003-07-17 | Stanley W. Stephenson | Wearable panoramic imager |
US20040056957A1 (en) * | 2002-09-20 | 2004-03-25 | Crandall John Christopher | System and method for capturing images based upon subject orientation |
US20060055775A1 (en) * | 2004-09-16 | 2006-03-16 | Seong Taeg Nou | System and method for providing image service using telematics system |
US20100245585A1 (en) * | 2009-02-27 | 2010-09-30 | Fisher Ronald Eugene | Headset-Based Telecommunications Platform |
US20130308920A1 (en) * | 2011-06-10 | 2013-11-21 | Lucas J. Myslinski | Method of and system for fact checking recorded information |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8577204B2 (en) * | 2006-11-13 | 2013-11-05 | Cyberlink Corp. | System and methods for remote manipulation of video over a network |
CN201393284Y (en) * | 2009-04-03 | 2010-01-27 | 周业如 | Wearable wireless audio-video interactive system |
CN102158686A (en) * | 2011-01-13 | 2011-08-17 | 重庆桓浩科技发展有限公司 | Novel security management system for Internet of things residential district |
-
2012
- 2012-12-17 US US13/717,254 patent/US9374509B2/en not_active Expired - Fee Related
-
2013
- 2013-11-26 BR BR102013030317A patent/BR102013030317A2/en not_active Application Discontinuation
- 2013-12-16 CN CN201310688947.1A patent/CN103873750B/en not_active Expired - Fee Related
-
2016
- 2016-03-07 US US15/062,867 patent/US20160183604A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030133008A1 (en) * | 1999-02-02 | 2003-07-17 | Stanley W. Stephenson | Wearable panoramic imager |
US20040056957A1 (en) * | 2002-09-20 | 2004-03-25 | Crandall John Christopher | System and method for capturing images based upon subject orientation |
US20060055775A1 (en) * | 2004-09-16 | 2006-03-16 | Seong Taeg Nou | System and method for providing image service using telematics system |
US20100245585A1 (en) * | 2009-02-27 | 2010-09-30 | Fisher Ronald Eugene | Headset-Based Telecommunications Platform |
US20130308920A1 (en) * | 2011-06-10 | 2013-11-21 | Lucas J. Myslinski | Method of and system for fact checking recorded information |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190109966A1 (en) * | 2017-10-05 | 2019-04-11 | Haddon Spurgeon Kirk, III | System for Live Streaming and/or Video Recording of Platform Tennis Matches |
US11050905B2 (en) * | 2017-10-05 | 2021-06-29 | Haddon Spurgeon Kirk, III | System for live streaming and/or video recording of platform tennis matches |
EP3621288A1 (en) | 2018-09-07 | 2020-03-11 | Bundesdruckerei GmbH | Arrangement and method for optically detecting objects and / or persons to be checked |
DE102018121901A1 (en) * | 2018-09-07 | 2020-03-12 | Bundesdruckerei Gmbh | Arrangement and method for the optical detection of objects and / or people to be checked |
Also Published As
Publication number | Publication date |
---|---|
CN103873750A (en) | 2014-06-18 |
BR102013030317A2 (en) | 2015-12-08 |
CN103873750B (en) | 2018-01-30 |
US20140168355A1 (en) | 2014-06-19 |
US9374509B2 (en) | 2016-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9374509B2 (en) | Wearable imaging sensor for communications | |
EP2974268B1 (en) | Always-on camera sampling strategies | |
US10028080B2 (en) | Method and apparatus for establishing communication between an image photographing apparatus and a user device | |
JP7248304B2 (en) | Image display method, electronic device, computer-readable storage medium and computer program | |
US10514708B2 (en) | Method, apparatus and system for controlling unmanned aerial vehicle | |
US10165182B1 (en) | Panoramic imaging systems based on two laterally-offset and vertically-overlap camera modules | |
CN105138126A (en) | Unmanned aerial vehicle shooting control method and device and electronic device | |
US8478308B2 (en) | Positioning system for adding location information to the metadata of an image and positioning method thereof | |
US9871956B2 (en) | Multiple lenses in a mobile device | |
US20150381886A1 (en) | Camera Controlling Apparatus For Controlling Camera Operation | |
CN112822387A (en) | Combined images from front and rear cameras | |
WO2021175097A1 (en) | Not-line-of-sight object imaging method, and electronic device | |
US20160155253A1 (en) | Electronic device and method of displaying images on electronic device | |
JP2013219608A (en) | Information processing apparatus, control method for information processing apparatus, and program | |
WO2021164387A1 (en) | Early warning method and apparatus for target object, and electronic device | |
US10148874B1 (en) | Method and system for generating panoramic photographs and videos | |
US11146741B2 (en) | Electronic device and method for capturing and displaying image | |
CN115119135A (en) | Data sending method, receiving method and device | |
US20240013490A1 (en) | Augmented live content | |
JP7082175B2 (en) | A system including a terminal device for displaying a virtual object and a server device, and the server device. | |
JP6815439B2 (en) | A system including a terminal device and a server device for displaying a virtual object, and the server device. | |
JP2013114653A (en) | Information processor, method of controlling the same, and program | |
CN114143588A (en) | Play control method and electronic equipment | |
WO2018222532A1 (en) | Rfid based zoom lens tracking of objects of interest | |
Stanimirovic et al. | [Poster] Smartwatch-aided handheld augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INTEL CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PILLARISETTY, RAVI;AGRAHARAM, SAIRAM;GUZEK, JOHN S.;AND OTHERS;SIGNING DATES FROM 20121217 TO 20130204;REEL/FRAME:041702/0879 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |