WO2016085669A1 - A modular image capture device - Google Patents

A modular image capture device Download PDF

Info

Publication number
WO2016085669A1
WO2016085669A1 PCT/US2015/060481 US2015060481W WO2016085669A1 WO 2016085669 A1 WO2016085669 A1 WO 2016085669A1 US 2015060481 W US2015060481 W US 2015060481W WO 2016085669 A1 WO2016085669 A1 WO 2016085669A1
Authority
WO
WIPO (PCT)
Prior art keywords
image capture
module
data
image
ports
Prior art date
Application number
PCT/US2015/060481
Other languages
French (fr)
Inventor
Amir Ehsani ZONOUZ
Original Assignee
University Of Massachusetts
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201462085247P priority Critical
Priority to US62/085,247 priority
Application filed by University Of Massachusetts filed Critical University Of Massachusetts
Publication of WO2016085669A1 publication Critical patent/WO2016085669A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/2258Cameras using two or more image sensors, e.g. a CMOS sensor for video and a CCD for still image
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2206/00Systems for exchange of information between different pieces of apparatus, e.g. for exchanging trimming information, for photo finishing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors

Abstract

An apparatus includes first, second, and third ports respectively extending from a main body along first, second, and third axes that span a plane. Each of the first, second, and third ports is electrically and mechanically connectable to one of two or more image capture devices. The apparatus includes one or more processing devices configured to execute machine-readable instructions to perform operations. The operations include identifying a type of the one image capture module connected to one of the first, second, and third ports. The operations also include receiving imagery data from the connected image capture module. The operations further include transmitting data representative of the received imagery data to a computing device separate from the image capture module.

Description

A MODULAR IMAGE CAPTURE DEVICE
CROSS-REFERENCE TO RELATED APPLICATION
This application claims the benefit of U.S. Provisional Application No.
62/085,247 filed on November 27, 2014, the entire contents of which is incorporated herein by reference.
TECHNICAL FIELD
This disclosure relates to image capturing by an image capturing device using an image capture module.
BACKGROUND
A conventional digital camera can include a single lens capable of capturing images of an environment. The digital camera can include a display for playback images captured of the environment. The image data can be stored on an internal data storage device or an external data storage device connected to the digital camera using a wired connection. The digital camera can capture still images or record video of the environment.
A user can interact with user interface devices of the digital camera, including buttons and switches. The user can use these user interface devices to select settings or control operations of the digital camera. The user can control mechanical settings of the lens attached to camera, such as a focal length or an optical magnification distance. The user can also control processes implemented by the digital camera, such as postprocessing operations for the images.
The user can connect external devices to the digital camera, such as an external memory storage element. These external devices are connected to ports dedicated to those types of devices. For example, the digital camera can include a Secure Digital (SD) card port for the external memory storage element. The digital camera can also include a micro Universal Serial Bus (USB) port that can be used to connect the digital camera to a remote computing device. The digital camera can include a battery port to connect a removable energy source. SUMMARY
In one aspect, an apparatus includes first, second, and third ports respectively extending from a main body along first, second, and third axes that span a plane. Each of the first, second, and third ports is electrically and mechanically connectable to one of two or more image capture modules. The apparatus include one or more processing devices configured to execute machine-readable instructions to perform operations. The operations include identifying a type of the one image capture module connected to one of the first, second, and third ports. The operations further include receiving imagery data from the connected image capture module. The operation also include transmitting data representative of the received imagery data to a computing device separate from the image capture module.
In some examples, the operations can include recording processed image data received through at least one of the first, second, and third ports.
In some examples, the operations can include controlling the image capture module based on the identified type.
In some examples, the first, second, and third axes can be rotationally symmetric about an intersection of the first, second, and third axes.
In some examples, the apparatus can include a first image capture module being connectable to one of the ports and including one or more lenses. The apparatus can include a second image capture module. The second image capture module can be connectable to a second of the ports and can include one or more lenses. The first image capture module can provide one viewing perspective. The second image capture module can provide a different viewing perspective. The one or more processing devices can be configured to process imagery data from the first and second image capture modules to produce three-dimensional imagery. The apparatus can include a third image capture module. The third image capture module can be connectable to a third of the ports and including one or more lenses. The one or more processing devices can be configured to process imagery data from the first, second and third image capture modules to produce three-hundred-sixty-degree imagery.
In some examples, the apparatus can include a fourth port. The fourth port cna extend from the main body along a fourth axis. The fourth axis can be angled away from the plane. The fourth axis can intersect the first, second, and third axes at the intersection of the first, second, and third axes. In some examples, the apparatus can include an image presenting module. The image presenting module can be connectable to one of the ports. The image presenting module can include a display. The image presenting module can include a projector.
In some examples, the apparatus can include a location detecting module. The location detecting module can include a global positioning system (GPS) receiver.
In some examples, the apparatus can include a night vision module. The night vision module can include an infrared receiver.
In some examples, the apparatus can include an energy source module for providing electrical energy to at least one of the main body and one or more other modules.
In some examples, the operations can include wirelessly receiving control data from the computing device separate from the one image capture module. The operations can include transmitting control signals to the one image capture module to control the one image capture module. Wirelessly receiving the control data can be provided by a wide area network. Wirelessly receiving the control data can be provided by a local area network. Wirelessly receiving the control data can be provided by near field
communication.
In another aspect, a computing device implemented method includes identifying a type of an image capture module. The image capture module is electrically and mechanically connected to one of a first, second, and third ports of an image capture device. The first, second, and third ports each respectively extend from the image capture device along one of a first, second, and third axes that span a plane. The method further includes receiving imagery data from the connected image capture module. The method also includes generating processed imagery data from the received imagery data based on the identified type of the image capture module. The method further includes transmitting the processed imagery data to a computing device separate from the image capture device and the connected image capture module.
In some examples, the method can further include receiving control data from the computing device. The method can also include controlling the image capture module based on the received control data.
In a further aspect, one or more computer readable media storing instructions are executable by a processing device. The instructions, upon such execution cause the processing device to perform operations including identifying a type of an image capture module. The image capture module is electrically and mechanically connected to one of a first, second, and third ports of an image capture device. The first, second, and third ports each respectively extend from the image capture device along one of a first, second, and third axes that span a plane. The operations further include receiving imagery data from the connected image capture module. The operations also include generating processed imagery data from the received imagery data based on the identified type of the image capture module. The operation include transmitting the processed imagery data to a computing device separate from the image capture device and the connected image capture module.
In some examples, the operations can further include receiving control data from the computing device. The operations can also include controlling the image capture module based on the received control data.
These features and other features described in this disclosure can derive the following advantages. Multiple lenses can be connected to the apparatus to generate complex image data usable by the apparatus to generate complex images. These complex images can capture perspectives of an environment of the apparatus that a single lens would typically be unable to capture.
The ports of the apparatus can further be connected to various types of modules, enabling a single port to enable many types of functionalities. These functionalities can include recharging an energy source of the apparatus, capturing image data for the apparatus to process, driving a mechanism to move the apparatus around the environment, etc. The multi-functionality of the ports further allow modules to be easily and conveniently changed and replaced. The modularity of the apparatus further provides for easy customization by the user for different applications of image and video recording.
The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.
DESCRIPTION OF DRAWINGS
FIGS. 1A and IB are an exploded front view and an exploded rear view, respectively, of an image capture device with connected image capture modules.
FIG. 2 is a top view of an image capture device.
FIGS. 3A and 3B are a side view and a front view, respectively, of an image capture module. FIG. 3 C is an exploded front perspective view of the image capture device of FIGS. 3A and 3B.
FIG. 4A and 4B are top schematic views of an image capture device with an image capture module connectable to the image capture device.
FIGS. 5A, 5B, 5C, 5D, and 5E are a front-right perspective view, a front-left perspective view, a front perspective view, a rear perspective view, and a top perspective view, respectively, of an image capture device with connected image capture modules.
FIG. 5F is a schematic top view of the image capture device with the connected image capture modules of FIGS. 5A-5E.
FIG. 6A is a perspective view of a display module.
FIG. 6B is a top view the display module of FIG. 6A connected to an image capture device.
FIG. 7 is a top perspective view of an energy source module connected to an image capture device.
FIGS. 8A and 8B are a side view and a top view, respectively, of a module extension.
FIG. 8C is a top perspective view of module extensions connected to an image capture device.
FIG. 8D is a top perspective view of the module extensions connected to the image capture device of FIG. 8C.
FIG. 9A is a front perspective view of a projector module connected to an image capture device.
FIG. 9B is a top perspective view of the projector module of FIG. 9A.
FIG. 10A is a block diagram of an image capture device connected to image capture modules and a computing device.
FIG. 10B is a block diagram of systems of an image capture module.
FIG. 11 A is a block diagram of a control system for an image capture device.
FIG. 1 IB is a block diagram for a module port interface.
FIG. 12 is a flowchart of operations executable by a processor.
FIG. 13 is a schematic of a computing system and a mobile computing system.
Like reference symbols in the various drawings indicate like elements. DETAILED DESCRIPTION
Image capture devices can capture images from their surroundings using lenses that receive light from the surroundings. These image capture devices can be capable of displaying the images for playback for a user. The image capture devices can also be mounted or attached to surfaces or other devices to enable the user to use the image capture devices in, for examples, hands free or self-portrait modes. The user can further control the image capture devices using a remote computing device, such as a
smartwatch, smartphone, desktop computer, laptop computer, or other appropriate computing device.
A modular architecture of image capture devices enables users rapidly to swap in and swap out modules so that the image capture devices can perform a wide array of photographic and videographic operations, including two-dimensional, three-dimensional, wide-angle, multi-angle, and three-hundred-sixty-degree imaging. Recorded images and videos from the modules attached to the image capture device can be seen on a display that is part of a module attached to the image capture device or a display that is part of the image capture device. The recorded images and video can also be viewed separately on remote devices, such as smartphones, laptops, and other remote computing devices that are connected to the image capture devices using a wired or wireless connection. One or more processors of or connected to the image capture device can process and merge recorded videos and images from the image capture modules into a single file that enables, for example, three-hundred-sixty-degree images to be viewed using virtual reality goggles, or hemisphere and 3D display screens.
In some examples, an image capture device can be connected to various modules that enable a user to control a functionality of the image capture device. An example of an image capture system 100 is illustrated in FIGS. 1A and IB, which respectively show exploded front and rear perspective views of the image capture system 100. The image capture system 100 includes an image capture device 102. Also referring to FIG. 2, which depicts a top view of the image capture device 102, multiple independent modules 104a, 104b, 104c, 104d can be connected to ports 106a, 106b, 106c, 106d of the image capture device 102. Each of the ports 106a-106d extends in a different direction from the image capture device 102. The ports 106a- 106d enable the image capture device 102 to be electrically and mechanically connected to the modules 104a-104d.
Each of these modules 104a-104d can enable different functionalities of the image capture device 102 when they are connected to the image capture device 102. For example, the module 104a can be an image capture module that generates image data based on received and sensed light in an environment of the image capture system 100. In particular, the module 104a can include a lens 108 that receives and senses the light from the environment to generate the image data for the image capture device 102. The module 104b can also include a lens 110 that that receives and senses the light from the environment of the image capture system 100 to generate image data for the image capture device 102. When the modules 104a, 104b are connected to the image capture device 102, using the image data received from the module 104a and the module 104b, the image capture device 102 can process the image data and generate images depicting multi-angular perspectives of the environment of the image capture system 100.
In some examples, the module 104c can also include a lens 112 that receives and senses light from the environment. The module 104c can provide additional image data that the image capture device 102 can use to add additional perspectives to the images generated from the image data from the modules 104a, 104b. In some examples, the module 104d can also include a lens (not shown) that receives and senses light from the environment. The module 104d can provide yet additional image data that the image capture device 102 can use to add additional perspectives to the images generated from the image data from the modules 104a-104c. Using the image data from each of the modules 104a-104d, the image capture device 102 can generate a hemispherical view of the environment. In some examples, the image capture device uses image data from the modules 104a-104c to generate a three-hundred-sixty-degree view of the environment. In some cases, the image capture device uses image data from the modules 104a- 104b to generate a three-dimensional view or a stereoscopic view of the environment.
Each of these modules 104a-104d can record images, a series of images, or videos. In some cases, these modules 104a-104d may also manually adjustable focus, optically adjustable magnification, etc. In some cases, the image capture device 102 can include an audio recording system that enables audio recording concurrent with image or video recording of the modules 104a-104d.
In some implementations, the module 104c, instead of or in addition to being an image capture module, the module 104c can include a display. When the module 104c is connected to the image capture device 102, the image capture device 102 can transmit user display signals and/or user interface signals to the display of the module 104c. The display of the module 104c can display images captured by, for example, the modules 104a, 104b, 104c. The display can be a touchscreen display with which the user can interact to control operations of the image capture device 102. For example, the user may be able to use the touchscreen display to control image capture settings of each of the image capture modules 104a, 104b. In some implementations, the module 104c can further include buttons or other user input devices in addition to or as alternative to the touchscreen display. In some cases, the image capture device 102 includes integral user input devices that the user with which the user can interact to control the operations of the image capture device 102 and any connected modules (e.g., the modules 104a, 104b, 104c).
In general, the image capture device 102 includes a main body 200, as illustrated in the top view of the image capture device 102 shown in FIG. 2. Also referring to FIGS. 1A and IB, the main body 200 may include a base 202 that can be rested or be mounted on a flat surface 204. The base 202 has a planar bottom surface 206 that enables it to be rested on the flat surface 204. In some arrangements other types of mounting structures may be employed, for example, to mount the image capture device 102 on a vehicle, an article of clothing (e.g., a bike helmet), etc. In some cases, the main body 200 can include an adaptor for a detachable strap. The strap can be used to attach the image capture device 102 to, for example, a chest or head of a user. The adaptor can also be used for an extension pole or a tripod.
The ports 106a-106d extend from the main body 200 in varying directions. For example, the ports 106a- 106c can extend within a plane offset and parallel to the planar bottom surface 206 of the base 202. The ports 106a-106c can extend along first, second, and third axes 208a, 208b, 208c, respectively. These axes 208a-208c can define the plane that is offset and parallel to the planar bottom surface 206. In some examples, the axes 208a-208c intersect at an intersection point 210. The axes 208a-208c can extend from the intersection point 210 such that they are rotationally symmetric about the intersection point 210. For example, the axis 208a and the axis 208b can form a 120 degree angle, the axis 208b and the axis 208c can form a 120 degree angle, and the axis 208c and the axis 208a can form another 120 degree angle. The ports 106a- 106c can also be rotationally symmetric about the intersection point 210. When image capture modules are connected to the ports 106a- 106c, the rotation symmetry of the ports 106a- 106c can enable the image capture device 102 to receive image data corresponding to a three-hundred-sixty- degree view of the environment of the image capture device 102. While the angles between each of the axes are equivalent; in some arrangements the angles may not be equivalent (e.g., a first and second ports may be separated by a smaller angle compared to the first and a third port).
The port 106d has an axis 208d that extends away from the plane defined by the axes 208a-208c. In some examples, the axis 208d is perpendicular to each of the axes 208a-208c, while in other cases, the axis 208d extends upwardly at a non-perpendicular angle with teach of the axes 208a-208c.
The ports 106a-106d enable mechanical connections of modules (e.g., the modules 104a-104d of FIGS. 1A and IB) to the main body 200 and one or more electrical connections of the modules to circuity (e.g., a processor based circuit) housed in the main body 200. The ports 106a- 106d enable the modules to be removably attached through, for example, mating threads on the ports 106a-106d and the modules, a latching mechanism, a bayonet mounting mechanism, a magnetic mounting mechanism, or other appropriate reversible attachment mechanism.
Each of the ports 106a-106d can include an electrical port 212 that enables the electrical connection between the modules and the ports 106a-106d. The electrical port 212 can facilitate data transfer between, for example, a processing device of a connected module and a processing device of the image capture device 102. The electrical port 212 can also allow energy transfer (e.g., electrical energy) between the module and the image capture device 102.
While four ports and four modules are described with respect to FIGS. 1A and IB, in some implementations, the image capture device can include more than two ports, three ports, or more than four ports (e.g., five, six ports, etc.). In some implementations, the ports may extend toward the planar bottom surface of the base. In some examples, the ports in the plane parallel to the planar surface of the base can include more than three ports.
In some implementations, the main body 200 is waterproof and/or water resistant. The main body 200, which can house the electrical components of the image capture device 102, can protect those electrical components from wetness when the main body 200 is immersed in water. The main body 200 can be resistant to water depths between, for example, 30 and 60 meters, 60 and 90 meters, 90 and 120 meters, or more.
As described above, a module (e.g., one or more of the modules 104a-104d) can be an image capture module capable of generating image data based on detecting light in the environment of the image capture device 102. FIGS. 3A and 3B respectively show side and front view of an image capture module 300. The image capture module 300 can include a module port 302 that can be connected to a port of the image capture device (e.g., the ports 106a-106d described with respect to FIG. 2).
As illustrated in the exploded perspective view of the image capture module shown in FIG. 3C, the module port 302 can include a mechanical connector 304 and an electrical connector 306 that extends through the mechanical connector 304. When the image capture module 300 is connected to the port of the image capture device, the mechanical connector 304 mechanically couples the image capture device with the image capture module 300. The electrical connector 306 electrically connects circuitry (e.g., a processing unit 308) of the image capture module 300 with circuitry (e.g., a processing device) of the image capture device.
In this arrangement, the processing unit 308 receives imaging data from an image sensor 310. The image sensor 310 receives light through a lens 312 of the image capture module 300 and then generates image data to be sent to the image processing unit 308. The image sensor 310 can be, for example, a charge-coupled device (CCD) image sensor. The processing unit 308, in some cases, can perform a processing operation on the image data, as described in greater detail herein. In some cases, the processing unit 308 can transmit the image data or processed image data through the electrical connector 306 to the image capture device 102.
The image capture module 300 can include other components that can optically manipulate light received and detected by the image capture module 300. For example, the lens 312 can be a zoom lens, a wide-angle lens, or a fish-eye lens. The image capture module 300 can also include polarizers, filters, anti-glare accessories, and other accessories that modify qualities of the light received by the lens 312. These accessories can be mounted onto the lens 312 or integral to the lens 312 of the image capture module 300.
FIGS. 4A to 4B depict an example of attaching and detaching a module 300 (e.g., one of the modules 104a-104d described with respect to FIGS. 1A and IB) to the image capture device 102. As shown in FIG. 4A, a user can attach the module 300 such that a module axis 302 aligns with the axis 208a of the port 106a. The user can attach the module 300 by pushing, screwing, etc. the module 300 into the port 106a. The alignment allows, for example, a lens axis of a lens of the module 300 to be aligned with the port axis 208a. As shown in FIG. 4B, after attaching the module 300, the user can detach the module 300 from the port 106a. The user can, for example, attach the module 300 by pulling, unscrewing, etc. the module 300 away from the port 106a. In cases where an image capture module is connected to each of the ports 106a- 106c, the lens axes of the lenses of these image capture modules can be aligned with the port axes 208a-208c. Image data captured by each of the image capture modules can therefore produce images that are properly aligned with one another to be usable to generate, for example, a three-hundred-sixty-degree view of the environment of the image capture device 102.
As described herein, the image capture device 102 can be used with image capture modules to capture images of the environment of the image capture device 102. In some examples, as illustrated in FIGS. 5A-5F showing different views of the image capture device 102, each of the ports 106a-106d of the image capture device 102 can be connected to image capture modules 500a, 500b, 500c, 500d. Each of the image capture module 500a-500d can include a lens. The image capture modules 500a-500d, in some cases, can further include image sensors and image processing units.
The image capture modules 500a-500c are connected to the ports 106a- 106c such that the lens axes of these image capture modules 500a-500c are in the same plane. This plane can be parallel to the bottom surface 206 of the base 202. The lens axes of the image capture modules 500a-500c can be angled relative to one another such that each pair of axes forms a 120 degree angle. The image capture module 500d connected to the port 106d has a lens axis that can be perpendicular to the lens axes of the other image capture modules 500a-500c.
Each individual image capture module 500a-500d records a specific and different portion of the environment of the image capture device 102. Referring to FIG. 5F, which shows a schematic top view of the image capture device 102, each of the image capture modules 500a-500c has a distinct angle of view 505a, 505b, 505c, respectively. The angles of view 505a-505c demarcate a field of view sensible by the image capture modules 500a-500c. In particular, features in the environment within the angles of view 505a-505c are optically detectable by the image capture modules 500a-500c. A center axis of the angles of view 505a-505c correspond to lens axes 507a-507c of the image capture modules 500a-500c.
As shown in FIG. 5F, the angles of view 505a-505c are sufficiently wide such that the portions of the environment sensed by each of the image capture modules 500a-500c can overlap. Because the lens axes 507a-507c are angled approximately 120 degrees relative to one another, the angles of view 505a-505c can be between, for example, 120 degrees to 180 degrees so that the angles of view 505a-505c overlap. The image data generated by each of the image capture modules 500a-500c can be used to form a three- hundred-sixty-degree image of the environment around the image capture device 102.
While FIG. 5F shows the angles of view 505a-505c as two-dimensional angles of view, the angles of view 505a-505c of each the image capture modules 500a-500c extend three-dimensionally, thus enabling the image capture modules 500a-500c to sense three- dimensional portions of the environment. The image capture module 500d also has an angle of view that can overlap with the angles of view 505a-505c of the image capture modules 500a-500c. The overlapping angles of view 505a-505c and the angle of view of the image capture module 500d enable the image data generated by the image capture modules 500a-500d to be usable to produce a hemispherical view of the environment around the image capture device 102.
In some implementations, one of the ports of the image capture device 102 can be connected to a display module. A display module 600, which is shown in a perspective view of FIG. 6A, includes a display 602. When the display module 600 is connected to the image capture device 102, as shown in the perspective view of the image capture device 102 in FIG. 6B, the display 602 can display images corresponding to the image data received by the image capture device 102. The display module 600, in FIG. 6B, is connected to the port 106d, though, in some implementations, the display module 600 can be connected to any of the ports 106a-106d available on the image capture device 102. Various types of display technologies may be employed by the display 602; for example, liquid crystal display (LCD), plasma, electrochromic, electrophoretic, etc.
To display the images, the display module 600 can include one or more processing units, including a graphics processing unit. The graphics processing unit can cause the display 602 to display a user interface for the user. When the image capture device 102 captures images using, for example, image capture modules connected to the image capture device 102, the user can invoke the user interface to view the images. The image data sent from the image capture modules to the processor of the image capture device 102 can be subsequently transmitted to the graphics processing unit of the display module 600. The graphics processing unit can then generate user display data to cause the display 602 to display the images captured by the image capture modules.
The displayed image can be a single image captured by one of the image capture modules. In some implementations, the displayed image can be processed such that the display image represents a merge of the image data generated by each of the image capture modules. In these cases, the displayed image can represent, for example, a three- hundred-sixty degree view of the environment of the image capture device 102, a hemispherical view of the environment, a three-dimensional view of the image capture device 102.
In some implementations, the display 602 of the display module 600 can be an autostereoscopic display device that enables the display 602 to represent to the user a three-dimensional view through a two-dimensional display. For example, if the image capture device 102 forms a stereoscopic image from image data generated by two image capture modules connected to the image capture device 102, the display 602 can include a parallax barrier that shows the stereoscopic image to be viewable without additional equipment, such as stereoscopic glasses.
The display module 600 can further include one or more user interface devices with which the user can interact to transmit instructions to the image capture device 102 when the display module 600 is connected to the image capture device 102. For example, the display 602 can, in addition to displaying images to the user, can serve as a user interface device. The display 602 can be a touchscreen display. In some implementations, the display module 600 further includes multi-functional buttons 604 that transmits control signals to a processing unit of the display module 600 and/or the processor of the image capture device 102.
The display module 600 can also include speakers or a speaker system that allow for audio playback through the display module 600. In this regard, when the user initiates playback of a video, the corresponding audio can also play back. The display module 600 therefore can receive both the appropriate image data and audio data for the video playback to occur. The multi-functional buttons 604 could be used to adjust volume, brightness, and other features of the video and audio playback.
In some implementations, the image capture device 102 can include an energy source that is rechargeable or replaceable. The energy source can be connected to the image capture device 102 independent from its ports 106a- 106d. The energy source of the image capture device 102 can provide electrical energy to modules electrically connected to the image capture device 102 as well as to electrical systems of the image capture device 102.
In some examples, as shown in FIG. 7, an energy source module 700 can be connected to any of the ports 106a- 106d. The energy source module 700 includes a rechargeable energy source whose electrical energy can be transmitted through the port of the image capture device 102 to electrical systems of the image capture device 102. The energy source module 700 can provide energy to the energy source of the image capture device 102. In some implementations, the energy source module 700 can provide energy directly to other modules connected to the image capture device 102. The energy source module 700 can lengthen duration of use of the image capture device 102 without having to charge an internal energy source of the image capture device 102.
Alternatively or additionally, the energy source module can be a data storage module that also stores data or provide other types of functionality. For example, the energy source module 700 can additionally also include available data storage for image recorded by image capture modules. In some implementations, a data storage module without an energy source can be connected to image capture device to provide data storage for the image capture device.
To enable stereoscopic imaging using the image capture device 102, a module extension 800 depicted in FIGS. 8A and 8B can be attached to one or more ports of the image capture device 102. As shown in FIG. 8A depicting a side view of the module extension 800, the module extension 800 has a module port 802 that connects to a module to be connected to the image capture device 102. As shown in FIG. 8B depicting a top view of the module extension 800, the module extension 800 also has an image capture device port 804 that connects to the image capture device 102. The module port 802 and the image capture device port 804 enable mechanical and electrical connections with modules and the image capture device 102.
The ports 802, 804 of the module extension 800 are arranged such that, when an image capture module is connected to the module port 802 and the image capture device port 804 is connected to the image capture device 102, the lens axis of the lens of the image capture module is contained within the plane formed by the port axes of the ports of the image capture device 102.
FIGS. 8C and 8D show perspective and top views, respectively, of the image capture device 102 with two module extensions 800a, 800b attached to the ports 106a, 106b of the image capture device 102. Image capture modules 806a, 806b (e.g., the image capture module 300) are connected to the module extensions 800a, 800b, respectively. When these image capture modules 806a, 806b are connected to the image capture device 102 through the module extensions 800a, 800b, lens axes 808a, 808b of the image capture modules 806a, 806b are within the plane formed by the port axes of the ports 106a- 106c. The lens axes 808a, 808b also are substantially parallel to one another and face a substantially similar direction. In this configuration of the image capture modules 806a, 806b and their lens axes 808a, 808b as depicted in FIGS. 8C and 8D, image data generated by the image capture modules 806a, 806b provide a stereoscopic view of a portion of the environment. The processor of the image capture device 102 can use the image data to produce a stereoscopic image.
In some examples, the image capture device 102 can display images through a projector attached to a port of the image capture device 102. As shown in FIG. 9A depicting a perspective view of the image capture device 102, a projector module 900 includes a projector lens 902 that projects images corresponding to image data received by the projector module 900. As shown in FIG. 9B depicting a perspective view of the projector module 900, the projector module 900 includes a port 904 that can be mechanically and electrically connected to the port of the image capture device 102. The image data can be received from the image capture device 102 when the port 904 of the projector module 900 is connected to the image capture device 102.
As depicted in the block diagram shown in FIG. 10A, the image capture device 102 can be connected to four separate image capture modules 300a-300d. When the image capture device 102 is connected to the image capture modules 300a-300d, the processor of the image capture device 102 can electrically communicate with each of the processors of the image capture modules 300a-300d.
The processors of the image capture modules 300a-300d can transmit data to the processor of the image capture device 102. As shown in the block diagram of FIG. 10B and described with respect to FIGS. 3A, 3B, and 3C, the image capture module 300 includes the module port 302, the processing unit 308, the image sensor 310, and the lens 312. The lens 312 receives the light for the image sensor 310 to generate image data 1000. The image data 1000 can undergo a noise reduction from a noise reduction system 1002, e.g., that filters out noise that can affect image quality. In some cases the image data 1000 is sent directly through the module port 302 to the processor of the image capture device 102. In some cases, the image data 1000 is transmitted to the processing unit 308. The processing unit 308 can process the image data 1000 to form processed image data 1004 that is then transmitted through the module port 302 to the processor of the image capture device 102.
The processing unit 308, in some examples, can also receive data through the module port 302. The data can include control signals that adjust settings of the image sensor 310, the noise reduction system 1002, the processing unit 308, etc. The settings can include, for example, digital contrast, brightness, color, saturation, and other photographic editing adjustments that the processing unit 308 could implement prior to transmitting the processed image data 1000 to the image capture device 102. In some implementations, the processing unit 308 can control an optical magnification of the lens 312. For example, the processing unit 308 can operate an actuator or drive mechanism contained within the image capture module 300 that moves the lens 312 to achieve a desired optical magnification. In some implementations, the processing unit 308 can receive control signals to adjust a digital magnification of the processed image data.
Referring back to FIG. 10A, the image capture device 102, in addition to communicating with the image capture modules 300a-300d, can also communicate with a computing device 1010 (e.g., a remotely located computer system, smartphone, etc.). The image capture device 102 can include an electrical port separate from the ports 106a- 106d described with respect to FIGS. 1A and IB that can be connected to the computing device 1010 through a wired electrical connection. For example the electrical port can be a Universal Serial Bus (USB) port, a mini USB port, a power port, communication port, a Secure Digital (SD) card port, a microSD card port, or other appropriate port for data and/or power transfer. In some examples, the computing device 1010 can include a memory storage element on which image data or processed image data received by the processor of the image capture device 102 is stored.
In some implementations, the image capture device 102 can wirelessly communicate with the computing device 1010. The image capture device 102 can include a wireless transceiver. The wireless transceiver can connect the image capture device 102 to a local area network (LAN), a wide area network (WAN), or other appropriate network that enables data transfer between devices connected to the network. The computing device 1010 can also connect to the network. The computing device 1010 and the image capture device 102 can communicate data with one another using the network.
In some implementations, the image capture device 102 can include a wireless transceiver that communicates with wireless transceivers of the image capture modules 300a-300d. The image capture modules 300a-300d can further include their own energy sources. In such implementations, the processor of the image capture device 102 can wirelessly receive data from and transmit data to the processors of the image capture modules 300a-300d. The image capture modules 300a-300d can mechanically connect to the image capture device 102 so that the image capture modules 300a-300d are properly positioned relative to one another to form, for example, stereoscopic images, three- hundred-sixty degree images, or hemispheric images. The image capture device 102 can include one or more processing devices that control the systems of the image capture device 102 and connected modules. A control system 1 100 for the image capture device 102, which can include the processor of the image capture device 102 described herein, is schematically shown in the block diagram of FIG. 1 1A. The control system 1 100 includes a processor 1 102 that communicates with a module controller 1122, a module port interface 1124, a user interface 1 1 14, an audio codec 1120, a buffer memory 1 104, a firmware memory 1 106, a wired interface 1 108, a power system 11 10, a wireless connectivity interface 1 1 12, a display memory 1 116, and a display interface 1 118. The control system 1 100 can communicate with and operate these systems to control the operations of the image capture device 102 and various modules described herein that can be connected to the image capture device 102. For example, the systems of the control system 1 100 can cooperate to provide a variety of functions, including processing and coordinating graphics and audio data associated with visual and audio information in the environment surrounding the image capture device 102. The video and audio information can received by modules connected to the image capture device 102, or, in some cases, integral audio and image recording systems of the image capture device 102.
As described herein, the image capture device 102 can be connected to various modules that enable the image capture device 102 to perform various functions depending on the hardware and software contained within those modules. The module port interface 1 124 enables the electrical connection through the ports of the image capture device 102 (e.g., the ports 106a-106d). The module port interface 1 124 can provide an interface for the various types of modules that can be connected to the image capture device 102.
The block diagram of FIG 1 IB schematically depicts various interfaces that can be included in the module port interface 1124. The module port interface 1 124 can include an image capture interface 1126, an energy source interface 1 128, a drone interface 1 130, a night vision interface 1132, a display interface 1134, a projector interface 1136, and a Global Positioning System (GPS) receiver interface 1 138. The module port interface 1 124 can activate one or more of these interfaces depending on the module or modules connected to the module port interface 1124. In some examples, a single module connected to the module port interface 1 124 can activate multiple interfaces and in other examples, a single module connected to the module port interface 1124 activates a single interface. The module port interface 1 124 can enable a port if the port has a connected module and can disable the port if the port does not have a connected module.
Also referring to FIG. 1 1A, the image capture interface 1 126 can enable the module port interface 1124 to receive imagery data from an image capture module 1 140, such as the image capture module 300 described with respect to FIGS. 3A-3C. The image capture interface 1 126 can also enable the module controller 1 122 to transmit control signals through the module port interface 1124 to control settings of the image capture module 1140. For example, the module controller 1 122, upon detecting that the image capture interface 1 126 is being utilized by a connected image capture module 300, can implement an auto-focusing operation to focus images captured by the image capture module 300. Other operations can include controlling a flash or light associated with the image capture module 300, a shutter-speed of a shutter of the image capture module 300, or an aperture size of the shutter of the image capture module 300. In some
implementations, the module controller 1 122 can also control an exposure index or an ISO speed. The module controller 1 122 can also control a timer of the image capture module 300.
The energy source interface 1128 can enable the module port interface 1 124 to receive electrical energy through the module port interface 1 124. The electrical energy received through the energy source interface 1128 can be directed transmitted to the subsystems of the control system 1 100. An external energy source module 1142 can be connected through the module port interface 1 124. As shown in FIG. 1 1A, the control system 1 100 can be structured such that an external energy source module 1 142 is directly controlled by the power system 1 110. The power system 1 1 10 can distribute the electrical energy of the external energy source module 1 142 to the subsystems requiring electrical energy. In some implementations, the power system 1 110 can use the electrical energy from the external energy source module 1142 to charge an internal energy source 1 144 of the image capture device 102.
The drone interface 1 130 can enable the module port interface 1124 to control a drone module 1 145 connected to the image capture device 102. The drone module 1145 can include, for example, a motorized rotor that the control system 1100 can power and control. The module controller 1 122, upon identifying the drone module 1145 as a connected module, can coordinate transfer of power to the drone module 1 145 to activate the motorized rotor. The motorized motor may be configured to provide enough force to cause the image capture device 102 to hover. In some implementations, the drone module 1 145 can include various sensors to determine motion parameters of the drone module 1 145 connected to the image capture device 102. For example, the drone module 1 145 can include an accelerometer, a position sensor, and/or a speed sensor. The drone module 1 145 can further include force sensors, infrared sensors, proximity sensors, ultrasonic or acoustic sensors, and other appropriate sensors to detect characteristics of the
environment surrounding the image capture device 102. The module port interface 1124 and the drone interface 1130 can enable intake of data generated by the sensors of the drone module 1145. In some implementations, the drone module 1145 can activate other interfaces of the module port interface 1 124. For example, the drone module 1 145 also include an optical lens and can activate the image capture interface 1 126 when connected to the image capture device 102.
The night vision interface 1 132 enables the module port interface 1124 to receive data associated with a night vision imaging device. For example, a night vision image capture module 1146 can include thermal sensors or infrared sensors that enable to detect light outside of the visible light range (e.g., in the infrared portion of the electromagnetic spectrum). Accordingly, the night vision image capture module 1 146 can generate images of the environment of the image capture device 102 when there are low light conditions in the environment. The module controller 1122 can receive image data generated by the night vision image capture module 1 146 and can also control operations of the night vision image capture module 1146.
The display interface 1134 enables the module controller 1122 to control a display module 1148 (e.g., the display module 600 as described with respect to FIGS. 6A and 6B) connected to the module port interface 1124. In some implementations, the control system 1100 includes the display interface 1 1 18 as an interface independent from the display interface 1 134 of the module port interface 1 124. A display can be connected through the display interface 1 118 and another display can be connected through the display interface 1 134 of the module port interface 1124. When the display module 1 148 is connected through the module port interface 1 124, the module controller 1122 can identify the display module 1148, generate user display data, and transmit the user display data to the display module 1148. In some implementations, the display module 1 148 further includes user interface elements, such as buttons or a touchscreen display. The display module 1148, when connected to the display interface 1134, can therefore also communicate through the user interface 1 114. The projector interface 1 136 enables a projector module 1 150 (e.g., the projector module 900 of FIGS. 9A and 9B) to be operated through the module port interface 1124. Through the module port interface 1 124, the control system 1 100 can provide electrical energy and display data to the projector module 1 150 so that the projector module 1150 can project an image into the environment.
The GPS receiver interface 1 138 enables a GPS receiver of a module to be operable with the module controller 1122 so that the module controller 1122 can determine a position of the image capture device 102. For example, the modules described herein can include a GPS receiver. The image capture module 1 140, the drone module 1145, and/or the night vision image capture module 1 146 can, for example, include a GPS receiver. The module controller 1 122 can coordinate sensed data from these modules with GPS receiver data so that the sensed data can be associated with a particular location.
As described above with respect to the module port interface 1124, the control system 1 100 can receive inputs from removable modules connected through the module port interface 1124. The control system 1 100 can also receive inputs from systems integral to the image capture device 102. For example, the user interface 1 114 can be connected to buttons 1 152 positioned on the main body of the image capture device. These buttons 1 152 can each be multi-function buttons that the user can press to control operations of the image capture device and connected modules. The processor 1102 can receive a signal in response to the buttons 1152 being pressed and then can generate control signals to control operations of the control system 1 100 or to control operations of a connected module.
The control system 1 100 can further receive inputs from the user through the wireless connectivity interface 11 12, which enables wireless data connectivity with the processor 1102. The user can interact with, for example, a smartphone, a smartwatch, a virtual reality set, a laptop computer, or other remote computing device to provide inputs to the control system 1 100. The user can use these remote computing devices to control the operations of the image capture device 102 and its connected modules.
The control system 1 100 can also receive input data from microphones 1 154a, 1 154b. The microphones 1154a, 1154b can form a stereo microphone system 1 156. The microphones 1154a, 1154b can record audio occurring in the environment of the image capture device 102. If one or more image capture modules are connected through the module port interface 1 124, the microphones 1 154a, 1154b can record the audio as the image capture modules record images of the environment. A timing system 1 158 of the module controller 1 122 can coordinate the received image and audio data such that, when the audio and image data are stored, the timing of the image data matches the timing of the audio data. In some implementations, the processor 1102 coordinates the timing of the received data.
The microphones 1154a, 1 154b can be positioned symmetrically about the main body of the image capture device 102 so that the microphones can receive audio information in the environment from multiple directions. In some implementations, the stereo microphone system 1 156 can include three or more microphones. The module port interface 1 124, in some cases, can also interface with a microphone included on a connected module.
The audio codec 1120 can provide functionality such as converting between analog and digital forms of data, encoding digitized data, etc. The audio codec 1 120 enables transfer of audio data between the processor 1 102 and audio input and output devices. For example, analog signals generated by the microphones 1154a, 1 154b in response to audio information in the environment can be converted to digital signals usable by the processor 1102. The audio codec 1 120 can also execute operations such as decoding data, converting data received from the processor 1 102 into analog signals for audio playback on a speaker system 1 159, etc. The speaker system 1159 can include one or more speakers situated on various positions on the image capture device 102 for audio playback. The speaker system 1159 can include multiple speakers for stereo audio playback. In some implementations, the module port interface 1124 can also interface with an external speaker located on a connected module.
The buffer memory 1 104 can be an internal volatile memory that facilitates execution of processes implemented by the processor 1 102. The buffer memory 1104 can therefore be used for data processing. The buffer memory 1104 can also serve as temporary storage for data received from the modules, e.g., imagery data from the image capture module and/or video data from the image capture module. During the operation of an image capture module, for example, the image capture module can capture images, record image data, and temporarily store in the image data in the buffer memory 1 104. The buffer memory 1 104 can also be used for management of data transmitted through and received from the wireless connectivity interface 1 112 and the wired interface 1 108. For example, data received through a USB connection or through a near field communication wireless connection can be temporarily stored on the buffer memory 1 104.
The firmware memory 1106 can store information and instructions for the processor 1102. The processor 1 102 can in turn provide these instructions to other systems of the control system 1 100. The firmware memory 1106 can store firmware and settings of the system of the control system 1 100. In some cases, the firmware memory 1 106 includes default settings associated with various modules that can be connected to the image capture device.
The wired interface 1108 can enable, for example, a data storage device to be connected to the control system 1 100. The data storage device can be connected through, for example, a micro USB port 1160 or a micro SD port 1 162. The processor 1102 can store data on the data storage device. For example, the processor 1 102 can store image data, video data, or other sensed data on the data storage device. In some
implementations, the control system 1 100 can include an internal data storage device that stores the data received from the modules connected to the image capture device. In some implementations, the control system 1100 can use the wireless connectivity interface 11 12 to send the data to a remote computing device, which can store the data in a remote data storage device.
The wireless connectivity interface 11 12 may include interfaces to WiFi, Bluetooth, near field communication, and other communication protocols. The user can use a mobile device to remotely control the image capture device by connecting to the image capture device using a wireless communication protocol. In some cases, the wireless connectivity interface 1 112 can transmit and receive radio signals. The wireless connectivity interface 1 112 can also include line-of-sight receivers and transmitters, such as an infrared transceiver. These line-of-sight receivers and transmitters can communicate with infrared remote controls so that a user can remotely control the image capture device using the remote control.
The power system 1 110 can control operations of energy sources connected to the control system 1100. For example, the connected energy source can include the internal energy source 1 144 that supplies electrical energy to the systems of the control system 1100. The internal energy source 1 144 can be a rechargeable and removable energy source. In some implementations, the external energy source module 1142 (e.g., the energy source module 700) can be connected through the module port interface 1 124. As described herein, the power system 1 110 can coordinate the electrical energy supplied by the external energy source module 1 142.
The display memory 1 116 can serve as a graphic buffer that stores data to be displayed on a display connected to the control system 1100. For example, the image capture device 102 can include an integral display that receives display data through the display interface 1 118. The display memory 11 16 is a temporary data storage element for the display data. In some implementations, the display interface 1 118 is connected to the display module 1148. The display memory 11 16 can receive and store display data for various functions. The display data can be for image playback or can be used to display user interface elements on a touchscreen.
In some implementations, the control system 1 100 can operate in various modes selected by the user (e.g., through a remote computing device, the button 1 152, or other user interface devices referred to as user interface module 1 153). These modes can include an image capture mode in which image capture modules connected to the image capture device can record images of the environment. The modes can also include an image playback mode in which a display module connected to the image capture device can playback images for the user to view. The modes can include a recording mode that enable both image capturing and audio recording to occur. The modes can include a playback mode that enables both image playback and audio playback. The modes can further include selection of a video recording mode or an image capture mode. In the video recording mode, the user can further select a frames-per-second in addition to other settings of the image capture module.
In the image capture and/or video recording modes, the user can further select from, for example, an action mode, a three-hundred-sixty-degree mode, a hemispheric view mode, and/or 3D mode. In the action mode, each image capture module attached to the image capture device can record video and images separate and independently from the other image capture modules. The output image and video data of the image capture modules can be stored in a data storage device as described herein. In the three-hundred- sixty-degree mode, three of the image capture modules can cooperate to record a portion of the image capture device's environment corresponding to a three-hundred-sixty-degree view around the image capture device. The image capture device can merge the recorded videos and images from the image capture modules into one unique file to create an image usable by, for example, a virtual reality headset to create a three-hundred-sixty- degree representation of the environment. In the hemispheric view mode, four of the image capture modules can cooperate to record a portion of the image capture device's environment corresponding to a hemispheric view around the image capture device. The image capture device can merge the recorded videos and images from the image capture modules into one unique file to create an image also usable by a virtual reality headset to create a hemispheric representation of the environment. In the 3D mode, the image capture device can record video and image data using the extension modules described with respect to FIGS. 3A-3D attached to image capture modules. The image data and video data can be merged to create a stereoscopic view of the environment.
FIG. 12 shows a flowchart 1200 representing operations of one or more processors associated with the image capture device 102. The image capture device 102 can include a processor, and the modules connected to the image capture device 102 can also include processors. The operations can be executed by one or more processors included in the image capture device 102. In some arrangements, operations may be distributed among processors included in the image capture device and one or more processors included in one or more connected modules. A remote computing device connected to the image capture device 102 using a wired or wireless connection can also include a processor that may perform a portion of the operations (e.g., some or all of the operations) described herein.
Operations of the processor can include identifying 1205 a type of an image capture module. The image capture module can be electrically and/or mechanically connected to a port of the image capture device. The type can specify that the image capture module is, for example, a night vision image capture module or visible light image capture module. In some implementations, the processor included in the image capture device 102 identifies a type of a module that can be an image capture module and/or other type of module. For example, data may be provided from the module to assist with the identification, data may be exchanged between the module and the image capture device (e.g., a handshake operations) for identification. Data may be stored on the module and/or the image capture device is assist with the identification. Similar to being an image capture module, the type can specify that the module is one or more of, for example, a display module, a drone module, an external energy source module, an image capture module, a projector module, a GPS receiver module, etc.
Operations of the processor can include receiving 1210 imagery data from the image capture module. The imagery data can be visible light imagery data, night vision imagery data, or other imagery data generated by the image capture module. In some cases, the processor can receive other data from a module connected to the processor. If the module includes a user interface element, the operations can include receiving control signals to control operations of the image capture module, to control operations of the image capture device, or to control operations of other systems described herein. The control signals may be indicative of settings of the image capture module or the image capture device. In some implementations, the processor can also receive audio data from an audio recording device, such as a microphone.
Operations of the processor can include generating 1215 processed imagery data from the imagery data based on the identified type of the image capture module. The generating 1215 can include various post-processing operations to edit the imagery data. For example, dependent upon the image capture module being capable of providing data for "action", "3D" or "360°" imagery, different processing operations may be executed. For example, imagery to be used for producing 3D imagery may processed in one manner (e.g., filtered) while imagery for presenting a 360° image, a night vision image, etc. may be processed in a different manner. In some arrangements, the generating 1215 can include compressing raw imagery data using an appropriate codec or resizing images using interpolation techniques. The generating 1215 can also include image editing operations, such as executing color and saturation enhancement processes on the imagery data, increasing sharpness or blur of the imagery data, and reducing noise in the imagery data. The generating 1215 can also include combining imagery data from multiple image capture modules to produce a three-hundred-sixty degree image or a hemispheric image.
Operations of the processor can include transmitting 1220 the processed imagery data to a computing device. The computing device can be a computing device separate from the main body and the connected image capture module. In some cases, the operations further include receiving control data from the computing device. In some arrangements transmitting and receiving operations are executed in a wireless manner. The operations can also include controlling the image capture module based on the received control data.
FIG. 13 shows an example of example computing device 1300 and example mobile computing device 1350, which can be used to implement the techniques described herein. For example, a portion or all of the operations of the control system 1 100, the processor 1102, and other systems shown in FIG. 11A may be executed by the computing device 1300 and/or the mobile computing device 1350. The processors of modules connected to the image capture device 102 can also be executed by the computing device 1300 and/or the mobile computing device 1350. Computing device 1300 is intended to represent various forms of digital computers, including, e.g., laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Computing device 1350 is intended to represent various forms of mobile devices, including, e.g., personal digital assistants, tablet computing devices, cellular telephones, smartphones, and other similar computing devices. The components shown here, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the techniques described and/or claimed in this document.
Computing device 1300 includes processor 1302, memory 1304, storage device 1306, high-speed interface 1308 connecting to memory 1304 and high-speed expansion ports 1310, and low speed interface 1312 connecting to low speed bus 1314 and storage device 1306. Each of components 1302, 1304, 1306, 1308, 1310, and 1312, are interconnected using various busses, and can be mounted on a common motherboard or in other manners as appropriate. Processor 1302 can process instructions for execution within computing device 1300, including instructions stored in memory 1304 or on storage device 1306 to display graphical data for a GUI on an external input/output device, including, e.g., display 1316 coupled to high speed interface 1308. In other implementations, multiple processors and/or multiple busses can be used, as appropriate, along with multiple memories and types of memory. Also, multiple computing devices 1300 can be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
Memory 1304 stores data within computing device 1300. In one implementation, memory 1304 is a volatile memory unit or units. In another implementation, memory 1304 is a non-volatile memory unit or units. Memory 1304 also can be another form of computer-readable medium (e.g., a magnetic or optical disk. Memory 1304 may be non- transitory.)
Storage device 1306 is capable of providing mass storage for computing device 1300. In one implementation, storage device 1306 can be or contain a computer-readable medium (e.g., a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, such as devices in a storage area network or other configurations.) A computer program product can be tangibly embodied in a data carrier. The computer program product also can contain instructions that, when executed, perform one or more methods (e.g., those described above.) The data carrier is a computer- or machine-readable medium, (e.g., memory 1304, storage device 1306, memory on processor 1302, and the like.)
High-speed controller 1308 manages bandwidth- intensive operations for computing device 1300, while low speed controller 1312 manages lower bandwidth- intensive operations. Such allocation of functions is an example only. In one implementation, high-speed controller 1308 is coupled to memory 1304, display 1316 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports 1310, which can accept various expansion cards (not shown). In the implementation, low-speed controller 1312 is coupled to storage device 1306 and low-speed expansion port 1314. The low-speed expansion port, which can include various communication ports (e.g., USB, Bluetooth®, Ethernet, wireless Ethernet), can be coupled to one or more input/output devices, (e.g., a keyboard, a pointing device, a scanner, or a networking device including a switch or router, e.g., through a network adapter.)
Computing device 1300 can be implemented in a number of different forms, as shown in the figure. For example, it can be implemented as standard server 1320, or multiple times in a group of such servers. It also can be implemented as part of rack server system 1324. In addition or as an alternative, it can be implemented in a personal computer (e.g., laptop computer 1322.) In some examples, components from computing device 1300 can be combined with other components in a mobile device (not shown), e.g., device 1350. Each of such devices can contain one or more of computing device 1300, 1350, and an entire system can be made up of multiple computing devices 1300, 1350 communicating with each other.
Computing device 1350 includes processor 1352, memory 1364, an input/output device (e.g., display 1354, communication interface 1366, and transceiver 1368) among other components. Device 1350 also can be provided with a storage device, (e.g., a microdrive or other device) to provide additional storage. Each of components 1350, 1352, 1364, 1354, 1366, and 1368, are interconnected using various buses, and several of the components can be mounted on a common motherboard or in other manners as appropriate.
Processor 1352 can execute instructions within computing device 1350, including instructions stored in memory 1364. The processor can be implemented as a chipset of chips that include separate and multiple analog and digital processors. The processor can provide, for example, for coordination of the other components of device 1350, e.g., control of user interfaces, applications run by device 1350, and wireless communication by device 1350.
Processor 1352 can communicate with a user through control interface 1358 and display interface 1356 coupled to display 1354. Display 1354 can be, for example, a TFT LCD (Thin-Film-Transistor Liquid Crystal Display) or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology. Display interface 1356 can comprise appropriate circuitry for driving display 1354 to present graphical and other data to a user. Control interface 1358 can receive commands from a user and convert them for submission to processor 1352. In addition, external interface 1362 can communicate with processor 1352, so as to enable near area communication of device 1350 with other devices. External interface 1362 can provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces also can be used.
Memory 1364 stores data within computing device 1350. Memory 1364 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units. Expansion memory 1374 also can be provided and connected to device 1350 through expansion interface 1372, which can include, for example, a SIMM (Single In Line Memory Module) card interface. Such expansion memory 1374 can provide extra storage space for device 1350, or also can store applications or other data for device 1350. Specifically, expansion memory 1374 can include instructions to carry out or supplement the processes described above, and can include secure data also. Thus, for example, expansion memory 1374 can be provided as a security module for device 1350, and can be programmed with instructions that permit secure use of device 1350. In addition, secure applications can be provided through the SIMM cards, along with additional data, (e.g., placing identifying data on the SIMM card in a non-hackable manner.)
The memory can include, for example, flash memory and/or NVRAM memory, as discussed below. In one implementation, a computer program product is tangibly embodied in a data carrier. The computer program product contains instructions that, when executed, perform one or more methods, e.g., those described above. The data carrier is a computer- or machine-readable medium (e.g., memory 1364, expansion memory 1374, and/or memory on processor 1352), which can be received, for example, over transceiver 1368 or external interface 1362. Device 1350 can communicate wirelessly through communication interface 1366, which can include digital signal processing circuitry where necessary. Communication interface 1366 can provide for communications under various modes or protocols (e.g., GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA2000, or GPRS, among others.) Such communication can occur, for example, through radio-frequency transceiver 1368. In addition, short-range communication can occur, e.g., using a Bluetooth®, WiFi, or other such transceiver (not shown). In addition, GPS (Global Positioning System) receiver module 1370 can provide additional navigation- and location-related wireless data to device 1350, which can be used as appropriate by applications running on device 1350. Sensors and modules such as cameras, microphones, compasses, accelerators (for orientation sensing), etc. may be included in the device.
Device 1350 also can communicate audibly using audio codec 1360, which can receive spoken data from a user and convert it to usable digital data. Audio codec 1360 can likewise generate audible sound for a user, (e.g., through a speaker in a handset of device 1350.) Such sound can include sound from voice telephone calls, can include recorded sound (e.g., voice messages, music files, and the like) and also can include sound generated by applications operating on device 1350.
Computing device 1350 can be implemented in a number of different forms, as shown in the figure. For example, it can be implemented as cellular telephone 1380. It also can be implemented as part of smartphone 1382, personal digital assistant, or other similar mobile device.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor. The programmable processor can be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms machine-readable medium and computer-readable medium refer to a computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a device for displaying data to the user (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor), and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be a form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in a form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a backend component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a frontend component (e.g., a client computer having a user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or a combination of such back end, middleware, or frontend components. The components of the system can be interconnected by a form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (LAN), a wide area network (WAN), and the Internet.
The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
In some implementations, the engines described herein can be separated, combined or incorporated into a single or combined engine. The engines depicted in the figures are not intended to limit the systems described here to the software architectures shown in the figures.
A number of embodiments have been described. Nevertheless, it will be understood that various modifications can be made without departing from the spirit and scope of the processes and techniques described herein. In addition, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps can be provided, or steps can be eliminated, from the described flows, and other components can be added to, or removed from, the described systems. Accordingly, other embodiments are within the scope of the following claims.

Claims

WHAT IS CLAIMED IS:
1. An apparatus comprising:
first, second, and third ports respectively extending from a main body along first, second, and third axes that span a plane, each of the first, second, and third ports being electrically and mechanically connectable to one of a plurality of image capture modules; and
one or more processing devices configured to execute machine-readable instructions to perform operations comprising:
identifying a type of the one image capture module connected to one of the first, second, and third ports,
receiving imagery data from the connected image capture module, and transmitting data representative of the received imagery data to a computing device separate from the image capture module.
2. The apparatus of claim 1, the operations further comprising:
recording processed image data received through at least one of the first, second, and third ports.
3. The apparatus of claim 1, the operations further comprising:
controlling the image capture module based on the identified type.
4. The apparatus of claim 1, wherein the first, second, and third axes are rotationally symmetric about an intersection of the first, second, and third axes.
5. The apparatus of claim 1, further comprising:
a first image capture module being connectable to one of the ports and including one or more lenses.
6. The apparatus of claim 5, further comprising:
a second image capture module being connectable to a second of the ports and including one or more lenses.
7. The apparatus of claim 6, wherein the first image capture module provides one viewing perspective and the second image capture module provides a different viewing perspective.
8. The apparatus of claim 6, wherein the one or more processing devices are configured to process imagery data from the first and second image capture modules to produce three-dimensional imagery.
9. The apparatus of claim 6, further comprising:
A third image capture module being connectable to a third of the ports and including one or more lenses.
10. The apparatus of claim 9, where the one or more processing devices are configured to process imagery data from the first, second and third image capture modules to produce three-hundred-sixty-degree imagery.
1 1. The apparatus of claim 1, further comprising a fourth port extending from the main body along a fourth axis angled away from the plane, the fourth axis intersecting the first, second, and third axes at the intersection of the first, second, and third axes.
12. The apparatus of claim 1, further comprising:
an image presenting module being connectable to one of the ports.
13. The apparatus of claim 12, wherein the image presenting module includes a display.
14. The apparatus of claim 12, wherein the image presenting module includes a projector.
15. The apparatus of claim 1, further comprising a location detecting module including a global positioning system (GPS) receiver.
16. The apparatus of claim 1, further comprising a night vision module including an infrared receiver.
17. The apparatus of claim 1, further comprising an energy source module for providing electrical energy to at least one of the main body and one or more other modules.
18. The apparatus of claim 1, the operations further comprising:
wirelessly receiving control data from the computing device separate from the one image capture module; and
transmitting control signals to the one image capture module to control the one image capture module.
19. The apparatus of claim 18, wherein wirelessly receiving the control data is provided by a wide area network.
20. The apparatus of claim 18, wherein wirelessly receiving the control data is provided by a local area network.
21. The apparatus of claim 18, wherein wirelessly receiving the control data is provided by near field communication.
22. A computing device implemented method comprising:
identifying a type of an image capture module, the image capture module being electrically and mechanically connected to one of a first, second, and third ports of an image capture device, the first, second, and third ports each respectively extending from the image capture device along one of a first, second, and third axes that span a plane; receiving imagery data from the connected image capture module;
generating processed imagery data from the received imagery data based on the identified type of the image capture module; and
transmitting the processed imagery data to a computing device separate from the image capture device and the connected image capture module.
23. The method of claim 22, further comprising:
receiving control data from the computing device; and
controlling the image capture module based on the received control data.
24. One or more computer readable media storing instructions that are executable by a processing device, and upon such execution cause the processing device to perform operations comprising:
identifying a type of an image capture module, the image capture module being electrically and mechanically connected to one of a first, second, and third ports of an image capture device, the first, second, and third ports each respectively extending from the image capture device along one of a first, second, and third axes that span a plane; receiving imagery data from the connected image capture module;
generating processed imagery data from the received imagery data based on the identified type of the image capture module; and
transmitting the processed imagery data to a computing device separate from the image capture device and the connected image capture module.
25. The computer readable media of claim 24, operations further comprising: receiving control data from the computing device; and
controlling the image capture module based on the received control data.
PCT/US2015/060481 2014-11-27 2015-11-12 A modular image capture device WO2016085669A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201462085247P true 2014-11-27 2014-11-27
US62/085,247 2014-11-27

Publications (1)

Publication Number Publication Date
WO2016085669A1 true WO2016085669A1 (en) 2016-06-02

Family

ID=56074888

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/060481 WO2016085669A1 (en) 2014-11-27 2015-11-12 A modular image capture device

Country Status (1)

Country Link
WO (1) WO2016085669A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108008595A (en) * 2017-11-28 2018-05-08 北京航天计量测试技术研究所 A kind of distributed infrared integrated optics system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880815A (en) * 1996-01-17 1999-03-09 Nec Corporation Image pickup apparatus capable of preventing overlap or lack of image
US20040027451A1 (en) * 2002-04-12 2004-02-12 Image Masters, Inc. Immersive imaging system
US20040246333A1 (en) * 2003-06-03 2004-12-09 Steuart Leonard P. (Skip) Digital 3D/360 degree camera system
US20080170123A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Tracking a range of body movement based on 3d captured image streams of a user
US20100321475A1 (en) * 2008-01-23 2010-12-23 Phillip Cox System and method to quickly acquire three-dimensional images
US20130111464A1 (en) * 2011-10-27 2013-05-02 3Dmedia Corporation Modular and open platform image capture devices and related methods
US20130237148A1 (en) * 2012-03-12 2013-09-12 Research In Motion Limited Wireless local area network hotspot registration using near field communications
US20140104378A1 (en) * 2011-04-08 2014-04-17 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Capturing panoramic or semi-panoramic 3d scenes
US20140168386A1 (en) * 2012-12-18 2014-06-19 Delta Electronics, Inc. Projection system and projection method thereof

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880815A (en) * 1996-01-17 1999-03-09 Nec Corporation Image pickup apparatus capable of preventing overlap or lack of image
US20040027451A1 (en) * 2002-04-12 2004-02-12 Image Masters, Inc. Immersive imaging system
US20040246333A1 (en) * 2003-06-03 2004-12-09 Steuart Leonard P. (Skip) Digital 3D/360 degree camera system
US20080170123A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Tracking a range of body movement based on 3d captured image streams of a user
US20100321475A1 (en) * 2008-01-23 2010-12-23 Phillip Cox System and method to quickly acquire three-dimensional images
US20140104378A1 (en) * 2011-04-08 2014-04-17 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Capturing panoramic or semi-panoramic 3d scenes
US20130111464A1 (en) * 2011-10-27 2013-05-02 3Dmedia Corporation Modular and open platform image capture devices and related methods
US20130237148A1 (en) * 2012-03-12 2013-09-12 Research In Motion Limited Wireless local area network hotspot registration using near field communications
US20140168386A1 (en) * 2012-12-18 2014-06-19 Delta Electronics, Inc. Projection system and projection method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108008595A (en) * 2017-11-28 2018-05-08 北京航天计量测试技术研究所 A kind of distributed infrared integrated optics system

Similar Documents

Publication Publication Date Title
US9055220B1 (en) Enabling the integration of a three hundred and sixty degree panoramic camera within a mobile device case
US10331024B2 (en) Mobile and portable screen to view an image recorded by a camera
US20200084394A1 (en) Systems and methods for compressing video content
US9171221B2 (en) Camera to track an object
US9245389B2 (en) Information processing apparatus and recording medium
US10171792B2 (en) Device and method for three-dimensional video communication
US9851803B2 (en) Autonomous computing and telecommunications head-up displays glasses
US9927948B2 (en) Image display apparatus and image display method
US20170195568A1 (en) Modular Panoramic Camera Systems
WO2017118309A1 (en) Closed wearable panoramic image-capturing and processing system, and operation method therefor
TWI547163B (en) Head-mounted digital video camcorder and system thereof
US10817976B2 (en) Systems and methods for modifying image distortion (curvature) for viewing distance in post capture
US9939843B2 (en) Apparel-mountable panoramic camera systems
CN108169927A (en) A kind of glasses for guiding blind based on binocular stereo vision
US10261408B2 (en) Mobile and portable camera platform for tracking an object
KR102181208B1 (en) Mobile terminal and control method for the mobile terminal
WO2018145261A1 (en) Multifunctional camera and control method therefor, wearable device, pan-tilt, and aerial vehicle
WO2020192458A1 (en) Image processing method and head-mounted display device
JP2021514573A (en) Systems and methods for capturing omni-stereo video using multi-sensors
US20200221219A1 (en) Microphone pattern based on selected image of dual lens image capture device
KR20200016211A (en) Magnetic camera coupling system
WO2016085669A1 (en) A modular image capture device
CN207166600U (en) A kind of panorama camera
CN209105311U (en) Stabilization camera and system
US10831093B1 (en) Focus control for a plurality of cameras in a smartphone

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15862935

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15862935

Country of ref document: EP

Kind code of ref document: A1