WO2018017625A1 - User interface for smart digital camera - Google Patents
User interface for smart digital camera Download PDFInfo
- Publication number
- WO2018017625A1 WO2018017625A1 PCT/US2017/042685 US2017042685W WO2018017625A1 WO 2018017625 A1 WO2018017625 A1 WO 2018017625A1 US 2017042685 W US2017042685 W US 2017042685W WO 2018017625 A1 WO2018017625 A1 WO 2018017625A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- combinations
- focus
- exposure
- auto
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
Definitions
- An advantage of electronic viewfinders is you get to see exactly what the camera's sensor sees and your view of a scene is never obstructed when taking a photo (your view is momentarily blocked when taking photos on DSLR cameras).
- Some cameras also augment the EVF display in various ways, such as by highlighting areas in focus ('peaking' autofocus), simulating the motion blur you'll see if you take a photo and automatically boosting brightness when shooting very dark scenes.
- Adjusting precapture settings and postcapture editing on typical DSLRs and mobile camera-enabled devices involve the inconvenience of taking one hand off the camera to make touch screen or button actuated adjustments to imaging parameters that tend to temporarily destabilize the camera as an image capture device or as an viewer. It is desired to have a camera that allows camera users to smoothly and conveniently adjust precapture settings and perform postcapture editing.
- Figure 1 schematically illustrates a front perspective view of a first digital camera in accordance with certain embodiments.
- Figure 2 schematically illustrates a back perspective view of a digital camera with a movable viewfinder in accordance with certain embodiments.
- Figures 3 A-3D schematically illustrate a linear slider for adjusting one or more image capture parameters and/or editing a captured image in accordance with certain embodiments.
- Figures 4A-4C schematically illustrate front views of example digital cameras that each include multiple LEDs for illuminating objects to be imaged in accordance with certain embodiments.
- Figure 5 schematically illustrates a perspective view of a digital camera that includes multiple microphones in accordance certain embodiments.
- Figure 6 schematically illustrates a back view of a digital camera that includes a display screen and various buttons for image capture and/or editing control, including buttons for capture type control (e.g., video, time lapse, slow motion), secondary controls such as timer and flash, adjustment controls, global controls such as gallery, app store and settings, and a thumbnail of a previous image capture in accordance with certain embodiments.
- buttons for capture type control e.g., video, time lapse, slow motion
- secondary controls such as timer and flash
- adjustment controls e.g., global controls such as gallery, app store and settings
- global controls such as gallery, app store and settings
- thumbnail of a previous image capture e.g., a previous image capture in accordance with certain embodiments.
- Figure 7 schematically illustrates a back view of a digital camera that includes a display screen and various buttons for image capture and/or editing control, including buttons for adjusting a time parameter and/or scrolling through a sequence of images, for selecting and editing various parameters using smart menus and a linear slider for adjusting, scrolling or showing a current time parameter disposed between a start time and an end time for the sequence of images, in accordance with certain embodiments.
- Figure 8 schematically illustrates a back view of a digital camera that includes a display screen and a smart reset button.
- Figure 9 schematically illustrates a back view of a digital camera that includes a display screen such as a touch screen, a smart button, a value indicator, smart correction and/or scrolling button, and a linear slider for adjusting parameters such as exposure, contrast, fill-flash, face priority and various other image capture and/or editing parameters, in accordance with certain embodiments.
- Figure 10 schematically illustrates a back view of a digital camera that includes a display screen showing a live image, a favorite select button, a delete select button, a global control button, and advanced edits and share buttons, in accordance with certain embodiments.
- Figure 11 schematically illustrates a back view of a digital camera that includes a display screen showing a feedback bubble that a user can accept, reject or ignore in accordance with certain embodiments.
- Figure 12 schematically illustrates a back view of a digital camera that includes a display screen and buttons for crop control and other adjustment controls, and a button for confirming a crop or other adjustment, and cancel and smart buttons, in accordance with certain embodiments.
- Figure 13 schematically illustrates a back view of a digital camera that includes a display screen and a timeline with indicators of original and current time values disposed between start and end times, and buttons for canceling to exit adjustment mode without saving and for confirming to save changes, and a smart button, in accordance with certain embodiments.
- Figure 14 schematically illustrates a back view of a digital camera that includes a display screen showing a selected image for sharing, and buttons for email, text, facebook, and networked second camera or other device, in accordance with certain embodiments.
- Figures 15A-15B schematically illustrate a back view of a digital camera that includes a display screen that shows a level guide that auto appears when the camera is not leveled and disappears when the level is restored in accordance with certain embodiments.
- Figure 16 schematically illustrates a digital camera display screen showing a column of menu or executable icons and a photographic scene overlayed by a column of translucent menu or executable icons in accordance with certain embodiments.
- Figure 17 schematically illustrates a digital camera display screen showing a column of menu or executable icons and a photographic scene overlayed by a first column of translucent menu or executable icons and a second column of translucent menu or executable text items in accordance with certain embodiments.
- Figure 18 schematically illustrates a digital camera display screen showing a photographic scene overlayed by two columns of translucent menu or executable icons and two columns of translucent menu or executable text items in accordance with certain embodiments.
- Figure 19 illustrates a digital camera display screen showing a photographic scene overlayed by multiple columns of translucent icons and indicating a depth selection as a highlighted row of icons across the multiple columns in accordance with certain embodiments.
- Figure 20 illustrates a digital camera display screen showing a photographic scene overlayed by multiple columns of translucent icons and showing a column of translucent, selectable motion option icons in accordance with certain embodiments.
- Figure 21 illustrates a display screen result of scrolling to and selecting one of the multiple icons within the column of translucent, selectable motion option icons of Figure 20, including adding highlighting of the selected motion option icon and extending a row of highlighted icons in accordance with certain embodiments.
- Figure 22 illustrates a digital camera display screen showing a photographic scene overlayed by multiple columns of translucent icons and showing a column of translucent, selectable light option icons in accordance with certain embodiments.
- Figure 23 illustrates a display screen result of scrolling to and selecting one of the multiple icons within the column of translucent, selectable light option icons of Figure 22, including adding highlighting of the selected light option icon and extending a row of highlighted icons in accordance with certain embodiments.
- Figure 24 illustrates a display screen result of selecting one of multiple translucent icons from a first column of translucent icons in accordance with certain embodiments, the selected icon being indicated by universal brightening or white or gray translucent overlaying of an icon- containing pixel area, while maintaining the highlighting illustrated at Figure 23 of an extended row of icons across an adjacent block of multiple other columns.
- Figure 25 illustrates a digital camera display screen result of selecting a "clear" icon in accordance with certain embodiments to remove the universal brightening or white or gray translucent overlaying of the icon-containing area illustrated at Figure 24.
- Figure 26 illustrates a display screen that is overlayed in certain embodiments by an additional column of translucent icons on the opposite side of the display screen from the other columns illustrated at Figure 22.
- Figure 27 illustrates a digital camera display screen in accordance with certain
- Figure 28 illustrates a digital camera display screen in accordance with certain
- Figure 29 illustrates a digital camera display screen including a translucent cursor overlay in accordance with certain embodiments, e.g., an unfilled circle appearing to enclose a circular pixel area of the display screen and showing a plus sign at a center of the circular pixel area.
- a translucent cursor overlay in accordance with certain embodiments, e.g., an unfilled circle appearing to enclose a circular pixel area of the display screen and showing a plus sign at a center of the circular pixel area.
- Figure 30 illustrates a digital camera display screen including a translucent cursor overlay as in Figure 29 and a translucent, broken-circle-shaped overlay includes selectable depth, motion and light circle segments in accordance with certain embodiments.
- Figure 31 illustrates a display screen result in accordance with certain embodiments of selecting, indicated by highlighting, of a location within the depth segment of the translucent, broken-circle shaped overlay of Figure 30 that corresponds to a particular value of depth of field.
- Figure 32 illustrates a display screen result in accordance with certain embodiments of adjusting the value of depth of field by sliding to or otherwise selecting, indicated by highlighting, of a different location within the depth segment of the translucent, broken-circle shaped overlay of Figure 30.
- Figure 33 illustrates a digital camera display screen including multiple translucent columns that appear in certain embodiments indicating values of aperture from which a user may choose to manually select.
- Figure 34 illustrates a digital camera display screen including multiple translucent columns that appear in certain embodiments indicating values of ISO, or sensitivity to available light, from which a user may choose to manually select.
- Figure 35 illustrates a digital camera display screen including multiple translucent columns that appear in certain embodiments indicating values of shutter speed from which a user may choose to manually select.
- Figure 36 illustrates a digital camera display screen including multiple translucent columns that appear in certain embodiments indicating values of white balance from which a user may choose to manually select.
- Figure 37 illustrates a digital camera display screen showing an indication of recent capture of two minutes and fifteen second of video, the display screen also showing multiple translucent columns that appear in certain embodiments including a row of highlighted icons indicating a manually-selected value of 1/60 seconds as a camera shutter speed.
- Figure 38 illustrates a display screen result of removing the multiple translucent columns indicating a manual selection of a 1/60 second shutter speed as shown in Figure 37.
- Figure 39 illustrates a digital camera display screen including icons that a user may touch or otherwise select to execute video record capture, video record pause and video clip delete commands in accordance with certain embodiments.
- Figure 40 illustrates a camera display for a digital camera in a default camera state for a viewfinder mode with no user interface in accordance with certain embodiments.
- Figure 41 illustrates a camera display for a digital camera after a single tap instantiates appearance of a key zone object in accordance with certain embodiments.
- Figure 42 illustrates a camera display for a digital camera including a pinch and zoom feature to vary the size of a key zone object in accordance with certain embodiments.
- Figure 43 illustrates a camera display for a digital camera including a key zone controls panel for selecting light, focus and speed adjustment in accordance with certain embodiments.
- Figure 44 illustrates a camera display for a digital camera including a key zone adjustor panel for a light parameter selected from a key zone control panel in accordance with certain embodiments.
- Figure 45 illustrates a camera display for a digital camera upon user interaction with the key zone adjustor panel of Figure 44 to adjust a light parameter.
- Figure 46 illustrates a camera display for a digital camera upon user interaction with a key zone adjustor panel for a focus parameter selected from a key zone control panel in accordance with certain embodiments.
- Figure 47 illustrates a camera display for a digital camera upon user interaction with a key zone adjustor panel for a speed parameter selected from a key zone control panel in accordance with certain embodiments.
- Figure 48 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller and navigation panel of a main user interface without a key zone in accordance with certain embodiments.
- Figure 49 illustrates capture formats that may be selected in accordance with certain embodiments.
- Figure 50 illustrates a camera display for a digital camera including a selected secondary control setting panel next to a secondary control panel in accordance with certain embodiments.
- Figure 51 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone in accordance with certain embodiments.
- Figure 52 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone and a key zone controls panel in accordance with certain embodiments.
- Figure 53 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone and a key zone controls panel and a key zone controls adjustor panel in accordance with certain embodiments.
- Figure 54 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface and a pre-capture filters pull-up panel in accordance with certain embodiments.
- Figure 55 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface and an activities pull-down panel in accordance with certain embodiments.
- Figure 56 illustrates a dashboard interface for accessing to gallery and android in accordance with certain embodiments.
- Figure 57 illustrates a camera display for a digital camera including a default beginning configuration for a gallery in accordance with certain embodiments.
- Figure 58 illustrates a camera display for a digital camera including a gallery with select mode active in accordance with certain embodiments.
- Figure 59 illustrates a camera display for a digital camera including an opened individual photo in a default state with options including an edit option in accordance with certain embodiments.
- Figure 60 illustrates a camera display for a digital camera including an opened individual photo with user interface dismissed in accordance with certain embodiments.
- Figure 61 illustrates a camera display for a digital camera including an opened individual photo and an edit options panel in accordance with certain embodiments.
- Figure 62 illustrates a camera display for a digital camera including an opened individual photo and a filters, effects and frames panel in accordance with certain embodiments.
- Figure 63 illustrates an android apps and environment screen in accordance with certain embodiments.
- Figure 64A illustrates an expert mode menu of adjustable primary control settings for shutter speed, aperture and ISO in accordance with certain embodiments.
- Figure 64B illustrates an expert mode menu of adjustable secondary control settings for white balance in accordance with certain embodiments.
- Figure 64C illustrates a camera display for a digital camera operating in an expert mode, including a selected secondary control setting panel next to a secondary control panel in accordance with certain embodiments.
- Figures 65 A-65B illustrate a two level user interface in accordance with certain embodiments including an example top level activities interface plug-in over a hidden bottom level interface including primary and secondary controls panels, a capture mode scroller and navigation panel, and a scene display.
- Figure 66A illustrates example steps for guided usage with a soft focus portrait interface plug-in in accordance with certain embodiments.
- Figure 66B illustrates example steps for guided usage with a wedding shoot setup interface plug-in in accordance with certain embodiments.
- Figures 67A-67B illustrate simple usage modes of a manual operation user interface for primary and secondary controls in accordance with certain embodiments.
- Figures 67C-67D illustrate expert usage modes of a manual operation user interface for primary and secondary controls in accordance with certain embodiments.
- Figure 1 schematically illustrates a front perspective view of a first digital camera in accordance with certain embodiments.
- the digital camera shown in Figure 1 includes a grip 2, a lens 4, a hot shoe 6 and a view finder 8. Although not shown in Figure 1, the camera may be equipped with flash illumination.
- the grip 2 includes a capacitive touch sensor 10 and battery compartment 12.
- the capacitive touch sensor 10 may be used for scrolling through a menu of processing functions or for moving a cursor on a display screen or for another function that is typically available to a user by way of a mouse or keypad of a computer or other processor-based device.
- the capacitive touch sensor 10 may be used as an image capture button that may have both full press shutter trigger and half press settings adjustment functionality.
- the lens 4 may be replaceable with one or more other lenses having different optical properties.
- the lens 4 may be movable relative to an image sensor of the digital camera.
- the camera may include a CCD sensor cover that slides over the CCD sensor to prevent dust from getting on the CCD sensor and to prevent physical touching of the CCD sensor, e.g., when replacing a lens 4.
- the lens 4 may be one of multiple lenses contained within a lens holder 14.
- the lens 4 may be movable relative to one or more other lenses contained within the lens holder 14, and multiple lenses may be movable together relative to the image sensor.
- the hot shoe 6 includes a mechanical and/or electrical coupling interface for a peripheral such as a secondary flash or a secondary image capture device or SICD.
- a peripheral such as a secondary flash or a secondary image capture device or SICD.
- the secondary flash or SICD may be directly coupled to the digital camera housing at the hot shoe interface 6.
- Bluetooth or other wireless coupling interface may be included at the hot shoe 6 or otherwise within the digital camera for coupling the camera to a secondary display, secondary or primary flash or SICD, or secondary image processing or file sharing device.
- the viewfinder 8 is shown in a stowed or inactive position.
- the viewfinder 8 of Figure 1 is configured to be moveable, e.g., rotatable, between active and inactive positions.
- Figure 2 schematically illustrates a back perspective view of a digital camera with a movable viewfinder in accordance with certain embodiments.
- Figure 2 shows a back perspective view of a digital camera that includes a grip 42, hot shoe 46, viewfinder 48, lens holder 54, display 56 and compartment access door 58.
- the viewfinder 48 is shown in two positions in Figure 2. In a first stowed or inactive position A, the viewfinder 48 is out of the way of the display 56 and stowed similar to the viewfinder 8 illustrated schematically in Figure 1. In a second active position B, the viewfinder 48 is overlapping a portion of the display 56. In the example of Figure 2, an upper left corner section of the display 56 is overlapped by the viewfinder 48 when in the active position B.
- the viewfinder 48 may be moved between positions A and B by rotation about an axis that is approximately normal to the optical axis of the digital camera.
- the movement of the viewfinder 48 from the stowed position A to the active position B may in certain embodiments trigger a thumbnail to appear on the overlapped portion of the display for viewing through a viewfinder window 60 an approximately same or similar image as may be viewed on the display 56 when the viewfinder 48 is stowed, and as may be captured by full-pressing the image capture button (not shown in Figure 2, but see element 10 of Figure 1).
- Figure 2 also shows an eyebrow rest 62 to assist the user to position and stabilize his or her eye when using the viewfinder 48.
- a rear screen 56 of a the digital camera may be disposed at an image plane, while the viewfinder 48 may be configured such that the window 60 or a lens inside the window 60 has a significant magnification like a magnifying glass to serve as a loupe for viewing the image on the display 56.
- the image plane may also be a separate CCD, CMOS or other image detector, such that the image data may be processed through a ISP or other processor and provided as a thumbnail or small image on the screen 56 or portion thereof which is viewable through the viewfinder 48.
- the viewfinder 48 can be retracted when the photographer wants to use the viewfinder 48 and moved aside when the user would like a full view of the screen 56.
- the viewfinder 48 may be adjustable to suit the distinct eyesight of one or more individual viewers.
- the viewfinder 48 can use various areas of the screen depending on the resolution that is selected automatically by the camera or manually by a user.
- the screen 56 can automatically adjust based on detection of when the viewfinder 48 is placed in position to provide the viewfinder image and when the viewfinder is stowed to the side of the screen 56.
- the viewfinder 48 may be assembled as part of a digital camera, as shown in Figure 2, or may be selectably attached and removed as a peripheral device.
- the attachment of the viewfinder can be performed in certain embodiments by sliding the viewfinder into the hot shoe 46. In the case of attachment of the viewfinder 48 to the hot shoe 46, an image may be provided at the center-top of the screen 56 beneath the hot shoe 46 in the example of Figure 2 for viewing through viewfinder 48.
- the position of the hot shoe 46 may be anywhere around the camera periphery and the image may be provided at a screen location proximate or adjacent or convenient to the location of the hot shoe 46.
- the viewfinder may be configured to be adjustable such that different screen locations may be viewed through it.
- the viewfinder includes a hinged extension arm that folds out and may be rotated using a ball bearing coupling to view any or most any or a substantial or significant amount of selected screen portions.
- the viewfinder 48 may be selectably stowed at position A or put into position B for use by a hinge mechanism with locking recesses at positions A and B.
- the viewfinder 48 and grip 42 may be interchangeable either left and right or right and left to accommodate different dominant eyes of users.
- the viewfinder 48 in certain embodiments is designed with blinders or polarization filters or baffling or reflectors on the sides so that stray light is prevented from penetrating from the sides to advantageously provide a better contrast ratio.
- the viewfinder 48 may have a rubber cup eye socket interface (not shown) to stabilize the user at the viewfinder and reduce stray light.
- the viewfinder 48 can be adjusted in certain embodiments to multiple different magnifications in certain embodiments, and in embodiments having less versatility in the selection of magnification, one or more image parameters may alternatively be adjustable.
- a viewfinder view may be observable through a translucent display screen portion.
- a portion of the display or the entire display may be translucent such that images are viewable in a viewing mode, while the translucent display or translucent display portion may provide a view of the scene through the viewfinder in a viewfinder mode through the translucent display or translucent display portion.
- a viewfinder may include one or more lenses or curved mirrors, and may be multi-focal or anamorphic or asymmetric or non-symmetric or aspherical in magnification or optical power, or may be customized in accordance with an optical prescription of a particular camera user.
- the viewfinder may include a mirror for changing a direction of viewing relative to the direction of the object.
- the viewfinder may have a half-silvered mirror for partially viewing two different regions of a scene, or of two scenes to be combined or captured in succession.
- the viewfinder may have a pair of mirrors.
- a periscopic or telescopic attachment may be available as an accessory for coupling into the optical path of the viewfinder in certain embodiments.
- Figures 3A-3C schematically illustrate examples of touch slider display objects that, in a first example, a user may view on the display screen 16 while thumb or finger actuating a touch slider, or that, in a second example, a user may both view and touch screen actuate a displayed touch slider object, for selecting and adjusting an imaging parameter in accordance with certain embodiments.
- Any touch slider other than a touch screen display object slider may include multiple posts or pegs formed together in an array of pixels that may be disposed at an accessible area of the camera housing in an overall touch slider recess or in two or more touch slider region recesses or each post or peg may recess into its own individual post or peg recess when not in use, and then protrude out of the housing when a user decides to use the touch slider.
- a slider may include a fixed touchpad surface.
- the touch slider 120 illustrated schematically in Figure 3 A is divided into four regions: flash 122, exposure 124, focus 126 and auto/smart 128.
- the number of touch slider regions may be more or less than four and the regions 122, 124, 126, 128 may be disposed in a circular shape or in another curved shape or in a linear or rectangular shaped region, and the sub-regions may be polygonal or curved in shape while the overall touch sensor region may be shaped differently.
- the touch slider 120 may overlap a preview of an image on the display screen or may be disposed to the side or above or below a preview image on the display screen or there may be separate display screens for the user interface and preview or postcapture images.
- a user may initiate an adjustment of flash, exposure or focus or another parameter by tapping the touch slider region designated for the parameter that is to be adjusted.
- the touch slider changes to a different touch slider 130 such as that shown in Figure 3B for adjusting a value of the selected parameter. Tapping the auto/smart region 128 of the touch slider 120 would leave it to the default settings or a
- the user can tap one of the numbers shown in the example slider 130 of Figure 3B to adjust the value of the selected parameter by the indicated amount, e.g., +2 or -1.
- the user may in certain embodiments use a touch screen display slider object 130 by sliding a finger or thumb in one direction to increase the value of the parameter or in the opposite direction to reduce the value of the parameter.
- the touch slider display object 130 may alternatively show actual values of the parameter that may be selected directly by tapping the slider or by finger thumb sliding left or right to respectively decrease or increase the value of the parameter by an amount proportional to the sliding distance or other quantity that may be detected or computed for the finger or thumb movement such as slide speed or downward pressure.
- Figure 3C illustrates a view through a viewfinder, e.g., viewfinder 48 of Figure 2.
- An image 136 appears in the viewfinder illustrated in the example of Figure 3C.
- a touch slider display object 120 is shown just above the image 136 in Figure 3C, and a touch slider display object 130 is shown just below the image 136.
- one touch slider 120 or 130 would appear at a time, respectively, for selecting a parameter to adjust or for adjusting a selected parameter.
- a touch slider object may be divided functionally into two or more regions, including a region operating in accordance with touch slider 120 and a region operating in accordance with touch slider 130.
- one slider 120 may operate in accordance with touch slider 120, while another separate slider 130 may operate in accordance with touch slider 130.
- Figure 3D illustrates another view through a viewfinder, e.g., viewfinder 48 of Figure 2.
- An image 136 appears in the viewfinder illustrated in the example of Figure 3D.
- a touch slider display object 120 is shown near the top overlapping the image 136 in Figure 3D and a touch slider display object 130 is shown near the bottom also overlapping the image 136.
- there may be three touch slider display objects e.g., one for focus, aperture and/or depth of field, one for brightness or exposure, and one for motion blur or shutter duration control.
- Various numbers of touch slider display objects may be provided each corresponding to a different parameter that is amenable to manual user pre-capture or post-capture control.
- the objects 120, 130 in the example of Figure 3D may be translucent so that the image can be seen even where the display object 120, 130 also occupies a same display screen portion.
- one touch slider 120 or 130 would appear at a time, respectively, e.g., for first selecting a parameter to adjust and for next adjusting the selected parameter or for first adjusting a first parameter and for next adjusting a second parameter (then a third parameter, etc.).
- the touch sliders 120, 130 may be embodied in an array of touch sensitive elements coupled onto a digital camera housing or exposed through a cavity or recess in a digital camera housing, or provided as an object on a touch sensitive digital camera display screen, or combinations thereof.
- a camera processor is programmed to interpret a touching, tapping or sensed proximity of a finger, thumb or stylus or other tool of a user, or some combination thereof, to a specific region of the touch slider as a user command to initiate a process for adjusting a value of a specific imaging parameter.
- a length or duration of a sliding movement or double tap time, or a tap pressure, or a sliding movement between specific regions, or another sensed movement or characteristic of a sensed movement, such as an area of a closed path, may be assigned to a specific imaging parameter.
- Imaging parameters may include precapture settings for the digital camera such as an intensity of flash or other light source illumination, a selection of one or more of multiple available flash choices such as a xenon or krypton flash and one or more LEDs, and/or a duration or sequence or direction or spectral range or divergence or whether to use a Fresnel lens, or a length of exposure, or aperture size, or selection of a single or multiple still image capture, or one of multiple video capture modes, or a specific audio capture mode such as selecting from multiple available microphones, wavelength ranges to include or exclude, microphone direction, stereo balance or other available audio options, or a parameter that may be adjusted by altering a configuration of the optics of the camera, e.g., a focus or zoom setting may be adjusted by moving a lens relative to the image sensor, or magnification of a viewfinder may be adjusted by moving a magnifying lens within the viewfinder, or a parameter of a captured image such as exposure, contrast, brightness, focus distance, depth of field, white balance,
- an elongated slider 120 has been separated into four regions along its length.
- the four regions of the slider 120 in Figure 3 A are labeled flash, exposure, focus and smart/auto mode.
- the user may tap the exposure region, e.g., and a touch slider object 130 would show exposure values ordered from low to high values within some reasonable number of regions of the touch slider 120.
- a region may be then tapped which would adjust the exposure to the value provided in that region, or a sliding movement may be used to raise or lower the exposure value by a proportional amount to the distance, speed, pressure, duration or other determinable characteristic of the relative movement sensed by the touch slider 130.
- a touch slider may be deemed or referred to as a linear slider in certain embodiments wherein a camera user may adjust a value of a selected imaging parameter in an amount that is proportional to a relative movement along a directional axis defined within the plane of the slider surface such as a sliding distance of a user's finger along an axis defined in the plane of the linear slider.
- the slider 120, 130 may have a width as small as a single pixel such that relative movements can only be detected in one direction along a single axis of the slider.
- Two or more touch sensitive pixels may be provided in certain embodiments along a second directional axis of the slider 120, 130 such that relative finger or thumb motion may be detected along two axes that define a plane or other contour of the camera housing surface where the slider is located.
- a touch screen object slider 120, 130 or touchpad slider may have an elongated shape in certain embodiments or a circular, elliptical, square or other polygon or closed shape having some combination of curved and straight segments.
- a touch slider may also be coupled to the camera housing or configured as a display object that functions like a mouse wheel.
- the mouse wheel slider may appear like a mouse wheel.
- a clickable mouse wheel a user may manually select an imaging or editing parameter by turning the wheel to scroll through parameters, click the mouse wheel on a parameter to open a menu of values, scroll again with the mouse wheel to a value, click the wheel again to select the value.
- a double click of the wheel may return the camera back to an auto mode.
- a mouse wheel and/or left-click, right-click and/or single click button may be provided on the camera housing, e.g., on the top or back or side of the grip or on the opposite side of the camera.
- Combinations of a mouse wheel and clickable region are also provided in certain embodiments on the camera housing.
- combinations of a mouse wheel and/or clickable region are provided along with a touchpad slider on the camera housing and/or as display objects or as mix of one or more display objects and touch, click and/or wheel regions on the camera housing.
- Quantities associated with a third dimension normal to the plane of the slider 120, 130 which is coplanar with the camera housing surface and/or with the display screen surface, such as downward force or pressure or proximity, may be utilized by assigning certain commands to them in certain embodiments.
- a touch slider 120, 130 in accordance with certain embodiments may have the functionality of a mouse, joystick, or game controller or may be limited to a short list of imaging parameters as in the illustrative example of Figure 2 or something in between.
- the digital camera may be programmed to process a tap in a same or similar manner as a mouse click and to process a relative movement of a finger or thumb of a user or a stylus or other tool held by a user along the length or within the area of the slider in a same or similar manner as a movement of a mouse.
- a touch slider may be located at the top of the camera housing or the front of the camera housing, or the rear of the camera housing or grip, or on a touch screen display, and may be located on either side of the camera.
- the touch slider can be activated using a haptic mechanism such as a touch screen or a touch slider haptic mechanism.
- the camera may be configured for finger or thumb actuated haptic activation of the touch slider.
- An imaging parameter may be adjusted using the touch slider as a single parameter adjustment axis, or correction may be performed using the touch slider as a complex combination of some of the parameters above.
- the touch slider may be used in conjunction with a duplicate visual display and/or may be functionally divided into two regions: one object on the display and/or one region of the touch slider being configured for selecting a mode of correction and the other object on the display and/or other region of the touch slider being configured for selecting a quantity of correction.
- both of the objects illustrated in Figures 3 A and 3B may be provided together at the same time and/or side by side on the camera display and/or the touch sliders 120, 130 may be functionally separated into a mode selection region and a quantity of correction region (e.g., upper half and lower half or left side and right side).
- a digital camera in accordance with certain embodiments may include a pair of touch sliders that are functionally distinguished as a mode selection slider and a quantity of correction slider.
- a fingerprint reader may be included as part of one of the touchpad or touchscreen sensors for security.
- the camera may include a rear camera that also provides some security such as by identifying the camera user and/or by monitoring and/or recording and analyzing what is behind the camera user, and, e.g., signaling the user or sounding an alarm when a dangerous condition is detected.
- the display may be configured to flip from side to side depending on which camera's images are desired to be on a single sided display or double-sided display or in picture-in-picture or side-by-side format, or if only one camera's images are to be selected to be displayed.
- a touchpad sensor, or mouse wheel, or touchscreen object or other built-in device coupled accessibly at an exterior location of the camera housing may be used for scrolling through menus and executing programs and otherwise controlling the camera manually through the user interface appearing on the display.
- an autofocus position may be selected within an image and/or amongst the display screen pixels in a set autofocus mode or otherwise during a precapture mode of the camera when a touch pad surface is mapped proportionally to the display screen.
- a location on the touchpad may be tapped causing a location on the display screen and/or within an image to be captured or video being captured to be set as a user-selected auto-focus location.
- more than one auto-focus location can be selected, such as a primary and secondary or multiple faces in no particular order of importance.
- Figures 4A-4C schematically illustrate front views of example digital cameras that each include multiple LEDs 141 A for illuminating objects to be imaged in accordance with certain embodiments.
- the example illustration of Figure 4A shows four LEDs 141 A disposed across the front of the camera.
- multiple LEDs 141A are disposed across the front of the camera and behind an elongated Fresnel lens.
- a camera in accordance with a multiple LED embodiment may include as few as two LEDs 141 A that may be built-in to the camera or attachable at a hot shoe bracket or detachable for adjusting a position or angle of illumination during image capture, or remotely controlled by the camera as a peripheral accessory.
- the LEDs 141 A may be relatively disposed in various ways and embodiments of digital cameras herein generally may include no flash LEDs or any number of LEDs as flash illumination components.
- a microphone 141B is also shown in the example of Figure 4A to the right of the lens 114.
- three microphones 141B are disposed in the plane of the front surface of the camera that form a triangle such as a right triangle or otherwise to permit, e.g., stereo audio sound recording when the camera is in different orientations such as landscape and portrait orientations.
- the LEDs 141 A may be clustered in groups as illustrated schematically in Figure 4B.
- two groups 143 of two LEDs 141 A are disposed to the left and right of a third group 145 of three LEDs 141 A.
- the LEDs 141 A of any of the groups 143, 145 may be selected to complement other LEDs 141 A in the group.
- a group 143, 145 may include LEDs that offer different wavelength spectra, different color temperature, different intensity, different divergence (spot, wide), different duration, or may be positioned at a slightly different angle.
- the LEDs 141 A of a group 143, 145 may have different delays so that the LEDs 141 A flash at slightly offset times.
- the LEDs 141 A of a group 143, 145 may be activated at different times to allow the capture of multiple images each illuminated by one LED flash 141 A or a subset of LEDs 141 A. In one embodiment, these multiple images may be captured and/or illuminated using various light angles.
- the groups 143, 145 may be activated at a same time or at different times or for different durations, e.g., to provide a "back to front flash" effect.
- a specific selection of LEDs 141 A and/or groups 143, 145 of LEDs 141 A to use in capturing a specific image may provide specific fill light characteristics for the image.
- the camera may be configured to determine depth by using the disparity of the lights, e.g., to generate a 3D image or to adjust focus.
- a movement effect may be created by activating multiple LEDs 141 A in a particular order or sequence while capturing multiple images each with a different LED 141 A lighting the scene.
- the camera may be programmed to automatically determine a desired fill light based on an analysis of preview images. This determination may depend on a location of a main light source such as the sun, an external light or a main camera flash, a direction of shadows, overall color balance, and/or other parameters
- One or more LEDs 141 A may be used in combination with a xenon flash to provide short/long lighting.
- a xenon flash may be attachable at a hot shoe bracket or built-in such as a pop-up flash 129.
- Figure 4C schematically illustrates a digital camera with multiple LEDs 141 A disposed on the camera lens holder around the optical path of the camera at the periphery of a light collecting area of the lens at the object end of the lens holder.
- Six LEDs 141 A are shown in Figure 4C disposed on the lens holder 124, although any number of LEDs 141 A may be disposed on the lens holder 124 in various embodiments.
- One or more LEDs 141 A may be disposed on the lens holder as in Figure 4C, while one or more LEDs 141 A may be disposed on the camera housing such as in
- LEDs 141 A may be disposed within recesses defined in the housing when not in use.
- the LED 141 A When a LED 141 A is to be used to provide illumination during image capture, the LED 141 A may protrude out of the recess to provide illumination during an image capture and then recede back into the recess.
- An optional pop-up flash 129 may also be configured to recede into the housing when not in use.
- Another digital camera includes an image sensor within a camera housing, an optical assembly including one or more lenses for forming images on the image sensor, a display screen for viewing the images, a processor and a lens mounted flash coupled to the lens housing for providing illumination during image capture.
- Figure 5 schematically illustrates a perspective view of a digital camera that includes multiple microphones 160 in accordance certain embodiments.
- the example of Figure 5 includes three microphones 160 that are directed for receiving sounds from the front of the camera and one microphone that is directed for receiving sounds from the rear of the camera.
- One of the front- facing microphones 160 is located near the grip 2 and near the top of the camera, and may be otherwise disposed within the grip and/or nearer the bottom of the camera.
- Another front-facing microphone 160 is disposed at the top-right of the front surface of the camera in the example of Figure 5.
- This microphone 160 may be disposed within a viewfinder 108 such as in the example of Figure 2, such that the microphone would face front when the viewfinder 108 is stowed and would face to the side when the viewfinder 108 is in use.
- the microphone 160 may be disposed below and/or to the left of the viewfinder 108 such as to face front notwithstanding the configuration of the viewfinder 108.
- a third front-facing microphone 160 is disposed at the bottom-right of the front surface of the camera.
- the rear-facing microphone 160 is disposed between two front facing microphones 160 near the top of the camera in the example of Figure 5. This rear-facing microphone 160 may be located below and/or to the side of a hot shoe bracket.
- Various embodiments of digital cameras are provided that include multiple microphones aligned with the optical assembly to record sound during image capture.
- the camera includes at least three positioned microphones to generate the horizontal disparity in both portrait and landscape mode. Each pair of spaced microphones is disposed to capture stereo sound and signal processing may be used to further separate the right and left channels by subtracting left information from the right channel and vice versa.
- microphones A and B can be used for landscape mode.
- microphones B and C may be used.
- microphones A and C may be used.
- Rotating the camera from landscape to portrait may be monitored by an accelerometer.
- the microphone pair being used during a sound recording may be changed one or more times as changes in camera orientation are determined by the accelerometer.
- the camera may include three front-facing unidirectional microphones, or the A, B and C mics in Figure 5, in order to provide a stereo image with low background noise. Unlike omnidirectional microphones, unidirectional microphones do not pick up sound well in their rear direction.
- a rear facing directional microphone is also included in certain embodiments, e.g., as illustrated schematically at Figure 5.
- the rear facing mic, or the "Z" mic may be used to pick up the voice of the camera user or other sound coming from the rear of the camera.
- the Z microphone can be omnidirectional, or unidirectional or bidirectional.
- the level of user pickup can be altered in certain embodiments by changing the sensitivity of this rear facing microphone.
- the polarity of microphone Z may be selectively altered from positive to negative to further cancel the sound from the camera user during those times when the camera user does not want their voice recorded. Alternately the sound from the user's voice may simply be subtracted from the other microphones using an algorithm.
- the rear facing mic, or Z mic, or a different internal mic may be used to cancel a certain type of noise or certain types of noises sh as lens motor noise, shutter noise and/or camera handling noise.
- Sound gathering and/or sound forming may be provided automatically or with manual touch screen input based on face or other object detection, tracking and/or recognition including blind source recognition, and/or on motion detection. For example, a moving face or other object may be tracked and the camera audio may be dynamically adjusted based on information gathered during the tracking such as direction and volume of sounds
- Three uni elecrets may be used to capture sound from the front of the camera as illustrated at Figure 5.
- Figures 6-15 schematically illustrate digital cameras that are programmed to capture images that have desired quality characteristics. Precapture settings may be adjusted
- Captured images may also be edited or combined to form new or processed images, and sequences of images may be captured as video or to enhance still image quality.
- An advantageous user interface, image processor and program code embedded on storage media within the digital camera housing facilitate the capture and processing of quality images and video, as well as the display, storage and transmission of those quality images and video.
- Figures 6-15 Examples are provided and schematically illustrated in Figures 6-15 which images of various objects and user interface tools that may be provided on a rear display screen of a digital camera that is configured in accordance with certain embodiments to receive user input by manipulation of one or more touch sliders, e.g., as illustrated at Figures 3A-3D by manipulating a touch screen display slider object or a touchpad on the camera housing.
- Figure 6 schematically illustrates a back view of a digital camera that includes a display screen and various buttons for image capture and/or editing control, including buttons for capture type control (e.g., video, time lapse, slow motion, panorama, 3D, cinemagraph, 3D audio, moment), secondary controls such as timer and flash, adjustment controls, global controls such as gallery, app store and settings, and a thumbnail of a previous image capture in accordance with certain embodiments.
- buttons for capture type control e.g., video, time lapse, slow motion, panorama, 3D, cinemagraph, 3D audio, moment
- secondary controls such as timer and flash
- adjustment controls e.g., adjustment controls, global controls such as gallery, app store and settings, and a thumbnail of a previous image capture in accordance with certain embodiments.
- Figure 7 schematically illustrates a back view of a digital camera that includes a display screen and various buttons for image capture and/or editing control, including buttons for adjusting a time parameter and/or scrolling through a sequence of images, for selecting and editing various parameters using smart menus and a touch slider or linear slider for selecting an image parameter for adjustment and then adjusting the image parameter, and/or for scrolling or showing a current time parameter disposed between a start time and an end time for the sequence of images.
- the slider object changes between parameter selecting and adjusting modes, while in other embodiments, two different slider objects appear on the display in accordance with certain embodiments.
- Figure 8 schematically illustrates a back view of a digital camera that includes a display screen and a smart reset button.
- Figure 9 schematically illustrates a back view of a digital camera that includes a display screen such as a touch screen, a smart button, a value indicator, smart correction and/or scrolling button, and a linear slider or touch slider for adjusting parameters such as exposure, contrast, fill- flash, face priority, brightness, focus, and various other image capture and/or editing parameters, in accordance with certain embodiments.
- the camera is programmed to provide image quality alerts as a smart capture feature.
- the camera will notify a user that a specific parameter is poor, e.g., the captured image may be too dark or too blurry.
- One or more thumbnails of recent images or shots captured may have frames of different colors based on image quality, e.g., red for poor, yellow for so-so, and green for good.
- Figure 10 schematically illustrates a back view of a digital camera that includes a display screen showing a live image, a favorite select button, a delete select button, a global control button, and advanced edits and share buttons, in accordance with certain embodiments.
- Figure 11 schematically illustrates a back view of a digital camera that includes a display screen showing a feedback bubble that a user can accept, reject or ignore in accordance with certain embodiments.
- Figure 12 schematically illustrates a back view of a digital camera that includes a display screen and buttons for crop control and other adjustment controls, and a button for confirming a crop or other adjustment, and cancel and smart buttons, in accordance with certain embodiments.
- Figure 13 schematically illustrates a back view of a digital camera that includes a display screen and a timeline with indicators of original and current time values disposed between start and end times, and buttons for canceling to exit adjustment mode without saving and for confirming to save changes, and a smart button, in accordance with certain embodiments.
- Figure 14 schematically illustrates a back view of a digital camera that includes a display screen showing a selected image for sharing, and buttons for email, text, facebook, and networked second camera or other device, in accordance with certain embodiments.
- Figures 15A-15B schematically illustrate a back view of a digital camera that includes a display screen that shows a level guide that auto appears when the camera is not leveled and disappears when the level is restored in accordance with certain embodiments.
- a camera in accordance with certain embodiments may include a steganographic watermarking feature that may be automatic and/or may have default or customized settings for embedding hidden information within digital images.
- hidden information may include the identity of the photographer or owner of the camera or time/date stamp or geo gps settings or any selected message or parameter value that is not designed for editing and configured to be unique such that an unauthorized phony or edited image could be identified as not containing the proper steganographic watermark.
- the hidden information may be converted to noise when added to a camera image.
- a camera in accordance with certain embodiments may be configured with multiple screens or divisions of a single screen that may simultaneously display images in precapture, capture and/or post-capture editing modes. For example, an image to be captured may be displayed for editing precapture parameters on one screen while a previously captured image may be displayed for post-capture editing, transmitting or managed storing on another screen.
- an area may be selected for directing an auto-focus feature within a scene by instantiating an object on the camera display screen having a certain size and shape and overlaying a certain subset of pixels within an image about to be captured.
- This auto-focus object may be adjustable in size and/or shape, which may be rectangular, elliptical, circular, square, or arbitrarily curved or including straight and/or curved continuous or dashed segments of arbitrary or adjustable length or spacing.
- the auto-focus object may be used to sweep through different aperture and/or shutter speed settings, e.g., to set aperture and/or shutter speed priorities.
- the auto-focus area may also be used by the camera for auto-adjusting exposure and/or for manually adjusting auto defaults, which can be remembered by the camera and used for determining auto- adjustments in future images.
- both an auto-focus object and an auto-exposure object are provided on a camera display within an image to be captured.
- the auto-focus object and auto- exposure object may have same or different sizes or shapes, and may be centered or located within the image to enclosing entirely different subsets of pixels, identical subsets of pixels or combinations of same and different pixels within the image to be captured.
- multiple auto-focus objects and/or multiple auto-exposure objects may be provided at different locations within an image to be captured or having different sizes or shapes.
- the camera may capture as many images each prioritizing auto-focus and/or auto-exposure in accordance with one of the objects, and the images may be combined to produce a final image that reflects the multiple prioritized auto-focus and/or auto-exposure areas.
- a camera in accordance with certain embodiments may be configured to compose a collage of multiple images that may be posted together, e.g., on a social network that permits only one posting at a time.
- a camera in accordance with certain embodiments may be configured to network with other cameras at an event. Images from two or more cameras may be combined into a single image or video and/or information may be gathered by one camera and used for capturing or editing still or video images or sounds captured with another camera. Two or more networked cameras may coordinate for exposure setting. Time-based and/or location-based social networking and/or camera to camera sharing is provided in certain embodiments. A connection of two cameras on the network may be established when the two cameras become closer than a preset distance and/or the two cameras may be disconnected when they become separated by more than a preset distance.
- a stored image with desired image qualities may be selected in a precapture stage indicating to the camera to adjust precapture settings to capture an image of a current scene having approximately the desired image qualities of the selected stored image.
- a camera in accordance with certain embodiments may be configured to provide long exposure images of live scenes including moving objects without blurriness.
- the long exposure images are provided by combining data from multiple short exposure images. For example, a ten second exposure night shot may be provided by combining a thousand 0.01 second shots.
- Objects moving left or right may be edge-matched and translated to a common position, while objects moving toward or away may be reduced or enlarged, respectively, to a common size, and rotating objects may be counter-rotated to a common directional orientation, and blocked portions of objects in a subset of images may be taken into account by multiplying parametric contributions from other images wherein the object portions are unblocked.
- a camera in accordance with certain embodiments may have a low power consumption motion sensor that initiates a camera boot-up process upon sensing that the camera is being picked up or otherwise manipulated into a position indicative of a camera user's desire to capture a picture.
- the sensation that may initiates the camera boot-up process may include a touching of the camera or touching by a human hand or by a particular human hand on a grip with a finger on a fingerprint reader.
- An accelerometer built-in to the camera may sense motion to trigger start up, or a lens cap being removed may trigger start up.
- the sensation that initiates camera boot up may include a turning on of lights in a room where the camera is sitting or an approach by a human being based on a characteristic heat or sound signature.
- a camera in accordance with certain embodiments may be configured to detect words spoken nearby the camera while capturing video images and to discern whether to include the words as commentary or discard the words as noise or to provide an option to the user to include or discard the words and/or other discernible sounds or noises captured with the video.
- Figure 16 schematically illustrates a digital camera display screen showing an opaque column of selectable icons of a user interface and a photographic scene overlayed by a column of selectable translucent icons in accordance with certain embodiments.
- the selectable icons in the opaque column include a navigator or task manager icon, a home screen icon, a camera capture modes icon and a precapture settings icon.
- the navigator may be selected to access a dashboard, Apps, including open/live and recently-used apps and Activities.
- the home screen icon may be selected to access the home screen of the user interface.
- the column of selectable translucent icons shown in Figure 16 appear when the camera capture modes icon is selected.
- the selectable translucent icons in this column include a swap camera icon for selecting between a front camera and a rear camera, a still capture mode icon, a video capture mode icon, and a flash settings icon.
- a thumbnail is also available for viewing recent photos or other stored photos.
- Figure 17 schematically illustrates a digital camera display screen showing the opaque column and the column of selectable translucent icons of Figure 16, as well as a second column of selectable translucent icons for selecting from several camera capture modes including auto, smart, moment, pano, manual, slo-mo, hyper and lapse.
- Figure 18 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons.
- the third column of selectable translucent icons appears when the smart icon is selected from the second column of selectable translucent icons.
- the selectable translucent icons of the third column include depth, motion and light.
- the fourth column of selectable translucent icons appears when the depth icon is selected from the third column and includes values of depth of field that may be selected from.
- Figure 19 illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and four columns of selectable translucent icons of Figure 18. A specific depth selection appears as a highlighted row of icons across the second, third and fourth columns of selectable translucent icons.
- Figure 20 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons.
- the third column of selectable translucent icons appears when the smart icon is selected from the second column of selectable translucent icons.
- the selectable translucent icons of the third column include depth, motion and light.
- the fourth column of selectable translucent icons appears when the motion icon is selected from the third column and includes values of motion that may be selected from.
- Figure 21 schematically illustrates a display screen result of scrolling to and selecting one of the multiple icons representing a specific value of motion.
- the specific motion value selection appears as a highlighted row of icons across the second, third and fourth columns of selectable translucent icons.
- Figure 21 schematically illustrates a display screen result of scrolling to and selecting one of the multiple icons representing a specific light value option.
- the specific light value option selection appears as a highlighted row of icons across the second, third and fourth columns of selectable translucent icons.
- Figure 22 illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first, second and third columns of selectable translucent icons of Figures 17-21, as well as a fourth column of selectable translucent icons representing several light value options that may be selected from.
- Figure 24 illustrates a display screen result of selecting one of multiple translucent icons from the first column of selectable translucent icons in accordance with certain embodiments, the selected still photo mode icon being indicated by universal brightening or white or gray translucent overlaying of a still photo mode icon-containing pixel area, while maintaining the highlighting illustrated at Figure 23 of an extended row of icons across the second, third and fourth columns indicating a specific light value option.
- Figure 25 illustrates a digital camera display screen result of selecting a "clear" icon in accordance with certain embodiments and removing the universal brightening or white or gray translucent overlaying of the still photo mode icon-containing area illustrated at Figure 24.
- Figure 26 illustrates a display screen that is overlayed in certain embodiments by a fifth column of selectable translucent icons on the opposite side of the display screen from the opaque column and first to fourth columns of selectable translucent icons of Figure 22.
- Figure 27 illustrates a digital camera display screen in accordance with certain
- Figure 28 illustrates a digital camera display screen in accordance with certain
- Figure 29 illustrates a digital camera display screen in accordance with certain
- a translucent key zone cursor overlay in accordance with certain embodiments, e.g., an unfilled circle or dashed circle appearing to enclose a circular pixel area of the display screen and showing a plus sign at a center of the circular pixel area.
- Figure 30 illustrates a digital camera display screen that includes the opaque column and photographic scene overlayed by the first column of selectable translucent icons of Figure 16, and a translucent key zone cursor overlay as in Figure 29, and a translucent, broken-circle-shaped overlay that includes selectable depth, motion and light circle segments in accordance with certain embodiments.
- Figure 31 illustrates the display screen of Figure 30, and a highlighted depth segment at a selected location within the depth segment of the translucent, broken-circle shaped overlay that corresponds to a particular value of depth of field.
- Figure 32 illustrates a display screen result in accordance with certain embodiments of adjusting the value of depth of field by sliding to or otherwise selecting, indicated by highlighting, of a different location within the depth segment of the translucent, broken-circle shaped overlay of Figure 31.
- Figure 33 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons.
- the third column of selectable translucent icons appears when the manual icon is selected from the second column of selectable translucent icons.
- the selectable translucent icons of the third column include aperture or A, ISO, shutter speed or S, white balance or WB, and exposure or EXP.
- the fourth column of selectable translucent icons appears when the aperture or A icon is selected from the third column and includes values of aperture, e.g., F2, F2.8, F4, F5.6, F8, Fl 1 and F16, from which a user may choose to manually select.
- Figure 34 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons.
- the third column of selectable translucent icons appears when the manual icon is selected from the second column of selectable translucent icons.
- the selectable translucent icons of the third column include aperture or A, ISO, shutter speed or S, white balance or WB, and exposure or EXP.
- the fourth column of selectable translucent icons appears when the ISO icon is selected from the third column and includes values of ISO, or sensitivity to available light, e.g., 100, 200, 400, 800, 1600, 3200, and 6400, from which a user may choose to manually select.
- Figure 35 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons.
- the third column of selectable translucent icons appears when the manual icon is selected from the second column of selectable translucent icons.
- the selectable translucent icons of the third column include aperture or A, ISO, shutter speed or S, white balance or WB, and exposure or EXP.
- the fourth column of selectable translucent icons appears when the shutter speed or S icon is selected from the third column and includes values of shutter speed, e.g., 4s, Is, l/4s, 1/15s, l/60s, l/250s, 1/lOOOs, l/4000s and l/16000s, from which a user may choose to manually select.
- Figure 36 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons.
- the third column of selectable translucent icons appears when the manual icon is selected from the second column of selectable translucent icons.
- the selectable translucent icons of the third column include aperture or A, ISO, shutter speed or S, white balance or WB, and exposure or EXP.
- the fourth column of selectable translucent icons appears when the white balance or WB icon is selected from the third column and includes icons representing several white balance options from which a user may choose to manually select.
- Figure 37 illustrates a digital camera display screen showing an indication of recent capture of two minutes and fifteen seconds of video, the display screen also showing the opaque column and first to fourth columns of selectable translucent columns of Figure 35 including a row of highlighted icons indicating a manually-selected value of 1/60 seconds as a camera shutter speed.
- Figure 38 illustrates the display screen of Figure 37 without the second to fourth columns of selectable translucent icons indicating the manual selection of a 1/60 second shutter speed.
- Figure 39 illustrates the digital camera display screen of Figure 38, as well as icons that a user may touch or otherwise select to execute video clip play, video play or record pause and video clip delete commands in accordance with certain embodiments.
- Figure 40 illustrates a camera display for a digital camera in a default camera state for a viewfinder mode with no user interface in accordance with certain embodiments.
- the default fullscreen display size may be 1920 x 1080 pixels (16:9 ratio) in certain embodiments.
- the camera may be in 100% automatic mode and ready for capture using the camera's physical shutter button.
- a main UI symbol is shown in Figure 40, which indicates a swipeable interaction that can also be tapped. Swiping or tapping the main UI symbol would cause a capture mode/navigation bar to appear on the right side of the display screen and a secondary controls bar on the left side.
- Figure 41 illustrates a camera display for a digital camera after a single tap instantiates appearance of a key zone object in accordance with certain embodiments.
- the key zone object may be moved for location control of one or more parameters or groups or general categories of parameters within a photographic scene, e.g., a first group may include light, exposure and/or ISO, a second group may include focus, aperture and/or depth of field, and a third group may include shutter speed, motion, crispness and/or blur.
- a first group may include light, exposure and/or ISO
- a second group may include focus, aperture and/or depth of field
- a third group may include shutter speed, motion, crispness and/or blur.
- These groups may be combined and invoked by a single tap at any point in the screen.
- the image may be automatically focused on the portion in the scene enclosed by the key zone at the location of the tapping and/or the overall lighting may be automatically adjusted to favor that same location of the scene.
- a dismiss glyph may appear next to the key zone symbol that the user can tap on to delete the key zone object. The user may move the key zone object around by dragging it.
- Figure 42 illustrates a camera display for a digital camera including a pinch and zoom feature to vary the size of a key zone object in accordance with certain embodiments.
- the user may use two-fingered pinch and zoom gestures to adjust the size of the key zone in certain embodiments. This allows for enlarging or reducing the area the user wishes to be in focus and/or to have optimal lighting. Any interaction with the key zone object, to move and/or resize it, redisplays the "TAP AND HOLD TO ADJUST" label in certain embodiments. Pinch and Zoom resizing of the key zone object will not trigger the display of adjustment controls as long as the two fingers are moving.
- Figure 43 illustrates a camera display for a digital camera including a key zone controls panel for selecting light, focus and shutter speed adjustment in accordance with certain
- the key zone object may include three adjustment control options that are initially displayed on a thin right side bar and presented with symbols and labels that reflect how certain image qualities are being adjusted. These parameters may or may not map directly to aperture, shutter speed and ISO, particular when goal qualities being sought involve more than one.
- light adjustment may involve a combination of exposure adjustment and one or more other parameters, such as aperture, shutter speed, and/or ISO, and may depend on conditions, and may depend on other adjustment control settings.
- Focus adjustment may involve aperture setting and/or varying the depth of field in the scene, and may involve more than one underlying parameter.
- Speed adjustments will depend on whether objects being captured are moving and at what speed, and this may also involve multiple underlying parameters.
- Figure 44 illustrates a camera display for a digital camera including a key zone adjustor panel for a light parameter selected from a key zone control panel in accordance with certain embodiments.
- three key zone adjustors can be selected one at a time, e.g., each producing a vertical panel pulling out to the left of an adjust bar on the display screen.
- the currently selected parameter may be colored yellow.
- symbols may appear on either end of an adjustor bar indicating qualities the user may expect by sliding in one direction or the other.
- the light adjustor may work best by starting in the middle and allowing the user to adjust up or down, while the focus adjustor may begin at some particular setting in a linear lowest-to-highest aperture setting already and its position may reflect that in its initial location on the slider bar.
- Figure 45 illustrates a camera display for a digital camera upon user interaction with the key zone adjustor panel of Figure 44 to adjust a light parameter. The initial default
- FIG. 46 illustrates a camera display for a digital camera upon user interaction with a key zone adjustor panel for a focus parameter selected from a key zone control panel in accordance with certain embodiments.
- the focus adjustor may work similarly in certain embodiments, e.g., bringing up a slider bar adjustor control with an initial default setting reflecting the camera's automatic setting.
- the symbol indicates graphically the decreasing (downward) or increasing (upward) depth of field in the preview image. As the user interacts with this slider, the image will reflect and show this change in depth of field in certain embodiments.
- the size of the key zone will determine the portion of the scene that will remain in focus, even with the focus adjustor pulled all the way down.
- Figure 47 illustrates a camera display for a digital camera upon user interaction with a key zone adjustor panel for a speed parameter selected from a key zone control panel in accordance with certain embodiments.
- the speed adjustor may work similarly to the light and focus adjustors, e.g., bringing up a slider bar adjustor control with an initial default setting reflecting the camera's automatic setting.
- the symbol may indicate graphically the decreasing (downward) or increasing (upward) the shutter speed.
- capture shutter speed hasn't traditionally had a way to give users an intuitive way to visualize what effect it will have until after an image is captured.
- an artificially induced blur-trailing for moving objects may appear in a preview display that can be visually reduced by this slider until the moving objects are sharp and have no motion blurring.
- Figure 48 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller and navigation panel of a main user interface without a key zone in accordance with certain embodiments.
- a main UI may be displayed by swiping in any direction or by tapping or executing a dedicated button. In the example of Figure 48, the user has not set a key zone.
- Several capture modes are illustrated at Figure 48 that may be selected from a capture mode scroller panel.
- Secondary controls may include numerous parameters and/or a simple mode may include just a subset of these. Additional secondary controls may be accessible by scrolling down (dragging upward) or by tapping on the " ⁇ ⁇ ⁇ " ellipses symbol for more.
- a navigator or task manager may provides quick and easy access to a dashboard, Apps, settings, open/live and recently-used apps, and/or activities.
- a user may tap and hold to display a dashboard until release. This affordance may provide efficiency and reduced navigational steps for many frequently-used features.
- a pre-capture filters pull-up panel and activities pull-down panel may be caused to appear by the user.
- Tap to most recent capture, and tap and hold to Gallery may be provided options.
- a capture modes scroller is also apparent in Figure 49.
- Figure 50 illustrates a camera display for a digital camera including a selected secondary control setting panel next to a secondary control panel in accordance with certain embodiments. The user may taps on any secondary control to open a vertical panel immediately to the right on the display screen in certain embodiments.
- Figure 51 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone in accordance with certain embodiments.
- the key zone and its associated controls may operate independently from the main UI in certain embodiments.
- Figure 52 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone and a key zone controls panel in accordance with certain embodiments.
- Figure 53 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone and a key zone controls panel and a key zone controls adjustor panel in accordance with certain embodiments.
- Figure 54 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface and a pre-capture filters pull-up panel in accordance with certain embodiments.
- the user may swipe up from the bottom or tap on the up-pointing chevron to display the pre-capture filters.
- To close this panel the user can reverse the gesture, e.g., swiping down or tapping on the chevron, which has changed to a down-pointing chevron.
- Choosing a filter will in certain embodiments instantly apply the filter to a live viewfinder image.
- Filters may be configured such as not to alter a base master capture, but provide a way to add a desired process feature as the image is taken, thereby skipping post-processing steps.
- Figure 55 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface and an activities pull-down panel in accordance with certain embodiments.
- the user may swipe down from the top or tap on the down-pointing chevron to display activities.
- Activities may include structured guides for the user to control one or more image processing steps to achieve a desired outcome.
- desired outcomes may be a particular look that a user has seen an example of or a custom set of steps that send captured images to particular directories, or another type of setup and/or sequential procedure.
- This approach advantageously combines use of filters with more complex or multi- step procedures involving camera setup, pre-capture adjustments, and/or post-capture manipulation of destinations. Additional means to may be provides to access recently-used activities as well as to discard unwanted activities and to access additional activities.
- Figure 56 illustrates a dashboard interface for accessing to gallery and android in accordance with certain embodiments.
- Figure 57 illustrates a camera display for a digital camera including a default beginning configuration for a gallery in accordance with certain embodiments.
- a selected view of the gallery may include small thumbnails, medium thumbnails, large thumbnails, list (with tiny icon) or map view, or combinations thereof.
- a selected media type may include all media, all photos, moments, stills, panoramas, all videos, slow motion, time lapse, or stop motion, or combinations thereof.
- a selected sort ordering of media may include newest to oldest, oldest to newest, or by location, and there may be groups such as highest level groupings, all favorites, albums, collections and/or trash.
- Figure 58 illustrates a camera display for a digital camera including a gallery with select mode active in accordance with certain embodiments.
- select option in the top bar When a select option in the top bar is tapped, its label turns yellow and any media thumbnail or list item tapped will be selected and denoted by a yellow frame around it. Tapping on a selected thumbnail will unselect it.
- select mode When the select mode is active, e.g., highlighted in yellow, small translucent symbols may appear on thumbnails to indicate whether they are a moment and/or whether they have been uploaded to a cloud server. These may be displayed just when the select mode is active in certain embodiments.
- batch editing may be disabled in this mode making an edit option unavailable. The user may in certain embodiments favorite, share, or delete selected media individually or as a batch.
- Figure 59 illustrates a camera display for a digital camera including an opened individual photo in a default state with options including an edit option in accordance with certain embodiments.
- Figure 60 illustrates a camera display for a digital camera including an opened individual photo with user interface dismissed in accordance with certain embodiments.
- the user may swipe in any direction to return the UI.
- Figure 61 illustrates a camera display for a digital camera including an opened individual photo and an edit options panel in accordance with certain embodiments.
- Certain edit options may include crop, rotate and adjustments for color, contrast, and brightness.
- Figure 62 illustrates a camera display for a digital camera including an opened individual photo and a filters, effects and frames panel in accordance with certain embodiments.
- Figure 63 illustrates an android apps and environment screen in accordance with certain embodiments.
- Figure 64A illustrates an expert mode menu of adjustable primary control settings for shutter speed, aperture and ISO in accordance with certain embodiments.
- Figure 64B illustrates an expert mode menu of adjustable secondary control settings for white balance in accordance with certain embodiments.
- Figures 65 A-65B illustrate a two level user interface in accordance with certain embodiments including an example top level activities interface plug-in over a hidden bottom level interface including primary and secondary controls panels, a capture mode scroller and navigation panel, and a scene display.
- the top level activities interface is what the user sees and thinks of as the user interface or UI, and includes display screen objects that the user may interact with and control in configuring precapture settings, capturing images and editing images post- capture, as well as managing and communicating images.
- Each activity may be include or be configured as an applet or plug-in. Activities may include higher level apps that are privileged above regular Android apps that may be accessible elsewhere. Activities may be first party and/or created in-house or by premium partners.
- Activities can in certain embodiments guide users through multi-step procedures to achieve desired results and goals. Activities can in certain embodiments include a default general usage option and a full manual usage option. Examples of activities may include out of the box introduction, general simple operation, photo cookbook or step-by-step guides, daily or periodic contests or challenges, a community feed or curated photo blog, premium front ends to instagram or other social media, tutorials for guiding advanced photography, custom editing and effects, Activities can be updated and revised in real time.
- the bottom level interface may be hidden or optionally accessible.
- the bottom level interface may be controlled and/or configured by one or more top level activity plug-ins in accordance with certain embodiments.
- the bottom level interface provides a common underlying manual interface and/or expert interface.
- the bottom level interface is designed to work with applets and/or plug-ins in certain embodiments.
- the bottom level interface provides an efficient default and/or customized UI for manual operation.
- the bottom level interface holds a full range of available options and settings.
- the bottom level interface lets the top level interface choose which options and settings are shown to the user.
- the combination of top level and bottom level interfaces provides in certain embodiments a universal tool, e.g., based on a "Steering Wheel, Accelerator, Brake" model, for manual and/or expert operational control.
- Figure 66A illustrates example steps for guided usage with a soft focus portrait interface plug-in in accordance with certain embodiments.
- Figure 66B illustrates example steps for guided usage with a wedding shoot setup interface plug-in in accordance with certain embodiments.
- Figures 67A-67B illustrate simple usage modes of a manual operation user interface for primary and secondary controls in accordance with certain embodiments.
- Figures 67C-67D illustrate expert usage modes of a manual operation user interface for primary and secondary controls in accordance with certain embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
A digital camera is programmed according to a user-selectable combination of manual and automatic image capture settings for specific parameters or groups of parameters based on one or more preview images captured using default and/or automatically preselected focus, exposure and shutter speed settings, and overlaying one or more of the preview images with two or more translucent display objects to guide the camera user through selecting a combination of manual and automatic image capture settings for specific parameters and/or groups of parameters.
Description
USER INTERFACE FOR SMART DIGITAL CAMERA
PRIORITY AND RELATED APPLICATIONS
This application claims priority to United States provisional patent application Serial no. 62/363,835, filed July 18, 2016, and is incorporated by reference.
This application is related to U.S. patent applications Serial nos. 15/131,374 (US2016-
0309092A1); 15/131,407 (US2016-0309076A1); 15/131,434 (US2016-0309063A1); 15/131,529 (US2016-0309069A1); and 15/131,547, each filed April 18, 2016. Each of these applications is hereby incorporated by reference.
This application is also related to PCT application Serial No. PCT/US 16/28145
(WO2016/168838), filed April 18, 2016, and is incorporated by reference.
This application is also related to United States provisional patent applications Serial Nos. 62/149,406, 62/149,433, 62/149,452, and 62/149,475, each filed April 17, 2015. Each of these applications is hereby incorporated by reference.
BACKGROUND
An advantage of electronic viewfinders is you get to see exactly what the camera's sensor sees and your view of a scene is never obstructed when taking a photo (your view is momentarily blocked when taking photos on DSLR cameras). Some cameras also augment the EVF display in various ways, such as by highlighting areas in focus ('peaking' autofocus), simulating the motion blur you'll see if you take a photo and automatically boosting brightness when shooting very dark scenes.
Adjusting precapture settings and postcapture editing on typical DSLRs and mobile camera-enabled devices involve the inconvenience of taking one hand off the camera to make touch screen or button actuated adjustments to imaging parameters that tend to temporarily destabilize the camera as an image capture device or as an viewer. It is desired to have a camera that allows camera users to smoothly and conveniently adjust precapture settings and perform postcapture editing.
BRIEF DESCRIPTIONS OF THE DRAWINGS
Figure 1 schematically illustrates a front perspective view of a first digital camera in accordance with certain embodiments.
Figure 2 schematically illustrates a back perspective view of a digital camera with a movable viewfinder in accordance with certain embodiments.
Figures 3 A-3D schematically illustrate a linear slider for adjusting one or more image capture parameters and/or editing a captured image in accordance with certain embodiments.
Figures 4A-4C schematically illustrate front views of example digital cameras that each include multiple LEDs for illuminating objects to be imaged in accordance with certain embodiments.
Figure 5 schematically illustrates a perspective view of a digital camera that includes multiple microphones in accordance certain embodiments.
Figure 6 schematically illustrates a back view of a digital camera that includes a display screen and various buttons for image capture and/or editing control, including buttons for capture type control (e.g., video, time lapse, slow motion), secondary controls such as timer and flash, adjustment controls, global controls such as gallery, app store and settings, and a thumbnail of a previous image capture in accordance with certain embodiments.
Figure 7 schematically illustrates a back view of a digital camera that includes a display screen and various buttons for image capture and/or editing control, including buttons for adjusting a time parameter and/or scrolling through a sequence of images, for selecting and editing various parameters using smart menus and a linear slider for adjusting, scrolling or showing a current time parameter disposed between a start time and an end time for the sequence of images, in accordance with certain embodiments.
Figure 8 schematically illustrates a back view of a digital camera that includes a display screen and a smart reset button.
Figure 9 schematically illustrates a back view of a digital camera that includes a display screen such as a touch screen, a smart button, a value indicator, smart correction and/or scrolling button, and a linear slider for adjusting parameters such as exposure, contrast, fill-flash, face priority and various other image capture and/or editing parameters, in accordance with certain embodiments.
Figure 10 schematically illustrates a back view of a digital camera that includes a display screen showing a live image, a favorite select button, a delete select button, a global control button, and advanced edits and share buttons, in accordance with certain embodiments.
Figure 11 schematically illustrates a back view of a digital camera that includes a display screen showing a feedback bubble that a user can accept, reject or ignore in accordance with certain embodiments.
Figure 12 schematically illustrates a back view of a digital camera that includes a display screen and buttons for crop control and other adjustment controls, and a button for confirming a crop or other adjustment, and cancel and smart buttons, in accordance with certain embodiments.
Figure 13 schematically illustrates a back view of a digital camera that includes a display screen and a timeline with indicators of original and current time values disposed between start and end times, and buttons for canceling to exit adjustment mode without saving and for confirming to save changes, and a smart button, in accordance with certain embodiments.
Figure 14 schematically illustrates a back view of a digital camera that includes a display screen showing a selected image for sharing, and buttons for email, text, facebook, and networked second camera or other device, in accordance with certain embodiments.
Figures 15A-15B schematically illustrate a back view of a digital camera that includes a display screen that shows a level guide that auto appears when the camera is not leveled and disappears when the level is restored in accordance with certain embodiments.
Figure 16 schematically illustrates a digital camera display screen showing a column of menu or executable icons and a photographic scene overlayed by a column of translucent menu or executable icons in accordance with certain embodiments.
Figure 17 schematically illustrates a digital camera display screen showing a column of menu or executable icons and a photographic scene overlayed by a first column of translucent menu or executable icons and a second column of translucent menu or executable text items in accordance with certain embodiments.
Figure 18 schematically illustrates a digital camera display screen showing a photographic scene overlayed by two columns of translucent menu or executable icons and two columns of translucent menu or executable text items in accordance with certain embodiments.
Figure 19 illustrates a digital camera display screen showing a photographic scene overlayed by multiple columns of translucent icons and indicating a depth selection as a highlighted row of icons across the multiple columns in accordance with certain embodiments.
Figure 20 illustrates a digital camera display screen showing a photographic scene overlayed by multiple columns of translucent icons and showing a column of translucent, selectable motion option icons in accordance with certain embodiments.
Figure 21 illustrates a display screen result of scrolling to and selecting one of the multiple icons within the column of translucent, selectable motion option icons of Figure 20, including adding highlighting of the selected motion option icon and extending a row of highlighted icons in accordance with certain embodiments.
Figure 22 illustrates a digital camera display screen showing a photographic scene overlayed by multiple columns of translucent icons and showing a column of translucent, selectable light option icons in accordance with certain embodiments.
Figure 23 illustrates a display screen result of scrolling to and selecting one of the multiple icons within the column of translucent, selectable light option icons of Figure 22, including adding highlighting of the selected light option icon and extending a row of highlighted icons in accordance with certain embodiments.
Figure 24 illustrates a display screen result of selecting one of multiple translucent icons from a first column of translucent icons in accordance with certain embodiments, the selected icon being indicated by universal brightening or white or gray translucent overlaying of an icon- containing pixel area, while maintaining the highlighting illustrated at Figure 23 of an extended row of icons across an adjacent block of multiple other columns.
Figure 25 illustrates a digital camera display screen result of selecting a "clear" icon in accordance with certain embodiments to remove the universal brightening or white or gray translucent overlaying of the icon-containing area illustrated at Figure 24.
Figure 26 illustrates a display screen that is overlayed in certain embodiments by an additional column of translucent icons on the opposite side of the display screen from the other columns illustrated at Figure 22.
Figure 27 illustrates a digital camera display screen in accordance with certain
embodiments that is overlayed by two additional columns of translucent icons on the opposite side of the display screen from the other columns illustrated at Figure 22, including a highlighted row of two icons selected each from one of the two additional columns, e.g., "FIDR" and "on."
Figure 28 illustrates a digital camera display screen in accordance with certain
embodiments that is overlayed by two additional columns of translucent icons on the opposite side of the display screen from the other columns illustrated at Figure 22, including a highlighted row of two icons selected each from one of the two additional columns, e.g., "#" and "#," and having an effect of overlaying a grid onto the display screen.
Figure 29 illustrates a digital camera display screen including a translucent cursor overlay in accordance with certain embodiments, e.g., an unfilled circle appearing to enclose a circular pixel area of the display screen and showing a plus sign at a center of the circular pixel area.
Figure 30 illustrates a digital camera display screen including a translucent cursor overlay as in Figure 29 and a translucent, broken-circle-shaped overlay includes selectable depth, motion and light circle segments in accordance with certain embodiments.
Figure 31 illustrates a display screen result in accordance with certain embodiments of selecting, indicated by highlighting, of a location within the depth segment of the translucent, broken-circle shaped overlay of Figure 30 that corresponds to a particular value of depth of field.
Figure 32 illustrates a display screen result in accordance with certain embodiments of adjusting the value of depth of field by sliding to or otherwise selecting, indicated by highlighting, of a different location within the depth segment of the translucent, broken-circle shaped overlay of Figure 30.
Figure 33 illustrates a digital camera display screen including multiple translucent columns that appear in certain embodiments indicating values of aperture from which a user may choose to manually select.
Figure 34 illustrates a digital camera display screen including multiple translucent columns that appear in certain embodiments indicating values of ISO, or sensitivity to available light, from which a user may choose to manually select.
Figure 35 illustrates a digital camera display screen including multiple translucent columns that appear in certain embodiments indicating values of shutter speed from which a user may choose to manually select.
Figure 36 illustrates a digital camera display screen including multiple translucent columns that appear in certain embodiments indicating values of white balance from which a user may choose to manually select.
Figure 37 illustrates a digital camera display screen showing an indication of recent capture of two minutes and fifteen second of video, the display screen also showing multiple translucent columns that appear in certain embodiments including a row of highlighted icons indicating a manually-selected value of 1/60 seconds as a camera shutter speed.
Figure 38 illustrates a display screen result of removing the multiple translucent columns indicating a manual selection of a 1/60 second shutter speed as shown in Figure 37.
Figure 39 illustrates a digital camera display screen including icons that a user may touch or otherwise select to execute video record capture, video record pause and video clip delete commands in accordance with certain embodiments.
Figure 40 illustrates a camera display for a digital camera in a default camera state for a viewfinder mode with no user interface in accordance with certain embodiments.
Figure 41 illustrates a camera display for a digital camera after a single tap instantiates appearance of a key zone object in accordance with certain embodiments.
Figure 42 illustrates a camera display for a digital camera including a pinch and zoom feature to vary the size of a key zone object in accordance with certain embodiments.
Figure 43 illustrates a camera display for a digital camera including a key zone controls panel for selecting light, focus and speed adjustment in accordance with certain embodiments.
Figure 44 illustrates a camera display for a digital camera including a key zone adjustor panel for a light parameter selected from a key zone control panel in accordance with certain embodiments.
Figure 45 illustrates a camera display for a digital camera upon user interaction with the key zone adjustor panel of Figure 44 to adjust a light parameter.
Figure 46 illustrates a camera display for a digital camera upon user interaction with a key zone adjustor panel for a focus parameter selected from a key zone control panel in accordance with certain embodiments.
Figure 47 illustrates a camera display for a digital camera upon user interaction with a key zone adjustor panel for a speed parameter selected from a key zone control panel in accordance with certain embodiments.
Figure 48 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller and navigation panel of a main user interface without a key zone in accordance with certain embodiments.
Figure 49 illustrates capture formats that may be selected in accordance with certain embodiments.
Figure 50 illustrates a camera display for a digital camera including a selected secondary control setting panel next to a secondary control panel in accordance with certain embodiments.
Figure 51 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone in accordance with certain embodiments.
Figure 52 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone and a key zone controls panel in accordance with certain embodiments.
Figure 53 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone and a key zone controls panel and a key zone controls adjustor panel in accordance with certain embodiments.
Figure 54 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface and a pre-capture filters pull-up panel in accordance with certain embodiments.
Figure 55 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface and an activities pull-down panel in accordance with certain embodiments.
Figure 56 illustrates a dashboard interface for accessing to gallery and android in accordance with certain embodiments.
Figure 57 illustrates a camera display for a digital camera including a default beginning configuration for a gallery in accordance with certain embodiments.
Figure 58 illustrates a camera display for a digital camera including a gallery with select mode active in accordance with certain embodiments.
Figure 59 illustrates a camera display for a digital camera including an opened individual photo in a default state with options including an edit option in accordance with certain embodiments.
Figure 60 illustrates a camera display for a digital camera including an opened individual photo with user interface dismissed in accordance with certain embodiments.
Figure 61 illustrates a camera display for a digital camera including an opened individual photo and an edit options panel in accordance with certain embodiments.
Figure 62 illustrates a camera display for a digital camera including an opened individual photo and a filters, effects and frames panel in accordance with certain embodiments.
Figure 63 illustrates an android apps and environment screen in accordance with certain embodiments.
Figure 64A illustrates an expert mode menu of adjustable primary control settings for shutter speed, aperture and ISO in accordance with certain embodiments.
Figure 64B illustrates an expert mode menu of adjustable secondary control settings for white balance in accordance with certain embodiments.
Figure 64C illustrates a camera display for a digital camera operating in an expert mode, including a selected secondary control setting panel next to a secondary control panel in accordance with certain embodiments.
Figures 65 A-65B illustrate a two level user interface in accordance with certain embodiments including an example top level activities interface plug-in over a hidden bottom level interface including primary and secondary controls panels, a capture mode scroller and navigation panel, and a scene display.
Figure 66A illustrates example steps for guided usage with a soft focus portrait interface plug-in in accordance with certain embodiments.
Figure 66B illustrates example steps for guided usage with a wedding shoot setup interface plug-in in accordance with certain embodiments.
Figures 67A-67B illustrate simple usage modes of a manual operation user interface for primary and secondary controls in accordance with certain embodiments.
Figures 67C-67D illustrate expert usage modes of a manual operation user interface for primary and secondary controls in accordance with certain embodiments.
DETAILED DESCRIPTIONS OF THE EMBODIMENTS
Figure 1 schematically illustrates a front perspective view of a first digital camera in accordance with certain embodiments. The digital camera shown in Figure 1 includes a grip 2, a lens 4, a hot shoe 6 and a view finder 8. Although not shown in Figure 1, the camera may be equipped with flash illumination.
The grip 2 includes a capacitive touch sensor 10 and battery compartment 12. The capacitive touch sensor 10 may be used for scrolling through a menu of processing functions or for moving a cursor on a display screen or for another function that is typically available to a user by way of a mouse or keypad of a computer or other processor-based device. The capacitive touch sensor 10 may be used as an image capture button that may have both full press shutter trigger and half press settings adjustment functionality.
The lens 4 may be replaceable with one or more other lenses having different optical properties. The lens 4 may be movable relative to an image sensor of the digital camera. The camera may include a CCD sensor cover that slides over the CCD sensor to prevent dust from getting on the CCD sensor and to prevent physical touching of the CCD sensor, e.g., when replacing a lens 4. The lens 4 may be one of multiple lenses contained within a lens holder 14. The lens 4 may be movable relative to one or more other lenses contained within the lens holder 14, and multiple lenses may be movable together relative to the image sensor.
The hot shoe 6 includes a mechanical and/or electrical coupling interface for a peripheral such as a secondary flash or a secondary image capture device or SICD. The secondary flash or SICD may be directly coupled to the digital camera housing at the hot shoe interface 6. A
Bluetooth or other wireless coupling interface may be included at the hot shoe 6 or otherwise within the digital camera for coupling the camera to a secondary display, secondary or primary flash or SICD, or secondary image processing or file sharing device.
The viewfinder 8 is shown in a stowed or inactive position. The viewfinder 8 of Figure 1 is configured to be moveable, e.g., rotatable, between active and inactive positions.
VIEWFINDER
Figure 2 schematically illustrates a back perspective view of a digital camera with a movable viewfinder in accordance with certain embodiments. Figure 2 shows a back perspective view of a digital camera that includes a grip 42, hot shoe 46, viewfinder 48, lens holder 54, display 56 and compartment access door 58. The viewfinder 48 is shown in two positions in Figure 2. In a first stowed or inactive position A, the viewfinder 48 is out of the way of the display 56 and stowed similar to the viewfinder 8 illustrated schematically in Figure 1. In a second active position B, the viewfinder 48 is overlapping a portion of the display 56. In the example of Figure 2, an upper left corner section of the display 56 is overlapped by the viewfinder 48 when in the active position B.
The viewfinder 48 may be moved between positions A and B by rotation about an axis that is approximately normal to the optical axis of the digital camera. The movement of the viewfinder 48 from the stowed position A to the active position B may in certain embodiments trigger a thumbnail to appear on the overlapped portion of the display for viewing through a viewfinder window 60 an approximately same or similar image as may be viewed on the display 56 when the viewfinder 48 is stowed, and as may be captured by full-pressing the image capture button (not shown in Figure 2, but see element 10 of Figure 1). Figure 2 also shows an eyebrow rest 62 to assist the user to position and stabilize his or her eye when using the viewfinder 48.
Among the advantages of a digital camera with a viewfinder in accordance with embodiments illustrated in the examples of Figures 1-2 and other figures described herein below, a rear screen 56 of a the digital camera, or a small portion of the screen 56 as illustrated in the example of Figure 2, may be disposed at an image plane, while the viewfinder 48 may be configured such that the window 60 or a lens inside the window 60 has a significant magnification like a magnifying glass to serve as a loupe for viewing the image on the display 56. The image plane may also be a separate CCD, CMOS or other image detector, such that the image data may be processed through a ISP or other processor and provided as a thumbnail or small image on the screen 56 or portion thereof which is viewable through the viewfinder 48.
The viewfinder 48 can be retracted when the photographer wants to use the viewfinder 48 and moved aside when the user would like a full view of the screen 56. The viewfinder 48 may be adjustable to suit the distinct eyesight of one or more individual viewers.
The viewfinder 48 can use various areas of the screen depending on the resolution that is selected automatically by the camera or manually by a user. The screen 56 can automatically adjust based on detection of when the viewfinder 48 is placed in position to provide the viewfinder image and when the viewfinder is stowed to the side of the screen 56.
The viewfinder 48 may be assembled as part of a digital camera, as shown in Figure 2, or may be selectably attached and removed as a peripheral device. The attachment of the viewfinder can be performed in certain embodiments by sliding the viewfinder into the hot shoe 46. In the case of attachment of the viewfinder 48 to the hot shoe 46, an image may be provided at the center-top of the screen 56 beneath the hot shoe 46 in the example of Figure 2 for viewing through viewfinder 48. The position of the hot shoe 46 may be anywhere around the camera periphery and the image may be provided at a screen location proximate or adjacent or convenient to the location of the hot shoe 46. The viewfinder may be configured to be adjustable such that different screen locations may be viewed through it. In one embodiment, the viewfinder includes a hinged extension arm that folds out and may be rotated using a ball bearing coupling to view any or most any or a substantial or significant amount of selected screen portions.
The viewfinder 48 may be selectably stowed at position A or put into position B for use by a hinge mechanism with locking recesses at positions A and B.
The viewfinder 48 and grip 42 may be interchangeable either left and right or right and left to accommodate different dominant eyes of users. The viewfinder 48 in certain embodiments is designed with blinders or polarization filters or baffling or reflectors on the sides so that stray light is prevented from penetrating from the sides to advantageously provide a better contrast ratio.
The viewfinder 48 may have a rubber cup eye socket interface (not shown) to stabilize the user at the viewfinder and reduce stray light. The viewfinder 48 can be adjusted in certain embodiments to multiple different magnifications in certain embodiments, and in embodiments having less versatility in the selection of magnification, one or more image parameters may alternatively be adjustable.
In another embodiment, a viewfinder view may be observable through a translucent display screen portion. A portion of the display or the entire display may be translucent such that images are viewable in a viewing mode, while the translucent display or translucent display portion may provide a view of the scene through the viewfinder in a viewfinder mode through the translucent display or translucent display portion.
In other embodiments, a viewfinder may include one or more lenses or curved mirrors, and may be multi-focal or anamorphic or asymmetric or non-symmetric or aspherical in magnification or optical power, or may be customized in accordance with an optical prescription of a particular camera user. The viewfinder may include a mirror for changing a direction of viewing relative to the direction of the object. The viewfinder may have a half-silvered mirror for partially viewing two different regions of a scene, or of two scenes to be combined or captured in succession. The
viewfinder may have a pair of mirrors. A periscopic or telescopic attachment may be available as an accessory for coupling into the optical path of the viewfinder in certain embodiments.
TOUCH SLIDER
Figures 3A-3C schematically illustrate examples of touch slider display objects that, in a first example, a user may view on the display screen 16 while thumb or finger actuating a touch slider, or that, in a second example, a user may both view and touch screen actuate a displayed touch slider object, for selecting and adjusting an imaging parameter in accordance with certain embodiments. Any touch slider other than a touch screen display object slider may include multiple posts or pegs formed together in an array of pixels that may be disposed at an accessible area of the camera housing in an overall touch slider recess or in two or more touch slider region recesses or each post or peg may recess into its own individual post or peg recess when not in use, and then protrude out of the housing when a user decides to use the touch slider. In other embodiments, a slider may include a fixed touchpad surface.
The touch slider 120 illustrated schematically in Figure 3 A is divided into four regions: flash 122, exposure 124, focus 126 and auto/smart 128. The number of touch slider regions may be more or less than four and the regions 122, 124, 126, 128 may be disposed in a circular shape or in another curved shape or in a linear or rectangular shaped region, and the sub-regions may be polygonal or curved in shape while the overall touch sensor region may be shaped differently. The touch slider 120 may overlap a preview of an image on the display screen or may be disposed to the side or above or below a preview image on the display screen or there may be separate display screens for the user interface and preview or postcapture images. There may be another region that would forward to a next set of parameters that may be selected to adjust, and there may be as many sliders generated in this manner as there are sets of parameters that may be adjusted. In certain embodiments, a user may initiate an adjustment of flash, exposure or focus or another parameter by tapping the touch slider region designated for the parameter that is to be adjusted. When any of the regions designated flash, exposure or focus is selected by the user by tapping region 122, 124 or 126, respectively, then the touch slider changes to a different touch slider 130 such as that shown in Figure 3B for adjusting a value of the selected parameter. Tapping the auto/smart region 128 of the touch slider 120 would leave it to the default settings or a
programmed process for setting imaging parameters that have not been specifically set by the user.
In certain embodiments, the user can tap one of the numbers shown in the example slider 130 of Figure 3B to adjust the value of the selected parameter by the indicated amount, e.g., +2 or -1. The user may in certain embodiments use a touch screen display slider object 130 by sliding a
finger or thumb in one direction to increase the value of the parameter or in the opposite direction to reduce the value of the parameter. The touch slider display object 130 may alternatively show actual values of the parameter that may be selected directly by tapping the slider or by finger thumb sliding left or right to respectively decrease or increase the value of the parameter by an amount proportional to the sliding distance or other quantity that may be detected or computed for the finger or thumb movement such as slide speed or downward pressure.
Figure 3C illustrates a view through a viewfinder, e.g., viewfinder 48 of Figure 2. An image 136 appears in the viewfinder illustrated in the example of Figure 3C. A touch slider display object 120 is shown just above the image 136 in Figure 3C, and a touch slider display object 130 is shown just below the image 136. In other embodiments, one touch slider 120 or 130 would appear at a time, respectively, for selecting a parameter to adjust or for adjusting a selected parameter. A touch slider object may be divided functionally into two or more regions, including a region operating in accordance with touch slider 120 and a region operating in accordance with touch slider 130. In another example, one slider 120 may operate in accordance with touch slider 120, while another separate slider 130 may operate in accordance with touch slider 130.
Figure 3D illustrates another view through a viewfinder, e.g., viewfinder 48 of Figure 2. An image 136 appears in the viewfinder illustrated in the example of Figure 3D. A touch slider display object 120 is shown near the top overlapping the image 136 in Figure 3D and a touch slider display object 130 is shown near the bottom also overlapping the image 136. In certain embodiments, there may be three touch slider display objects, e.g., one for focus, aperture and/or depth of field, one for brightness or exposure, and one for motion blur or shutter duration control. Various numbers of touch slider display objects may be provided each corresponding to a different parameter that is amenable to manual user pre-capture or post-capture control. The objects 120, 130 in the example of Figure 3D may be translucent so that the image can be seen even where the display object 120, 130 also occupies a same display screen portion. In other embodiments, one touch slider 120 or 130 would appear at a time, respectively, e.g., for first selecting a parameter to adjust and for next adjusting the selected parameter or for first adjusting a first parameter and for next adjusting a second parameter (then a third parameter, etc.).
The touch sliders 120, 130 may be embodied in an array of touch sensitive elements coupled onto a digital camera housing or exposed through a cavity or recess in a digital camera housing, or provided as an object on a touch sensitive digital camera display screen, or combinations thereof. In certain embodiments, a camera processor is programmed to interpret a touching, tapping or sensed proximity of a finger, thumb or stylus or other tool of a user, or some combination thereof, to a specific region of the touch slider as a user command to initiate a
process for adjusting a value of a specific imaging parameter. Alternatively, a length or duration of a sliding movement or double tap time, or a tap pressure, or a sliding movement between specific regions, or another sensed movement or characteristic of a sensed movement, such as an area of a closed path, may be assigned to a specific imaging parameter.
Imaging parameters may include precapture settings for the digital camera such as an intensity of flash or other light source illumination, a selection of one or more of multiple available flash choices such as a xenon or krypton flash and one or more LEDs, and/or a duration or sequence or direction or spectral range or divergence or whether to use a Fresnel lens, or a length of exposure, or aperture size, or selection of a single or multiple still image capture, or one of multiple video capture modes, or a specific audio capture mode such as selecting from multiple available microphones, wavelength ranges to include or exclude, microphone direction, stereo balance or other available audio options, or a parameter that may be adjusted by altering a configuration of the optics of the camera, e.g., a focus or zoom setting may be adjusted by moving a lens relative to the image sensor, or magnification of a viewfinder may be adjusted by moving a magnifying lens within the viewfinder, or a parameter of a captured image such as exposure, contrast, brightness, focus distance, depth of field, white balance, digital fill flash or focal point.
In the example of Figure 3 A, which is simplified for illustrative purposes, an elongated slider 120 has been separated into four regions along its length. The four regions of the slider 120 in Figure 3 A are labeled flash, exposure, focus and smart/auto mode. The user may tap the exposure region, e.g., and a touch slider object 130 would show exposure values ordered from low to high values within some reasonable number of regions of the touch slider 120. A region may be then tapped which would adjust the exposure to the value provided in that region, or a sliding movement may be used to raise or lower the exposure value by a proportional amount to the distance, speed, pressure, duration or other determinable characteristic of the relative movement sensed by the touch slider 130.
A touch slider may be deemed or referred to as a linear slider in certain embodiments wherein a camera user may adjust a value of a selected imaging parameter in an amount that is proportional to a relative movement along a directional axis defined within the plane of the slider surface such as a sliding distance of a user's finger along an axis defined in the plane of the linear slider. The slider 120, 130 may have a width as small as a single pixel such that relative movements can only be detected in one direction along a single axis of the slider. Two or more touch sensitive pixels may be provided in certain embodiments along a second directional axis of the slider 120, 130 such that relative finger or thumb motion may be detected along two axes that define a plane or other contour of the camera housing surface where the slider is located. A touch
screen object slider 120, 130 or touchpad slider may have an elongated shape in certain embodiments or a circular, elliptical, square or other polygon or closed shape having some combination of curved and straight segments.
A touch slider may also be coupled to the camera housing or configured as a display object that functions like a mouse wheel. As a display object, the mouse wheel slider may appear like a mouse wheel. With a clickable mouse wheel, a user may manually select an imaging or editing parameter by turning the wheel to scroll through parameters, click the mouse wheel on a parameter to open a menu of values, scroll again with the mouse wheel to a value, click the wheel again to select the value. A double click of the wheel may return the camera back to an auto mode.
Left-click, right click and/or single click mouse-like display objects are also included in certain embodiments. A mouse wheel and/or left-click, right-click and/or single click button may be provided on the camera housing, e.g., on the top or back or side of the grip or on the opposite side of the camera. Combinations of a mouse wheel and clickable region are also provided in certain embodiments on the camera housing. In certain embodiments, combinations of a mouse wheel and/or clickable region are provided along with a touchpad slider on the camera housing and/or as display objects or as mix of one or more display objects and touch, click and/or wheel regions on the camera housing.
Quantities associated with a third dimension normal to the plane of the slider 120, 130 which is coplanar with the camera housing surface and/or with the display screen surface, such as downward force or pressure or proximity, may be utilized by assigning certain commands to them in certain embodiments. A touch slider 120, 130 in accordance with certain embodiments may have the functionality of a mouse, joystick, or game controller or may be limited to a short list of imaging parameters as in the illustrative example of Figure 2 or something in between. For example, the digital camera may be programmed to process a tap in a same or similar manner as a mouse click and to process a relative movement of a finger or thumb of a user or a stylus or other tool held by a user along the length or within the area of the slider in a same or similar manner as a movement of a mouse.
A touch slider may be located at the top of the camera housing or the front of the camera housing, or the rear of the camera housing or grip, or on a touch screen display, and may be located on either side of the camera. The touch slider can be activated using a haptic mechanism such as a touch screen or a touch slider haptic mechanism. The camera may be configured for finger or thumb actuated haptic activation of the touch slider.
An imaging parameter may be adjusted using the touch slider as a single parameter adjustment axis, or correction may be performed using the touch slider as a complex combination of some of the parameters above. The touch slider may be used in conjunction with a duplicate visual display and/or may be functionally divided into two regions: one object on the display and/or one region of the touch slider being configured for selecting a mode of correction and the other object on the display and/or other region of the touch slider being configured for selecting a quantity of correction. In one example, both of the objects illustrated in Figures 3 A and 3B may be provided together at the same time and/or side by side on the camera display and/or the touch sliders 120, 130 may be functionally separated into a mode selection region and a quantity of correction region (e.g., upper half and lower half or left side and right side). Alternatively, a digital camera in accordance with certain embodiments may include a pair of touch sliders that are functionally distinguished as a mode selection slider and a quantity of correction slider.
A fingerprint reader may be included as part of one of the touchpad or touchscreen sensors for security. The camera may include a rear camera that also provides some security such as by identifying the camera user and/or by monitoring and/or recording and analyzing what is behind the camera user, and, e.g., signaling the user or sounding an alarm when a dangerous condition is detected. The display may be configured to flip from side to side depending on which camera's images are desired to be on a single sided display or double-sided display or in picture-in-picture or side-by-side format, or if only one camera's images are to be selected to be displayed.
A touchpad sensor, or mouse wheel, or touchscreen object or other built-in device coupled accessibly at an exterior location of the camera housing may be used for scrolling through menus and executing programs and otherwise controlling the camera manually through the user interface appearing on the display. In certain embodiments, an autofocus position may be selected within an image and/or amongst the display screen pixels in a set autofocus mode or otherwise during a precapture mode of the camera when a touch pad surface is mapped proportionally to the display screen. When the camera is in this set autofocus mode, a location on the touchpad may be tapped causing a location on the display screen and/or within an image to be captured or video being captured to be set as a user-selected auto-focus location. In certain embodiments, more than one auto-focus location can be selected, such as a primary and secondary or multiple faces in no particular order of importance.
LED LIGHTING
Figures 4A-4C schematically illustrate front views of example digital cameras that each include multiple LEDs 141 A for illuminating objects to be imaged in accordance with certain
embodiments. The example illustration of Figure 4A shows four LEDs 141 A disposed across the front of the camera. In some embodiments, multiple LEDs 141A are disposed across the front of the camera and behind an elongated Fresnel lens. A camera in accordance with a multiple LED embodiment may include as few as two LEDs 141 A that may be built-in to the camera or attachable at a hot shoe bracket or detachable for adjusting a position or angle of illumination during image capture, or remotely controlled by the camera as a peripheral accessory. The LEDs 141 A may be relatively disposed in various ways and embodiments of digital cameras herein generally may include no flash LEDs or any number of LEDs as flash illumination components.
A microphone 141B is also shown in the example of Figure 4A to the right of the lens 114. In certain embodiments, three microphones 141B are disposed in the plane of the front surface of the camera that form a triangle such as a right triangle or otherwise to permit, e.g., stereo audio sound recording when the camera is in different orientations such as landscape and portrait orientations.
The LEDs 141 A may be clustered in groups as illustrated schematically in Figure 4B. In the example of Figure 4B, two groups 143 of two LEDs 141 A are disposed to the left and right of a third group 145 of three LEDs 141 A. The LEDs 141 A of any of the groups 143, 145 may be selected to complement other LEDs 141 A in the group. For example, a group 143, 145 may include LEDs that offer different wavelength spectra, different color temperature, different intensity, different divergence (spot, wide), different duration, or may be positioned at a slightly different angle.
The LEDs 141 A of a group 143, 145 may have different delays so that the LEDs 141 A flash at slightly offset times. The LEDs 141 A of a group 143, 145 may be activated at different times to allow the capture of multiple images each illuminated by one LED flash 141 A or a subset of LEDs 141 A. In one embodiment, these multiple images may be captured and/or illuminated using various light angles.
The groups 143, 145 may be activated at a same time or at different times or for different durations, e.g., to provide a "back to front flash" effect. A specific selection of LEDs 141 A and/or groups 143, 145 of LEDs 141 A to use in capturing a specific image may provide specific fill light characteristics for the image.
The camera may be configured to determine depth by using the disparity of the lights, e.g., to generate a 3D image or to adjust focus. A movement effect may be created by activating multiple LEDs 141 A in a particular order or sequence while capturing multiple images each with a different LED 141 A lighting the scene. The camera may be programmed to automatically determine a desired fill light based on an analysis of preview images. This determination may
depend on a location of a main light source such as the sun, an external light or a main camera flash, a direction of shadows, overall color balance, and/or other parameters
One or more LEDs 141 A may be used in combination with a xenon flash to provide short/long lighting. A xenon flash may be attachable at a hot shoe bracket or built-in such as a pop-up flash 129.
Figure 4C schematically illustrates a digital camera with multiple LEDs 141 A disposed on the camera lens holder around the optical path of the camera at the periphery of a light collecting area of the lens at the object end of the lens holder. Six LEDs 141 A are shown in Figure 4C disposed on the lens holder 124, although any number of LEDs 141 A may be disposed on the lens holder 124 in various embodiments. One or more LEDs 141 A may be disposed on the lens holder as in Figure 4C, while one or more LEDs 141 A may be disposed on the camera housing such as in
Figure 4 A or Figure 4B.
LEDs 141 A may be disposed within recesses defined in the housing when not in use.
When a LED 141 A is to be used to provide illumination during image capture, the LED 141 A may protrude out of the recess to provide illumination during an image capture and then recede back into the recess. An optional pop-up flash 129 may also be configured to recede into the housing when not in use.
Another digital camera is provided that includes an image sensor within a camera housing, an optical assembly including one or more lenses for forming images on the image sensor, a display screen for viewing the images, a processor and a lens mounted flash coupled to the lens housing for providing illumination during image capture.
AUDIO
Figure 5 schematically illustrates a perspective view of a digital camera that includes multiple microphones 160 in accordance certain embodiments. The example of Figure 5 includes three microphones 160 that are directed for receiving sounds from the front of the camera and one microphone that is directed for receiving sounds from the rear of the camera. One of the front- facing microphones 160 is located near the grip 2 and near the top of the camera, and may be otherwise disposed within the grip and/or nearer the bottom of the camera.
Another front-facing microphone 160 is disposed at the top-right of the front surface of the camera in the example of Figure 5. This microphone 160 may be disposed within a viewfinder 108 such as in the example of Figure 2, such that the microphone would face front when the viewfinder 108 is stowed and would face to the side when the viewfinder 108 is in use.
Alternatively, the microphone 160 may be disposed below and/or to the left of the viewfinder 108 such as to face front notwithstanding the configuration of the viewfinder 108.
A third front-facing microphone 160 is disposed at the bottom-right of the front surface of the camera. The rear-facing microphone 160 is disposed between two front facing microphones 160 near the top of the camera in the example of Figure 5. This rear-facing microphone 160 may be located below and/or to the side of a hot shoe bracket.
Various embodiments of digital cameras are provided that include multiple microphones aligned with the optical assembly to record sound during image capture. In certain embodiments, the camera includes at least three positioned microphones to generate the horizontal disparity in both portrait and landscape mode. Each pair of spaced microphones is disposed to capture stereo sound and signal processing may be used to further separate the right and left channels by subtracting left information from the right channel and vice versa. Referring to Figure 5, for landscape mode, microphones A and B can be used. For portrait mode, microphones B and C may be used. In the event that sound is being recorded with the camera rotated 45° then microphones A and C may be used. Rotating the camera from landscape to portrait may be monitored by an accelerometer. In certain embodiments, the microphone pair being used during a sound recording may be changed one or more times as changes in camera orientation are determined by the accelerometer.
In certain embodiments, the camera may include three front-facing unidirectional microphones, or the A, B and C mics in Figure 5, in order to provide a stereo image with low background noise. Unlike omnidirectional microphones, unidirectional microphones do not pick up sound well in their rear direction.
A rear facing directional microphone is also included in certain embodiments, e.g., as illustrated schematically at Figure 5. The rear facing mic, or the "Z" mic, may be used to pick up the voice of the camera user or other sound coming from the rear of the camera. The Z microphone can be omnidirectional, or unidirectional or bidirectional. The level of user pickup can be altered in certain embodiments by changing the sensitivity of this rear facing microphone. In certain embodiments, the polarity of microphone Z may be selectively altered from positive to negative to further cancel the sound from the camera user during those times when the camera user does not want their voice recorded. Alternately the sound from the user's voice may simply be subtracted from the other microphones using an algorithm. The rear facing mic, or Z mic, or a different internal mic may be used to cancel a certain type of noise or certain types of noises sh as lens motor noise, shutter noise and/or camera handling noise. Sound gathering and/or sound forming may be provided automatically or with manual touch screen input based on face or other
object detection, tracking and/or recognition including blind source recognition, and/or on motion detection. For example, a moving face or other object may be tracked and the camera audio may be dynamically adjusted based on information gathered during the tracking such as direction and volume of sounds
Three uni elecrets may be used to capture sound from the front of the camera as illustrated at Figure 5.
IMAGE CAPTURE AND PROCESSING INTERFACE
Figures 6-15 schematically illustrate digital cameras that are programmed to capture images that have desired quality characteristics. Precapture settings may be adjusted
automatically by the camera based on information gathered from preview images or user input or programming or combinations thereof. Captured images may also be edited or combined to form new or processed images, and sequences of images may be captured as video or to enhance still image quality. An advantageous user interface, image processor and program code embedded on storage media within the digital camera housing facilitate the capture and processing of quality images and video, as well as the display, storage and transmission of those quality images and video. Examples are provided and schematically illustrated in Figures 6-15 which images of various objects and user interface tools that may be provided on a rear display screen of a digital camera that is configured in accordance with certain embodiments to receive user input by manipulation of one or more touch sliders, e.g., as illustrated at Figures 3A-3D by manipulating a touch screen display slider object or a touchpad on the camera housing.
Figure 6 schematically illustrates a back view of a digital camera that includes a display screen and various buttons for image capture and/or editing control, including buttons for capture type control (e.g., video, time lapse, slow motion, panorama, 3D, cinemagraph, 3D audio, moment), secondary controls such as timer and flash, adjustment controls, global controls such as gallery, app store and settings, and a thumbnail of a previous image capture in accordance with certain embodiments.
Figure 7 schematically illustrates a back view of a digital camera that includes a display screen and various buttons for image capture and/or editing control, including buttons for adjusting a time parameter and/or scrolling through a sequence of images, for selecting and editing various parameters using smart menus and a touch slider or linear slider for selecting an image parameter for adjustment and then adjusting the image parameter, and/or for scrolling or showing a current time parameter disposed between a start time and an end time for the sequence of images. In certain embodiments the slider object changes between parameter selecting and
adjusting modes, while in other embodiments, two different slider objects appear on the display in accordance with certain embodiments.
Figure 8 schematically illustrates a back view of a digital camera that includes a display screen and a smart reset button.
Figure 9 schematically illustrates a back view of a digital camera that includes a display screen such as a touch screen, a smart button, a value indicator, smart correction and/or scrolling button, and a linear slider or touch slider for adjusting parameters such as exposure, contrast, fill- flash, face priority, brightness, focus, and various other image capture and/or editing parameters, in accordance with certain embodiments. The camera is programmed to provide image quality alerts as a smart capture feature. In certain embodiments, the camera will notify a user that a specific parameter is poor, e.g., the captured image may be too dark or too blurry. One or more thumbnails of recent images or shots captured may have frames of different colors based on image quality, e.g., red for poor, yellow for so-so, and green for good.
Figure 10 schematically illustrates a back view of a digital camera that includes a display screen showing a live image, a favorite select button, a delete select button, a global control button, and advanced edits and share buttons, in accordance with certain embodiments.
Figure 11 schematically illustrates a back view of a digital camera that includes a display screen showing a feedback bubble that a user can accept, reject or ignore in accordance with certain embodiments.
Figure 12 schematically illustrates a back view of a digital camera that includes a display screen and buttons for crop control and other adjustment controls, and a button for confirming a crop or other adjustment, and cancel and smart buttons, in accordance with certain embodiments.
Figure 13 schematically illustrates a back view of a digital camera that includes a display screen and a timeline with indicators of original and current time values disposed between start and end times, and buttons for canceling to exit adjustment mode without saving and for confirming to save changes, and a smart button, in accordance with certain embodiments.
Figure 14 schematically illustrates a back view of a digital camera that includes a display screen showing a selected image for sharing, and buttons for email, text, facebook, and networked second camera or other device, in accordance with certain embodiments.
Figures 15A-15B schematically illustrate a back view of a digital camera that includes a display screen that shows a level guide that auto appears when the camera is not leveled and disappears when the level is restored in accordance with certain embodiments.
A camera in accordance with certain embodiments may include a steganographic watermarking feature that may be automatic and/or may have default or customized settings for
embedding hidden information within digital images. Such hidden information may include the identity of the photographer or owner of the camera or time/date stamp or geo gps settings or any selected message or parameter value that is not designed for editing and configured to be unique such that an unauthorized phony or edited image could be identified as not containing the proper steganographic watermark. The hidden information may be converted to noise when added to a camera image.
A camera in accordance with certain embodiments may be configured with multiple screens or divisions of a single screen that may simultaneously display images in precapture, capture and/or post-capture editing modes. For example, an image to be captured may be displayed for editing precapture parameters on one screen while a previously captured image may be displayed for post-capture editing, transmitting or managed storing on another screen.
In certain embodiments, an area may be selected for directing an auto-focus feature within a scene by instantiating an object on the camera display screen having a certain size and shape and overlaying a certain subset of pixels within an image about to be captured. This auto-focus object may be adjustable in size and/or shape, which may be rectangular, elliptical, circular, square, or arbitrarily curved or including straight and/or curved continuous or dashed segments of arbitrary or adjustable length or spacing. The auto-focus object may be used to sweep through different aperture and/or shutter speed settings, e.g., to set aperture and/or shutter speed priorities. The auto-focus area may also be used by the camera for auto-adjusting exposure and/or for manually adjusting auto defaults, which can be remembered by the camera and used for determining auto- adjustments in future images.
In certain embodiments, both an auto-focus object and an auto-exposure object are provided on a camera display within an image to be captured. The auto-focus object and auto- exposure object may have same or different sizes or shapes, and may be centered or located within the image to enclosing entirely different subsets of pixels, identical subsets of pixels or combinations of same and different pixels within the image to be captured. In addition, multiple auto-focus objects and/or multiple auto-exposure objects may be provided at different locations within an image to be captured or having different sizes or shapes. The camera may capture as many images each prioritizing auto-focus and/or auto-exposure in accordance with one of the objects, and the images may be combined to produce a final image that reflects the multiple prioritized auto-focus and/or auto-exposure areas.
A camera in accordance with certain embodiments may be configured to compose a collage of multiple images that may be posted together, e.g., on a social network that permits only one posting at a time.
A camera in accordance with certain embodiments may be configured to network with other cameras at an event. Images from two or more cameras may be combined into a single image or video and/or information may be gathered by one camera and used for capturing or editing still or video images or sounds captured with another camera. Two or more networked cameras may coordinate for exposure setting. Time-based and/or location-based social networking and/or camera to camera sharing is provided in certain embodiments. A connection of two cameras on the network may be established when the two cameras become closer than a preset distance and/or the two cameras may be disconnected when they become separated by more than a preset distance.
In certain embodiments, a stored image with desired image qualities may be selected in a precapture stage indicating to the camera to adjust precapture settings to capture an image of a current scene having approximately the desired image qualities of the selected stored image.
A camera in accordance with certain embodiments may be configured to provide long exposure images of live scenes including moving objects without blurriness. The long exposure images are provided by combining data from multiple short exposure images. For example, a ten second exposure night shot may be provided by combining a thousand 0.01 second shots. Objects moving left or right may be edge-matched and translated to a common position, while objects moving toward or away may be reduced or enlarged, respectively, to a common size, and rotating objects may be counter-rotated to a common directional orientation, and blocked portions of objects in a subset of images may be taken into account by multiplying parametric contributions from other images wherein the object portions are unblocked.
A camera in accordance with certain embodiments may have a low power consumption motion sensor that initiates a camera boot-up process upon sensing that the camera is being picked up or otherwise manipulated into a position indicative of a camera user's desire to capture a picture. The sensation that may initiates the camera boot-up process may include a touching of the camera or touching by a human hand or by a particular human hand on a grip with a finger on a fingerprint reader. An accelerometer built-in to the camera may sense motion to trigger start up, or a lens cap being removed may trigger start up. The sensation that initiates camera boot up may include a turning on of lights in a room where the camera is sitting or an approach by a human being based on a characteristic heat or sound signature.
A camera in accordance with certain embodiments may be configured to detect words spoken nearby the camera while capturing video images and to discern whether to include the words as commentary or discard the words as noise or to provide an option to the user to include or discard the words and/or other discernible sounds or noises captured with the video.
Figure 16 schematically illustrates a digital camera display screen showing an opaque column of selectable icons of a user interface and a photographic scene overlayed by a column of selectable translucent icons in accordance with certain embodiments. The selectable icons in the opaque column include a navigator or task manager icon, a home screen icon, a camera capture modes icon and a precapture settings icon. The navigator may be selected to access a dashboard, Apps, including open/live and recently-used apps and Activities. The home screen icon may be selected to access the home screen of the user interface.
The column of selectable translucent icons shown in Figure 16 appear when the camera capture modes icon is selected. The selectable translucent icons in this column include a swap camera icon for selecting between a front camera and a rear camera, a still capture mode icon, a video capture mode icon, and a flash settings icon. A thumbnail is also available for viewing recent photos or other stored photos.
Figure 17 schematically illustrates a digital camera display screen showing the opaque column and the column of selectable translucent icons of Figure 16, as well as a second column of selectable translucent icons for selecting from several camera capture modes including auto, smart, moment, pano, manual, slo-mo, hyper and lapse.
Figure 18 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons. The third column of selectable translucent icons appears when the smart icon is selected from the second column of selectable translucent icons. The selectable translucent icons of the third column include depth, motion and light. The fourth column of selectable translucent icons appears when the depth icon is selected from the third column and includes values of depth of field that may be selected from.
Figure 19 illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and four columns of selectable translucent icons of Figure 18. A specific depth selection appears as a highlighted row of icons across the second, third and fourth columns of selectable translucent icons.
Figure 20 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons. The third column of selectable translucent icons appears when the smart icon is selected from the second column of selectable translucent icons. The selectable translucent icons of the third column include depth, motion and light. The fourth column of selectable translucent icons appears when
the motion icon is selected from the third column and includes values of motion that may be selected from.
Figure 21 schematically illustrates a display screen result of scrolling to and selecting one of the multiple icons representing a specific value of motion. The specific motion value selection appears as a highlighted row of icons across the second, third and fourth columns of selectable translucent icons.
Figure 21 schematically illustrates a display screen result of scrolling to and selecting one of the multiple icons representing a specific light value option. The specific light value option selection appears as a highlighted row of icons across the second, third and fourth columns of selectable translucent icons.
Figure 22 illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first, second and third columns of selectable translucent icons of Figures 17-21, as well as a fourth column of selectable translucent icons representing several light value options that may be selected from.
Figure 24 illustrates a display screen result of selecting one of multiple translucent icons from the first column of selectable translucent icons in accordance with certain embodiments, the selected still photo mode icon being indicated by universal brightening or white or gray translucent overlaying of a still photo mode icon-containing pixel area, while maintaining the highlighting illustrated at Figure 23 of an extended row of icons across the second, third and fourth columns indicating a specific light value option.
Figure 25 illustrates a digital camera display screen result of selecting a "clear" icon in accordance with certain embodiments and removing the universal brightening or white or gray translucent overlaying of the still photo mode icon-containing area illustrated at Figure 24.
Figure 26 illustrates a display screen that is overlayed in certain embodiments by a fifth column of selectable translucent icons on the opposite side of the display screen from the opaque column and first to fourth columns of selectable translucent icons of Figure 22.
Figure 27 illustrates a digital camera display screen in accordance with certain
embodiments that is overlayed by the opaque column and the first column of selectable translucent icons of Figure 16 and the fifth column of selectable translucent icons of Figure 26, as well as a sixth column of selectable translucent icons including on, off and auto FIDR settings. A highlighted row of two icons selected each from the fifth and sixth columns indicates that FIDR on has been selected.
Figure 28 illustrates a digital camera display screen in accordance with certain
embodiments that includes the opaque column and photographic scene overlayed by the first
column of selectable translucent icons of Figure 16, and the fifth column of selectable translucent icons of Figure 26, as well as a sixth column of selectable translucent icons including off and three grid mode settings. A highlighted row of two icons selected from the fifth and sixth columns indicates that a grid mode has been selected, e.g., "#" and "#," and having an effect of overlaying a grid onto the display screen.
Figure 29 illustrates a digital camera display screen in accordance with certain
embodiments that includes the opaque column and photographic scene overlayed by the first column of selectable translucent icons of Figure 16, and a translucent key zone cursor overlay in accordance with certain embodiments, e.g., an unfilled circle or dashed circle appearing to enclose a circular pixel area of the display screen and showing a plus sign at a center of the circular pixel area.
Figure 30 illustrates a digital camera display screen that includes the opaque column and photographic scene overlayed by the first column of selectable translucent icons of Figure 16, and a translucent key zone cursor overlay as in Figure 29, and a translucent, broken-circle-shaped overlay that includes selectable depth, motion and light circle segments in accordance with certain embodiments.
Figure 31 illustrates the display screen of Figure 30, and a highlighted depth segment at a selected location within the depth segment of the translucent, broken-circle shaped overlay that corresponds to a particular value of depth of field.
Figure 32 illustrates a display screen result in accordance with certain embodiments of adjusting the value of depth of field by sliding to or otherwise selecting, indicated by highlighting, of a different location within the depth segment of the translucent, broken-circle shaped overlay of Figure 31.
Figure 33 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons. The third column of selectable translucent icons appears when the manual icon is selected from the second column of selectable translucent icons. The selectable translucent icons of the third column include aperture or A, ISO, shutter speed or S, white balance or WB, and exposure or EXP. The fourth column of selectable translucent icons appears when the aperture or A icon is selected from the third column and includes values of aperture, e.g., F2, F2.8, F4, F5.6, F8, Fl 1 and F16, from which a user may choose to manually select.
Figure 34 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent
icons of Figure 17, as well as a third and fourth columns of selectable translucent icons. The third column of selectable translucent icons appears when the manual icon is selected from the second column of selectable translucent icons. The selectable translucent icons of the third column include aperture or A, ISO, shutter speed or S, white balance or WB, and exposure or EXP. The fourth column of selectable translucent icons appears when the ISO icon is selected from the third column and includes values of ISO, or sensitivity to available light, e.g., 100, 200, 400, 800, 1600, 3200, and 6400, from which a user may choose to manually select.
Figure 35 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons. The third column of selectable translucent icons appears when the manual icon is selected from the second column of selectable translucent icons. The selectable translucent icons of the third column include aperture or A, ISO, shutter speed or S, white balance or WB, and exposure or EXP. The fourth column of selectable translucent icons appears when the shutter speed or S icon is selected from the third column and includes values of shutter speed, e.g., 4s, Is, l/4s, 1/15s, l/60s, l/250s, 1/lOOOs, l/4000s and l/16000s, from which a user may choose to manually select.
Figure 36 schematically illustrates a digital camera display screen showing a photographic scene overlayed by the opaque column and the first and second columns of selectable translucent icons of Figure 17, as well as a third and fourth columns of selectable translucent icons. The third column of selectable translucent icons appears when the manual icon is selected from the second column of selectable translucent icons. The selectable translucent icons of the third column include aperture or A, ISO, shutter speed or S, white balance or WB, and exposure or EXP. The fourth column of selectable translucent icons appears when the white balance or WB icon is selected from the third column and includes icons representing several white balance options from which a user may choose to manually select.
Figure 37 illustrates a digital camera display screen showing an indication of recent capture of two minutes and fifteen seconds of video, the display screen also showing the opaque column and first to fourth columns of selectable translucent columns of Figure 35 including a row of highlighted icons indicating a manually-selected value of 1/60 seconds as a camera shutter speed.
Figure 38 illustrates the display screen of Figure 37 without the second to fourth columns of selectable translucent icons indicating the manual selection of a 1/60 second shutter speed.
Figure 39 illustrates the digital camera display screen of Figure 38, as well as icons that a user may touch or otherwise select to execute video clip play, video play or record pause and video clip delete commands in accordance with certain embodiments.
Figure 40 illustrates a camera display for a digital camera in a default camera state for a viewfinder mode with no user interface in accordance with certain embodiments. The default fullscreen display size may be 1920 x 1080 pixels (16:9 ratio) in certain embodiments. By default the camera may be in 100% automatic mode and ready for capture using the camera's physical shutter button. A main UI symbol is shown in Figure 40, which indicates a swipeable interaction that can also be tapped. Swiping or tapping the main UI symbol would cause a capture mode/navigation bar to appear on the right side of the display screen and a secondary controls bar on the left side.
Figure 41 illustrates a camera display for a digital camera after a single tap instantiates appearance of a key zone object in accordance with certain embodiments. In certain
embodiments, the key zone object may be moved for location control of one or more parameters or groups or general categories of parameters within a photographic scene, e.g., a first group may include light, exposure and/or ISO, a second group may include focus, aperture and/or depth of field, and a third group may include shutter speed, motion, crispness and/or blur. These groups may be combined and invoked by a single tap at any point in the screen. In certain embodiments, by default the image may be automatically focused on the portion in the scene enclosed by the key zone at the location of the tapping and/or the overall lighting may be automatically adjusted to favor that same location of the scene. A dismiss glyph may appear next to the key zone symbol that the user can tap on to delete the key zone object. The user may move the key zone object around by dragging it. The guiding label "TAP AND HOLD TO ADJUST" may appear briefly when the key zone is set and then fade away.
Figure 42 illustrates a camera display for a digital camera including a pinch and zoom feature to vary the size of a key zone object in accordance with certain embodiments. The user may use two-fingered pinch and zoom gestures to adjust the size of the key zone in certain embodiments. This allows for enlarging or reducing the area the user wishes to be in focus and/or to have optimal lighting. Any interaction with the key zone object, to move and/or resize it, redisplays the "TAP AND HOLD TO ADJUST" label in certain embodiments. Pinch and Zoom resizing of the key zone object will not trigger the display of adjustment controls as long as the two fingers are moving. If the two fingers stay stationary for a second, a right side adjust bar will appear provides touch access to light, focus, and shutter speed adjustment objects or object areas or segments.
Figure 43 illustrates a camera display for a digital camera including a key zone controls panel for selecting light, focus and shutter speed adjustment in accordance with certain
embodiments. For ease of use and minimal UI visual clutter in certain embodiments, the key zone object may include three adjustment control options that are initially displayed on a thin right side bar and presented with symbols and labels that reflect how certain image qualities are being adjusted. These parameters may or may not map directly to aperture, shutter speed and ISO, particular when goal qualities being sought involve more than one. For example, light adjustment may involve a combination of exposure adjustment and one or more other parameters, such as aperture, shutter speed, and/or ISO, and may depend on conditions, and may depend on other adjustment control settings. Focus adjustment may involve aperture setting and/or varying the depth of field in the scene, and may involve more than one underlying parameter. Speed adjustments will depend on whether objects being captured are moving and at what speed, and this may also involve multiple underlying parameters.
Figure 44 illustrates a camera display for a digital camera including a key zone adjustor panel for a light parameter selected from a key zone control panel in accordance with certain embodiments. In certain embodiments, three key zone adjustors can be selected one at a time, e.g., each producing a vertical panel pulling out to the left of an adjust bar on the display screen. The currently selected parameter may be colored yellow. In certain embodiments, symbols may appear on either end of an adjustor bar indicating qualities the user may expect by sliding in one direction or the other. The light adjustor may work best by starting in the middle and allowing the user to adjust up or down, while the focus adjustor may begin at some particular setting in a linear lowest-to-highest aperture setting already and its position may reflect that in its initial location on the slider bar.
Figure 45 illustrates a camera display for a digital camera upon user interaction with the key zone adjustor panel of Figure 44 to adjust a light parameter. The initial default
position/setting on an adjustor bar is shown as a blue circular slider handle in certain
embodiments. As soon as the user moves it, it changes to yellow, indicating a user's adjusted setting that differs from a default automatic setting, while a blue smart target may appear in certain embodiments below the slider bar. Tapping the smart target may return the setting to automatic control in certain embodiments, e.g., a blue circle may be located where the camera calculates. As the user interacts with these controls, a preview image may change in certain embodiments to show effects of an adjustment on the image. Dismissing the key zone adjustor bar will preserve any changes that have been made.
Figure 46 illustrates a camera display for a digital camera upon user interaction with a key zone adjustor panel for a focus parameter selected from a key zone control panel in accordance with certain embodiments. The focus adjustor may work similarly in certain embodiments, e.g., bringing up a slider bar adjustor control with an initial default setting reflecting the camera's automatic setting. The symbol indicates graphically the decreasing (downward) or increasing (upward) depth of field in the preview image. As the user interacts with this slider, the image will reflect and show this change in depth of field in certain embodiments. The size of the key zone will determine the portion of the scene that will remain in focus, even with the focus adjustor pulled all the way down.
Figure 47 illustrates a camera display for a digital camera upon user interaction with a key zone adjustor panel for a speed parameter selected from a key zone control panel in accordance with certain embodiments. The speed adjustor may work similarly to the light and focus adjustors, e.g., bringing up a slider bar adjustor control with an initial default setting reflecting the camera's automatic setting. The symbol may indicate graphically the decreasing (downward) or increasing (upward) the shutter speed. Unlike lightening-darkening and adjusting depth of field, capture shutter speed hasn't traditionally had a way to give users an intuitive way to visualize what effect it will have until after an image is captured. In certain embodiments however an artificially induced blur-trailing for moving objects may appear in a preview display that can be visually reduced by this slider until the moving objects are sharp and have no motion blurring.
Figure 48 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller and navigation panel of a main user interface without a key zone in accordance with certain embodiments. A main UI may be displayed by swiping in any direction or by tapping or executing a dedicated button. In the example of Figure 48, the user has not set a key zone. Several capture modes are illustrated at Figure 48 that may be selected from a capture mode scroller panel. Secondary controls may include numerous parameters and/or a simple mode may include just a subset of these. Additional secondary controls may be accessible by scrolling down (dragging upward) or by tapping on the "· · ·" ellipses symbol for more. A navigator or task manager may provides quick and easy access to a dashboard, Apps, settings, open/live and recently-used apps, and/or activities. A user may tap and hold to display a dashboard until release. This affordance may provide efficiency and reduced navigational steps for many frequently-used features. A pre-capture filters pull-up panel and activities pull-down panel may be caused to appear by the user. Tap to most recent capture, and tap and hold to Gallery may be provided options. A capture modes scroller is also apparent in Figure 49.
Figure 50 illustrates a camera display for a digital camera including a selected secondary control setting panel next to a secondary control panel in accordance with certain embodiments. The user may taps on any secondary control to open a vertical panel immediately to the right on the display screen in certain embodiments.
Figure 51 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone in accordance with certain embodiments. The key zone and its associated controls may operate independently from the main UI in certain embodiments.
Figure 52 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone and a key zone controls panel in accordance with certain embodiments.
Figure 53 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface with a key zone and a key zone controls panel and a key zone controls adjustor panel in accordance with certain embodiments.
Figure 54 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface and a pre-capture filters pull-up panel in accordance with certain embodiments. The user may swipe up from the bottom or tap on the up-pointing chevron to display the pre-capture filters. There may be additional means to access recently -used filters as well as to discard unwanted filters and to access to additional filters. To close this panel the user can reverse the gesture, e.g., swiping down or tapping on the chevron, which has changed to a down-pointing chevron. Choosing a filter will in certain embodiments instantly apply the filter to a live viewfinder image. Filters may be configured such as not to alter a base master capture, but provide a way to add a desired process feature as the image is taken, thereby skipping post-processing steps.
Figure 55 illustrates a camera display for a digital camera including a secondary control panel and a capture mode scroller panel of a main user interface and an activities pull-down panel in accordance with certain embodiments. The user may swipe down from the top or tap on the down-pointing chevron to display activities. Activities may include structured guides for the user to control one or more image processing steps to achieve a desired outcome. Such desired outcomes may be a particular look that a user has seen an example of or a custom set of steps that send captured images to particular directories, or another type of setup and/or sequential procedure. This approach advantageously combines use of filters with more complex or multi- step procedures involving camera setup, pre-capture adjustments, and/or post-capture
manipulation of destinations. Additional means to may be provides to access recently-used activities as well as to discard unwanted activities and to access additional activities.
Figure 56 illustrates a dashboard interface for accessing to gallery and android in accordance with certain embodiments.
Figure 57 illustrates a camera display for a digital camera including a default beginning configuration for a gallery in accordance with certain embodiments. A selected view of the gallery may include small thumbnails, medium thumbnails, large thumbnails, list (with tiny icon) or map view, or combinations thereof. A selected media type may include all media, all photos, moments, stills, panoramas, all videos, slow motion, time lapse, or stop motion, or combinations thereof. A selected sort ordering of media may include newest to oldest, oldest to newest, or by location, and there may be groups such as highest level groupings, all favorites, albums, collections and/or trash.
Figure 58 illustrates a camera display for a digital camera including a gallery with select mode active in accordance with certain embodiments. When a select option in the top bar is tapped, its label turns yellow and any media thumbnail or list item tapped will be selected and denoted by a yellow frame around it. Tapping on a selected thumbnail will unselect it. When the select mode is active, e.g., highlighted in yellow, small translucent symbols may appear on thumbnails to indicate whether they are a moment and/or whether they have been uploaded to a cloud server. These may be displayed just when the select mode is active in certain embodiments. In certain embodiments, batch editing may be disabled in this mode making an edit option unavailable. The user may in certain embodiments favorite, share, or delete selected media individually or as a batch.
Figure 59 illustrates a camera display for a digital camera including an opened individual photo in a default state with options including an edit option in accordance with certain embodiments.
Figure 60 illustrates a camera display for a digital camera including an opened individual photo with user interface dismissed in accordance with certain embodiments. When an individual photo is opened and the UI is dismissed, the user may swipe in any direction to return the UI.
Figure 61 illustrates a camera display for a digital camera including an opened individual photo and an edit options panel in accordance with certain embodiments. Certain edit options may include crop, rotate and adjustments for color, contrast, and brightness.
Figure 62 illustrates a camera display for a digital camera including an opened individual photo and a filters, effects and frames panel in accordance with certain embodiments.
Figure 63 illustrates an android apps and environment screen in accordance with certain embodiments.
Figure 64A illustrates an expert mode menu of adjustable primary control settings for shutter speed, aperture and ISO in accordance with certain embodiments.
Figure 64B illustrates an expert mode menu of adjustable secondary control settings for white balance in accordance with certain embodiments.
Figures 65 A-65B illustrate a two level user interface in accordance with certain embodiments including an example top level activities interface plug-in over a hidden bottom level interface including primary and secondary controls panels, a capture mode scroller and navigation panel, and a scene display. The top level activities interface is what the user sees and thinks of as the user interface or UI, and includes display screen objects that the user may interact with and control in configuring precapture settings, capturing images and editing images post- capture, as well as managing and communicating images. Each activity may be include or be configured as an applet or plug-in. Activities may include higher level apps that are privileged above regular Android apps that may be accessible elsewhere. Activities may be first party and/or created in-house or by premium partners. Activities can in certain embodiments guide users through multi-step procedures to achieve desired results and goals. Activities can in certain embodiments include a default general usage option and a full manual usage option. Examples of activities may include out of the box introduction, general simple operation, photo cookbook or step-by-step guides, daily or periodic contests or challenges, a community feed or curated photo blog, premium front ends to instagram or other social media, tutorials for guiding advanced photography, custom editing and effects, Activities can be updated and revised in real time.
The bottom level interface may be hidden or optionally accessible. The bottom level interface may be controlled and/or configured by one or more top level activity plug-ins in accordance with certain embodiments. The bottom level interface provides a common underlying manual interface and/or expert interface. The bottom level interface is designed to work with applets and/or plug-ins in certain embodiments. The bottom level interface provides an efficient default and/or customized UI for manual operation. The bottom level interface holds a full range of available options and settings. The bottom level interface lets the top level interface choose which options and settings are shown to the user. The combination of top level and bottom level interfaces provides in certain embodiments a universal tool, e.g., based on a "Steering Wheel, Accelerator, Brake" model, for manual and/or expert operational control.
Figure 66A illustrates example steps for guided usage with a soft focus portrait interface plug-in in accordance with certain embodiments.
Figure 66B illustrates example steps for guided usage with a wedding shoot setup interface plug-in in accordance with certain embodiments.
Figures 67A-67B illustrate simple usage modes of a manual operation user interface for primary and secondary controls in accordance with certain embodiments.
Figures 67C-67D illustrate expert usage modes of a manual operation user interface for primary and secondary controls in accordance with certain embodiments.
While an exemplary drawings and specific embodiments of the present invention have been described and illustrated, it is to be understood that that the scope of the present invention is not to be limited to the particular embodiments discussed. Thus, the embodiments shall be regarded as illustrative rather than restrictive, and it should be understood that variations may be made in those embodiments by workers skilled in the arts without departing from the scope of the present invention.
In addition, in methods that may be performed according to embodiments herein and that may have been described above, the operations have been described in selected typographical sequences. However, the sequences have been selected and so ordered for typographical convenience and are not intended to imply any particular order for performing the operations, except for those where a particular order may be expressly set forth or where those of ordinary skill in the art may deem a particular order to be necessary.
A group of items linked with the conjunction "and" in the above specification should not be read as requiring that each and every one of those items be present in the grouping in accordance with all embodiments of that grouping, as various embodiments will have one or more of those elements replaced with one or more others. Furthermore, although items, elements or components of the invention may be described or claimed in the singular, the plural is
contemplated to be within the scope thereof unless limitation to the singular is explicitly stated or clearly understood as necessary by those of ordinary skill in the art.
The presence of broadening words and phrases such as "one or more," "at least," "but not limited to" or other such as phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent. The use of the term "assembly" does not imply that the components or functionality described or claimed as part of the assembly are all configured in a common package. Indeed, any or all of the various components of an assembly, e.g., an optical assembly or a camera assembly may be combined in a single package or separately maintained and may further be manufactured, assembled or distributed at or through multiple locations.
Claims
1. A digital camera-enabled mobile device, comprising:
a mobile device housing including a camera aperture;
a processor within the housing;
an image sensor within the housing;
an optical assembly for focusing light entering through the camera aperture onto the image sensor;
a display viewable by a camera user;
one or more storage devices for storing images captured on the image sensor and for having stored therein code for programming the processor to perform a method of adjusting a user-selectable combination of manual and automatic image capture settings for specific parameters or groups of parameters based on one or more preview images, wherein the method comprises:
capturing a series of still or video preview images using default and/or automatically preselected focus, exposure and shutter speed settings;
displaying one or more of the preview images on the display;
instantiating two or more translucent display objects to guide the camera user through selecting said combination of manual and automatic image capture settings for said specific parameters and groups of parameters, including overlaying respective subsets of pixels on said display and having default and/or automatically preselected sizes, shapes and locations within the one or more preview images on the display;
receiving manual inputs at a user interface from the camera user communicating selected combinations of parameters or groups of parameters, or combinations thereof, for manual or automatic calibration of image capture settings represented in the sizes, shapes, and/or locations of said two or more translucent display objects and respective subsets of pixels;
adjusting one or more image capture settings including light, exposure, ISO, focus, aperture, depth of field, shutter speed, motion, crispness or blur, or combinations thereof, manually or automatically in accordance with said communicating of said selected combinations by the camera user, predicted to enhance image quality by improving focus, exposure or shutter speed, or combinations thereof, for said respective subsets of pixels overlay ed, as adjusted, by the display objects;
capturing one or more digital images in accordance with said one or more adjusted image capture settings; and
storing, displaying, communicating and/or editing processed or unprocessed versions of said one or more captured digital images, or combinations thereof.
2. The digital camera-enabled mobile device of claim 1, wherein the method comprises instantiating an adjusted translucent display object comprising a rectangular, elliptical, circular, square, or other closed shape formed by a combination of arbitrarily curved or straight, continuous or dashed, segments of arbitrary or adjustable length or spacing, or combinations thereof.
3. The digital camera-enabled mobile device of claim 1, wherein the method comprises instantiating at least one translucent display object that is configured for sweeping through different aperture and/or shutter speed settings and comparing focus qualities of images captured at said different aperture and/or shutter speed settings.
4. The digital camera-enabled mobile device of claim 1, wherein the method comprises instantiating a translucent display object for manually adjusting auto defaults, which are remembered by the camera and used for determining auto-adjustments in future images.
5. The digital camera-enabled mobile device of claim 1, wherein said two or more translucent display objects comprise an auto-focus object and an auto-exposure object that differ in size, shape and/or location on said display or relative to objects within one or more preview images.
6. The digital camera-enabled mobile device of claim 1, wherein the method comprises instantiating multiple translucent auto-focus objects and/or multiple translucent auto-exposure objects at different locations within one or more preview images and/or having different sizes and/or shapes.
7. The digital camera-enabled mobile device of claim 1, wherein the method comprises identifying multiple objects within the preview images and prioritizing auto-focus and/or auto- exposure in accordance with one or more of the identified objects, and capturing and combining multiple images each prioritizing auto-focus and/or auto-exposure for a different identified object and stitching a composite image that reflects the multiple prioritized auto-focus and/or auto- exposure objects.
8. The digital camera-enabled mobile device of claim 1, comprising a low power consumption motion, touch, audio, visual, thermal or other sensation sensor, or combinations thereof, that is configured to program the processor to initiate a camera boot-up process upon matching a measured sensation with a sensation predetermined to be indicative of a camera user's intention to capture a picture.
9. The digital camera-enabled mobile device of claim 8, wherein the sensation comprises the camera being picked up or otherwise manipulated into a position indicative of a camera user's desire to capture a picture.
10. The digital camera-enabled mobile device of claim 8, wherein the sensation comprises a touching of the camera or a touching by a human hand or by a particular human hand on a grip with a finger on a fingerprint reader, or combinations thereof.
11. The digital camera-enabled mobile device of claim 8, comprising an accelerometer built-in to the camera to sense a certain motion or sequence of motions or to sense a lens cap being removed, or combinations thereof, predetermined to be indicative of a camera user's intention to capture a picture.
12. The digital camera-enabled mobile device of claim 8, wherein the sensation comprises a turning on of lights in a room where the camera is sitting or an approach by a human being based on a characteristic heat or sound signature, or combinations thereof, predetermined to be indicative of a camera user's intention to capture a picture.
13. The digital camera-enabled mobile device of claim 8, wherein the sensation comprises detecting words or certain words spoken nearby the camera predetermined to be indicative of a camera user's intention to capture a picture.
14. The digital camera-enabled mobile device of claim 8, comprising a voice recognition program for detecting words or certain words spoken nearby the camera while capturing video images, wherein the method comprises discerning whether to include the words as commentary or to discard the words as noise or to provide an option to the user to include or discard the words and/or other discernible sounds or noises captured with the video.
15. The digital camera-enabled mobile device of claim 1, wherein the display comprises a touch screen user interface including one or more selectable opaque objects and said two or more selectable and/or adjustable translucent display objects.
16. The digital camera-enabled mobile device of claim 1, wherein the processor is programmed to instantiate a translucent display object at a location on the display or within a preview image, or both, touched by the camera user within one or more displayed preview images on the display.
17. The digital camera-enabled mobile device of claim 1, wherein the processor is programmed to move and/or adjust the size and/or shape of a user-instantiated translucent display object based on touch screen menu selections and adjustments of one or more of said image capture settings for a subset of pixels overlay ed by the translucent display object within the one or more preview images.
18. The digital camera-enabled mobile device of claim 1, wherein groups of image capture settings may include two or more of light, exposure or ISO in an auto-exposure group, focus, aperture or depth of field in an auto-focus group, or motion, crispness or blur in a shutter speed group, or combinations of two or more groups, or a groups with a parameter, or
combinations thereof, such that the camera user has an option to adjust exposure, focus or shutter speed generally, or to adjust light, exposure or ISO, focus, aperture or depth of field, or motion, crispness or blur, to specifically adjust a parameter respectively within said auto-exposure, auto- focus or auto-shutter speed groups.
19. The digital camera-enabled mobile device of claim 1, wherein the camera user has an option to form a group by selecting combinations of two or more parameters, or by selecting a parameter to add or remove from an existing group, or by combining two or more existing groups, or combinations thereof, and a further option to instantiate a translucent display object for manually selecting a subset of pixels on the display within a preview image to program the processor to optimize generally in accordance with programming code or to specifically optimize for certain parameters or groups or to specifically select values for one or more parameters or groups of parameters for said subset of pixels, or combinations thereof.
20. The digital camera-enabled mobile device of claim 1, wherein exposure, focus and shutter speed groups are preselected to respectively comprise light, exposure and ISO, focus, aperture and depth of field, and motion, crispness and blur, respectively, such that the camera user has an option to select subsets of pixels for optimizing with regard to exposure, focus, and/or shutter speed generally, and/or with regard to light, exposure, ISO, focus, aperture, depth of field, motion, crispness or blur, or combinations thereof.
21. The digital camera-enabled mobile device of claim 1, wherein the method comprises instantiating a translucent display object for a group that comprises two or more adjustment controls for respective specific parameters within the group.
22. The digital camera-enabled mobile device of claim 1, wherein the method comprises instantiating a translucent display object for a light adjustment group that comprises a combination of exposure adjustment and one or more other parameters including aperture, shutter speed, or ISO, or combinations thereof.
23. The digital camera-enabled mobile device of claim 1, wherein the method comprises instantiating a translucent display object for a focus adjustment group comprising aperture setting or varying depth of field in a scene, or both.
24. The digital camera-enabled mobile device of claim 1, wherein the method comprises instantiating a translucent display object for a shutter speed adjustment comprising adjusting values of relative speeds of moving objects or of a moving object relative to a background or other static object within the scene, or combinations thereof.
25. The digital camera-enabled mobile device of claim 1, wherein said user interface comprises a top level interface and a bottom level interface, wherein the bottom level interface comprises a hidden or optionally accessible component, or both.
26. The digital camera-enabled mobile device of claim 25, wherein the bottom level interface is controlled by one or more top level activity plug-ins to provide a common underlying manual and/or expert interface.
27. One or more processor readable storage devices for storing images captured on a digital camera and having stored therein code for programming a processor to perform a method of adjusting a user-selectable combination of manual and automatic image capture settings for specific parameters or groups of parameters based on one or more preview images, wherein the method comprises:
capturing a series of still or video preview images using default and/or automatically preselected focus, exposure and shutter speed settings;
displaying one or more of the preview images on the display;
instantiating two or more translucent display objects to guide the camera user through selecting said combination of manual and automatic image capture settings for said specific parameters and groups of parameters, including overlaying respective subsets of pixels on said display and having default and/or automatically preselected sizes, shapes and locations within the one or more preview images on the display;
receiving manual inputs at a user interface from the camera user communicating selected combinations of parameters or groups of parameters, or combinations thereof, for manual or automatic calibration of image capture settings represented in the sizes, shapes, and/or locations of said two or more translucent display objects and respective subsets of pixels;
adjusting one or more image capture settings including light, exposure, ISO, focus, aperture, depth of field, shutter speed, motion, crispness or blur, or combinations thereof, manually or automatically in accordance with said communicating of said selected combinations by the camera user, predicted to enhance image quality by improving focus, exposure or shutter speed, or combinations thereof, for said respective subsets of pixels overlayed, as adjusted, by the display objects;
capturing one or more digital images in accordance with said one or more adjusted image capture settings; and
storing, displaying, communicating and/or editing processed or unprocessed versions of said one or more captured digital images, or combinations thereof.
28. The one or more processor readable storage devices of claim 27, wherein the method comprises instantiating an adjusted translucent display object comprising a rectangular, elliptical, circular, square, or other closed shape formed by a combination of arbitrarily curved or straight, continuous or dashed, segments of arbitrary or adjustable length or spacing, or combinations thereof.
29. The one or more processor readable storage devices of claim 27, wherein the method comprises instantiating at least one translucent display object that is configured for sweeping through different aperture and/or shutter speed settings and comparing focus qualities of images captured at said different aperture and/or shutter speed settings.
30. The one or more processor readable storage devices of claim 27, wherein the method comprises instantiating a translucent display object for manually adjusting auto defaults, which are remembered by the camera and used for determining auto-adjustments in future images.
31. The one or more processor readable storage devices of claim 27, wherein said two or more translucent display objects comprise an auto-focus object and an auto-exposure object that differ in size, shape and/or location on said display or relative to objects within one or more preview images, or combinations thereof.
32. The one or more processor readable storage devices of claim 27, wherein the method comprises instantiating multiple translucent auto-focus objects and/or multiple translucent auto- exposure objects at different locations within one or more preview images and/or having different sizes and/or shapes.
33. The one or more processor readable storage devices of claim 27, wherein the method comprises identifying multiple objects within the preview images and prioritizing auto-focus and/or auto-exposure in accordance with one or more of the identified objects, and capturing and combining multiple images each prioritizing auto-focus and/or auto-exposure for a different identified object and stitching a composite image that reflects the multiple prioritized auto-focus and/or auto-exposure objects.
34. The one or more processor readable storage devices of claim 27, wherein the method comprises initiating a camera boot-up process upon matching a measured sensation with a stored low power consumption motion, touch, audio, visual, thermal or other stored sensation, or combinations thereof, that is predetermined to be indicative of a camera user's intention to capture a picture.
35. The one or more processor readable storage devices of claim 34, wherein the sensation comprises the camera being picked up or otherwise manipulated into a position indicative of a camera user's desire to capture a picture.
36. The one or more processor readable storage devices of claim 34, wherein the sensation comprises a touching of the camera or a touching by a human hand or by a particular human hand on a grip with a finger on a fingerprint reader, or combinations thereof.
37. The one or more processor readable storage devices of claim 34, wherein the method comprises sensing a certain motion or sequence of motions or sensing a lens cap being removed, or combinations thereof, that are predetermined to be indicative of a camera user's intention to capture a picture.
38. The one or more processor readable storage devices of claim 34, wherein the sensation comprises a turning on of lights in a room where the camera is sitting or an approach by a human being based on a characteristic heat or sound signature, or combinations thereof, that are predetermined to be indicative of a camera user's intention to capture a picture.
39. The one or more processor readable storage devices of claim 34, wherein the sensation comprises detecting words or certain words spoken nearby the camera, or combinations thereof, that are predetermined to be indicative of a camera user's intention to capture a picture.
40. The one or more processor readable storage devices of claim 34, wherein the method comprises detecting words or certain words spoken nearby the camera while capturing video images, and discerning whether to include the words as commentary or to discard the words as noise or to provide an option to the user to include or discard the words and/or other discernible sounds or noises captured with the video, or combinations thereof.
41. The one or more processor readable storage devices of claim 27, wherein the method comprises instantiating one or more selectable opaque objects and said two or more selectable and/or adjustable translucent display objects based on touch screen input from a camera user.
42. The one or more processor readable storage devices of claim 27, wherein the method comprises instantiating a translucent display object at a location on the display or within a preview
image, or both, touched by the camera user within one or more displayed preview images on the display.
43. The one or more processor readable storage devices of claim 27, wherein the method comprises moving and/or adjusting a size and/or shape of a user-instantiated translucent display object based on touch screen menu selections and adjustments of one or more of said image capture settings for a subset of pixels overlay ed by the translucent display object within the one or more preview images.
44. The one or more processor readable storage devices of claim 27, wherein groups of image capture settings include two or more of light, exposure or ISO in an auto-exposure group, focus, aperture or depth of field in an auto-focus group, or motion, crispness or blur in a shutter speed group, or combinations of two or more groups, or a group with a parameter, or
combinations thereof, and the method comprises providing the camera user an option to adjust exposure, focus and/or shutter speed generally, or to adjust light, exposure or ISO, focus, aperture or depth of field, or motion, crispness or blur, or combinations thereof, to specifically adjust one or more parameters respectively within said auto-exposure, auto-focus and/or auto-shutter speed groups.
45. The one or more processor readable storage devices of claim 27, wherein the method comprises providing the camera user an option to form a group by selecting combinations of two or more parameters, or by selecting a parameter to add or remove from an existing group, or by combining two or more existing groups, or combinations thereof, and a further option to instantiate a translucent display object for manually selecting a subset of pixels on the display within a preview image to program the processor to optimize generally in accordance with programming code or to specifically optimize for certain parameters or groups or to specifically select values for one or more parameters or groups of parameters for said subset of pixels, or combinations thereof.
46. The one or more processor readable storage devices of claim 27, wherein the method comprises preselecting exposure, focus and shutter speed groups to respectively comprise light, exposure and ISO, focus, aperture and depth of field, and motion, crispness and blur, and providing the camera user an option to select subsets of pixels for optimizing with regard to
exposure, focus, and/or shutter speed generally, and/or light, exposure, ISO, focus, aperture, depth of field, motion, crispness or blur, or combinations thereof.
47. The one or more processor readable storage devices of claim 27, wherein the method comprises instantiating a translucent display object for a group that comprises two or more adjustment controls for respective specific parameters within the group.
48. The one or more processor readable storage devices of claim 27, wherein the method comprises instantiating a translucent display object for a light adjustment group that comprises a combination of exposure adjustment and one or more other parameters including aperture, shutter speed, or ISO, or combinations thereof.
49. The one or more processor readable storage devices of claim 27, wherein the method comprises instantiating a translucent display object for a focus adjustment group comprising aperture setting or varying depth of field in a scene, or both.
50. The one or more processor readable storage devices of claim 27, wherein the method comprises instantiating a translucent display object for a shutter speed adjustment comprising adjusting values of relative speeds of moving objects or of a moving object relative to a background or other static object within the scene, or combinations thereof.
51. The one or more processor readable storage devices of claim 27, wherein said user interface comprises a top level interface and a bottom level interface, wherein the bottom level interface comprises a hidden or optionally accessible component, or both.
52. The one or more processor readable storage devices of claim 51, wherein the bottom level interface is controlled by one or more top level activity plug-ins to provide a common underlying manual and/or expert interface.
53. A method of adjusting a user-selectable combination of manual and automatic image capture settings for specific parameters or groups of parameters based on one or more preview images, comprising:
capturing a series of still or video preview images using default and/or automatically preselected focus, exposure and shutter speed settings;
displaying one or more of the preview images on the display;
instantiating two or more translucent display objects to guide the camera user through selecting said combination of manual and automatic image capture settings for said specific parameters and groups of parameters, including overlaying respective subsets of pixels on said display and having default and/or automatically preselected sizes, shapes and locations within the one or more preview images on the display;
receiving manual inputs at a user interface from the camera user communicating selected combinations of parameters or groups of parameters, or combinations thereof, for manual or automatic calibration of image capture settings represented in the sizes, shapes, and/or locations of said two or more translucent display objects and respective subsets of pixels;
adjusting one or more image capture settings including light, exposure, ISO, focus, aperture, depth of field, shutter speed, motion, crispness or blur, or combinations thereof, manually or automatically in accordance with said communicating of said selected combinations by the camera user, predicted to enhance image quality by improving focus, exposure or shutter speed, or combinations thereof, for said respective subsets of pixels overlayed, as adjusted, by the display objects;
capturing one or more digital images in accordance with said one or more adjusted image capture settings; and
storing, displaying, communicating and/or editing processed or unprocessed versions of said one or more captured digital images, or combinations thereof.
54. The method of claim 53, comprising instantiating an adjusted translucent display object comprises a rectangular, elliptical, circular, square, or other closed shape formed by a combination of arbitrarily curved or straight, continuous or dashed, segments of arbitrary or adjustable length or spacing, or combinations thereof.
55. The method of claim 53, comprising instantiating at least one translucent display object that is configured for sweeping through different aperture and/or shutter speed settings and comparing focus qualities of images captured at said different aperture and/or shutter speed settings.
56. The method of claim 53, comprising instantiating a translucent display object for manually adjusting auto defaults, which are remembered by the camera and used for determining auto-adjustments in future images.
57. The method of claim 53, wherein said two or more translucent display objects comprise an auto-focus object and an auto-exposure object that differ in size, shape or location on said display or relative to objects within one or more preview images.
58. The method of claim 53, comprising instantiating multiple translucent auto-focus objects and/or multiple translucent auto-exposure objects at different locations within one or more preview images and/or having different sizes and/or shapes.
59. The method of claim 53, comprising identifying multiple objects within the preview images and prioritizing auto-focus and/or auto-exposure in accordance with one or more of the identified objects, and capturing and combining multiple images each prioritizing auto-focus and/or auto-exposure for a different identified object and stitching a composite image that reflects the multiple prioritized auto-focus and/or auto-exposure objects.
60. The method of claim 53, comprising initiating a camera boot-up process upon matching a measured sensation with a stored low power consumption motion, touch, audio, visual, thermal or other stored sensation, or combinations thereof, that is predetermined to be indicative of a camera user's intention to capture a picture.
61. The method of claim 60, wherein the sensation comprises the camera being picked up or otherwise manipulated into a position indicative of a camera user's desire to capture a picture.
62. The method of claim 60, wherein the sensation comprises a touching of the camera or a touching by a human hand or by a particular human hand on a grip with a finger on a fingerprint reader, or combinations thereof.
63. The method of claim 60, comprising sensing a certain motion or sequence of motions or sensing a lens cap being removed, or combinations thereof, that are predetermined to be indicative of a camera user's intention to capture a picture.
64. The method of claim 60, wherein the sensation comprises a turning on of lights in a room where the camera is sitting or an approach by a human being based on a characteristic heat or sound signature, or combinations thereof, that are predetermined to be indicative of a camera user's intention to capture a picture.
65. The method of claim 60, wherein the sensation comprises detecting words or certain words spoken nearby the camera, or combinations thereof, that are predetermined to be indicative of a camera user's intention to capture a picture.
66. The method of claim 60, comprising detecting words or certain words spoken nearby the camera while capturing video images, and discerning whether to include the words as commentary or to discard the words as noise or to provide an option to the user to include or discard the words and/or other discernible sounds or noises captured with the video, or
combinations thereof.
67. The method of claim 53, comprising instantiating one or more selectable opaque objects and said two or more selectable and/or adjustable translucent display objects based on touch screen input from a camera user.
68. The method of claim 53, comprising instantiating a translucent display object at a location on the display or within a preview image touched by the camera user within one or more displayed preview images on the display.
69. The method of claim 53, comprising moving and/or adjusting a size and/or shape of a user-instantiated translucent display object based on touch screen menu selections and adjustments of one or more of said image capture settings for a subset of pixels overlayed by the translucent display object within the one or more preview images.
70. The method of claim 53, wherein groups of image capture settings include two or more of light, exposure or ISO in an auto-exposure group, focus, aperture or depth of field in an auto-focus group, or motion, crispness or blur in a shutter speed group, or combinations of two or more groups, or a group with a parameter, or combinations thereof, and the method comprises providing the camera user an option to adjust exposure, focus and/or shutter speed generally, or to adjust light, exposure or ISO, focus, aperture or depth of field, or motion, crispness or blur, or combinations thereof, to specifically adjust a parameter respectively within said auto-exposure, auto-focus and/or auto-shutter speed groups.
71. The method of claim 53, comprising providing the camera user an option to form a group by selecting combinations of two or more parameters, or by selecting a parameter to add or remove from an existing group, or by combining two or more existing groups, or combinations thereof, and a further option to instantiate a translucent display object for manually selecting a subset of pixels on the display within a preview image to program the processor to optimize generally in accordance with programming code or to specifically optimize for certain parameters or groups or to specifically select values for one or more parameters or groups of parameters for said subset of pixels, or combinations thereof.
72. The method of claim 53, comprising preselecting exposure, focus and shutter speed groups to respectively comprise light, exposure and ISO, focus, aperture and depth of field, and motion, crispness and blur, and providing the camera user an option to select subsets of pixels for optimizing with regard to exposure, focus, and/or shutter speed generally, and/or light, exposure, ISO, focus, aperture, depth of field, motion, crispness or blur, or combinations thereof.
73. The method of claim 53, comprising instantiating a translucent display object for a group that comprises two or more adjustment controls for respective specific parameters within the group.
74. The method of claim 53, comprising instantiating a translucent display object for a light adjustment group that comprises a combination of exposure adjustment and one or more other parameters including aperture, shutter speed, or ISO, or combinations thereof.
75. The method of claim 53, comprising instantiating a translucent display object for a focus adjustment group comprising aperture setting or varying depth of field in a scene, or both.
76. The method of claim 53, comprising instantiating a translucent display object for a shutter speed adjustment comprising adjusting values of relative speeds of moving objects or of a moving object relative to a background or other static object within the scene, or combinations thereof.
77. The method of claim 53, wherein said user interface comprises a top level interface and a bottom level interface, wherein the bottom level interface comprises a hidden or optionally accessible component, or both.
78. The method of claim 53, wherein the bottom level interface is controlled by one or more top level activity plug-ins to provide a common underlying manual and/or expert interface.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662363835P | 2016-07-18 | 2016-07-18 | |
US62/363,835 | 2016-07-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018017625A1 true WO2018017625A1 (en) | 2018-01-25 |
Family
ID=60992566
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2017/042685 WO2018017625A1 (en) | 2016-07-18 | 2017-07-18 | User interface for smart digital camera |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018017625A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10122914B2 (en) | 2015-04-17 | 2018-11-06 | mPerpetuo, Inc. | Method of controlling a camera using a touch slider |
CN108833781A (en) * | 2018-06-15 | 2018-11-16 | Oppo广东移动通信有限公司 | Image preview method, apparatus, terminal and computer readable storage medium |
WO2020149853A1 (en) * | 2019-01-18 | 2020-07-23 | Seyve Christophe | Universal control interface for camera |
DE102019118750A1 (en) * | 2019-07-10 | 2021-01-14 | Schölly Fiberoptic GmbH | Medical image recording system, which submits adaptation proposals depending on the situation, as well as the associated image recording process |
DE102019118752A1 (en) * | 2019-07-10 | 2021-02-25 | Schölly Fiberoptic GmbH | Method for adaptive function reassignment of control elements of an image recording system and the associated image recording system |
US20220053121A1 (en) * | 2018-09-11 | 2022-02-17 | Profoto Aktiebolag | A method, software product, camera device and system for determining artificial lighting and camera settings |
US20220382440A1 (en) * | 2021-06-01 | 2022-12-01 | Apple Inc. | User interfaces for managing media styles |
US11611691B2 (en) | 2018-09-11 | 2023-03-21 | Profoto Aktiebolag | Computer implemented method and a system for coordinating taking of a picture using a camera and initiation of a flash pulse of at least one flash device |
US11863866B2 (en) | 2019-02-01 | 2024-01-02 | Profoto Aktiebolag | Housing for an intermediate signal transmission unit and an intermediate signal transmission unit |
US11962889B2 (en) | 2016-06-12 | 2024-04-16 | Apple Inc. | User interface for camera effects |
US12081862B2 (en) | 2020-06-01 | 2024-09-03 | Apple Inc. | User interfaces for managing media |
US12101567B2 (en) | 2021-04-30 | 2024-09-24 | Apple Inc. | User interfaces for altering visual media |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130272673A1 (en) * | 2012-03-13 | 2013-10-17 | Lee Eugene Swearingen | System and method for guided video creation |
US20140184852A1 (en) * | 2011-05-31 | 2014-07-03 | Mobile Imaging In Sweden Ab | Method and apparatus for capturing images |
US20150350504A1 (en) * | 2014-06-03 | 2015-12-03 | 2P & M Holdings, LLC | RAW Camera Peripheral for Handheld Mobile Unit |
-
2017
- 2017-07-18 WO PCT/US2017/042685 patent/WO2018017625A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140184852A1 (en) * | 2011-05-31 | 2014-07-03 | Mobile Imaging In Sweden Ab | Method and apparatus for capturing images |
US20130272673A1 (en) * | 2012-03-13 | 2013-10-17 | Lee Eugene Swearingen | System and method for guided video creation |
US20150350504A1 (en) * | 2014-06-03 | 2015-12-03 | 2P & M Holdings, LLC | RAW Camera Peripheral for Handheld Mobile Unit |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10122914B2 (en) | 2015-04-17 | 2018-11-06 | mPerpetuo, Inc. | Method of controlling a camera using a touch slider |
US11962889B2 (en) | 2016-06-12 | 2024-04-16 | Apple Inc. | User interface for camera effects |
CN108833781A (en) * | 2018-06-15 | 2018-11-16 | Oppo广东移动通信有限公司 | Image preview method, apparatus, terminal and computer readable storage medium |
CN108833781B (en) * | 2018-06-15 | 2021-03-26 | Oppo广东移动通信有限公司 | Image preview method, device, terminal and computer readable storage medium |
US20220053121A1 (en) * | 2018-09-11 | 2022-02-17 | Profoto Aktiebolag | A method, software product, camera device and system for determining artificial lighting and camera settings |
US11611691B2 (en) | 2018-09-11 | 2023-03-21 | Profoto Aktiebolag | Computer implemented method and a system for coordinating taking of a picture using a camera and initiation of a flash pulse of at least one flash device |
US11595564B2 (en) * | 2019-01-18 | 2023-02-28 | Christophe Seyve | Universal control interface for camera |
WO2020149853A1 (en) * | 2019-01-18 | 2020-07-23 | Seyve Christophe | Universal control interface for camera |
US11863866B2 (en) | 2019-02-01 | 2024-01-02 | Profoto Aktiebolag | Housing for an intermediate signal transmission unit and an intermediate signal transmission unit |
US11653085B2 (en) | 2019-07-10 | 2023-05-16 | Schölly Fiberoptic GmbH | Image recording system, which suggests situation-dependent adaptation proposals, and associated image recording method |
DE102019118752B4 (en) | 2019-07-10 | 2023-06-15 | Schölly Fiberoptic GmbH | Method for adaptive functional reassignment of operating elements of an image recording system and associated image recording system |
DE102019118752A1 (en) * | 2019-07-10 | 2021-02-25 | Schölly Fiberoptic GmbH | Method for adaptive function reassignment of control elements of an image recording system and the associated image recording system |
DE102019118750A1 (en) * | 2019-07-10 | 2021-01-14 | Schölly Fiberoptic GmbH | Medical image recording system, which submits adaptation proposals depending on the situation, as well as the associated image recording process |
US11974719B2 (en) | 2019-07-10 | 2024-05-07 | Schölly Fiberoptic GmbH | Method for adaptive functional reconfiguration of operating elements of an image acquisition system and corresponding image acquisition system |
US12081862B2 (en) | 2020-06-01 | 2024-09-03 | Apple Inc. | User interfaces for managing media |
US12101567B2 (en) | 2021-04-30 | 2024-09-24 | Apple Inc. | User interfaces for altering visual media |
US20220382440A1 (en) * | 2021-06-01 | 2022-12-01 | Apple Inc. | User interfaces for managing media styles |
US12112024B2 (en) * | 2021-06-01 | 2024-10-08 | Apple Inc. | User interfaces for managing media styles |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018017625A1 (en) | User interface for smart digital camera | |
US10931866B2 (en) | Methods and apparatus for receiving and storing in a camera a user controllable setting that is used to control composite image generation performed after image capture | |
JP6748582B2 (en) | Imaging device, control method thereof, program, and recording medium | |
JP6083987B2 (en) | Imaging apparatus, control method thereof, and program | |
US9280223B2 (en) | Imaging apparatus and continuous shooting imaging methods using a continuous touch user input | |
US10491806B2 (en) | Camera device control related methods and apparatus | |
US9554031B2 (en) | Camera focusing related methods and apparatus | |
KR101231469B1 (en) | Method, apparatusfor supporting image processing, and computer-readable recording medium for executing the method | |
CN104349051B (en) | The control method of object detection device and object detection device | |
CN101772952B (en) | Imaging device | |
KR101918760B1 (en) | Imaging apparatus and control method | |
WO2015030126A1 (en) | Image processing device and image processing program | |
CN106688227B (en) | More photographic devices, more image capture methods | |
US20060044399A1 (en) | Control system for an image capture device | |
CN113395419A (en) | Electronic device, control method, and computer-readable medium | |
JP2009278623A (en) | Method and apparatus for performing touch-based adjustments within imaging device | |
US10958825B2 (en) | Electronic apparatus and method for controlling the same | |
US20120105588A1 (en) | Image capture device | |
US20200120269A1 (en) | Double-selfie system for photographic device having at least two cameras | |
JP2013009189A (en) | Imaging device and imaging method | |
JP5830564B2 (en) | Imaging apparatus and mode switching method in imaging apparatus | |
JP2014017665A (en) | Display control unit, control method for display control unit, program, and recording medium | |
US9214193B2 (en) | Processing apparatus and method for determining and reproducing a number of images based on input path information | |
KR20120054407A (en) | Apparatus for processing digital image and method for controlling thereof | |
CN110537363A (en) | Digital camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17831730 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27.05.2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17831730 Country of ref document: EP Kind code of ref document: A1 |