GB2334591A - Remote controlled and synchronized camera and flashlamp - Google Patents

Remote controlled and synchronized camera and flashlamp Download PDF

Info

Publication number
GB2334591A
GB2334591A GB9902328A GB9902328A GB2334591A GB 2334591 A GB2334591 A GB 2334591A GB 9902328 A GB9902328 A GB 9902328A GB 9902328 A GB9902328 A GB 9902328A GB 2334591 A GB2334591 A GB 2334591A
Authority
GB
United Kingdom
Prior art keywords
camera
cybernetic
light
photoborg
photography system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9902328A
Other versions
GB2334591B (en
GB9902328D0 (en
Inventor
William Stephen George Mann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB0203423A priority Critical patent/GB2370958B/en
Publication of GB9902328D0 publication Critical patent/GB9902328D0/en
Publication of GB2334591A publication Critical patent/GB2334591A/en
Application granted granted Critical
Publication of GB2334591B publication Critical patent/GB2334591B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/08Trick photography
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B29/00Combinations of cameras, projectors or photographic printing apparatus with non-photographic non-optical apparatus, e.g. clocks or weapons; Cameras having the shape of other objects
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B33/00Colour photography, other than mere exposure or projection of a colour film
    • G03B33/08Sequential recording or projection
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C19/00Electric signal transmission systems

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

A cybernetic photographic system has a camera and a illumination source held or worn by at a remote location by a photoborg. The camera and illumination source are synchronized and remotely activated by an actuator which may be on the illumination source. The brightness of the illumination source may be varied and there may be more than one illumination source. Also provided are: a photorendering system including the computation of photoquantigraphic quantities; a photographic system for taking several pictures of different portions of an object; a photographic system for taking pictures with different degrees of brightness; a photographic system including a colour coordinate transformation means; an apparatus for indicating the centre of the picture frame worn on the head; an apparatus for processing a plurality of exposures; a flash lamp for use in production of light vectors; a controller for a camera and several light sources usable in different combinations or flicker and flash a different times; a flashlamp with a viewfinder; a lightsweep and a camera indicating whether the user is in the filed of view.

Description

FIELD OF THE INVENTION Generally this invention pertains to photographic methods, apparatus, and systems involving multiple exposures of the same subject matter to differing illumination.
BACKGROUND OF THE INVENTION In photography (and in movie and video production), it is desirable to capture a broad dynamic range from the scene. Often the dynamic range of the scene exceeds that which can be captured by the recording medium. Therefore, it is not possible to rely on a light meter or automatic setting on the camera. Even if the photographer takes a reading from various areas of the scene that he/she considers important, it is seldom that the estimate of exposure will lead to an optimum picture. Results from cameras that attempt to do this automatically (e.g. by assuming the central area is important, and maybe measuring a few other image areas) are usually even worse.
Still-photographers attempt to address this problem by a process called bracketing the exposures. This process involves measuring (or guessing) the correct exposure, and then taking a variety of exposures around this value (e.g. overexposing one by a factor of two, one by a factor of four, and underexposing one by a factor of two, etc). From this set of pictures, they select the single picture that has the best overall appearance. A photographer might typically take half a dozen or so pictures of each pose or each scene. These pictures are usually taken in rapid succession, and the aperture is opened one stop (or 1/3 of a stop) between each exposure and the next, or the shutter speed is equivalently adjusted between one exposure and the next. when the pictures are developed they are usually arranged in a row (say left to right) ordered from lightest to darkest, and one of them is chosen by visual comparison to the others. The remaining pictures are usually disposed of or not used at all.
In situations where there is high contrast, extended-response film may be used.
Xlany modern films exhibit an extended response and are capable of capturing a broad dynamic range. Extended response film was invented by Charles VI:yckoff, as described in U.S. Pat. No. 3,663,228. In variations of the Wyckoff film in which the different exposures are separately addressable, it is also possible to apply a new imaging processing means and apparatus, as described in U.S. Pat. No. 5,828,793.
Often in an indoor setting, there is a window in the background, and we wish to capture both the indoor foreground (lit by low-power household lamps) and the outdoor scene, (which might be lit by bright sunlight). This situation is usually dealt with by adjusting the lighting. Often a fill-flash is used, sometimes leading to unnatural pictures. It is difficult to tell exactly how much fill-flash to use, and excessive fill flash leads to visually unpleasant results, while insufficient fill-flash fails to reduce the dynamic range of the scene sufficiently. Still-photographers address this problem, again, by bracketing, but now they must bracket over two variables: (1) the exposure for the background lighting, and (2) the exposure for the flash. This is generally done by noting that the shutter speed does not affect the flash exposure but only affects the exposure to background light, while the aperture affects both. Thus the photographer will expose for a variety of both shutter speeds and apertures. Alternatively, a flash with adjustable output may be used, and the photographer will make a variety of exposures attempting to bracket through all possible combinations of flash output and exposure to natural light. While there are many automatic systems that combine "intelligent" flash and light meter functionality, the results are often unacceptable, or at best, still fall short of the results that can be obtained by bracketing over the two variables - flash and natural light.
Alternatively, especially in commercial photography, movie production, or video production, great effort is expended to reduce the dynamic range of the scene. Large sheets of light-reducing dark grey transparency material are used to cover windows.
Powerful lamps are temporarily set up in office or home interiors. Again, it is very difficult to adjust the balance between the various lamps. Most professional photographers bracket the exposures over a variety of combinations of lamp output levels.
As one can imagine, the number of possible permutations grows astronomically with the number of separate lights. Furthermore, the dimension of color is often involved.
It is common, for example, in pictures used in annual reports or magazine advertisements, for different colored filters to be placed over each lamp. For example, it is common to cover the lamps in the background with strongly colored (e.g. dark blue) filters. The exact effect is not predictable. Most professional photographers test with Polaroid film first but Polaroid is not good enough for the final use. A high-quality film is inserted in place of the Polaroid, and the same exposure is made there. Because of differences between the response of the two films, it is still necessary to bracket over the balance of the various lights. Furthermore, it is impossible to predict the exact wishes of the client, or other possible end uses of the image, and it is usually necessary to take both "normal" pictures with no filters on the lights, as well as "dramatic" pictures with colored lights. Therefore, it is also common to bracket over colors (e.g. take one picture with a bright yellow background, another with plain white, and another with deep blue, etc). Thus the resulting shot can appear in both a more traditional publication and a more artistic/advertising-related publication.
This dual-use can save the photographer from having to do a re-shoot when another use of the image arises, or should the client have a slight change of heart, as is often the case.
As an alternative to using many different lights, it is common in commercial photography to leave the shutter open and move around through the scene with a hand-held flash unit. The photographer aims the flash unit at various parts of the scene and triggers the flash manually. Apart from its economy (only one flash lamp is needed, rather than tens or hundreds of flashlamps that would be needed to create the same effect with a single short exposure), the method is often preferred for certain applications even when more flash lamps are available to the photographer.
The method, which is called painting with tight, or, more succinctly, lightpainting, has a certain expressive artistic quality, that makes it popular among commercial photographers, particularly in the advertising industry. The reason for this appeal is that the light sources can be placed right in view of the camera. At one instant the photographer may stand right in full view of the camera and point the light to the side, flashing on some object in the scene. The photographer does not show up in the picture because the light is aimed away from his/her body, and the camera only "sees" the object that the flash is aimed at. If that were the only flash of light, the picture would be entirely black except for that one object in the scene. However, the photographer moves through the scene and illuminates many different parts of the scene in a similar way. Thus, using a single flash lamp, the scene can be illuminated in ways that are simply not possible in a single short exposure, even with access to an unlimited number of flash lamps. This is because a plurality of lamps placed in the scene at the same time would illuminate each other, or, for example, light from one flashlamp may illuminate the light stand upon which another flashlamp is attached.
Often, in lightpainting, various colored filters are held over the lamp each time it is flashed.
Soft-focus and diffusion filters are frequently used in commercial photography.
These filters create pleasing halos around specular highlights in the scene. They may or may not reduce resolution (e.g. the ability to read a newspaper positioned in the scene), since the image detail can remain yet be seen through a "soft and dreamy" world. It is often desirable to either blur or diffuse some areas of the scene but not others. Sometimes a soft-focus filter with a hole in the middle is used to blur the edges of the image (usually corresponding to background material) while leaving the center (usually the main subject matter) unaffected.
Another creative effect that has become quite popular in commercial photography is called split diffusion. Split-diffusion is created by separating the control of the foreground lighting from the control of the background lighting. The foreground lights are turned on and one exposure is made. The foreground lights are turned off, and the background lights are turned on. A diffusion filter is placed over the lens and a second exposure is made on the same piece of film. The split-diffusion effect may also be created with flash. The foreground flashlamps are activated, the diffusion filter is moved over the lens, and the background flashlamps are then activated.
Split-diffusion is also routinely applied within the context of lightpainting. The diffusion filter is often moved by an assistant, or electrically, back and forth in front of the lens or away from the lens, while the photographer flashes at different parts of the scene, some flashes with and some without the diffusion.
SUMMARY OF THE INVENTION The invention facilitates a new form of visual art, in which a fixed point of view is chosen for the base station camera, and then, once the camera is secured on a tripod, a photoborg can walk around and use various sources of illumination to sequentially build up an image layer-upon-layer in a manner analogous to paint brushes upon canvas, and the cumulative effect embodied therein. To the extent that the artist's light sources can be made far more powerful than the natural ambient light levels, the artist may have a tremendous degree of control over the illumination in the scene.
The resulting image is therefore a result of what is actually present in the scene, together with a potentially very visually rich illumination sculpture surrounding it.
Typically the illumination sources that the artist carries are powered by batteries, and therefore, owing to limitations on the output capabilities of these light sources, the art is practiced in spaces that may be darkened sufficiently, or, in the case of outdoor scenes, at times when the natural light levels are least.
By "photoborg", what is meant is one who is either a photographic cyborg (cybernetic organism), a lighting technician, a photographer, or an artist using the apparatus of the invention. By virtue of the communications link between the photoborg and the base station, the photoborg may move through the space, including the space in view of the camera, and the photoborg may selectively illuminate objects that are at least partially within the field of view of the camera. Typically the photoborg will produce multiple exposures of the same scene or object. These multiple exposures are typically each stored as separate files, and are typically combined at the base station, either by remote control of the photoborg (e.g. by way of wearable computer remotely logged into the base station computer), or by a director or manager at the base station.
In a typical application, the artist may, for example, position the camera upon a hillside, or on the roof of a building, overlooking a portion of a city. The artist may then roam about the city, walking down various streets, and use the light sources to illuminate various buildings one-at-a-time. Typically, in order that the wearable or portable light sources be of sufficient strength compared to the natural light in the scene (e.g. so that it is not necessary to shut off the electricity to the entire city to darken it sufficiently that the artist's light source be of greater relative brightness) some form of electronic flash is used as the light source. In some embodiments of the invention, an FT-623 lamp is used, housed in a lightweight 30 inch highly polished reflector, with a handle which allows it to be easily held in one hand. The communications infrastructure is established such that the camera is only sensitive to light for a short time period (e.g. typically approximately 1/500 of a second), during the instant that the flash lamp produces light. In this manner a comparatively small lamp (e.g. a lamp and housing which can be held in one hand) may illuminate a large skyscraper or office tower in such a manner that, in the final image, the flashlamp is the dominant light source, compared to fluorescent lights and the like that might have been left turned on upon the various floors of the building, or to moonlight, or light from streetlamps which cannot be easily turned off.
Typically, the photoborg's wearable computer system comprises a visual display which is capable of displaying the image from the camera (typically sent wirelessly over a data communications link from the computer that controls the camera). Typically, also, this display is updated with each new exposure. The display update is typically switchable between a mode that shows only the new exposure, and a cumulative mode that shows a photoquantigraphic summation over time to show the new exposure photoquantigraphically added to previous exposures. This temporally cumulative display makes the device useful to the photoborg because it helps in the envisioning of a completed lightmodule painting. The temporally cumulative display is also useful in certain applications of the apparatus to gaming. For example, a game can be devised in which two players compete against each other, One player may try to paint the subject matter before the camera red, and the other will try to paint the subject matter blue. When the subject matter is an entire cityscape as seen from a camera located on the roof of a tall building, the game can be quite competitive and interesting. Additionally, photoborgs can either work cooperatively on the same team, or competitively, as when two teams each try to paint the city a different color, and "claim" territory with their color. In some embodiments of the game the photoborgs can also shoot at each other with the flashguns. For example, if a photoborg from the "red" team "paints" a blue-team photoborg red, he may disable or "kill" the blue-team photoborg, shutting down his flashgun. In other embodiments, the "kill" and "shoot" aspects can be removed, in which case the game is similar to a game like squash, where the opponents work in a collegial fashion, getting out of each other's way while each side takes turns shooting. The red team flashgun(s) and blue team flashgun(s) can be fired alternately by a free running base-station camera, or they can all fire together. When they fire alternately there is no problem disambiguating them.
When they fire together, there is preferably a blue filter over each of the flashguns of the blue team, and a red filter over each of the flashguns of the red team, so that flashes of light from each team can be disambiguated.
The wearable computer is generally controllable by the photoborg through a chording keyboard mounted into the handle of each light source, so that it is not necessary to carry a separate keyboard. In this manner, whichever light source the photoborg plugs into the body-worn system becomes the device for controlling the process. Typically, also, exposures are maintained as separate image files in addition to a combined cumulative exposure that appears on the photoborg's screen. The exposures being in separate image files allows the photoborg to selectively delete the most recent exposure, or any of the other exposures previously combined into the running sum on the screen. This capability is quite useful, compared to the process of painting on canvas, where one must paint over mistakes rather than simply being able to turn off brushstrokes. Furthermore, exposures to light can be adjusted either during the shooting or afterwards, and then re-combined. The capability of doing this during the shooting is an important aspect of the invention, because it allows the photoborg to capture additional exposures if necessary, and thus to remain at the site until a satisfactory final picture is produced. The final picture as well as the underlying dataset of separately adjustable exposures, and the weighting that was selected to generate the final picture, is typically sent wirelessly to other sites (e.g. on the World Wide Web) so that others (e.g. art directors or other collaborators) can manipulate the various exposures and combine them in different ways, and send comments back to the photoborg by email. This additional communication facilitates the collection of additional exposures if it turns out that certain areas of the scene or object could be better served if they were more accurately or more expressively described in the dataset.
Each of these exposures is called a lightstroke. A lightstroke is analogous to an artist's brushstroke, and it is the plurality of lightstrokes that are combined together that give the invention described here it's unique ability to capture the way that a scene or object responds to various forms of light.
Furthermore, a particular lightstroke may be repeated (e.g. the same exposure may be repeated in almost exactly the same way, holding the light in the same position, more than once). These seemingly identical lightstrokes may be averaged together to obtain a single lightstroke of improved signal to noise ratio. This signal averaging technique of repeating a given lightstroke may also be generalized to the extent that the lamp output may be varied for each repetition, but otherwise held in the same position and pointed in the same direction at the scene. The resulting collection of differently exposed pictures may be combined to produce a lightstroke that captures a broad dynamic range.
A typical attribute of the images produced using the apparatus of the invention is that of extreme exposure. Some portions of the image are often deliberately overexposed by as much as 10 f-stops or more, while other areas of the image are deliberately underexposed. In this way, selected features of the scene or object are emphasized. Typically, pictures produced using the apparatus of the invention span a very wide range of colorspace. Typically the deliberate overexposure is combined with very strongly saturated colors, so that the portions of the image extend to the boundaries of the color gamut. Accordingly, what is observed in some areas of the images is extreme shadow detail that would not show up in a normally exposed picture. In other areas of the picture, one might see extreme highlight details that would not show up in a normally exposed picture. Thus in order to capture information pertaining to the extreme dynamic range necessary to be able to render images of such extreme exposure range, lightstrokes of extended dynamic range are extremely useful. Moreover, lightstrokes of extended dynamic range may be useful for other reasons such as the synthesis of split-diffusion effects which become more numerically stable and immune to quantization noise or the like, when the input lightstrokes have extended dynamic range.
Finally, it may, at times, be desirable to have a real or virtual assistant at the camera, to direct/advise the photoborg. In this case, the photoborg's viewfinder which presents an image from the perspective of the fixed camera also affords the photoborg with a view of what the assistant sees. Similarly, it is advantageous at times that the assistant have a view from the perspective of the photoborg. To accomplish this, the photoborg may have a second camera of a wearable form. Through this second camera, the photoborg allows the assistant to observe the scene from the photoborg's perspective. Thus the photoborg and assistant may collaborate by exchange of viewpoints, as if each had the eyes of the other.
The photoborg's camera may alternatively be attached to and integrated with the light source (e.g. flashlamp) in such a way that it provides a preview of the coverage of the flashlamp. Thus when this camera output is sent to the photoborg's own wearable computer screen, a flashlamp viewfinder results. The flashlamp viewfinder allows the photoborg to aim the flashlamp, and allows the photoborg to see what is included within the cone of light that the flashlamp will produce. Furthermore, when viewpoints are exchanged, the assistant at the main camera can see what the flashlamp is pointed at prior to activation of the flash.
Typically there is a command that may be entered to switch between local mode (where the photoborg sees the flash viewfinder) and exchanged mode (where the photoborg sees out through the main camera and the assistant at the main camera sees out through the photoborg's typically wearable camera.
In many embodiments of the invention the flashlamp is wearable. The flashlamp may also be an EyeTap (TM) flashlamp. An EyeTap flashlamp is one in which the effective source of light is co-incident with an eye of the wearer of the flashlamp.
One aspect of the invention allows a photographer to use a flashlamp and always end up with the ability to produce a picture where there is just the right proportion of flash in relation to the total exposure, and where the photographer may even change the apparent amount of flash after a set of basis pictures has been taken. Using the apparatus of the invention, the photographer simply pushes a button and the apparatus takes, for example, a picture at a shutter speed of 1/250 sec with the flash, then automatically turns off the flash and quickly takes another picture at 1/30 sec.
The look and feel of the system is no different than an ordinary camera and the fact that two or more pictures are taken need not be evident to those being photographed, or to the photographer, since the flash will only fire once, and the second click of the camera shutter if it is of a mechanical variety is seldom perceptible if it happens quickly after the first. Preferably a non-mechanical camera is used so that a possibly distracting double or multiple clicking is not perceptible.
After acquiring this pair of "basis pictures", various combinations of the flash and non-flash exposures may be synthesized and displayed on a computer screen, either after the camera is brought to a base station for processing, or directly upon the screen of a wearable computer that the photographer is using, or perhaps directly inside the viewfinder of the camera itself, if it has an electronic viewfinder. The picture that best matches personal preference may be selected and printed. Thus the desired ratio of flash to ambient light can be selected AFTER the basis pictures have been taken.
Furthermore, color correction can be done on the flash and ambient components of the picture separately (automatically or manually). If the picture was taken in an office, the greenish cast of the fluorescent lights can be removed without altering the face of someone lit mostly by the flash.
Furthermore, the background may be colored for interesting effects. For example suppose the background is mostly sky. The flash image may be left unaltered, resulting in a normal color balance for flesh tones, and the sky may be made a nice blue color, even though it might have been grey in reality. This effect works really nicely for night time portraits where the sky in the background would otherwise tend to appear green or dark brown, and changing it to a deep blue by traditional global color balance adjustment of the prior art would lend an unpleasant blue cast to the faces of the people in the picture.
Each of the two basis pictures may be generated in accordance with a Wyckoff principle ("definition enhancement") as follows: the flash may be activated multiple times. Without loss of generality, consider an example where the flash is activated 3 times with low, medium and high output levels, and where 3 non-flash pictures are also taken in rapid succession with three different exposures as well. Two basis images of extended dynamic range are then synthesized from each set of three pictures using the Wyckoff principle.
More generally, any number of pictures with any particular ratio of flash and ambient exposure may be collectively used to estimate a two dimensional manifold in the MN dimensional picture space defined by a picture of dimensions Af by V.
The major aspect of this invention involves the lightpainting method described earlier. The invention permits the photographer to capture the result of exposure to each flash of light (called a "lightstroke", analogous to an artist's brush stroke) separately. The lightstrokes can be electronically combined in various ways before or after the photographer has packed up the camera and left the scene. In lightpainting, photographers often place colored filters over the flash, to simulate a scene lit by multiple sources of different colored lights. Using the apparatus of the invention, no filters are needed, because the color of each lightstroke may be assigned electronically after the photographer has left the scene, although optional filters may still be used in addition to electronic colour selection. Therefore, the photographer is not necessarily committed to decisions about the choice of color, or the relative intensity of the various lightstrokes, and is also free to make decisions regarding whether or not to apply, and in what extent to apply split-diffusion, after leaving the scene.
These collections of lightstrokes are referred to as a "lightspace". The image pairs in the above flash/no-flash example are a special case of a lightspace where the flash picture is one lightstroke and the non-flash picture is another. In the case of black and white (greyscale) images, the lightspace is homomorphically equivalent to a vector space, where the coefficients in the vector sum are a scalar field. This process is a generalization of homomorphic filtering, where a pointwise transfer function is applied to each entire image, a weighted sum is taken, and then the inverse transfer function is applied to this sum. In practice, with typical cameras, a sufficiently pleasing image results if each image is cubed, the results added together with the desired weighting, and the cube root of the sum is computed. In the case of color images, the vector space is generalized to a module space, for colour coordinate transformations and various filtering, blurring, and diffusion operations. Alternatively the process may be regarded as three separate vector spaces, one for each colour channel.
Another aspect of the invention is that the photographer need not work in total darkness as is typically the case with ordinary lightpainting. With a typical electronic flash, and even with a mechanical shutter (as is used with photographic film) the shutter is open for only 1/500 sec or so for each "lightstoke". Thus the lightpainting can be done under normal lighting conditions (e.g. the room lights may often be left on). This aspect of the invention pertains to both traditional lightpainting (where the invention allows multiple flash-synched exposures to be made on the same piece of film, as well as to the use of separate recording media (e.g. separate film frames or electronic image captures) for each lightstroke. The invention makes use of innovative communications protocols and a user-interface that maintain the illusion that the system is immune to ambient light, while requiring no new skills beyond that of traditional lightpainting. The communications protocols typically include a full-duplex radio communications link so that a button on the flash sends a signal to the camera to make the shutter open, and at the same time, a radio wired to the flash sync contacts of the camera is already "listening" for when the shutter opens. The fact that the button is right on the flash gives the user the illusion that he or she is just pushing the lamp test button of a flash as in normal lightpainting, and the fact that there is really any communications link at all is hidden by this ergonomic user interface.
The invention also includes a variety of options for making the lightpainting task easier and more controlled. These include such innovations as a means for photoborg to determine if he or she can be "seen" by the camera (e.g. means to indicate extent of camera's coverage), various compositional aids, means of providing workspaceillumination that has no effect on the picture, and some innovative light sources.
Other innovations such as EyeTap cameras, EyeTap light sources, etc., and further means of collaboration among a community of photoborgs are also included in the invention.
BRIEF DESCRIPTION OF THE DRAWINGS The invention will now be described in more detail, by way of examples which in no way are meant to limit the scope of the invention, but, rather, these examples will serve to illustrate with reference to the accompanying drawings, in which: FIG. 1 is a diagram of a typical illumination source used in conjunction with the invention, comprising a data entry or command entry device such as a pushbutton switch, which, when pressed, causes the lamp to flash, but not directly; instead the lamp flashes as a result of a bi-directional communications protocol.
FIG. 2a is a diagram of the camera and base station that receives the signal from the light source shown in FIG. 1, wherein the signal causes the camera shutter to open, and it is the opening of the camera shutter which sends back a confirmation signal to the illumination source of FIG. 1, causing the flash to be triggered.
FIG. 2b is a diagram of the camera and base station that uses a flash detector instead of an explicit inbound radio channel for synchronization.
FIG. 3 shows a typical usage pattern of the invention in which the fixed (static) nature of the camera and base station is emphasized by way of concentric circles denoting radio waves sent to (inbound) it and radio waves sent from (outbound) it, and where a hand-held version of the light source depicted in FIG. 1 is flashed three times at three different locations.
FIG. 4 shows a detailed view of the user-interface typical of an instrument depicted in FIG. 1, where the data entry device comprises a series of pushbutton switches and where there is also a data display screen affixed to the source of illumination.
FIG. 5 shows an example where pictures of dimension 640x480 are captured by repeated flashing of the apparatus of FIG. 1 at different locations, where the picture from each exposure is represented as a point in a 307200 (640x480) dimensional photoquantigraphic imagespace.
FIG. 6 shows how some of these pictures, which are called lightvectors when represented in photoquantigraphic imagespace, may fall in a subspace of the 307200 dimensional photoquantigraphic imagespace.
FIG. 7a shows an example of a two dimensional subspace of the 307200 dimen sional lightvectorspace, where the corresponding pictures associated with the lightvectors in this space are positioned on the page to correspond with their coordinates in the plane of the page.
FIG. 7b shows how the three pictures along the first row of the photoquantigraphic subspace of Fig. 7a are generated.
FIG. 8 shows a photoquantigraphic coordinate transformation, which appears as a coordinate transformation of the two dimensional space depicted in FIG. 7a.
FIG. 9a shows the method by which pictures are converted to lightvectors by applying a linearizing inverse transfer function, after which the lightvectors are added together (possibly with different weighting) and the resulting lightvector sum is converted back to a picture by way of the forward transfer function (inverse of that used to convert the incoming images to lightvectors).
FIG. 9b shows the calculation of a photoquantigraphic sum in pseudocolor modulespace.
FIG. 9c shows photorendering (painting with lightmodules), e.g. calculation of a photoquantigraphic sum in pseudocolor modulespace.
FIG. 9d shows a phlashlamp made from 8 flashlamps, used to generate some of the lightvectors of Fig. 9c.
FIG. 9e shows lightspace rendering in CMYK colorspace.
FIG. 9f shows the inverse gamut warning aspect of the invention.
FIG. 10a shows a general philter operation, implemented by applying a photoquantigraphic filter (e.g. by converting to lightvectorspace, filtering, and converting back).
FIG. lob shows the implementation of split diffusion using a philter on one lightvectorspace quantity and no filter on the other quantity.
FIG. lOc shows an image edit operation, such as a pheathering operation, implemented by applying a photoquantigraphic edit operation (e.g. photoquantigraphic feathering).
FIG. 10d shows a philter operation applied over an ensemble of input images.
FIG. 11 shows how the estimate of a single lightvector (such as v4 of Fig. 5) may be improved by analyzing three different but collinear lightvectors.
FIG. 12a shows the converse of Fig. 11, namely to illlustrate the fact that to generate a Wyckoff set (as strongly colored lightvectors approximately do over their color channels), one desires to begin with a great deal of dynamic range, as might be captured by a Wyckoff set.
FIG. 12b attempts to make this point of Fig. 12a all the more clear by showing that a strongly colored filter exhibits an approximation to the Wyckoff effect by virtue of the different degrees of attenuation in different spectral bands of a color camera.
FIG. 12c shows a true Wyckoff effect implemented for a scene that is monochr matic and a color camera with strongly colored filter.
FIG. 13a shows the EyeTap (TM) flashlamp or phlashlamp aspect of the invention.
FIG. 13b shows a wide angle embodiment of the EyeTap (TM) flashlamp or phlashlamp.
FIG. 14a shows an EyeTap (TM) camera with planar diverter.
FIG. 14b shows an EyeTap (TM) camera with curved diverter which is also part of the optical system for the camera.
FIG. 15 shows an embodiment of the finder light or hiding light, which helps a photoborg determine where the camera is, or whether or not he or she is hidden from view of the camera.
FIG. 16 shows an embodiment of the lightsweep (pushbroom light).
FIG. 17a shows an embodiment of the flash sequencer aspect of the invention.
FIG. 17b shows an embodiment of the invention for acquiring lightvector spaces, using special flashlamps that do not require a sequencer controller.
FIG. 18 shows the user interface to a typical session of the Computer Enhanced Multiple Exposure Numerical Technique (CEMENT) program.
While the invention shall now be described with reference to the preferred embodiments shown in the drawings, it should be understood that the intent is not to limit the invention only to the particular embodiments shown, but rather to cover all alterations, modifications and equivalent arrangements possible within the scope of appended claims.
In all aspects of the present invention, references to "camera" mean any device or collection of devices capable of simultaneously determining a quantity proportional to the amount of light arriving from a plurality of directions and or at a plurality of locations.
References to "photography", "photographic", and the like, may also be taken to include "videography", "videographic", and the like. Thus the final result may be a video or other sequence of images, and need not be limited to a single picture.
Indeed, the term "picture" may mean a motion picture, in addition to just simply a still picture.
Similarly references to "data entry device" shall not be limited to traditional keyboards and pointing devices such as mice, but shall also include input devices more suitable to the "wearable computers" of the invention, as well as to portable devices. Such input devices may include both analog and digital devices as simple as a single pushbutton switch or as sophisticated as a voice controlled or brainwave, respiration, or heart rate controlled device, or devices controlled by a combination of these or other biosignals. The input devices may also include possible inferences made as to when to capture a picture or trigger an event in a manner that does not necessarily require or involve conscious thought or effort.
Moreover, references to "inbound channel" shall not be limited to radio communications devices as depicted in the drawings through the use of the standard international symbol for antenna, but shall also include communications over wire (twisted pair, coax, or otherwise), infrared communications, or any other communications medium from a user to the camera base station. References to base station also do not limit it to a station that is permanent or semi-permanent; base stations may include mobile units mounted on wheels or vehicles, and units mounted or carried on other persons.
Similarly, references to "outbound channel" shall not be limited to radio communication, as depicted in the drawings, but may also include other means of communication from the camera to the user, such as the ability of a user to hear the click of a camera shutter, perhaps in conjunction with steps taken to make the sound of the shutter louder, or to add other audible, visual, or the like, events to the opening of the shutter. The "outbound channel" may also include means by which a photoborg can confirm that a camera shutter is open for an extended period of time, or means of making a photoborg aware of the progression of time for which a shutter is open.
The use of "shutter" is not meant to limit the scope of the invention. While the drawings depict a mechanical shutter with solenoid, the invention may be (and is more often) practiced with electronic cameras that do not have explicit shutters, but, rather, the shuttering operation may comprise electronic control of a sensor array, or simply the selection of the appropriate frame(s) from a video sequence. Thus when reference is made to the time during which the camera is "sensitive to light", what is meant is that there is an intent or action that collects information from the camera during that time period, so that this intent or action itself serves to take the place of an actual shutter.
Likewise, while the drawings and explanation involve two separate communications channels for the inbound and outbound channels, operating at different radio frequencies, it will be understood that the invention is typically practiced using a single bidirectional communications link implemented via TCP/IP communications protocols between a wearable computer system and a stationary computer at the base station, but even this method of communication is not meant to limit the scope of the invention. The communications channel could comprise, for example, a single piece of string or rope, where the user tugs on the rope to cause a picture to be taken, and the rope is tugged back by the camera to activate the user's light source. Moreover, this communication need not be bidirectional, and may, for example, be implemented simply by having suitable timing between the flash and the camera, so that a signal need only be sent from the flash to the camera, and then the flash may be fired at the appropriate interval, by a timing circuit contained therein, so that there is no explicit outbound communications channel or need for one. In this case, it will be understood that the outbound communications channel may comprise synchronized timing devices, at least one of which is associated with a photoborg's apparatus and at least one of which is associated with a camera at the base station.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS, WITH REFERENCE TO DRAWINGS In using the invention to selectively and sequentially illuminate portions of a scene or object, a photoborg (e.g. a photographer, artist, lighting technician, hobbyist, professional or other user of the invention) will typically point a light source at some object or portion of the scene in view of the camera, and issue a command through a wearable computer system, or through a portable system of some sort, to acquire a picture which is affected, at least in part, by the light source.
In the embodiment presented in Fig 1 this command is issued by pressing button 110. Button 110 turns on radio transmitter 120, causing it to send a signal out through its antenna 130. Transmitter 120 is designated as "iTx" where "i" denotes inbound. The inbound pathway is the communications pathway from a photoborg to the base station.
The signal from iTx 120 is received at a base station and remote camera, depicted in Fig 2a, by antenna 210, where it is demodulated by inbound receiver iRx 220.
Receiver 220 may be something as simple as an envelope detector, or an LM567 tone decoder that activates a relay. In some embodiments it is a communications protocol running over amateur packet radio, using a terminal node controller in RISKS mode (TCP/IP). In a commercially manufactured embodiment, however, it would preferably be a communications channel that does not require a special radio license, or the like. In the simple example illustrated here, inbound receiver 220 activates shutter solenoid 230.
What is depicted in this drawing is, for illustrative purposes, approximately typical of a 1940s press camera fitted with the standard 6 volt solenoid shutter release.
Preferably, however, the synchronization does not involve a mechanical shutter, and thus no shutter contacts are actually involved. Instead a sensor array is preferably used. A satisfactory camera having a sensor array is the Kodak (TM) DCS-260.
Continuing with the illustrative embodiment, Shutter flash synchronization contacts 240 (denoted *'X" in Fig. 2a) activate outbound transmitter oTx 250 causing a signal to be sent out antenna 260. This outbound signal from the base station is received by a photoborg by way of antenna 140. This received signal causes the outbound receiver oRx 150 of the outbound channel to activate an electronic flash typically via an optocoupler such as a Motorola MOC3020 or the like, which is typically connected to the synchronization terminals 160 (denoted "X" in Fig. 1) of the flash unit 170. An opening 180 on the light source allows light to emerge to illuminate the scene or objects that the source is pointed at.
It should be noted that many older cameras have a so-called "M" sync contact which was meant for firing magnesium flashbulbs. This sync contact fires before the shutter is open. It is often useful to use such a sync contact with the invention as it may account for delay in the outbound channel, or equivalently allow for a form of pulse compression such as a chirp to be sent over the outbound channel, so that the invention will enjoy a greater robustness to noise and improved signal range. Similarly where the camera is an electronic imaging device, it may be desirable to design it so that it provides a sync signal in advance of becoming sensitive to light. The advance sync may then be used together with pulse compression.
Typically, the components of Fig 1 are spread out upon the body of the photoborg and incorporated into a wearable computer system or the like, but alternatively, if desired, the entire source may be contained inside a single box, together with all the communications hardware, power source, and perhaps a display means. In this alternative hand-holdable embodiment, a photoborg will then not need to wear any special clothing or apparatus.
A photoborg will typically wear black clothing and hold the light source depicted in Fig 1 using a black glove, although this is not absolutely necessary. Accordingly, the housing of the apparatus 190 will typically be flat black in colour, and might have only two openings, one for the button 110, and one for the light opening 180, thereby hiding the complexity of the contents from the photoborg so as to make the device more intuitive for use by those not technically skilled or inclined.
The purpose of this aspect of the invention, illustrated in Fig. 1 and Fig. 2a is to obtain a plurality of pictures of the same subject matter, where the subject matter is differently illuminated in each of the pictures. There are various other embodiments of this aspect of the invention, which also allow this process to be performed. For example, the camera set up at the base station may be a video camera, in which case the photoborg can walk around with a flashlamp and flash the lamp at various portions of the subject matter in the scene.
Afterwards, the photoborg or another person can play back the video, and extract the frames of video during which a flashlamp was fired. Alternatively, a computer can be used to analyze the video being played back, and can automatically detect which frames include subject matter illuminated by flash, and can then mark the locations of these frames or separate them from the entire video sequence. If a computer is going to be used to analyze the video afterwards, it may also analyze the video during capture. This analysis would greatly reduce the storage space needed because the system could just wait until a flashlamp was fired, and then automatically store the pictures in which subject matter was illuminated (in whole or in part) by a flashlamp.
Fig. 2b depicts such a system. Video camera 265 is used to take the pictures. Video camera 265, denoted CAM, is connected to video capture device 270, denoted CAP.
The capture device 270 captures video continuously, and sends digitized video to the processor, PROC, 275. A satisfactory connection between CAP 270 and PROC 275 is an IEEE 1394 connection. In particular, a satisfactory unit that can be used as a substitute for both CAP 270 and PROC 275 is a digital video camera such as a SONNY PC7, which outputs a digital video signal.
Processor 275 originally captures one or more frames of video from the scene under ambient illumination. If more than one frame is captured, the frames may be photoquantigraphically averaged together. By this means, or by other similar means, a background frame is stored in memory 280, denoted MEM. The stored image can then be compared against further incoming images to see if there are regions that differ. If there is sufficient difference over a region of a new incoming frame of video to overcome a certain noise threshold setting, then a the new frame is captured. The comparison frame may also be updated between flashes, in case the ambient light is slowly changing. In this case, PROC 275 processes with an assumption that flashes are occasional, e.g. that there will likely be many frames of video before and after each flash, so that changes in ambient light can be tracked. This tracking can also accommodate sudden changes, as when lights turn on inside a building by timer control, since the changes will be more like a step function, while the flashlamp is more like a Dirac delta measure (e.g. the ambient lights may quickly change state they don't usually go on and then off again in a very short time).
In this way, there is no need for any explicit radio communications system or the like. The photoborg simply sets up the camera and leaves it running, and then takes an ordinary flashlamp and walks around lighting up different parts of the scene.
Optionally the photoborg may have a one-way communications system from the camera so that he can see what effect his flashlamp is having on the scene. This communications system may be as simple as having a television screen at the base station, where the television screen is big enough for him to see from some distance away. Alternatively, an actual transmitter, such as a radio transmitter, may be used to send signals from the base station to a small television or EyeTap (TM) display built into his eyeglasses.
Each image, for which a flashlamp illumination was detected, may optionally be sent back to the photoborg by way of communications system 285, denoted COMM, and connected to antenna 290. Images due to each flash of light, as well as one or more images due only to ambient light, are saved on disk 295, denoted DISK.
The components of Fig. 2b may be spread out over the Internet, if desired. The base station is free-running and does not require any human operator, although a manager may remotely log into the base station if it is connected to the Internet. In this case a manager can cement the images together into a photorendering, and she can select certain lightvectors of particular interest to send to a photoborg. There may also be more than one photoborg working together on a project.
A satisfactory camera for camera 265 is an ordinary video camera. Preferably, however, camera 265 would be a specially built camera in which each pixel functions as a lock-in amplifier, to lock onto a known waveform emitted from a specially designed electronic flashlamp. The camera of this embodiment of the invention is then referred to as a lock-in camera.
Fig 3 depicts the typical usage pattern of the source depicted in Fig 1. A fixed camera 300 (depicted here with a single antenna as may be typical of a system running with a terminal node controller over a TCP/IP communications link), is used together with a hand-held illumination source which is flashed at one location 310, then moved to a new location 320, flashed again, and so on, ending up at its final location 330 where it is flashed for the last time. Alternatively, a number of photoborgs, each carrying a flashlamp, may selectively illuminate the scene.
Fig 4 depicts a view of a self-contained illumination source. While the art is most frequently practiced using a wearable computer with head-up display or the like, it is illustrative to consider a self-contained unit with a screen right on it (although there is then the problem that the screen is lit up and may spoil a picture if it becomes visible to the camera whereas a head mounted display painted black and properly fitted to the eye with suitable polarizers will produce much less environmental light pollution).
This source has pushbuttons 410 denoted by color (e.g. "R" for red, where the button may be colored or illuminated in red), "G" for green, etc.. These pushbuttons may be wired so that they take exposures directly in the indicated color (e.g. so that pushing 410 R will cause the apparatus to request a red exposure of the camera), or they may be wired so that pushing R marks the system for red, but does nothing until FILM WRITE (W) 420 is pressed. Pressing W will then send a signal to the camera requesting a red exposure, which typically happens via a spinning filter wheel in front of the camera, wherein the camera shutter opens but the flash sync pulse is not sent back right away. Instead the base station waits until the instant that the red filter is in front of the lens and then at that exact instant, sends back a flash sync pulse, activating flash 430 so that it sends a burst of illumination out opening 440. Alternatively, these color selections may be made electronically, wherein the only difference between pressing, for example, R, and pressing G is in the form of information appended to the header of the image file in the form of a comment. For example, if the captured image is a Portable PixMap (PPM), the image content is exactly the same except that the comment #dustcolo 1 0 0 is at the beginning of the image file, as opposed to #dustcolo O 1 0 if the green button were pressed.
The effect of pressing the red button might show up on screen 450 in the form of an additional red splotch of light upon the scene, if, for example, there is a single object illuminated with the white flash of light to which the color red has been assigned.
In this way, it is possible to watch the lightpainting build up slowly. In the case of an electronic imaging system, any of the effects of these flashes (called lightstrokes) may be deleted or changed in colour after they have been made (e.g. since the colour choice is just header information in the computer file corresponding to each picture).
However, in the case of a film-based camera, it is not possible to change the color, so a great deal of film could be wasted were it not for the preview button on control panel 420. In particular, the preview button (P) performs the operation on a lower resolution electronic camera, to allow the photoborg to see its effect, prior to writing to film (W).
Also, in practice, not every flash exposure need be written to a separate frame of film, and in fact, if desired, all of the flash exposures can be written to a single frame of film, to avoid the need to scan the film and combine the exposures later. In this way, a photoborg can preview (P) over and over again, and then practice several times each lightstroke of a long lightpainting sequence, view the final result, and then execute the exact same lightpainting sequence onto a single frame of film.
In practice, when using film, the art of this invention is practiced somewhere between the two extremes of having a lightpainting on one single frame and having each lightstroke on its own frame. Typically portions of the lightpainting are practiced using P 420 and then written onto a frame of film using several presses of W 420.
Then the film is advanced one frame (by pressing all three buttons on panel 420 together, or by some other special signal sent to the camera) and another portion of the lightpainting is completed. A typical lightpainting comprises then less than 36 frames of film and can thus be captured on the same negative strip, which simplifies the mathematical analysis of the film once it is scanned, since each lightvector will have undergone the exact same film aging prior to development, as well as the exact same development.
In the situation where film is not being used (e.g. in an embodiment of the invention using a completely electronic camera), lightvectors may still be grouped together if it is decided that they never need to be independently accessed. In this case, a new single image file is created from a plurality of lightvectors. Typically a new file format is needed in order to preserve the full dynamic range, especially if a Wyckoff effect (combining differently exposed lightvectors) is involved. Typically these combined files will take the form of a so-called "Portable DoubleMap (PDM)", and will have a file header of the form P8 #photoborg 8.5 has image address space v850 to v899 #photoborg 8.5 selected: #v852.pgm 0 0 1 g *2.5 #v853.pgm 0 0 1 g *2.5 #v854.pgm 0 0 1 g *2.5 #cement to v855.pdm 1536 1024 255 where the P8 indicates image type PDM, lines beginning in the # symbol are comments, and in particular, v852.pdm is the file name of the file containing both the ascii header and the raw binary data, and the numbers following the filename indicate how the image is to be cemented into the viewfinder with the other images. In particular, the first three numbers after the filename indicate the color blue (in RGB), the next symbol, "g" indicates that only the green channel of the image is to be considered (e.g. instead of mapping the whole image to blue which would only consider the blue channel, the green channel is mapped to blue), and the "*2.5" indicates a photoquantigraphic multiplicative factor (e.g. this lightvector will be boosted to 2.5 times its normal strength). This line together with the next two lines indicate what will be combined to make the new file v855.pdm.
After that, the next two lines indicate the file dimensions, and the last line of the header indicates the default range of the image data. Since PDM files are of type double (e.g. the numbers themselves range up to more than 10308) this number does not indicate the limit of the data type, but, rather, the limit of the data stored in the datatype (e.g. the maximum entry in the image array).
Each combined (PDM) lightvector (e.g. v855.pdm) of the above file size occupies 36 megabytes of disk space. Therefore, preferably, when a large number of lightvectors are being acquired, the JPEG file format is used to store the individual lightvectors.
A new file format is invented for storing combined lightvectors. This new format is a compressed version of the PDM file format, and normally has the file extension ".jdg". Thus the above file would become "v855.jdg" in a combined form.
Fig 5 depicts lightvectors in the multidimensional photoquantigraphic space formed by either converting a picture to lightspace coordinates (by applying the inverse nonlinear transfer function of the camera to the picture) or by shooting with a camera that provides lightspace (linearized) output directly, and then unrolling a picture so taken into a long vector. For example, a picture of pixel dimension 640x480 may be regarded as a single point in a 307200 dimensional lightvectorspace. Likewise continuous pictures on film may be transformed (by application of the inverse response of film and scanner) to points in infinite dimensional lightvectorspace, but for purposes of illustration, consider the case when the pixel count is finite while the pixel values remain unquantized. The first l sis 580 ("...") denote a continuation up to v307200, denoted 590, v307201, denoted 591, and beyond to, for example, v999999, denoted 599 in the figure. However, in practice, due to limitations of film there are typically far fewer lightvectors than the dimension of the space, e.g. 36 lightvectors if using a typical 35mm still film camera or in the case of a motion picture film camera or most electronic cameras, the number of lightvectors is typically not more than 999 in many applications of the invention.
Accordingly, in many previous embodiments of the invention where lightvectors were each stored as a separate file on a hard disk, the filenames of these lightvectors were numbered with three digit alphanumeric filenames of the form, v000, v001, . . . 123, if, for example, there were 124 lightvectors, so that they would list in numerical order on a typical UNIX-based computer using the "ls" command. For each scene or object being photographed, a new directory was created to hold the corresponding set of lightvectors.
Fig 6 depicts some pictures that represent linear combinations of two light sources and are therefore in a two dimensional lightvector subspace of the 307200 dimensional space. Lightvector 610 denotes the vector spanned by 520 530 and 540 of Fig 5.
Light vector 630 denotes the lightvector spanned by 560 and 570 of Fig 5. Lightvectors 620 and 640 denoted in bold together with 610 and 630 span a two dimensional lightvector subspace.
Fig. 7a depicts two pictures, 710 taken with a slow shutter (long exposure) and no flash to record the natural light and 720 taken with a fast shutter and flash to record the response of the scene to the flash. In 710, the entire image is properly exposed, while in 720 the foreground objects are properly exposed while the background objects are underexposed. These two images represent two points in the 307200 dimensional space of Fig. 5 and Fig. 6. Any two such noncollinear (e.g. corresponding to differently lit pictures) points span a two dimensional space, depicted by the plane of the paper upon which Fig. 7a is printed. The two axes of this space are the ambient light axis 730 (labeled 740 with numerals 750) and the flash axis 760 (labeled 770 with numerals 780).
The manner in which the other images are calculated will be now be described with reference to Fig. 7b which depicts the top row of images in Fig. 7a, namely images 790, 792, and 794.
The basis image of Fig. 7a denoted 710 is depicted as function fi in Fig. 7b.
The basis image 720 is denoted as function f2 These functions are functions of the quantity of light falling on time image sensor due to each of the sources of light.
The quantity of light falling on the image sensor due to the natural illumination is qi(x,y). That due to the flashlamp is q(x,y). Thus the picture 710 is given by fl(x,y) = f(qi(x,y)). Similarly, picture 720 is given by f2(x,y) = f(g2(Z,Y)) Thus, referring to Fig. 7b, passing fl and f2 through the inverse camera response function, f-' results in ql and q2, which are then distributed through vectorspace weights w through w32. These vectorspace weights are denoted by circles along the signal flow paths in Fig. 7b.
The vectorspace weights wij map bases Q to lightvectors ql; to form the first (top) row of pictures depicted in Fig. 7a, according to the following equation:
Which corresponds to the operation performed in Fig. 7b. The linear vectorspace spanned by qli is called the lightvector space. The nonlinear space spanned by fli is called the lightstroke space.
The other two rows of pictures in Fig. 7a are formed similarly. Clearly we need not limit ourselves to a 3 by 3 grid of pictures in Fig. 7a. In general we will have a continuous two-dimensional lightstroke space, from which infinitely many pictures can be generated within a continuous plane like that of Fig. 7a.
Any image that would have resulted from any amount of flash or natural light mixture falls somewhere on the page (plane) depicted in Fig. 7a. For example, at coordinates (0,2) which correspond to zero ambient exposure (fast shutter) and two units of flash, we have an image 790 where the foreground objects are overexposed while the background objects are still underexposed. At (1,2) we have an image 792 in which the foreground is grossly overexposed and the background is normally exposed, while at (2,2) we have an image 794 in which the foreground is so heavily overexposed that these objects are completely white, while the background subjects matter is somewhat overexposed. Therefore, lightvectors 710 and 720 are all that are needed to render any of the other lightvectors (and thus to render any of the other images). Thus a photographer who is uncertain exactly how much flash to use may simply capture a small number (at least two) of noncollinear lightvectors (e.g. take two pictures that have different ratios of flash and natural light) and can later render any desired fill-flash ratio. Thus we see that 710 and 720 form a basis for rendering any of the other nine images presented on this page, and in fact any of the other infinitely many images at other coordinates on this two dimensional page. In practice, however, due to noise (quantization noise owing to the fact that the camera may be digital, as well as other forms of noise) and the like, a better image will be rendered with a more accurate form of determining the two lightvectors than just taking one picture for each one (two pictures total). By taking more than just one picture each, a better estimate is possible. For example, taking ten identical pictures for 710 and averaging them together, as well as taking another ten identical pictures for 720 and averaging them together will result in much better lightvectors. Moreover, instead of merely taking multiple identically exposed images for each lightvector, the process will be better served by taking multiple differently exposed images for each lightvector.
This is in fact the scenario depicted in Fig 5 where for example, lightvector v2,3,4 is determined from three lightvectors 520, 530, 540, using the Wyckoff principle. The Wyckoff principle is a generalization of signal averaging known to those skilled in the art of image processing.
Finally, a further generalization of signal averaging, which is also a generaliza tion of the Wyckoff principle, is the improved estimation of the lightstroke subspace through capture of lightvectors off the axes. For example, suppose that image 794 was not rendered from 710 and 720, but instead suppose that image 794 was captured directly from the scene with two units of flash and 2 units of ambient light. From these three pictures, 710, 720, and 794, any other picture in the two dimensional lightvector subspace can be rendered. These three pictures form "basis" images.
Since the "bases" are overdetermined (e.g. three vectors that define a plane), there is redundant information, and redundant information assists in combating noise just as the redundant information of signal averaging with identical exposures (identical points in the 307200 dimensional space) or implementing the Wyckoff principle with collinear lightvectors (collinear points in the 307200 dimensional space) did. Thus the photographer may capture a variety of images of different combinations of flash and natural illumination, and use these to render still others, having combinations of flash and natural illumination different from any that were actually taken.
Often it is not possible to have the shutter be fast enough to completely exclude background illumination. In particular, lightpainting is practiced normally in dark places, and it would be desirable that this art could be practiced in places that cannot be darkened completely as might arise when streetlamps cannot be shut off, or when a full moon is present, or when one might wish to have the comfort and utility of working in an environment that is not totally dark. Accordingly, a coordinate transformation may be applied to all lightvectors with respect to the ambient lightvector. The ambient lightvector may be obtained by simply taking one picture with no activation of flash (to capture the natural light in the scene). In practice, many identical pictures may be taken with no flash, so that photoquantigraphic signal averaging can be used to determine the ambient lightvector. Preferably various differently exposed ambient light pictures are taken to calculate an extended response picture for the ambient light.
The photoquantigraphic subtraction of the ambient light image fo from another image fi is given by the expression: f(f-l(fi)f-l(fo)) where f is the response function of the camera. More generally, an entire lightvector space may be photoquantigraphically coordinate transformed.
An example of a photoquantigraphic coordinate transformation is depicted in Fig 8 where the ambient light axis 810 remains fixed, but the space is sheared along the flash axis owing to the change to a new axis 820 now called "Total illumination" 830. The numerals now extend further 840 owing to the fact that the new coordinates capture the essence of images such as 850 that is now the greatest along axis 820. Mathematically, the example coordinate transformation given in Fig. 8 may be written:
ambient r 1 (ambient) 1) (0.2) total = f t 1 1 L f~l(flash) ) (0.2) which, through example, illustrates what is meant by a "photoquantigraphic coordinate transformation".
Fig 9a illustrates what is meant by a photoquantigraphic summation, and illustrates through example, one of the many useful mathematical operations that can be performed in lightspace. Images 910 are processed through inverse (f-') transfer functions 920 to obtain photoquantigraphic measurments (lightvectors) 930. These photoquantigrahic measurements are denoted ql through q .
Typically the inverse transfer functions 920 expand the dynamic range of the images since most cameras have been built to compress dynamic range of typical scenes onto some recording medium of limited dynamic range. In particular, inverse transfer functions 920 are preferably such that they undo the inherent dynamic range compression process performed by the specific camera in use. In the absence of knowledge about the specific camera being used, the inverse transfer function 920 may be estimated by comparing two or more pictures that differ only in exposure, as described in CS.Pat. S. Pat.No.5, 82S, 793. Alternatively, a generic inverse transfer function may be used. A satisfactory generic inverse transfer function is the function f-' (fi) = 3. Thus a satisfactory operation is to cube each of the incoming images, although it would be preferable to try to actually estimate or determine f-l.
The transfer functions drawn in boxes 920 are actually a plot of the function f(fi) = f2 simply because the parabolic shape is one of the easiest concave-upweards plot to draw by hand, and is typical of the shape of many inverse transfer functions, e.g. it is visually similar to the actual shapes of curves typically used. In this illustration, then, every pixel of each incoming image 910 is squared to make a new image 930 which is the lightvector. These squared images are summed 940 and the square root of the sum is computed by transfer function 950 to produce output image 960.
Optionally, the photoquantigraphic summation may be a weighted summation, in which case weights 935 may be adjusted as desired.
Suppose that each of the input images in Fig. 9 corresponded to a set of pictures that differed only in illumination, and that each of these pictures corresponded to a picture taken by the apparatus of Fig. 3 with the flashlamp in each of the positions depicted in Fig. 3. Then image 960 would have the visual appearance of an image that would have been taken if three flashes depicted in Fig. 3 were simultaneously activated at the three locations of Fig 3, rather than in sequence.
It should be noted that merely adding the images together will not produce the desired result because the images do not record the quantity of light, but, rather, some compressed version of it.
Similarly, the example depicts a squaring, when in fact the actual inverse function needed for most cameras is closer to raising to an exponent between about three (cubing) and five (raising to the fifth power).
As stated above, inverse function 920 might cube the images, and forward function 950 might extract the cube root of the sum, or in the case of a typical film scanned by PhotoCD, it has been found by experiment that the exponent of 4.22 for 920 and (1/4.22) for 950 is satisfactory. Moreover, a more sophisticated transfer function other than simply raising images to exponents is often used when practicing the invention presented here. Typically the curves 920 are monotonic and also of monotonic slope.
Lastly if and when cameras are made to directly support the art of this invention, these cameras would provide measurements linearly proportional to the quantity of light received, and therefore the images would themselves embody a lightvector space directly.
In general, the input images will typically be color pictures, and a the notion of photoquantigraphic vectorspace implicit in Fig. 9a is replaced with that of photoquantigraphic modulespace. Typically a color camera involves the use of three separate color channels (red, green, and blue). Thus the inverse transfer functions 920 will apply to each of the three channels a separate inverse transfer function. In practice, the tree separate inverse transfer functions for a particular camera are quite similar, so it may be possible to apply a single inverse transfer function to each of the three color channels. Once these color inverse transfer functions are applied, then quantities of light 930 are color quantities, and weights 935 are colour weights.
In general weights 935 will be three by three matrices (e.g. matrices with nine elements). Thus instead of a single scalar constant as with greyscale images, there are nine scalar constants for color images. These constants 935 amount to a generalized color coordinate trasnformation, with scaling of each of the colour components. The resulting color quantities are then added together, where adder 940 is now a three channel adder. Forward transfer function 950 is also a color transfer function (e.g. comprises three scalar transfer functions, one for each channel). Output image 960 is thus a color image.
In some cases, it is preferable to completely ignore the color information in the original scene, while still producing a color output image. For example, a photoborg may wish to ignore color information present in some of the lightstrokes, while still imparting a strong color effect in the output. Such a lightstrokes will be referred to as pseudocolor lightstrokes. An example of when such a lightstroke is useful is when shooting late at night, when a one wishes to have a blue sky background in a picture, and the sky is not blue. For example, suppose that the sky is green, or greenish/reddish brown is is typically the case for a night time sky. An color image of the sky is captured, and converted to greyscale. The greyscale image is converted back to color by repeating the same greyscale entry three times. In this way the file and data type is compatible with color images but contains no color information. Accordingly, it may be colorized as desired, in particular, a weighting causing it to appear in or affect only the blue channel of the output image may be made, notwithstanding the fact that there was little if any blue content in the original color image before it was converted to greyscale. An example in which two greyscale images are combined to produce a pseudocolor image is depicted in Fig. 9b.
Specifically, Fig. 9b depicts this variation of the photoquantigraphic modulespace in which the color coordinate transformation matrices 935 (of Fig. 9a) are:
where the square matrix is formed by repeating the standard YIQ transformation three times. Thus it is clear that this matrix will destroy any color information present in the input image, yet still allow the output image to be colorful (by way of the ability to adjust weights wR, wG, and wB).
In Fig. 9b there is depicted a situation involving two input images, so the corresponding mathematical operation is that of a photoquantigraphic pseudocolor modulespace given by:
This mathematical operation can be simplified by just using a greyscale camera.
In fact it is often desirable to use only a greyscale camera, and simply paint the scene with pseudocolor lightvectors. This strategy is particularly useful when the scene includes apartment buildings or dwellings, so that an infrared camera and infrared flashlamp may be used. In this way a colorful lightvector painting can be made without awakening or disturbing residents. For example, it may be desired to create a colorful lightvector painting of an entire city, by walking down each street with a flashlamp and flashing light at the houses and buildings along the street, without disturbing the residents. In this situation, a satisfactory camera is the Kodak (TM) DCS-460 in which the sensor array is specially manufactured with no color filters over any of the pixel cells, and with no infrared rejection filter. Such a specially manufactured camera will have a tremendously increased sensitivity compared to the standard DCS-460, allowing a small handheld flashlamp to illuminate a large building.
A satisfactory flashlamp is a specially modified Lumedyne 468 system in which a quartz infrared flashlamp is fitted, and in which an infrared filter is placed over the lamp head reflector. A satisfactory reflector is the Norman-2H sports reflector which will also fit on the lumedyne lamp head. Preferably a cooling fan is installed in the reflector to dissipate excess heat buildup on account of the infrared filter that makes the lamp flashes invisible to the human eye.
In Fig. 9b, what is shown is two input images that have either already been converted to greyscale, or were greyscale already, on account of their being taken with a greyscale system, such as the infrared camera and flashlamp described above.
These two greyscale input images are denoted fyl and fy2 in Fig. 9b. The images then pass through inverse transfer function denoted by fy l producing qy1 and qy2. These quantities contain no color information from the original scene. However, it is desired to colorize them into a color output image. Accordingly quantity qWl is spread out into three identical copies, each passing through weights wR1, WGi, and UWB1. Similarly, quantity qy2 is spread out into three identical copies, each passing through weights WR2, WG2, and WB2. a total quantity of red is obtained at qB, a total quantity of green at qB, and a total quantity of blue at qs. These total quantities are then converted into a picture by passing them through three separate forward transfer functions f.
In practice each of these three transfer functions is similar enough that they may be regarded as identical, but if desired, may also be calculated independently if there is reason to believe that the camera compresses the dynamic range of its three color channels differently.
In a typical scenario, an image of a building interior may be taken with the infrared camera and infrared flashlamp described above. This interior may, for example, be the stairwell of an apartment building in which there are glass windows showing the stairs to the outside. It is desired to capture an expressive architectural image of the building.
A photoborg climbing the stairs flashes a burst of infrared light at each floor, to light up the inside stairs. The images arising from these bursts are captured by an infrared camera fixed outside. The camera is preferably fixed by a heavy cast iron surveyor's tripod registered on three stakes driven into the ground, or the like. After the photoborg has done each floor, the resulting images are photoquantigraphically averaged together as was shown in Fig. 9a. The photoquantigraphic average is the image fyl depicted in Fig. 9b.
Then the photoborg leaves the building and illuminates the exterior. Again, an infrared flashlamp is used so as not to awaken or disturb residents of the building.
A large number of exterior pictures are taken, while the photoborg walks around and illuminates the outside concrete structure of the building. These images of the exterior are photoquantigraphically averaged to obtain fu2 depicted in Fig. 9b.
In the case of a small building, a single shot may provide sufficient coverage and Signal to Noise Ratio (SNR), but often multiple shots are photoquantigraphically averaged as described.
Then the photoborg selects the weights. A common selection for the weights in the scenario described above is wR1 = 1, wGl = 1, wgo = 0, to give the building interior a welcoming yellow appearance, and we2 = 0, wG2 = 0, WB2 = 1, to give the exterior a "midnight blue" appearance. Thus, although the camera captured no color information from the scene, a colorful expressive image as might be printed on the cover of an architectural magazine using high quality color reproduction may be produced.
The above scenario is not entirely ideal because it may be desired to mix color lighstrokes with pseudocolor lightstrokes in the same image. Accordingly, a more preferable scenario is depicted in Fig. 9c.
Fig. 9c depicts a simplified diagram showing only some of the steps involved in making a typical lightmodule painting.
The process begins by calculating one or more ambient lightmodules. This estimate is useful either for photoquantigraphically subtracting from each image that will later be taken, or simply to fill in a background level of detail. In the latter case, the ambient lightmodule typically comprises a daytime estimate multiplied by the color blue, added to a night time estimate multiplied by the color yellow, and added to the overall image in addition to the lightstrokes made with the photoborg's flashlamp.
There may be more than one ambient lightvector, as indicated here (e.g. one for daytime lighting to create a blue sky in the final picture, and one for nighttime lighting to create yellow lights inside all the buildings in the picture). Sometimes there are hundreds of different ambient lightvectors computed as the sun passes through the sky, so that each time of day provides different shadow conditions from which other desired lightmodule spaces are computed.
In this simple example, it is assumed that only one ambient lightmodule is to be computed. This ambient lightmodule is typically computed as follows: A photoborg first issues a command from his WearComp (wearable computer) to the base station to instruct it to construct an estimate of the background ambient illumination. The computer at the base station directs the camera at the base station to acquire a variety of differently exposed pictures. In this simple example, sixteen pictures are captured at 1/2000th of a second shutter speed.
These pictures are stored in files with filenames ranging from v000.jpg to v015.jpg.
Note that v000.jpg, etc., are not usually lightvectors until they pass through the camera's inverse transfer function, unless the camera already shoots in lightvectorspace (e.g. is a linearized camera). The signal v000 in Fig. 9c denotes the image stored in file v000.jpg, and the signal v001 in Fig. 9c denotes the image stored in file v001.jpg, and so on. These sixteen images are photoquantigraphically averaged. By photo quantigraphic averaging, what is meant is that each is passed through an inverse transfer function, f-l to arrive at the quantities of light falling on the image sensor, and then these quantities are averaged. These values are denoted q000 through qol5 in Fig. 9c. Each of these values may be stored in a double precision image array, although preferably the process is done pixelwise or in smaller blocks so that the amount of memory required in the base station computer is reduced. The average of these sixteen photoquantigraphic quantities is denoted 210--15 in Fig. 9c. It should be noted that average and sum are conceptually identical, and that the extra factor of division by 16 may be incorporated into the weight w015 to be described later.
Then the base station computer continues to instruct the camera to acquire sixteen pictures at a shutter speed of 1/250sec. The picture signals associated with these sixteen pictures are denoted v016 through v031 in Fig. 9c. These signals are used to estimate the photoquantigraphic quantities qol6 through qO3l. These photoquantigraphic signals are averaged together to arrive at lightmodule i'1631.
Then the base station computer continues to instruct the camera to acquire sixteen pictures at a shutter speed of 1/8sec. The picture signals associated with these sixteen pictures are denoted v032 through v047 in Fig. 9c. These signals are used to estimate the photoquantigraphic quantities qo32 through q047. These photoquantigraphic signals are averaged together to arrive at lightmodule v3247.
The three lightmodules v0-15, v1631, and V32--47 are further processed by weighting each of them in accordance with the shutter speeds. Thus v015 is multiplied by 23 = 8, while v32-47 is multiplied by 2-5 = 1/32. Lightmodule v16-31 is multiplied by 1 (e.g. it is left as it is, since it has been selected as the reference image).
In this way, all lightmodules are scaled according to the shutter speeds, so that each will be an equivalent estimate of the quantity of light arriving at the image sensor, except for the fact that quantization noise, and other forms of noise, and the like, will tend to cause the highlight detail of v015 to be best, while the shadow details will be best captured by lightmodule v3247.
This preference for highlight detail from VOi5, midtone detail from 2116-31 and shadow detail from v324 is captured by certainty functions CO-15, cl6~3l, and c324, shown in Fig. 9c. After applying these certainty functions, a weighted summation is made, to arrive at lightmodule signal vo which is the estimate of the ambient light.
Lightmodule vo is typically a double-precision 3 channel (color) array of the same dimensions as the input images. However, vo is in photoquantigraphic units (which are neither irradiance nor illuminance, but, rather, are characterized by the spectral response in each of the three color bands over which they are taken).
Typically, lightmodule vo is actually computed over many more exposure steps, e.g. 256 pictures at every possible shutter speed the camera is capable of.
After the ambient light quantity vo is determined, control of the camera is returned to one or more photoborgs who can then select a portion of the scene or objects in view of the camera to illuminate. A photoborg generally illuminates the scene with a flashlamp or phlashlamp. A phiashlamp is a photoquantigraphic flashlamp, as illustrated in Fig. 9d.
Fig. 9d shows the Medusa8 (TM) flashlamp which is made from eight ordinary flashlamps. A satisfactory configuration is made from eight of the most powerful Metz (TM) flashlamps mounted to a frame with grip handles M8G. Grips M8G are preferably smooth and easy to grab onto. A satisfactory material for handles M8G is cherry, or other hardwood. Grips MHG allow the photoborg operator to hold the bank of eight flashlamps and aim the entire bank of flashlamps at the subject matter of interest.
One of the grips MHG preferably contains a chording keyboard built into the grip, which allows the photoborg to type commands into a wearable computer system used together with the apparatus of Fig. 9d. When the photoborg has selected subject matter of interest, and aimed the phlashlamp at this subject matter, the photoborg issues an acquire lightmodule command. This command is transmitted to the base station computer causing four pictures to be taken in rapid succession. Each of these pictures generates a sync pulse transmitted from the base station to the photoborg.
There is contained in the phlashlamp a sequencer computer, M8C, which fires the four flashlamps designated M8F when the first synchronization pulse is received. Alternatively, the sequencing may be performed on the body-worn computer (WearComp) often worn by a photoborg. The sequencing computer M8C fires the two flashlamps M8T when the next sync pulse is received. It then fires the single one flashlamp designated M80 when the third sync pulse is received. Finally it fires the flashlamp M811 at half power when the fourth sync pulse is received. In order for this sequencing to take place, the eight flashlamps are connected to the sequencing computer M8C by way of wires M8W leading from each of the "hot shoe" connectors M8HS ordinar ily found on many flashlamps. Typically hot shoe connectors M8HS are located on the bottom of the flashlamp bodies M8B. The flashlamp bodies M8B can usually be folded to one side, so that the flashlamp heads M8F, M8T, M80, and M8H can all be clustered together in approximately the same space.
In this way, the Medusa8 (TM) flashlamp causes there to have been taken a plurality of pictures of different exposure levels. This plurality of pictures (in this case, four differently exposed pictures) is designated v049 through v052 in Fig 9c.
Alternatively (and often preferably), a flashlamp comprises a single lamp head, with a single flashtube, which is flashed at a plurality of different levels in rapid succession, by, for example, rapidly switching differently sized capacitor banks into the system. In this way, all flashes of light come from exactly the same direction, as opposed to the Medusa8 approach in which flashes of light come from slightly different directions, owing to the fact that different flashtubes are being used.
A flashlamp may be fired repeatedly as a photoborg walks around and illuminates different objects in the scene. Alternatively, several photoborgs carrying phlashlamps may illuminate the scene either at the same time (synchronized to each other), or may take turns firing their respective phlashlamps. For example, another object is selected by another photoborg, and this photoborg aims the flashlamp at this other object, and another acquire lightmodule command is issued. Another four pictures are taken in rapid succession, and these are designated as v812 through v815 in Fig 9c.
Typically each photoborg is given a range of images, so, for example, photoborgl may have image space from v100 to v199, and photoborg 8 will have image filenames v800 to v899. Alternatively, the photoborg's UID and GID may be inserted automatically in each filename header, together with his or her heart rate, physical coordinates, etc., and any other information which may help to automate the process.
In this way, the base station may also, through Intelligent Signal Processing, as described in Proc. IEEE, Vol. 86, No. 11, make an inference as to which lightmodules are most important or most interesting to each of the photoborgs.
In the situation depicted in Fig 9c, photoborgl has decided to cement his lightvector into the sum with weight wl, while photoborg8 has selected weight w2. Additionally, photoborgS has decided to cement his contribution into the sum as a greyscale image but with color weight w2. As will be seen, although w2 affects the color of the lightmodule as it appears in the final Output Image, no color information from v2 gets to the Output Image.
The lightmodule from photoborgi is computed automatically by setting wf = 1/4, Wt = 1/2, w0 = 1, and Wh = 2. In this way, the four-flash image is scaled down four times, the two-flash image down two, and the half-power-flash image up 2, so that all four photoquantigraphic estimates qioo to qoo3 are brought into tonal register. Then certainty functions are applied. The four-flash certainty function, cf, weights the darker pixels in qooo more heavily. The two-flash certainty function ct weights the darker midtones most heavily, while the one-flash certainty function c0 weights the brighter midtones most heavily. The half-power-flash certainty function ch weights the highlights (brightest areas of the scene) most heavily. The result is that the weighted sum of these four inputs gives lightmodule v. Lightmodule vl then continues on toward the total photoquantigrahic sum, with a weighting wl selected by photoborgl.
In the situation in which a pseudocolor lightmodule is desired, as is illustrated in Fig. 9c with qsi2 through q815, the color lightmodule v2 is computed just as in the above case, but instead, this lightmodule is converted to greyscale and typecast back to color again as follows: Light module V2 passes through color separater CS and is broken down into separate Red (R), Green (G), and Blue (B) channels. Each of these has a weight associated with it. The weights are designated wR, WG, and WB in Fig. 9c. The default weights are those of the standard YIQ transformation if none are specified by the photoborg.
Color depth is often expressed in bits per pixel, e.g. 8 bit precision is often referred to as "24 bit color" (meaning 24 bits total over all three channels). Likewise, a double precision variable (e.g. a REAL*8 variable, in standard IEEE floating point arithmetic), occupies 64 bits for each of red, green, and blue channels, and is thus designated as 192 bit color. Hence the designations in Fig 9c showing where the signals have a color depth of 192 bits. After the color separator CS, the signals in each channel have a depth of 64 bits (greyscale), passing through the weights. After passing through the weights, certainty functions are computed based on exposure in each color band. Thus, for example, if the red channel is overexposed as is often the case where tungsten lights are concerned, then the highlight details can come from the blue channel. Photoborg8 may also deliberately use a colored gel over a flashlamp in addition to or instead of using a plurality of flashlamps as in using a phlashlamp.
* For example a red gel over an ordinary flashlamp with a deliberate overexpose sure will convenienty overexpose the red channel. Typically the blue channel will be underexposed. The green channel will typically fall somewhere in between.
Accordingly, certainty functions CR, CG, and CB will often help extend the dynamic range of the greyscale image through the process of deriving a greyscale image from a color image. A weighted sum, including weighting by these certainty functions, is produced at v2y which is still a 64 bits per pixel greyscale image. This image is replicated three times and supplied to color combiner CC. Ordinary a color combiner will take separate R, G, and B inputs and combine them into a single color image.
However, when fed with three identical inputs color combiner CC simply converts the greyscale image into a datatype that is compatible with color images. The result, Vy2c, is a 192 bits per pixel greyscale image in a format in which the subject matter is simply repeated three times. This signal may now be passed through weight w2 where it may be assigned to the final output image. Weight w2 might, for example, be [5O0] in which case lightmodule vy2C will appear as yellow in the final image, with a strength five times the default strength.
Ordinarily images are produced in 24 bit Red Green Blue (RGB), and converted to 32 bit CMYK (Cyan, Magenta, Yellow, blacK) for printing. However, if printing is desired, it will be advantageous to do the conversion to CMYK in lightspace prior to converting back to a 32 bit CMYK picture. Accordingly, The three lightmodules vo, vl. and Day, are weighted as desired (these final weights are selected for the desired visual effect), a weighted sum is taken in 192 bit color, and converted to 256 bit color CMYK colorspace by the block denoted by RGBtoCMYK in Fig. 9c.
Ordinarily there is some color shift in conversion from RGB to CMYK, and most conversion programs are optimized for mid-key, e.g. fleshtones, or the like. However, a feature of the images produced by the apparatus of the invention is that much of the image content exists at extremes of the color gamut, so it is desirable that when converting to CMYK colorspace, that the resulting image stretch out toward the different boundaries of the CMYK space. The CMYK space is quite different than RGB space in the sense that there are colors that can be obtained in RGB that cannot be obtained in CMYK and vice-versa. However, what is desired is an image that hits the edges of whatever colorspace it is to exist in. Most notably, color hue fidelity is typically less important than simply the fact that the image should touch the boundaries of the colorspace. Thus it will typically be desired to convert from RGB to LAB, HSV, or HSL space, and then increase the saturation, and then convert to CMYK, where colors will be clipped off for being out of gamut.
Ironically, it is preferable that colors be clipped off as being out of gamat, rather than having them fall in within the color gamut boundaries after having been previously clipped in the old colorspace. Thus the block denoted SAT in Fig. 9c will in fact use the output of block GBM (Gamut Boundary Manager) which detects where the colors were originally at the boundaries of the RGB colorspace. In this way, block SAT will adjust the CMYK input in accordance with where the RGB signals were at their extrema and ensure that none of these colors get mapped to the interior of the CMYK space. The block denoted SAT performs an optimization in 256 bit lightspace and attempts to map any images that were clipped to a clipped part of the new gamut.
Fig 9f depicts a color coordinate transformation from domain DOM to range RAN, where the domain DOM may, for example, be an RGB colorspace, or a higher dimensional colorspace as might be captured using a filter wheel over a camera. The range, RAN is typically a CMYK colorspace suitable for normal printing, or another colorspace, such as the Hexachrome (TM) colorspace described in U.S. Pat. No.
5,734,800, "Six-color process system", issued Nov. 29, 1994, invented by Richard Herbert and Al DiBernardo, and assigned to Pantone, Inc.
Ordinarily the two colorspaces have different color gamuts, so that there will be some colors in the domain DOM that will get clipped (distorted) when converted to the range RAN. Colors that are clipped are denoted CL in Fig. 9f.
Conversely, there will also be colors BC that are not necessarily distorted by the colorspace conversion, but were at the gamut boundaries in the domain DOM and exist inside the boundaries in the range RAN. Consider two colors in lightspace, BC1 and BC2, where BC1 is just at the boundary of the domain DOM, while BC2 is beyond the domain boundary DOM. The camera will map both of these colors to BC1, since BC2 is beyond its gamut. For example, both may be bright blue, and both may get mapped to RGB = [001]. However, in the conversion to colorspace domain DOM, both will appear within the boundary of what the new colorspace could achieve.
Colors BC1 and BC2 represent single pixels, in isolation. In practice, however, it is evident from a picture, by the context of surrounding pixels, when a region of the picture goes beyond the gamut of the colorspace. For example, an extremely bright red light in a picture will often register as white, and then have a yellow halo around it, and then bloom out to red further out. When such an image is rendered anywhere other than at the boundary of the new colorspace RAN, the appearance is not visually appealing, and will be referred to as "brightgrey". The term brightgrey denotes colors that should be bright but register as greyish, for example, colors that were bright and vibrant in DOM, but may appear greyish in RAN, especially when RAN is a CMYK colorspace or the like. For example, a bright magenta in RGB may register as a dul greyish magenta in CMYK, even though the color is not distorted.
In fact it is the very fact that the color is not distorted that is the problem, e.g. since CMYK is capable of producing a very strong magenta, there is a perception of the magenta being weak when it is faithfully reproduced in CMYK. Instead of faithfully reproducing it in CMYK, it is preferable, within the context of the invention, to distort the magenta from its original RGB value to something that is much stronger than it was in RGB. Typically this may be done by intensifying the magenta and reducing the amount of cyan, or the like, that might be causing the cyan to appear brightgrey. (Cyan and black tend to darken certain colors.) When the camera is a lightspace camera, e.g. one that implements a Wyckoff effect, or is otherwise based on a plurality of differently exposed images, it is possible to determine the actual quantity of light arriving in each of the three color spectral bands, and therefore it is possible to identify colors that are outside the RGB colorspace one would ordinarily have for taking a picture When these colors would be further distorted by clipping, in conversion to the new colorspace, the appearance is not so bad as when they would fall in the interior of the new colorspace, so the emphasis of this invention is to address the darkgrey colors (colors denoted BC, or BC2 having been clipped to BC1 and then existing in the interior of RAN).
Most notably, there are two ways, within the context of the present invention, to obtain a vibrantly colored lightmodule painting: use a plurality of input images, preferably differing only in exposure, to calcu late each lightvector, and then do all calculations and colorspace conversions in lightspace, prior to converting back to an image by applying a point wise nonlinearity, f; accept the fact that incoming lightstrokes will have been limited by domain DOM, and attempt to stretch them out in colorspace so that regions such as BC1 will be stretched out further toward the boundaries of the range RAN (e.g.
BC1 would move out toward BC2 or beyond).
It is understood that this second method will involve some distortion of the colors and it is understood that this distortion is acceptable because often the apparatus of the invention is used to create expressive lightmodule paintings in which colors are not natural to begin with.
Fig. 9c includes the saturation booster SAT and the Gamut Boundary Manager GBM. The effect of the SAT block with the GBM input is to ensure that, for example, a portion of the image that a photoborg deliberately overexposed by 12 f-stops and then mapped through a dark blue filter weight, e.g. Wi = [00212], will not come out with a greying effect in the CMYK space. It is not uncommon to deliberately overexpose by a dozen or so f-stops when using a dark blue (e.g. pure blue as in RGB = [0,0,1]) filter. Ordinarily such an image is shot overexposed in order to deliberately blow away any appreciable detail. Thus a textured door, or rough wall will have an appearance as if a blob of deep blue paint were splashed on the image to obliterate any detail. Such an image creates the visual percept of something that is extremely bright. Thus should it land anywhere but at the outer edge of the CMYK gamut, it will create a very unsightly appearance. This appearance is hard to describe, other than by saying it looks "bright bluish grey". Obviously such a bright splotch of lightmodule paint should no be printed in any way that contains grey (e.g. contains black ink in CMYK). Thus SAT together with GBM must ensure, at all costs, that such a color maps to something at the outer boundary of CMYK space, even if it means that the hue must be shifted. Indeed, it is preferable that the hue does shift.
For example, it would be preferable that the blue be shifted to pure cyan, rather than risk having it fall anywhere but at the extreme outer boundary of CMYK space.
It is understood and expected that additional information will be lost when converting to CMYK. In fact, it is the very fact that methods of converting from RGB to CMYK of the prior art try to preserve information that leads to this problem. Thus an important aspect of the present invention is a means of converting from RGB to CNIYK where hue fidelity is of relatively little importance, and where maintaining detail in the image is of relatively little importance compared to the importance of maintaining extremely bright vibrant colors.
Once the image has been adjusted in 256 bit CMYK lightspace, so that all colors that were bright and vibrant in the input image are also bright and vibrant in the CMYK lightspace (even if it was necessary to distort their hues, or destroy large amounts of highlight detail to do so), then the lightspace is passed through a nonlinearity f which compresses its dynamic range. The nonlinearity f may be the forward transfer function of the camera itself, or some other desired transfer function that compresses the dynamic range of the image in a slightly different way. After passing through f, the result is quantized to 32 bit color, so that it can be saved in a standard CMYK file format, such as TIFF. In this way, it can be sent either to a digital press, such as a Heidelberg digital press, or it can be used to make four color separation films which in turn can be used to prepare four metal plates for a traditional printing press. The resulting image will thus have very rich vibrant colors, exhibit no noticable quantization contouring (e.g. have no solarized appearance or contour line appearance). Typically the resulting images, when printed on a high quality press, such as is used for a magazine cover, will have a much richer tonal range, and much better color, than is possible with photographic film, because of the capabilities of the lightspace processing of the invention.
In practice, only some of the lightstrokes are offenders containing contributing to or containing darkgrey portions. Accordingly, it is preferable to alter only the offending lightvectors, or to alter the worse offenders more severely. Accordingly, Fig. 9e depicts a plurality of quantities of light arriving from a cybernetic photography system. These quantities are denoted qRGBo, qRGBI, ... qRGBN and are linearly proportional, in each of a plurality of spectral bands (typically at least three spec tral bands, such as red, green, and blue) to the scene radiance integrated with the spectral response in each of these spectral bands. Such quantities are referred to as photoquantigraphic.
These quantities, qRGB0, qRGB1, and qRGBN, may be arrived at by applying the inverse camera response function, f-l, to each of a plurality of pictures, or alternatively, a photoquantigraphic camera may be constructed, in which the output of the camera is in these photoquantigraphic units.
Each of these photoquantigraphs is typically due to a different source of light, e.g. qRGBO might be a photoquantigraph taken in the daytime, qRGB1, a long exposure photoquantigraph taken at night, and qRGBN, taken with a flashlamp over a short (e.g. 1/500sec) exposure.
Each photoquantigraph is first converted immediately to CMYK. In Fig. 9e, the conversion from qRGBO to a cmyk is denoted by CMYK0, and the result of the conversion is denoted by qCMYO, the conversion from qRGBl to a cmyk is denoted by CMYKl, and the result of the conversion is denoted by qcMyKi, ...the conversion from qRGB,rV to a cmyk is denoted by CMYKN, and the result of the conversion is denoted by qCMYKN. Typically CMYK0, CMYK1, ...CMYKN are identical conversion processes even though the input photoquantigraphs (and hence the outputs) are typically different. Each of these conversion processes is done independently in such a way as to minimize brighgrey, and thus contribute to a vibrant lightmodule painting. Thus qCMYAO is generated from QRGBO by also looking at the gamut boundaries. Gamut boundary manager 0, denoted GBM0 looks at the gamut boundaries of qRGBo, with particular emphasis on where the gamut boundaries are reached in the domain colorspace but not the range. Thus GMB0 controls SAT0 to resaturate, and expand the gamut of qcMYKo, as well as deliberately distort the hue, and deliberately truncate highlight detail as needed to boost the brightgrey regions out to the edges of the new CMYK gamut. Similarly, GBM1 controls SAT1 to resaturate, and expand the gamut of qcMYKi, ... and GBMN controls SATN to resaturate, and expand the gamut of qCMYXN- The gamut boundary managers and saturation boosters are also responsive to the overall photoquantigraphic sum, e.g. to the sum in lightspace after it has passed through a forward transfer function, fCMYA- The forward transfer function fCMYK is semimonotonically increasing, but is level or concave down (has nonpositive second derivative) in each of the four C, M, NT, and K channels.
Fig. lOa depicts a photoquantigraphic filter, which will be referred to as a "philter". A philter is obtained by first computing the photoquantagraphic quantity 1030, denoted q, from the input image 1010, by way of applying the inverse response function of the camera 1020. Then an ordinary filter, such as a linear time invariant filter, 1040 is applied. The result, 1045, is passed through camera response function 1050, to produce output image 1060.
Fig. lob depicts an implementation of split diffusion in lightvectorspace. Split diffusion is useful because it is often desired that some lightvectors will contribute in a blurry fashion while others will contribute in a sharper fashion. Swore generally, it is often desirable that there will be different filters associated with different lightvectors or sets of lightvectors.
Referring to Fig. 10b, one or more lightvectors to be blurred, 1031, are passed through blurring filter 1040. It is assumed that lightvector(s) 1031 is (are) already in lightvectorspace (e.g. already passed through an inverse transfer function, or taken with a camera that outputs in lightspace).
The output of blurring filter 1040 is added to one or more other lightvectors by adder 1042, and the result is passed through a semimonotonic function of nonpositive second derivative 1050, giving an output image 1060.
A drawback of traditional image editing, where a feather radius is used to move pieces of an image from one place to another, is that there is a "brightgrey" effect when a bright area overlaps with a dark area. Typically, the bright area gets added to the dark area in regions of overlap, and what results is a grey area with clipped highlight details. Clipping is visually acceptable at the upper boundary of greyvalues, but when values are clipped and then reduced from white to grey, the appearance of bright lights or the like (such as in a region of the picture where bright lights were shining into the camera) is unacceptable when mixed with dark areas. For example, a "brightgrey" rendering of a portion of the scene where there was a bare lightbulb blasting light into the camera at very high intensity is not acceptable.
Accordingly, the notion of philters may be used to build up a complete image processing toolkit, which might, for example, include image editing tools, etc.. These tools can all be adopted to lightspace, so that, for example, during image editing, the selection of a region may be done in lightvectorspace,so that if a feather radius is used, the feathering happens in lightvectorspace. Such a feathering operation will be referred to as a pheathering operation.
The pheathering operation is depicted in Fig. 10c. Here the philter operation is denoted 1041, and comprises the editing of the image in lightvectorspace.
When filtering operations, editing operations, or split diffusion operations are tonally drastic, e.g. when one wishes to perform strong sharpening or blurring operations on images, often the effects of limited dynamic range become evident. In this case, it is preferable that the inputs have extended dynamic range. Accordingly, Fig. 10d shows an example of a set of collinear lightvectors 1010, 1011, and 1012 which are processed by Wyckoff Principle block 1025 which implements the Wyckoff Principle as described in U.S. Pat. No. 5,828,793.
The result, qToT, denoted 1031 in the figure, is passed through the filter 1040.
Since filter 1040 is operating in lightvectorspace, it is a philter. The result of is then converted to an image with semimonotonic function of nonpositive second derivative 1050, giving an output image 1060.
Fig. 11 gives an illustration of the Wyckoff principle, and in particular, the fact that taking a plurality of differently exposed pictures gives rise to a decomposition of the light falling on the image sensor into a plurality of collinear lightvectors, denoted by , H2, and W3 in this figure.
Typically when practicing the invention, very strong deliberate overexposure is used for at least some of the lightvectors (in greyscale images) or lightmodules (in color images). For example, a photoborg may deliberately overexpose a section of the image and then apply a very strong color such as pure red or pure blue to this lightvector.
Portions spilling over into the adjacent color channels will thus be moderately exposed, or underexposed. Thus there is an inverse Wyckoff effect in the rendering of the lightvector into the sum. Accordingly, Fig. 12a shoes this inverse Wyckoff effect in which a Wyckoff set is captured, passed through a combiner (synthesis), to generate a Composite image. The Composite image is then split up. This split up is inherent in the use of a strongly colored lightmodule coefficient. For example, using a bright red lightmodule coefficient of RGB = [0.01,0.1,1.0] will result in an approximation to the u yckoff effect in the output, in which the blue channel will contain an underexposed version of the image that will show much of the highlight detail in the image, as denoted WgLI,-E</R green exposure, and the green and red together form yellow. The yellow output To will likely have a red halo around it, as blooming into adjacent pixels or sensor elements will be weaker than the central beam, and will thus only expose the red channel.
A ray of really strong white light, W3, will pass through the filter and emerge as white, 1, since it will be strong enough to saturate all three spectral bands of the sensor (assuming a three band RGB sensor). Although the red component is stronger than the others, all components are strong enough to saturate the respective sensors to their maximum value. The white output Who will likely have a yellow halo around it, and, further out, a red halo, as light spilling over to other adjacent pixels or sensor elements will be weaker than the central beam, and will create behaviour similar to that of Y0 further out, and R0 still further out.
It will be understood that to render this kind of effect, it will not be sufficient to just have a normal picture and computationally apply a red virtual filter to it in lightmodulespace, but, rather, it will be preferable to capture a picture of extremely broad dynamic range so that this inverse Wyckoff effect can be synthesized, resulting in a natural looking image in which the red channel is extremely overexposed, the green channel is moderately exposed, and the blue channel is possibly underexposed.
Such a picture will appear white in areas of overexposure, yellow in areas of moderate exposure, and red in areas of weaker exposure (and of course dark red or black in areas of still weaker exposure).
Accordingly, an important aspect of the invention is photorendering in lightmodulespace, with lightmodules being derived from a phlashlamp or the like. Another important aspect of the invention is the application of various philters to lightvectors of extended response.
Fig. 13a depicts an EyeTap (TM) flashlamp. The EyeTap flashlamp produces rays of light that effectively emanate from an eye of a photoborg. Light source 1310 is collimated with optics 1320, and aimed at diverter 1340. Diverter 1340 is typically a mirror or beamsplitter. Diverter 1340 creates an image of light source 1310 as if it originated from an eye 1331 of a photoborg. Preferably the light effectively originates from the center of the lens 1330 of the eye 1331.
Optionally, an aiming aid 1350 reflects off the back of beamsplitter or mirror 1340.
If 1340 is a mirror, it should be a two-sided mirror if there is an aiming aid 1350.
Aiming aid 1350 may be an aremac, projector, television, or the like, which serves as a viewfinder for aiming the EyeTap flashlamp apparatus of the invention.
Fig. 13b depicts a wide-angle embodiment in which eye 1331 is a right eye, so optional aiming aid 1350 can extend behind the eye, to the right side of the face of a photoborg using the apparatus of the invention.
Fig. 14a depicts an EyeTap (TM) camera system. An EyeTap camera system provides a camera with effective center of projection co-incident with the center of the lens 1330 of an eye 1331 of the user of the EyeTap camera system. Preferably the EyeTap camera system is wearable.
Rays of light from subject matter 1300 are diverted by diverter 1340, and pass through optics 1313, to form an image on sensor array 1311, which is connected to a camera control unit (CCU) 1312. Preferably diverter 1340 is a beamsplitter so that it does not appreciably obstruct the vision of the user of the apparatus. Optionally, optics 1313 may be controlled by focus control unit (FCU) 1314.
The EyeTap camera system, in some embodiments, may include a second similar apparatus for a second eye of the user. In this way, a binocular video signal may be captured, depicting exactly what the user sees.
The image from the EyeTap camera system may be transmitted as live video to a remote manager so that she can experience what the user experiences. Typically the user is a photoborg, who may also communicate with a remote manager.
Optionally the EyeTap camera system may also include a display means which may show the output of a remote fixed camera at a remote base station.
Fig. 14b depicts an alternate embodiment of the EyeTap camera system. A curved diverter 1341 serves also as at least part of the image forming optics. A satisfactory curved diverter is a toroidal mirror, which forms an image on sensor array 1311 without the need for a separate lens, or with only a small correction lens needed.
Typically, diverter 1341 forms an image with considerable distortion.
Distortion is acceptable, so long as the image is sharp. Distortion is acceptable because CCU 1312 is connected to a coordinate transformation means 1315 which corrects for the distortion. Thus output 1316 is free of distortion notwithstanding distortion that may have been introduced by the use of a curved diverter. Preferably the diverter and sensor array are aligned in such a way as to meet the EyeTap criterion in which the effective location of the camera is the eye 1331 of the user, as closely as possible. The effective center of projection of the camera should match closely with the location of the center of the lens 1330 of the eye 1331. This embodiment of the EyeTap camera can be made with reduced size and weight, and reduced cost.
Eyewear in which the apparatus of the invention may be built is preferably protective of the eyes to excessive exposure to light, as might happen if a flashlamp of the invention is fired into the eyes of a photoborg, especially when large flashlamps are used to light up tall skyscrapers in a large cityscape. Accordingly, eyewear should incorporate an automatic darkening feature such as is typical of the quickshade (TM) welding glasses, or Crystal Eyes (TM) 3-D glasses. An automatic darkening feature can either be synched to the flashlamps, or be triggered by photocells or a wearable camera that is part of certain embodiments of the invention.
Fig. 15 shows an embodiment of a finder light or hiding light. The finder light is used to find where the camera is located, particularly when shooting a large cityscape, where the camera might, for example, be located on the roof of a building a few hundred meters away. In this case, a light source 1510 may be remotely activated.
Together with optics 1520 and field of view limiter 1521, a very bright light is produced by rays 1511, which partially pass through a 45 deg. beamsplitter and are wasted as rays 1512, and partially reflected as rays 1513. Alternatively, the light source may be placed next to the camera and facing in the same direction, if the losses of a beamsplitter are unacceptable. A satisfactory light source is a 1000 watt halogen lamp, or arc lamp, which can be detected from among other lights in a large city by way of the fact that a photoborg has remote control of it. Alternatively, lamp 1510 may be a flashlamp that the photoborg can remotely flash, in which case it is also quite visible from a great distance, nothwithstanding other bright lights in an urban setting.
In addition to helping to find the camera, the finder light can also be used to determine if one is within its field of coverage. For this purpose, Camera 1500 has the same field of view as the light source, so that one can make this determination.
In some embodiments, barrier 1521 is a colored filter, so that the light appears a different color when one is within the field of view of the camera, but can still be seen when one is outside the camera's field of view.
At close range, the light is strong enough to light up the scene, and also thus functions as a worklight so that a photoborg can see where he or she is going. Preferably, in this use, another worklight off-axis is used so that the camera finder light is not on continuously enough to attract insects toward the camera, causing degradation of the image in the time immediately following the shutting off of light 1510.
As a hiding light, light 1510 can be illuminated and a photoborg can also see if he or she is casting a visible shadow. A visible shadow indicates that he or she does not blend into the background, assuming black clothing which would blend with a long-range open space behind the photoborg, such state being readily visible by the finder light at close range.
Fig. 16 shows the lightsweep apparatus of the invention. A row of lamps (as little as 5 or 7 lamps, but preferably more, such as 16 or 32 lamps) is sequenced as it is moved through space, during which time the shutter of the camera is either held open in a long exposure, or rapidly acquires multiple exposures which are later photoquantigraphically summed. During this time, the lamps on frame 1600 are turned on and off. In the figure, the letter "A" has just been drawn in mid-air by the device, and lamp 1601 is still on, while lamp 1607 has turned off. The path of frame 1600 through space leaves behind a ribbon of light in the photograph. For example, element 1610 persists even though the frame is no longer there.
Typically the device is used with graphics rather than text. For example, a circle may be drawn using sin and cos lookup tables. A solid filled-in circle is often drawn in mid-air, often not directly into the camera, but, instead, pointing away from the camera so that it is only seen indirectly as its effect of illumination. In this way, frame 1600 can be used to synthesize any arbitrary shape of light, such as a softbox in mid air (if a rectangle is chosen), or a light more like an umbrella diffuser if a circle is chosen.
Rather than program the shape of light a-priori, it is sometimes preferable to simply sequence the lamps while recording the scene at video frame rates, and then use photoquantigraphic weighted summation, setting weights to zero to achieve the equivalent effect of turning off certain lights.
Fig. 17a shows a lamp sequencer of the invention, in which processor 1700 captures images from camera 1500 while it controls a sequence of lamps 1701, 1702, 1703, .... Subject matter 1720 may be a person or include people, in which case lamps 1701, 1702, 1703, ... are preferably flashlamps and camera 1500 is preferably a highspeed video camera, or subject matter 1720 may be a still life scene in which case lamps 1701,1702,1703, . . may be ordinary tungsten lamps or the like, and camera 1500 an ordinary digital still camera or the like.
In the former case, wires 1730 are flash sync cables, while in the second case, wires 1730 may be the actual power cords for the lamps. In either case, no preparation of lamps is needed and ordinary lamps may be used. Thus the innovation of the invention is in processor 1700 which may include a computer controlled multichannel light dimmer, or a flash sequencer.
In the situation illustrated here, five pictures of the same subject matter are captured, and in each of the five pictures, the subject matter is differently illuminated.
These five pictures are then passed to a lightspace rendering program which allows for the generation of a lightmodule painting. Typically in a studio setting, there are preferred default settings for the lightvectors. For example, the lightmodule weight for the picture corresonding to lamp 1703 is typically set to blue, and split diffusion is used to run it through a photoquantigraphic blurring filter prior to the computation of a photoquantigraphic sum.
The apparatus of Fig. 17a may be used for the production of still pictures or for the production of motion pictures. When still pictures are being produced, ordinarily the lamps are sequenced through only once. When a motion picture is being produced, camera 1500 is a high speed motion picture camera, and the lamps are sequenced through periodically. In this example, since there are five lamps, the motion picture camera must shoot at a frame rate or field rate at least five times the desired output frame rate or field rate. For example, if we desire a motion picture shot at 24 frames per second, then the motion picture camera must shoot at least 120 frames per second.
Each set of five pictures, corresponding to one cycle around the lamps, is used to photoquantigraphically render a single frame of output picture.
In the case of motion pictures, camera 1500 may be mobile, if desired, and lamps 1701, 1702, 1703 may also be mobile, if desired. In this case. preferably motion picture camera 1500 will be an even higher speed motion picture camera than necessary. For example, if it is a 240 frames per second camera, it can cycle through all five lights, and then wait a brief interval, before cycling through once again. In this way, there is less misregistration artifacts. Additionally, or alternatively, a registration algorithm can be applied to the images from camera 1500 to compensate for the fact that the subject matter may have changed slightly from the time the first lamp 1701 was fired, to the time the fifth lamp was fired.
Fig. 17b shows a system in which a lightvectorspace is generated without explicit use of any controller. Instead, special flashlamps, 1801, 1802, 1803, ..., are used.
These are all connected directly to the camera. If camera 1500 is a video camera, all the flashlamps may be supplied with the video signal to lock onto. If the camera 1500 is a still picture camera, then all the flashlamps may simply receive a flash sync signal from the camera.
Wires 1730 from each of the flashlamps are connected to an output 1840 of camera 1500. Alternatively, some of the flashlamps may be daisy chained to others, e.g. by connections 1831, since all the flashlamps only need to be connected in parallel, and no longer need separate connections to any central controller. Alternatively the connection may be wireless, and each flashlamp may act as a slave unit.
Ordinarily, in the prior art, all flashlamps would fire simultaneously when camera 1500 took a picture. However, in the context of the present invention, flashlamps 1801,1802,1803, .. are special flashlamps that can be set to respond only to every Nth pulse, starting at pluse X, where N and M are user-selectable. In this case, all flashlamps may be set to N=5 when we are using 5 flashlamps. In general, N is set to the desired number of lightvectors. Then the values for M are selected, e.g. lamp 1801 is set to We=1, lamp 1802 to M=2, and so on. Thus lamp 1801 fires once every 5 pulses, starting on the first pulse, lamp 1802 fires once every 5 pulses starting on the second pulse, and so on.
The number of lamps can be greater than the number of lightvectors. For example, we may set N=2 on each lamp, so that, for example, three of them will fire on even numbered pulses and the other two will fire on odd numbered pulses. This setting would give us a two-dimensional lightvectorspace.
In any case, the novelty is in the design of the flashlamps when using the system depicted in Fig. 17b. Thus the camera can be an ordinary camera, and the user simply purchases the desired number of special flashlamps of the invention. Since many flashlamps already have menus and means for adjusting and programming various setting, it is not hard to manufacture flashlamps with the capabilities of the invention.
Ideally the flashlamps may each contain a slave sensor, infrared sensor, radio receiver, or the like, so that they can operate within the context of the invention without the need for wires connecting them. If wires are to be used to power the separate lamps, the necessary synchronization signals may be sent over these power lines.
Fig. 18 shows a typical session using the lightspace rendering (photoquantigraphic rendering) system. Preferably this system is on the Internet so that it can be accessed by any of the photoborgs by way of a WearComp (wearable computer) system. Additionally, one or more remote managers can also visit this site. Accordingly, a preferred embodiment is a WA7iV page implementation.
Here ten pictures are shown on a WWW browser 1800. These ten have been selected from a set of 1000 pictures, by visiting another WWW page upon which a selection process is done. All of the photoborgs have agreed that these ten images are the ones they wish to use to make the final rendering. All of these ten images 1810 are pictures of the same subject matter under different illumination.
Below each image is a set of controls in the space below each image 1820. These controls include Y channel selector 1830 for greyscale pseudocolor modulespace selection, together with three color sliders 1840, an overall weighting, 1850, and a focus adjust 1860. Focus adjust 1860 blurs or sharpens the image photoquantigraphically.
To observe the output, another WWW page is visited. Each time that page is reloaded, the photorendering is produced according to the weights set here in 1800.
OTHER EMBODIMENTS From the foregoing description, it will thus be evident that the present invention provides a design for a system that uses a plurality of pictures or exposures to produce picture that is improved, or where there is some extended expressive or artistic capability. As various changes can be made in the above embodiments and operating methods without departing from the spirit or scope of the following claims, it is intended that all matter contained in the above description or shown in the accompanying drawings should be interpreted as illustrative and not in a limiting sense.
Variations or modifications to the design and construction of this invention, within the scope of the appended claims, may occur to those skilled in the art upon reviewing the disclosure herein. Such variations or modifications, if within the spirit of this invention, are intended to be encompassed within the scope of any claims to patent protection issuing upon this invention.

Claims (1)

  1. CLAIMS The embodiments of my invention in which I claim an exclusive property or privilege are defined as follows:
    1. A cybernetic photography system, said cybernetic photography system includ ing: a camera to be placed at a fixed location; an inbound channel, said inbound channel for carrying information from at least one photoborg to said camera; an outbound channel, said outbound channel for carrying information from said camera back to said at least one photoborg; at least one source of illumination, said source of illumination being one of: - a hand-holdable light source carryable by said photoborg; and - a wearable light source wearable by said photoborg, where said cybernetic photography system further includes remote activation means of said camera, said remote activation means operable by said photoborg, and synchronization means, said synchronization means comprising means of causing said source of illumination to produce light during the time interval in which said camera becomes sensitive to light.
    2. A cybernetic photography system as described in Claim 1 where said remote activation means includes a switch affixed to said source of illumination, where said switch provices said photoborg with a means of repeated activation of said camera.
    3. A cybernetic photography system as described in Claim 2, where said source of illumination is an electronic flashlamp.
    4. A cybernetic photography system as described in Claim 1 where said outbound channel includes a machine-readable signal sent to at least one WearComp wearable by said at least one photoborg.
    5. A cybernetic photography system as described in Claim 1, further including a display, said display for viewing by said photoborg, and said display being responsive to an output of said camera.
    6. A cybernetic photography system as described in Claim 5 where said display means includes means of displaying a result of a photoquantigraphic summation.
    7. A cybernetic photography system as described in Claim 5, in which said display means is worn by said photoborg.
    8. A cybernetic photography system as described in Claim 5, in which said display means is affixed to said source of illumination.
    9. A cybernetic photography system as described in Claim 5 including means of updating an image displayed on said display means each time said camera is activated.
    10. A cybernetic photography system as described in Claim 9 where said means of updating said image includes the computation of a photoquantigraphic quantity q(z, y) determined by applying the inverse response function of said camera to a picture output from said camera.
    11. A cybernetic photography system as described in Claim 9 where said means of updating said image includes the computation of a photoquantigraphic sum from a picture taken when said camera is activated and at least one other picture taken during previous times said camera was activated.
    12. A cybernetic photography system as described in Claim 9 where said means of updating said image includes the computation of a photoquantigraphic vec torspace from a picture taken when said camera is activated and at least one other picture taken during previous times said camera was activated.
    13. A photorendering system for computing an output picture from a plurality of input pictures, said plurality of input pictures having been derived from the same subject matter under differing illumination, said photorendering system including the steps of: computation of photoquantigraphic quantities q1. q2,..., for each of said input pictures, computation of a weighted sum, q = Wiqi + w2q2 +..., 14. A cybernetic photography system as described in Claim 1 further including a photorendering system as described in Claim 13.
    15. A cybernetic photography system as described in Claim 5, further including a photorendering system as described in Claim 13 where said display means includes means of displaying said output picture.
    16. A cybernetic photography system as described in Claim 9, further including a photorendering system as described in Claim 13 where said means of updating an image comprises means of displaying said output picture, where said output picture is computed from a picture taken when said camera is activated and at least one other picture taken during previous times said camera was activated.
    17. A cybernetic photography system as described in Claim 9 including a method of updating said image comprising steps of: determining from said camera a spatially varying quantity linearly propor tional to the photoquantigraphic quantity, q(z, y), over spatial coordinates (z, y) of light falling on the image plane or image sensor of said camera for each of a plurality of exposures; o computing a weighted sum q(z, y) over said plurality of exposures, said weighted sum being given by q(x,y) = wlql(z, y) + w2q2(x,y) + applying an essentially semi-monotonic transfer function, f(q), to said sum, q(z,y), to obtain a picture f(q(z,y)), where said essentially semi monotonic transfer function, f(q) has essentially semi-monotonic slope; displaying said picture f(q(z, y)) on said display. ls. Means and apparatus as described in Claim 1 where said outbound channel comprises a human-readable signal that indicates when the camera has become sensitive to light, and where said signal comprises at least one of: an audible signal reproduced from a signal sent via said outbound channel; an audible signal produced in the vicinity of said camera, where said out bound channel comprises the ability of sound to travel through the air between said camera and said photoborg; a vibrotactile signal perceptible by said photoborg; direct electrical stimulation of the body of said photoborg; a visual signal visible to said photoborg; a negated visual signal, said negated visual signal comprising a lamp turn ing off while said camera is sensitive to light; an acknowledgement provided by virtue of the display of an image, trans mitted over said outbound channel, where said image is responsive to an output of said camera.
    19. A photography system including a flashlamp, where said flashlamp may be held by a hand of a user of said flashlamp and where said photography system also includes a radio transmitter and a radio receiver borne by said flashlamp.
    20. A cybernetic flashlamp including means of repeatedly activating a remote cam era, where said cybernetic flashlamp further includes means of synchronization with said remote camera.
    21. A cybernetic flashlamp including: a light source; an activator for signalling a remote camera to take an exposure each time said activator is activated; a synchronizer for flashing said light source in synchronism with exposures of said camera.
    22. The flashlamp of Claim 21 including means to disable said light source, while leaving said activator enabled.
    23. A cybernetic flashlamp as described in 21 further including means of determin ing the field of coverage of the illumination of said cybernetic flashlamp.
    24. A cybernetic flashlamp system as described in Claim 21, together with remote control means for positioning at least one of a variety of remotely selectable filters in front of or within said camera, where said means of remote control is operable by said photoborg.
    25. A cybernetic flashlamp system as described in Claim 21, said cybernetic flash lamp system including a plurality of pushbuttons borne by said source of illu mination, and means for remote activation of said camera by pressing at least one of said pushbuttons, where at least three of said pushbuttons select from at least three colors, said colors being either of: the color of a filter as described in Claim 24; and the color assigned to a lightmodule in a photorendering.
    26. A cybernetic photography system as described in Claim 1 including means for said photoborg to specify a color choice together with each of a plurality of lightstrokes acquired by said camera, where said color choices may be used in a photorendering process as described in Claim 13.
    27. A cybernetic photography system including: a camera for placement at a fixed location; a portable light source; r a portable user actuator which, when actuated by a user, sends a signal to said camera causing said camera to take an exposure; means to synchronize said light source with said camera such that said light source flashes when said camera takes an exposure.
    28. The system of Claim 27 wherein said portable user actuator and said portable light source comprise an integral unit.
    29. The system of Claim 27 wherein said portable user actuator is voice actuated.
    30. A cybernetic photography system, said cybernetic photography system includ ing: a camera to be placed at a fixed location; at least one source of illumination, said source of illumination being one of: - a hand-held light source carryable by a photoborg; and - a wearable light source wearable by a photoborg, where said cybernetic photography system includes means of taking a plurality of pictures while said photoborg directs said source of illumination at different portions of subject matter in view of said camera, and where said source of illumination is activated in synchronization with at least some of said plurality of pictures.
    31. A cybernetic photography system as described in Claim 27, where said camera includes means of detecting that subject matter in view of said camera is being illuminated with said source of illumination.
    32. A cybernetic photography system as described in Claim 27, where said camera takes at least one picture of said subject matter with no use of said source of illumination, and then where said cybernetic photography system uses said at least one picture of said subject matter to compare with further pictures of said subject matter to determine whether or not said subject matter is being illuminated with said source of illumination.
    33. A cybernetic photography system as described in Claim 32 where said cybernetic photography system includes means of recording pictures that are determined to have been pictures of said subject matter illuminated with said source of illu mination, and not recording pictures that are determined to have been pictures of said subject matter not illuminated with said source of illumination.
    34. A cybernetic photography system as described in Claim 27 where said camera is a video camera, and where said source of illumination flashes repeatedly at the frame rate of said video camera.
    35. A cybernetic photography system as described in Claim 34, including means of turning said source of illumination on and off, where said source of illumination produces repeated rapid bursts of light when it is turned on, and no light when it is turned off, and where said video camera records while said source of illu mination is turned on, and stops recording during at least some of the time for which said source of illumination is turned off.
    36. A cybernetic photography system, said cybernetic photography system includ ing: a lock-in camera to be placed at a fixed location; at least one source of illumination, said source of illumination being one of: - a hand-held light source carried by said photoborg; and - a wearable light source worn by said photoborg, where said source of illumination produces a periodically varying level of inten sity, and where said cybernetic photography system includes means of taking at least one picture with said lock-in camera.
    37. A phlashlamp, where said flashlamp includes means of producing at least three flashes of different strengths in rapid succession.
    38. A phlashlamp photography system, including a flashlamp as described in Claim 37, where said flashlamp includes remote control means for a camera, said remote control means including means for taking at least three pictures in rapid succession, where said three at least three pictures are pictures of the same subject matter exposed to different quantities of light.
    39. A cybernetic photography system, including at least one flashlamp, where said flashlamp is at least one of: wearable; and hand-holdable, and where said cybernetic photography system includes means of producing a plurality of flashes of light in rapid succession, where said cybernetic photog raphy system further includes means of remotely activating a camera to take a plurality of pictures in rapid succession, where at least some of said pictures are pictures of subject matter that has been affected by at least one of said flashes of light.
    40. A cybernetic photography system including a photorendering system as de scribed in Claim 13 where said cybernetic photography system further includes a virtual control panel presented upon a video display means, and where said virtual control panel comprises lightmodule weight selection means.
    41. A cybernetic photography system as described in Claim 40, where said virtual control panel is operable by a Web browser, running on a computer connected to the Internet.
    42. A cybernetic photography system as described in Claim 40, further including the features of Claim 27, where said video display means is viewable by said photoborg.
    43. A cybernetic photography system including color coordinate transformation means, together with brightgrey warning means, said brightgrey warning means including means of indicating image areas that correspond to regions of col orspace at the gamut boundary of the domain of said color coordinate transfor mation, but not at the gamut boundary of the range of said color coordinate transformation.
    44. A cybernetic photography system including lightspace rendering means, and color coordinate transformation means in lightspace coordinates, together with brightgrey reduction means, said brightgrey reduction means including means of identifying regions of colorspace at the gamut boundary of the domain of said color coordinate transformation, but not at the gamut boundary of the range of said color coordinate transformation, said cybernetic photography system including means of adjusting said color coordinate transformation means to reduce the amount of brightgrey image content.
    45. A cybernetic photography system as described in Claim 44, where said lightspace rendering is from a plurality of lightmodules, said brightgrey reduction means including means of identifying brightgrey regions in each of said plurality of lightmodules, and where said adjustment of said color coordinate transforma tion includes at separate color coordinate transformations in each of said light modules, said separate color coordinate transformations including at least one of: deliberate distortion of color hue to reduce the amount of brightgrey con tribution; and deliberate destruction of highlight detail by clipping, to reduce the amount of brightgrey contribution.
    46. A cybernetic photography system including color coordinate transformation means, together with brightgrey reduction means, said brightgrey reduction means including means of identifying regions of colorspace at the gamut bound ary of the domain of said color coordinate transformation, but not at the gamut boundary of the range of said color coordinate transformation, said cybernetic photography system including means of adjusting said color coordinate trans formation means to reduce the amount of brightgrey image content.
    47. A cybernetic photography system as described in Claim 46, where said adjust ment of said color coordinate transformation includes at least one of: deliberate distortion of color hue; and deliberate destruction of highlight detail by clipping.
    48. A cybernetic photography system as described in Claim 5 where said display means includes inverse gamut warning means, where said inverse gamut warning means includes means of indicating image areas of a photoquantigraphic sum mation that correspond to regions of colorspace at the boundary of a domain gamut but not at the boundary of a range gamut.
    49. A method for facilitating combining pictures of a given scene or object, com prising: capturing photoquantigraphic quantities, ql. q2, ..., one from each of a plurality of pictures of said given scene or object, at least some of said pictures taken under different illuminations.
    50. The method of Claim 49, further comprising: computing a weighted sum from said photoquantigraphic quantities, said weighted sum given by q = wlql + w2q2 + ...
    51. A photorendering method for computing an output picture from a plurality of input pictures, said plurality of input pictures having been derived from the same scene or object under differing illumination, said photorendering system including: capturing photoquantigraphic quantities ql, q2,..., one from each of said plurality of input pictures; computing a weighted sum from said photoquantigraphic quantities, said weighted sum given by q = w1q1 + w2q2 + +...; colorspace coordinate transformation of said weighted sum q.
    52. A photorendering system as described in Claim 51 further including gamut boundary management means.
    53. A photorendering system as described in Claim 51 where the range of said colorspace coordinate transformation is a CMYK colorspace, and where said colorspace coordinate transformation is followed by steps that include the steps of: application of a pointwise nonlinearity, where said pointwise nonlinearity is semimonotonically increasing, and where the slope of said pointwise nonlinearity is semimonotonically decreasing; quantization.
    54. A cybernetic photography system as described in Claim 27, where said source of light is wearable.
    55. A cybernetic photography system as described in Claim 27, including optics to effectively locate said source of light near the center of a lens of the eye of said photoborg.
    56. A cybernetic photography system as described in Claim 27, where said source of light is a phlashlamp, and where said cybernetic photography system includes optics to effectively locate said source of light near the center of a lens of the eye of said photoborg.
    57. A wearable photography apparatus , comprising: headgear; a camera borne by said headgear; optics borne by said headgear and arranged to locate the effective center of projection of said camera near the center of a lens of the eye of the wearer of said wearable photography apparatus.
    58. A wearable photography apparatus as described in Claim 57, where said head gear is a pair of eyeglasses.
    59. A wearable cybernetic photography apparatus including a camera, together with optics to effectively locate said camera near the center of a lens of the eye of the wearer of said wearable cybernetic photography apparatus.
    60. A cybernetic photography apparatus, including a camera, where said cybernetic photography apparatus is wearable, and where said camera has an effective center of projection in or near the lens of an eye of the wearer of said cybernetic photography apparatus.
    61. A cybernetic photography apparatus as described in Claim 57, where said cam era is a left camera with effective center of projection in the left eye of the wearer of said cybernetic photography apparatus, and where said cybernetic photography apparatus further includes a right camera with effective center of projection in the right eye of said wearer.
    62. A cybernetic photography system as described in Claim 1, together with a source of illumination providing sustained light output where said sustained light output includes means of adjusting a light output level of said source of illumination and where said outbound channel comprises at least one of the following: an audible tone, originating from the vicinity of said camera, but loud enough to be heard some distance away from said camera; an audible tone broadcast to said photoborg, and re-produced by an au dible transducer on said photoborg's person; a vibrotactile signal reproduced within said source of illumination and felt through a handle or other point of contact between said source of illumination and a photoborg's body; a vibrotactile signal reproduced by a wearable computer or communica tions apparatus upon the body of a photoborg, and where said cybernetic photography system further includes remote control means to repeatedly remotely activate said camera, so that a plurality of long exposure pictures can be taken of subject matter with different illumination in each of said plurality of long exposure pictures.
    63. Means and apparatus as described in Claim 62 where said remote control means is operable by a switch affixed to said source of light.
    64. Means and apparatus as described in Claim 62 where said means of adjusting light output level includes a switch, selecting from among two light output levels, and where said switch also operates said remote control means.
    65. Means and apparatus as described in Claim 62 where said means of adjusting light output level includes a spring-loaded lever, and where said spring-loaded lever also operates said remote control means.
    66. Means and apparatus as described in Claim 62 where said means of adjusting said light output level includes a squeezable spring-loaded trigger and where said spring-loaded trigger also operates said remote control means when it is squeezed beyond a certain threshold.
    67. Means and apparatus as described in Claim 62 where said outbound channel provides a confirmation that said camera has become sensitive to light, and an indication of how long said camera remains sensitive to light.
    68. A source of illumination, as outlined in Claim 62, together with a data entry device where said data entry device issues a command via said inbound channel to activate said camera in a manner in which information is also passed to said camera to select, specify, or continuously vary during an exposure at least one of: aperture of said camera; degree of sensitivity or gain of said camera; shutter speed of said camera; degree of openness of shutter of said camera; focus of said camera; degree of filtration applied optically or electronically to said camera affect ing spectral sensitivity of said camera; degree of filtration applied optically or electronically to said camera affect ing sharpness or clarity of said camera; 69. A source of illumination as outlined in Claim 62, together with an input device, where said input device includes at least one spring-loaded switch and where said switch operates said remote control means, and where said camera remains sensitive to light for as long as said switch is depressed.
    70. A cybernetic photography system as described in Claim 27, where said portable light source comprises a pushbroom light, said a pushbroom light including a plurality of light emitting elements of separately controllable intensity mounted to a frame such that a photoborg may grasp said frame and move about with it, said cybernetic photography system including means of dynamically varying the output level of each of said plurality of light emitting elements.
    71. A cybernetic photography system as described in Claim 27, further including worklights, said worklights allowing said photoborg to see, said cybernetic pho tography system further including means of turning off said worklights during said time interval in which said camera becomes sensitive to light.
    72. A cybernetic photography system as described in Claim 1, further including room light controlling means in a working environment such as a photographic studio, where the room lighting itself may be controlled by an electric circuit, said room light controlling means including means of automatically turning said room lighting off during at least one time interval in which said camera becomes sensitive to light.
    73. A cybernetic photography system as described in Claim 1, further including at least one indicator lamp fixed in the vicinity of said camera, said indicator lamp viewable by said photoborg when said photoborg is within the field of view of said camera.
    74. A cybernetic photography system as described in Claim 1, further including at least one indicator means by which said photoborg can determine whether or not said photoborg is within the field of view of said camera.
    75. A cybernetic photography system as described in Claim 1, further including at least one indicator light source fixed in the vicinity of said camera, said indicator light source having an attribute viewable by said photoborg when said photoborg is within the field of view of said camera, and said attribute of said light source not viewable by said photoborg when said photoborg is not within the field of view of said camera.
    76. A cybernetic photography system as described in Claim 75, in which said at tribute is a color of said light.
    77. A cybernetic photography system as described in Claim 27, where said cy bernetic photography system further includes a hiding test light, and remote activation means of said hiding test light operable by said photoborg.
    78. Apparatus for processing a plurality of exposures of the same scene or object, comprising: image buffers each for storing one of said plurality of exposures, means for obtaining photoquantigraphic quantities, one for each of said plurality of exposures; and means for producing a weighted photoquantigraphic summation of said photoquantigraphic quantities.
    79. A cybernetic photography system for acquiring a pluralitly of pictures of the same subject matter under differently illuminated conditions, said cybernetic photography system including a fixed camera and a plurality of flash lamps, together with means for sequentially activating each of said flashlamps each time one of said plurality of pictures is taken, said each of said flashlamps activated during the time interval in which said camera is sensitive to light.
    80. A cybernetic photography system, as described in Claim 79, including means of sequentially firing a plurality of flashlamps, sequencing from one of said flash lamps to the next at a video rate, and where said camera is a video camera, and where said cybernetic photography system further includes means of recording video output from said video camera.
    81. Means and apparatus as described in Claim 78 where, prior to computing said weighted photoquantigraphic summation, at least some of said exposures may photoquantigraphically blurred.
    82. The apparatus of Claim 5, where said display is a first display for viewing by a photoborg, and further including at least a second display for viewing by a second photoborg, said second display also being responsive to an output of said camera.
    83. A game or amusement device as described in Claim 82 in which a plurality of photoborgs each have one or more wearable light sensitive devices.
    84. A controller for a camera and at least one light source, such that said camera acquires a pair of pictures in rapid succession with at least one picture acquired with illumination from said light source, and at least one other other picture acquired without illumination from said light source.
    85. An apparatus which includes a camera and light source, where said apparatus includes means for acquiring a pair of images in rapid succession where one image is acquired with greater influence from said light source than the other image, and where said influence is judged in comparison to a somevvhat constant degree of illumination which is external and not controllable by the apparatus.
    86. An apparatus which includes a camera and a plurality of light sources, where said apparatus includes means of acquiring a plurality of images where said images differ primarily in the relative amount of influence that each of said plurality of light sources has had on each of said images. s7. A flashlamp for use in production of lightvectorspaces, where said flashlamp includes a synchronization input, where said flashlamp is responsive only to every nth signal received by said synchronization input, and where the first m < n synchronization signals are ignored by said flashlamp, and where m and n are user selectable.
    88. A flashlamp as described in Claim 87, where n may be set to 2, and where m may be set to 0 or 1, so that when m = 0 said flashlamp application of a pointwise nonlinear function to each of said images, where said function is approximately monotonically increasing and has an ap proximately monotonically increasing slope; pointwise addition of the results obtained from the above step; applying a different point wise nonlinearity to said sum, where said different pointwise nonlinearity is approximately monotonically increasing and has an approximately monotonically decreasing slope.
    92. An apparatus including a motion picture camera with film, videotape, or elec tronic recording medium, and a plurality of light sources, where said apparatus includes means of sequentially activating said light sources so that each of said light sources periodically flickers or flashes with a period that is an integer frac tion of the frame-rate or field-rate of said camera, and where said various light sources reach their peak levels of illumination at different times.
    93. Means and apparatus including a motion picture camera and a light source fixed to said camera, together with means of controlling said light source in such a manner that it flashes periodically with a period of one half the field rate or frame rate of said motion picture camera, such that said light source affects even frames or fields to a different degree than it affects odd frames or fields, and where said means and apparatus also includes means of producing a new image sequence from the image sequence acquired by said camera, where said new image sequence is made at half the field or frame rate of the original image sequence by pairwise processing of adjacent pairs of pictures, said pointwise processing including at least one of the following: a photoquantigraphic summation; an implementation of split diffusion in lightvectorspace; r calculation of a photoquantigraphic vectorspace.
    94. A cybernetic hand-held flashlamp including means of synchronizing the flash from said flashlamp with a remote camera, and further including means of repeatedly activating said remote camera.
    95. An apparatus which comprises a means of activating a fixed camera by a remote control attached to a hand-held flash unit, and where each time said remote control is activated, said camera briefly admits light to an image recording medium and said flash unit is activated by said apparatus with the correct timing such that said flash unit illuminates at least a portion of the subject matter of said camera during the brief time that said camera admits light to said image recording medium.
    96. An apparatus which includes an electronic camera at a fixed location, and a hand-held light source, where said camera is capable of capturing and integrat ing a plurality of image captures together into a single image, and where said integration is initiated and terminated using a remote control located in the vicinity of said light source.
    97. A cybernetic flashlamp where said cybernetic flashlamp includes a viewfinder means through which a user may look to determine the extent of illumination of said flashlamp.
    98. A lightsweep where said lightsweep includes a frame upon which a plurality of light sources is mounted, and means to vary the quantity of illumination produced by each of said light sources through a data entry device affixed to said lightsweep.
    99. A camera for use at a fixed location, including a visual indication means by which a person may discern whether or not he or she is within the field of view of said camera, where said means includes sources of light visible from a distance of at least 1000 meters from said camera.
GB9902328A 1998-02-02 1999-02-02 Means and apparatus for aquiring,processing and combining multiple exposures of the same scene or objects to different illuminations Expired - Fee Related GB2334591B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0203423A GB2370958B (en) 1998-02-02 1999-02-02 Means and apparatus for acquiring, processing and combining multiple exposures of the same scene or objects to different illuminations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CA002228403A CA2228403A1 (en) 1998-02-02 1998-02-02 Means and apparatus for aquiring, processing, and combining multiple exposures of the same scene or objects to different illuminations

Publications (3)

Publication Number Publication Date
GB9902328D0 GB9902328D0 (en) 1999-03-24
GB2334591A true GB2334591A (en) 1999-08-25
GB2334591B GB2334591B (en) 2002-06-05

Family

ID=4162052

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9902328A Expired - Fee Related GB2334591B (en) 1998-02-02 1999-02-02 Means and apparatus for aquiring,processing and combining multiple exposures of the same scene or objects to different illuminations

Country Status (2)

Country Link
CA (1) CA2228403A1 (en)
GB (1) GB2334591B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7035534B2 (en) 2004-06-16 2006-04-25 Eastman Kodak Company Photographic lightmeter-remote, system, and method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE533375C2 (en) 2008-03-12 2010-09-07 Mindy Ab Device and method of digital photography
US11892751B2 (en) * 2020-09-18 2024-02-06 Robert House System and method for product photography

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1558839A (en) * 1976-01-23 1980-01-09 Bron Elektronik Ag Remote control device for photographic flash unit
US4368966A (en) * 1980-09-24 1983-01-18 Nippon Kogaku K.K. Photographic system including remote controllable flash unit
US4754295A (en) * 1986-03-20 1988-06-28 Scott T D Camera flash attachment switch
US4920371A (en) * 1987-04-27 1990-04-24 Fuji Photo Film Co., Ltd. Camera control device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1558839A (en) * 1976-01-23 1980-01-09 Bron Elektronik Ag Remote control device for photographic flash unit
US4368966A (en) * 1980-09-24 1983-01-18 Nippon Kogaku K.K. Photographic system including remote controllable flash unit
US4754295A (en) * 1986-03-20 1988-06-28 Scott T D Camera flash attachment switch
US4920371A (en) * 1987-04-27 1990-04-24 Fuji Photo Film Co., Ltd. Camera control device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7035534B2 (en) 2004-06-16 2006-04-25 Eastman Kodak Company Photographic lightmeter-remote, system, and method

Also Published As

Publication number Publication date
GB2334591B (en) 2002-06-05
CA2228403A1 (en) 1999-08-02
GB9902328D0 (en) 1999-03-24

Similar Documents

Publication Publication Date Title
CN112040092B (en) Real-time virtual scene LED shooting system and method
US8237809B2 (en) Imaging camera processing unit and method
US7590344B2 (en) Adaptive processing for images captured with flash
JP4434073B2 (en) Image processing apparatus and imaging apparatus
US8111946B2 (en) Image processing apparatus, imaging apparatus, and image processing method
JP6685188B2 (en) Imaging device, image processing device, control method thereof, and program
US20050046703A1 (en) Color calibration in photographic devices
WO2019019904A1 (en) White balance processing method and apparatus, and terminal
JP2978615B2 (en) Apparatus and method for adjusting color balance
CN109618089A (en) Intelligentized shooting controller, Management Controller and image pickup method
US8654210B2 (en) Adaptive color imaging
Heinrich Basics architectural photography
GB2370958A (en) Combining multiple images of the same scene with different illuminations
GB2334591A (en) Remote controlled and synchronized camera and flashlamp
JP2002218495A (en) White balance control method and electronic camera
CA2261376A1 (en) Means and apparatus for acquiring, processing, and combining multiple exposures of the same scene or objects to different illuminations
CN111083348A (en) Mobile terminal, photographing method thereof and computer storage medium
US10564518B2 (en) Environmental lighting system and method
CA2316451A1 (en) Means, apparatus, and method for acquiring, processing, and combining differently illuminated exposures of the same subject matter
CN216248725U (en) Intelligent high-dynamic-range full-color light matrix
JP3903095B2 (en) White balance control method and digital camera
JP2001211457A (en) Digital camera and automatic white balance control method for the digital camera
CN102377928A (en) Imaging apparatus and imaging apparatus control method
Kamps The Beginner's Guide to Photography: Capturing the Moment Every Time, Whatever Camera You Have
EP4360305A1 (en) Light compensations for virtual backgrounds

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20170202