GB2463375A - Colour editing using algorithm stored in image data - Google Patents

Colour editing using algorithm stored in image data Download PDF

Info

Publication number
GB2463375A
GB2463375A GB0916127A GB0916127A GB2463375A GB 2463375 A GB2463375 A GB 2463375A GB 0916127 A GB0916127 A GB 0916127A GB 0916127 A GB0916127 A GB 0916127A GB 2463375 A GB2463375 A GB 2463375A
Authority
GB
United Kingdom
Prior art keywords
algorithm
image data
data stream
image
colour
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0916127A
Other versions
GB0916127D0 (en
Inventor
Stephen David Brett
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pandora International Ltd
Original Assignee
Pandora International Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pandora International Ltd filed Critical Pandora International Ltd
Publication of GB0916127D0 publication Critical patent/GB0916127D0/en
Publication of GB2463375A publication Critical patent/GB2463375A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00281Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal
    • H04N1/00283Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal with a television apparatus
    • H04N1/00286Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal with a television apparatus with studio circuitry, devices or equipment, e.g. television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32128Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3242Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document of processing required or performed, e.g. for reproduction or before recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3256Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document colour related metadata, e.g. colour, ICC profiles
    • H04N2201/3259Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document colour related metadata, e.g. colour, ICC profiles relating to the image, page or document, e.g. intended colours
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3274Storage or retrieval of prestored additional information

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)

Abstract

A method of storing an image appearance alteration for altering the appearance of an image 22, such as color corrections, in an input image data stream 20 to produce an output image data stream 40 comprising: defining the image appearance alteration as an algorithm, which can comprise discontinuous functions or conditional constructs; and storing the algorithm as metadata 24 in association with a part of the input image data stream. The image appearance alteration may then be applied to the image data by a method comprising the steps of: retrieving an image appearance alteration algorithm from the input image data stream 20; retrieving the image data from the input image data stream; and generating an output image data stream 40 by applying the image appearance alteration algorithm to the input image data.

Description

INTELLECTUAL
. .... PROPERTY OFFICE Application No. GBO9 16127.4 RTM Date:16 December 2009 The following terms are registered trademarks and should be read as such wherever they occur in this document: Xilinx, Altera Intellectual Property Office is an operating name of the Patent Office www.ipo.gov.uk Colour Editing In entertainment media it is often desired to alter the appearance or look of motion image sequences. For example, film may be shot on a dull day, but the story to be told wants this scene in sunshine. Another example is where a scene is shot on video for logistical reasons in the day, but the plot requires this scene to be at night. A further scene may be shot where the car is red, whereas the story requires a green car.
Many techniques are known to accomplish these appearance changes, dating from photographic changes made in the development or print processing, through the analogue electronic systems such as those launched by Da Vinci in 1978. These analog systems would provide alteration facilities for any one of six colours -Red, Green, Blue, Cyan, Magenta, or Yellow. This was facilitated by having six analog processing channels, one for each of the colours. * .5.
With the advent of high speed digital computing, these processors became digital. * .*.
s.. 20 One of the first of these devices was the DCP (Digital Colour Processor) produced *:::: by the applicant in 1992. This device is described well in GB 2278514. The advantages of digital processing were very popular -precise replicability with no drift' inherent in analogue processing. * *. * e * **s S
*:*** 25 Meanwhile, post-production boomed. It was found to be so much more cost effective to fix it in post' than to shoot the correct material. As post production systems became more versatile, more productive, and less expensive, this trend continued. With many vendors of processing equipment now available, it was important to end users that all of this equipment became interoperable. Thus pressure was exerted on the manufacturers to do this. The single. most important interoperability was to define lists primarily for editing. This led to the definition of Edit Decision Lists' (EDL's) An Edit Decision List or EDL is a way of representing a film or video edit. It contains an ordered list of reel and time-code data representing where each clip can be obtained in order to conform (or assemble) the final cut of the work being produced.
EDL's are often created by offline editing systems, or can be paper documents constructed by hand. These days, linear editing systems have been superseded by non linear editing systems which can output EDL's electronically to allow aUtoconform on an online editing system -the recreation of an edited programme from the original sources and the editing decisions in the EDL. With the trend to store and process image sequences digitally, EDL's are now often used in the digital video editing world, so rather than referring to reels they can refer to sequences of images stored on disk. EDL's can refer to not just cuts' from one image sequence, to another, but also to fades and transitions' (for example wipes' between scenes).
Whilst several manufacturers extended the EDL formats to contain proprietary information, the basic cut list' became an industry standard.
It was then desired by end users to be able to use colour edit information in n *: :: : interoperable way. For example, it is useful to be able to view a motion sequence from film or video, experiment with colour grading or alteration, store the resulting changes as numeric control values (for example, Red + 5 units) and to be able to * apply these control values on another system from a different manufacturer and see * the same effect. * **
:.: Colour edit information can be stored alongside the digital image sequence data as *..S.* - * 25 metadata. Such metadata is typically attached to a particular frame of the image sequence and may apply only to that frame or alternatively may continue to apply until cancelled or superseded by subsequent metadata. For example an operator may set a particular green correction for a scene, e.g. Green + 3 units. That correction is then stored alongside the frames of that scene as metadata associated with that scene. When the image sequence and metadata are read back in by another colour corrector (not necessarily from the same manufacturer), the metadata is used to recreate the colour correction which the operator had previously chosen, i.e. in this example, the scene is once again corrected by adding 3 units to the Green.
However, this system is unsatisfactory as it will not produce quite the same colour correction on machines made by different manufacturers. To explain this further, in earlier analog colour correctors, an operator would define colour corrections by altering the behaviour of an electronic circuit. The resulting colour correction would be a curve of input against output. For example a correction to make the reds redder could have a smaller effect on the low-end reds, a medium effect on the mid-range reds and a stronger effect on the high-end reds. The exact shape of the curve was dependent on the particular electronic circuit.
When digital systems were introduced, it was desirable to reproduce the shape of this curve so that an adjustment set by an operator would reproduce the tonal characteristics that had previously been produced by the electronic circuit.
Therefore, the manufacturers loaded their new digital machines with lookup tables which would reproduce the behaviour of the previous analog machines by implementing the same shape of adjustment curve. The problem is that each * manufacturer used a different electronic circuit and therefore implemented a different tonal characteristic curve. *.** * * * S. S
When metadata such as Green + 3 units is read into one of these maèhines, the green adjustment curve will be produced by multiplying the tonal characteristic curve by :.: : 3. However, as the characteristic curve is manufacturer-dependent, the resulting * : 25 colOur correction curve will vary between different manufacturers, meaning that a colour correction defined in one manufacturer's hardware is not reproducible on another manufacturer's hardware.
This problem led to the development of the Colour Decision List' or CDL. The American Society of Cinematographers (ASC) have proposed a standard for this colour decision list as follows. -4..
The formula for ASC CDL colour correction is: out = (j*s+o)P where out is the color graded pixel code value * i is the input pixel code value (Oblack, 1 vhite) s is slope (any number 0 or greater, nominal value is 1.0) o is offset (any number, nominal value is 0) p is power (any number greater than 0, nominal value is 1.0) In other words, the formula defines a colour correction curve from a straight line, raised to the power p. If s1.0, o=0,0 and p1.O, then output equals input and no correction is applied.
The ASC standard aims to define a standard characteristic curve to be applied by all manufacturers. The parameters to this curve (i.e. the slope, offset and power) can then be stored as metadata and will produce the same colour correction curve regardless of the manufacturer of the hardware which is used to implement the correction. * S S *5 S
The formula is applied to the three color values for each pixel using the corresponding slope, offset, and power numbers for each color channel. *5S
25 Whilst this is a big step forward in interoperability, it only deals with' part f the colour correction process. Colour correction conventionally consists of a stage of processing each of the channels (usually Red, Green, and Blue) individually (primary colour correction), then a process of multi dimensional nature, for example, altering the red dependent on the amount of blue (secondary colour correction).
Thus it can be seen that the ASC approach tackles only the first part of the problem.
Additionally, the functions that can be described using this technique are restricted, as the colour correction function has to be able to be described by the ASC algorithm. For example, the ASC formula cannot represent a discontinuous function.
In post production, special effects are often obtained by loading discontinuous functions into the 1 -D Lookup tables of the colour corrector. Such special effects cannot therefore be implemented with the ASC algorithm.
The invention may be viewed from many different aspects. According to one aspect, the invention provides a method of storing an image appearance alteration for altering the appearance of an image in an input image data stream to produce an output image data stream comprising: defining said image appearance alteration as an algorithm which produces said output data stream dependent upon said input data stream; and storing said image appearance alteration by storing said algorithm in association with a part of said input image data stream to which it applies.
The algorithm may fully define the image appearance alteration. The algorithm may comprise at least one variable which can be adjusted by an operator. The algorithm may comprise a default value for each of said at least one variable.
:::: The algorithm may define a discontinuous function. The algorithm may comprise * 20 conditional constructs. The algorithm may be stored as metadata of the input image **.
* * * data stream. S. *
* Viewed from another aspect, the invention provides a method of applying an image * appearance alteration to image data in an input image data stream comprising: * 25 retrieving an image appearance alteration algorithm from said input image data stream; retrieving said image data from said input image data stream; and generating an output image data stream by applying said image appearance alteration algorithm to said input image data.
The algorithm may comprise at least one parameter which may be varied and the method may further comprise a step of obtaining operator input to set the value of said at least one variable parameter of said algorithm.
The step of obtaining Operator input may comprise the step of associating the algorithm with at least one control of data processing apparatus. Such a control may be a simple on/off control. However, the algorithm may comprise at least one variable parameter which can be adjusted by the operator and the value of said at least one variable parameter may be set by adjusting said at least one control of said data processing apparatus. Said control may comprise buttons andlor sliders and/or dials. *..* * * . * .
*.. At least one of the at least one controls may be a hardware control. At least one of * the at least one controls may be implemented in software. **.* * . . * 15
The at least one software control may be implemented in response to retrieving the : :* image appearance alteration algorithm. The at least one software control may be implemented in dependence upon the image appearance alteration algorithm.
S..... * .
The algorithm may be loaded into hardware which applies said algorithm to said input image data. The algorithm may be in a form which can be directly loaded into hardware. The algorithm may be in the form of VHDL code.
The algorithm may be used to generate a lookup table which may in turn be used to apply the image appearance alteration to the input image data.
The image appearance alteration may be performed on a sequence of video images substantially in real time.
The invention also extends to a software product comprising instructions which when executed on data processing apparatus cause the apparatus to carry out any of the methods described above.
The software product may be a physical data carrier or signals transmitted from a remote location.
The invention also extends to a method of manufacturing a software product which is in the form of a physical data carrier, comprising storing on the data carrier instructions which when executed on data processing apparatus will cause the apparatus to operate in accordance with any of the methods described above.
The invention also extends to a method of providing a software product to a remote location, by means of transmitting data to data processing apparatus at that remote * location, the data comprising instructions which when executed on data processing *::: :* apparatus cause the apparatus to operate in accordance with any of the methods described above. * .1
** 15 In accordance with the present invention, it has been found that storing the algorithm * : as part of the metadata gives significantly improved interoperability. Rather than * : * being constrained by selecting parameters for the fixed ASC algorithm, far more complex and useful functionality can be described in a manufacturer independent manner. For example, complex interoperable processing can be built up. Consider a case where the operator wants more Blue. The operator can simply increase the overall blue by specifying functionality that causes the algorithm Blue_out 1.1*Bluein.
* In this case, all values of Blue will be increased by 10%. Limiting will automatically be implemented to stop numeric overflow. An offset of value Off_Set can be added, and the algorithm resulting will then be Blue out=Off Set+ 1.1 *Bftlein This algorithm is then added to the image sequence metadata, as an algorithm (rather than just the parameters to an algorithm). Note that at this point no irretrievable processing of the image has taken place. Some systems incorporate the modification into the image data, which destroys bit accuracy' as, if this is applied in reverse to get back to the original, small errors occur due to the limited precision of calculation. In the system of the present invention, if the modification is deleted, the original image is still available, accurate everywhere to the last bit of precision.
Furthermore, rapid transitions in the resulting lookup tables can be specified by the general algorithmic form. For example, it is possible to remove all very dark tones, and turn them white by specifying this, and resulting in an algorithm of the form S... * S S S. *
IF Blue_in LESS THAN 50, THEN Blue_out -255 S...
: 15 This can obviously be applied to all three channels, Red, Green, and Blue. This can also be applied simultaneously with processing as above, where Blue is increased by * .. 10%. * S S S.. I *
* S* .1.
* Using this method, we can also specify multi-dimensional, or cross channel' processing (secondary corrections). An example of this may be to make the whites whiter, but to leave the saturated Reds, Greens, and Blues untouched (this is something that cannot be done with one dimensional processing alone.
The resulting algorithms may be of the form: IF (Blue_in GREATER THAN 200) AND (Red in GREATER THAN 200) AND (Green_in GREATER THAN 200) THEN Blue out255 and Red out=255 and Green_out=255 Obviously, this system can be extended to produce any desired effect. Effects are 3.0 not limited simply to colour corrections, but may involve other image characteristics such as texture or sharpness. The resulting algorithms and the associated alteration parameters are stored with the image as metadata. This is particularly useful on set'.
In these cases, either film is shot, processed, and digitised, or digital video is captured directly. When the material is reviewed on set, it is necessary to make the creative decisions as to whether the resulting material is of sufficient quality for the final production. It is therefore desirable to correct the acquired material to a near final' form, to ensure that the material is suitable for its final usage. The material, together with the associated metadata is then passed to post-production, where they can either use the material and its metadata as is', make further changes with the parameters in the currently specified algorithms, add further algorithmic processing, or delete all of the previous processing metadata, and start with the material as originally captured. Note that the colour metadata is independent of the resolution of the material. It may be that on set' acquisition is taking place at 4K resolution *::: : (conventionally 4096 pixels x 3172 lines) for high quality portrayal. The on set' colour processing may be limited, so it may be desirable to make a lower quality S..
viewing' copy, say at 2K, and preview colour editing at this resolution. When the : 15 material is reviewed in the post-production facility, the resultant algorithmic and parametric values can be applied to the 4K version seamlessly, as a starting point for in house' colour correction at the post-production facility.
S.'...
* S Whilst it is possible to describe changes to a single film or video frame, it would be preferable to grade in a scene' mode. This is a well known technique, where a work is divided into scenes', and each scene has an associated grading.
A further advantage of an algorithmic technique is that it is possible to specify processing dependent on frame numbers. An example of this is where it is desired for the sun to get brighter throughout a given scene. By determining the sun brightness on frame number (or more preferably on an index derived from how far through the scene a given frame is) this effect can be accomplished easily.
Further developments of this algorithm and parameter' approach take us into spatial processing. The simplest example here may be a frame, or sequence of frames, where there are two red objects in the frame. The first is in the top half of the frame, whilst the second is in the bottom half It is required to make the red object in the -10 -top half of the scene brighter, whilst keeping the red object in the bottom half the same. Algorithmic and parametric processing of the form IF (Red_in GREATER THAN Threshold) AND Line_Number LESS THAN (Max Line Number/2) THEN Red_out=Red_in* 1.1 Yet again, this can be extended further, to describe more detailed areas of the screen.
Delineation of a given frame may be implemented in vectors, where a starting point is given, together with a distance and direction to the next point, or as a series of geometric primitives (circles, squares, ellipses) to delineate an area for algorithmic and parametric processing. Alternative techniques may involve delineation using *: : :. splines, where the spline algorithm and parameters are stored as metadata. * S *5**
It must be appreciated that in many cases, this delineation can be very approximate. *S.*
: 15 It is sometimes necessary merely to delineate between different objects. For example, consider a scene with two red cars, where it is desired to change one of * *. these cars green, whilst leaving the other as red. In this case a simple rough circle :.: containing one of the cars is all that is needed to differentiate between the two cars.
S..... * .
Further extensions to the algorithmic and parametric metadata alteration can be obtained by inter frame processing. For example, a motion blur effect can be obtained by averaging between frames. Consider a stationary camera scene, with a static background. In this scene, a sports car enters the left hand side of the frame, drives through the scene, and exits from the right side. An algorithmic processing of the form Frame (N) (Pixel(x, y)) = {Frame (N-1)(Pixel(x, y))+Frame(N+l)(Pixel(x, y))} /2 will obtain a motion blur effect for the sports car. Since the background does not change from Frame(N-l) to Frame(N+l) there will be no modification. Obviously a greater motion blur effect can be obtained by bringing Frame(N-2) and Frame(N+2) into the equations.
In the above, the process of recording image changes using algorithms and parameters has been described. To see the results of these changes, as applied to their relevant image sequences, it is necessary to process the stored image sequence.
Therefore, preferably the equations are stored in a high level programming language such as C' or "C++', or as FPGA code such as VHDL, the common entry level language for Logic circuits. When an image sequence is to be displayed, the algorithmic and parametric metadata is read, a programming path is set up either by compiling e.g. the C code, and executing it, or by the loading of VHDL compiled instructions into FPGA's for the processing. In this way, extraction of the defined colour corrections from the metadata and loading of those colour corrections into the *: :: : colour correcting hardware is simple and fast. * *
To further aid the creative process in this system, a soft' control panel can be used. S...
: 15 Such panels are used in mobile phones and other computer applications, where no physical controls exist, but soft buttons' or sliders appear on a display panel, which * :; then respond to touch. Alternatively, buttons or sliders can be created on the screen of a personal computer or of a dedicated colour controller and can be operated * and/or adjusted by the use of a keyboard, mouse or trackball.
In embodiments of the present invention, an optimal operational method is to allow operators to specif' an effect' (e.g. to make the red colours redder') and to then have this effect produce a soft control on the control panel. Because redder' metadata has been created, for the algorithm and parameters, this will cause this control to appear whenever this control sequence appears. This simplifies the controls greatly, as only controls that are active appear on the control panel. Thus if the operator makes a make the browns muddier' control, a slider or rotary control will appear for this, arid will reappear every time his image sequence is recalled. A * possible working methodology may be to have a team of two operational staff, wherein the first has more engineering and programming skills, and defines the algorithms for the desired changes, under the direction of the creative colourist. The -12-Colourist would then use these functions to produce the resulting imagery that he wanted.
Instead of storing in the metadata the function together with all the relevant parameter values for that function which define a particular colour correction, the function can be stored with one or more parameters left variable. When such variable metadata is encountered by a colour corrector, the colour corrector provides a control such as a dial or a slider to the operator so that the operator can select a value for that variable, thus completing the full definition of the colour correction.
A default or starting value for this variable can also be stored in the metadata.
In this way, an operator who is working on a particular colour correction can save *:: :* his work before it is finished, move on to a new, more urgent project, then return to the previous work at a later time to finish it off. When returning to the work, the * : 15 colour corrector will load the functions which are stored in the metadata, including any variable parts of those functions and the values which the operator had previously selected. The operator will thus be reminded of what he was doing before he left the work and will be able to continue working on it straight away.
* What is more, as the function itself is stored in the metadata, the operator need not reload the work onto the same hardware. He may choose to use a different machine, made by a different manufacturer and the colour corrections will be exactly reproduced as he had previously defined them. In this way, the invention provides great interoperability.
In yet another implementation, the metadata need not explicitly specify which parts of the function are variable. Instead, all variable parts of the function can be automatically detected and appropriate control can be provided to the user.
As described above, this system simplifies the controls for the operator. Instead of requiring a control panel with a control (e.g. a knob or slider) for each of a large number of possible colour corrections, controls can be dynamically created or allocated based on the corrections which are actually to be applied to the current -13 -scene. For example, in a simple scene, the operator may only have defined a correction to make the reds redder. It is therefore not necessary to provide a complex control panel including controls for the red, green, blue and other characteristics. Instead, the system can dynamically create a single soft' control for the operator or can dynamically allocate a single control from a reduced size hard' control panel.
It is possible to implement the above ideas in many ways. One way is to use general purpose computers, such as those that conform to the PC Standard, or Apple Macs.
A second is in dedicated hardware. It is a fairly simple task to construct hardware circuits containing registers, adders, and multipliers to implement the equations and * : : r. parametric methods described earlier. **.* * .
Another possible technique for implementation is to use FPGAs (Field * 15 Programmable Gate Arrays), These devices contain uncommitted gates, adders, and multipliers which the user configures, usually by programming in a language such as VHDL. Such devices are readily available from suppliers such as Xilinx and Altera.' ** * * S.....
* One further way is to combine the features of this invention into the systems described in GB 2413232. This proposes the use of either one channel' of hardware colour correction, which is used to load a 3D colour lookup table, or alternatively general purpose computing power used to load a 3D colour lookup
table.
The Parametric equations described in this application are used as the software algorithm' (GB 2413232A p5, line 13) which creates a look up table for the cardinal colour points of the colour channel.
By the publication of standards for programming, it is possible to obtain a very wide ranging series of effects and alterations, and to see the identical results across systems from different vendors.
-14 -Certain preferred embodiments of the invention will now be described, by way of example only, and with reference to the accompanying drawings in which: Figure 1 shows a prior art colour correction system; Figure 2 shows another prior art colour correction system; Figure 3 shows a colour correction system in accordance with the invention; Figure 4 shows a controller; and Figure 5 shows another controller.
In the colour correction systems shown in Figures 1, 2 and 3, an input data stream 20 comprises image data 21 and associated input metadata 24. The image data comprises a sequence of input frame images 22. In the examples shown, the sequence of input frames 22 is divided into scenes and the metadata 24 is associated with each scene. It will be understood however that metadata 24 can be associated with the image data 21 in many other ways, for example it could be associated with the whole sequence of frames 22, with individual frames 22 and/or with individual * lines or pixels within a frame 22. The input data stream 20 is input into a colour corrector 30 which processes the image data 21 and the metadata 24 and outputs an output data stream 40, which also consists of a sequence of frames 42. The colour corrector 30 is controlled by a controller 50 which in turn is operated by an operator 60.
The controller 50 (described in more detail below) may be either a dedicated colour * : * corrector controller (such as the Applicant's POGLE controller) or it may be a general purpose computer. The operator 60 is normally a human operator, typically a colourist, although it could also be automated if desired.
Likewise, the colour corrector 30 may be a dedicated piece of hardware or it may be a general purpose computer. In the case that it is a general purpose computer, it may also perform the function of the controller 50, thus reducing the amount of hardware required. However, in most cases a dedicated hardware colour corrector 30 is preferred as such devices are fast enough to carry out colour corrections in real time on High Definition image streams such as HDTV or the 2K or 4K standards.
Figure 1 shows a prior art system in which colour corrections are stored in the metadata 24 simply as numeric values. In this system, the colour corrector 30 combines that metadata 24 with its own internal tonal characteristic curve 32 (typically stored in a one dimensional lookup table) to produce a colour correction curve which is used to apply the colour correction to the pixels of the input images, thus producing the output images. As the internal tonal characteristic curve 32 is manufacturer-dependent and is hard-wired into the colour corrector 30, the input data stream 20 will produce a different output data stream 40, depending on which manufacturer's colour corrector is used.
Figure 2 shows another prior art system in which the ASC Colour Definition List proposed standard is used. In this system, the metadata 24 in the input data stream defines colour corrections by providing the slope, offset and power parameters which are to be used with a standardised formula 34 (out = [in * slope + * S S * offset]Apower). The standard formula 34 is hard-wired into the colour corrector 30
SS * S
and is used in combination with the metadata 24 to produce a colour correction curve which is used to apply the colour correction to the pixels of the input images, thus producing the output images. As the formula 34 is a standard, the same equation will be programmed into every manufacturer's colour corrector 30, so the same input data stream 20 should produce the same output data stream 40 regardless of the choice of colour corrector 30. However, the system lacks flexibility as the standard formula 34 only allows a restricted set of colour correction curves to be reproduced. In particular, discontinuous colour corrections cannot be applied and secondary colour corrections (correcting one colour in dependence on other parameters, such as the other colours, texture or sharpness) cannot be applied with this system.
Figure 3 shows a colour correction system according to one embodiment of the invention. In this system, the metadata 24 in the input data stream 20 defines colour -16-corrections in terms of algorithms and parameters. The colour corrector 30 loads the algorithms and parameters for the colour corrections and uses those algorithms and parameters to perform the image adjustments on the input images to produce the output images.
In the systems in accordance with the invention, the algorithms and parameters may either be used directly to perform the image adjustments, e.g. by loading the algorithm into an FPGA which then processes each pixel in turn or they may be used to create a lookup table for processing the image data. A lookup table of the appropriate number of dimensions can be created, e.g. a primary colour correction will only require a one dimensional lookup table, whereas a secondary colour correction could require a two or more dimensional lookup table. The choice of direct or indirect processing can be made by the colour corrector 30 at the time of processing according to which method will produce the faster processing.
As the algorithm is to be loaded directly into an FPGA or a computer, it is preferable * that the metadata stores the algorithm in a form in which it can be directly loaded or compiled into the relevant device. For example the algorithm may be stored as a **SS function written in the C programming language or it may be stored as VHDL *..S *. : 20 instructions for programming an FPGA (note that for simplicity the metadata in the drawings are not shown in a particular language). S 55 * S S
In Figure 3, Scene 1 is shown with a colour correction corresponding to an ASC ***SS * formula, and also a secondary colour correction in which the correction is conditional upon more than one input colour. Scene 2 is shown with a discontinuous blue adjustment and an adjustment which is conditional upon a spatial characteristic (line number).
Figure 4 is a schematic representation of a controller 50 for controlling a colour corrector 30. The controller 50 has a screen area 52 for displaying images or sequences of images (with or without applied corrections), and various slider controls 54 and rotary controls 56. The slider and rotary controls may be used for -17-varying the magnitude of an adjustment andlor for setting upper and lower limits of ranges for corrections. The controller 50 represented in Figure 4 has six colour correction channels and a slider and two rotary controls are provided for each channel. If complex colour corrections are being applied to a particular scene, all six channels may be in use simultaneously. In simpler cases, only a single channel may be required for a single colour correction. However, as the controls 54, 56 are a physical part of the controller 50, the controls 54, 56 for all six channel are still presented to the operator 60, even though five of them are currently not set to do anything. Thus in this particular instance the controller 50 is needlessly complex and confusing for the operator 60.
* Figure 5 is a schematic representation of a controller 50' according to a preferred * embodiment of the invention. The controller 50' of Figure 5 differs from that of Figure 4 in that the channel controls 54, 56 are not physical sliders and knobs.
Instead, a touch sensitive screen 58 is provided below the main screen 52. With this controller 50', it is possible to create and remove channel controls 54, 56 on the fly.
* Therefore the controller 50' can display only those controls 54, 56 which are .. currently active. In the state shown in Figure 5, the controller is displaying only a * .** * single slider 54 and two knobs 56, i.e. the controls for a single channel. Compared with the controller 50 shown in Figure 4, the controls 54, 56 are far simpler and less confusing to the operator 60. * ** * * S
In the controller 50' shown in Figure 5, the controls 54, 56 are dynamically created as and when they are needed. When an adjustment is loaded from the metadata 24, -25 one or more controls 54, 56 are displayed on the touch sensitive screen 58 and can be operated by the operator 60 to make or alter the corresponding colour correction.
When new metadata is encountered (e.g. upon moving to the next scene), the controls 54, 56 are redefined accordingly.
Although Figure 5 shows the controller 50' in a simple state with controls 54, 56 for only one channel, it will be understood that where a plurality of colour corrections are associated with the frame 22 or the sequence of frames 22, or where more -18-complex colour corrections are being made (e.g. secondary colour corrections), more controls 54, 56 will be provided on the touch sensitive screen 58. Where necessary, the controller 50' can provide the same number or a greater number of controls 54, 56 as are provided on the hard wired controller 50 of Figure 4. * .S* * S S S. * * .*. * S S.. S *... * * S S. * * S. * S S *. S
S * .

Claims (27)

  1. Claims: 1. A method of storing an image appearance alteration for altering the appearance of an image in an input image data stream to produce an output image data stream comprising: defining said image appearance alteration as an algorithm which produces said output data stream dependent upon said input data stream; and storing said image appearance alteration by storing said algorithm in association with a part of said input image data stream to which it applies.
  2. 2. A method as claimed in claim I, wherein said algorithm fully defines the image appearance alteration.
  3. 3. A method as claimed in claim 1, wherein the algorithm comprises at least one variable which can be adjusted by an oerator.
  4. 4. A method as claimed in claim 3, wherein the algorithm comprises a default value for each of said at least one variable.
    : 20
  5. 5. A method as claimed in any preceding claim, wherein the algorithm defines a discOntinuous function. * ** * * U
    :;: .:*
  6. 6. A method as claimed in any preceding claim, wherein the algorithm comprises conditional constructs.
  7. 7. A method as claimed in any preceding claim, wherein the algorithm is stored as metadata of the input image data stream.
  8. 8. A method of applying an image appearance alteration to image data in an input image data stream comprising: retrieving an image appearance alteration algorithm from said input image data stream; -20-retrieving said image data from said input image data stream; and generating an output image data stream by applying said image appearance alteration algorithm to said input image data.
  9. 9. A method as claimed in claim 8, wherein the algorithm comprises at least one parameter which may be varied and the method further comprises a step of obtaining operator input to set the value of said at least one variable parameter of said algorithm.
  10. 10. A method as claimed in claim 9, wherein the step of obtaining operator input comprises the step of associating the algorithm with at least one control of data processing apparatus.
  11. 11. A method as claimed in claim 10, wherein the algorithm comprises at least one variable parameter which can be adjusted by the operator and wherein the value of said at least one variable parameter is set by adjusting said at least one control of *: :: : said data processing apparatus. *.*. * . S...
  12. 12. A method as claimed in claim 10 or 11, wherein at least one of the at least one controls is a hardware control.
  13. 13. A method as claimed in claim 10 or 11, wherein at least one of the at least one controls is implemented in software.*...S.I
  14. 14. A method as claimed in claim 13, wherein the at least one software control is implemented in response to retrieving the image appearance alteration algorithm.
  15. 15. A method as claimed in claim 13 or 14, wherein the at least one software control is implemented in dependence upon the image appearance alteration algorithm.
  16. 16. A method as claimed in any of claims 8 to 15, wherein the algorithm is loaded into hardware which applies said algorithm to said input image data.
  17. 17. A method as claimed in claim 16, wherein the algorithm is in a form which can be directly loaded into hardware.
  18. 18. A method as claimed in claim 17, wherein the algorithm is in the form of VHDL code.
  19. 19. A method as claimed iii any of claims claim 8 to 15, wherein the algorithm is used to generate a lookup table which is in turn used to apply the image appearance alteration to the input image data.
  20. 20. A method as claimed in any of claims 8 to 19, wherein the image appearance alteration is performed on a sequence of video images substantially in real time.*: :: :
  21. 21. A software product comprising instructions which when executed on data processing apparatus cause the apparatus to carry out the method of any preceding * **.* claim. ** * S S * 20
    *
  22. 22. A software product as claimed in claim 21, wherein the software product is a . : :* physical data carrier. *
    S.....
    *
  23. 23. A software product as claimed in claim 21, wherein the software product comprises signals transmitted from a remote location.
  24. 24. A method of manufacturing a software product which is in the form of a physical data carrier, comprising storing on the data carrier instructions which when executed on data processing apparatus will cause the apparatus to operate in accordance with the method of any of claims ito 20.
    -22 -
  25. 25. A method of providing a software product to a remote location, by means of transmitting data to data processing apparatus at that remote location, the data comprising instructions which when executed on data processing apparatus cause the apparatus to operate in accordance with the method of any of claims 1 to 20.
  26. 26. A method of storing an image appearance alteration, substantially as hereinbefore described, with reference to the accompanying drawings.
  27. 27. A method of applying an image appearance alteration to image data, substantially as hereinbefore described, with reference to the accompanying drawings. * * . ** . S... * . S.,. *5*S * S * S. S * *5 * . .S.....S
GB0916127A 2008-09-12 2009-09-14 Colour editing using algorithm stored in image data Withdrawn GB2463375A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0816768A GB0816768D0 (en) 2008-09-12 2008-09-12 Colour editing

Publications (2)

Publication Number Publication Date
GB0916127D0 GB0916127D0 (en) 2009-10-28
GB2463375A true GB2463375A (en) 2010-03-17

Family

ID=39930110

Family Applications (2)

Application Number Title Priority Date Filing Date
GB0816768A Ceased GB0816768D0 (en) 2008-09-12 2008-09-12 Colour editing
GB0916127A Withdrawn GB2463375A (en) 2008-09-12 2009-09-14 Colour editing using algorithm stored in image data

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB0816768A Ceased GB0816768D0 (en) 2008-09-12 2008-09-12 Colour editing

Country Status (1)

Country Link
GB (2) GB0816768D0 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013147925A1 (en) * 2012-03-27 2013-10-03 Thomson Licensing Color grading preview method and apparatus
US9501817B2 (en) 2011-04-08 2016-11-22 Dolby Laboratories Licensing Corporation Image range expansion control methods and apparatus
US10194162B2 (en) 2013-07-18 2019-01-29 Koninklijke Philips N.V. Methods and apparatuses for creating code mapping functions for encoding an HDR image, and methods and apparatuses for use of such encoded images

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999055088A1 (en) * 1998-04-22 1999-10-28 Siemens Aktiengesellschaft Method for transmitting supplementary information signals in blanking intervals of a video signal, and a device for receiving these supplementary information signals
US6335983B1 (en) * 1998-09-28 2002-01-01 Eastman Kodak Company Representing an extended color gamut digital image in a limited color gamut color space
US20030184652A1 (en) * 2002-03-28 2003-10-02 Fuji Photo Film Co., Ltd. Digital camera and image data processing system
JP2003324746A (en) * 2002-05-07 2003-11-14 Fuji Photo Film Co Ltd Digital camera
US20060092293A1 (en) * 2001-04-11 2006-05-04 Fuji Photo Film Co., Ltd. Printer system and image processing system having image correcting function
WO2008088868A2 (en) * 2007-01-19 2008-07-24 Bioptigen, Inc. Methods, systems and computer program products for processing images generated using fourier domain optical coherence tomography (fdoct)

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999055088A1 (en) * 1998-04-22 1999-10-28 Siemens Aktiengesellschaft Method for transmitting supplementary information signals in blanking intervals of a video signal, and a device for receiving these supplementary information signals
US6335983B1 (en) * 1998-09-28 2002-01-01 Eastman Kodak Company Representing an extended color gamut digital image in a limited color gamut color space
US20060092293A1 (en) * 2001-04-11 2006-05-04 Fuji Photo Film Co., Ltd. Printer system and image processing system having image correcting function
US20030184652A1 (en) * 2002-03-28 2003-10-02 Fuji Photo Film Co., Ltd. Digital camera and image data processing system
JP2003324746A (en) * 2002-05-07 2003-11-14 Fuji Photo Film Co Ltd Digital camera
WO2008088868A2 (en) * 2007-01-19 2008-07-24 Bioptigen, Inc. Methods, systems and computer program products for processing images generated using fourier domain optical coherence tomography (fdoct)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9501817B2 (en) 2011-04-08 2016-11-22 Dolby Laboratories Licensing Corporation Image range expansion control methods and apparatus
US10395351B2 (en) 2011-04-08 2019-08-27 Dolby Laboratories Licensing Corporation Image range expansion control methods and apparatus
WO2013147925A1 (en) * 2012-03-27 2013-10-03 Thomson Licensing Color grading preview method and apparatus
CN104205795A (en) * 2012-03-27 2014-12-10 汤姆逊许可公司 Color grading preview method and apparatus
CN104205795B (en) * 2012-03-27 2017-05-03 汤姆逊许可公司 Color grading preview method and apparatus
US10194162B2 (en) 2013-07-18 2019-01-29 Koninklijke Philips N.V. Methods and apparatuses for creating code mapping functions for encoding an HDR image, and methods and apparatuses for use of such encoded images

Also Published As

Publication number Publication date
GB0816768D0 (en) 2008-10-22
GB0916127D0 (en) 2009-10-28

Similar Documents

Publication Publication Date Title
US6873344B2 (en) Media production system using flowgraph representation of operations
US5404316A (en) Desktop digital video processing system
US5412773A (en) Computerized interactive menu-driven video signal processing apparatus and method
EP0947955B1 (en) Apparatus for generating custom gamma curves for color correction equipment
EP2792138B1 (en) Editing color values using graphical representation of the color values
US6445816B1 (en) Compositing video image data
US20090204913A1 (en) User interfaces for managing image colors
US20030169373A1 (en) Method and apparatus for creating non-linear motion picture transitions
US7623722B2 (en) Animated display for image manipulation and correction of digital image
US20070253640A1 (en) Image manipulation method and apparatus
GB2254517A (en) Video signal colour correction
US5487020A (en) Refinement of color images using reference colors
CN105874786A (en) Image processing apparatus, image processing method, and computer-readable recording medium
GB2463375A (en) Colour editing using algorithm stored in image data
JP5105806B2 (en) Color correction apparatus and color correction method
US9723286B2 (en) Image processing apparatus and control method thereof
Annum et al. Image colouration in adobe photoshop: A digital painting technique for transforming grayscale photographs into colour mode
Ganbar Nuke 101: professional compositing and visual effects
US7190391B2 (en) Image processing
US6295369B1 (en) Multi-dimensional color image mapping apparatus and method
Lanier Compositing Visual Effects in After Effects: Essential Techniques
KR100545116B1 (en) Color Correction Device and Color Correction Method, Image Processing Device and Image Processing Method
US7532219B2 (en) Image processing
US20240078648A1 (en) Tone Mapping for Preserving Contrast of Fine Features in an Image
Paolini Apple Pro Training Series: Shake 4

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)