EP0538462A1 - Interface homme/ordinateur servant a effectuer des transformations de donnees en manipulant des objets graphiques sur un affichage video - Google Patents
Interface homme/ordinateur servant a effectuer des transformations de donnees en manipulant des objets graphiques sur un affichage videoInfo
- Publication number
- EP0538462A1 EP0538462A1 EP19920913518 EP92913518A EP0538462A1 EP 0538462 A1 EP0538462 A1 EP 0538462A1 EP 19920913518 EP19920913518 EP 19920913518 EP 92913518 A EP92913518 A EP 92913518A EP 0538462 A1 EP0538462 A1 EP 0538462A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- user
- transform
- data
- definition
- array
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00413—Display of information to the user, e.g. menus using menus, i.e. presenting the user with a plurality of selectable options
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00411—Display of information to the user, e.g. menus the display also being used for user input, e.g. touch screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6011—Colour correction or control with simulation on a subsidiary picture reproducer
Definitions
- the invention relates generally to the interface between a human operator and a computer hardware and software system and in particular to a system for performing data transformations by manipulating graphical objects on a video display monitor.
- a preferred embodiment of the invention relates to color image processing systems employing look-up tables for transforming from a first coordinate space to a second coordinate space.
- Color image processing systems typically include an input device for generating an electronic representation of a color image. The input device provides the electronic image representation to a computer workstation, which processes the image in ac- cordance with a user's instructions and forwards the processed image to a color monitor for display.
- the user interacts with the workstation, typically through input devices such as a mouse and keyboard, and an output device such as a control monitor, repeatedly instructing the computer to adjust the electronic image until the color monitor displays a desired image.
- the user can also generate a hard copy of the image by instructing the workstation to provide the processed electronic image to a selected printing device.
- the electronic image processed by the workstation consists of a two dimensional array of picture elements (pixels) .
- the color of each pixel may be represented in any of a variety of color notations or "color spaces.”
- the RGB color space represents pixel colors according to the relative contributions of three primary colors, red, green and blue.
- each pixel of the monitor's display contains three primary color phosphors.
- the monitor stimulates each primary phosphor with an intensity determined by the corresponding R, G, B value.
- the CMYK color space represents color using four variables, C, M, Y, K, each corresponding to the relative (subtractive) contribution of the colorants, cyan, magenta, yellow and black.
- C, M, Y, K determines the amount of a colorant (e.g. ink, dye) used by the printer in producing a desired color.
- Black is used to increase printed ink density and minimize the use of costly colored colorants, in situations where the overlay of multiple colorants would appear substantially black.
- Color spaces such as linear RGB and CMYK are useful for image scanning devices and image printing devices, respectively, since each parameter of the color space closely corresponds to a physical mechanism by which these devices measure and generate color.
- the three parameters R, G, B define a three dimensional, linear color space, each point within the space corresponding to a unique color.
- a selected change in the values of the parameters may not result in a commensurate change in color perceived by a human viewer.
- increasing the parameter R by n units yields little perceived change in color.
- increasing R by the same n units yields a dramatic change in the perceived color. Accordingly, it may be difficult for a user to manipulate the primaries R, G, B, to achieve a desired change in color.
- the "u'v'L*" space for example, is a three dimensional color space defined by the parameters u', v', L*.
- the chromaticity of each color in this space is uniformly characterized by the parameters u', v 1 .
- L* perceptually uniform variations in the lightness of the color
- the workstation To process a color image change within the "uvL" color space, the workstation simply maps each point u n , v Q , L Q in the color space to a new point u,, v-, L- . For example, if the user desires to display the image on a monitor, he may wish to adjust the colors of the image to compensate for lighting conditions of the room. Accordingly, the user selects a transform which maps each point u Q , v Q , L Q to a new point having the same values u Q , v Q but having greater luminance value
- the image processing system typically contains a predetermined transform definition for each such color image transformation. Based on a selected definition, the system maps certain points of the color space to new points. Accordingly, the color at each pixel of an electronic image is sequentially mapped in accordance with the transform definition to yield the desired visual effect. To perform another image transformation, the system remaps the color values to yet another point in accordance with a second transform definition. Any number of transformations can thus be performed by sequentially mapping color values accord ⁇ ing to the available predetermined transform definitions. However, such sequential processing of images can be extremely time consuming, particularly if a large number of predetermined transforms are selected and a large number of datapoints must be transformed.
- the method features the steps of receiving a user's selection of an image transformation to be performed on the array of input pixel values.
- a plurality of transform definitions are automatically selected from stored transform definitions.
- Each transform definition includes sample values in three dimensions representing an input/output relation of a predetermined image transformation.
- a composite transform definition is generated containing sample values of an input/output relation of a composite image transformation which is equivalent to the several image transformations effectively selected by the user.
- the composite transform is preferably compiled and implemented sufficiently quickly (e.g., in real time) to allow the user to interactively change his selec- tions until a desired composite transformation is created.
- At least one sample value is selected from the composite transform definition, based on the value of an input color to be modified.
- a processed color value is then determined based on the at least one selected sample value. For example, in one embodiment, a nearest neighbor of the input color value is selected as the sample value. In another embodiment, a plurality of sample values are selected and the processed color value is determined by interpolating between these values.
- the stored transform definitions may include custom transform definitions made in accordance with a user's instruction as well as the predetermined transform definitions.
- a user's instructions specifying desired color changes are received.
- a custom transform definition is then prepared for implementing the selected color changes.
- Such custom transform definitions may be created by modifying predetermined transform definitions or by generating entirely new transform definitions based on the user's input.
- the three-dimensional transform definitions are functional look-up tables and are referred to herein as "FUTs.”
- CHI CHI
- the apparatus of the invention in a first preference embodiment, generates an array of modified data values related to an array of input data values in response to commands from a user.
- the apparatus includes storage means for storing at least one data transform definition and data processing means for identifying at least one of the transform definitions in response to a user's selection of at least one data transformation and for applying the identified transform definitions to the array of input data values.
- Interface means displays are provided between the user and the data processing means for receiving instructions from the user and for indicating the outputs of the data processing means.
- the interface means comprises an input device for the user to transmit instructions, signal generating means for generating signals representative of the one transform definition and the user instructions, and display means.
- the display means displays a graphical representation of the signals representing the user instructions and the transform definitions.
- Signal processing means are also included for receiving the user instructions to select and move on the display means the graphical representation of at least one of said transform definitions.
- the signal processing means generates commands to the data processing means to identify the selected at least one transform definition and to apply a selected transform definitions to the array of input data values to arrive at modified values.
- a second preferred embodiment of the invention further comprises means for generating a single composite transform definition, such that application of the single composite transform definition to the input data values generates the same output data values as would the individual serial application of a selected at least two transform definitions.
- the invention interface includes means for selecting a primary range of the input data array to be transformed according to a selected transform definition and designating at least one secondary range of the input data array to be transformed according to a transform definition that provides a selected rate of change of the transformation of data values within the secondary range of said input data array with respect to the proximity of said data values to the primary range of said input data array.
- the interface includes means for sending a message to and means for receiving a message from tools and objects, at least one data transform definition tool having means for sending a message to and receiving a message from other tools and objects and an object representing the input data values having means for sending a message to and means for receiving a message from tools, where the message sent by the transform tool to the data object includes a message to apply a selected transform to the input data values.
- the user interface further includes signal processing means for receiving from the user commands to identify on the display means the graphical representation of at least one master transform definition and to generate commands to the transform controller to generate a changeable working transform definition that is the same as the identified master transform definition, other than the ability to be changed.
- Figure 1 is a schematic diagram of the RGB color space.
- Figure 2 is a block diagram of an image processing system used in connection with the claimed invention.
- Figure 3 is a block diagram of a variety of transform operations in accordance with instructions generated by the user through a CHI of the invention.
- Figure 4 is a flow chart schematically showing the basic operation of a hand of the CHI of the invention.
- Figure 5 is a flow chart schematically showing the basic operation of tools of the CHI of the invention which are used to create and modify data FUTs when the tools are activated or applied.
- Figure 6 is a flow chart schematically showing the basic operation of tools of the CHI of the invention which are used to create and modify data FUTs when the tools are released on an object.
- Figure 7a-7f is a flow chart schematically showing the basic steps a user would follow using the method of the invention to display a picture on a monitor, try various modifications to that picture and finally make permanent the alterations to the picture.
- Figure 8 is a flow chart schematically showing the general steps the user would take according to the CHI of the invention to change a FUT relating to the grey balance of a picture.
- Figure 9 is a schematic, showing basic elements of the CHI, including a room, door and picture objects and hand, input, output, adjustment, grey balance, tonal, monitor, work order, memory storage, and inking tools.
- Figure 10 is a schematic representation, of the hardware elements of the CHI of the invention.
- Figure 11 is a schematic, showing the basic actions that will take place as a result of a user's activation of various combinations of two mouse buttons.
- Figure 12 is a schematic representation of the tool room of the CHI of the invention.
- Figure 13 is a schematic representation of an activated tonal tool of the CHI of the invention.
- Figure 14 is a schematic representation of an activated grey balance tool of the CHI of the invention.
- Figure 15 is a schematic representation of an activated monitor tool of the CHI of the invention applied to a picture object.
- Figure 16 is a schematic representation of an activated work order tool, including a grey balance tool adjusted to affect shadows, another grey balance tool adjusted to affect highlights and a tonal tool adjusted to affect the entire range.
- an image processing system 8 includes a plurality of input devices 10 for scanning a source image (such as a photograph, film, or a scene within the view of a camera) to create an electronic digital representation of the image.
- the electronic representation is provided to an image processor 14 which adjusts the colors of the electronic image and either stores the adjusted image on storage device 17 (e.g., for later retrieval and processing) or forwards it to various output devices 16 for printing, display or transmission over a network 15 or any other communication channel.
- Image processor 14 is connected to a user interface 22 through which a user indicates the desired transformations to be performed on an electronic image.
- image processor 14 and user interface 22 are implemented with a properly programmed general purpose computer or computer workstation.
- the principal aspect of the present invention is a user interface, described below.
- a transform controller 20 selects a set of FUTs from a collection 19 of stored predetermined FUTs.
- transform controller 20 may be implemented as separate hardware elements.
- Each predetermined FUT describes a unique transform for mapping the values representing each color of an image in a first color space to a different set of values (e.g., a different color in the same or a different color space) thus yielding a desired image transformation.
- the user can also create his own custom FUT in accordance with the invention.
- the user interface allows the user to select a set of colors to be changed (e.g., from a palette of possible colors) .
- the user can then specify the desired changes to these colors, (e.g., a specific increase in the brightness) .
- the controller can then prepare a custom FUT corresponding to the user's selections.
- controller 20 can compose the selected FUTs into a single composite FUT 28 as illustrated in Figure 3 for processing and displaying (or FUT 32 for printing) the image from the input device without intermediate storage of the image.
- This selection and composition is performed with sufficient speed to allow the user to interact with the system, altering his selections until the system displays a desired image. Processing at a speed whereby the user experiences only a minimal or no delay between making a command and the system's response to that command is referred to as "real-time" processing.
- controller 20 provides the composite FUT 28 to transform processor 18 which implements the transform in accordance with the composite FUT 28.
- Transform processor 18 is also preferably implemented in software on a general purpose computer, performing any transform specified in a given FUT, although, it may also be composed of individual hardware elements.
- the user may instruct image processor 14 to accept an electronic image from scanner 10(a) , perform selected transformations of the image, and display the transformed image on color monitor 16(a).
- controller 20 first (and preferably automatically) selects input FUT 12(a) for converting the electronic image from the particular scanner 10 (a) into a "reference" color space used in performing subsequent transformations.
- the defined input transform maps each point in the scanner's RGB space to a corresponding point in the perceptually based color space, u, v, L (i.e., the "reference” space). In performing this translation, the input transform compensates for idiosyncrasies of the associated scanner 10(a).
- each scanner 10(a) , 10(b) may generate coordinates in the RGB space different from each other. Accordingly, the input FUTs 12(a), 12(b) (see Fig. 4) are calibrated to compensate for idiosyncrasies of each scanner such that each scanner generates the same point in the reference space uvL when scanning the same color.
- FUT 12(a) is set once for a particular scanner and is not modified by the user thereafter. However, the user may select from a limited number of such FUTs, depending on input conditions.
- input transform definition 12(a) can be applied to the input data by transform processor 18 alternatively before image store 17, or after as part of a composite transform definition.
- the user may request that the image from scanner 10(a), after calibration 12(a), receive further "input” processing through modifiable input FUT 13, and then be stored on storage device 17 before being further processed and displayed.
- the user modifiable input transform definition 13 may be used by a user, for instance, to change the effects of calibration transform definition 12(a), if it does not suit his particular needs. Accordingly, after controller 20 instructs transform processor 18 to process the electronic image according to input FUT 12(a) it then instructs processing according to input FUT 13.
- the resultant calibrated, transformed image is then stored in a storage device 17.
- the electronic image is stored as an array of data values in reference space for each segment of the picture.
- controller 20 automatically selects three FUTs 26, 30, and 15(a) required to implement the task.
- Display calibration FUT 15(a) is designed to convert the image from reference color space into the RGB color space required by the specific monitor 16(a), calibrating the image to compensate for the characteristics of the monitor so that it appears to the user on display 16(a) as it would be printed on printer 16(b) .
- Gamut compression FUT 30 is designed to modify the image so that it will appear the same on monitor 16(a) as if printed on printer 16(b). For example, in most cases, electronic printers cannot print as wide a range of colors as a monitor can display. Thus, it is not useful to display these colors on a monitor. If the electronic image contains colors which the printer cannot display (but the monitor can) , the gamut compression transform maps these colors to acceptably similar colors within the printer's gamut.
- User modifiable display FUT 26 is used to adjust the calibration of display 16(a) if the standard calibration FUT 15(a) does not suit its needs, for instance with respect to room light. FUT 26 adjusts all colors to compensate for the ambient lighting conditions surrounding display 16(a) .
- the three FUTs 26, 30 and 15(a) are typically selected once for a given site to accommodate the monitor, room and printer.
- controller 20 After selecting FUTs 26, 30, and 15(a), controller 20 next selects pleasing FUT 24 corresponding to the color transformations requested by the user for a specific image.
- Pleasing FUT 24, for example, may increase the luminance parameter L of certain blue colors in the image.
- a user might, alternatively, for instance, be processing a picture taken by a photographer who habitually overexposes the photos.
- a different pleasing transform 24 might decrease the brightness over all ranges to counteract this incorrect overexposure.
- Controller 20 composes the four selected FUTs into a single composite FUT 28. It then fetches the previously stored data array representing the original image from storage 17 and instructs processor 18 to process the image according to composite FUT 28 for display on monitor 16(a) .
- controller 20 dynamically composes the FUTs 24, 26, 30 and 15(a) into the composite FUT 28.
- the electronic image from scanner 10(a) is then processed in a single transformation step in accordance with composite FUT 28 and thereafter is displayed on color monitor 16(a).
- transform processor 18 applied input calibration FUT 12 (a) , and user modifiable input FUT 13 sequentially to the input data and then the data was stored. Afterward, composite FUT 28 was applied to the data which had already been transformed through FUTs 12(a) and 13.
- transform controller 20 can compose FUTs 12(a) and 13 together before image store 17 and instruct transform processor 18 to apply the composed FUT to the data and then store the resultant.
- the invention may also be practiced by storing the data in image store 17 just as it is scanned, without passing through any FUTs, and then including those FUTs 12(a) and 13 in the group of FUTs composed by transform controller 20 after image store 17.
- the user if satisfied with the displayed image, next instructs the controller to generate a hard copy of the image on printer 16(b). Accordingly, the controller selects the FUTs 24, 30 and 15(b).
- the first two of the three definitions, pleasing FUT 24 and gamut compression FUT 30, are identical to the pleasing and gamut FUTs used in generating the display image above.
- Printer calibration FUT 15(b) is designed to convert the image into the CMYK color space, calibrating the image to compensate for characteristics of a specific printer 16 (b) , which have been previously determined. If input FUTs 12(b) and 13 had not been applied to the data before image store 17, then they too would be composed with FUTs 24, 30 and 15(b) to create output composite FUT 32.
- the foregoing description is of the hardware and software components used to implement a user's commands.
- the subject invention relates principally to the interface (22 Fig. 2) between a user and the hardware elements.
- the invention has broad applicability to many types of data array processing, a particularly preferred embodiment is with respect to color image processing. Such an embodiment of a user interface is described below.
- a typical user interface 22 includes an input pointing device, such as a mouse 120 or trackball (not shown) having at least one and beneficially two user activated buttons L, R.
- a video display 122 is used to display icons or graphical objects representing data arrays that the user will instruct the transform processor 18 to use for transformation operations and the results of his instructions on those data arrays.
- the mouse 120 or trackball can be moved by the user.
- a sensor in the mouse senses its relative motions and transmits signals representing those motions to a processor that causes the display 122 to display a correspondingly moving cursor 100.
- a keyboard 124 may also be used for this purpose.
- the terminal 122 is not the same as the monitor 16(a), which is also part of the user interface, upon which the picture image is displayed. However, it is possible to dedicate a portion of a single display device to both tasks, according to well known techniques.
- the cursor 100 (Fig. 9) is an icon that resembles a hand. As explained below, the user uses the hand to accomplish certain tasks.
- the user interface also includes tools, which are displayed on the terminal as graphical icons.
- Additional tools include a scanner input tool 102, a monitor output tool 104, a hard copy printer tool 106, a memory storage tool 108, a grey balance tool 110, a tonality tool 112, a work order tool 117, an adjustment tool 114 and an inking tool 116.
- a picture object 118 is also shown.
- a scanner input tool allows the user to control a scanner to scan pictures into the system.
- a picture object allows the user to select data for a given picture to be treated.
- a monitor tool allows the user to control a color monitor to display and view pictures as they might be modified by use of the other tools.
- An output tool allows the user to control a printer to make hard copies such as prints or films of the image displayed on the monitor.
- a memory storage tool allows the user to direct that image data be written into or read from computer disk memory.
- a grey balance tool allows the user to add or remove color cast from a picture, either as displayed on the monitor or as stored or printed.
- a tonal tool allows the user to adjust the brightness and contrast of a picture over specified tonal ranges.
- a work order tool allows the user to combine the effects of several tools into one tool.
- An inking tool specifies the press/proof conditions under which to display or output a picture and permits control of under color removal.
- the processor of the user interface generates icons displayed on monitor 122 representing each of the tools shown in Fig. 9.
- each tool is controlled by a program entity according to methods well known in the art. As discussed below, there can be multiple instances of the same tool, either identically parameterized, or variably parameterized. Each instance of a tool is represented by a tool data element, and each type of tool is represented by a software routine.
- the tools may be thought of as independent processors that receive instructions from the user and act according to those instructions.
- the user activates and implements the tools by manipulating the icons of the tools with the hand icon, which is a cursor.
- the user can invoke in the hand two states of openness, i.e. open or closed, and two states of activation, i.e. activated and not activated.
- the states are invoked by use of two buttons L (left) or R (right) on mouse 120.
- the hand 100 can be closed by pressing the left mouse button.
- the hand icon has a "hot spot" (i.e. the "+" in Fig. 11, bottom) . If the hot spot of the hand icon is overlapping any part of any other icon on the terminal screen 122 at the instant it is closed, it will grasp that object, as shown at 130.
- the icon representing the hand changes shape to show a hand grasping the tool or object being grasped, to signal to the user the grasped state of the tool.
- the drawing of a hand 98 is not part of the user interface, but is a representation of an actual hand of a human user, pressing the buttons of the mouse 120.
- This representational aspect of the user interface i.e. the change of appearance of the hand icon when it changes state, closely represents the actions of three dimensional objects in the real world.
- This aspect is implemented in other fashions, as will be discussed, with respect to other actions implemented by the user.
- the tools are program entities that have attributes, such as display location, applied state, active state, control settings, etc.
- Each tool has an icon associated with it.
- the user actually causes the tool object, the program entity, to move in a space that is represented by the display 122. This is shown to the user by a change in appearance of the icon e.g. it moves on the screen, or opens up to reveal additional controls.
- the hand opens 132 and its icon changes shape. If the hand was grasping an object before it was opened, it will release that object. If the hand is open and the right button is pushed the hand activates any object or tool (or portion of a tool) to which it's pointing finger is adjacent 134. The effect of activating a tool differs from tool to tool as is discussed.
- the grasped tool is applied 136. If the grasped tool is adjacent to another object or another tool to which the grasped tool can be applied, then some activity will occur with respect to that second tool or object. If the grasped tool is not adjacent to any object or tool upon which it can be applied, the grasped tool and hand icon pair will change shape to indicate to the user that the grasped tool is being applied, but there is no object for its application. For instance, the adjustment tool 114, rotates as would a screwdriver in use to illustrate its use and aid in learning.
- the interaction between the user, the hand 100 and other tools or objects in general is shown schematically in Fig. 4.
- the hand is initialized 200 and is made open 202.
- the system evaluates 204 the user command. If the user commands the hand to grasp (by pressing the L button), at 208, the system evaluates if there is an object at that display location (which location would be part of the data parameters defining the program or processor entity that constitutes that object) . If there is, at 206 the system sends a message identifying the hand's location on the display 122 along with a message that whatever is at that location is being grasped. Then at 210 the object receives the instruction to be grasped, sends back a message that something has been grasped and what it is.
- the hand icon changes shape or replaces the cursor with a view of the object held by a hand, and the grasped object moves with the hand cursor 100 in response to the user's motion of the mouse.
- the user interface operating system would indicate that there was no object, and the hand would close empty handed at 212.
- the system or, a tool, as the case may be
- sending a message including the hand's (or a tool's) location which message is intercepted by the object at that location
- the system or tool to examine the location beneath the hand, and determine if there is another object there. If so, the system or tool sends the message. If not, it sends no message.
- the general case will be discussed as if the hand (or a tool) simply sends the message to the second tool, without regard to the manner by which it is determined if a second tool is at the location or the fact that the operating system, rather than the hand, sends and receives messages.
- the user had next commanded the hand to open, it would return to 202 and be made open. If at 214 the user had transmitted any other command, such as no command, or a movement of the mouse, the hand 100 would remain closed.
- the hand which is grasping something, would evaluate the user command at 216. If the user had pressed the L button, directing the hand 100 to apply the held tool or object, the hand 100 would send 217 an "Apply" message to the held tool or object to apply itself. The held object would at 218 act accordingly to receiving a command to apply itself, depending on its environment. A specific implementation of a tool responding to an "apply" message is shown schematically in Fig. 5 and is discussed below. After the object has been applied, at 216, the hand 100 again evaluates 216 the user command and proceeds as described.
- the user command is to "release” also referred to as "drop" the object
- the hand 100 sends "Drop” a message to the grasped object that it is released and the hand drops 228 the object.
- the specific routine for the object being released is invoked at 230.
- a routine for a specific object having been dropped is shown schematically at Fig. 6. After the completion of such a routine, the hand 100 is made open at 202 and again evaluates a user command. If at 216, the user command is other than
- the hand again evaluates the user command at 216. If, for instance, the user command is “move”, the hand returns to 216 and then, if an "apply” command is in place, it again goes to the specific routine for the application of that object.
- the hand 100 sends 220 a message that whatever is at that location is being activated and the hand icon changes shape, e.g. to a hand with a pointing finger.
- the object receives 224 the instruction to be activated.
- the routine for the particular object, when activated, is invoked 226.
- Such a procedure for a particular activated object is shown schematically at Fig. 5.
- an object can be applied without being activated, depending on whether that condition would be an appropriate analogy to the real world.
- the hand 100 if the hand 100 is not over an object, it will change its shape to point to nothing at 240 to illustrate its use and aid in learning.
- the hand will evaluate the user command. If the user command is "deactivate”, the hand returns to 202 and is made open, proceeding as before. If the command is other than “deactivate”, such as "no command” or a
- a tool if a tool is released or activated, that tool invokes its own specific routine for being released or activated.
- the routine for released and activated tools of the type that "create" a FUT is shown schematically at Fig. 5.
- Such tools include the grey balance tool 110, the tonal tool 112 and the work order tool 117. It will be understood to those of ordinary skill in the art that tools that "create" FUTs do so by instructing transform controller 20 to create a FUT.
- the tool creates a FUT, albeit at a distance, and through the agency of another program entity, the transform controller 20; similarly, with respect to modifying and composing FUTs.
- the tool enters the routine from the stage discussed above at A or C of Fig.
- the tool receives 304 a message, if any from the target.
- the tool evaluates 306 the message. If 308 there is no message from a target, then the tool returns to Fig. 4. Such a situation could arise if there is no object under the tool, or if the object under the tool is not the type that can receive a FUT. If the message is "process the picture at memory location x" the tool gets 310 the data at memory location X and transforms 312 the data according to the FUT then associated with the tool. The tool writes 314 the data to the memory location X and returns 316 to Fig. 4.
- the tool computes a new FUT if necessary 318 and sends 320 the FUT to the target object. The tool then returns 322 to Fig. 4. The target object will proceed along its own command path. A typical such command path is discussed with respect to a monitor tool 104 and Fig. 7.
- the routine for a dropped object is shown schematically at Fig. 6, which is invoked at B of Fig. 4.
- the tool receives 328 a message from the object.
- the tool evaluates 330 the received message. If the received message is "send your FUT,” the tool sends 332 its FUT and returns 334. If at 330 the message is anything other than "send your FUT, " the tool returns at 334 without sending its FUT.
- the tool must compute a new FUT.
- This routine is shown schematically in Fig. 8 with respect to the grey balance tool 110, shown in more detail in Fig. 14.
- the grey balance tool 110 is used to add or remove a color cast to an image.
- the grey balance tool permits a color cast change with respect to all or only specified parts of the tonal range of an image. For instance, if the highlights of an image are too blue, it is possible to reduce the blueness of only the highlights of the picture, leaving untouched the blueness of all other tonal portions of the picture e.g. the shadows and midtones.
- the grey balance tool 110 is divided into zones, which zones act as controls to be separately activated by the user.
- the grey balance tool 110 (Fig. 14) includes a color control 520, a maximum effect control 522 and decreasing and increasing effect controls 526, 524 respectively.
- the user indicates with the hand 100 a location on the color circle 520, for instance as marked by the small x.
- the color circle represents all hues from neutral to their most saturated forms; the center represents neutral and the outside edge represents the most saturated form of each hue. Hues vary continuously around the circle.
- the letters C,G,Y,R,M,B stand for cyan, green, yellow, red, magenta and blue, respectively. Pointing with the activated hand 100 at the center of the color indicates no color change.
- Pointing the hand 100 at a location away from the center indicates the amount of a color change (not the amount of color cast in absolute.)
- the user wants to remove a purple color cast, he adds a yellowish green color cast to all hues in the picture, by pointing at the spot indicated by the x. If the user finds that the added cast was correct in color, but too weak in saturation, the user points further out along the radius on which x lies, until the desired color cast is achieved.
- the user must also select the range of tones over which the color cast is desired. This is done with adjustment of the maximum effect control 522, increasing effect control 524 and decreasing effect control 526.
- the scale 528 indicates schematically what range of tonality will be affected by application of the three other controls. The shadows are indicated on the left of the scale 528 and the highlights on the right. Because, as shown in this example, the maximum effect control extends equally about the mid region of tonality, the full effect of the yellowish green color shift will take effect only in regions of midrange brightness.
- the slope of the edges of increasing effect control 524 and decreasing effect control 526 can be moved by activating the hand in that region and moving the hand while activated.
- the range of maximum effect would span the tonal range and the indicated color cast change will be applied fully to all tonality ranges, from the darkest shadows to the lightest highlights. If, however, the increasing effect control 524 is inclined as indicated, then tonal regions that are only slightly darker than the mid tonal region indicated by the maximum effect control 528, will have their color cast changed, but at less than the full change. The further from the midtones that a tonal range is, the less will there be an effect in parts of the image having that tone. Inclining the increasing effect control 524 more steeply, so that it does not intersect the side margin of the panel, as shown with respect to the grey balance tool 110b of Fig. 16, will result in no color change whatsoever at the darkest shadows.
- the decreasing effect control 526 works similarly with respect to the other side of the area of maximum effect. It is also possible, in a preferred embodiment, to move the area of maximum effect from left to right or vice versa, i.e. from darker to lighter tonalities, by activating the maximum effect control 528 with the hand and moving it, side to side, in the desired direction.
- the width of the region of maximum effect can also be widened or narrowed by activating 528 and moving up or down. Both side to side and up and down motion may be done in combination.
- the effect of moving these controls on the FUT generated by the grey balance tool is shown schematically in Fig. 8. The grey balance tool starts the routine at 350.
- the tool computes 354 the color parameters for its FUT based on the location of the hand, recomputes 356 its FUT and returns 358 to the main routine shown in Fig. 6(a), then moving on to send 320 its FUT to the target object. If the hand is not in the color circle control
- the grey balance tool evaluates if the hand is in the maximum effect control 522. If so, the grey balance tool identifies 362 the tonal range of the maximum effect based on the location of the had tool and the parameters that will be used to generate the new FUT, and then proceeds to recompute 356 the new FUT as described above.
- the hand If the hand is not in the maximum effect control 522, it evaluates 364 if it is in the increasing effect control 524. If it is, the grey balance tool computes 366 the degree of increasing effect based on the location of the hand, and identifies the parameters that will be used to generate the new FUT, then proceeding to generate the new FUT as before. If the hand is not in the increasing effect control 524, the tool evaluates 368 whether it is in the decreasing effect control 526, with similar results. If the hand is not in the decreasing effect control 526, then the tool returns 372 to the main routine shown in Fig. 6(a), then moving on to send 320 its FUT to the target object.
- the tool will generate a new FUT for each new control location and send that FUT to the target object, for instance a monitor tool.
- the system reviews the location of the hand at least two times per second. This is fast enough to generate ⁇ real time processing.
- the monitor tool 104 allows the user to control the high resolution monitor 16(a). and more. In its inactive state, the monitor tool icon 104 appears as shown in Fig. 9. When dropped on a picture, the monitor tool icon enlarges to surround the icon of the picture object 11 as shown in Fig. 15.
- the monitor tool includes several controls, which work in the same general fashion as the controls of the grey balance tool.
- a zoom slider 111 is used to control the degree of magnification that the high resolution monitor will show the picture.
- the view area control 113 can be moved around the monitor, by activating with the hand, to select the portion of the picture to be displayed on the high resolution monitor.
- the zoom setting indicator 115 displays the magnification of the chosen degree of zoom.
- the hand 100 grasps the monitor tool 104 when the user presses the L button, according to the message interchange discussed above with respect to the general operation of the hand 100.
- the user moves 408 the monitor tool 104 to a picture object 118 and releases 410 the monitor tool.
- the monitor tool sends 412 a message "process.”
- the picture object being at the location of the monitor tool, receives 414 the message and sends a message "a data array for the picture to be processed is at memory location [specified]".
- the monitor tool 104 receives 416 the message from the picture tool, gets the data array from the specified memory location and processes the picture data using the FUTs then in place. It will be assumed for the purpose of this discussion that the only FUT in place is a display calibration FUT 15(a) (Fig. 3).
- the monitor tool has associated with it a composite FUT 28, which may combine a plurality of individual FUTs into one, thereby minimizing the computation time.
- composite FUT 28 is the same as display calibration FUT 15(a) .
- the monitor tool 104 stores the data for the transformed array in memory at a location different from that where the original picture data resides. Thus, the original data remains intact for further manipulation.
- the monitor tool causes the data for the transformed array to be sent to the color monitor 16(a), where it is displayed for the user's inspection. It will be understood that the data array that results from the application of composite FUT 28 will be in RGB space, which are the type of signals that the display device 16(a) uses to display an image. If the user wants to add a gamut compression FUT 30, to display on the monitor the image as it would actually be printed by a specific printer, at 420 he causes the hand 100 to grasp an adjustment tool 114, which in a preferred embodiment appears as a screwdriver. The hand moves 422 the adjustment tool 114 over the monitor tool and applies 424 the adjustment tool.
- the adjustment tool sends a message "adjust.”
- the monitor tool receives the message and the icon changes 426 appearance to reveal an adjustment panel 105.
- the adjustment panel 105 includes a zone 107, referred to as an "inking sticky pad.”
- the hand moves away 428 from the monitor and releases the adjustment tool 114, which remains where released and causes no further action.
- the user moves 430 the hand 100 to and grasps an ink tool/object 116.
- the user moves 432 the ink object to over the sticky pad and causes the hand to send a "release” message to the inking tool 116.
- the inking tool 116 sends 434 a "process" message.
- the monitor tool receives the message and sends 436 a message that is received by the inking tool "send your FUT.”
- the inking tool 116 sends 438 its FUT for the gamut compression of a particular inking process and that FUT is received by the monitor tool 104.
- the monitor tool 104 composes 440 the gamut compression FUT with whatever FUTs it has been previously set up with. This results in a composite FUT 28 (fig. 3), which provides the same transformation as would first application of gamut compression FUT 30 and next display calibration FUT 15(a).
- the monitor tool gets the originally stored picture data and processes the data through the newly composed FUT, and displays the picture as modified on the color monitor 16(a) .
- the user interface provides an elegant method by which a user can cause a composite FUT to be created from two individual FUTs, simply by selecting tools on the display and moving those tools relative to one another and picture objects.
- the user may want to effectuate additional transformations to the image being worked upon, which are also of a general nature.
- the room in which the monitor 16(a) is used may be unusually bright and thus a change in the brightness is desired so that the image appears as would a printed image in normal lighting conditions. Further, it may be desired to offset an undesired color cast caused by the lighting in the viewing room.
- This second type of adjustment is a grey balance adjustment.
- a work order tool 117 is a tool that combines other tools which have FUTs, associated with them such as inking tools, grey balance tools and tonality tools.
- a work order tool 117 is shown in an activated configuration in fig. 16.
- the work order tool shown has three other tools associated with it: a grey balance tool 110a for affecting the shadows, a grey balance tool 110b for affecting the highlights and a tonal tool 112a for an overall effect.
- the work order tool can accommodate a large number of tools. It should be noted that there is no limit to the number of identical types of tools that a user can use in one image processing environment.
- new tonal tools 112 can be freely made by copying from existing tonal tools, and then modifying them. If a tool is copied, according to the software embodiment of the invention, a new data structure is created that is initially virtually identical to the data structure of the tool copied.
- the user simply causes the hand 100 to grasp the desired tool and release it on the work order tool 117.
- the released object sends a "process" message to the work order tool.
- the work order tool sends a message that is received by the released tool to "send FUT".
- the work order tool receives the FUT, and combines it with the previously combined FUTs received from tools already a part of the work order.
- the work order tool then graphically merges the added tool into the work order.
- Each "pocket" of a work order tool is actually a sticky pad.
- the work order tool is another tool by which the user can combine transformations, simply by causing the hand to move graphical objects relative to one another on a terminal display.
- the user grasps 444 a previously prepared work order tool 117 and moves it to the sticky pad 109 on the monitor tool adjustment panel 105, where it is released.
- the work order tool 117 sends its composed FUT to the monitor tool, which receives it and composes 450 a new FUT based on the FUTs already installed, for instance the display calibration FUT 15(a) and gamut compression FUT 30 generated by inking tool 116, stuck onto sticky pad 107.
- the order in which the monitor tool composes the FUTs is important. Further, it is not always possible for the monitor tool to simply compose a new FUT, for instance from the work order tool 117, with a previously existing one, e.g. from the display calibration combined with the gamut compression. Sometimes, depending on the types of FUTs involved, it is necessary to begin anew with its uncomposed source FUTs, and recompose them all.
- the order of composition is to first compose customer modifiable monitor-to-proof ("CMMP") FUTs, such as grey balance, tonal tool or work orders, with inking FUTs, and next with display calibrations FUTs, such as 15(a) for a monitor or 15(b) for a printer.
- CMMP monitor-to-proof
- the monitor tool 104 gets 452 the original picture data and processes the data through the newly composed FUT and redisplays the altered picture on color monitor 16(a) .
- the foregoing changes are the type of changes that the user must make so that the picture is displayed as it will be printed on the chosen printer.
- the user may also want to add an artificial or artistic alteration to the picture, such as to increase the contrast of the midtones of the picture.
- the hand 100 grasps a tonal tool 112 and moves 456 it to the monitor, where it is released and then opens up.
- the user points 458 the hand at the tonal tool 112 to activate it, whereby the tonal tool sends its FUT to the monitor tool 104, which composes a new FUT as described above in connection with step 450.
- the monitor tool applies the new FUT to the data for the picture and displays the further modified picture on the color monitor 16(a) .
- the tonal tool is shown in detail in fig. 13.
- the user may, for instance, desire to increase the brightness and contrast of certain portions of the picture, for instance those presently of medium brightness.
- the tonal tool includes a brightness control 502 and a contrast control 504, a maximum effect control 506, an increasing effect control 508, a decreasing effect control 510, and region 599 of both brightness and contrast control together.
- the user points 462 the hand 100 at the contrast control 504 and presses the R button sending 464 a message to activate the slider control 504.
- the slider control 504 sends 466 a message to the tonal tool 112, indicating its current position.
- the tonal tool 112 computes 468 its FUT based on the location of the contrast slider control 504.
- the means by which the parameters of the FUT vary with respect to the location of the controls is unimportant with respect to the present invention.
- changes in location of the contrast control invoke changes in the FUT such that when the FUT is applied to data, the picture generated changes with respect to its contrast. Similarly with respect to brightness.
- the tonal tool sends 470 the newly computed FUT to the monitor tool as already discussed, and the monitor tool recomputes 472 its FUT including the new FUT from the tonal tool 112 and recomputes 474 and redisplays the picture, indicating the effect of the tonal adjustment. If the user again moves the contrast control 504, at 476 the tonal tool loops back to 466 and generates a new FUT, which is sent to the monitor tool and used as discussed.
- the pixel elements that will be affected are highlighted in some fashion, for instance all are made orange or all are made white.
- the R button is released, then the actual contrast transformation takes place.
- the user grasps 478 the monitor tool and removes it from the picture object 118.
- the monitor 16(a) will no longer display 480 the picture in question.
- the user grasps the tonal tool 112 with the hand and applies 482 it to the picture object 118.
- the tonal tool gets the picture data from memory and transforms it according to the FUT generated by the tonal tool 112 according to its settings 484. This data is stored in the location for the picture data and the picture data is thus permanently changed.
- the user interface of the invention keeps inviolate the data for the picture during the interactive user modifications made while the monitor tool is displaying the picture.
- the original data is kept in one memory location and modified picture data is kept in another memory location.
- FIG. 12 A tool room is shown in Fig. 12.
- multiple "rooms" are provided.
- the user can move objects or tools from one room to another, by grasping the objects and carrying it to a door 121.
- the door opens and the tool or object can be carried to the adjoining room.
- the user can create new rooms and connect them to each other as desired.
- the CHI of the invention includes a room with special properties, referred to as a "tool room, " which acts as a room having an endless supply of standard tools.
- the tool room is shown in Fig. 12 and includes one master copy of each tool that will be used, set up according to default parameters.
- the tool room creates a working copy of the tool which the user takes to the new room.
- this creation amounts to the generation of a copy of the data structure for the tool.
- the creation is effected by the activation of an additional processor designed to carry out the activities of the tool.
- the user goes to the tool room and simply takes the tool desired. It is not necessary to consciously keep track of the last copy of a tool, so that it is not inadvertently destroyed or altered. Further the master copies of the tools are all kept in a central place.
- the invention also includes a method and apparatus for visual or graphical programming.
- the user programs the transform controller 20 by selecting and moving graphical objects on a screen relative to each other. For instance, as described in connection with Fig. 7, the user programmed the transform controller 20 to select, first FUTS 15(a), 30, 26 and 24 and to apply those to the data array for purposes of viewing on monitor 16(a). Next, the user programmed the controller 20 to select FUTS 24, 27, 30 and 15(b) and apply them to the data for purposes of printing. All of this programming was done by manipulating graphics rather than by typing commands in a word-based programming language.
- CHI of the invention is also useful in connection with hardware that operates at a high enough instruction rate per second that it is not necessary to compose multiple FUTs together in order to achieve real time image processing.
- the claimed invention is intended to cover those embodiments that do not include composition of FUTs.
- the invention may be implemented in software or hardware.
- each tool is controlled by a processor or a portion of a processor, which generates signals causing the terminal to display the icon of the tool.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
Abstract
L'appareil de cette invention produit un ensemble de valeurs de données modifiées apparentées à un ensemble de valeurs de données d'entrée en réponse à des ordres émis par un utilisateur. L'invention convient particulièrement au traitement d'images couleur. L'appareil comprend une mémoire de stockage servant à stocker au moins une définition d'une fonction de transformation de données et des moyens de traitement de données servant à identifier au moins une des définitions de la fonction de transformation en réponse à la sélection, par l'utilisateur, d'au moins une transformation de données et à appliquer les définitions de fonctions de transformation identifiées à l'ensemble de valeurs de données introduites. Une interface d'utilisateur est décrite. L'interface comprend un dispositif d'entrée et un dispositif d'affichage. Le dispositif d'affichage affiche une représentation graphique des directives de l'utilisateur et des définitions des fonctions de transformation. L'utilisateur choisit la représentation graphique d'au moins l'une des définitions et la déplace sur le dispositif d'affichage. Un processeur de signaux identifie la ou les définition(s) choisies et applique les définitions des fonctions de transformation choisies à l'ensemble de valeurs de données d'entrée afin d'obtenir les valeurs modifiées. Ainsi, les caractéristiques tonales d'une image peuvent être modifiées. L'invention comprend en outre des éléments permettant de produire une définition de fonction de transformation composite unique, de sorte que l'application de la définition composite aux valeurs de données d'entrée produit les mêmes valeurs de données de sortie que produirait l'application individuelle et en série d'au moins deux définitions choisies.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US69328191A | 1991-04-26 | 1991-04-26 | |
US693281 | 1991-04-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
EP0538462A1 true EP0538462A1 (fr) | 1993-04-28 |
Family
ID=24784041
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19920913518 Withdrawn EP0538462A1 (fr) | 1991-04-26 | 1992-04-15 | Interface homme/ordinateur servant a effectuer des transformations de donnees en manipulant des objets graphiques sur un affichage video |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP0538462A1 (fr) |
JP (1) | JPH06502055A (fr) |
WO (1) | WO1992020184A1 (fr) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0772767A (ja) * | 1993-06-15 | 1995-03-17 | Xerox Corp | 対話型ユーザ支援システム |
US5502580A (en) * | 1993-10-06 | 1996-03-26 | Fuji Photo Film Co., Ltd. | Color reproduction system |
EP0674430A1 (fr) * | 1994-03-24 | 1995-09-27 | Eastman Kodak Company | Procédé et dispositif pour la transformation interactive des valeurs de couleurs entre espaces de couleur |
JPH0816129A (ja) * | 1994-04-27 | 1996-01-19 | Canon Inc | 画像処理装置 |
JP2914227B2 (ja) * | 1995-07-11 | 1999-06-28 | 富士ゼロックス株式会社 | 画像処理装置および画像処理方法 |
JP3163987B2 (ja) * | 1995-09-04 | 2001-05-08 | 富士ゼロックス株式会社 | 画像処理装置およびガミュート調整方法 |
US5910796A (en) * | 1996-05-20 | 1999-06-08 | Ricoh Corporation | Monitor gamma determination and correction |
SG75190A1 (en) | 1998-12-14 | 2000-09-19 | Canon Kk | Image processing method and apparatus image processing system and storage medium |
JP3478329B2 (ja) * | 1999-10-01 | 2003-12-15 | セイコーエプソン株式会社 | 画像処理装置及び画像処理方法 |
US7342682B2 (en) * | 2002-12-05 | 2008-03-11 | Canon Kabushiki Kaisha | Incremental color transform creation |
JP2006074331A (ja) * | 2004-09-01 | 2006-03-16 | Ricoh Co Ltd | 画像処理装置、画像処理プログラム、記憶媒体、画像処理装置の画像処理制御方法および画像形成装置 |
EP1821518B1 (fr) * | 2006-02-16 | 2013-05-22 | Hewlett-Packard Development Company, L.P. | Reproduction personnalisée des couleurs |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2794714B2 (ja) * | 1988-06-29 | 1998-09-10 | ソニー株式会社 | プリンタ |
JPH0659085B2 (ja) * | 1988-07-12 | 1994-08-03 | 大日本スクリーン製造株式会社 | 画像シミュレーション方法 |
JP2695484B2 (ja) * | 1989-09-08 | 1997-12-24 | 富士写真フイルム株式会社 | カラースキャナ |
-
1992
- 1992-04-15 EP EP19920913518 patent/EP0538462A1/fr not_active Withdrawn
- 1992-04-15 JP JP4511852A patent/JPH06502055A/ja active Pending
- 1992-04-15 WO PCT/US1992/003135 patent/WO1992020184A1/fr not_active Application Discontinuation
Non-Patent Citations (1)
Title |
---|
See references of WO9220184A1 * |
Also Published As
Publication number | Publication date |
---|---|
JPH06502055A (ja) | 1994-03-03 |
WO1992020184A1 (fr) | 1992-11-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5898436A (en) | Graphical user interface for digital image editing | |
US5254978A (en) | Reference color selection system | |
US6031543A (en) | Image processing apparatus for correcting color space coordinates and method | |
EP0160548B1 (fr) | Procédé et appareil de sélection et de production de couleurs | |
US5311212A (en) | Functional color selection system | |
US6266103B1 (en) | Methods and apparatus for generating custom gamma curves for color correction equipment | |
EP0177146B1 (fr) | Retouche d'images | |
US7796296B2 (en) | Personalized color reproduction | |
US5627950A (en) | Real-time three-dimensional color look-up table interactive editor system and method | |
CA2079918C (fr) | Systeme et methode d'edition d'images a commandes d'edition multidimensionnelles ameliorees | |
JP3869910B2 (ja) | 画像処理方法及び装置及び記憶媒体 | |
US5420979A (en) | Method and apparatus for using composite transforms to form intermediary image data metrics which achieve device/media compatibility for subsequent imaging applications | |
US7110595B2 (en) | Method of and apparatus for image processing, and computer product | |
EP0503051B1 (fr) | Systeme de traitement d'images en couleurs servant a preparer un module de transformation d'image composite pour effectuer une pluralite de transformations d'image selectionnees | |
US20070222789A1 (en) | Image processing method, image processing apparatus, storage medium and program | |
JPH0830772A (ja) | カラー空間間の対話式色変換装置および方法 | |
EP0070680B1 (fr) | Reproduction d'images en couleurs | |
EP0310388B1 (fr) | Modification interactive d'images | |
US6057931A (en) | Method and apparatus for controlling color image reproduction | |
JP2003087591A (ja) | 画像処理方法及び画像処理装置 | |
EP0538462A1 (fr) | Interface homme/ordinateur servant a effectuer des transformations de donnees en manipulant des objets graphiques sur un affichage video | |
US6456293B1 (en) | Method and apparatus for working with constrained color on a computer terminal display | |
EP0741492A1 (fr) | Correction sélective de couleur appliquée à plusieurs gammes locales de couleur | |
JPH11196285A (ja) | 画像処理方法、装置および記録媒体 | |
US6441869B1 (en) | Systems and methods for color specific hue control |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): DE FR GB |
|
17P | Request for examination filed |
Effective date: 19930421 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Withdrawal date: 19931223 |