CA1324321C - System and method for color image enhancement - Google Patents
System and method for color image enhancementInfo
- Publication number
- CA1324321C CA1324321C CA000616362A CA616362A CA1324321C CA 1324321 C CA1324321 C CA 1324321C CA 000616362 A CA000616362 A CA 000616362A CA 616362 A CA616362 A CA 616362A CA 1324321 C CA1324321 C CA 1324321C
- Authority
- CA
- Canada
- Prior art keywords
- image
- color
- mask
- region
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Landscapes
- Image Processing (AREA)
Abstract
Abstract A method for selectively coloring at least one picture element of an image comprised of a multiplicity of picture elements combines image information from mask-defined regions of the image with color information. The invention provides an improvement consisting of assigning to an image to be colored, mask-defining region information and color information from a previous image, and moving at least one of the mask-defined regions a predetermined distance along a predetermined axis to a moved location and storing mask-defining region information for the image to be colored at the moved location.
Description
13~32~
SYSTEM AND METHOD FOR COLOR IMAGE ENH~NCEMENT
This application is a division of application Serial No. 573,741 filed August 3, 1988.
Field of the Invention The present invention pertains to a system and method for the color enhancement of monochrome or multi-color images by relation of digitized gray-scale information from the original image to unique values (triplets) of hue, luminance, and saturation derived from a color transfer function applied to the achromatic gray-scale information.
Backaround of the Invention In strictly mathematical terms, the image produced on black and white film is one dimensional (brightness), while an . original-scene can be viewed as three dimensional (using the HLS model of color, where hue, luminance, and saturation are the dimensions). The chemical reaction of the film emulsion, and the camera mechanisms and filters, essentially reduce the three dimensional information to one dimension by complex mapping, and forever obliterate the original hue, luminance, and saturation information.
Note: It should be pointed out that luminance and brightness are not the same. Since brightness of the black and white image results from the complex interactions of the three elements of the HLS system, there is only a statistical correlation between brightness and luminance, but they are not identical.
Color Representstion ~yste~s Color is both an absolute quality of light (its wave-length or frequency) and a relative quality which depends on 2 132~32~
the human perceiver. For this reason, various color "gamuts"
have been proposed which allow perceived colors to be quantified. Among these models is the CIE chromaticity scale, which permits quantification of color by combinations of so-called primaries (which are, by themselves, not visible tohuman observers). (See Foley & VanDam, Fundamentals of Interactive Computer Graphics, Addison Wesley, Reading, MA
198~ pp. 606-608.) In addition to theoretical models of color (CIE Chromaticity, Hue-Luminance-Saturation, and Red-Green-Blue), there are many representational systems for colorreproduction in various media such as the RGB system for cathode ray tubes, the process color system for printing, and the NTSC, PAL, and SECAM systems for broadcast video. Each of these models is, in fact, a relatively small subset of both the theoretical absolute range of colors and of the actual range of humanly perceivable colors.
One of the theoretical models (Ostwald, W., Colour Science, Winsor & Winson, London, 1931) which is especially useful for analysis of color is the HLS model of hue, luminance, and saturation. This model uses a three dimensional cylindrical universe (the absolute theoretical color space), inside of which is a space bounded by a pair of cones having a common base (the perception space~. The cylinder's axis (from apex to apex of the cones) is the luminance axis, with the lower apex having a luminance of zero (all black) and the apex of the upper cone having a luminance of 1 (all white). Distance from the luminance axis (normal to the axis) determines saturation (proportion of a color which is made up of its complementary color), with the axis representing zero saturation (equal amounts of the primary color and its complement and therefore no color) and the surface of the cone representing maximum perceived saturation for a given luminance. Finally, angular displacement around the luminance axis represents hue, as with a conventional color wheel, with complements separated by 180 degrees.
The solid space defined by the HLS conic model is pointed at its tcp and bottom to represent the fact that as colors 3 ~324321 become whiter or blacker, they are perceived to lose hue.
Saturation is a characteristic which depends chiefly on the narrowness or purity of the band of wavelengths comprising a color. At the exact center of the space is a medium gray which exhibits a moderate emittance or reflectance for all wavelengths of light.
The RGB system, is used, for example, in CRT displays, where each of the three electron quns (red, green, and blue) is controlled directly by a signal indicating its respective iO intensity. HLS color may be converted to the RGB
representation (see Foley ~ VanDam, pp. 648-9.), however there are many HLS combinations which must be "clipped" to be represented in RGB. The same is true of NTSC video (also known as YIQ), which has an even more restricted range than RGB. Thus, not every "real world" color can be reproduced by video systems using either RGB or NTSC standards. According to Foley & VanDam, "neither the V in ~SV [Hue, Saturation, Value] nor the L in HLS correspond to luminance in the YIQ
model, so two different colors defined in either space can easily have the same luminance, and be indistinguishable on black and white TV or videotape." ~Foley & VanDam p. 618) Since approximately 1970, several techniques have been developed for the addition of color to monochrome images.
Often, these techniques rely on video technology to augment the video signal in a defined manner, typically by adding a chrominance component signal to the pre-existing luminance siqnal. This technique results in an appearance not unlike that achieved by Thomas Edison in his early efforts to add tint to black and white film by painting the frames: a colored area over a black and white picture. Because of their reliance on video technology, all of these systems suffer from lack of vividness of colors, and from an inability to accurately reproduce colors.
,,:
Brief Description of the Invention ~' The present invention comprises a method for selectively ;
4 132~32~
coloring at least one picture element of an image comprised of a multiplicity of picture elements by combining image information ~rom mask-defined regions of the image with color information, the improvement consisting of: (a) assigning to an image to be colored, mask-defining region information and color information from a previous image; (b) moving at least one of said mask-defined regions a predetermined distance along a predetermined axis to a moved location and storing mask-defining region information for said image to be colored at the moved location.
Brief Description of the Fiaures Figure 1 depicts in schematic form the creation of two adjacent frames of graphic information according to the method of the present invention.
Figure 2 is a schematic representation of a portion of the image memory structure according to one embodiment of the method of the present invention.
Figure 3 depicts the replication of a region memory portion of the image memory structure of the present invention.
Figure 4 depicts the correction of a value stored in the replicated region memory portion of Fig. 3.
Figure 5 depicts the application of the HLS color model to the coloring o~ an array of picture elements stored in the image memory structure of Figs. 2-4.
Figure 6 depicts a schematic representation of the "histogram slide" operation according to the method of the present invention.
Figure 7 depicts a schematic representation of the "histogram multiplication down" operation according to the method of the present invention.
Figure 8 depicts a ~chematic representation of the "histogram multiplication up" operation according to the method of the present invention.
Figure 9 depicts a schematic representation of the "color 132~32~
bending" operation according to the method of the present invention.
Detailed Description of the Invention General Process overview The film to be colored or color enhanced is first digitally recorded at a preselected resolution. In instances where the final product will be placed on conventional video tape, appropriate resolutions are employed. (E.g. NTSC=525 lines, PAL=640, HDTV=1125-1500). Where the final product will be used for theatrical presentation, resolutions of 1024 lines or more may be employed. The digitization process (depicted in Fig. 1) compri6es projecting each frame of the original (typically black and white) film through a liquid gate device which masks scratches and other imperfections. The projected ; 15 frame is captured using a video camera such as a saticon tube and the video signal is digitized and stored in a random access memory frame buffer. The digitization process allows ad~ustment of the intensity of light falling on the film in order to achieve the best contrast for the particular frame being digitized (gamma correction). After acquisition of a frame, the digital representation o~ the frame is stored in non-volatile storage for ~urther processing. The digitization process is repeated on a frame-by-frame basis, marking each frame with a time and sequence indication (SMPTE time code).
For each scene (sequence of frames), a frame is selected as being representative of the majority of the elements present in the sequence. This "key frame" is used as a template for the enhancement o~ other frames within the scene sequence. For this reason, the greatest care is applied to the processing of key frames.
After kay frames are color-enhanced and defined, they are passed with the rest of the frames comprising a scene to a work station where the regions defining the application of 6 1~2~32~
color are tracked from frame to frame. After each frame has been enhanced, it is restored in non-volatile storage and sent to a central storage where all enhanced scenes are eventually reassembled according to their original sequence as a final product motion pictuxe.
Key Frame Coloring The first task of the key frame colorist is to identify areas of the image which comprise objects which will be designated to be of similar hue. Several software tools are available to the colorist to aid in this computer-interactive definition process.
A "free hand" mode allows drawing and erasing as well as control over a variably-sized cursor. A pointing device controls an interactive cursor displayed on a screen overlaying the key frame to be processed. Movement of the pointing device effects an analogous movement of the cursor and the picture elements selected by this movement (which are displayed on a graphics plane known as the "mask plane") are used to define the edge of an area. Because the cursor size ma~ be reduced to only one pixel, the smallest region which may be produced is also one pixel.
A second drawing tool available to the colorist is one which defines polygons, rather than completely free-hand shapes. Using the pointing devices, the colorist may indicate the precise placement of vertices which are intended to define the perimeter of a polygonal area. Each vertex may be assigned or removed and, when complete, the vertices may be completed to define a closed path.
After definition of a closed perimeter by free-hand or polygon methods, the enclosed area may be assigned to the region (by setting the bits in the mask plane) by using one of two filling algorith~s. An "area fill" assigns all picture elements within the closed shape and up to the defined edge to ; the area. A "cartoon fill" combines the defined edges with edge recognition algorithms in order to fill an area which is 7 132~321 bounded by a border line such as the black pen lines which are typical of cartoons. Cartoon fill mode fills up to, or slightly over such heavy border lines.
After areas have been defined for all objects having a common hue, the final step is the selection and application of a given hue to that area. ~olors may be selected by interactively pointing to a color wheel displayed on a video screen, and then by manipulating control indicators on the video screen which allow refinement of the gray-scale information for the area as described below. After selection, the color may be displayed by its application to the areas (which are collectively known as a color mask.) After a color mask has been defined, its associated picture elements are identified as a "region", the mask plane is cleared, and additional masks are defined in a similar manner, one for each hue which will be displayed in the frame.
Once all regions are defined, the regions, the underlying frame, and the colors assigned to each region are stored for later retrieval by colorists who will reiterate the process using the subsequent frames of each scene and the key frame which has been defined.
An additional tool for defining regions is the threshold region definition tool. The threshold region definition tool permits the automatic definition of regions to be colored.
First, a brush size (a form of cursor) is selected. Next, an upper or lower threshold gray-scale value is selected. The upper/lower gray-scale value is representative of the edge of a con~iguous group of picture elements having common brightness. An upper gray-scale value is selected if the threshold region definition tool is to be used to select picture elements below a given upper threshold value.
Alternatively, a lower threshold gray-scale value is selected to designate a region above said lower threshold value. The picture elements located within the brush tip are analyzed to determine the medium brightness o~ those picture elements. If the medium brightness within the brush tip is within the designated threshold region (either below the upper threshold 8 132~321 region or above the lower threshold region), the picture elements are included in the designated region. The picture elements above an upper threshold region or below a lower threshold region are excluded from the new region.
The process of extrapolating color enhancement from key frames to entire scenes is performed by identifying and tracking motion as well as appearances and disappearances of objects. Generally fewer than 5% of the picture elements in a frame of a scene change in the succeeding frame. ~ecause motion pictures are filmed at 24 frames per second, increments of motion on all but the most rapidly moving objects are extremely small and easily followed. Once motion has been identified, adjustment of the regions and application of the key frames' predefined colors to those regions is all that is required.
Implicit in all frame-to-frame tracking is the ability to copy predefined parts of the preceding or key frame's re~ion memory contents and to use those parts in subsequent frame processing. After copying, small adjustments may be made to the new region memory contents and they may be saved with the new frame. Importantly, because the digitization process assures consistency of brightness and gray scale, and because colors are predefined by an art director working with ~ey frames, overall consistency ~rom frame-to-frame, and scene-to-scene is also assured.
The simplest form of tracking from frame to frame is where motion in the frame is apparent only as a result of a camera pan. In such instances, the entire region memory may be copied from that of the preceding frame and "shifted" in the direction of the pan. Such a shift will create an undefined area at the end opposite the motion. This area must then be added to the existing regions in order to completely apply color.
When only some of the objects of a frame move, a corresponding portion of the region memory may be translated as well. In this operation, that portion of the region memory which is to be moved i8 designated by the operator and is 9 1~24~21 moved manually to its new location, thus overwriting the former contents of the region memory at the new location. The old location must then be re-defined and/or added to one or more pre-existing regions in order to apply color.
In highly complex scenes, where several types of motion are present, (especially compound motion of subject and camera) a "differencing" procedure is employed. A frame for which areas have been de~ined and to which color has been applied is subtracted from a next subsequent frame in order to reveal those picture elements which have changed (in gray-scale value). These areas of difference are then displayed as a contrasting overlay on the subsequent frame to be processed.
Alternatively, an algorithm for edge detection in gray-scale images may be used and may be correlated to the previous frame's region ~emory. Detected differences may then cause the subsequent frame's regions to be adjusted or deleted, as required. By examining the coincidence of this overlay and the region structure whi~h has been duplicated from the preceding frame, the production colorist is able to quickly designate areas which should be added to the region structure, and those which must be removed from a given region and reassigned.
The final operation is to assure that no objects have either completely appeared or disappeared from the frame. If an object disappears, its region must be removed. Similarly, newly appearing objects must have a region defined for them "on the fly" and a color assigned as well. After adjusting the region for the next subsequent image, the colorist again stores that image to~ether with its region memory and repeats the process for each subseguent Prame.
To permit quality control and review of a sequence of color-enhanced frames, two modes of viewing are used: Cine mode and Cycle mode. In cine mode, a series of frames is sequentially displayed, shuttling first in a forward direction, and then immediately in reverse. The speed may be controlled by the operator. In cycle mode, display direction and speed are directly tied to the pointing device to allow 10 132~321 fine control over viewing frame-to-~rame transitions.
Hardw~re Overview The method of the present invention is implemented on a special purpose version of an RTI-Station image processing system interfaced to a personal computer (such as the IBM*
personal computer AT). The RTI image processing subsystem (available from Recognition Technology, Inc. of Holliston, MA) comprises a video digitizer, a pipeline architecture pixel processor, and image memory which is all interconnected both by a host computer bus and by a synchronous video bus. The video digitizer takes its input from a video camera such as that described above for digitizing film frames, and outputs its information to a high definition color CRT (though not necessarily one interfaced by conventional NTSC video circuitry). The host computer bus permits the image processing subsystem to communicate with the user through the host CPU, its memory, associated mass storage devices, and I/O
devices including pointing devices and keyboards.
The image processing subsystem of the preferred embodiment of the present invention comprises an image memory having 51Z x 512 spatial resolution, 8 bits of brightness resolution, and 4 bits for region identification.
Alternatively, the same system may be configured for 6 bits of brightness resolution and 6 bits of region identification or for 7 bits of brightness resolution and 5 bits of region identification. (One additional bit is reserved for use by the system in construction of the mask bit plane.) General information regarding the structure and operation of the RTI
Station and its associated software library (RTILIB) may be found in the RTILIB/500 Tutorial Revision l.Oo and the RTILIB/500 User Manual Revision A, which are published by Recognition Technology, Inc.
Although ~he currently used implementation employs a 512 * Trademark 11 ~324321 line spatial resolution, it will be recognized by those skilled in the art that increases to and above 2000 line spatial resolution are within the scope of the present invention. Additionally, the image memory may be a part of a larger frame store memory which may be, for instance, 4096 x 4096 x 16 bits. The display of frames may then be controlled by a memory pointer into this frame store.
For the purpose of color selection, the host CPU is provided with a 24 bit RGB video board (such as the Targa*24, a product of AT~T Information Systems, Inc.) A color wheel which corresponds to the common base of the HLS cone is displayed on a video screen. Using a pointing device, the colorist selects a particular color on the wheel, thus defining both hue (angular position on the color wheel) and saturation (distance from the center). As deRcribed more fully below, this unique 24 bit RGB value is translated immediately into the HLS coordinate space, correlated to the modified image gray-scale, and retranslated for output on a RGB monitor.
~Ortware overvlew Referring now to Figs. 1-4, according to the method of the present invention, a series of digitized monochrome images may be stored in a random access memory of a computer as an array of gray-scale values. For example, 8 bits of information may be used to represent each picture element, thus permitting a total of 256 unique gray-scale values to be represented. (Of course, more bits may be used to achieve better resolution and dynamic range for the gray-scale ` information.) The digitization step may optionally include the use of look-up tables in order to enhance contrast, adjust gamma, enhance edge definition or "stretch" the gray-scale of a faded image. Associated with the memory used to store the gray-scale values is a region memory used to uniguely identi~y * Trademark each picture element with one of a number of regions (depicted as a four bit memory structure in Figs. 2-5). In addition to the image memory and region memory structures, a mask plane memory is also associated with the image for use in identifying picture elements. T~rough conventional techniques, the contents or parts of the contents of these memory structures may be copied from one structure to another (for example, from the region memory to the mask plane memory) or from one frame's memory to another (as depicted in Fig. 3 for the transfer of frame 1 region memory to frame 2 region memory.) The copied contents may, of course, be altered by the operator to conform to the underlying frame contents as depicted in Fig. 4.
As depicted in Figs. 5-9, the colorist has at his or her disposal a number of software-implemented tools which control the construction of the color transfer function. For computational efficiency, the luminance axis of HLS
cylindrical space is represented as an integer range of [0..1000] (as opposed to the conventional representation as a 20 real number in the range [01.0]). The gray-scale of a frame is linearly related to the integer range by defaulting a gray-scale value of 128 to a luminance value of 500, and corresponding each discrete gray-scale unit to each integer of luminance. (Thus, 255 gray-scale corresponds by default to a luminance of 627.) In order to affect the color transfer function constructed by the system, the colorist actually manipulates the frame gray-scale information by applying histogramming techniques such as translation, compression, and stretching.
Specifically, the entire gray-scale may be made to appear either more or less luminous by a simple "slide" of the gray-scale (GS) with respect to the luminance scale (Fig. 6).
Instead of corresponding GS - 128 to L = 500, the operator may instead choose to brighten the frame by choosing L = 750.
Similarly, the frame may be darkened by choosing L = 350. As depicted in Fig. 6, the portion of the color transfer function falling ~ithin the HLS conic space which corresponds to the 13 132~321 gray-scale is controlled by this slide operation. At its extremes, the operation creates a threshold in the transfer function beyond which all gray-scale values are displayed as either white (above the upper threshold), or as black (below the lower one).
void slide map(); /*slide brightness mapping */
void stretch map(); /*stretch or compress brightness mapping ~/
void bend on(~; /J'enable color bending */
void bend off(); /~disable color bending ~/
void slide bend(); /*slide region of color bending */
void stretch bend(); /~stretch region of color bending */
/*LOCAL CONSTANTS--------- */
#define TRUE 1 #define FALSE 0 /*Local Static Data---------------~/
static int brt I = 0; /*brightness value for start of ramp ~/
static int brt u = 1000; /*brightness value for end of ramp */
static int bend = FALSE; /~logical - TRUE if color bend enabled */
static int markl; /*lower color bending overlap marker ~/
: static int mark2; /~upper color bending overlap marker ~/
/~Global Data (and externals)----*/
extern int hue 1; /*primary hue value - [0.. 360] */
externintsatl; /~primarysaturationvalue-[0.. 1000]~/
extern int hue 2; /~secondaryhuevalue - [0.. 360] ~/
extern int sat 2; /~secondary saturation value - [0.. 1000] ~/
/~secondaries valid for bend = TRUE only ~/
void slide map (dbrt) : int dbrt; /~change in brightness - [-999--9991 ~/
{
/~make sure that the requested change is in range */
dbrt = max (dbrt, -999);
dbrt = min (dbrt, 999);
- /~check for limiting case ~/
if ( (dbrt ~ 0) && (brt I < -dbrt)) {
stsetch_map (-brt 1,1000-brt u);
return;
}
/~Simple cæe: appiy the change ~/
brt I += dbrt;
brt_u ~ = dbrt;
if( bend) {
markl += dbrt;
mark2 += dbrt;
. }
~/ End Slide map ~/
14 13243~
Another operation available to the colorist is a ratiometric multiplication of the gray-scale by a value (depicted in Figs. 7-8). When the multiplier value is a fraction less than one, a ratiometric compression of the gray-scale occurs which relatss the 256 discrete gray-scale values to fewer than 256 integer luminance values. When the multiplier value is greater than unity, a "stretch" of the gray-scale occurs which results in a correspondence of the 256 gray-scale values to a range of luminance greater than 256.
These multiplications may also occur using either end of the gray-scale as a fixed point, yielding 4 variations as depicted in Figs. 7-8.
Listing 2 void stretch map( db low, db up) int db low; /~ change in lower ramp point ~/
int db up; /~ change in upper ramp point ~/
int newbrt 1, newbrt_u, newrnark_1, newmark_u;
double fmarkl, fmark2, rlen, newrlen;
2 0 t~ Check that change parameters are within range ~/
db low ~ min( db low, 999);
db low = max( db low, -999);
db up = min( db_up, 999);
db up = max( db up, -999);
2 5 /~ Apply change to lower ramp point ~/
newbrt_1 = brt 1 + db low;
newbrt 1 = max( newbrt 1, 0);
newbrt 1 = min( newbrt_l, 99~);
/~ Apply change to upper ramp point ~/
newbrt u = brt_u + db_up;
newbrt u = max( newbrt u, 1);
newbrt u = min( newbrt u, 1000);
/~ Make sure that the new combination is valid ~/
if( (db_up == O) && (newbrt 1 >= newbrt u) ) newbrt_1 = newbrt u- 1;
else if( newbrt_u ~ = newbrt_1) newbrt_u = newbrt 1 + 1;
/~ Reset color-bend markers for new ramp ~/
if( bend){
4 o rlen = (double)( brt_u - brt_1); /~ current ramplen ~/
newrlen = (double)( newbrt_u - newbrt_1); l~new ramplen ~/
fmarkl = (double)( markl - brt 1) / rlen;
fmark2 = (double)( mark2 - brt ~) / rlen;
132~32 markl = (int)( f~narkl * new}len) + newbrt 1;
mark2 = (int)( ~nark2 * newrlen) + newbrt 1;
/~ Move new ramp points into official ramp points ~/
brt 1 = newbrt_1;
brt u - newbrt u;
} /~End of stretch-map ~/
In depicting reflected light (either originating from a colored source, reflecting from a colored surface, or both), it is necessary to ~bend~ the color transfer function. As shown in Fig. 9, the colorist has available the option of setting a pair of "bending points" for the function which determine the nature and extent of the bending. These points, depicted in Fig. 9, (as points A and B) define the endpoints of three line segments (OB, BA, and Al). Of these segments, the first and last each have associated with them unique, operator-selected values of hue, while the AB segment represents a mixture of hues and saturations. The locations of points A and B are defaulted to 25% and 75% of the current gray-scale range, thus providing color mixing over one-half of the gray-scale. These locations, however, may be changed by the operator to change the "rate" of color bending.
Listing 3 void bend_on() {
int midpoint, ramplen;
if( bend) return; t~ already done, do nothing ~/
ramplen = (brt u - brt 1 + 1~ length of current ramp ~/
midpoint = brt_1 + ramplen n; /~ midpoint of current ramp ~/
3 0 markl = midpoint - ramplen /4; /Y bottom of color overlap ~/
mark2 = midpoint + ramplen /4;1~ top of color overlap ~/
} /~ End of bend_on void bend off() bend = FAISE;
} /~End of bend-off ~/
void slide bend( dbend) int dbend- /~ change in region of bend - -999 to 999~/
16 l32432 int ovlplen;
/~ If bend not enabled, do nothing */
if( Ibend) return;
/~ Make sure that there is room to go in the requested direction ~/
if~ ~ dbend < 0) && (markl == brt 1) ) return;
if( (dbend > 0) && (mark2 == brt u) ) return;
/* Make sure that the requested change is within raDge */
dbend = (dbend < 0) ? max ( dbend, brt 1- markl):
n~in( dbend, brt u - mark2);
/~ Apply the change ~/
markl += dbend; /~ new markl position ~/
mark2 += dbend; /~ new mark2 position ~/
} /* End of s1ide bend ~/
void stretch bend( sfact) int sfact; /~ stretch factor (-100 to 100)~/
{
double ~act, hmarklen, markfact, marklen, midpoint, ramplen;
/~ Compute a floating point stretch factor ~/
fact = (sfact > = 0) ? 1.0 + (double)sfact / 100.0;
1.0 - (double) (-sfact)/100.0;
/~ Get floating point ramp, overlap length and midpoint of bend */ ramplen = (double) brt u - brt 1 + 1);
; marklen = ldouble) ( mark2 - markl + 1);
midpoint = (double)markl + marklen/2.0;
2 5/~ Compute and adjust the percentage overlap ~/
~ markfact = marklen I ramplen;
: markfact ~- fact: /~ apply change ~1 - hmarklen = (ramplen ~ markfact) / æo; /~ half new mark length ~/
/~ Compute ~ew mark points ~/
30 markl = max( (int)( m~dpoint - hmarklen), brt_1);
~ mark2 = min( (int)( midpoint + hmarklen), brt_u);
} /~ End of stretch_bend ~/
As an example of bending, consider the selection of an OB
hue as flesh tone, and the Al hue as orange (as would be the case if a face were viewed by reflected fire light). The BA
segment would contain hues which are flesh tones in the less bright areas, but for brighter ones, take on more and more orange appearance. Of course, it will be appreciated that the selection of the locations of points A and B or the inclusion of additional bending points are fully within the spirit of the technique.
Because regions are defined so as to include all objects of a similar hue, regions contain a wide range of picture element gray-scale values. In order to accurately represent the gray-scale values in the final colored image, a color . . .
.~.- .
.,, i~ , 17 ~32432~
transfer function must be define~ for converting the one-dimensional gray-scale to a set of three dimensional HLS
values. From the point on the displayed color wheel chosen by a key frame colorist, a line segment is extended vertically parallel to the a~is of the cylindrical HLS universe. This line segment passes through the points in the HLS cones which represent the colors to be applied within the region, and comprises the color transfer function. For example, an 8 bit picture element having a gray-scale value of 190 is represented on the gray-scale of Figure 5. The corresponding point on the upper segment of the color transfer function defines a unique HLS value which will be applied to those picture elements within the region having the 190 gray-scale value. Where the line segment does not fall within the HLS
conic space, the representation is either pure white (in the region above L = 0.5) or pure black (in the region below L = 0.5). It will be recognized that because the color transfer function traverses a portion of the altitude of HLS
coordinate space, points which fall at either extreme of the gray-scale are correspondingly less saturated when displayed in color. This renders the displayed colors much more realistic than systems using constant saturations.
Specular highlights are those areas of particularly bright reflection due to angle of incidence effects. The production of colorless specular highlights can be produced by applying the histogram stretch technique described above to "pull" brighter gray-scale values outside of the HLS cone, thus making them appear white. The extent of this stretch controls the appearance of the specular highlight.
Referring again to Figure 9, there is depicted a schematic representation of the method by which both specular highlights and reflection of colored light sources may be achieved. Although human perception is, in fact, best represented by the HLS double cone, the actual physics of color and light are best represented by an HLS cylinder which has luminance as its axis, and saturation as its radius.
Within this cylinder is inscribed the conic perceptual space.
This conic space defines the limits of human perception, and the preGise shape and size of the cones varies from person to person. (A colorblind person, for example, has a cone with a "flattened" side, while one with night-blindness has an extremely truncated lo~er cone.) By manipulating gray-scale information representing an image, the color transfer function by which gray-scale is transformed into hue, luminance, and saturation is affected, and a variety of effects are produced.
Glo~sary o~ Terms Image - monochrome or color analog picture Picture Elements - discrete locations in an image Brightness - the absolute amount of light transmitted through a monochrome image (dependent on incident light intensity) Capturing an Image Digitally - recording digital data representative of an image and from which an image can be re-created.
Gray-Scale Value - relative amount of light transmitted through a point on a monochrome image (dependent on incident light intensity) Luminance - the absolute amount of light emitted by, or transmitted through a point on a color image Hue - the peak or primary wavelength emitted by or transmitted through a point on a color image Saturation - the relative number of primary to complementary points in a given area of a color image (more primary =
higher saturation) Region - a data structure in memory used to identify a group of picture elements to which a defined color transfer function is to be applied.
Mask - a data structure (commonly a graphics plane) in memory used to identify a group of picture elements to be acted upon, such as ~y their inclusion within a region.
Map - a data structure (commonly a multi-bit graphics plane) for identifying one or more regions of an image, each such region comprising picture elements representative of one or more objects or image areas having the common property.
Specular Highlight - region of high brightness (monochrome) or high luminance (color) brought about by direct (angle of incidence) reflection of a light source.
19 1324~2~
Statement of Industrial Utility The method and apparatus of the present invention is use~ul in the color enhancement of achromatic images such as still photographs, motion pictures, video taped images, telemetry data, and the like.
SYSTEM AND METHOD FOR COLOR IMAGE ENH~NCEMENT
This application is a division of application Serial No. 573,741 filed August 3, 1988.
Field of the Invention The present invention pertains to a system and method for the color enhancement of monochrome or multi-color images by relation of digitized gray-scale information from the original image to unique values (triplets) of hue, luminance, and saturation derived from a color transfer function applied to the achromatic gray-scale information.
Backaround of the Invention In strictly mathematical terms, the image produced on black and white film is one dimensional (brightness), while an . original-scene can be viewed as three dimensional (using the HLS model of color, where hue, luminance, and saturation are the dimensions). The chemical reaction of the film emulsion, and the camera mechanisms and filters, essentially reduce the three dimensional information to one dimension by complex mapping, and forever obliterate the original hue, luminance, and saturation information.
Note: It should be pointed out that luminance and brightness are not the same. Since brightness of the black and white image results from the complex interactions of the three elements of the HLS system, there is only a statistical correlation between brightness and luminance, but they are not identical.
Color Representstion ~yste~s Color is both an absolute quality of light (its wave-length or frequency) and a relative quality which depends on 2 132~32~
the human perceiver. For this reason, various color "gamuts"
have been proposed which allow perceived colors to be quantified. Among these models is the CIE chromaticity scale, which permits quantification of color by combinations of so-called primaries (which are, by themselves, not visible tohuman observers). (See Foley & VanDam, Fundamentals of Interactive Computer Graphics, Addison Wesley, Reading, MA
198~ pp. 606-608.) In addition to theoretical models of color (CIE Chromaticity, Hue-Luminance-Saturation, and Red-Green-Blue), there are many representational systems for colorreproduction in various media such as the RGB system for cathode ray tubes, the process color system for printing, and the NTSC, PAL, and SECAM systems for broadcast video. Each of these models is, in fact, a relatively small subset of both the theoretical absolute range of colors and of the actual range of humanly perceivable colors.
One of the theoretical models (Ostwald, W., Colour Science, Winsor & Winson, London, 1931) which is especially useful for analysis of color is the HLS model of hue, luminance, and saturation. This model uses a three dimensional cylindrical universe (the absolute theoretical color space), inside of which is a space bounded by a pair of cones having a common base (the perception space~. The cylinder's axis (from apex to apex of the cones) is the luminance axis, with the lower apex having a luminance of zero (all black) and the apex of the upper cone having a luminance of 1 (all white). Distance from the luminance axis (normal to the axis) determines saturation (proportion of a color which is made up of its complementary color), with the axis representing zero saturation (equal amounts of the primary color and its complement and therefore no color) and the surface of the cone representing maximum perceived saturation for a given luminance. Finally, angular displacement around the luminance axis represents hue, as with a conventional color wheel, with complements separated by 180 degrees.
The solid space defined by the HLS conic model is pointed at its tcp and bottom to represent the fact that as colors 3 ~324321 become whiter or blacker, they are perceived to lose hue.
Saturation is a characteristic which depends chiefly on the narrowness or purity of the band of wavelengths comprising a color. At the exact center of the space is a medium gray which exhibits a moderate emittance or reflectance for all wavelengths of light.
The RGB system, is used, for example, in CRT displays, where each of the three electron quns (red, green, and blue) is controlled directly by a signal indicating its respective iO intensity. HLS color may be converted to the RGB
representation (see Foley ~ VanDam, pp. 648-9.), however there are many HLS combinations which must be "clipped" to be represented in RGB. The same is true of NTSC video (also known as YIQ), which has an even more restricted range than RGB. Thus, not every "real world" color can be reproduced by video systems using either RGB or NTSC standards. According to Foley & VanDam, "neither the V in ~SV [Hue, Saturation, Value] nor the L in HLS correspond to luminance in the YIQ
model, so two different colors defined in either space can easily have the same luminance, and be indistinguishable on black and white TV or videotape." ~Foley & VanDam p. 618) Since approximately 1970, several techniques have been developed for the addition of color to monochrome images.
Often, these techniques rely on video technology to augment the video signal in a defined manner, typically by adding a chrominance component signal to the pre-existing luminance siqnal. This technique results in an appearance not unlike that achieved by Thomas Edison in his early efforts to add tint to black and white film by painting the frames: a colored area over a black and white picture. Because of their reliance on video technology, all of these systems suffer from lack of vividness of colors, and from an inability to accurately reproduce colors.
,,:
Brief Description of the Invention ~' The present invention comprises a method for selectively ;
4 132~32~
coloring at least one picture element of an image comprised of a multiplicity of picture elements by combining image information ~rom mask-defined regions of the image with color information, the improvement consisting of: (a) assigning to an image to be colored, mask-defining region information and color information from a previous image; (b) moving at least one of said mask-defined regions a predetermined distance along a predetermined axis to a moved location and storing mask-defining region information for said image to be colored at the moved location.
Brief Description of the Fiaures Figure 1 depicts in schematic form the creation of two adjacent frames of graphic information according to the method of the present invention.
Figure 2 is a schematic representation of a portion of the image memory structure according to one embodiment of the method of the present invention.
Figure 3 depicts the replication of a region memory portion of the image memory structure of the present invention.
Figure 4 depicts the correction of a value stored in the replicated region memory portion of Fig. 3.
Figure 5 depicts the application of the HLS color model to the coloring o~ an array of picture elements stored in the image memory structure of Figs. 2-4.
Figure 6 depicts a schematic representation of the "histogram slide" operation according to the method of the present invention.
Figure 7 depicts a schematic representation of the "histogram multiplication down" operation according to the method of the present invention.
Figure 8 depicts a ~chematic representation of the "histogram multiplication up" operation according to the method of the present invention.
Figure 9 depicts a schematic representation of the "color 132~32~
bending" operation according to the method of the present invention.
Detailed Description of the Invention General Process overview The film to be colored or color enhanced is first digitally recorded at a preselected resolution. In instances where the final product will be placed on conventional video tape, appropriate resolutions are employed. (E.g. NTSC=525 lines, PAL=640, HDTV=1125-1500). Where the final product will be used for theatrical presentation, resolutions of 1024 lines or more may be employed. The digitization process (depicted in Fig. 1) compri6es projecting each frame of the original (typically black and white) film through a liquid gate device which masks scratches and other imperfections. The projected ; 15 frame is captured using a video camera such as a saticon tube and the video signal is digitized and stored in a random access memory frame buffer. The digitization process allows ad~ustment of the intensity of light falling on the film in order to achieve the best contrast for the particular frame being digitized (gamma correction). After acquisition of a frame, the digital representation o~ the frame is stored in non-volatile storage for ~urther processing. The digitization process is repeated on a frame-by-frame basis, marking each frame with a time and sequence indication (SMPTE time code).
For each scene (sequence of frames), a frame is selected as being representative of the majority of the elements present in the sequence. This "key frame" is used as a template for the enhancement o~ other frames within the scene sequence. For this reason, the greatest care is applied to the processing of key frames.
After kay frames are color-enhanced and defined, they are passed with the rest of the frames comprising a scene to a work station where the regions defining the application of 6 1~2~32~
color are tracked from frame to frame. After each frame has been enhanced, it is restored in non-volatile storage and sent to a central storage where all enhanced scenes are eventually reassembled according to their original sequence as a final product motion pictuxe.
Key Frame Coloring The first task of the key frame colorist is to identify areas of the image which comprise objects which will be designated to be of similar hue. Several software tools are available to the colorist to aid in this computer-interactive definition process.
A "free hand" mode allows drawing and erasing as well as control over a variably-sized cursor. A pointing device controls an interactive cursor displayed on a screen overlaying the key frame to be processed. Movement of the pointing device effects an analogous movement of the cursor and the picture elements selected by this movement (which are displayed on a graphics plane known as the "mask plane") are used to define the edge of an area. Because the cursor size ma~ be reduced to only one pixel, the smallest region which may be produced is also one pixel.
A second drawing tool available to the colorist is one which defines polygons, rather than completely free-hand shapes. Using the pointing devices, the colorist may indicate the precise placement of vertices which are intended to define the perimeter of a polygonal area. Each vertex may be assigned or removed and, when complete, the vertices may be completed to define a closed path.
After definition of a closed perimeter by free-hand or polygon methods, the enclosed area may be assigned to the region (by setting the bits in the mask plane) by using one of two filling algorith~s. An "area fill" assigns all picture elements within the closed shape and up to the defined edge to ; the area. A "cartoon fill" combines the defined edges with edge recognition algorithms in order to fill an area which is 7 132~321 bounded by a border line such as the black pen lines which are typical of cartoons. Cartoon fill mode fills up to, or slightly over such heavy border lines.
After areas have been defined for all objects having a common hue, the final step is the selection and application of a given hue to that area. ~olors may be selected by interactively pointing to a color wheel displayed on a video screen, and then by manipulating control indicators on the video screen which allow refinement of the gray-scale information for the area as described below. After selection, the color may be displayed by its application to the areas (which are collectively known as a color mask.) After a color mask has been defined, its associated picture elements are identified as a "region", the mask plane is cleared, and additional masks are defined in a similar manner, one for each hue which will be displayed in the frame.
Once all regions are defined, the regions, the underlying frame, and the colors assigned to each region are stored for later retrieval by colorists who will reiterate the process using the subsequent frames of each scene and the key frame which has been defined.
An additional tool for defining regions is the threshold region definition tool. The threshold region definition tool permits the automatic definition of regions to be colored.
First, a brush size (a form of cursor) is selected. Next, an upper or lower threshold gray-scale value is selected. The upper/lower gray-scale value is representative of the edge of a con~iguous group of picture elements having common brightness. An upper gray-scale value is selected if the threshold region definition tool is to be used to select picture elements below a given upper threshold value.
Alternatively, a lower threshold gray-scale value is selected to designate a region above said lower threshold value. The picture elements located within the brush tip are analyzed to determine the medium brightness o~ those picture elements. If the medium brightness within the brush tip is within the designated threshold region (either below the upper threshold 8 132~321 region or above the lower threshold region), the picture elements are included in the designated region. The picture elements above an upper threshold region or below a lower threshold region are excluded from the new region.
The process of extrapolating color enhancement from key frames to entire scenes is performed by identifying and tracking motion as well as appearances and disappearances of objects. Generally fewer than 5% of the picture elements in a frame of a scene change in the succeeding frame. ~ecause motion pictures are filmed at 24 frames per second, increments of motion on all but the most rapidly moving objects are extremely small and easily followed. Once motion has been identified, adjustment of the regions and application of the key frames' predefined colors to those regions is all that is required.
Implicit in all frame-to-frame tracking is the ability to copy predefined parts of the preceding or key frame's re~ion memory contents and to use those parts in subsequent frame processing. After copying, small adjustments may be made to the new region memory contents and they may be saved with the new frame. Importantly, because the digitization process assures consistency of brightness and gray scale, and because colors are predefined by an art director working with ~ey frames, overall consistency ~rom frame-to-frame, and scene-to-scene is also assured.
The simplest form of tracking from frame to frame is where motion in the frame is apparent only as a result of a camera pan. In such instances, the entire region memory may be copied from that of the preceding frame and "shifted" in the direction of the pan. Such a shift will create an undefined area at the end opposite the motion. This area must then be added to the existing regions in order to completely apply color.
When only some of the objects of a frame move, a corresponding portion of the region memory may be translated as well. In this operation, that portion of the region memory which is to be moved i8 designated by the operator and is 9 1~24~21 moved manually to its new location, thus overwriting the former contents of the region memory at the new location. The old location must then be re-defined and/or added to one or more pre-existing regions in order to apply color.
In highly complex scenes, where several types of motion are present, (especially compound motion of subject and camera) a "differencing" procedure is employed. A frame for which areas have been de~ined and to which color has been applied is subtracted from a next subsequent frame in order to reveal those picture elements which have changed (in gray-scale value). These areas of difference are then displayed as a contrasting overlay on the subsequent frame to be processed.
Alternatively, an algorithm for edge detection in gray-scale images may be used and may be correlated to the previous frame's region ~emory. Detected differences may then cause the subsequent frame's regions to be adjusted or deleted, as required. By examining the coincidence of this overlay and the region structure whi~h has been duplicated from the preceding frame, the production colorist is able to quickly designate areas which should be added to the region structure, and those which must be removed from a given region and reassigned.
The final operation is to assure that no objects have either completely appeared or disappeared from the frame. If an object disappears, its region must be removed. Similarly, newly appearing objects must have a region defined for them "on the fly" and a color assigned as well. After adjusting the region for the next subsequent image, the colorist again stores that image to~ether with its region memory and repeats the process for each subseguent Prame.
To permit quality control and review of a sequence of color-enhanced frames, two modes of viewing are used: Cine mode and Cycle mode. In cine mode, a series of frames is sequentially displayed, shuttling first in a forward direction, and then immediately in reverse. The speed may be controlled by the operator. In cycle mode, display direction and speed are directly tied to the pointing device to allow 10 132~321 fine control over viewing frame-to-~rame transitions.
Hardw~re Overview The method of the present invention is implemented on a special purpose version of an RTI-Station image processing system interfaced to a personal computer (such as the IBM*
personal computer AT). The RTI image processing subsystem (available from Recognition Technology, Inc. of Holliston, MA) comprises a video digitizer, a pipeline architecture pixel processor, and image memory which is all interconnected both by a host computer bus and by a synchronous video bus. The video digitizer takes its input from a video camera such as that described above for digitizing film frames, and outputs its information to a high definition color CRT (though not necessarily one interfaced by conventional NTSC video circuitry). The host computer bus permits the image processing subsystem to communicate with the user through the host CPU, its memory, associated mass storage devices, and I/O
devices including pointing devices and keyboards.
The image processing subsystem of the preferred embodiment of the present invention comprises an image memory having 51Z x 512 spatial resolution, 8 bits of brightness resolution, and 4 bits for region identification.
Alternatively, the same system may be configured for 6 bits of brightness resolution and 6 bits of region identification or for 7 bits of brightness resolution and 5 bits of region identification. (One additional bit is reserved for use by the system in construction of the mask bit plane.) General information regarding the structure and operation of the RTI
Station and its associated software library (RTILIB) may be found in the RTILIB/500 Tutorial Revision l.Oo and the RTILIB/500 User Manual Revision A, which are published by Recognition Technology, Inc.
Although ~he currently used implementation employs a 512 * Trademark 11 ~324321 line spatial resolution, it will be recognized by those skilled in the art that increases to and above 2000 line spatial resolution are within the scope of the present invention. Additionally, the image memory may be a part of a larger frame store memory which may be, for instance, 4096 x 4096 x 16 bits. The display of frames may then be controlled by a memory pointer into this frame store.
For the purpose of color selection, the host CPU is provided with a 24 bit RGB video board (such as the Targa*24, a product of AT~T Information Systems, Inc.) A color wheel which corresponds to the common base of the HLS cone is displayed on a video screen. Using a pointing device, the colorist selects a particular color on the wheel, thus defining both hue (angular position on the color wheel) and saturation (distance from the center). As deRcribed more fully below, this unique 24 bit RGB value is translated immediately into the HLS coordinate space, correlated to the modified image gray-scale, and retranslated for output on a RGB monitor.
~Ortware overvlew Referring now to Figs. 1-4, according to the method of the present invention, a series of digitized monochrome images may be stored in a random access memory of a computer as an array of gray-scale values. For example, 8 bits of information may be used to represent each picture element, thus permitting a total of 256 unique gray-scale values to be represented. (Of course, more bits may be used to achieve better resolution and dynamic range for the gray-scale ` information.) The digitization step may optionally include the use of look-up tables in order to enhance contrast, adjust gamma, enhance edge definition or "stretch" the gray-scale of a faded image. Associated with the memory used to store the gray-scale values is a region memory used to uniguely identi~y * Trademark each picture element with one of a number of regions (depicted as a four bit memory structure in Figs. 2-5). In addition to the image memory and region memory structures, a mask plane memory is also associated with the image for use in identifying picture elements. T~rough conventional techniques, the contents or parts of the contents of these memory structures may be copied from one structure to another (for example, from the region memory to the mask plane memory) or from one frame's memory to another (as depicted in Fig. 3 for the transfer of frame 1 region memory to frame 2 region memory.) The copied contents may, of course, be altered by the operator to conform to the underlying frame contents as depicted in Fig. 4.
As depicted in Figs. 5-9, the colorist has at his or her disposal a number of software-implemented tools which control the construction of the color transfer function. For computational efficiency, the luminance axis of HLS
cylindrical space is represented as an integer range of [0..1000] (as opposed to the conventional representation as a 20 real number in the range [01.0]). The gray-scale of a frame is linearly related to the integer range by defaulting a gray-scale value of 128 to a luminance value of 500, and corresponding each discrete gray-scale unit to each integer of luminance. (Thus, 255 gray-scale corresponds by default to a luminance of 627.) In order to affect the color transfer function constructed by the system, the colorist actually manipulates the frame gray-scale information by applying histogramming techniques such as translation, compression, and stretching.
Specifically, the entire gray-scale may be made to appear either more or less luminous by a simple "slide" of the gray-scale (GS) with respect to the luminance scale (Fig. 6).
Instead of corresponding GS - 128 to L = 500, the operator may instead choose to brighten the frame by choosing L = 750.
Similarly, the frame may be darkened by choosing L = 350. As depicted in Fig. 6, the portion of the color transfer function falling ~ithin the HLS conic space which corresponds to the 13 132~321 gray-scale is controlled by this slide operation. At its extremes, the operation creates a threshold in the transfer function beyond which all gray-scale values are displayed as either white (above the upper threshold), or as black (below the lower one).
void slide map(); /*slide brightness mapping */
void stretch map(); /*stretch or compress brightness mapping ~/
void bend on(~; /J'enable color bending */
void bend off(); /~disable color bending ~/
void slide bend(); /*slide region of color bending */
void stretch bend(); /~stretch region of color bending */
/*LOCAL CONSTANTS--------- */
#define TRUE 1 #define FALSE 0 /*Local Static Data---------------~/
static int brt I = 0; /*brightness value for start of ramp ~/
static int brt u = 1000; /*brightness value for end of ramp */
static int bend = FALSE; /~logical - TRUE if color bend enabled */
static int markl; /*lower color bending overlap marker ~/
: static int mark2; /~upper color bending overlap marker ~/
/~Global Data (and externals)----*/
extern int hue 1; /*primary hue value - [0.. 360] */
externintsatl; /~primarysaturationvalue-[0.. 1000]~/
extern int hue 2; /~secondaryhuevalue - [0.. 360] ~/
extern int sat 2; /~secondary saturation value - [0.. 1000] ~/
/~secondaries valid for bend = TRUE only ~/
void slide map (dbrt) : int dbrt; /~change in brightness - [-999--9991 ~/
{
/~make sure that the requested change is in range */
dbrt = max (dbrt, -999);
dbrt = min (dbrt, 999);
- /~check for limiting case ~/
if ( (dbrt ~ 0) && (brt I < -dbrt)) {
stsetch_map (-brt 1,1000-brt u);
return;
}
/~Simple cæe: appiy the change ~/
brt I += dbrt;
brt_u ~ = dbrt;
if( bend) {
markl += dbrt;
mark2 += dbrt;
. }
~/ End Slide map ~/
14 13243~
Another operation available to the colorist is a ratiometric multiplication of the gray-scale by a value (depicted in Figs. 7-8). When the multiplier value is a fraction less than one, a ratiometric compression of the gray-scale occurs which relatss the 256 discrete gray-scale values to fewer than 256 integer luminance values. When the multiplier value is greater than unity, a "stretch" of the gray-scale occurs which results in a correspondence of the 256 gray-scale values to a range of luminance greater than 256.
These multiplications may also occur using either end of the gray-scale as a fixed point, yielding 4 variations as depicted in Figs. 7-8.
Listing 2 void stretch map( db low, db up) int db low; /~ change in lower ramp point ~/
int db up; /~ change in upper ramp point ~/
int newbrt 1, newbrt_u, newrnark_1, newmark_u;
double fmarkl, fmark2, rlen, newrlen;
2 0 t~ Check that change parameters are within range ~/
db low ~ min( db low, 999);
db low = max( db low, -999);
db up = min( db_up, 999);
db up = max( db up, -999);
2 5 /~ Apply change to lower ramp point ~/
newbrt_1 = brt 1 + db low;
newbrt 1 = max( newbrt 1, 0);
newbrt 1 = min( newbrt_l, 99~);
/~ Apply change to upper ramp point ~/
newbrt u = brt_u + db_up;
newbrt u = max( newbrt u, 1);
newbrt u = min( newbrt u, 1000);
/~ Make sure that the new combination is valid ~/
if( (db_up == O) && (newbrt 1 >= newbrt u) ) newbrt_1 = newbrt u- 1;
else if( newbrt_u ~ = newbrt_1) newbrt_u = newbrt 1 + 1;
/~ Reset color-bend markers for new ramp ~/
if( bend){
4 o rlen = (double)( brt_u - brt_1); /~ current ramplen ~/
newrlen = (double)( newbrt_u - newbrt_1); l~new ramplen ~/
fmarkl = (double)( markl - brt 1) / rlen;
fmark2 = (double)( mark2 - brt ~) / rlen;
132~32 markl = (int)( f~narkl * new}len) + newbrt 1;
mark2 = (int)( ~nark2 * newrlen) + newbrt 1;
/~ Move new ramp points into official ramp points ~/
brt 1 = newbrt_1;
brt u - newbrt u;
} /~End of stretch-map ~/
In depicting reflected light (either originating from a colored source, reflecting from a colored surface, or both), it is necessary to ~bend~ the color transfer function. As shown in Fig. 9, the colorist has available the option of setting a pair of "bending points" for the function which determine the nature and extent of the bending. These points, depicted in Fig. 9, (as points A and B) define the endpoints of three line segments (OB, BA, and Al). Of these segments, the first and last each have associated with them unique, operator-selected values of hue, while the AB segment represents a mixture of hues and saturations. The locations of points A and B are defaulted to 25% and 75% of the current gray-scale range, thus providing color mixing over one-half of the gray-scale. These locations, however, may be changed by the operator to change the "rate" of color bending.
Listing 3 void bend_on() {
int midpoint, ramplen;
if( bend) return; t~ already done, do nothing ~/
ramplen = (brt u - brt 1 + 1~ length of current ramp ~/
midpoint = brt_1 + ramplen n; /~ midpoint of current ramp ~/
3 0 markl = midpoint - ramplen /4; /Y bottom of color overlap ~/
mark2 = midpoint + ramplen /4;1~ top of color overlap ~/
} /~ End of bend_on void bend off() bend = FAISE;
} /~End of bend-off ~/
void slide bend( dbend) int dbend- /~ change in region of bend - -999 to 999~/
16 l32432 int ovlplen;
/~ If bend not enabled, do nothing */
if( Ibend) return;
/~ Make sure that there is room to go in the requested direction ~/
if~ ~ dbend < 0) && (markl == brt 1) ) return;
if( (dbend > 0) && (mark2 == brt u) ) return;
/* Make sure that the requested change is within raDge */
dbend = (dbend < 0) ? max ( dbend, brt 1- markl):
n~in( dbend, brt u - mark2);
/~ Apply the change ~/
markl += dbend; /~ new markl position ~/
mark2 += dbend; /~ new mark2 position ~/
} /* End of s1ide bend ~/
void stretch bend( sfact) int sfact; /~ stretch factor (-100 to 100)~/
{
double ~act, hmarklen, markfact, marklen, midpoint, ramplen;
/~ Compute a floating point stretch factor ~/
fact = (sfact > = 0) ? 1.0 + (double)sfact / 100.0;
1.0 - (double) (-sfact)/100.0;
/~ Get floating point ramp, overlap length and midpoint of bend */ ramplen = (double) brt u - brt 1 + 1);
; marklen = ldouble) ( mark2 - markl + 1);
midpoint = (double)markl + marklen/2.0;
2 5/~ Compute and adjust the percentage overlap ~/
~ markfact = marklen I ramplen;
: markfact ~- fact: /~ apply change ~1 - hmarklen = (ramplen ~ markfact) / æo; /~ half new mark length ~/
/~ Compute ~ew mark points ~/
30 markl = max( (int)( m~dpoint - hmarklen), brt_1);
~ mark2 = min( (int)( midpoint + hmarklen), brt_u);
} /~ End of stretch_bend ~/
As an example of bending, consider the selection of an OB
hue as flesh tone, and the Al hue as orange (as would be the case if a face were viewed by reflected fire light). The BA
segment would contain hues which are flesh tones in the less bright areas, but for brighter ones, take on more and more orange appearance. Of course, it will be appreciated that the selection of the locations of points A and B or the inclusion of additional bending points are fully within the spirit of the technique.
Because regions are defined so as to include all objects of a similar hue, regions contain a wide range of picture element gray-scale values. In order to accurately represent the gray-scale values in the final colored image, a color . . .
.~.- .
.,, i~ , 17 ~32432~
transfer function must be define~ for converting the one-dimensional gray-scale to a set of three dimensional HLS
values. From the point on the displayed color wheel chosen by a key frame colorist, a line segment is extended vertically parallel to the a~is of the cylindrical HLS universe. This line segment passes through the points in the HLS cones which represent the colors to be applied within the region, and comprises the color transfer function. For example, an 8 bit picture element having a gray-scale value of 190 is represented on the gray-scale of Figure 5. The corresponding point on the upper segment of the color transfer function defines a unique HLS value which will be applied to those picture elements within the region having the 190 gray-scale value. Where the line segment does not fall within the HLS
conic space, the representation is either pure white (in the region above L = 0.5) or pure black (in the region below L = 0.5). It will be recognized that because the color transfer function traverses a portion of the altitude of HLS
coordinate space, points which fall at either extreme of the gray-scale are correspondingly less saturated when displayed in color. This renders the displayed colors much more realistic than systems using constant saturations.
Specular highlights are those areas of particularly bright reflection due to angle of incidence effects. The production of colorless specular highlights can be produced by applying the histogram stretch technique described above to "pull" brighter gray-scale values outside of the HLS cone, thus making them appear white. The extent of this stretch controls the appearance of the specular highlight.
Referring again to Figure 9, there is depicted a schematic representation of the method by which both specular highlights and reflection of colored light sources may be achieved. Although human perception is, in fact, best represented by the HLS double cone, the actual physics of color and light are best represented by an HLS cylinder which has luminance as its axis, and saturation as its radius.
Within this cylinder is inscribed the conic perceptual space.
This conic space defines the limits of human perception, and the preGise shape and size of the cones varies from person to person. (A colorblind person, for example, has a cone with a "flattened" side, while one with night-blindness has an extremely truncated lo~er cone.) By manipulating gray-scale information representing an image, the color transfer function by which gray-scale is transformed into hue, luminance, and saturation is affected, and a variety of effects are produced.
Glo~sary o~ Terms Image - monochrome or color analog picture Picture Elements - discrete locations in an image Brightness - the absolute amount of light transmitted through a monochrome image (dependent on incident light intensity) Capturing an Image Digitally - recording digital data representative of an image and from which an image can be re-created.
Gray-Scale Value - relative amount of light transmitted through a point on a monochrome image (dependent on incident light intensity) Luminance - the absolute amount of light emitted by, or transmitted through a point on a color image Hue - the peak or primary wavelength emitted by or transmitted through a point on a color image Saturation - the relative number of primary to complementary points in a given area of a color image (more primary =
higher saturation) Region - a data structure in memory used to identify a group of picture elements to which a defined color transfer function is to be applied.
Mask - a data structure (commonly a graphics plane) in memory used to identify a group of picture elements to be acted upon, such as ~y their inclusion within a region.
Map - a data structure (commonly a multi-bit graphics plane) for identifying one or more regions of an image, each such region comprising picture elements representative of one or more objects or image areas having the common property.
Specular Highlight - region of high brightness (monochrome) or high luminance (color) brought about by direct (angle of incidence) reflection of a light source.
19 1324~2~
Statement of Industrial Utility The method and apparatus of the present invention is use~ul in the color enhancement of achromatic images such as still photographs, motion pictures, video taped images, telemetry data, and the like.
Claims
Claims:
1. In a method for selectively coloring at least one picture element of an image comprised of a multiplicity of picture elements by combining image information from mask-defined regions of the image with color information, the improvement consisting of:
(a) assigning to an image to be colored, mask-defining region information and color information from a previous image;
(b) moving at least one of said mask-defined regions a predetermined distance along a predetermined axis to a moved location and storing mask-defining region information for said image to be colored at the moved location.
2. The method of claim 1 wherein said at least one of said mask-defined regions consists of all mask-defined regions in said previous image.
3. The method of claim 1 including the further step of assigning those pixels which are within a mask-defined region of the previous image and are not within a corresponding moved mask-defined region to an additional mask-defined region.
1. In a method for selectively coloring at least one picture element of an image comprised of a multiplicity of picture elements by combining image information from mask-defined regions of the image with color information, the improvement consisting of:
(a) assigning to an image to be colored, mask-defining region information and color information from a previous image;
(b) moving at least one of said mask-defined regions a predetermined distance along a predetermined axis to a moved location and storing mask-defining region information for said image to be colored at the moved location.
2. The method of claim 1 wherein said at least one of said mask-defined regions consists of all mask-defined regions in said previous image.
3. The method of claim 1 including the further step of assigning those pixels which are within a mask-defined region of the previous image and are not within a corresponding moved mask-defined region to an additional mask-defined region.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA000616362A CA1324321C (en) | 1987-08-03 | 1992-04-27 | System and method for color image enhancement |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US8111687A | 1987-08-03 | 1987-08-03 | |
US081,116 | 1987-08-03 | ||
US222,543 | 1988-07-25 | ||
US07/222,543 US4984072A (en) | 1987-08-03 | 1988-07-25 | System and method for color image enhancement |
CA000573741A CA1315206C (en) | 1987-08-03 | 1988-08-03 | System and method for color image enhancement |
CA000616362A CA1324321C (en) | 1987-08-03 | 1992-04-27 | System and method for color image enhancement |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA000573741A Division CA1315206C (en) | 1987-08-03 | 1988-08-03 | System and method for color image enhancement |
Publications (1)
Publication Number | Publication Date |
---|---|
CA1324321C true CA1324321C (en) | 1993-11-16 |
Family
ID=27168011
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA000616362A Expired - Lifetime CA1324321C (en) | 1987-08-03 | 1992-04-27 | System and method for color image enhancement |
Country Status (1)
Country | Link |
---|---|
CA (1) | CA1324321C (en) |
-
1992
- 1992-04-27 CA CA000616362A patent/CA1324321C/en not_active Expired - Lifetime
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US4984072A (en) | System and method for color image enhancement | |
EP0302454B1 (en) | System and method for color image enhancement | |
US5093717A (en) | System and method for digitally coloring images | |
JP3095818B2 (en) | Method and apparatus for mapping a color image to a black and white image | |
US4384336A (en) | Method and apparatus for lightness imaging | |
US7447351B2 (en) | Color level graphical user interface | |
US3772465A (en) | Image modification of motion pictures | |
AU735613B2 (en) | Method for image processing | |
US6351557B1 (en) | Method and apparatus for color manipulation | |
DE60126554T2 (en) | Image display system, image processing method and information storage medium | |
US7064864B2 (en) | Method and apparatus for compressing reproducible color gamut | |
US6236406B1 (en) | Three-dimensional color space display | |
US6496599B1 (en) | Facilitating the compositing of video images | |
AU2003298666A1 (en) | Reality-based light environment for digital imaging in motion pictures | |
Sloan Jr et al. | Color map techniques | |
US6292167B1 (en) | Electronic graphic system | |
CA2014631A1 (en) | Color information storage and processing system | |
CA1324321C (en) | System and method for color image enhancement | |
JPH0325493A (en) | Color display | |
US20030071897A1 (en) | Image processing | |
Marcu et al. | Three-dimensional histogram visualization in different color spaces and applications | |
US6295369B1 (en) | Multi-dimensional color image mapping apparatus and method | |
Plataniotis et al. | Color spaces | |
JP2756047B2 (en) | Color image color adjustment method and apparatus, and image synthesis method | |
CA2192104A1 (en) | Apparatus for use in color correction in an electronic imaging system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MKEX | Expiry |
Effective date: 20101116 |