GB2312120A - Producing a transition region surrounding an image - Google Patents
Producing a transition region surrounding an image Download PDFInfo
- Publication number
- GB2312120A GB2312120A GB9607650A GB9607650A GB2312120A GB 2312120 A GB2312120 A GB 2312120A GB 9607650 A GB9607650 A GB 9607650A GB 9607650 A GB9607650 A GB 9607650A GB 2312120 A GB2312120 A GB 2312120A
- Authority
- GB
- United Kingdom
- Prior art keywords
- image
- boundary
- data
- pixels
- values
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
- H04N5/275—Generation of keying signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/74—Circuits for processing colour signals for obtaining special effects
- H04N9/75—Chroma key
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present invention relates to the field of data processing of image data, and particularly, although not exclusively, to a method of processing image data wherein a displayable matte image is represented as an array of pixels. A displayable matte image 400 is surrounded by an outer boundary comprising a plurality of Bezier curves 501 - 507. An intermediate region between the outer boundary and the matte image is shaded with pixel intensity values having a gradual transition between a first intensity and a second intensity. The method may be applicable to applying a soft edge to a keyed in matte image.
Description
Title: PROCESSING IMAGE DATA
The present invention relates to a method of processing image data in the field of post-production video and film editing.
INTRODUCTION
Systems for processing image data in response to manual operation of a control device are known, in which image data is stored as an array of pixel values. Each pixel value may represent a luminance level or, in a full color system, pixel values may be stored for color components such as additive red, green, blue components, subtractive cyan, magenta, yellow and black components or, particularly in video systems, luminance plus chrominance color difference components.
In addition to application in television post production facilities, systems of this type are being used increasingly for the production of cinematographic film, where the information content for each frame of film is substantially higher. Film clips are scanned on a frame-by-frame basis producing large volumes of image data for subsequent manipulation. Pixel data may be processed under program control to produce visual effects and a system of this type is produced by the present applicant and distributed under the trade mark "FLAME". Thus, in addition to manipulating images as part of a post production process, it may become necessary to make manual modifications to image data, possibly to remove unwanted items from an image or to change or add color etc.
Traditionally, in order to make manual operations, each frame must be examined to allow manual modifications to be made to the image data on a frame-by-frame basis. This takes a considerable amount of time and limits the extent to which techniques of this type may be implemented for particular productions. In known systems pixel data from an image frame is read from storage, manipulated in response to manual operation of a device, possibly a manually operated stylus upon a graphics touch tablet, resulting in modifications being made to pixel data, whereafter said pixel data is written back to the stored frame.
One way of modifying the image data on a frame-by-frame basis is to key in a first video image into a second video image, to produce a composite video image. The first video image, comprises a key matte. The key matte may be abstracted from a video image frame by means of defining a plurality of points around a perimeter of a feature of the video image frame, and then applying a bezier function by means of computer algorithm, to the selected points in order to define a line around the perimeter of the feature. By selecting a plurality of points around the perimeter and fitting a plurality of bezier curves, linking up with each other, the perimeter of the feature can be defined by the bezier curves and the feature abstracted from the video image on a frame-by-frame basis, to produce a video matte which can be keyed in to the second video image on a frame-by-frame basis.
However, there is a problem that the video bezier function produces an abruptly defined edge which is visually distinguishable in the finished composite image.
It is known to "paint" over the bezier edge for each frame, however such a process is slow.
SUMMARY OF THE INVENTION
Aspects of the present invention are defined by the claims herein.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the invention, and to show how the same may be carried into effect, reference will now be made to the accompanying drawings in which:
Figure 1 shows a post-production video artist making manual modifications to a composite frame of video image data using image processing apparatus embodying the present invention, connected to a display monitor and a bulk storage device;
Figure 2 shows a schematic representation of the processing apparatus identified in Figure 1;
Figure 3 details the video monitor shown in Figure 1;
Figure 4 shows a video matte image;
Figure 5 shows the video matte image of Figure 4, having applied thereto an outer boundary comprising a set of bezier curves;
Figure 6 shows a region peripheral of the video matte of Figure 4; and
Figure 7 shows the matte image having a perimeter with graded shading.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS AND
METHODS
The invention will now be described by way of example only, with reference to the accompanying drawings identified above.
A post production facility is illustrated in Figure 1, in which a video artist 101 is seated at a processing station 102. Images are displayed to the artist via a visual display unit 103 and manual modifications to displayed images are effected in response to manual operation of a stylus 104 upon a graphics touch tablet 105. In addition, a conventional keyboard 106 is provided to allow alpha-numeric values to be entered directly. The monitor 103, tablet 105 and keyboard 106 are interfaced to an image processor 107, which may be based substantially upon a graphics workstation executing the
UNIX operating system.
Image data is supplied to the image processor 107 via a digital video tape recorder 108, which may be configured to supply full bandwidth broadcast quality video data to the image processor at video rate.
Altematively, general purpose data storage tape drives may be used and image frames substantially larger than video frames, such as image frames derived from cinematographic film, may be received and processed within the system.
Image processor 107 is detailed in Figure 2. The processor includes a central processing unit 201, arranged to receive program instructions from an addressable random access memory 202. The processing unit 201 may also access a hardware graphics card 203, provided as part of the UNIX environment, allowing computationally extensive operations to be effected at very high speed.
Image data is held within the random access memory 202 as modifications are taking place and large data volumes are held on a disk store 204, preferably taking the form of an array of concurrently accessible disks. The processing unit 201 communicates with the display unit 103, the graphics tablet 105, the keyboard 106 and the video recorder 108 via interface circuits 205 and additional interface circuits may be provided, such as an SCSI interface etc, to allow communication with conventional data manipulation and storage devices etc.
In response to program instructions read from RAM 202, the CPU 201 generates image data which is in tum displayed on the monitor 103. An image of the type displayed on said display unit 103 is illustrated in Figure 3.
Display unit 103 is a twenty inch non-interlaced visual display unit. The displayed image 301 may be considered as being made up of two component parts, taking the form of a working "canvas" 302 and a control panel 303. Image data is displayed in the region of said canvas 302 either as individual frames or as a moving video/film clip.
While an image is being displayed on the canvas 302, pixel data may be modified in response to manual operations of the stylus 104 upon the touch tablet 105. The position of the stylus 104 is identified to the artist 101 by means of a cursor 304, which tracks the position of the stylus 104 as it is moved over the touch tablet 105. Thus, for example, a color process may be selected resulting in modifications being made to pixel colors as the stylus is moved over the touch tablet. Thus, as the cursor 304 moves across the canvas 302, pixels within a region associated with said cursor appear to undergo color modifications in a process which appears similar to applying conventional paint upon a physical canvas. However, although the process appears to be operating in this way, the pixel modification is actually occurring in a separate object with new pixel data being created in response to movement of the stylus, for subsequent combination with the existing data for display purposes.
As the new pixel data is being created the existing pixel data remains unmodified, with the new object data being combined with the existing data in real time, as part of a real time mixing process, to allow the creation and modification of said data interactively in response to manual operation of the control device; that is to say, in response to manual operation of the stylus 104 upon the touch tablet 105. Subsequently, when the required modification has been made, the old data may be finally replaced with the new data, whereafter the image may be written back to the tape recorder 108 for subsequent on-line editing. The monitor 103 includes a control panel 305 for the control of monitor variables, as is well known in the art.
It will be appreciated that images displayed on the display unit 103 are modified in response to operations of the tablet and stylus. Modifications to the displayed images correspond to modifications to image data by the processor 107 and all modifications to images referred hereunder correspond to modifications to image data.
Referring to Figure 4 of the accompanying drawings, there is shown a single frame of a video matte 400. The video matte comprises an enclosed region defining an image feature. At a perimeter 401 of the matte is an abrupt transition from high intensity pixels to a uniform intensity background portion 402 of the frame.
Referring to Figure 5 of the accompanying drawings, there is shown the video matte 400 displayed on the display device 103. The editor, using the graphics tablet 105 and stylus 104 may define a plurality of points on the displayed frame image, at positions outside an around the matte image 400.
A plurality of such points A,B,C,D,E,....N are shown in Figure 5. The plurality
N of points A - N may be linked into a continuous outer ring, outside the matte image 400 by fitting a plurality of boiler curves 501, 502, 503, 504, 505, 506, 507 to the plurality N of selected data positions in the image frame.
A parametric bezier curve is shown in Figure 6. The bezier curve consists of a locus of points moving from a first control point a at one end of the curve to a second control point b at another end of the curve. The position of curvature control points c and d control the curvature of the bezier curve. The locus of points are calculated over the parameter space t=0. For each value of t, a point may be calculated which receives contributions from the positions of points a, b, c and d. The respective proportions of contribution which go to make the bezier curve is determined from a bicubic function, where an example of a bicubic function is a Bemstein blending function as shown in Figure 6b.
Points along the curve are calculated by substituting values of the x, y and z coordinates of the control points a, b, c and d into the bicubic function in order to calculate revised values of x,y and z as the parameter t varies between zero and one.
Bezier curves have the beneficial property that a curve may be defined at any required resolution from input data consisting of the coordinates from points a, b, c and d. A whole bezier curve can be manipulated by manipulating the position of only the four control points defining the curve.
When points a, b, c and d are connected vectors ac, cd, db, the actual bezier curve is tangential to vector ac at point a, tangential to vector db at point b. Additionally the position of the points c and d control the extent of curvature of the curve. Such points are referred to as curvature control points.
Referring again to Figure 5 of the accompanying drawings, the outer ring comprising bezier curves 501 - 507 surrounds the matte image 400.
Bezier curves may be advantageously used to define the outer ring comprising an outer boundary surrounding the matte, since the bezier curves can be defined with any required resolution by specifying a relatively low number of end points and control points, and using functions which do not require a heavy computational effort.
The perimeter 401 of the matte image defines a first ,inner, boundary, and the bezier ring 501 - 507 defines a second, outer, boundary. Between the first and second boundaries is an intermediate region 520. The intermediate region is divided into a plurality of individual polygons or substantially polygon shaped areas 530 by division lines extending across the intermediate area between the first and second boundaries. The number of such polygons may be varied by a preselected number entered into the key pad 106 of the processing workstation. The plurality of polygons may be defined automatically using the stylus and touch tablet in accordance with a predetermined algorithm, by operation of a single keypad, or interface device, eg a mouse or stylus, or the user may individually define each polygon by pointing to the appropriate positions on the first and second boundaries between which a dividing line between polygons is to extend.
Refening to Figure 7 herein there is shown as an example one such polygon 701 between the first inner and second outer boundary regions.
The pixel data corresponding to the polygon may be rendered or shaded in intensity. For example, where the matte image 400 is of a blue color, pixels immediately adjacent the first inner boundary may be rendered to be of the same intensity and color as that of the matte image. Pixels nearest the second outer boundary may be rendered such that at the outer boundary pixels have zero intensity in the blue component, Between the inner and outer boundaries the intensity of the blue component of the pixels may be gradually increased with the overall effect that the area is shaded by an intensity transition between zero intensity at the outer boundary and full intensity corresponding to the intensity of the matte image at the inner boundary.
Shading may be carried out alternatively by introducing pixels of full intensity in the polygon area, but varying the spatial density of the pixels from full density ie complete coverage adjacent the inner boundary to sparse density near the outer boundary and zero density at the outer boundary.
The above example relates to a matte image having a blue component only, however pixels having red, green and blue components may be present in the polygon in various shades. Thus, a transition from completely blue component near the inner boundary to completely red component near the outer boundary with grades of red\blue therebetween may be achieved. Similarly, for other more complex color combinations involving red, green and blue components.
Each successive polygon may be rendered on a polygon-bypolygon basis. The graded shading process is canied out on the image data by a computer algorithm in the central processing unit.
Referring to Figure 7 of the accompanying drawings, there is shown schematically the matte image 400 having a graded shaded intermediate region between the inner and outer boundaries to produce a matte having a "soft" edge in accordance with the above preferred method.
Claims (10)
1. A method of processing image data wherein a displayable matte image is stored as image data representing an array of pixel values, the method comprising the steps of:
(a) defining a boundary edge of said matte image;
(b) displaying said image on a display device;
(c) defining a second boundary around said matte image such as to create an intermediate region between said first boundary and said second boundary;
(d) allocating first pixel intensity values to pixels in the intermediate region in an immediate vicinity of said first boundary;
(e) allocating second pixel intensity values to pixels in the intermediate region at or near the second boundary; and
(f) for pixels in the intermediate region between said first and second boundaries, allocating intensity values between said first and said second pixel intensity values.
2. A method according to Claim 1 in which said step of defining a second boundary comprises selecting a start point data and an end point data and calculating a Bezier curve between said start point data and said end point data.
3. A method according to Claims 1 or 2, in which in said step (d), said first pixel intensity values are selected at a value or values the same or substantially the same as intensity values corresponding to pixels at a region of the matte image at or near the first boundary.
4. A method according to any one of the preceding claims, in which in said step (e), said second pixel values are selected at a value or values the same or substantially the same as pixel intensity values of pixels in the region outside of an immediately surrounding said second boundary.
5. A method according to any one of the preceding claims, in which in said step (f), said pixels between said first and second boundaries are allocated intensity values corresponding to a gradual transition between the first and second intensity values.
6. A method according to any one of the preceding claims, in which said intermediate region created in step (c) is divided into a plurality of substantially polygonal areas.
7. A method according to any one of the preceding claims, in which said second boundary is defined by selecting a start data corresponding to a start position, an end data, corresponding to an end position, and at least one control data corresponding to a control position, and fitting a curve to said start data and said end data in accordance with said control data.
8. A method according to Claim 7, in which said curve is determined by a polynomial function.
9. A method according to Claims 7 or 8, in which said curve is fitted to said start and end data, by a Bezier function.
10. A method of processing image data, wherein a displayable image is stored as image data representing an array of pixel values, the method comprising the steps of:
obtaining said image data;
defining a first boundary edge of said image;
calculating a second boundary edge which is either totally enclosed by or totally surrounds said first boundary edge, such that an intermediate region is defined between said first and second boundary edges;
dividing said intermediate region into a plurality of polygons; and
calculating pixel values within each of said polygons to provide an intensity transition between said defined first boundary edge and said calculated second boundary edge.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9607650A GB2312120A (en) | 1996-04-12 | 1996-04-12 | Producing a transition region surrounding an image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9607650A GB2312120A (en) | 1996-04-12 | 1996-04-12 | Producing a transition region surrounding an image |
Publications (2)
Publication Number | Publication Date |
---|---|
GB9607650D0 GB9607650D0 (en) | 1996-06-12 |
GB2312120A true GB2312120A (en) | 1997-10-15 |
Family
ID=10791994
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9607650A Withdrawn GB2312120A (en) | 1996-04-12 | 1996-04-12 | Producing a transition region surrounding an image |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2312120A (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6351557B1 (en) | 1998-04-03 | 2002-02-26 | Avid Technology, Inc. | Method and apparatus for color manipulation |
US6417891B1 (en) | 1999-04-16 | 2002-07-09 | Avid Technology, Inc. | Color modification on a digital nonlinear editing system |
US6477271B1 (en) | 2000-04-07 | 2002-11-05 | Avid Technology, Inc. | Secondary color modification of a digital image |
US6552731B1 (en) | 1999-04-16 | 2003-04-22 | Avid Technology, Inc. | Multi-tone representation of a digital image on a digital nonlinear editing system |
US7973800B2 (en) | 1999-04-16 | 2011-07-05 | Avid Technology, Inc. | Source color modification on a digital nonlinear editing system |
EP2458552A1 (en) * | 2008-05-28 | 2012-05-30 | Apple Inc. | Defining a selection border in an image as a pair of deformable curves |
US8331685B2 (en) | 2008-05-28 | 2012-12-11 | Apple Inc. | Defining a border for an image |
US8452105B2 (en) | 2008-05-28 | 2013-05-28 | Apple Inc. | Selecting a section of interest within an image |
US8548251B2 (en) | 2008-05-28 | 2013-10-01 | Apple Inc. | Defining a border for an image |
US8582834B2 (en) | 2010-08-30 | 2013-11-12 | Apple Inc. | Multi-image face-based image processing |
US8619093B2 (en) | 2010-07-20 | 2013-12-31 | Apple Inc. | Keying an image |
US8675009B2 (en) | 2010-07-20 | 2014-03-18 | Apple Inc. | Keying an image in three dimensions |
US8743139B2 (en) | 2010-07-20 | 2014-06-03 | Apple Inc. | Automatically keying an image |
US8760464B2 (en) | 2011-02-16 | 2014-06-24 | Apple Inc. | Shape masks |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2262860A (en) * | 1991-12-21 | 1993-06-30 | Philips Nv | Chroma key signal generator with soft zone type keying |
GB2270439A (en) * | 1992-09-03 | 1994-03-09 | Philips Electronics Nv | Chromakey method and apparatus for picture signals |
EP0701379A2 (en) * | 1994-09-10 | 1996-03-13 | Philips Patentverwaltung GmbH | Process and circuit arrangement for color keying |
-
1996
- 1996-04-12 GB GB9607650A patent/GB2312120A/en not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2262860A (en) * | 1991-12-21 | 1993-06-30 | Philips Nv | Chroma key signal generator with soft zone type keying |
GB2270439A (en) * | 1992-09-03 | 1994-03-09 | Philips Electronics Nv | Chromakey method and apparatus for picture signals |
EP0701379A2 (en) * | 1994-09-10 | 1996-03-13 | Philips Patentverwaltung GmbH | Process and circuit arrangement for color keying |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6351557B1 (en) | 1998-04-03 | 2002-02-26 | Avid Technology, Inc. | Method and apparatus for color manipulation |
US6417891B1 (en) | 1999-04-16 | 2002-07-09 | Avid Technology, Inc. | Color modification on a digital nonlinear editing system |
US6552731B1 (en) | 1999-04-16 | 2003-04-22 | Avid Technology, Inc. | Multi-tone representation of a digital image on a digital nonlinear editing system |
US7973800B2 (en) | 1999-04-16 | 2011-07-05 | Avid Technology, Inc. | Source color modification on a digital nonlinear editing system |
US6477271B1 (en) | 2000-04-07 | 2002-11-05 | Avid Technology, Inc. | Secondary color modification of a digital image |
US6763134B2 (en) | 2000-04-07 | 2004-07-13 | Avid Technology, Inc. | Secondary color modification of a digital image |
US8452105B2 (en) | 2008-05-28 | 2013-05-28 | Apple Inc. | Selecting a section of interest within an image |
US8331685B2 (en) | 2008-05-28 | 2012-12-11 | Apple Inc. | Defining a border for an image |
EP2458552A1 (en) * | 2008-05-28 | 2012-05-30 | Apple Inc. | Defining a selection border in an image as a pair of deformable curves |
US8548251B2 (en) | 2008-05-28 | 2013-10-01 | Apple Inc. | Defining a border for an image |
US8571326B2 (en) | 2008-05-28 | 2013-10-29 | Apple Inc. | Defining a border for an image |
US8619093B2 (en) | 2010-07-20 | 2013-12-31 | Apple Inc. | Keying an image |
US8675009B2 (en) | 2010-07-20 | 2014-03-18 | Apple Inc. | Keying an image in three dimensions |
US8743139B2 (en) | 2010-07-20 | 2014-06-03 | Apple Inc. | Automatically keying an image |
US8582834B2 (en) | 2010-08-30 | 2013-11-12 | Apple Inc. | Multi-image face-based image processing |
US8760464B2 (en) | 2011-02-16 | 2014-06-24 | Apple Inc. | Shape masks |
US8891864B2 (en) | 2011-02-16 | 2014-11-18 | Apple Inc. | User-aided image segmentation |
Also Published As
Publication number | Publication date |
---|---|
GB9607650D0 (en) | 1996-06-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6445816B1 (en) | Compositing video image data | |
US5754183A (en) | Image processing apparatus and method for producing pixel data in dependence upon the shape of a sectional line extending between boundary lines of an object | |
US6337692B1 (en) | Primary and secondary color manipulations using hue, saturation, luminance and area isolation | |
US6897880B2 (en) | User interface for generating parameter values in media presentations based on selected presentation instances | |
CA2079918C (en) | Image editing system and method having improved multi-dimensional editing controls | |
EP0947955B1 (en) | Apparatus for generating custom gamma curves for color correction equipment | |
US6751347B2 (en) | Color diamond chroma keying | |
US5077610A (en) | Previewing cuts and transformations in an electronic image composition system | |
US6850249B1 (en) | Automatic region of interest tracking for a color correction system | |
US7003178B2 (en) | Color matching image data | |
WO1992021096A1 (en) | Image synthesis and processing | |
Wallace | Merging and transformation of raster images for cartoon animation | |
US20070253640A1 (en) | Image manipulation method and apparatus | |
US6757425B2 (en) | Processing image data to transform color volumes | |
JP2000149053A (en) | Image processor including blending processing and its method | |
GB2312120A (en) | Producing a transition region surrounding an image | |
GB2236638A (en) | Predicted graphic image movement path display using keyframes | |
US5621821A (en) | Apparatus and method for detecting distortions in processed image data | |
US6052109A (en) | Processing image data | |
US5151686A (en) | Electronic brush generation | |
GB2312123A (en) | Post production film or video editing using "warping" | |
US6295369B1 (en) | Multi-dimensional color image mapping apparatus and method | |
EP0947957B1 (en) | Defining color regions for applying image processing in a color correction system | |
EP1160727A2 (en) | Apparatus, method and system for selective image processing | |
Moioli | Render, Compositing, and Video Editing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |