WO2013122022A1 - 画像評価装置及びパターン形状評価装置 - Google Patents
画像評価装置及びパターン形状評価装置 Download PDFInfo
- Publication number
- WO2013122022A1 WO2013122022A1 PCT/JP2013/053175 JP2013053175W WO2013122022A1 WO 2013122022 A1 WO2013122022 A1 WO 2013122022A1 JP 2013053175 W JP2013053175 W JP 2013053175W WO 2013122022 A1 WO2013122022 A1 WO 2013122022A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pattern
- image
- contour
- model
- unit
- Prior art date
Links
- 238000011156 evaluation Methods 0.000 title claims abstract description 123
- 238000001878 scanning electron micrograph Methods 0.000 claims abstract description 65
- 239000004065 semiconductor Substances 0.000 claims abstract description 28
- 238000003860 storage Methods 0.000 claims abstract description 12
- 238000012545 processing Methods 0.000 claims description 43
- 238000005259 measurement Methods 0.000 claims description 22
- 238000010894 electron beam technology Methods 0.000 claims description 19
- 238000013461 design Methods 0.000 claims description 15
- 238000004088 simulation Methods 0.000 claims description 6
- 230000008859 change Effects 0.000 abstract description 47
- 238000000034 method Methods 0.000 abstract description 28
- 238000000605 extraction Methods 0.000 description 37
- 238000004364 calculation method Methods 0.000 description 24
- 238000001514 detection method Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 15
- 238000000926 separation method Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 8
- 238000007689 inspection Methods 0.000 description 7
- 238000004519 manufacturing process Methods 0.000 description 7
- 230000004075 alteration Effects 0.000 description 5
- 238000002360 preparation method Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000005484 gravity Effects 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 238000012821 model calculation Methods 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 201000009310 astigmatism Diseases 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 230000007547 defect Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 239000002245 particle Substances 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 239000011248 coating agent Substances 0.000 description 2
- 238000000576 coating method Methods 0.000 description 2
- 206010010071 Coma Diseases 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000007429 general method Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000010884 ion-beam technique Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/337—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L22/00—Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
- H01L22/10—Measuring as part of the manufacturing process
- H01L22/12—Measuring as part of the manufacturing process for structural parameters, e.g. thickness, line width, refractive index, temperature, warp, bond strength, defects, optical inspection, electrical measurement of structural dimensions, metallurgic measurement of diffusions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/02—Details
- H01J37/22—Optical, image processing or photographic arrangements associated with the tube
- H01J37/222—Image processing arrangements associated with the tube
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/26—Electron or ion microscopes; Electron or ion diffraction tubes
- H01J37/28—Electron or ion microscopes; Electron or ion diffraction tubes with scanning beams
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B2210/00—Aspects not specifically covered by any group under G01B, e.g. of wheel alignment, caliper-like sensors
- G01B2210/56—Measuring geometric parameters of semiconductor structures, e.g. profile, critical dimensions or trench depth
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
- G06T2207/10061—Microscopic image from scanning electron microscope
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10144—Varying exposure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30148—Semiconductor; IC; Wafer
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/22—Treatment of data
- H01J2237/221—Image processing
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/245—Detection characterised by the variable being measured
- H01J2237/24571—Measurements of non-electric or non-magnetic variables
- H01J2237/24578—Spatial variables, e.g. position, distance
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/245—Detection characterised by the variable being measured
- H01J2237/24592—Inspection and quality control of devices
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/26—Electron or ion microscopes
- H01J2237/28—Scanning microscopes
- H01J2237/2801—Details
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/26—Electron or ion microscopes
- H01J2237/28—Scanning microscopes
- H01J2237/2813—Scanning microscopes characterised by the application
- H01J2237/2817—Pattern inspection
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L2924/00—Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
- H01L2924/0001—Technical content checked by a classifier
- H01L2924/0002—Not covered by any one of groups H01L24/00, H01L24/00 and H01L2224/00
Definitions
- the present invention relates to an apparatus for evaluating a pattern shape of a semiconductor pattern.
- the present invention also relates to an apparatus or the like for evaluating the pattern shape of a semiconductor pattern, and in particular, a pattern shape suitable for finding an appropriate semiconductor manufacturing condition or extracting parameters for finding an appropriate semiconductor manufacturing condition. It relates to an evaluation device.
- a length measurement SEM is used as a means for evaluating whether a formed pattern is as designed, and the width of a line pattern, the size of the diameter of a hole, etc. are measured, and the pattern shape is determined by the dimensions.
- I was managing. With the miniaturization of semiconductors, it is common to form patterns below the exposure wavelength, and super resolution techniques such as proof of deformation and optical proximity correction have been introduced. It is difficult to measure with the dimensions of the pattern, such as the inclination of the pattern side wall, the rounding and constriction of the corner of the pattern, and the deformation of the pattern due to the change of the aberration of the exposure device. For this reason, there is known a technique (see Patent Document 1) for generating the upper and lower contour lines of the pattern side wall and evaluating the inclination of the pattern side wall from the two-dimensional shape of the pattern and the width of the white band.
- the projection exposure method for transferring a semiconductor pattern onto a wafer exposure light is applied to a photomask of a shielding material on which a pattern to be printed is written, and a photomask image is projected onto a resist on the wafer through a lens system.
- the exposure and exposure are determined by determining the focus and exposure amount.
- the focus and exposure amount will shift and the dimensions and shape of the transferred pattern will be transferred. May change and the normal pattern may not be obtained. Further, the focus may be lost due to non-flatness or lens aberration caused by the photomask.
- Astigmatism causes a phenomenon in which the condensing position differs between the horizontal direction and the vertical direction.
- Astigmatism causes a phenomenon in which the condensing position differs between the horizontal direction and the vertical direction.
- the pattern becomes an elliptical pattern with vertical and horizontal dimensions different.
- a method of obtaining a focus value using a line pattern see Patent Document 2
- a horizontal focus value is obtained using a vertical line pattern
- a vertical focus is obtained using the horizontal line pattern.
- JP 2004-228394 A JP 2005-64023 A (corresponding US Pat. No. 6,929,892) JP 2008-140911 A (corresponding US Pat. No. 8,023,759)
- the exposure conditions in the X direction and / or the Y direction are evaluated, the exposure conditions can be evaluated, the exposure conditions, Alternatively, a pattern shape evaluation apparatus is proposed in which output of adjustment conditions for exposure conditions is a second purpose.
- an image evaluation apparatus for obtaining an exposure condition of a semiconductor pattern from an image photographed using an electron beam is obtained by generating a plurality of contour lines from an SEM image.
- a storage unit storing a model indicating the relationship between the feature amount and the exposure condition, contour generation parameter information corresponding to the model, and generating a plurality of contour lines from the SEM image using the contour generation parameter information
- Proposing an image evaluation apparatus comprising: an outline generation unit that performs an exposure condition using a feature amount obtained from a plurality of outlines generated by the outline generation unit and the model, and the model To do.
- an image evaluation apparatus in which the contour generation unit generates three or more contour lines.
- the contour generation parameter information is information for generating a contour line by the contour generation unit, and is information for generating the number of contour lines and each contour line corresponding to the number.
- An image evaluation apparatus is proposed.
- an image evaluation apparatus that creates a model using a plurality of SEM images
- an outline generation unit that generates a plurality of outlines from an SEM image using outline generation parameter information
- a plurality of lines generated by the outline generation unit Using a plurality of the contour generation parameter information, a model generation unit that creates a model expression from the feature amount obtained from the contour line and the exposure condition corresponding to the SEM image, and through the contour generation unit and the model generation unit
- the image processing apparatus includes a plurality of image processing apparatuses. For the direction, feature quantities in a plurality of directions of the target pattern are obtained, weights assigned to the plurality of directions are assigned to the feature quantities in the plurality of directions, and based on the weights, the exposure condition in a specific direction is determined.
- a pattern shape evaluation apparatus for obtaining parameters required for adjustment is proposed.
- the parameters required for the exposure conditions in the X direction and / or the Y direction can be obtained based on feature amounts in a plurality of directions other than the X direction and the Y direction. It is possible to appropriately perform exposure condition adjustment using an actual pattern in which the amount of edges in the X direction and the Y direction may not be sufficient as in a dedicated pattern. In addition, since feature amounts in the X direction and the Y direction can be extracted using edges other than the X direction and the Y direction, it is possible to perform highly accurate evaluation based on a sufficient amount of information.
- the figure which shows the Example of an image evaluation apparatus The figure which shows the Example of an outline generation part.
- the figure which shows the Example of GUI of a display means The figure which shows the space
- the figure which shows the direction and feature-value of an outline The figure which shows the Example of a horizontal direction estimation part and a vertical direction estimation part.
- the image evaluation apparatus exemplified in the embodiments described below relates to a pattern image evaluation method and apparatus for monitoring process fluctuations from pattern image data obtained by SEM imaging.
- process variation is detected from the image data using the two-dimensional shape of a plurality of contour lines of the pattern is shown.
- a charged particle beam apparatus is illustrated as an apparatus for forming an image, and an example using an SEM is described as one aspect thereof.
- a focused ion beam (FIB) apparatus that scans the beam to form an image may be employed as the charged particle beam apparatus.
- FIB focused ion beam
- FIG. 19 is a schematic explanatory diagram of a measurement / inspection system in which a plurality of measurement or inspection devices are connected to a network.
- the system mainly includes a CD-SEM 2401 for measuring pattern dimensions of a semiconductor wafer, a photomask, etc., and irradiating a sample with an electron beam to acquire an image and compare the image with a pre-registered reference image.
- the defect inspection apparatus 2402 for extracting defects based on the above is connected to the network.
- the network also includes a condition setting device 2403 for setting the measurement position and measurement conditions on the design data of the semiconductor device, and the pattern quality based on the design data of the semiconductor device and the manufacturing conditions of the semiconductor manufacturing device.
- a simulator 2404 for simulation and a storage medium 2405 for storing design data in which layout data and manufacturing conditions of semiconductor devices are registered are connected.
- the design data is expressed in, for example, the GDS format or the OASIS format, and is stored in a predetermined format.
- the design data can be of any type as long as the software that displays the design data can display the format and can handle the data as graphic data.
- the storage medium 2405 may be built in the measuring device, the control device of the inspection device, the condition setting device 2403, and the simulator 2404.
- the CD-SEM 2401 and the defect inspection device 2402 are provided with respective control devices, and necessary control is performed for each device. In these control devices, the functions of the simulator, measurement conditions, etc. are set. You may make it mount a function.
- an electron beam emitted from an electron source is focused by a plurality of lenses, and the focused electron beam is scanned one-dimensionally or two-dimensionally on a sample by a scanning deflector.
- SE Secondary Electron
- BSE backscattered electrons
- the image signals stored in the frame memory are integrated by an arithmetic device mounted in the control device. Further, scanning by the scanning deflector is possible for any size, position, and direction.
- control and the like are performed by the control device of each SEM, and images and signals obtained as a result of scanning with the electron beam are sent to the condition setting device 2403 via the communication line network.
- the control device that controls the SEM and the condition setting device 2403 are described as separate units.
- the present invention is not limited to this, and the condition setting device 2403 controls and measures the device. Processing may be performed in a lump, or SEM control and measurement processing may be performed together in each control device.
- condition setting device 2403 or the control device stores a program for executing a measurement process, and measurement or calculation is performed according to the program.
- the condition setting device 2403 has a function of creating a program (recipe) for controlling the operation of the SEM based on semiconductor design data, and functions as a recipe setting unit. Specifically, positions for performing processing necessary for SEM such as desired measurement points, autofocus, autostigma, and addressing points on design data, pattern outline data, or design data that has been simulated And a program for automatically controlling the sample stage, deflector, etc. of the SEM is created based on the setting. Also, in order to create a template, which will be described later, information on a region serving as a template is extracted from design data, and a processor for creating a template based on the extracted information, or a program for creating a template for a general-purpose processor, or It is remembered.
- a program for controlling the operation of the SEM based on semiconductor design data
- a recipe setting unit Specifically, positions for performing processing necessary for SEM such as desired measurement points, autofocus, autostigma, and addressing points on design data, pattern outline data, or design data that
- FIG. 20 is a schematic configuration diagram of a scanning electron microscope.
- An electron beam 2503 extracted from an electron source 2501 by an extraction electrode 2502 and accelerated by an acceleration electrode (not shown) is focused by a condenser lens 2504 which is a form of a focusing lens, and then is scanned on a sample 2509 by a scanning deflector 2505.
- the electron beam 2503 is decelerated by a negative voltage applied to an electrode built in the sample stage 2508 and is focused by the lens action of the objective lens 2506 and irradiated onto the sample 2509.
- secondary electrons and electrons 2510 such as backscattered electrons are emitted from the irradiated portion.
- the emitted electrons 2510 are accelerated in the direction of the electron source by the acceleration action based on the negative voltage applied to the sample, collide with the conversion electrode 2512, and generate secondary electrons 2511.
- the secondary electrons 2511 emitted from the conversion electrode 2512 are captured by the detector 2513, and the output I of the detector 2513 changes depending on the amount of captured secondary electrons. Depending on the output I, the brightness of a display device (not shown) changes.
- an image of the scanning region is formed by synchronizing the deflection signal to the scanning deflector 2505 and the output I of the detector 2513.
- the scanning electron microscope illustrated in FIG. 20 includes a deflector (not shown) that moves the scanning region of the electron beam.
- the present invention is not limited to such a configuration. It is possible to adopt a configuration in which the detection surface of the electron multiplier tube or the detector is arranged on the orbit.
- the control device 2514 controls each component of the scanning electron microscope, and forms a pattern on the sample based on a function of forming an image based on detected electrons and an intensity distribution of detected electrons called a line profile. It has a function to measure the pattern width.
- the image evaluation device 1 can be built in the control device 2514, or can be executed by a computing device with built-in image processing, or can be executed by an external computing device (for example, a condition setting device 2403) via a network. It is also possible to perform image evaluation at.
- FIG. 1A is a diagram illustrating an example of a model creation unit 1 of an image evaluation apparatus that creates a model for obtaining a relationship between an SEM image and an exposure condition and outputs an outline generation parameter used for the model.
- An FEM (Focus Exposure Matrix) wafer on which a pattern is printed by changing the exposure conditions (focus, exposure amount) for each shot (one exposure unit) in advance is photographed using SEM, and depending on the position on the photographed wafer, This information is used as exposure condition information 30 because the correspondence of which exposure condition shot can be taken.
- the plurality of different exposure condition (focus, exposure amount) information 30 and a photographed SEM image 31 are used.
- the contour line generation unit 11 generates a plurality of contour lines from the SEM image 31 based on the contour line generation parameter 32.
- the contour generation parameter 32 is information on the number of contour lines generated by the contour line generation unit 11 and parameters for generating each contour line.
- a model that uses a plurality of contour data generated by the contour generation unit 11 to obtain a feature amount by the model generation unit 12 and corresponds to information on exposure conditions (focus, dose), and shows a relationship between the feature amount and the exposure condition Create Using the model created by the model generation unit 12, the evaluation unit 13 evaluates the model.
- FIG. 13 shows an example of changes in the pattern side wall depending on the focus.
- the focus changes to F1, F2, and F3
- the upper part of the photoconductor shrinks and the upper PA point of the photoconductor moves to the right, but the lower PB point of the photoconductor is not so much. It does not change.
- FIG. 14 shows the relationship between the threshold of the contour line and the pattern side wall.
- This is a hole pattern
- (a) is a cross section of the pattern side wall portion of the i line of the SEM image of (b).
- the pattern shape appears as a white band as shown in FIG.
- the white band increases in the amount of secondary electrons reflected on the pattern side wall, thereby increasing the brightness, and appears as a white band along the pattern shape.
- the portion where the inclination of the cross section of the pattern side wall is the largest corresponds to the position Pp of the luminance peak of the white band in the profile (c).
- the pattern upper part PA and lower part PB correspond to the skirts Pa and Pb at both ends of the peak of the white band luminance profile. Which of the two ends of the white band is the upper part or the lower part of the pattern can be determined by using information such as whether the inner side or the outer side of the pattern is concave or convex.
- both ends of the white band are inside and outside and here is a hole pattern
- the circle inside the white band is concave and corresponds to the bottom.
- the outer circle of the white band corresponds to the upper part.
- the upper and lower portions of the pattern, the position between them, and the like can be captured from the luminance profile.
- the white band peak is defined as 100%
- the contour line point that generates 50% of the position inside the white band (on the right side of the peak position) is used.
- a line created by connecting a point at a position of 50% inside each white band (on the right side of the peak position) along the pattern shape from the luminance profile is defined as an outline created at 50% inside.
- four contour lines can be similarly created using four points of 30%, 70%, and 90%.
- th (threshold) 30, th (threshold) 50, th (threshold) 70, th (threshold) 90 are set as parameters for generating the contour line, and the peak is set to 100%.
- “outside, threshold value 50” may be set.
- the parameter for generating each contour line for the number of contour lines of the contour generation parameter 32 is information on this threshold.
- a contour line is created with an arbitrary threshold corresponding to the height position of the pattern side wall.
- contour lines As described above, by using three or more contour lines, it is possible to obtain the amount of change at the height position of a plurality of resists, and the focus value estimation accuracy is improved. However, if too many contour lines are used, processing time is required and it is impractical, so 10 or less lines are considered sufficient. Further, if the number is large, it is considered that a fixed threshold value may be used without adjusting the threshold value. Further, when the number of contour lines is small due to processing time, it is considered effective to obtain the height position of the resist that changes depending on the focus. For example, using the contour lines created with threshold values for many different resist height positions in advance, each threshold value between points with a large amount of change by focusing is examined, and the selected threshold value is used for actual evaluation. It is conceivable to use a contour line at.
- the evaluation unit 13 evaluates the model obtained as a result. That is, the contour generation parameter 32 with a good model evaluation is a parameter for generating a contour that more effectively represents the feature for obtaining the exposure condition, and a parameter for generating a contour that better captures the shape change of the pattern side wall. I think.
- FIG. 11 (a) shows an example of a plurality of contour generation parameters.
- the contour generation parameter 32 is information on the number of contour lines and the height position (threshold value) corresponding to the number of contour lines. Information on the inside and outside of the band and information on the unevenness of the pattern may be added.
- One contour line generation parameter is indicated for each line, and a list of a plurality of lines is processed in order from the top. Further, since the number of contour lines can be determined from the information on the number of height positions (thresholds) without explicitly indicating the number of contour lines as shown in FIG. 11B, a plurality of heights for generating contour lines can be determined.
- the position (threshold value) information may be used.
- FIG. 2 shows an example of the contour line generator.
- the contour generation unit 11 generates a plurality of contour data 11 a for the SEM image 31 based on the contour generation parameter 32.
- the parameters for generating the respective contour lines are read into the n contour line creation units 1101 to 11n. Then, n contour lines are generated from the SEM images by the contour creating units 1101 to 11n.
- the generated contour data may be stored in the contour storage unit 1100.
- n contour line creation units are used, but one contour line creation unit may be used and created n times.
- Fig. 3 shows an example of the model generation unit.
- a feature amount is obtained from the contour line data generated by the contour generation unit 11, and a model indicating the relationship between the Focus value, Dose value information, and the feature amount is created.
- the feature amount calculation unit 121 aligns the contour line data and the reference pattern, and obtains the distance between each pixel of the contour line and the pixel position of the reference pattern corresponding thereto.
- the alignment of the contour line data and the reference pattern is performed by imaging the contour line data and the reference pattern, expanding each of them, and performing a matching process using normalized correlation. It is also conceivable that after the image is formed, the center of gravity of the image is obtained and the center of gravity is aligned.
- the present invention is not limited to this, and the alignment of the contour line data and the reference pattern can be performed by a known matching technique.
- the reference pattern may be design data, simulation data, or image data or contour line data created from one or a plurality of SEM images.
- One of the plurality of contour data generated by the contour generation unit 11 may be used as the reference pattern.
- the pixel of the reference pattern corresponding to the pixel closest to the distance among the pixels of the reference pattern with respect to the pixel of the contour line is the correspondence between the pixel of the contour line and the pixel of the reference pattern after alignment. And the distance between the corresponding pixels is obtained.
- the distances from the corresponding reference pattern pixels are obtained for all the pixels of the contour line, and the statistical values of the distances obtained for all the pixels, for example, the average value and the variance value are obtained as the feature values. You may obtain
- a plurality of feature quantities may be used. Further, this feature amount is obtained for each contour line.
- the association is obtained based on the pixels of the contour line, but may be obtained based on the pixels of the reference pattern.
- the line edge roughness changes depending on the exposure conditions. Since the line edge roughness has periodicity, the spatial frequency such as Fourier transform (FFT) is obtained to show the periodicity of the line edge roughness. It is also conceivable to determine the focus value using the quantity.
- FFT Fourier transform
- a model is created by the modeling unit 122 using the feature amount for each contour line obtained by the feature amount calculation unit 121 and the exposure condition (Focus value, Dose value) information 30.
- the model may be created by obtaining a regression equation or may be obtained by using linear programming.
- the exposure condition Y can be expressed by a linear sum of the weighting factors X1, X2,... Xn to the feature amounts A1, A2,.
- the model has values of weighting factors X1, X2,.
- the modeling unit 122 is provided with a storage unit such as a memory, and the feature amount obtained from the exposure condition information and the contour line of the SEM image is stored.
- Fig. 4 shows an example of the evaluation unit.
- the evaluation unit 13 evaluates a plurality of models obtained by the model creation unit 1, and selects the best model and contour generation parameters from among the models based on the evaluation values.
- the model evaluation unit 131 evaluates the fit of the model. For the evaluation of the fit, for example, the determination coefficient of the degree of freedom adjusted may be obtained and used as the evaluation value of the fit. Further, the evaluation value may be determined based on the evaluation standard of the Akaike information criterion (AIC), or other known techniques may be used.
- the evaluation value obtained by the model evaluation unit 131, the model, and the contour generation parameter are stored in the storage / selection unit 132, and the model 34 and the contour generation parameter 35 having a good evaluation value are output from the plurality of stored model evaluation values. To do. In addition, since the processing time for generating the number of contour lines is long, the evaluation value is not only applied to the model, but the evaluation value is decreased as the number of contour lines increases. A smaller number of lines may improve the evaluation value.
- FIG. 1B is a diagram illustrating an example of the exposure condition estimation unit 2 of the image evaluation apparatus that estimates the exposure condition from the SEM image using the model 34 and the contour generation parameter 35 obtained by the image evaluation apparatus 1.
- the contour creation unit 21 Based on the contour generation parameter 35 obtained by the model creation unit 1, the contour creation unit 21 generates a plurality of contour lines from the SEM image, obtains feature amounts from the generated plurality of contour lines, and the model creation unit 1
- the estimation unit 22 estimates the exposure condition using the obtained model.
- the contour generation unit 21 is the same as the contour generation unit 21 shown in FIG. FIG. 5 shows an embodiment of the estimation unit.
- the feature amount is calculated by the feature amount calculation unit 221 from the plurality of contour lines generated by the contour line generation unit 21.
- the feature quantity calculation unit 221 is the same as the feature quantity calculation unit 121 described with reference to FIG.
- the model calculation unit 222 estimates the exposure amount from the feature amount obtained by the feature amount calculation unit 221 using the model obtained by the model creation unit 1.
- the exposure amount can be estimated.
- the estimated result may be displayed by displaying the result of the exposure condition estimated and obtained from the SEM image corresponding to the position indicated by the wafer map 386 on the GUI screen 38 as shown in FIG.
- the difference between the exposure condition of the pattern and the exposure condition estimated by the model may be indicated. It is also possible to change the color depending on the size of the difference.
- a setting unit 381 is provided for setting parameters for generating three or more contour lines for obtaining feature quantities used in the estimation unit. This setting may be input manually by the user, or may be stored in advance as a file when the model is created or the file may be read.
- the contour line generation parameters are the number of contour lines to be generated and the respective threshold values.
- the exposure condition model shown in FIG. 1A when the exposure condition model shown in FIG. 1A is created, the exposure condition and the exposure condition estimated by the model are arranged and displayed from the SEM image corresponding to the position indicated by the wafer map 386 as shown in FIG. May be. At that time, a window 383 for displaying each model obtained using a plurality of contour generation parameters and its evaluation value may be provided. In addition, one model is selected from the plurality of models, and the exposure condition and the exposure condition estimated by the selected model are displayed side by side at a position corresponding to the SEM imaging position on the wafer when the model is used. May be. Further, although the description has been given of the case where the exposure condition model shown in FIG. 1A is created, it can be considered that the exposure condition model shown in FIG.
- a setting unit 382 that sets the maximum number of contour lines in accordance with an allowable processing time is provided, and a number less than the set maximum number of contour lines is provided. It is also conceivable to generate a contour line by a contour line generation unit and obtain a model having a good evaluation value and its contour line generation parameter.
- the contour generation parameters may be determined automatically by generating a plurality of contour generation parameters without setting the contour generation parameters from the outside. In that case, for example, it may be possible to set the automatic mode in the setting unit 382.
- the increment value of the height position (threshold value) may be set to 10, and 11 parameters from 0 to 10 increments may be automatically generated up to the maximum value of 100. It is conceivable to obtain a model having a good evaluation value and its contour line generation parameter by changing the threshold values of a plurality of contour lines including the inner and outer sides of the pattern (upper and lower portions of the pattern side wall), respectively.
- FIG. 6 shows a processing flow of an image evaluation method for obtaining exposure conditions.
- a model for estimating the exposure condition is created in the model creation stage of S10, and the exposure condition is estimated from the SEM image using the model created in the exposure condition estimation stage of S20.
- Fig. 7 shows the processing flow of the model creation unit.
- the model creation unit sets a plurality of contour generation parameters in the contour generation parameter setting in S11.
- a contour line is generated based on one contour line generation parameter among the plurality of contour line generation parameters set in the contour line generation step of S12.
- the contour generation step is repeated until contour generation is performed for all SEM images in S13.
- the model creation stage is entered in S14.
- a feature amount is obtained from each of a plurality of contour lines generated from all SEM images, and a model indicating the relationship with the exposure condition is created.
- the application of the model created in S15 is evaluated, and the evaluation result, the model, and its outline generation parameter are stored.
- FIG. 8 shows a processing flow in the contour generation stage.
- a contour generation parameter is read in S121, and a plurality of contour lines are generated in S122 based on the number of contour lines of the contour generation parameter and the respective threshold values.
- the generated contour data is stored in the storage means.
- Figure 9 shows the processing flow at the model creation stage.
- the contour line data corresponding to the SEM image and the information of the exposure conditions are read.
- the contour line and the reference pattern are aligned.
- the distance from the pixel of the reference pattern corresponding to each pixel of the contour line is obtained, and the statistic of the distance value obtained in S144 is calculated and stored in the storage means as a feature value.
- the model is obtained using a regression equation, linear programming, or the like based on the feature amount and the exposure condition.
- FIG. 10 shows a processing flow of the exposure amount estimation stage.
- the contour generation parameter and the model obtained in the model creation step S10 are set, and a plurality of contour lines are determined based on the number of contour lines of the contour generation parameter set in the contour generation step in S22 and their respective threshold values.
- step S23 a feature amount is obtained from a plurality of contour lines, and an exposure strip is estimated using a model based on the feature amount.
- the contour generation stage of S22 is the same process as the contour generation stage of S21.
- Fig. 11 shows the processing flow in the estimation stage.
- step S231 the contour line and the reference pattern are aligned.
- step S232 the distance from the pixel of the reference pattern corresponding to each pixel of the contour line is obtained.
- step S233 the statistic (feature amount) of the distance value of all pixels is obtained.
- the model is calculated with the feature amount obtained in S234, and the exposure condition is obtained.
- the model is created by regression or linear programming, but for example, a table that draws exposure conditions using the feature values obtained from multiple contours as addresses is created, and the created table is used for evaluation.
- the exposure condition may be obtained by subtracting the address of the table having the closest feature value based on the feature values obtained from a plurality of contour lines.
- the diameter K of the hole pattern is smaller than a certain threshold value, it is conceivable to set a threshold value for a plurality of contour lines by narrowing down to the lower part of the pattern when setting the contour line generation parameter.
- FIG. 17 shows the change in focus at the line end (end point) part of the pattern. This is seen from above, and the magnitude of the change in the shape of the focus FA and the shape of the focus FB differs between the center portion LC and the end portion L of the end points. Since the light intensity varies depending on the part of the pattern due to the wraparound of the exposure light, the change in the side wall of the pattern is considered to be different for each part such as the corner part, the straight line part, and the line end (end point) part. Therefore, when obtaining the exposure conditions with various patterns, it is conceivable to generate a plurality of contour lines for each part from the SEM image, and to create a model for each part by obtaining the feature amount.
- a contour generation parameter is obtained for each part.
- the models of the line end portions 151c and 152c are divided into line end portions, 151a, 152a, 154a and 153a are straight portions, and 15b is divided into corner portions. It is conceivable that the model and the evaluation of the model are performed for each part, and the model and the contour generation parameter are obtained for each part.
- a feature quantity for obtaining a feature quantity from a pattern image including a closed curve
- a pattern shape evaluation apparatus including an extraction unit and an estimation unit that estimates exposure conditions in at least two directions of vertical and horizontal using the feature amount obtained by the feature amount extraction unit will be described.
- the feature amount extraction unit includes a feature amount extraction unit that obtains at least two vertical and horizontal feature amounts based on the direction information obtained by the direction detection unit that obtains the direction information of the white band, pattern edge, or outline.
- a pattern shape evaluation apparatus characterized by this is proposed.
- the feature amount extraction unit proposes a pattern shape evaluation device including a feature amount extraction unit that obtains a feature amount separately for each region designated by the user for each of two or more directions.
- the estimation unit when estimating by the estimation unit, it is estimated using a model or table indicating a relationship between the feature amount for each direction obtained by the feature amount extraction unit and the exposure condition, and pattern shape evaluation Propose the device.
- a pattern shape evaluation apparatus comprising a direction-specific feature calculation unit for calculating a ratio for dividing at least two directions of vertical and horizontal from the direction information obtained by the direction detection unit.
- a pattern shape evaluation apparatus for obtaining a feature amount
- a feature amount extraction unit for obtaining feature amounts in at least two directions, vertical and horizontal, from a pattern image including a closed curve, a feature amount for each direction obtained by the feature amount extraction unit, and an SEM image
- a pattern shape evaluation apparatus including a model creation unit that creates a model using exposure conditions corresponding to the above is proposed.
- a pattern shape evaluation apparatus in which the model creating unit creates a model showing a relationship between a feature amount and an exposure amount in at least two directions of vertical and horizontal directions.
- a pattern shape evaluation apparatus including a display unit capable of switching and displaying exposure conditions for each direction on a wafer map is proposed below. To do.
- a pattern shape evaluation apparatus includes an instruction unit that allows a user to specify an area for obtaining a feature value for each direction.
- a pattern shape evaluation apparatus for obtaining an exposure condition of a semiconductor pattern from an image taken using an electron beam, direction detection for obtaining direction information of a white band, a pattern edge, or a contour from a pattern image including a closed curve
- a feature amount extraction unit that obtains at least two vertical and horizontal feature amounts based on direction information of the direction detection unit, a feature amount for each direction obtained by the feature amount extraction unit, and a feature amount for each direction
- a pattern shape evaluation apparatus includes an estimation unit that estimates at least focus values in two directions, vertical and horizontal, based on a model or table indicating the relationship between the focus value and the focus value.
- a pattern shape evaluation apparatus for obtaining an exposure condition of a semiconductor pattern from an image photographed using an electron beam
- the feature amount extracting unit for obtaining a feature amount from a pattern image including a closed curve, and the direction of the edge of the image
- a contribution rate calculation unit for calculating a contribution rate for dividing the feature amount into at least two directions of vertical and horizontal, and a model or table indicating a relationship between the feature amount, the contribution rate, a feature amount for each direction, and a focus value
- a pattern shape evaluation apparatus which includes an estimation unit that estimates at least two vertical and horizontal focus values.
- a semiconductor pattern from an image photographed using an electron beam is described below.
- a feature amount extraction unit for obtaining a feature amount in at least two directions of vertical and horizontal from a pattern image including a closed curve, and a feature amount for each direction obtained by the feature amount extraction unit A pattern shape evaluation apparatus including a model creating unit that creates a model indicating a relationship between a feature value and a focus value for each direction from an exposure condition corresponding to an SEM image is proposed.
- a pattern shape evaluation apparatus exemplified in the embodiments described below is a pattern image evaluation method and apparatus for monitoring exposure conditions including horizontal and vertical focus from image data of a curved pattern obtained by SEM imaging. It is about. As a specific example, an example in which exposure conditions including horizontal and vertical focus are detected from the image data of a curve pattern using the two-dimensional shape of the contour line is shown.
- FIGS. 21A and 21B illustrate an example of an image processing apparatus 2102 that detects exposure conditions including horizontal and vertical focus from image data of a pattern including a closed curve using a two-dimensional shape of a contour line.
- FIG. An FEM (Focus Exposure Matrix) wafer on which a pattern is printed in advance by changing the exposure conditions (focus, exposure amount) for each shot (one exposure unit) is SEM-photographed, and the photographed image is based on the position on the photographed wafer. Therefore, this information is referred to as exposure condition information 7.
- an SEM image 2105 including the plurality of different exposure condition (focus, exposure amount) information 7 and a target pattern to be measured is used. Also, an SEM image whose pattern is broken is not suitable for creating a model and may be removed in advance.
- the feature quantity extraction unit 2124 uses the SEM image 2105 and the reference pattern 2103 to extract feature quantities that change in accordance with the exposure conditions. Then, the direction-specific exposure condition model creation unit 2123 uses the feature amount obtained by the feature amount extraction unit 2124 and the exposure condition information 2107 to determine the relationship between the feature amount and the exposure condition for each direction such as the horizontal direction and the vertical direction. A direction-specific model 2104 to be shown is created.
- an exposure condition including a focus value for each direction such as a horizontal direction and a vertical direction may be given as information. Further, a dimension value when the focus value is optimum may be given. Further, with reference to the focus value in one direction, the focus value in the other direction may be given a value indicated by a ratio or a difference.
- the feature amount extraction unit 2121 extracts the feature amount, and the direction-specific model obtained in (b)
- the direction-specific exposure condition estimation unit 2122 uses the feature amount obtained by the feature amount extraction unit 2121 to estimate the direction-specific exposure condition.
- the estimated direction-specific exposure condition information can be fed back to the exposure device to correct the exposure condition for each direction.
- the reference pattern may be image data obtained by drawing design data, image data obtained from a photographed pattern image with good quality, or image data obtained from a plurality of photographed pattern images.
- image data obtained from a simulation image may be used.
- FIG. 22 shows an example of the feature quantity extraction unit.
- the feature amount extraction unit 2121 extracts, from the SEM image 2101 and the reference pattern 2103, feature amounts that change in accordance with the contour line of the SEM image 2101 and the exposure conditions.
- the feature quantity extraction units 2121 and 2124 in FIGS. 21A and 21B can have the same configuration.
- the contour line extraction unit 211 extracts the contour line of the SEM image 2101
- the registration unit 212 performs alignment with the image data of the reference pattern 2103 and the contour line data obtained by the contour line extraction unit 211, and performs alignment.
- the distance value calculation unit 213 obtains a distance value from the corresponding pixel position of the reference pattern.
- the distance value obtained by the distance value calculation unit 213 is used as a feature amount.
- associating the pixels of the contour line with the pixels of the reference pattern may be performed by searching for a pixel on the vertical line with respect to the direction of the pixel of the contour line, or by obtaining a pixel having the closest distance. You can also.
- FIG. 23 shows an example of the contour line extraction unit.
- An edge image is obtained from the SEM image 1 by filter processing for extracting an edge such as a Laplacian filter by the edge detection unit 2111, binarized by an arbitrary threshold value by the binarization unit 2112, and thinned by the thinning unit 2113.
- a line is obtained.
- the white band may be smoothed, binarized, and thinned to obtain a contour line, or any other method may be used as long as a contour line whose pattern shape can be obtained is obtained.
- the alignment unit 212 expands the contour pixels and overlaps the images to shift the position, and obtains the position where the correlation value of the normalized correlation is the highest, or obtains the center of gravity position of each image and aligns the center of gravity position. This is possible by a general method of aligning two images or the like, and will not be described.
- FIG. 27 shows an example of an image obtained by aligning and superimposing two images of the contour line image and the reference pattern image.
- the reference pattern is a circle
- the outline of the SEM image 1 is an alignment example in images having different shapes such as an ellipse.
- the A point pixel of the contour line corresponds to the A ′ point pixel of the reference pattern, and the distance values of all the pixels such as the distance value of the A point pixel and the A ′ point pixel and the distance value of the B point pixel and the B ′ point pixel
- the value is obtained as a feature value.
- FIG. 24 shows an example of the direction-specific exposure condition estimation unit.
- the direction-specific exposure condition estimation unit 2422 calculates the feature amount for each pixel obtained by the feature amount extraction unit 2121 by dividing it into feature amounts for each direction by the direction-specific feature separation unit 2401 using the contour line data. Using the model, the horizontal direction estimation unit 2402 and the vertical direction estimation unit 2403 estimate the exposure conditions in the respective directions.
- FIG. 25 shows an example of the direction-specific feature separation unit.
- the direction-specific feature separation unit 211 obtains the feature amount for each pixel obtained by the feature amount extraction unit 2121 separately for each direction.
- the direction detection unit 2211 obtains the direction of the target contour pixel using the contour line data around the target contour pixel obtained by the feature amount extraction unit 2121.
- the horizontal / vertical ratio determination unit 2212 determines the rate of change in the exposure condition in the horizontal direction and the vertical direction of the feature amount.
- the horizontal ratio multiplying unit 2213 and the vertical ratio multiplying unit 2214 multiply the respective ratios obtained by the horizontal / vertical ratio determining unit 2212 with the feature quantity for each pixel to obtain the feature quantity for each direction in the horizontal direction and the vertical direction.
- the direction detection unit 2211 obtains the direction of the contour pixel using the contour line data around the target contour pixel from which the feature amount is obtained.
- the direction may be detected by pattern matching with the arrangement of contour pixels of a matrix of 3 pixels ⁇ 3 pixels as shown in FIG.
- the directions (angle ⁇ ) are 0 °, 22.5 °, 45 °, 67.5 °, 90 °, 112.5 °, 135 °, and 157.5 °, respectively.
- More detailed directions can be obtained by increasing the size of the number of pixels to be referenced to 5 pixels ⁇ 5 pixel matrix or the like and increasing the number of patterns in the direction to be obtained.
- the horizontal / vertical ratio determining unit 2212 determines the ratio of the feature amount that affects the horizontal and vertical focus according to the direction of the outline obtained by the direction detecting unit 2211. For example, at point A shown in FIG. 27, the direction of the contour line is 0 °. In this case, the ratio of the feature amount affecting the vertical focus is 1.0, and the ratio of the feature affecting the horizontal focus is 0.0. At point B, the direction of the contour line is 90 °. In this case, the ratio of the feature amount affecting the vertical focus is 0.0, and the ratio of the feature affecting the horizontal focus is 1.0. To do. At point C, the direction of the contour line is 45 °.
- the ratio of the feature amount that affects the vertical focus is 0.5
- the ratio of the feature that affects the horizontal focus is 0.5.
- the formula for obtaining the ratio uses the angle ⁇ (°) of the direction obtained by the direction detection unit 2211, and obtains the ratio of the feature amount affecting the vertical focus by cos ⁇ / (sin ⁇ + cos ⁇ ), and affects the horizontal focus. It is conceivable that the ratio of the feature quantity to be calculated is sin ⁇ / (sin ⁇ + cos ⁇ ).
- an approximate straight line may be created using a plurality of sample pixels around the target contour pixel to determine the direction. It is also conceivable to extract the contour line by sufficiently smoothing the SEM image 2101.
- the ratios of the respective feature values for each direction are sent to the horizontal ratio multiplication unit 2213 and the vertical ratio multiplication unit 2214, and are multiplied by the feature values for each pixel to output the feature values for each direction.
- the horizontal ratio multiplier 2213 and the vertical ratio multiplier 2214 can be realized by a multiplier.
- the horizontal direction estimation unit 2402 contains the feature amount obtained by the horizontal ratio multiplication unit 2213
- the vertical direction estimation unit 2403 contains the feature amount obtained by the vertical ratio multiplication unit 2214.
- FIG. 28 shows an example of the horizontal direction estimation unit and the vertical direction estimation unit.
- the statistic calculation unit 2221 of the horizontal direction estimation unit 2402 in FIG. 28A the total value of the feature values of all the contour pixels obtained by the horizontal ratio multiplication unit 2213 is obtained, and the statistic such as the average value and the variance value is obtained. Ask for. Further, the skewness and the saliency may be obtained.
- the model calculation unit 2222 obtains the horizontal focus value and the exposure amount using the obtained statistical values and the model for each direction.
- the model calculation unit 2222 performs calculation using a model corresponding to the horizontal exposure condition of the direction-specific model. For example, it is conceivable to calculate the exposure condition by multiplying weights corresponding to a plurality of statistics in the calculation of the model.
- the vertical direction estimation unit 2403 in FIG. 28B can determine the exposure condition in the vertical direction.
- the exposure conditions obtained in the horizontal direction and the vertical direction are output as the exposure condition information 8 by direction.
- the direction-specific feature separation unit 231 is the same as the direction-specific feature separation unit 2401 of the direction-specific exposure condition estimation unit 2122 in FIG. 21A, and outputs the feature amount separately for each horizontal direction and vertical direction.
- the horizontal model creation unit 232 creates a model using the exposure condition information 2107 and the horizontal feature amount.
- the vertical model creation unit 233 creates a model using the exposure condition information 7 and the feature amount in the vertical direction.
- FIG. 30 shows an example of a horizontal model creation unit and a vertical model creation unit.
- the statistic calculation unit 2321 of the horizontal direction estimation unit 232 in FIG. 30A the total value of the feature values of all the contour pixels obtained by the horizontal ratio multiplication unit 2313 is obtained, and the average value, variance value, and skewness are obtained. Or statistics such as saliency may be obtained.
- a model indicating the relationship between the statistical quantity obtained by the model creating unit 2322 and the exposure condition in the horizontal direction is created.
- the model may be a regression equation that obtains the exposure condition by a linear sum obtained by multiplying a plurality of statistical values by a coefficient.
- the exposure condition Y can be expressed by a linear sum of weight coefficients X1, X2,... Xn for a plurality of statistical values A1, A2,.
- the model has values of weight coefficients X1, X2,. Further, it may be obtained by nonlinear regression, or linear programming may be used. The weight may be obtained by learning using a plurality of statistical values and the exposure conditions.
- the statistic calculation unit 2331 of the vertical direction estimation unit 233 in FIG. 10B calculates the total value of the feature values of all the contour pixels calculated by the vertical ratio multiplication unit 2314, and calculates the average value, variance value, and the like. Find statistics. Then, a model indicating the relationship between the statistic obtained by the model creation unit 2332 and the exposure condition in the horizontal direction is created using the obtained statistical values and the exposure condition information 7.
- Each model created by the model creation units 2322 and 2332 is output as a direction-specific model 2104.
- weighting is performed on the statistical amount of the feature amount for each of a plurality of angles (directions), and the exposure condition in the X direction or the Y direction (specific direction) is obtained by adding the statistic. Rather, it is possible to obtain the exposure conditions at that time by referring to the feature quantities in a plurality of directions. Even if the amount of information in a specific direction is insufficient, information in a plurality of directions can be referred to, so that the shortage of information can be compensated and appropriate exposure conditions can be obtained. Further, exposure conditions can be determined with reference to the performance of patterns other than the X and Y directions. Furthermore, in the above formula, the exposure condition Y is obtained by adding weighted results of measurement values in a plurality of directions, but may be a statistical value (average value).
- the exposure conditions referred to here may be parameters required for adjusting the exposure conditions in a specific direction.
- the exposure conditions of the exposure apparatus at the time of monitoring and the adjustment amount for example, ideal conditions-exposure conditions at the time of monitoring.
- the adjustment distance of the scale bar for adjusting the exposure condition on the GUI screen may be obtained by the regression equation, the table or the like.
- the direction-specific feature separation unit 2401 described in FIG. 25 uses a direction detection unit 2211 and a horizontal / vertical ratio determination unit 2212 as a vertical line / horizontal line detection unit 2215 as shown in FIG.
- a horizontal line (horizontal line) and a vertical line (vertical line) linear pattern are detected to be changed, and in the case of a pixel detected as a horizontal line, the ratio of the horizontal ratio multiplier is 1.0, and the vertical ratio multiplier In the case of a pixel detected as a horizontal line, the ratio of the horizontal ratio multiplier is 0.0 and the ratio of the vertical ratio multiplier is 1.0.
- the ratio of the horizontal ratio multiplication unit may be 0.0, and the ratio of the vertical ratio multiplication unit may be 0.0.
- the direction is obtained from the contour line.
- the direction of the reference pattern is obtained by changing the contour line to be entered into the direction detection unit 2211 in FIG. 25 or the vertical / horizontal line detection unit 2215 in FIG. It is also conceivable to obtain the ratio of the feature amount in the horizontal direction and the vertical direction. Further, when the configuration is changed in FIGS. 25 and 31, the direction-specific exposure condition model creation unit in FIG.
- FIG. 32 shows an example of an image evaluation apparatus when the user designates a region for which a feature amount is to be obtained for each direction.
- a direction-specific area designation unit 3209 is added to the image evaluation apparatus in FIG.
- the direction area designating unit 3209 is provided with a means for displaying and a means for designating an area, and represents a horizontal characteristic area for a contour line obtained from an SEM image or a line A displaying a reference pattern as shown in FIG.
- This is a GUI that allows a user to freely specify a rectangle representing a rectangle (1) and a feature region (2) in the vertical direction.
- Information on the region specified by the direction-specific region specifying unit 9 enters the feature amount extracting unit 25.
- FIG. 34 shows an example of the feature quantity extraction unit.
- a contour line is extracted from the SEM image 1 by the contour line extraction unit 211, and alignment is performed by the reference pattern and alignment unit 212. Up to this point, the feature extraction unit 2121 is the same.
- the vertical / horizontal direction region determination unit 216 determines whether the target pixel is a pixel included in the horizontal feature region specified by the user or a pixel included in the vertical feature region. If it is a pixel in the horizontal feature region, the feature amount (distance value between the target contour pixel and the corresponding reference pattern pixel) corresponding to the target pixel after alignment is the value of the horizontal direction distance value calculation unit 214 ( Output as horizontal feature).
- the feature value obtained from the target pixel is output as the value (vertical feature value) of the vertical distance value calculation unit 215. It is conceivable to exclude a feature region that is neither a horizontal feature region nor a vertical feature region.
- the target pixel determined by the vertical / horizontal direction region determination unit 216 may be a contour pixel or a pixel of a corresponding reference pattern as a target pixel.
- the value of the horizontal direction distance value calculation unit 214 and the value of the vertical direction distance value calculation unit 215 enter the direction-specific exposure condition estimation unit 3226.
- FIG. 35 shows an example of the direction-specific exposure condition estimation unit.
- the horizontal direction estimation unit 261 estimates the exposure condition in the horizontal direction using the value of the horizontal direction distance value calculation unit 214 and the model for each direction.
- the vertical direction estimation unit 262 estimates the exposure condition in the horizontal direction using the value of the horizontal direction distance value calculation unit 215 and the model for each direction.
- the horizontal direction estimation unit 261 and the vertical direction estimation unit 262 can be realized with the same configuration as the horizontal direction estimation unit 2402 and the horizontal direction estimation unit 2403 described in FIG.
- the direction-specific exposure condition model creation unit 3228 will be described with reference to FIG.
- the feature quantity in the horizontal direction and the feature quantity in the vertical direction obtained by the direction-specific exposure condition estimation unit 3227 enter the horizontal direction model creation unit 281 and the vertical direction model creation unit 282, respectively, and use the exposure condition information 7 to A model for obtaining the exposure condition in the horizontal direction is created by the model creating unit 281, and a model for obtaining the exposure condition in the vertical direction is created by the vertical direction model creating unit 282.
- the horizontal direction model creation unit 281 and the vertical direction model creation unit 282 can be realized with the same configuration as the horizontal direction model creation unit 231 and the vertical direction model creation unit 232 described with reference to FIG.
- the feature amount extraction unit 2121 of FIGS. 21 and 32 the feature amount is obtained by extracting a contour line from the SEM image and comparing it with the reference pattern. It is conceivable to use information as a feature quantity.
- the binarization unit 291 binarizes the SEM image with an arbitrary threshold, and the white band along the pattern shape of the SEM image is binarized into white pixels, and the other is binarized into black pixels. Is done.
- the binarized image is thinned by the thinning unit 293 to obtain an outline image.
- the binary image and the contour image enter the direction-specific feature separation unit of the direction-specific exposure condition estimation unit shown in FIG.
- FIG. A direction-specific width detection unit 2216 is added to the direction-specific exposure condition estimation unit shown in FIG. Then, instead of the feature value output from the distance value calculation unit 213, the output from the direction width detection unit 2216 is used as the feature value.
- the width of the white band in the direction perpendicular to that direction is obtained.
- the width of the white band (white pixel) of W1 in the vertical direction is set.
- the width of the white band (white pixel) of W1 in the vertical direction is set.
- the contour line direction is vertical at the point B pixel
- the width of the white pixel in the horizontal direction is obtained.
- the obtained white pixel width is obtained.
- the exposure conditions for each direction can be obtained by using the characteristics for each direction.
- FIG. 40 shows an example of a GUI for displaying input / output. It is conceivable to display the exposure image by switching the direction. For example, when the exposure condition is displayed on the wafer map 4004, an instruction unit 4001 for switching the display of the exposure condition for each direction is provided. For example, the horizontal exposure condition and the vertical exposure condition are displayed on the wafer map 4004. It is possible to switch and display. It is also conceivable to provide an instruction unit 4002 for instructing that the maximum value, minimum value, average value, etc. of the exposure conditions can be obtained and displayed in each direction or in all directions. It is also conceivable to provide an instruction unit 4003 for instructing and displaying model information for estimating the exposure condition for each direction, for example, a coefficient corresponding to the feature amount.
- the exposure conditions in the vertical direction and the horizontal direction it may be determined in the same way in all directions such as 45 ° and 30 ° obliquely.
- the user can specify an area for each direction.
- a display unit 4005 for displaying a reference pattern or an outline of an SEM image when a display unit 4005 for displaying a reference pattern or an outline of an SEM image is provided and a region is specified by direction, a rectangular b region indicating a horizontal region and a rectangular a region indicating a vertical region
- a rectangular c region indicating an oblique 45 ° region and a d region indicating a 60 ° region.
- the user can also assign an omnidirectional area.
- the above apparatus may be performed by software processing using a personal computer. It is also conceivable to use LSI.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Manufacturing & Machinery (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Power Engineering (AREA)
- Quality & Reliability (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Computer Hardware Design (AREA)
- Length-Measuring Devices Using Wave Or Particle Radiation (AREA)
- Testing Or Measuring Of Semiconductors Or The Like (AREA)
Abstract
Description
この場合、モデルは各特徴量の重み係数X1、X2、…Xn、bの値となる。図示しないが、モデル化部122にメモリ等の記憶部を設けておき、露光条件の情報とSEM像の輪郭線から得た特徴量は記憶しておく。
Y=X1A1+X2A2+・・・XnAn+b
で露光量を推定することができる。
この場合、モデルは各統計値の重み係数X1、X2、・・・Xn、bの値となる。また、非線形の回帰で求めてもよいし、線形計画法を用いても良い。複数の統計値とその露光条件を用いて学習させて重みを求めてもよい。同様に図10(b)の垂直方向推定部233の統計量算出部2331では、垂直割合乗算部2314で求めた全ての輪郭画素の特徴量の合計値を求め、その平均値や分散値等の統計量を求める。そして、それら求めた複数の統計値と露光条件情報7を用いて、モデル作成部2332で求めた統計量と水平方向の露光条件の関係を示すモデルを作成する。モデル作成部2322、2332で作成したそれぞれのモデルは方向別モデル2104として出力する。
2 露光条件推定部
11、21 輪郭線生成部
11n、1101、1102 輪郭線作成部
12 モデル生成部
13 評価部
22 推定部
30、37 露光条件情報
31、36 SEM画像
32、35 輪郭線生成パラメータ
33 基準パターン
34 モデル
121、221 特徴量算出部
122 モデル化部
131 モデル評価部
132 記憶/選択部
222 モデル演算部
1100 輪郭線記憶部
Claims (26)
- 電子線を用いて撮影した画像から半導体パターンの露光条件を求める画像評価装置において、
SEM画像から3つ以上の輪郭線を生成するためのパラメータを設定する設定部と、
前記設定部に設定されたパラメータを用いてSEM画像から3つ以上の輪郭線を生成する輪郭線生成部と、
基準パターンと前記輪郭線生成部で生成した3つ以上の輪郭線から求めた特徴量を用いて、露光条件を求める推定部を備えたことを特徴とする画像評価装置。 - 前記特徴量は、基準パターンと生成した各輪郭線との間の距離に関する統計量であることを特徴とする請求項1記載の画像評価装置。
- 前記基準パターンは設計データに基づくパターン又は、シミュレーションパターン又は撮影した1枚若しくは複数のSEM像から作成したパターンであることを特徴とする請求項2記載の画像評価装置。
- 電子線を用いて撮影した画像から半導体パターンの露光条件を求める画像評価装置において、
SEM像から3つ以上の輪郭線を生成して得た特徴量と露光条件との関係を示すモデルと、前記モデルに対応する3つ以上の輪郭線を生成する輪郭線生成パラメータ情報を格納する記憶部と、
前記輪郭線生成パラメータ情報を用いてSEM画像から3つ以上の輪郭線を生成する輪郭線生成部と、
基準パターンと前記輪郭線生成部で生成した3つ以上の輪郭線とから求めた特徴量と、前記モデルを用いて、露光条件を求める推定部を備えたことを特徴とする画像評価装置。 - 前記輪郭線生成パラメータ情報は前記輪郭線生成部で輪郭線を生成するための情報であり、輪郭線の数とその数に対応した輪郭線各々を生成するための情報であることを特徴とする請求項4記載の画像評価装置。
- 前記特徴量は、基準パターンと生成した各輪郭線との間の距離に関する統計量であることを特徴とする請求項5記載の画像評価装置。
- 前記基準パターンは設計データに基づくパターン又は、シミュレーションパターン又は撮影した1枚若しくは複数のSEM像から作成したパターンであることを特徴とする請求項6記載の画像評価装置。
- 前記推定部で求めた露光条件に関する値をモニタ上に表示するモニタ手段を備え、前記SEM像の撮影位置に対応する位置に露光条件に関する値を表示することを特徴とする請求項7記載の画像評価装置。
- 複数の半導体パターンの露光条件とそれに対応する複数のSEM画像を用いてモデルを作成する画像評価装置において、
輪郭線生成パラメータ情報を用いて、SEM像から複数の輪郭線を生成する輪郭線生成部と、
前記輪郭線生成部で生成した複数の輪郭線から得た特徴量とSEM画像に対応する露光条件からモデルを作成するモデル生成部と、
前記輪郭線生成パラメータ情報を複数用いて、前記輪郭線生成部と前記モデル生成部を経てそれぞれのモデルを求め、求めた複数のモデルの中から評価の良いモデル及びそれに対応する輪郭線生成パラメータ情報を求める評価部を備えたことを特徴とする画像評価装置。 - 前記輪郭線生成部は3つ以上の輪郭線を生成することを特徴とする請求項9記載の画像評価装置。
- 前記輪郭線生成パラメータ情報は前記輪郭線生成部で輪郭線を生成するための情報であり、輪郭線の数とその数に対応した輪郭線各々を生成するための情報であることを特徴とする請求項10記載の画像評価装置。
- 前記輪郭線生成パラメータ情報の輪郭線の数は3つ以上であることを特徴とする請求項11記載の画像評価装置。
- 前記特徴量は、基準パターンと生成した各輪郭線との間の距離に関する統計量であることを特徴とする請求項12記載の画像評価装置。
- 前記基準パターンは設計データに基づくパターン又は、シミュレーションパターン又は撮影した1枚若しくは複数のSEM像から作成したパターンであることを特徴とする請求項13記載の画像評価装置。
- 前記評価部での評価はモデルの当てはまりの良さの評価を行うことを特徴とする請求項14記載の画像評価装置。
- 電子線を用いて撮影した画像から半導体パターンの露光条件を求める画像評価装置において、複数の輪郭線を生成し、複数の輪郭線について基準パターンとの寸法を測定し、測定結果に基づく値と露光条件との関連性を示すと共に前記複数の輪郭線毎の測定結果に基づく値に異なる重みの設定が可能なモデルを参照して、露光条件を出力する画像評価装置。
- 画像取得装置によって形成される画像に含まれる対象パターンを評価する画像処理装置を備えたパターン形状評価装置において、
前記画像処理装置は、複数の方向について、前記対象パターンの複数の方向の特徴量を求め、当該複数の方向の特徴量に対し、前記複数の方向について割り当てられた重み付けを行い、当該重み付けに基づいて、特定方向の露光条件の調整に要するパラメータを求めることを特徴とするパターン形状評価装置。 - 請求項17において、
前記画像処理装置は、前記対象パターンのホワイトバンド、パターンエッジ、又は輪郭線の方向ごとに前記特徴量を求めることを特徴とするパターン形状評価装置。 - 請求項17において、
前記特定方向は、X方向、及びY方向の2方向であって、前記画像処理装置は、当該当該X方向、及びY方向以外の方向の特徴量を含む複数の特徴量に基づいて、前記パラメータを求めることを特徴とするパターン形状評価装置。 - 請求項17において、
前記画像処理装置は、前記複数の方向の特徴量と、前記パラメータの関係を示すモデル、又はテーブルを用いて前記パラメータを求めることを特徴とするパターン形状評価装置。 - 請求項17において、
前記画像処理装置は、前記特定方向の方向に基づいて、前記重み付けの係数を算出することを特徴とするパターン形状評価装置。 - 請求項17において、
前記画像処理装置は、前記画像取得装置によって形成される画像に含まれる前記対象パターンと、予め記憶された参照パターンを比較して、前記画像に含まれるパターンを評価することを特徴とするパターン形状評価装置。 - 請求項17において、
前記画像処理装置は、ウエハマップ上に露光条件を方向毎に切り替えて表示することができる表示部を備えたことを特徴とするパターン形状評価装置。 - 請求項17において、
前記画像処理装置は、方向毎の特徴量を求める領域をユーザが指定できる指示部を備えたことを特徴とするパターン形状評価装置。 - 画像取得装置によって形成される画像に含まれる対象パターンを評価する画像処理装置を備えたパターン形状評価装置において、
前記画像処理装置は、複数の方向について、前記対象パターンの複数の方向の特徴量を求め、当該特徴量と特定方向の露光条件の調整に要するパラメータとの関係を示すモデルを作成することを特徴とするパターン形状評価装置。 - 請求項25において、
前記画像処理装置は、前記特定方向は垂直と水平の2方向であって、当該特定毎に特徴量と露光量の関係を示すモデルを作成することを特徴とするパターン形状評価装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/377,728 US9830705B2 (en) | 2012-02-14 | 2013-02-12 | Image evaluation apparatus and pattern shape evaluation apparatus |
JP2013558679A JP6043735B2 (ja) | 2012-02-14 | 2013-02-12 | 画像評価装置及びパターン形状評価装置 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012029058 | 2012-02-14 | ||
JP2012-029058 | 2012-02-14 | ||
JP2012274191 | 2012-12-17 | ||
JP2012-274191 | 2012-12-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013122022A1 true WO2013122022A1 (ja) | 2013-08-22 |
Family
ID=48984133
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/053175 WO2013122022A1 (ja) | 2012-02-14 | 2013-02-12 | 画像評価装置及びパターン形状評価装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9830705B2 (ja) |
JP (1) | JP6043735B2 (ja) |
WO (1) | WO2013122022A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016104342A1 (ja) * | 2014-12-26 | 2016-06-30 | 株式会社 日立ハイテクノロジーズ | 露光条件評価装置 |
JP2020173904A (ja) * | 2019-04-08 | 2020-10-22 | 株式会社日立ハイテク | パターン断面形状推定システム、およびプログラム |
KR20210099998A (ko) * | 2020-02-05 | 2021-08-13 | 가부시끼가이샤 히다치 세이사꾸쇼 | 화상을 생성하는 시스템 |
JP2021135893A (ja) * | 2020-02-28 | 2021-09-13 | 株式会社東芝 | 検査装置、検査方法、及びプログラム |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10102619B1 (en) * | 2011-03-28 | 2018-10-16 | Hermes Microvision, Inc. | Inspection method and system |
KR20170019236A (ko) * | 2015-08-11 | 2017-02-21 | 삼성전자주식회사 | 특이 부분의 검출 방법 및 이를 이용한 측정 장치의 ap 설정 방법 |
US10790114B2 (en) * | 2017-06-29 | 2020-09-29 | Kla-Tencor Corporation | Scanning electron microscope objective lens calibration using X-Y voltages iteratively determined from images obtained using said voltages |
JP2019204618A (ja) * | 2018-05-22 | 2019-11-28 | 株式会社日立ハイテクノロジーズ | 走査型電子顕微鏡 |
JP7187384B2 (ja) * | 2019-05-17 | 2022-12-12 | 株式会社日立製作所 | 検査装置 |
US11854184B2 (en) * | 2021-01-14 | 2023-12-26 | Applied Materials Israel Ltd. | Determination of defects and/or edge roughness in a specimen based on a reference image |
KR20230037102A (ko) * | 2021-09-08 | 2023-03-16 | 삼성전자주식회사 | 주사 전자 현미경 장치, 반도체 생산 장치, 및 그 제어 방법 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003059813A (ja) * | 2001-08-20 | 2003-02-28 | Hitachi Ltd | 電子線を用いたプロセス変動監視システムおよび方法 |
JP2005123318A (ja) * | 2003-10-15 | 2005-05-12 | Tokyo Seimitsu Co Ltd | 解像性評価方法及び装置、並びに電子線露光システム |
JP2005286095A (ja) * | 2004-03-30 | 2005-10-13 | Hitachi High-Technologies Corp | 露光プロセスモニタ方法及びその装置 |
JP2007129059A (ja) * | 2005-11-04 | 2007-05-24 | Hitachi High-Technologies Corp | 半導体デバイス製造プロセスモニタ装置および方法並びにパターンの断面形状推定方法及びその装置 |
JP2008205017A (ja) * | 2007-02-16 | 2008-09-04 | Hitachi High-Tech Science Systems Corp | 露光状態表示方法及びシステム |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004228394A (ja) | 2003-01-24 | 2004-08-12 | Hitachi High-Technologies Corp | 半導体ウェーハのパターン形状評価システム |
JP4065817B2 (ja) | 2003-08-12 | 2008-03-26 | 株式会社日立ハイテクノロジーズ | 露光プロセスモニタ方法 |
JP2006234588A (ja) * | 2005-02-25 | 2006-09-07 | Hitachi High-Technologies Corp | パターン測定方法、及びパターン測定装置 |
JP2008140911A (ja) | 2006-11-30 | 2008-06-19 | Toshiba Corp | フォーカスモニタ方法 |
-
2013
- 2013-02-12 WO PCT/JP2013/053175 patent/WO2013122022A1/ja active Application Filing
- 2013-02-12 US US14/377,728 patent/US9830705B2/en active Active
- 2013-02-12 JP JP2013558679A patent/JP6043735B2/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003059813A (ja) * | 2001-08-20 | 2003-02-28 | Hitachi Ltd | 電子線を用いたプロセス変動監視システムおよび方法 |
JP2005123318A (ja) * | 2003-10-15 | 2005-05-12 | Tokyo Seimitsu Co Ltd | 解像性評価方法及び装置、並びに電子線露光システム |
JP2005286095A (ja) * | 2004-03-30 | 2005-10-13 | Hitachi High-Technologies Corp | 露光プロセスモニタ方法及びその装置 |
JP2007129059A (ja) * | 2005-11-04 | 2007-05-24 | Hitachi High-Technologies Corp | 半導体デバイス製造プロセスモニタ装置および方法並びにパターンの断面形状推定方法及びその装置 |
JP2008205017A (ja) * | 2007-02-16 | 2008-09-04 | Hitachi High-Tech Science Systems Corp | 露光状態表示方法及びシステム |
Non-Patent Citations (1)
Title |
---|
CHIE SHISHIDO ET AL.: "Dose and focus estimation using top-down SEM images", PROCEEDINGS OF SPIE, vol. 5038, 2 June 2003 (2003-06-02), pages 1071 - 1079 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016104342A1 (ja) * | 2014-12-26 | 2016-06-30 | 株式会社 日立ハイテクノロジーズ | 露光条件評価装置 |
US10558127B2 (en) | 2014-12-26 | 2020-02-11 | Hitachi High-Technologies Corporation | Exposure condition evaluation device |
JP2020173904A (ja) * | 2019-04-08 | 2020-10-22 | 株式会社日立ハイテク | パターン断面形状推定システム、およびプログラム |
JP7199290B2 (ja) | 2019-04-08 | 2023-01-05 | 株式会社日立ハイテク | パターン断面形状推定システム、およびプログラム |
KR20210099998A (ko) * | 2020-02-05 | 2021-08-13 | 가부시끼가이샤 히다치 세이사꾸쇼 | 화상을 생성하는 시스템 |
KR102502486B1 (ko) | 2020-02-05 | 2023-02-23 | 가부시끼가이샤 히다치 세이사꾸쇼 | 화상을 생성하는 시스템 |
JP2021135893A (ja) * | 2020-02-28 | 2021-09-13 | 株式会社東芝 | 検査装置、検査方法、及びプログラム |
US11587223B2 (en) | 2020-02-28 | 2023-02-21 | Kabushiki Kaisha Toshiba | Inspection apparatus that detects defect in image and inspection method and storage medium thereof |
JP7273748B2 (ja) | 2020-02-28 | 2023-05-15 | 株式会社東芝 | 検査装置、検査方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20150287201A1 (en) | 2015-10-08 |
US9830705B2 (en) | 2017-11-28 |
JPWO2013122022A1 (ja) | 2015-05-11 |
JP6043735B2 (ja) | 2016-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6043735B2 (ja) | 画像評価装置及びパターン形状評価装置 | |
US10937146B2 (en) | Image evaluation method and image evaluation device | |
JP7144244B2 (ja) | パターン検査システム | |
JP5639797B2 (ja) | パターンマッチング方法,画像処理装置、及びコンピュータプログラム | |
US20150212019A1 (en) | Pattern inspection device and pattern inspection method | |
JP2009198338A (ja) | 電子顕微鏡システム及びそれを用いたパターン寸法計測方法 | |
JP2009222454A (ja) | パターン測定方法及びパターン測定装置 | |
JP5966087B2 (ja) | パターン形状評価装置及び方法 | |
JP2011165479A (ja) | パターン検査方法、パターン検査プログラム、電子デバイス検査システム | |
JP2011137901A (ja) | パターン計測条件設定装置 | |
JP2005322423A (ja) | 電子顕微鏡装置およびそのシステム並びに電子顕微鏡装置およびそのシステムを用いた寸法計測方法 | |
JP6286544B2 (ja) | パターン測定条件設定装置、及びパターン測定装置 | |
TW202418220A (zh) | 圖像處理程式、圖像處理裝置、圖像處理方法及缺陷檢測系統 | |
US10558127B2 (en) | Exposure condition evaluation device | |
JP2013200319A (ja) | 電子顕微鏡システム及びそれを用いたパターン寸法計測方法 | |
JP5604208B2 (ja) | 欠陥検出装置及びコンピュータプログラム | |
WO2013180043A1 (ja) | 計測方法、画像処理装置、及び荷電粒子線装置 | |
JP4700772B2 (ja) | 画像照合方法および画像照合プログラム | |
JP2006029891A (ja) | パターン画像計測方法及びその方法を用いたパターン画像計測装置 | |
JP2023001367A (ja) | 画像処理システムおよび画像処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13749530 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013558679 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14377728 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13749530 Country of ref document: EP Kind code of ref document: A1 |