Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUGMENTED REALITY (AR) ASSISTED PARTICLE CONTAMINATION DETECTION
Document Type and Number:
WIPO Patent Application WO/2022/175025
Kind Code:
A1
Abstract:
An augmented reality (AR) headset includes one or more sensors and a processor coupled to the one or more sensors. The one or more sensors may be configured to scan a surface of an object during an inspection. The processor may be configured to process, in real-time during the inspection, information obtained using the one or more sensors. The processor may be further configured to determine, based on the processed information, whether a contaminant is present on the surface of the object. And, the processor may be further configured to, in response to determining that the contaminant is present on the surface of the object, display a image of the contaminant to a user of the AR headset.

Inventors:
T DA SILVA MICHAEL (US)
WEINLANDT THOMAS (US)
Application Number:
PCT/EP2022/051507
Publication Date:
August 25, 2022
Filing Date:
January 24, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ASML HOLDING NV (NL)
International Classes:
G02B27/01; G01N21/94; G03F7/00; G03F7/20
Domestic Patent References:
WO2019079065A12019-04-25
Foreign References:
US7383129B12008-06-03
US20190011418A12019-01-10
US20180285649A12018-10-04
US5978078A1999-11-02
EP1452851A12004-09-01
DE102019200855A12020-07-30
US7511799B22009-03-31
Other References:
"CONTAMINANT ANALYZING METROLOGY SYSTEM, LITHOGRAPHIC APPARATUS, AND METHODS THEREOF", RESEARCH DISCLOSURE, KENNETH MASON PUBLICATIONS, HAMPSHIRE, UK, GB, vol. 675, no. 81, 1 July 2020 (2020-07-01), pages 1208, XP007148518, ISSN: 0374-4353, [retrieved on 20200623]
Attorney, Agent or Firm:
ASML NETHERLANDS B.V. (NL)
Download PDF:
Claims:
CLAIMS

1. An augmented reality (AR) headset comprising: one or more sensors configured to scan a surface of an object during an inspection; and a processor coupled to the one or more sensors and configured to: process, in real-time during the inspection, information obtained using the one or more sensors; determine, based on the processed information, whether a contaminant is present on the surface of the object; and in response to determining that the contaminant is present on the surface of the object, display an image of the contaminant to a user of the AR headset.

2. The AR headset of claim 1, wherein: the one or more sensors comprise one or more image capturing devices configured to capture a video feed during the inspection; the processor is further configured to analyze the video feed to determine whether a change in a reflected luminosity from two or more spots on a surface of the object occurred; and the processor is further configured to determine that the contaminant is present on the object in response to determining that the change in the reflected luminosity occurred.

3. The AR headset of claim 1, wherein: the one or more sensors comprise one or more ultrasonic sensors configured to, at time tl, direct an ultrasonic wave to reflect from the object and, at time t2, to receive the reflected wave for two or more spots on a surface of the object; the processor is further configured to measure a distance to the object by measuring a time between tl and t2 at each of the two of more spots; and to determine whether the contaminant is present on the surface of the object, the processor is further configured to determine that the contaminant is present in response to determining a change in the distance between the surface of the object and the AR headset by comparing the distance at the two or more spots.

4. The AR headset of claim 1, wherein: the processor is further configured to generate a three-dimensional (3D) map of the object; the processor is further configured to combine the 3D map with the information from the one or more sensors to identify a location of a contaminant on the object; and the processor is further configured to provide a user with location information of the contaminant.

5. A method comprising: scanning, using one or more sensors of an augmented reality (AR) headset, a surface of an object during an inspection; processing, in real-time during the inspection, information obtained using the one or more sensors; determining, based on the processed information, whether a contaminant is present on the surface of the object; and in response to determining that the contaminant is present on the surface of the object, displaying an image of the contaminant to a user of the AR headset.

6. The method of claim 5, further comprising: using one or more image capturing devices as the one or more sensors, the one or more image being configured to capture a video feed during the inspection, and the processing comprises analyzing the video feed to determine whether a change in a reflected luminosity from two or more spots on a surface of the object occurred.

7. The method of claim 6, wherein to determine whether the contaminant is present on the surface of the object comprises determining that the contaminant is present in response to determining that the change in the reflected luminosity occurred.

8. The method of claim 5, further comprising: using one or more ultrasonic sensors as the one or more sensors, the one or more ultrasonic sensors being configured to, at time tl, direct an ultrasonic wave to reflect from the object and, at time t2, to receive the reflected wave for two or more spots on a surface of the object; and the processing comprising measuring a distance to the object by measuring a time between tl and t2 at each of the two of more spots.

9. The method of claim 8, wherein to determine whether the contaminant is present on the surface of the object comprises determining that the contaminant is present in response to determining a change in the distance between the surface of the object and the AR headset by comparing the distance at the two or more spots.

10. The method of claim 5, further comprising: generating a three-dimensional (3D) map of the object; and combining the 3D map with the information from the one or more sensors to identify a location of a contaminant on the object.

11. The method of claim 10, further comprising providing a user with location information of the contaminant.

12. A non-transitory computer readable medium storing one or more sequences of one or more instructions for execution by one or more processors to perform operations, comprising: scanning, using one or more sensors of an augmented reality headset, a surface of an object during an inspection; and processing, in real-time during the inspection, information obtained using the one or more sensors; determining, based on the processed information, whether a contaminant is present on the surface of the object; and in response to determining that the contaminant is present on the surface of the object, displaying a image of the contaminant to a user of the AR headset.

13. The non-transitory computer readable medium of claim 12, the operations further comprising generating a three-dimensional (3D) map of the object.

14. The non-transitory computer readable medium of claim 13, the operations further comprising combining the 3D map with the information from the one or more sensors to identify a location of a contaminant on the object.

15. The non-transitory computer readable medium of claim 13, the operations further comprising logging the location of the contaminant in a database.

Description:
AUGMENTED REALITY (AR) ASSISTED PARTICLE CONTAMINATION DETECTION

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority of U.S. Provisional Patent Application No. 63/149,783, which was filed on February 16, 2021, and which is incorporated herein in its entirety by reference.

FIELD

[0002] The present disclosure relates to lithographic systems, for example, inspection systems for detecting contaminants.

BACKGROUND

[0003] A lithographic apparatus is a machine that applies a desired pattern onto a substrate, usually onto a target portion of the substrate. A lithographic apparatus can be used, for example, in the manufacture of integrated circuits (ICs). In that instance, a patterning device, which is alternatively referred to as a mask or a reticle, can be used to generate a circuit pattern to be formed on an individual layer of the IC. This pattern can be transferred onto a target portion (e.g., comprising part of, one, or several dies) on a substrate (e.g., a silicon wafer). Transfer of the pattern is typically via imaging onto a layer of radiation- sensitive material (resist) provided on the substrate. In general, a single substrate will contain a network of adjacent target portions that are successively patterned. Known lithographic apparatus include so-called steppers, in which each target portion is irradiated by exposing an entire pattern onto the target portion at one time, and so-called scanners, in which each target portion is irradiated by scanning the pattern through a radiation beam in a given direction (the “scanning”- direction) while synchronously scanning the target portions parallel or anti-parallel to this scanning direction. It is also possible to transfer the pattern from the patterning device to the substrate by imprinting the pattern onto the substrate.

[0004] Another lithographic system is an interferometric lithographic system where there is no patterning device, but rather a light beam is split into two beams, and the two beams are caused to interfere at a target portion of the substrate through the use of a reflection system. The interference causes lines to be formed at the target portion of the substrate.

[0005] During lithographic operation, different processing steps may require different layers to be sequentially formed on the substrate. Sequencing of layers is typically accomplished by exchanging different reticles, according to the desired pattern for each layer, for each pattern transfer process. A typical lithographic system works within sub-nanometer tolerances regarding patterns on the reticle and patterns transferred onto the wafer from the reticle. A contaminant particle on a reticle may introduce errors to transferred patterns. Therefore, it is desirable to maintain contaminant-free reticles capable of accurately transferring patterns onto wafers with sub-nanometer accuracy. [0006] Within the environment of the lithographic apparatus, highly dynamic processes take place, e.g., reticle hand-off, wafer hand-off, controlled gas flows, outgassing of vacuum chamber walls, liquid dispensing (e.g., photoresist coating), temperature variations, metal deposition, rapid movement of numerous actuatable components, and wear of structures. Over time, dynamic processes introduce and build up contaminant particles within the lithographic apparatus.

SUMMARY

[0007] There is a need to provide improved inspection techniques to detect contaminants.

[0008] In some embodiments, an augmented reality (AR) headset includes the following components. One or more sensors and a processor coupled to the one or more sensors. The one or more sensors may be configured to scan a surface of an object during an inspection. The processor may be configured to process, in real-time during the inspection, information obtained using the one or more sensors. The processor may be further configured to determine, based on the processed information, whether a contaminant is present on the surface of the object. And, the processor may be further configured to, in response to determining that the contaminant is present on the surface of the object, display a image of the contaminant to a user of the AR headset.

[0009] In some embodiments, a method includes the following operations. Scanning, using one or more sensors of an augmented reality (AR) headset, a surface of an object during an inspection. Processing, in real-time during the inspection, information obtained using the one or more sensors. Determining, based on the processed information, whether a contaminant is present on the surface of the object. In response to determining that the contaminant is present on the surface of the object, displaying a image of the contaminant to a user of the AR headset.

[0010] In some embodiments, a non-transitory computer readable medium storing one or more sequences of one or more instructions for execution by one or more processors to perform the following operations. Scanning, using one or more sensors of an augmented reality (AR) headset, a surface of an object during an inspection. Processing, in real-time during the inspection, information obtained using the one or more sensors. Determining, based on the processed information, whether a contaminant is present on the surface of the object. In response to determining that the contaminant is present on the surface of the object, displaying a image of the contaminant to a user of the AR headset.

[0011] Further features of the present disclosure, as well as the structure and operation of various embodiments, are described in detail below with reference to the accompanying drawings. It is noted that the present disclosure is not limited to the specific embodiments described herein. Such embodiments are presented herein for illustrative purposes only. Additional embodiments will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES

[0012] The accompanying drawings, which are incorporated herein and form part of the specification, illustrate the present disclosure and, together with the description, further serve to explain the principles of the present disclosure and to enable a person skilled in the relevant art(s) to make and use embodiments described herein.

[0013] FIG. 1A shows a schematic of a reflective lithographic apparatus, according to some embodiments.

[0014] FIG. IB shows a schematic of a transmissive lithographic apparatus, according to some embodiments.

[0015] FIG. 2 shows a more detailed schematic of the reflective lithographic apparatus, according to some embodiments.

[0016] FIG. 3 shows a schematic of a lithographic cell, according to some embodiments.

[0017] FIG. 4 shows a schematic of an augmented reality (AR) device, according to some embodiments.

[0018] The features of the present disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements. Additionally, generally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears. Unless otherwise indicated, the drawings provided throughout the disclosure should not be interpreted as to-scale drawings.

DETAIFED DESCRIPTION

[0019] This specification discloses one or more embodiments that incorporate the features of the present disclosure. The disclosed embodiment(s) are provided as examples. The scope of the present disclosure is not limited to the disclosed embodiment(s). Claimed features are defined by the claims appended hereto.

[0020] The embodiment(s) described, and references in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment(s) described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is understood that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.

[0021] Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “on,” “upper” and the like, can be used herein for ease of description to describe one element or feature’ s relationship to another element(s) or feature(s) as illustrated in the figures. The spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The apparatus can be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein may likewise be interpreted accordingly. [0022] The term “about” as used herein indicates the value of a given quantity that can vary based on a particular technology. Based on the particular technology, the term “about” can indicate a value of a given quantity that varies within, for example, 10-30% of the value (e.g., ±10%, ±20%, or ±30% of the value).

[0023] Embodiments of the disclosure can be implemented in hardware, firmware, software, or any combination thereof. Embodiments of the disclosure may also be implemented as instructions stored on a machine-readable medium, which can be read and executed by one or more processors. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device). For example, a machine -readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others. Further, firmware, software, routines, and/or instructions can be described herein as performing certain actions. However, it should be appreciated that such descriptions are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc. The term “non-transitory” may be used herein to characterize computer readable media used for storing data, information, instructions, and the like, with the sole exception being a transitory, propagating signal.

[0024] Before describing such embodiments in more detail, however, it is instructive to present an example environment in which embodiments of the present disclosure can be implemented.

[0025] Example Lithographic Systems

[0026] FIGS. 1A and IB show schematic illustrations of a lithographic apparatus 100 and lithographic apparatus 100’, respectively, in which embodiments of the present disclosure may be implemented. Fithographic apparatus 100 and lithographic apparatus 100’ each include the following: an illumination system (illuminator) IF configured to condition a radiation beam B (for example, deep ultra violet or extreme ultra violet radiation); a support structure (for example, a mask table) MT configured to support a patterning device (for example, a mask, a reticle, or a dynamic patterning device) MA and connected to a first positioner PM configured to accurately position the patterning device MA; and, a substrate table (for example, a wafer table) WT configured to hold a substrate (for example, a resist coated wafer) W and connected to a second positioner PW configured to accurately position the substrate W. Fithographic apparatus 100 and 100’ also have a projection system PS configured to project a pattern imparted to the radiation beam B by patterning device MA onto a target portion (for example, comprising one or more dies) C of the substrate W. In lithographic apparatus 100, the patterning device MA and the projection system PS are reflective. In lithographic apparatus 100’, the patterning device MA and the projection system PS are transmissive.

[0027] The illumination system IL may include various types of optical components, such as refractive, reflective, catadioptric, magnetic, electromagnetic, electrostatic, or other types of optical components, or any combination thereof, for directing, shaping, or controlling the radiation beam B. [0028] The support structure MT holds the patterning device MA in a manner that depends on the orientation of the patterning device MA with respect to a reference frame, the design of at least one of the lithographic apparatus 100 and 100’, and other conditions, such as whether or not the patterning device MA is held in a vacuum environment. The support structure MT may use mechanical, vacuum, electrostatic, or other clamping techniques to hold the patterning device MA. The support structure MT may be a frame or a table, for example, which may be fixed or movable, as required. By using sensors, the support structure MT may ensure that the patterning device MA is at a desired position, for example, with respect to the projection system PS.

[0029] The term “patterning device” MA should be broadly interpreted as referring to any device that may be used to impart a radiation beam B with a pattern in its cross-section, such as to create a pattern in the target portion C of the substrate W. The pattern imparted to the radiation beam B may correspond to a particular functional layer in a device being created in the target portion C to form an integrated circuit.

[0030] The patterning device MA may be transmissive (as in lithographic apparatus 100’ of

FIG. IB) or reflective (as in lithographic apparatus 100 of FIG. 1A). Examples of patterning devices MA include reticles, masks, programmable mirror arrays, or programmable LCD panels. Masks are well known in lithography, and include mask types such as binary, alternating phase shift, or attenuated phase shift, as well as various hybrid mask types. An example of a programmable mirror array employs a matrix arrangement of small mirrors, each of which may be individually tilted so as to reflect an incoming radiation beam in different directions. The tilted mirrors impart a pattern in the radiation beam B, which is reflected by a matrix of small mirrors.

[0031] The term “projection system” PS may encompass any type of projection system, including refractive, reflective, catadioptric, magnetic, electromagnetic and electrostatic optical systems, or any combination thereof, as appropriate for the exposure radiation being used, or for other factors, such as the use of an immersion liquid on the substrate W or the use of a vacuum. A vacuum environment may be used for EUV or electron beam radiation since other gases may absorb too much radiation or electrons. A vacuum environment may therefore be provided to the whole beam path with the aid of a vacuum wall and vacuum pumps.

[0032] Lithographic apparatus 100 and/or lithographic apparatus 100’ may be of a type having two (dual stage) or more substrate tables WT (and/or two or more mask tables). In such “multiple stage” machines, the additional substrate tables WT may be used in parallel, or preparatory steps may be carried out on one or more tables while one or more other substrate tables WT are being used for exposure. In some situations, the additional table may not be a substrate table WT.

[0033] The lithographic apparatus may also be of a type wherein at least a portion of the substrate may be covered by a liquid having a relatively high refractive index, e.g., water, so as to fill a space between the projection system and the substrate. An immersion liquid may also be applied to other spaces in the lithographic apparatus, for example, between the mask and the projection system. Immersion techniques are well known in the art for increasing the numerical aperture of projection systems. The term “immersion” as used herein does not mean that a structure, such as a substrate, must be submerged in liquid, but rather only means that liquid is located between the projection system and the substrate during exposure.

[0034] Referring to FIGS. 1A and IB, the illuminator IL receives a radiation beam from a radiation source SO. The source SO and the lithographic apparatus 100, 100’ may be separate physical entities, for example, when the source SO is an excimer laser. In such cases, the source SO is not considered to form part of the lithographic apparatus 100 or 100’, and the radiation beam B passes from the source SO to the illuminator IL with the aid of a beam delivery system BD (in FIG. IB) including, for example, suitable directing mirrors and/or a beam expander. In other cases, the source SO may be an integral part of the lithographic apparatus 100, 100’, for example, when the source SO is a mercury lamp. The source SO and the illuminator IL, together with the beam delivery system BD, if required, may be referred to as a radiation system.

[0035] The illuminator IL may include an adjuster AD (in FIG. IB) for adjusting the angular intensity distribution of the radiation beam. Generally, at least the outer and or inner radial extent (commonly referred to as “s-outer” and “s-inner,” respectively) of the intensity distribution in a pupil plane of the illuminator may be adjusted. In addition, the illuminator IL may comprise various other components (in FIG. IB), such as an integrator IN and a condenser CO. The illuminator IL may be used to condition the radiation beam B to have a desired uniformity and intensity distribution in its cross section.

[0036] Referring to FIG. 1A, the radiation beam B is incident on the patterning device (for example, mask) MA, which is held on the support structure (for example, mask table) MT, and is patterned by the patterning device MA. In lithographic apparatus 100, the radiation beam B is reflected from the patterning device (for example, mask) MA. After being reflected from the patterning device (for example, mask) MA, the radiation beam B passes through the projection system PS, which focuses the radiation beam B onto a target portion C of the substrate W. With the aid of the second positioner PW and position sensor IF2 (for example, an interferometric device, linear encoder, or capacitive sensor), the substrate table WT may be moved accurately (for example, so as to position different target portions C in the path of the radiation beam B). Similarly, the first positioner PM and another position sensor IF1 may be used to accurately position the patterning device (for example, mask) MA with respect to the path of the radiation beam B. Patterning device (for example, mask) MA and substrate W may be aligned using mask alignment marks Ml, M2 and substrate alignment marks PI, P2.

[0037] Referring to FIG. IB, the radiation beam B is incident on the patterning device (for example, mask MA), which is held on the support structure (for example, mask table MT), and is patterned by the patterning device. Having traversed the mask MA, the radiation beam B passes through the projection system PS, which focuses the beam onto a target portion C of the substrate W. The projection system has a pupil conjugate PPU to an illumination system pupil IPU. Portions of radiation emanate from the intensity distribution at the illumination system pupil IPU and traverse a mask pattern without being affected by diffraction at the mask pattern and create an image of the intensity distribution at the illumination system pupil IPU.

[0038] The projection system PS projects an image MP’ of the mask pattern MP, where image

MP’ is formed by diffracted beams produced from the mark pattern MP by radiation from the intensity distribution, onto a photoresist layer coated on the substrate W. For example, the mask pattern MP may include an array of lines and spaces. A diffraction of radiation at the array and different from zeroth order diffraction generates diverted diffracted beams with a change of direction in a direction perpendicular to the lines. Undiffracted beams (i.e., so-called zeroth order diffracted beams) traverse the pattern without any change in propagation direction. The zeroth order diffracted beams traverse an upper lens or upper lens group of the projection system PS, upstream of the pupil conjugate PPU of the projection system PS, to reach the pupil conjugate PPU. The portion of the intensity distribution in the plane of the pupil conjugate PPU and associated with the zeroth order diffracted beams is an image of the intensity distribution in the illumination system pupil IPU of the illumination system IL. The aperture device PD, for example, is disposed at or substantially at a plane that includes the pupil conjugate PPU of the projection system PS.

[0039] The projection system PS is arranged to capture, by means of a lens or lens group L, not only the zeroth order diffracted beams, but also first-order or first- and higher-order diffracted beams (not shown). In some embodiments, dipole illumination for imaging line patterns extending in a direction perpendicular to a line may be used to utilize the resolution enhancement effect of dipole illumination. For example, first-order diffracted beams interfere with corresponding zeroth-order diffracted beams at the level of the wafer W to create an image of the line pattern MP at highest possible resolution and process window (i.e., usable depth of focus in combination with tolerable exposure dose deviations). In some embodiments, astigmatism aberration may be reduced by providing radiation poles (not shown) in opposite quadrants of the illumination system pupil IPU. Further, in some embodiments, astigmatism aberration may be reduced by blocking the zeroth order beams in the pupil conjugate PPU of the projection system associated with radiation poles in opposite quadrants. This is described in more detail in US 7,511,799 B2, issued Mar. 31, 2009, which is incorporated by reference herein in its entirety. [0040] With the aid of the second positioner PW and position sensor IF (for example, an interferometric device, linear encoder, or capacitive sensor), the substrate table WT may be moved accurately (for example, so as to position different target portions C in the path of the radiation beam B). Similarly, the first positioner PM and another position sensor (not shown in FIG. IB) may be used to accurately position the mask MA with respect to the path of the radiation beam B (for example, after mechanical retrieval from a mask library or during a scan).

[0041] In general, movement of the mask table MT may be realized with the aid of a long- stroke module (coarse positioning) and a short-stroke module (fine positioning), which form part of the first positioner PM. Similarly, movement of the substrate table WT may be realized using a long-stroke module and a short-stroke module, which form part of the second positioner PW. In the case of a stepper (as opposed to a scanner), the mask table MT may be connected to a short-stroke actuator only or may be fixed. Mask MA and substrate W may be aligned using mask alignment marks Ml, M2, and substrate alignment marks PI, P2. Although the substrate alignment marks (as illustrated) occupy dedicated target portions, they may be located in spaces between target portions (known as scribe-lane alignment marks). Similarly, in situations in which more than one die is provided on the mask MA, the mask alignment marks may be located between the dies.

[0042] Mask table MT and patterning device MA may be in a vacuum chamber V, where an in-vacuum robot IVR may be used to move patterning devices such as a mask in and out of vacuum chamber. Alternatively, when mask table MT and patterning device MA are outside of the vacuum chamber, an out-of-vacuum robot may be used for various transportation operations, similar to the in vacuum robot IVR. Both the in-vacuum and out-of-vacuum robots need to be calibrated for a smooth transfer of any payload (e.g., mask) to a fixed kinematic mount of a transfer station.

[0043] The lithographic apparatus 100 and 100’ may be used in at least one of the following modes:

[0044] 1. In step mode, the support structure (for example, mask table) MT and the substrate table WT are kept essentially stationary, while an entire pattern imparted to the radiation beam

B is projected onto a target portion C at one time (i.e., a single static exposure). The substrate table WT is then shifted in the X and/or Y direction so that a different target portion C may be exposed.

[0045] 2. In scan mode, the support structure (for example, mask table) MT and the substrate table WT are scanned synchronously while a pattern imparted to the radiation beam B is projected onto a target portion C (i.e., a single dynamic exposure). The velocity and direction of the substrate table WT relative to the support structure (for example, mask table) MT may be determined by the (de-)magnification and image reversal characteristics of the projection system PS.

[0046] 3. In another mode, the support structure (for example, mask table) MT is kept substantially stationary holding a programmable patterning device, and the substrate table WT is moved or scanned while a pattern imparted to the radiation beam B is projected onto a target portion C. A pulsed radiation source SO may be employed and the programmable patterning device is updated as required after each movement of the substrate table WT or in between successive radiation pulses during a scan. This mode of operation may be readily applied to maskless lithography that utilizes a programmable patterning device, such as a programmable mirror array.

[0047] Combinations and/or variations on the described modes of use or entirely different modes of use may also be employed.

[0048] In some embodiments, a lithographic apparatus may generate DUV and or EUV radiation. For example, lithographic apparatus 100’ may be configured to operate using a DUV source. In another example, lithographic apparatus 100 includes an extreme ultraviolet (EUV) source, which is configured to generate a beam of EUV radiation for EUV lithography. In general, the EUV source is configured in a radiation system, and a corresponding illumination system is configured to condition the EUV radiation beam of the EUV source.

[0049] FIG. 2 shows the lithographic apparatus 100 in more detail, including the source collector apparatus SO, the illumination system IL, and the projection system PS. The source collector apparatus SO is constructed and arranged such that a vacuum environment may be maintained in an enclosing structure 220 of the source collector apparatus SO. An EUV radiation emitting plasma 210 may be formed by a discharge produced plasma source. EUV radiation may be produced by a gas or vapor, for example Xe gas, Li vapor, or Sn vapor in which the very hot plasma 210 is created to emit radiation in the EUV range of the electromagnetic spectrum. The very hot plasma 210 is created by, for example, an electrical discharge causing at least a partially ionized plasma. Partial pressures of, for example, 10 Pa of Xe, Li, Sn vapor, or any other suitable gas or vapor may be required for efficient generation of the radiation. In some embodiments, a plasma of excited tin (Sn) is provided to produce EUV radiation.

[0050] The radiation emitted by the hot plasma 210 is passed from a source chamber 211 into a collector chamber 212 via an optional gas barrier or contaminant trap 230 (in some cases also referred to as contaminant barrier or foil trap), which is positioned in or behind an opening in source chamber 211. The contaminant trap 230 may include a channel structure. Contamination trap 230 may also include a gas barrier or a combination of a gas barrier and a channel structure. The contaminant trap or contaminant barrier 230 further indicated herein at least includes a channel structure.

[0051] The collector chamber 212 may include a radiation collector CO, which may be a so- called grazing incidence collector. Radiation collector CO has an upstream radiation collector side 251 and a downstream radiation collector side 252. Radiation that traverses collector CO may be reflected off a grating spectral filter 240 to be focused in a virtual source point IF. The virtual source point IF is commonly referred to as the intermediate focus, and the source collector apparatus is arranged such that the intermediate focus IF is located at or near an opening 219 in the enclosing structure 220. The virtual source point IF is an image of the radiation emitting plasma 210. Grating spectral filter 240 is used in particular for suppressing infra-red (IR) radiation. [0052] Subsequently the radiation traverses the illumination system IL, which may include a faceted field mirror device 222 and a faceted pupil mirror device 224 arranged to provide a desired angular distribution of the radiation beam 221, at the patterning device MA, as well as a desired uniformity of radiation intensity at the patterning device MA. Upon reflection of the beam of radiation 221 at the patterning device MA, held by the support structure MT, a patterned beam 226 is formed and the patterned beam 226 is imaged by the projection system PS via reflective elements 228, 229 onto a substrate W held by the wafer stage or substrate table WT.

[0053] More elements than shown may generally be present in illumination optics unit IL and projection system PS. The grating spectral filter 240 may optionally be present, depending upon the type of lithographic apparatus. Further, there may be more mirrors present than those shown in the FIG. 2, for example there may be one to six additional reflective elements present in the projection system PS than shown in FIG. 2.

[0054] Collector optic CO, as illustrated in FIG. 2, is depicted as a nested collector with grazing incidence reflectors 253, 254, and 255, just as an example of a collector (or collector mirror). The grazing incidence reflectors 253, 254, and 255 are disposed axially symmetric around an optical axis O and a collector optic CO of this type is preferably used in combination with a discharge produced plasma source, often called a DPP source.

[0055] Exemplary Lithographic Cell

[0056] FIG. 3 shows a lithographic cell 300, also sometimes referred to a lithocell or cluster, according to some embodiments. Lithographic apparatus 100 or 100’ may form part of lithographic cell 300. Lithographic cell 300 may also include one or more apparatuses to perform pre- and post-exposure processes on a substrate. Conventionally these include spin coaters SC to deposit resist layers, developers DE to develop exposed resist, chill plates CH, and bake plates BK. A substrate handler, or robot, RO picks up substrates from input/output ports I/O 1 , 1/02, moves them between the different process apparatuses and delivers them to the loading bay LB of the lithographic apparatus 100 or 100’. These devices, which are often collectively referred to as the track, are under the control of a track control unit TCU, which is itself controlled by a supervisory control system SCS, which also controls the lithographic apparatus via lithography control unit LACU. Thus, the different apparatuses may be operated to maximize throughput and processing efficiency.

[0057] Exemplary Contaminant Inspection Apparatus

[0058] In some embodiments, the lithographic apparatus 100 of FIG. 1A and/or the lithographic apparatus 100’ of FIG. IB, and tool associated with servicing such apparatus(es) require inspection to determine its cleanliness in order to avoid any imperfections, defects, blemishes, or the like during manufacturing processes. Inspection techniques may be performed such that undesirable defects on a surface (e.g., a surface of a reticle or substrate) are prevented. Inspection techniques can include visually inspecting the object with the naked eye. The quality of such inspections can be dependent on the acuity of the vision of the individual inspecting the object, as well as how attentive the individual is during inspect. Furthermore, some contaminants may be too small for the naked eye for the naked eye to detect.

[0059] The term “imperfection,” “defect,” “blemish,” and the like may be used herein to refer to deviations or non-uniformities of structures from a specified tolerance. For example, a flat surface may have defects such as scratches, holes, or recesses, foreign particles, stains, and the like.

[0060] In the context of imperfections, the terms “foreign particle,” “contaminant particle,”

“contaminant,” and the like may be used herein to refer to unexpected, atypical, undesirable, or the like (herein undesirable) particulate matter that is present in a region or on a surface that was not designed to tolerate the presence of the undesirable particulate matter or otherwise adversely impacts operation of the apparatus on which the particulate matter is present. Such “foreign particles” and “contaminants” may include, but are not limited to, organic materials, such as human tissue/cells, inorganic contaminants, such as metal/alloy shavings. Organic contaminants may be the result of how parts/modules are handled during production, shipping, and assembly. The inorganic contaminants may be the result of technological processes used to manufacture the parts/modules. For example, parts/modules may be processed on lathes or milling machines, thereby generating a large number of small particles on the parts/modules that, even with multiple subsequent cleaning steps, may still be found on tested surfaces. Some examples of inorganic contaminants may include dust, stray photoresist, or other dislodged materials within the lithographic apparatus. Examples of dislodged materials may include steel, Au, Ag, Al, Cu, Pd, Pt, Ti, and the like. Material dislodging may occur due to, e.g., processes of fabricating metal interconnects on substrates and friction and impacts of actuated structures. Contaminants may make their way onto sensitive parts in the lithographic apparatus (e.g., reticle or substrate) and increase the likelihood of errors in lithographic processes. Embodiments of the present disclosure provide structures and functions for detecting defects on sensitive parts of a lithographic apparatus or process.

[0061] FIG. 4 shows a schematic of an augmented reality (AR) headset 400, according to some embodiments. In some embodiments, AR headset 400 may include one or more sensors 402, a processor 404, and a display 406, e.g., one or more lenses for displaying images to a user. It should be understood by those of ordinary skill in the art that the AR headset 400 may also include a frame having arm portions connected to a lens holding portion. Images may be projected onto at least one lens of the display 406 disposed in an opening of the frame. In some embodiments, the AR headset 400 may be used to inspect an object 410, such as in a lithographic apparatus 100 of FIG. 1 A or in a lithographic apparatus 100’ of FIG. IB. As another example, object 410 may be a tool used to service either the lithographic apparatus 100 of FIG. 1A or the lithographic apparatus 100’ of FIG. IB.

[0062] In some embodiments, the one or more sensors 402 may be one or more image capturing devices, e.g., a camera, configured to capture one or more images of the object 410. For example, the one or more images can be a video feed. In some embodiments, the one or more sensors 402 may be one or more ultrasonic sensors configured to emit an ultrasonic wave and receive a reflected wave reflected back from the object 410. In some embodiments, the one or more sensors 402 may be one or more hyperspectral imaging sensors, e.g., spatial scanners, spectral scanners, snapshot hyperspectral imaging scanners, or the like, configured to collect and processes information from across an electromagnetic spectrum to obtain a spectrum for each pixel in an image of the object 410. In some embodiments, the one or more sensors may be any combination of image capturing devices, ultrasonic sensors, and hyperspectral imaging sensors. It should be understood by those of ordinary skill in the art these are merely examples of sensors that can be used, and that other sensors are further contemplated in accordance with aspects of the present disclosure.

[0063] In some embodiments, the processor 404 may process information from the one or more sensors 402 in real-time during the inspection of the object 410 to detect a presence of a contaminant on a surface of the object 410. For example, for the one or more image capturing devices may capture the video feed during the inspection. In this example, the processor 404 may analyze the video feed to detect a change in a reflected luminosity between two or more spots on a surface of the object 410. In some embodiments, the change in the reflected luminosity between the two or more spots may indicate the presence of the contaminant.

[0064] As another example, for the one or more ultrasonic sensors, and the processor 404 may measure a distance between the AR headset 400 and the object 410 by measuring a time between the emission and reception of the ultrasonic wave at two or more spots on the surface of the object 410. For example, the one or more ultrasonic sensors may, at a first time tl, direct an ultrasonic wave to reflect from the object 410 and, at a second time t2, to receive the reflected wave for the two or more spots on the surface of the object 410. The processor 404 may measure a distance to the object by measuring a time between tl and t2 at each of the two of more spots. In some embodiments, the processor 404 may be further configured to compare a current distance to preceding distance. In this example, the processor 404 may determine that the contaminant is present in response to determining a change in the distance between the surface of the object and the AR headset by comparing the distance at the two or more spots.

[0065] In a further example, for the one or more hyperspectral imaging sensors, the processor

404 may analyze the electromagnetic spectrum for each pixel in the image of the object 410. In this example, based on a change in the electromagnetic spectrum from one pixel to another at two or more spots on the surface of the object 410, the processor 404 may determine that the contaminant is present on the object 410.

[0066] In some embodiments, in response to detecting the presence of the contaminant, the

AR headset 400 may display the detected contaminant to the user using display 406. In some embodiments, the detected contaminant may be magnified or enlarged for display. For example, using the one or more sensors, the AR headset 400 may capture and enlarge an image of the contaminant on the surface of the object 410 for illustration on the display 406. [0067] In some embodiments, the AR headset 400 may generate a three-dimensional (3D) map of the object 410. The AR headset 400 may combine the 3D map with the information from the one or more sensors to identify a location of a contaminant on the object 410. Thus, in some embodiments, the AR headset 400 may provide a user with location information of the contaminant. In some embodiments, the AR headset 400 may log the location information of the contaminant in a repository, e.g., store the location information in a memory or database.

[0068] The embodiments may further be described using the following clauses:

1. An augmented reality (AR) headset comprising: one or more sensors configured to scan a surface of an object during an inspection; and a processor coupled to the one or more sensors and configured to: process, in real-time during the inspection, information obtained using the one or more sensors; determine, based on the processed information, whether a contaminant is present on the surface of the object; and in response to determining that the contaminant is present on the surface of the object, display an image of the contaminant to a user of the AR headset.

2. The AR headset of clause 1, wherein: the one or more sensors comprise one or more image capturing devices configured to capture a video feed during the inspection, and the processor is further configured to analyze the video feed to determine whether a change in a reflected luminosity from two or more spots on a surface of the object occurred.

3. The AR headset of clause 2, wherein the processor is further configured to determine that the contaminant is present on the object in response to determining that the change in the reflected luminosity occurred.

4. The AR headset of clause 1, wherein: the one or more sensors comprise one or more ultrasonic sensors configured to, at time tl, direct an ultrasonic wave to reflect from the object and, at time t2, to receive the reflected wave for two or more spots on a surface of the object; and the processor is further configured to measure a distance to the object by measuring a time between tl and t2 at each of the two of more spots.

5. The AR headset of clause 4, wherein, to determine whether the contaminant is present on the surface of the object, the processor is further configured to determine that the contaminant is present in response to determining a change in the distance between the surface of the object and the AR headset by comparing the distance at the two or more spots.

6. The AR headset of clause 1, wherein the processor is further configured to generate a three- dimensional (3D) map of the object. 7. The AR headset of clause 6, wherein the processor is further configured to combine the 3D map with the information from the one or more sensors to identify a location of a contaminant on the object.

8. The AR headset of clause 7, wherein the processor is further configured to provide a user with location information of the contaminant.

9. A method comprising: scanning, using one or more sensors of an augmented reality (AR) headset, a surface of an object during an inspection; processing, in real-time during the inspection, information obtained using the one or more sensors; determining, based on the processed information, whether a contaminant is present on the surface of the object; and in response to determining that the contaminant is present on the surface of the object, displaying an image of the contaminant to a user of the AR headset.

10. The method of clause 9, further comprising: using one or more image capturing devices as the one or more sensors, the one or more image being configured to capture a video feed during the inspection, and the processing comprises analyzing the video feed to determine whether a change in a reflected luminosity from two or more spots on a surface of the object occurred.

11. The method of clause 10, wherein to determine whether the contaminant is present on the surface of the object comprises determining that the contaminant is present in response to determining that the change in the reflected luminosity occurred.

12. The method of clause 9, further comprising: using one or more ultrasonic sensors as the one or more sensors, the one or more ultrasonic sensors being configured to, at time tl, direct an ultrasonic wave to reflect from the object and, at time t2, to receive the reflected wave for two or more spots on a surface of the object; and the processing comprising measuring a distance to the object by measuring a time between tl and t2 at each of the two of more spots.

13. The method of clause 12, wherein to determine whether the contaminant is present on the surface of the object comprises determining that the contaminant is present in response to determining a change in the distance between the surface of the object and the AR headset by comparing the distance at the two or more spots.

14. The method of clause 9, further comprising generating a three-dimensional (3D) map of the object.

15. The method of clause 14, further comprising combining the 3D map with the information from the one or more sensors to identify a location of a contaminant on the object. 16. The method of clause 15, further comprising providing a user with location information of the contaminant.

17. A non- transitory computer readable medium storing one or more sequences of one or more instructions for execution by one or more processors to perform operations, comprising: scanning, using one or more sensors of an augmented reality headset, a surface of an object during an inspection; and processing, in real-time during the inspection, information obtained using the one or more sensors; determining, based on the processed information, whether a contaminant is present on the surface of the object; and in response to determining that the contaminant is present on the surface of the object, displaying a image of the contaminant to a user of the AR headset.

18. The non-transitory computer readable medium of clause 15, the operations further comprising generating a three-dimensional (3D) map of the object.

19. The non-transitory computer readable medium of clause 18, the operations further comprising combining the 3D map with the information from the one or more sensors to identify a location of a contaminant on the object.

20. The non-transitory computer readable medium of clause 18, the operations further comprising logging the location of the contaminant in a database.

[0069] Although specific reference can be made in this text to the use of lithographic apparatus in the manufacture of ICs, it should be understood that the lithographic apparatus described herein may have other applications, such as the manufacture of integrated optical systems, guidance and detection patterns for magnetic domain memories, flat-panel displays, LCDs, thin-film magnetic heads, etc. The skilled artisan will appreciate that, in the context of such alternative applications, any use of the terms “wafer” or “die” herein can be considered as synonymous with the more general terms “substrate” or “target portion”, respectively. The substrate referred to herein can be processed, before or after exposure, in for example a track unit (a tool that typically applies a layer of resist to a substrate and develops the exposed resist), a metrology unit and/or an inspection unit. Where applicable, the disclosure herein can be applied to such and other substrate processing tools. Further, the substrate can be processed more than once, for example in order to create a multi-layer IC, so that the term substrate used herein may also refer to a substrate that already contains multiple processed layers.

[0070] Although specific reference may have been made above to the use of embodiments of the present disclosure in the context of optical lithography, it will be appreciated that the present disclosure can be used in other applications, for example imprint lithography, and where the context allows, is not limited to optical lithography. In imprint lithography a topography in a patterning device defines the pattern created on a substrate. The topography of the patterning device can be pressed into a layer of resist supplied to the substrate whereupon the resist is cured by applying electromagnetic radiation, heat, pressure or a combination thereof. The patterning device is moved out of the resist leaving a pattern in it after the resist is cured.

[0071] It is to be understood that the phraseology or terminology herein is for the purpose of description and not of limitation, such that the terminology or phraseology of the present disclosure is to be interpreted by those skilled in relevant art(s) in light of the teachings herein.

[0072] The term “substrate” as used herein describes a material onto which material layers are added. In some embodiments, the substrate itself can be patterned and materials added on top of it may also be patterned, or may remain without patterning.

[0073] Although specific reference can be made in this text to the use of the apparatus and/or system according to the present disclosure in the manufacture of ICs, it should be explicitly understood that such an apparatus and or system has many other possible applications. For example, it can be employed in the manufacture of integrated optical systems, guidance and detection patterns for magnetic domain memories, LCD panels, thin-film magnetic heads, etc. The skilled artisan will appreciate that, in the context of such alternative applications, any use of the terms “reticle,” “wafer,” or “die” in this text should be considered as being replaced by the more general terms “mask,” “substrate,” and “target portion,” respectively.

[0074] While specific embodiments of the present disclosure have been described above, it will be appreciated that the present disclosure can be practiced otherwise than as described. The description is not intended to limit the present disclosure.

[0075] It is to be appreciated that the Detailed Description section, and not the Summary and

Abstract sections, is intended to be used to interpret the claims. The Summary and Abstract sections may set forth one or more but not all exemplary embodiments of the present disclosure as contemplated by the inventor(s), and thus, are not intended to limit the present disclosure and the appended claims in any way.

[0076] The present disclosure has been described above with the aid of functional building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.

[0077] The foregoing description of the specific embodiments will so fully reveal the general nature of the present disclosure that others can, by applying knowledge within the skill of the art, readily modify and/or adapt for various applications such specific embodiments, without undue experimentation, without departing from the general concept of the present disclosure. Therefore, such adaptations and modifications are intended to be within the meaning and range of equivalents of the disclosed embodiments, based on the teaching and guidance presented herein. [0078] The breadth and scope of protected subject matter should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.