Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD FOR ASCERTAINING AN IMAGE OF AN OBJECT
Document Type and Number:
WIPO Patent Application WO/2022/002558
Kind Code:
A1
Abstract:
To ascertain an image of an object (5) which emerges when the object (5) is illuminated with illumination light (1) from a partly coherent light source (17) with a target illumination setting having an illumination-side numerical aperture NA_illu and an imaging-side numerical aperture NA_detection, the following procedure is performed: initially, a section (3) of the object (5) is illuminated with illumination light (1) from a coherent measurement light source with an illumination setting having an illumination-side numerical aperture NA_i, which is at least as large as NA_detection. Then, a diffraction image of the illuminated section (3) is recorded. This is implemented by way of a spatially resolved detection in a far field detection plane (8a) of a diffraction intensity of illumination light (16) diffracted by the illuminated section (3) with a recording-side numerical aperture NA by way of a plurality of sensor pixels, This recording-side aperture must be greater than or equal to the maximum of NA_illu and NA_detection. From the recorded diffraction image data, the image of the section (3) of the object (5) for the target illumination setting is then ascertained from the recorded diffraction image data. An apparatus for carrying out the method comprises a measurement light source for providing the illumination light (1) and a spatially resolving detector (8), arranged in the detection plane (8a), for recording the diffraction image. This yields a method and an apparatus by means of which a flexible image ascertainment of sections of the object is facilitated, in particular for different target illumination settings.

Inventors:
HUSEMANN CHRISTOPH (DE)
SEIDEL DIRK (DE)
MOUT MARCO (DE)
Application Number:
PCT/EP2021/065644
Publication Date:
January 06, 2022
Filing Date:
June 10, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ZEISS CARL SMT GMBH (DE)
International Classes:
G03F1/84; G02B27/09; G06T7/00
Domestic Patent References:
WO2016012426A12016-01-28
Foreign References:
US20190391087A12019-12-26
US20170059845A12017-03-02
DE102020208045A2020-06-29
US20130335552A12013-12-19
US20130335552A12013-12-19
Other References:
POTIER J ET AL: "Experimental comparison of full and partial coherent illumination in coherent diffraction imaging reconstructions", JOURNAL OF PHYSICS: CONFERENCE SERIES, INSTITUTE OF PHYSICS PUBLISHING, BRISTOL, GB, vol. 425, no. 19, 22 March 2013 (2013-03-22), pages 192009, XP020243122, ISSN: 1742-6596, DOI: 10.1088/1742-6596/425/19/192009
B. ZHANG ET AL.: "Quantitative tabletop coherent diffraction imaging microscope for EUV lithography mask inspection", PROCEEDINGS OF SPIE 9050, METROLOGY, INSPECTION, AND PROCESS CONTROL FOR MICROLITHOGRAPHY, vol. XXVIII, 2 April 2014 (2014-04-02), pages 90501D
F. ZHANG ET AL.: "Translation position determination in ptychographic coherent diffraction imaging", OPTICS EXPRESS, vol. 21, no. 11, 2013
A. WOJDYLA: "EUV photolithography mask inspection using Fourier ptychography", PROCEEDINGS SPIE 10656, IMAGE SENSING TECHNOLOGIES: MATERIALS, DEVICES, SYSTEMS, AND APPLICATIONS, vol. V, 29 May 2018 (2018-05-29), pages 106560W
A. MAIDEN ET AL.: "Further improvements to the ptychograph-ical iterative engine", OPTICA, vol. 4, no. 7, 2017
H.M.L. FAULKNER ET AL.: "Movable Aperture Lensless Transmission Microscopy: A Novel Phase Retrieval Algorithm", PHYS. REV. LETT., vol. 93, no. 2, 2004, pages 2, XP055155406, DOI: 10.1103/PhysRevLett.93.023903
D.F. GARDNER ET AL.: "High numerical aperture reflection mode coherent diffraction microscopy using off-axis apertured illumination", OPTICS EXPRESS, vol. 20, no. 17, 2012
B. ZHANG ET AL.: "Full field tabletop EUV coherent diffractive imaging in a transmission geometry", OPTICS EXPRESS, vol. 21, no. 19, 2013, XP055443889, DOI: 10.1364/OE.21.021970
HEINRICH KIRCHAU-ER: "Photolithography Simulation", March 1998, TU VIENNA
Attorney, Agent or Firm:
RAU, SCHNECK & HÜBNER PATENTANWÄLTE RECHTSANWÄLTE PARTGMBB (DE)
Download PDF:
Claims:
Claims

1. Method for ascertaining an image of an object (5) emerging when the object (5) is illuminated with illumination light (1) from a partly coher ent light source (17) with a target illumination setting having an illu mination-side numerical aperture NA illu and an imaging-side numer ical aperture NA detection, including the following steps: illuminating a section (3) of the object (5) with illumination light (1) from a coherent measurement light source (6) with an illumina tion setting having an illumination- side numerical aperture NA_i, which is at least as large as NA detection, recording a diffraction image of the illuminated section (3) by way of a spatially resolved detection in a far field detection plane (8a) of a diffraction intensity of illumination light (16) diffracted by the illuminated section (3) with a recording-side numerical aperture NA, which is greater than or equal to the maximum of NA illu and NA detection, by way of a plurality of sensor pixels, ascertaining the image of the section (3) of the object (5) for the target illumination setting from the recorded diffraction image da ta.

2. Method according to Claim 1, characterized in that the ascertainment of the image includes a reconstruction of amplitude and phase of a co- herent object illumination light field (y) spatially downstream of the section (3) of the object (5) using the recorded diffraction image data, wherein the object illumination light field (y) would arise in the case of an illumination of the section (3) of the object (5) with illumination light in the form of a plane wave. 3. Method according to Claim 1 or 2, characterized in that a simulation is carried out when ascertaining the image, said simulation arising from a virtual illumination of the object diffraction field (y) in an object plane (4) using a diffraction-limited illumination spot (F).

4. Method according to Claim 3, characterized in that the following steps are carried out in the simulation: simulating aerial image data of an object point by way of a virtual illumination of the object diffraction light field (y) in the object plane (4) using a diffraction-limited illumination spot (F) with a simulated numerical aperture NA, which at most is as large as the illumination- side numerical aperture NA_i, summing the pixel data which have arisen by way of the simulation for the object section (3) to be detected, for the pixels whose ar rangement corresponds to the illumination directions of the illumi nation setting, scanning the object diffraction field within the section (3) of the object (5) using the illumination spot (F) and repeating the steps of "simulating" and "summing" for each grid point.

5. Method according to Claim 3 or 4, characterized in that the field of the diffraction-limited illumination spot (F) is multiplied by the recon structed object illumination light field (y) during the simulation and in that the multiplication result is Fourier transformed, as a result of which aerial image data of the object point in the far field arise.

6. Apparatus for carrying out a method according to any one of Claims 1 to 5, - comprising a measurement light source (6) for providing the illumi nation light (1), comprising a spatially resolving detector (8), arranged in the detec tion plane (8a), for recording the diffraction image.

7. Apparatus according to Claim 6, comprising an aperture stop (15) for specifying the illumination- side numerical aperture NA_i as selection from a plurality of specifiable illumination- side numerical apertures.

Description:
Method for ascertaining an image of an object

The present patent application claims the priority of the German patent ap plication DE 10 2020 208 045.3, the content of which is incorporated by reference herein.

The invention relates to a method for ascertaining an image of an object. Further, the invention relates to an apparatus for carrying out the method.

A method and respectively an apparatus for detecting a structure of a li thography mask are known from WO 2016/012426 Al. In the latter a 3D aerial image measurement takes place in the region around an image plane during the imaging of a lithography mask arranged in an object plane.

The specialist article "Quantitative tabletop coherent diffraction imaging microscope for EUV lithography mask inspection" by B. Zhang, et al., pro ceedings of SPIE 9050, Metrology, Inspection, and Process Control for Microlithography XXVIII, 9050 ID (2 April 2014) discloses a structure detection method for the inspection of lithography masks. The specialist article "Translation position determination in ptychographic coherent dif fraction imaging" by F. Zhang et al., Optics Express, Vol. 21, No. 11, 2013 discloses a position estimation method using ptychography. The specialist article "EUV photolithography mask inspection using Fourier ptychogra phy" by A. Wojdyla, proceedings SPIE 10656, Image Sensing Technolo gies: Materials, Devices, Systems, and Applications V, 106560W (29 May 2018) discloses an apparatus for inspecting EUV lithography masks using Fourier ptychography. It is an object of the present invention to develop a method and an appa ratus of the type set forth at the outset, in such a way that a flexible image ascertainment of sections of the object, in particular for different target il lumination settings, is made possible.

In respect of the method, the object is achieved according to the invention by a method having the features specified in Claim 1.

The invention has recognized that method steps known firstly from coher ent diffractive imaging (CDI) and secondly from microscopic spot imaging can be combined with one another in such a way that a diffraction image of an extended object section illuminated with a defined illumination- side numerical aperture is detected and the diffraction image data can be used to simulate, in particular, a different illumination situation of the illuminated object section and, more particularly, a selected target illumination setting. Thus, when carrying out the ascertainment method, it is not necessary for the object to be actually illuminated with the target illumination setting; instead, it is enough if the object is illuminated using a different illumina tion setting, more particularly an illumination setting that is easier to gen erate, of the coherent measurement light source.

The target illumination setting can be a used illumination setting, utilized within the scope of projection lithography. In particular, it is possible to ensure an image ascertainment for various target illumination settings, the only precondition of which being that a numerical aperture of the respec tive target illumination setting is at most as large as the numerical aperture of the measurement illumination. A coherent CDI measurement of the ob ject allows, in particular, to predict images which would arise when the object is illuminated using a partially coherent illumination setting. In addition to the combined use of CDI and microscopic spot imaging, it is also possible, in principle, within the scope of the ascertainment method to carry out the simulation with the aid of artificial intelligence processes (machine learning) and convolution/deconvolution algorithms, particularly in conjunction with the processing of the diffraction image data.

In this case, the term numerical aperture (NA) is used synonymously with an illumination- side or imaging-side stop, which restricts the angle spec- tram of the light in a plane that is conjugated to the object.

In the method according to Claim 2, it is possible to use ptychography al gorithms which are described in the specialist article by A. Maiden et al., OPTICA, Vol. 4, No. 7, 2017 and "Movable Aperture Lensless Transmis- sion Microscopy: A Novel Phase Retrieval Algorithm" by H.M.L. Faulkner et ah, Phys. Rev. Lett., Vol. 93, No. 2., 2004. The object illumination light field y can be reconstructed for frequencies up to an aperture of the sum of the illumination- side aperture NA_i and the imaging-side aperture NA of the CDI structure, since all these frequencies contribute to the sensor sig- nah In the image ascertainment, it is the object illumination light field or the object exposure field that is reconstructed, and hence not the object it self. This optical field is reconstructed following an interaction with the object, i.e., for example, following a passage through the object. On ac count of the interaction with the object, the reconstructed field contains the complete structure information. Therefore, the object illumination light field is observed spatially downstream of the object section in order to take account of the interaction with the object section. A distance between, for example, a plane in which amplitude and phase of the object exposure field are reconstructed and an arrangement plane of the object section can be 0. This distance can also be greater than 0.

The method according to Claim 3 renders insights from scanning micros copy usable, which insights are known, for example, from US 2013/0335 552 A.

This applies accordingly to the simulation method according to Claim 4.

Precise aerial image data arise in the simulation according to Claim 5.

The advantages of an apparatus according to Claim 6 correspond to those which have already been explained above with reference to the image as certainment method. The apparatus can be used as a measurement system for mask qualification or else as a registration tool.

An aperture stop according to Claim 7 increases the variability of the measurement apparatus. In particular, it is possible to specify a maximum numerical aperture of the illumination.

Exemplary embodiments of the invention are explained in greater detail below with reference to the drawing. In said drawing:

Fig. 1 shows highly schematically, in a plan view with the viewing direc tion perpendicular to a plane of incidence, a measurement system for detecting a structure of a lithography mask by detection of at least one diffraction image, for measuring a reflective lithography mask; Fig. 2 shows schematically and for elucidation in relation to a transparent object, a measurement arrangement of the measurement system which is illuminating a section of the lithography mask, as object to be measured, using illumination light of a partly coherent meas urement light source with an illumination- side numerical aperture and with a recording-side or detection-side numerical aperture for recording a diffraction image of the illuminated object section; and

Fig. 3 shows, schematically and in a manner similar to Figure 2, a con figuration for elucidating aperture data when ascertaining an image of the object section for a target illumination setting in the form of a simulation scenario with an illumination- side numerical aperture and a likewise chosen detection-side numerical aperture from the diffraction image recording data recorded with the measurement arrangement according to Figure 2.

Figure 1 shows in a view corresponding to a meridional section a beam path of EUV illumination light and imaging light 1 in a metrology system 2 for the examination of an object 5, arranged in an object field 3 in an object plane 4, in the form of a reticle or a lithography mask or a section thereof with the EUV illumination light 1.

The metrology system 2 is used as an apparatus for analysis of a diffraction image and serves to detect a structure of the lithography mask, which is in turn used during EUV projection exposure for the production of semicon ductor components. The mask structure detected by the metrology system 2 can then be used for example to determine effects of properties of the li thography mask on the optical imaging by projection optical units within a projection exposure apparatus. The metrology system 2, in a similar manner to the system known from WO 2016/012426 Al, can be a system for mask qualification. In this case, the structure to be detected of the lithography mask 5 is the mask structure to be imaged itself

As an alternative or in addition thereto, the structure to be detected of the lithography mask 5 can be a position marker or a used structure on the li thography mask 5. The detection of such a position marker or a used strac- ture can be used to detect or to measure an exact position of a plurality of position markers with respect to one another or of a plurality of used struc tures with respect to one another or used structure(s) relative to position marker(s) on the lithography mask 5. The metrology system 2 then finds application as a registration tool. One registration tool is known under the tradename PROVE. The measurement wavelength of the illumination light 1 can correspond to an actual projection exposure wavelength.

In order to facilitate the presentation of positional relationships, a Cartesian xyz-coordinate system is used hereinafter. The x-axis extends perpendicu- lar to the plane of the drawing and out of the latter in Figure 1. The y-axis extends upward in Figure 1. The z-axis extends to the left in Figure 1.

The object plane 4 is parallel to the xy-plane. The illumination light 1 is reflected and diffracted at the object 5. A plane of incidence of the illumination light 1 lies parallel to the yz-plane. Depending on the embodiment of the metrology system 2, the latter can be used for a reflective or for a transmissive object 5. One example of a transmissive object is a phase mask.

The EUV illumination light 1 is generated by a measurement light source 6. This can be a light source in the visible range, in the near, middle or far UV range or in the EUV range. The light source 6 can be a laser plasma source (LPP; laser produced plasma) or a discharge source (DPP; discharge produced plasma). It is also possible to use a synchrotron-based light source, or a Free Electron Laser (FEL). The light source 6 can comprise a device for generating high harmonics of a fundamental wavelength (High Harmonic Generation, HHG). A used wavelength of the EUV light source can be e.g. in the range of between 5 nm and 30 nm. However, longer or shorter wavelengths are also possible. In principle, in the case of a variant of the metrology system 2, a light source for another used light wavelength can also be used instead of the light source 6, for example a light source for a DUV used wavelength of 193 nm. The light source 6 is a coherent light source.

An illumination optical unit 7 of the metrology system 2 is arranged be tween the light source 6 and the object 5. The illumination optical unit 7 serves for the illumination of the object 5 to be examined with a defined illumination intensity distribution over the object field 3 and at the same time with a defined illumination angle or a defined illumination angle dis tribution with which the field points of the object field 3 are illuminated. With the lithography mask 5 arranged in the object field 3, the object field 3 simultaneously constitutes an illuminated portion of the lithography mask 5. The object field 3 is illuminated in the metrology system 2 in such a way that a part of the object 5 is illuminated in each case by way of a macro scopic spot of the illumination light 1. The spot can have a diameter of sev eral micrometres. By way of example, this diameter can range between 2 pm and 30 pm. This spot of the illumination light 1 is scanned laterally over the entire object field 3. During this scanning procedure, the spots of the illumination light 1 overlap at adjacent scanning points. A diffraction image is recorded at each scanning point, i.e., at each current position of the spot of the illumination light 1 on the object field 3, as will be ex plained in more detail below.

After reflection at the object 5 or transmission through the object 5, the diffracted illumination or imaging light 1 impinges on a spatially resolving detection unit or detection device 8 of the metrology system 2. Thus, a dif fraction measurement occurs, which is also referred to as CDI (coherent diffractive imaging) below. The detection device 8 is embodied as CCD or CMOS detector, for example, and comprises a plurality of sensor pixels, which are arranged in rows and columns in the form of an array and which are not illustrated in any more detail in the drawing. A spatial resolution of the sensor or detector arises by way of a corresponding pixel division. The spatially resolving sensor of the detection device 8 can be delimited in a square or rectangular fashion. The CCD or CMOS detector is arranged in a detection plane 8a. The detection device 8 detects a diffraction intensity during the recording of the diffraction image of the lithography mask 5. As indicated in Figure 1, a deflecting and/or beam-influencing optical unit 9 can be arranged between the object 5 and the detection device 8. This is not mandatory, however. In other words, it is also possible for no optical ele ment and/or no beam-influencing element at all to be arranged between the object 5 and the detection device 8. The detection device 8 is signal connected to a digital image processing device 10a. The object 5 is carried by a mask or object holder 10. The latter can be dis placed by way of a displacement drive 11 on the one hand parallel to the xy-plane and on the other hand perpendicularly to this plane, that is to say in the z-direction, e.g. in D z increments. The mask holder 10 is displaceable for changing between portions to be illuminated of the lithography mask 5. The mask holder 10 can additionally be embodied as tiltable about the x- axis and/or about the y-axis. The displacement drive 11, as also the entire operation of the metrology system 2, is controlled by a central control de vice 12, which, in a way that is not illustrated in more specific detail, is signal connected to the components to be controlled.

Figure 2 elucidates the aperture ratios, firstly when illuminating the lithog raphy mask 5 with the illumination light 1 and secondly when detecting the illumination light 1 diffracted by the lithography mask 5. Once again, the case of a reflective lithography mask 5 is illustrated. The illustration of Figure 2 is not true to scale in respect of the shown numerical apertures or aperture angles.

The illumination light 1 impinges on the object field 3 with a chief ray an gle CRA between an illumination- side chief ray CRAO and a normal N to the object plane 4 of 6°. The object field 3 arises as illumination spot of the illumination light 1.

A different chief ray angle CRA in the range between 3° and 8° is also pos sible, in particular. Proceeding from the object field 3, a zero order of dif- fraction of the illumination light 1 propagates with an image-side chief ray angle between an image-side chief ray CRAI and the normal N, which in turn has the value of the incidence-side chief ray angle CRA.

Figure 2 shows illumination parameters of a measurement arrangement which is realized by the metrology system 2. The object 5 in the object plane 4 is illustrated as a transparent object that transmits illumination light 1 in Figure 2. An example of such a transparent object 5 is a lithography mask in the form of a transmissive phase mask. At the same time, the illus tration of Figure 2 serves for a better elucidation on account of the better separation of the illumination light side from the detection side in compari son with the illustration of Figure 1 with the reflective object 5.

In the measurement structure according to Figure 2, the object 5 is illumi nated with an illumination- side numerical aperture NA_i. A section of the object 5 in the form of the object field 3 extending in the xy-plane is illu minated.

The illumination- side numerical aperture NA_i is specified by an illumina tion-side aperture stop 15.

A diffraction image of the object field 3 is recorded by the detector 8 in the detection plane 8a, the detector possibly being a CMOS or a CCD pixel array. This is implemented by spatially resolved detection of a diffraction intensity of the diffraction light 16, i.e., the illumination light 1 diffracted by the illuminated object field 3. This spatially resolved detection is im plemented by the detection device 8 in the detection plane 8a, which repre sents a far field detection plane. This detection-site detection of the illumi- nation light 16 is implemented with a recording-side numerical aperture NA.

In Figure 2, the recording-side numerical aperture NA is illustrated in ex emplary fashion for the diffraction light 16 emanating from an object point. In fact, the detection device 8 records the diffraction light 16 emanating from the entire object field 3.

A partly coherent image of the section of the object 5 arranged in the object field 3 can be ascertained from the diffraction image data recorded by the detection device 8. In this case, it is possible to specify the target illumina tion setting NA illu and detection illumination setting NA detection used to illuminate and detect the ascertained image in the simulation to be ascer tained. In this case, NA illu represents a parameter that is relevant to the target illumination setting, specifically the maximum numerical aperture thereof (maximum radius of a pupil of the target illumination setting in an illumination pupil plane). Additionally, the target illumination setting can satisfy further parameters, which yield defined full illumination of an illu mination pupil plane. By way of example, for the target illumination set ting, use can be made of a used illumination setting from projection lithog raphy, known from the prior art.

Figure 3 shows, in exemplary fashion, a variant of an illumination and de tection situation as a simulation scenario, for which the simulated recorded image can be ascertained from the ascertained diffraction image data. In the simulation scenario according to Figure 3, an illumination- side numeri cal aperture NA illu of a simulation illumination with a simulation light source 17 is specified, the latter illuminating the object 5 in the simulation object plane 4. Moreover, Figure 3 illustrates the specification of a numeri- cal detection aperture NA detection to be simulated. Then, the result of the ascertainment is a light intensity in a simulation image plane 18. Figure 3 additionally schematically illustrates three lens elements 19 for elucidating the two numerical apertures NA illu and NA detection. The three lens el ements 19 are a condenser lens element, which is arranged between the illumination- side numerical aperture NA illu and the object field 3 in Fig ure 3, and two objective lens elements or tube lens elements, which image the object field 3 into the simulation image plane 18.

The following procedure is carried out to ascertain an image of the object section in the object field 3, which is illuminated and detected like in the simulation scenario with the illumination- side numerical aperture NA illu (target illumination setting) and the detection-side numerical aperture NA detection:

Initially, the object section on the object field 3 is coherently illuminated with the illumination light 1 and the illumination side numerical aperture NA_i using the measurement structure according to Figure 2 and all fre quencies less than the image-side aperture NA are recorded by the detec tion unit 8, i.e., on a pixelated sensor in the far field. This maximum rec orded frequency is defined to be the imaging-side aperture NA even though no physical stop needs to be present in the CDI measurement system; in stead, the maximum recorded frequency may also arise from the lateral extent of the sensor.

What holds true as a matter of principle is that this imaging aperture NA is greater than or equal to the largest aperture value of the two numerical ap ertures NA illu and NA detection to be simulated. It thus holds true that: NA > max (NA detection, NA illu).

What must furthermore hold true is that the illumination- side aperture NA_i is greater than or equal to the aperture NA detection to be simulated, i.e., the following holds true:

NA_i > NA detection.

The aperture stop 15 determines the coherent illumination field for the run ning CDI measurement. In principle, it is possible to illuminate the object 5 with a comparatively simple illumination field. The selection of the target illumination setting to be simulated, which is then considered during the computational image ascertainment, is independent thereof.

In particular, the aperture stop 15 can be configured with adjustable com ponents, for example in the style of an iris diaphragm or else in the style of a revolver interchanger, by means of which different interchangeable aper ture stops, which can each be used to specify a certain illumination, can be inserted into a beam path of the illumination light 1. A corresponding inter changer can be motor driven and can be controlled by way of the central control device 12.

Kohler illumination or else a critical illumination can be used as a target illumination of the object field used for the image to be ascertained.

The intensity of the simulated image, i.e., of the object image to be ascer tained, when illuminated by the target illumination setting can be written as follows: In this case: I(x) is the arising light intensity in the illumination to be ascertained with the target illumination setting; x is an image field dimension in the detec tion plane 8a (correspondingly also y, x and y span the detection plane 8a); k is the illumination direction of the illumination light 1 (wave vector); ai is the target illumination setting to be simulated; p is the detection direction for recording the diffraction light 16 (wave vec tor); y is the Fourier transform of the object section in the object field 3;

P is the detection aperture NA detection. Formula (1) allows the image to be emulated to be simulated with the tar get illumination setting if the object structure in the illuminated object sec tion y(c) is known. As a matter of principle, the latter can be directly cal culated using known reconstruction algorithms, which are known under the heading CDI (coherent diffraction imaging).

An alternative method, which has greater stability in comparison with the ascertainment method explained above, is described below. Initially, a propagation signal S xs is calculated in an intermediate step as a function of the propagation direction k, which emanates from a spot at the location x s of the illuminated object section on the object field 3. Here it holds true that: Here, F xs is a function reproducing the selected spot (for a punctiform spot made of one pixel, F xs =1 in the case of x = x s ; otherwise 0).

The signal S xs is described as measurement variable in US 2013/0335 552 Al. Depending on the size of the illumination field in the CDI measure ment setup, the matrix S xs is similar to the CDI sensor signal· Both are identical if the CDI illumination field equals F xs . If the CDI illumination field has a greater extent (as is conventional), there needs to be a conver sion, for example once again by way of the reconstructed illuminated ob- ject section ifr(x) from a CDI reconstruction. The similarity of S xs to the CDI sensor signal is the reason for the greater stability of the second pro posed method.

The image intensity to be simulated or to be ascertained in the simulation scenario according to Figure 3 can now be calculated directly from the sig nal S xs . Here, use is once again made of the intermediate step of calculating the image intensity at a location x s which corresponds to the location of the spot position, as addressed above in conjunction with Formula (2). The following holds true for the image intensity I(x s ) to be ascertained at the location x s :

In this case, ai is the target illumination setting to be simulated (within the numerical aperture NA illu: 1 where there is illumination in the pupil plane in the case of the target illumination setting; 0 otherwise). The calculation steps as per Formulas (2) and (3) are then carried out for all array positions x s = (xi,y of the sensor pixels of an image sensor to be simulated in the simulation image plane 18. Then, the result is an image intensity I(x,y) to be ascertained of the partly coherent image to be ascer tained of the section of the object 5, which corresponds to the object field 3.

Such a reconstruction can also be carried out by means of a ptychography method. In this case, it is also possible to use iterative Fourier transform algorithms (IFTAs).

In this case, the coherent object illumination light field is reconstructed from the recorded diffraction image data immediately after the considered section 3 of the object 5, wherein the illumination field y considered is that which would result when the object 5 is illuminated with illumination light in the form of a plane wave.

By way of example, the object structure iji(x) is reconstructed using a method from coherent diffractive imaging (CDI). Such a method is known from the specialist article "High numerical aperture reflection mode coher ent diffraction microscopy using off-axis apertured illumination" by D.F. Gardner et al., Optics Express, Vol. 20, No. 17, 2012.

Basic principles of the iterative Fourier transform algorithm (IFTA) are found in the specialist article "Further improvements to the ptychograph- ical iterative engine" by A. Maiden et al., OPTICA, Vol. 4, No. 7, 2017 and "Movable Aperture Lensless Transmission Microscopy: A Novel Phase Retrieval Algorithm" by H.M.L. Faulkner et al., Phys. Rev. Lett., Vol. 93, No. 2., 2004. A further reference for the use of a diffractive image recording in structure detection is the specialist article “Full field tabletop EUV coherent diffrac tive imaging in a transmission geometry” by B. Zhang et al., Optics Ex- press, Vol. 21, No. 19, 2013.

Simulation methods which can be used in this context are moreover known from the dissertation "Photolithography Simulation" by Heinrich Kirchau- er, TU Vienna, March 1998.