Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
RECURRENT EQUIVARIANT INFERENCE MACHINES FOR CHANNEL ESTIMATION
Document Type and Number:
WIPO Patent Application WO/2024/064354
Kind Code:
A1
Abstract:
Methods, systems, and devices for wireless communications are described. A wireless device may receive an assignment of a set of resources associated with a channel, where the set of resources includes a first subset of resources allocated for data transmission and a second subset of resources allocated for a reference signal. The wireless device may generate, from the reference signal in accordance with a minimum mean square estimation (MMSE) operation, a first set of multiple channel estimations per layer of the channel. The wireless device may generate, in accordance with a nonlinear two-dimensional interpolation of the channel, a second set of multiple channel estimations per layer of the channel and may perform a refinement operation utilizing the estimations to generate a channel estimation associated with multiple layers.

Inventors:
PRATIK KUMAR (US)
BEHBOODI ARASH (US)
SADEGHI POURIYA (US)
SRIKRISHNAN THARUN ADITHYA (US)
PIERROT ALEXANDRE (US)
SORIAGA JOSEPH (US)
HARIHARAN GAUTHAM (US)
BHATTACHARJEE SUPRATIK (US)
Application Number:
PCT/US2023/033506
Publication Date:
March 28, 2024
Filing Date:
September 22, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
QUALCOMM INC (US)
International Classes:
H04L25/02; G06N3/045
Domestic Patent References:
WO2022074639A22022-04-14
Other References:
MUTLU URAL ET AL: "Deep Learning Aided Channel Estimation Approach for 5G Communication Systems", 2022 4TH GLOBAL POWER, ENERGY AND COMMUNICATION CONFERENCE (GPECOM), IEEE, 14 June 2022 (2022-06-14), pages 655 - 660, XP034146953, DOI: 10.1109/GPECOM55404.2022.9815811
ZIMAGLIA ELISA ET AL: "A Deep Learning-based Approach to 5G-New Radio Channel Estimation", 2021 JOINT EUROPEAN CONFERENCE ON NETWORKS AND COMMUNICATIONS & 6G SUMMIT (EUCNC/6G SUMMIT), IEEE, 8 June 2021 (2021-06-08), pages 78 - 83, XP033946083, DOI: 10.1109/EUCNC/6GSUMMIT51104.2021.9482426
Attorney, Agent or Firm:
LARSEN, Per (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. An apparatus for wireless communication at a wireless communication device, comprising: one or more memories; and one or more processors coupled with the one or more memories and configured to cause the wireless communication device to: receive an assignment of a set of resources associated with a channel comprising a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal; generate, from the reference signal received over the second subset of resources in accordance with a minimum mean square estimation operation, a first plurality of channel estimations associated with respective layers of a plurality of layers of the channel for the set of resources; generate, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second plurality of channel estimations and a plurality of values of a latent variable, the second plurality of channel estimations and the plurality of values associated with respective layers of the plurality of layers of the channel for the set of resources, wherein the nonlinear two-dimensional interpolation of the channel is based at least in part on the first plurality of channel estimations; and perform a refinement operation on the second plurality of channel estimations comprising one or more iterations, wherein each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters, and wherein, to perform each iteration of the one or more iterations, the one or more processors are configured to cause the wireless communication device to: generate respective gradients associated with the second plurality of channel estimations based at least in part on the second plurality of channel estimations for the second subset of resources and measured observations of the second subset of resources; generate, based at least in part on a first set of values of the plurality of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable; and modify the second plurality of channel estimations associated with the plurality of layers based at least in part on the second set of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients.

2. The apparatus of claim 1, wherein the refinement operation is a first refinement operation and the set of machine learning parameters is a first set of machine learning parameters, and the one or more processors are configured to cause the wireless communication device to: perform a second refinement operation on the second plurality of channel estimations, the second refinement operation comprising one or more second iterations performed in accordance with a same second set of machine learning parameters, wherein the first refinement operation and the second refinement operation are associated with a respective attention calculation of a plurality of attention calculations.

3. The apparatus of claim 2, wherein the plurality of attention calculations comprises an intra-physical resource block group calculation, an interphysical resource block group calculation, a cross-multiple-input and multiple-output calculation, a multi-layer perceptron calculation, or any combination thereof.

4. The apparatus of claim 1, wherein the one or more processors are configured to cause the wireless communication device to: perform the minimum mean square estimation operation based on a resource configuration pattern of the second subset of resources allocated for the reference signal, the reference signal comprising a demodulation reference signal.

5. The apparatus of claim 1, wherein, to generate the respective gradients, the one or more processors are configured to cause the wireless communication device to: generate respective sets of values of a residual variable based at least in part on a difference between the measured observations of the second subset of resources and the second plurality of channel estimations for the second subset of resources; and combine the respective sets of values of the residual variable, the second subset of resources, and a quantity of mask bits.

6. The apparatus of claim 1, wherein, to generate the second set of values of the latent variable, the one or more processors are configured to cause the wireless communication device to: combine the second plurality of channel estimations for the second subset of resources, the respective gradients, and respective values of the first set of values of the latent variable based at least in part on generation of the respective gradients.

7. The apparatus of claim 1, wherein, to generate the second set of values of the latent variable, the one or more processors are configured to cause the wireless communication device to: model a correlation between resources of each resource block of a group of resource blocks and other resource blocks of the group of resource blocks.

8. The apparatus of claim 1, wherein, to generate the second set of values of the latent variable, the one or more processors are configured to cause the wireless communication device to: model a correlation between resources of each group of a plurality of groups of resources of the set of resources and other groups of the plurality of groups of resources, wherein each group of the plurality of groups of resources comprises a plurality of resource blocks.

9. The apparatus of claim 1, wherein, to generate the second set of values of the latent variable, the one or more processors are configured to cause the wireless communication device to: model a correlation between each layer of the plurality of layers for the set of resources.

10. The apparatus of claim 1, wherein, to modify the second plurality of channel estimations, the one or more processors are configured to cause the wireless communication device to: combine the second set of values of the latent variable, the second plurality of channel estimations, and the respective gradients based at least in part on the set of machine learning parameters.

11. The apparatus of claim 1, wherein the nonlinear two-dimensional interpolation of the channel is based at least in part on a machine learning model.

12. The apparatus of claim 1, wherein the first plurality of channel estimations and the second plurality of channel estimations are associated with a plurality of single-input and single-output antenna pairs.

13. The apparatus of claim 1, wherein: each iteration of the one or more iterations is performed by a refinement network comprising a likelihood module, an encoder module, and a decoder module, the refinement network comprising a machine learning model; and each refinement network executes according to the same set of machine learning parameters.

14. A method for wireless communication at a wireless communication device, comprising: receiving an assignment of a set of resources associated with a channel comprising a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal; generating, from the reference signal received over the second subset of resources in accordance with a minimum mean square estimation operation, a first plurality of channel estimations associated with respective layers of a plurality of layers of the channel for the set of resources; generating, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second plurality of channel estimations and a plurality of values of a latent variable, the second plurality of channel estimations and the plurality of values associated with respective layers of the plurality of layers of the channel for the set of resources, wherein the nonlinear two-dimensional interpolation of the channel is based at least in part on the first plurality of channel estimations; and performing a refinement operation on the second plurality of channel estimations comprising one or more iterations, wherein each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters, and wherein each iteration of the one or more iterations comprises: generating respective gradients associated with the second plurality of channel estimations based at least in part on the second plurality of channel estimations for the second subset of resources and measured observations of the second subset of resources; generating, based at least in part on a first set of values of the plurality of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable; and modifying the second plurality of channel estimations associated with the plurality of layers based at least in part on the second set of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients.

15. The method of claim 14, wherein the refinement operation is a first refinement operation and the set of machine learning parameters is a first set of machine learning parameters, the method further comprising: performing a second refinement operation on the second plurality of channel estimations, the second refinement operation comprising one or more second iterations performed in accordance with a same second set of machine learning parameters, wherein the first refinement operation and the second refinement operation are associated with a respective attention calculation of a plurality of attention calculations.

16. The method of claim 15, wherein the plurality of attention calculations comprises an intra-physical resource block group calculation, an interphysical resource block group calculation, a cross-multiple-input and multiple-output calculation, a multi-layer perceptron calculation, or any combination thereof.

17. The method of claim 14, further comprising: performing the minimum mean square estimation operation based on a resource configuration pattern of the second subset of resources allocated for the reference signal, the reference signal comprising a demodulation reference signal.

18. The method of claim 14, wherein generating the respective gradients comprises: generating respective sets of values of a residual variable based at least in part on a difference between the measured observations of the second subset of resources and the second plurality of channel estimations for the second subset of resources; and combining the respective sets of values of the residual variable, the second subset of resources, and a quantity of mask bits.

19. The method of claim 14, wherein generating the second set of values of the latent variable comprises: combining the second plurality of channel estimations for the second subset of resources, the respective gradients, and respective values of the first set of values of the latent variable based at least in part on generating the respective gradients.

20. The method of claim 14, wherein generating the second set of values of the latent variable comprises: modeling a correlation between resources of each resource block of a group of resource blocks and other resource blocks of the group of resource blocks.

21. The method of claim 14, wherein generating the second set of values of the latent variable comprises: modeling a correlation between resources of each group of a plurality of groups of resources of the set of resources and other groups of the plurality of groups of resources, wherein each group of the plurality of groups of resources comprises a plurality of resource blocks.

22. The method of claim 14, wherein generating the second set of values of the latent variable comprises: modeling a correlation between each layer of the plurality of layers for the set of resources.

23. The method of claim 14, wherein modifying the second plurality of channel estimations comprises: combining the second set of values of the latent variable, the second plurality of channel estimations, and the respective gradients based at least in part on the set of machine learning parameters.

24. The method of claim 14, wherein the nonlinear two-dimensional interpolation of the channel is based at least in part on a machine learning model.

25. The method of claim 14, wherein the first plurality of channel estimations and the second plurality of channel estimations are associated with a plurality of single-input and single-output antenna pairs.

26. The method of claim 14, wherein: each iteration of the one or more iterations is performed by a refinement network comprising a likelihood module, an encoder module, and a decoder module, the refinement network comprising a machine learning model; and each refinement network executes according to the same set of machine learning parameters.

27. A wireless communication device for wireless communication, comprising: means for receiving an assignment of a set of resources associated with a channel comprising a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal; means for generating, from the reference signal received over the second subset of resources in accordance with a minimum mean square estimation operation, a first plurality of channel estimations associated with respective layers of a plurality of layers of the channel for the set of resources; means for generating, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second plurality of channel estimations and a plurality of values of a latent variable, the second plurality of channel estimations and the plurality of values associated with respective layers of the plurality of layers of the channel for the set of resources, wherein the nonlinear two-dimensional interpolation of the channel is based at least in part on the first plurality of channel estimations; and means for performing a refinement operation on the second plurality of channel estimations comprising one or more iterations, wherein each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters, and wherein the means for performing each iteration of the one or more iterations comprise: means for generating respective gradients associated with the second plurality of channel estimations based at least in part on the second plurality of channel estimations for the second subset of resources and measured observations of the second subset of resources; means for generating, based at least in part on a first set of values of the plurality of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable; and means for modifying the second plurality of channel estimations associated with the plurality of layers based at least in part on the second set of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients.

28. The wireless communication device of claim 27, wherein the refinement operation is a first refinement operation and the set of machine learning parameters is a first set of machine learning parameters, the wireless communication device further comprising: means for performing a second refinement operation on the second plurality of channel estimations, the second refinement operation comprising one or more second iterations performed in accordance with a same second set of machine learning parameters, wherein the first refinement operation and the second refinement operation are associated with a respective attention calculation of a plurality of attention calculations.

29. A non-transitory computer-readable medium storing code for wireless communication at a wireless communication device, the code comprising instructions executable by one or more processors to cause the wireless communication device: receive an assignment of a set of resources associated with a channel comprising a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal; generate, from the reference signal received over the second subset of resources in accordance with a minimum mean square estimation operation, a first plurality of channel estimations associated with respective layers of a plurality of layers of the channel for the set of resources; generate, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second plurality of channel estimations and a plurality of values of a latent variable, the second plurality of channel estimations and the plurality of values associated with respective layers of the plurality of layers of the channel for the set of resources, wherein the nonlinear two-dimensional interpolation of the channel is based at least in part on the first plurality of channel estimations; and perform a refinement operation on the second plurality of channel estimations comprising one or more iterations, wherein each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters, and wherein the instructions to perform each iteration of the one or more iterations are executable to: generate respective gradients associated with the second plurality of channel estimations based at least in part on the second plurality of channel estimations for the second subset of resources and measured observations of the second subset of resources; generate, based at least in part on a first set of values of the plurality of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable; and modify the second plurality of channel estimations associated with the plurality of layers based at least in part on the second set of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients.

30. The non-transitory computer-readable medium of claim 29, wherein the refinement operation is a first refinement operation and the set of machine learning parameters is a first set of machine learning parameters, and the instructions are further executable by the one or more processors to cause the wireless communication device: perform a second refinement operation on the second plurality of channel estimations, the second refinement operation comprising one or more second iterations performed in accordance with a same second set of machine learning parameters, wherein the first refinement operation and the second refinement operation are associated with a respective attention calculation of a plurality of attention calculations.

Description:
RECURRENT EQUIVARIANT INFERENCE MACHINES FOR CHANNEL ESTIMATION

CROSS REFERENCE

[0001] The present Application for Patent claims priority to U.S. Patent Application No. 18/472,083 by PRATIK et al., entitled “RECURRENT EQUIVARIANT INFERENCE MACHINES FOR CHANNEL ESTIMATION” filed September 21, 2023, and to U.S. Patent Application No. 17/952,203 by PRATIK et al., entitled “RECURRENT EQUIVARIANT INFERENCE MACHINES FOR CHANNEL ESTIMATION,” filed September 23, 2022, each of which is assigned to the assignee hereof, and expressly incorporated by reference in its entirety herein.

INTRODUCTION

[0002] The following relates to wireless communications, and more specifically to estimating a channel using machine learning models.

[0003] Wireless communications systems are widely deployed to provide various types of communication content such as voice, video, packet data, messaging, broadcast, and so on. These systems may be capable of supporting communication with multiple users by sharing the available system resources (e.g., time, frequency, and power). Examples of such multiple-access systems include fourth generation (4G) systems such as Long Term Evolution (LTE) systems, LTE- Advanced (LTE-A) systems, or LTE-A Pro systems, and fifth generation (5G) systems which may be referred to as New Radio (NR) systems. These systems may employ technologies such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), or discrete Fourier transform spread orthogonal frequency division multiplexing (DFT-S-OFDM). A wireless multiple-access communications system may include one or more base stations, each supporting wireless communication for communication devices, which may be known as user equipment (UE). SUMMARY

[0004] The described techniques relate to improved methods, systems, devices, and apparatuses that support recurrent equivariant inference machines for channel estimation. For example, the described techniques provide for calculating channel estimations using recurrent equivariant inference machines. In some cases, a wireless communications system may place pilot symbols (e.g., demodulation reference signal (DMRS) symbols) in transmission slots according to a known pattern, thus allowing wireless devices to estimate the unknown resources of the channel based on the known resources (e.g., DMRS symbols). For example, a wireless device may receive an assignment of a set of resources associated with a channel where the set of resources includes a first subset of resources allocated for data transmission and a second subset of resources allocated for a reference signal (e.g., DMRS). The wireless device may generate multiple channel estimations per layer of the channel (e.g., single-input and single-output (SISO) channel estimations) and perform a refinement operation utilizing the estimations to generate a channel estimation associated with multiple layers (e.g., a multiple-input and multiple-output (MIMO) channel estimation). In some cases, the refinement operation may include multiple iterations. For example, each iteration may include generating respective gradients associated with each per layer channel estimation based on the per layer channel estimations and observed resources of the channel (e.g., the known DMRS resources); generating a current set of values of a latent variable (e.g., an inferred variable based on observed variables) based on a previous set of values of the latent variable, the respective gradients, and the per layer channel estimations; and modifying (e.g., refining, updating, improving) the channel estimations based on the current set of values of the latent variable, the per layer channel estimations, and the respective gradients. In some cases, the refinement operation may be performed by a refinement network that includes a likelihood module, an encoder module, and a decoder module.

[0005] A method for wireless communication at a wireless communication device is described. The method may include receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal, generating, from the reference signal received over the second subset of resources in accordance with a minimum mean square estimation (MMSE) operation, a first set of multiple channel estimations associated with respective layers of a set of multiple of layers of the channel for the set of resources, generating, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second set of multiple of channel estimations and a set of multiple of values of a latent variable, the second set of multiple of channel estimations and the set of multiple of values associated with respective layers of the set of multiple of layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple of channel estimations, and including a refinement operation on the second set of multiple of channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. Performing each iteration of the one or more iterations may include generating respective gradients associated with the second set of multiple of channel estimations based on the second set of multiple of channel estimations for the second subset of resources and measured observations of the second subset of resources, generating, based on a first set of values of the set of multiple of values of the latent variable, the second set of multiple of channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and modifying the second set of multiple of channel estimations associated with the set of multiple of layers based on the second set of values of the latent variable, the second set of multiple of channel estimations, the set of machine learning parameters, and the respective gradients.

[0006] An apparatus for wireless communication at a wireless communication device is described. The apparatus may include one or more memories and one or more processors coupled with the one or more memories. The one or more processors may be configured to cause the wireless communication device to receive an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal, generate, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first set of multiple channel estimations associated with respective layers of a set of multiple of layers of the channel for the set of resources, generate, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second set of multiple of channel estimations and a set of multiple of values of a latent variable, the second set of multiple of channel estimations and the set of multiple of values associated with respective layers of the set of multiple of layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple of channel estimations, and perform a refinement operation on the second set of multiple of channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. To perform each iteration of the one or more iterations, the one or more processors may be configured to cause the wireless device to generate respective gradients associated with the second set of multiple of channel estimations based on the second set of multiple of channel estimations for the second subset of resources and measured observations of the second subset of resources, generate, based on a first set of values of the set of multiple of values of the latent variable, the second set of multiple of channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and modify the second set of multiple of channel estimations associated with the set of multiple of layers based on the second set of values of the latent variable, the second set of multiple of channel estimations, the set of machine learning parameters, and the respective gradients.

[0007] Another apparatus for wireless communication is described. The apparatus may include means for receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal, generating, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first set of multiple channel estimations associated with respective layers of a set of multiple of layers of the channel for the set of resources, generating, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second set of multiple of channel estimations and a set of multiple of values of a latent variable, the second set of multiple of channel estimations and the set of multiple of values associated with respective layers of the set of multiple of layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple of channel estimations, and performing a refinement operation on the second set of multiple of channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. Performing each iteration of the one or more iterations may include means for generating respective gradients associated with the second set of multiple of channel estimations based on the second set of multiple of channel estimations for the second subset of resources and measured observations of the second subset of resources, generating, based on a first set of values of the set of multiple of values of the latent variable, the second set of multiple of channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and modifying the second set of multiple of channel estimations associated with the set of multiple of layers based on the second set of values of the latent variable, the second set of multiple of channel estimations, the set of machine learning parameters, and the respective gradients.

[0008] A non-transitory computer-readable medium storing code for wireless communication at a wireless communication device is described. The code may include instructions executable by one or more processors to cause the wireless communication device to receive an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal, generate, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first set of multiple channel estimations associated with respective layers of a set of multiple of layers of the channel for the set of resources, generate, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second set of multiple of channel estimations and a set of multiple of values of a latent variable, the second set of multiple of channel estimations and the set of multiple of values associated with respective layers of the set of multiple of layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple of channel estimations, and perform a refinement operation on the second set of multiple of channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. The instructions to perform each iteration of the one or more iterations may be executable by the one or more processors to cause the wireless device to generate respective gradients associated with the second set of multiple of channel estimations based on the second set of multiple of channel estimations for the second subset of resources and measured observations of the second subset of resources, generate, based on a first set of values of the set of multiple of values of the latent variable, the second set of multiple of channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and modify the second set of multiple of channel estimations associated with the set of multiple of layers based on the second set of values of the latent variable, the second set of multiple of channel estimations, the set of machine learning parameters, and the respective gradients.

[0009] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, the refinement operation is a first refinement operation and the set of machine learning parameters is a first set of machine learning parameters. Some examples of the method, apparatuses, and non-transitory computer- readable medium described herein may include operations, features, means, or instructions for performing a second refinement operation on the second set of multiple of channel estimations, the second refinement operation including one or more second iterations performed in accordance with a same second set of machine learning parameters, where the first refinement operation and the second refinement operation are associated with a respective attention calculation of a set of multiple of attention calculations.

[0010] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, the set of multiple of attention calculations includes an intra-physical resource block (PRB) group calculation, an inter-PRB group calculation, a cross-multiple-input and multiple-output (MIMO) calculation, a multilayer perceptron (MLP) calculation, or any combination thereof.

[0011] Some examples of the method, apparatuses, and non-transitory computer- readable medium described herein may include operations, features, means, or instructions for performing the MMSE operation based on a resource configuration pattern of the second subset of resources allocated for the reference signal, the reference signal including a demodulation reference signal (DMRS). [0012] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, generating the respective gradients may include operations, features, means, or instructions for generating respective sets of values of a residual variable based on a difference between the measured observations of the second subset of resources and the second set of multiple of channel estimations for the second subset of resources, and combining the respective sets of values of the residual variable, the second subset of resources, and a quantity of mask bits.

[0013] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, generating the second set of values of the latent variable may include operations, features, means, or instructions for combining the second set of multiple of channel estimations for the second subset of resources, the respective gradients, and respective values of the first set of values of the latent variable based on generating the respective gradients.

[0014] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, generating the second set of values of the latent variable may include operations, features, means, or instructions for modeling a correlation between resources of each resource block of a group of resource blocks and other resource blocks of the group of resource blocks.

[0015] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, generating the second set of values of the latent variable may include operations, features, means, or instructions for modeling a correlation between resources of each group of a set of multiple of groups of resources of the set of resources and other groups of the set of multiple of groups of resources, where each group of the set of multiple of groups of resources includes a set of multiple of resource blocks.

[0016] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, generating the second set of values of the latent variable may include operations, features, means, or instructions for modeling a correlation between each layer of the set of multiple of layers for the set of resources.

[0017] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, modifying the second set of multiple of channel estimations may include operations, features, means, or instructions for combining the second set of values of the latent variable, the second set of multiple of channel estimations, and the respective gradients based on the set of machine learning parameters.

[0018] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, the nonlinear two-dimensional interpolation of the channel is based on a machine learning model.

[0019] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, the first set of multiple of channel estimations and the second set of multiple of channel estimations are associated with a set of multiple of single-input and single-output (SISO) antenna pairs.

[0020] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, each iteration of the one or more iterations is performed by a refinement network including a likelihood module, an encoder module, and a decoder module, the refinement network including a machine learning model. In some examples of the method, apparatuses, and non-transitory computer-readable medium described herein, each refinement network executes according to the same set of machine learning parameters.

[0021] A method for wireless communication is described. The method may include receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal, generating a set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources, and performing a refinement operation on the set of multiple channel estimations including one or more iterations, where each iteration of the one or more iterations may include operations, features, means, or instructions for generating respective gradients associated with the set of multiple channel estimations based on the set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generating, based on a first set of values of a latent variable, the set of multiple channel estimations, and the respective gradients, a second set of values of the latent variable, and modifying the set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients.

[0022] An apparatus for wireless communication at a wireless communication device is described. The apparatus may include one or more memories and one or more processors coupled with the one or more memories. The one or more processors may be configured to cause the wireless communication device to receive an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal, generate a set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources, and perform a refinement operation on the set of multiple channel estimations including one or more iterations, where the instructions to each iteration of the one or more iterations are executable by the processor to cause the apparatus to generate respective gradients associated with the set of multiple channel estimations based on the set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generate, based on a first set of values of a latent variable, the set of multiple channel estimations, and the respective gradients, a second set of values of the latent variable, and modify the set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients.

[0023] Another apparatus for wireless communication is described. The apparatus may include means for receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal, means for generating a set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources, and means for performing a refinement operation on the set of multiple channel estimations including one or more iterations, where the means for each iteration of the one or more iterations include means for generating respective gradients associated with the set of multiple channel estimations based on the set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, means for generating, based on a first set of values of a latent variable, the set of multiple channel estimations, and the respective gradients, a second set of values of the latent variable, and means for modifying the set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients.

[0024] A non-transitory computer-readable medium storing code for wireless communication is described. The code may include instructions executable by one or more processors to receive an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal, generate a set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources, and perform a refinement operation on the set of multiple channel estimations including one or more iterations, where the instructions to each iteration of the one or more iterations are executable to generate respective gradients associated with the set of multiple channel estimations based on the set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generate, based on a first set of values of a latent variable, the set of multiple channel estimations, and the respective gradients, a second set of values of the latent variable, and modify the set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients.

[0025] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, generating the respective gradients may include operations, features, means, or instructions for generating respective sets of values of a residual variable based on a difference between the measured observations of the second subset of resources and the set of multiple channel estimations for the second subset of resources and combining the respective sets of values of the residual variable, the measured observations of the second subset of resources, and a quantity of mask bits.

[0026] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, generating the second set of values of the latent variable may include operations, features, means, or instructions for combining the set of multiple channel estimations for the second subset of resources, the respective gradients, and respective values of the first set of values of the latent variable based on generating the respective gradients.

[0027] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, generating the second set of values of the latent variable may include operations, features, means, or instructions for modeling correlation between resources of each resource block of a group of resource blocks and other resource blocks of the group of resource blocks.

[0028] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, generating the second set of values of the latent variable may include operations, features, means, or instructions for modeling correlation between resources of each group of a set of multiple groups of resources of the set of resources and other groups of the set of multiple groups of resources, where each group of the set of multiple groups of resources includes a set of multiple resource blocks.

[0029] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, generating the second set of values of the latent variable may include operations, features, means, or instructions for modeling correlation between each layer of the set of multiple layers for the set of resources.

[0030] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, modifying the set of multiple channel estimations may include operations, features, means, or instructions for combining the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients.

[0031] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, initial values of the set of multiple channel estimations may be associated with SISO antenna pairs.

[0032] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, the second subset of resources may be configured according to a resource configuration pattern of a set of resource configuration patterns. [0033] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, the set of resource configuration patterns may be a set of DMRS patterns.

[0034] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, each iteration may be performed by a refinement network including a likelihood module, an encoder module, and a decoder module, and each refinement network further includes a respective parameter associated with a machine learning operation.

[0035] In some examples of the method, apparatuses, and non-transitory computer- readable medium described herein, the set of resources includes one or more groups of resources, and each respective layer of the set of multiple layers may be associated with a respective antenna pair of a set of multiple SISO antenna pairs.

BRIEF DESCRIPTION OF THE DRAWINGS

[0036] FIG. 1 shows an example of a wireless communications system that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure.

[0037] FIG. 2 shows an example of a network architecture that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure.

[0038] FIGs. 3-6 show examples of networks that support recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure.

[0039] FIGs. 7 and 8 show block diagrams of devices that support recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure.

[0040] FIG. 9 shows a block diagram of a communications manager that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. [0041] FIG. 10 shows a diagram of a system including a user equipment (UE) that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure.

[0042] FIG. 11 shows a diagram of a system including a network entity that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure.

[0043] FIGs. 12 through 15 show flowcharts illustrating methods that support recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure.

DETAILED DESCRIPTION

[0044] Some wireless communications systems may support channel estimations. For example, a wireless device may estimate resources of a channel to maintain high data throughput. To accomplish channel estimation, some communications systems may utilize a tracking reference signal (TRS) to calculate channel characteristics (e.g., doppler, delay spread, signal-to-noise ratio (SNR), and the like) and estimate the channel resources based on the channel characteristics. However, TRS may include a relatively large overhead (e.g., memory, computation) and not all wireless communications systems may continuously (e.g., periodically or relatively regularly over time) transmit TRS, which may affect channel estimations. Additionally, some wireless communications systems may support multiple-input and multiple-output (MIMO) communications in which multiple resource layers may interfere with each other, further reducing the accuracy of the channel estimations. A resource layer may refer to a spatial-layer of a wireless channel. For MIMO communications, multiple antennas or antenna ports of a transmitting device may each be associated with a respective layer, which may be transmitted over the same time-frequency resources. To further improve TRS consistency, as well as account for the cross-MIMO interferences, the channel estimation procedure may be updated.

[0045] Some wireless communication devices support machine learning-based channel estimation operations. The techniques described herein provide for applying techniques for minimum mean square estimation (MMSE) operations for channel estimation to machine learning-based channel estimation techniques to improve channel estimation techniques. For example, performing a combination of machine learningbased and MMSE-based channel estimation may provide enhanced channel estimation through machine learning capabilities while maintaining existing channel estimation and demapping hardware at the wireless device used for the MMSE operation, which may reduce memory consumption and computational costs, among other possibilities. MMSE operations may include a wireless device estimating a channel by minimizing of a mean square error of variables associated with the channel. The techniques described herein may provide for calculating channel estimations using recurrent equivariant inference machines, which may be a type of machine learning model that provides relatively reliable and accurate estimations based on a given set of inputs. In some cases, a wireless communications system may place pilot symbols (e.g., demodulation reference signal (DMRS) symbols, other types of reference symbols, or any other pilot symbols) in transmission slots according to a known pattern, thus allowing wireless devices to estimate the unknown resources of the channel based on the known resources (e.g., DMRS symbols). For example, a wireless device may receive an assignment of a first set of resources within a channel that are allocated for data transmission and a second set of resources within the channel that are allocated for a reference signal (e.g., DMRS). The wireless device may perform an MMSE operation to generate a first set of multiple channel estimations per layer of the channel (e.g., single-input and singleoutput (SISO) channel estimations). The MMSE operation may be an estimation technique that utilizes linear equalization to estimate a channel. MMSE may be supported by hardware within the wireless device, which may provide for reduced complexity and processing as compared with performing an initial estimation based on machine learning. The wireless device may subsequently utilize a nonlinear two- dimensional interpolation to modify the MMSE estimates and generate a second set of multiple channel estimates per layer of the channel. For example, the wireless device may build on the first set of multiple channel estimates using interpolation in a time and frequency domain in accordance with a machine learning model. In some cases the nonlinear two-dimensional interpolation may utilize a machine learning model to improve an accuracy of the channel estimates produced by the MMSE operation and to apply the channel estimates to the time and frequency domain. [0046] The wireless device may perform a refinement operation utilizing the second set of multiple channel estimations to generate a channel estimation associated with multiple layers (e.g., a MEMO channel estimation). That is, in one example, the wireless device may take the channel estimates generated via the MMSE operation and subsequent machine learning interpolation, and the wireless device may refine the channel estimates and combine the estimates for each layer of the channel into a single MIMO channel estimation. In some cases, the refinement operation may include multiple iterations of refinement. Each refinement iteration may generate respective gradients associated with the per layer channel estimations based on the per layer channel estimations and observed resources of the channel (e.g., the known DMRS resources). Each refinement iteration may further generate a current set of values of a latent variable (e.g., an inferred variable based on observed variables). The current set of values of the latent variable may be generated based on a previous set of values of the latent variable, the respective gradients, and the per layer channel estimations. Each refinement iteration may further modify (e.g., refine, update, improve) the channel estimations based on the current set of values of the latent variable, the per layer channel estimations, and the respective gradients. That is, in one example, each refinement iteration may further improve the channel estimations and generate new or improved values of an inferred latent variable. In some cases, the refinement operation may be performed by a refinement network, which may be a machine learning model that includes a likelihood module, an encoder module, and a decoder module.

[0047] The techniques described herein may provide for the wireless device to estimate a channel reliably and accurately with relatively low memory consumption and processing. For example, by first utilizing an MMSE operation to generate channel estimations, the wireless device may reduce memory consumption and processing as compared with machine learning-based estimations or other techniques. Additionally, or alternatively, by utilizing MMSE for channel estimation, the wireless device may support reduced hardware costs and complexities as compared with other machine learning-based channel estimation techniques, at least because the MMSE operation may be supported by current hardware components of the wireless device.

[0048] Aspects of the disclosure are initially described in the context of wireless communications systems. Aspects of the disclosure are then described in the context of networks. Aspects of the disclosure are further illustrated by and described with reference to apparatus diagrams, system diagrams, and flowcharts that relate to recurrent equivariant inference machines for channel estimation.

[0049] FIG. 1 shows an example of a wireless communications system 100 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. The wireless communications system 100 may include one or more network entities 105, one or more UEs 115, and a core network 130. In some examples, the wireless communications system 100 may be an LTE network, an LTE-A network, an LTE-A Pro network, a NR network, or a network operating in accordance with other systems and radio technologies, including future systems and radio technologies not explicitly mentioned herein.

[0050] The network entities 105 may be dispersed throughout a geographic area to form the wireless communications system 100 and may include devices in different forms or having different capabilities. In various examples, a network entity 105 may be referred to as a network element, a mobility element, a radio access network (RAN) node, or network equipment, among other nomenclature. In some examples, network entities 105 and UEs 115 may wirelessly communicate via one or more communication links 125 (e.g., a radio frequency (RF) access link). For example, a network entity 105 may support a coverage area 110 (e.g., a geographic coverage area) over which the UEs 115 and the network entity 105 may establish one or more communication links 125. The coverage area 110 may be an example of a geographic area over which a network entity 105 and a UE 115 may support the communication of signals according to one or more radio access technologies (RATs).

[0051] The UEs 115 may be dispersed throughout a coverage area 110 of the wireless communications system 100, and each UE 115 may be stationary, or mobile, or both at different times. The UEs 115 may be devices in different forms or having different capabilities. Some example UEs 115 are illustrated in FIG. 1. The UEs 115 described herein may be capable of supporting communications with various types of devices, such as other UEs 115 or network entities 105, as shown in FIG. 1. The UEs 115 may include a communications manager 101 configured to transmit and receive communications to and from a network entity 105. In some examples, the communications manager 101 may be configured to receive an assignment of a set of resources including resource allocated for a data signal and resources allocated for a reference signal. Additionally, or alternatively, the communications manager 101 may be configured to generate channel estimations based on an MMSE operation, a nonlinear two-dimensional interpolation of a channel, a refinement operation, or any combination thereof.

[0052] As described herein, a node, which may be referred to as a node, a network node, a network entity, or a wireless node, may be a base station (e.g., any base station described herein), a UE (e.g., any UE described herein), a network controller, an apparatus, a device, a computing system, one or more components, and/or another suitable processing entity configured to perform any of the techniques described herein. For example, a network node may be a UE. As another example, a network node may be a base station. As another example, a first network node may be configured to communicate with a second network node or a third network node. In one aspect of this example, the first network node may be a UE, the second network node may be a base station, and the third network node may be a UE. In another aspect of this example, the first network node may be a UE, the second network node may be a base station, and the third network node may be a base station. In yet other aspects of this example, the first, second, and third network nodes may be different relative to these examples. Similarly, reference to a UE, base station, apparatus, device, computing system, or the like may include disclosure of the UE, base station, apparatus, device, computing system, or the like being a network node. For example, disclosure that a UE is configured to receive information from a base station also discloses that a first network node is configured to receive information from a second network node. Consistent with this disclosure, once a specific example is broadened in accordance with this disclosure (e.g., a UE is configured to receive information from a base station also discloses that a first network node is configured to receive information from a second network node), the broader example of the narrower example may be interpreted in the reverse, but in a broad open-ended way. In the example above where a UE being configured to receive information from a base station also discloses that a first network node being configured to receive information from a second network node, the first network node may refer to a first UE, a first base station, a first apparatus, a first device, a first computing system, a first one or more components, a first processing entity, or the like configured to receive the information; and the second network node may refer to a second UE, a second base station, a second apparatus, a second device, a second computing system, a second one or more components, a second processing entity, or the like.

[0053] As described herein, communication of information (e.g., any information, signal, or the like) may be described in various aspects using different terminology. Disclosure of one communication term includes disclosure of other communication terms. For example, a first network node may be described as being configured to transmit information to a second network node. In this example and consistent with this disclosure, disclosure that the first network node is configured to transmit information to the second network node includes disclosure that the first network node is configured to provide, send, output, communicate, or transmit information to the second network node. Similarly, in this example and consistent with this disclosure, disclosure that the first network node is configured to transmit information to the second network node includes disclosure that the second network node is configured to receive, obtain, or decode the information that is provided, sent, output, communicated, or transmitted by the first network node.

[0054] In some examples, network entities 105 may communicate with the core network 130, or with one another, or both. For example, network entities 105 may communicate with the core network 130 via one or more backhaul communication links 120 (e.g., in accordance with an SI, N2, N3, or other interface protocol). In some examples, network entities 105 may communicate with one another via a backhaul communication link 120 (e.g., in accordance with an X2, Xn, or other interface protocol) either directly (e.g., directly between network entities 105) or indirectly (e.g., via a core network 130). In some examples, network entities 105 may communicate with one another via a midhaul communication link 162 (e.g., in accordance with a midhaul interface protocol) or a fronthaul communication link 168 (e.g., in accordance with a fronthaul interface protocol), or any combination thereof. The backhaul communication links 120, midhaul communication links 162, or fronthaul communication links 168 may be or include one or more wired links (e.g., an electrical link, an optical fiber link), one or more wireless links (e.g., a radio link, a wireless optical link), among other examples or various combinations thereof. A UE 115 may communicate with the core network 130 via a communication link 155. [0055] One or more of the network entities 105 described herein may include or may be referred to as a base station 140 (e.g., a base transceiver station, a radio base station, an NR base station, an access point, a radio transceiver, a NodeB, an eNodeB (eNB), a next-generation NodeB or a giga-NodeB (either of which may be referred to as a gNB), a 5G NB, a next-generation eNB (ng-eNB), a Home NodeB, a Home eNodeB, or other suitable terminology). In some examples, a network entity 105 (e.g., a base station 140) may be implemented in an aggregated (e.g., monolithic, standalone) base station architecture, which may be configured to utilize a protocol stack that is physically or logically integrated within a single network entity 105 (e.g., a single RAN node, such as a base station 140). The network entities 105 may include a communications manager 102 configured to transmit and receive communications to and from a UE 115. In some examples, the communications manager 102 may be configured to transmit an assignment of a set of resources including resource allocated for a data signal and resources allocated for a reference signal. Additionally, or alternatively, the communications manager 102 may be configured to generate channel estimations based on an MMSE operation, a nonlinear two-dimensional interpolation of a channel, a refinement operation, or any combination thereof.

[0056] Techniques described herein, in addition to or as an alternative to be carried out between UEs 115 and network entities 105, may be implemented via additional or alternative wireless devices, including IAB nodes 104, distributed units (DUs) 165, centralized units (CUs) 160, radio units (RUs) 170, and the like. For example, in some implementations, aspects described herein may be implemented in the context of a disaggregated radio access network (RAN) architecture (e.g., open RAN architecture). In a disaggregated architecture, the RAN may be split into three areas of functionality corresponding to the CU 160, the DU 165, and the RU 170. The split of functionality between the CU 160, DU 165, and RU 175 is flexible and as such gives rise to numerous permutations of different functionalities depending upon which functions (e.g., MAC functions, baseband functions, radio frequency functions, and any combinations thereof) are performed at the CU 160, DU 165, and RU 175. For example, a functional split of the protocol stack may be employed between a DU 165 and an RU 170 such that the DU 165 may support one or more layers of the protocol stack and the RU 170 may support one or more different layers of the protocol stack. [0057] Some wireless communications systems (e.g., wireless communications system 100), infrastructure and spectral resources for NR access may additionally support wireless backhaul link capabilities in supplement to wireline backhaul connections, providing an IAB network architecture. One or more network entities 105 may include CUs 160, DUs 165, and RUs 170 and may be referred to as donor network entities 105 or IAB donors. One or more DUs 165 (e.g., and/or RUs 170) associated with a donor network entity 105 may be partially controlled by CUs 160 associated with the donor network entity 105. The one or more donor network entities 105 (e.g., IAB donors) may be in communication with one or more additional network entities 105 (e.g., IAB nodes 104) via supported access and backhaul links. IAB nodes 104 may support mobile terminal (MT) functionality controlled and/or scheduled by DUs 165 of a coupled IAB donor. In addition, the IAB nodes 104 may include DUs 165 that support communication links with additional entities (e.g., IAB nodes 104, UEs 115, etc.) within the relay chain or configuration of the access network (e.g., downstream). In such cases, one or more components of the disaggregated RAN architecture (e.g., one or more IAB nodes 104 or components of IAB nodes 104) may be configured to operate according to the techniques described herein.

[0058] In some examples, the wireless communications system 100 may include a core network 130 (e.g., a next generation core network (NGC)), one or more IAB donors, IAB nodes 104, and UEs 115, where IAB nodes 104 may be partially controlled by each other and/or the IAB donor. The IAB donor and IAB nodes 104 may be examples of aspects of network entities 105. IAB donor and one or more IAB nodes 104 may be configured as (e.g., or in communication according to) some relay chain.

[0059] For instance, an access network (AN) or RAN may refer to communications between access nodes (e.g., IAB donor), IAB nodes 104, and one or more UEs 115. The IAB donor may facilitate connection between the core network 130 and the AN (e.g., via a wireline or wireless connection to the core network 130). That is, an IAB donor may refer to a RAN node with a wireline or wireless connection to core network 130. The IAB donor may include a CU 160 and at least one DU 165 (e.g., and RU 170), where the CU 160 may communicate with the core network 130 over an NG interface (e.g., some backhaul link). The CU 160 may host layer 3 (L3) (e.g., Radio Resource Control (RRC), service data adaption protocol (SDAP), PDCP, etc.) functionality and signaling. The at least one DU 165 and/or RU 170 may host lower layer, such as layer 1 (LI) and layer 2 (L2) (e.g., RLC, MAC, physical (PHY), etc.) functionality and signaling, and may each be at least partially controlled by the CU 160. The DU 165 may support one or multiple different cells. IAB donor and IAB nodes 104 may communicate over an Fl interface according to some protocol that defines signaling messages (e.g., Fl AP protocol). Additionally, CU 160 may communicate with the core network over an NG interface (which may be an example of a portion of backhaul link), and may communicate with other CUs 160 (e.g., a CU 160 associated with an alternative IAB donor) over an Xn-C interface (which may be an example of a portion of a backhaul link).

[0060] IAB nodes 104 may refer to a RAN node that provides IAB functionality (e.g., access for UEs 115, wireless self-backhauling capabilities, etc.). IAB nodes 104 may include a DU 165 and an MT. A DU 165 may act as a distributed scheduling node towards child nodes associated with the IAB node 104, and the MT may act as a scheduled node towards parent nodes associated with the IAB node 104. That is, an IAB donor may be referred to as a parent node in communication with one or more child nodes (e.g., an IAB donor may relay transmissions for UEs through one or more other IAB nodes 104). Additionally, an IAB node 104 may also be referred to as a parent node or a child node to other IAB nodes 104, depending on the relay chain or configuration of the AN. Therefore, the MT entity of IAB nodes 104 (e.g., MTs) may provide a Uu interface for a child node to receive signaling from a parent IAB node 104, and the DU interface (e.g., DUs 165) may provide a Uu interface for a parent node to signal to a child IAB node 104 or UE 115.

[0061] For example, IAB node 104 may be referred to a parent node associated with IAB node, and a child node associated with IAB donor. The IAB donor may include a CU 160 with a wireline (e.g., optical fiber) or wireless connection to the core network and may act as parent node to IAB nodes 104. For example, the DU 165 of IAB donor may relay transmissions to UEs 115 through IAB nodes 104, and may directly signal transmissions to a UE 115. The CU 160 of IAB donor may signal communication link establishment via an Fl interface to IAB nodes 104, and the IAB nodes 104 may schedule transmissions (e.g., transmissions to the UEs 115 relayed from the IAB donor) through the DUs 165. That is, data may be relayed to and from IAB nodes 104 via signaling over an NR Uu interface to MT of the IAB node 104. Communications with IAB node 104 may be scheduled by DU 165 of IAB donor and communications with IAB node 104 may be scheduled by DU 165 of IAB node 104.

[0062] In the case of the techniques described herein applied in the context of a disaggregated RAN architecture, one or more components of the disaggregated RAN architecture (e.g., one or more IAB nodes 104 or components of IAB nodes 104) may be configured to support techniques for large round trip times in random access channel procedures as described herein. For example, some operations described as being performed by a UE 115 or a network entity 105 may additionally or alternatively be performed by components of the disaggregated RAN architecture (e.g., IAB nodes, DUs, CUs, etc ).

[0063] In some examples, a network entity 105 may be implemented in a disaggregated architecture (e.g., a disaggregated base station architecture, a disaggregated RAN architecture), which may be configured to utilize a protocol stack that is physically or logically distributed among two or more network entities 105, such as an IAB network, an open RAN (O-RAN) (e.g., a network configuration sponsored by the O-RAN Alliance), or a virtualized RAN (vRAN) (e.g., a cloud RAN (C-RAN)). For example, a network entity 105 may include one or more of a CU 160, a DU 165, an RU 170, a RAN Intelligent Controller (RIC) 175 (e.g., a Near-Real Time RIC (Near-RT RIC), a Non-Real Time RIC (Non-RT RIC)), a Service Management and Orchestration (SMO) 180 system, or any combination thereof. An RU 170 may also be referred to as a radio head, a smart radio head, a remote radio head (RRH), a remote radio unit (RRU), or a transmission reception point (TRP). One or more components of the network entities 105 in a disaggregated RAN architecture may be co-located, or one or more components of the network entities 105 may be located in distributed locations (e.g., separate physical locations). In some examples, one or more network entities 105 of a disaggregated RAN architecture may be implemented as virtual units (e.g., a virtual CU (VCU), a virtual DU (VDU), a virtual RU (VRU)).

[0064] The split of functionality between a CU 160, a DU 165, and an RU 170 is flexible and may support different functionalities depending on which functions (e.g., network layer functions, protocol layer functions, baseband functions, RF functions, and any combinations thereof) are performed at a CU 160, a DU 165, or an RU 170. For example, a functional split of a protocol stack may be employed between a CU 160 and a DU 165 such that the CU 160 may support one or more layers of the protocol stack and the DU 165 may support one or more different layers of the protocol stack. In some examples, the CU 160 may host upper protocol layer (e.g., layer 3 (L3), layer 2 (L2)) functionality and signaling (e.g., RRC, SDAP, PDCP). The CU 160 may be connected to one or more DUs 165 or RUs 170, and the one or more DUs 165 or RUs 170 may host lower protocol layers, such as layer 1 (LI) (e.g., PHY layer) or L2 (e.g., RLC layer, medium access control (MAC) layer) functionality and signaling, and may each be at least partially controlled by the CU 160. Additionally, or alternatively, a functional split of the protocol stack may be employed between a DU 165 and an RU 170 such that the DU 165 may support one or more layers of the protocol stack and the RU 170 may support one or more different layers of the protocol stack. The DU 165 may support one or multiple different cells (e.g., via one or more RUs 170). In some cases, a functional split between a CU 160 and a DU 165, or between a DU 165 and an RU 170 may be within a protocol layer (e.g., some functions for a protocol layer may be performed by one of a CU 160, a DU 165, or an RU 170, while other functions of the protocol layer are performed by a different one of the CU 160, the DU 165, or the RU 170). A CU 160 may be functionally split further into CU control plane (CU-CP) and CU user plane (CU-UP) functions. A CU 160 may be connected to one or more DUs 165 via a midhaul communication link 162 (e.g., Fl, Fl-c, Fl-u), and a DU 165 may be connected to one or more RUs 170 via a fronthaul communication link 168 (e.g., open fronthaul (FH) interface). In some examples, a midhaul communication link 162 or a fronthaul communication link 168 may be implemented in accordance with an interface (e.g., a channel) between layers of a protocol stack supported by respective network entities 105 that are in communication via such communication links.

[0065] In wireless communications systems (e.g., wireless communications system 100), infrastructure and spectral resources for radio access may support wireless backhaul link capabilities to supplement wired backhaul connections, providing an IAB network architecture (e.g., to a core network 130). In some cases, in an IAB network, one or more network entities 105 (e.g., IAB nodes 104) may be partially controlled by each other. One or more IAB nodes 104 may be referred to as a donor entity or an IAB donor. One or more DUs 165 or one or more RUs 170 may be partially controlled by one or more CUs 160 associated with a donor network entity 105 (e.g., a donor base station 140). The one or more donor network entities 105 (e.g., IAB donors) may be in communication with one or more additional network entities 105 (e.g., IAB nodes 104) via supported access and backhaul links (e.g., backhaul communication links 120). IAB nodes 104 may include an IAB mobile termination (IAB-MT) controlled (e.g., scheduled) by DUs 165 of a coupled IAB donor. An IAB-MT may include an independent set of antennas for relay of communications with UEs 115, or may share the same antennas (e.g., of an RU 170) of an IAB node 104 used for access via the DU 165 of the IAB node 104 (e.g., referred to as virtual IAB-MT (vIAB-MT)). In some examples, the IAB nodes 104 may include DUs 165 that support communication links with additional entities (e.g., IAB nodes 104, UEs 115) within the relay chain or configuration of the access network (e.g., downstream). In such cases, one or more components of the disaggregated RAN architecture (e.g., one or more IAB nodes 104 or components of IAB nodes 104) may be configured to operate according to the techniques described herein.

[0066] For instance, an access network (AN) or RAN may include communications between access nodes (e.g., an IAB donor), IAB nodes 104, and one or more UEs 115. The IAB donor may facilitate connection between the core network 130 and the AN (e.g., via a wired or wireless connection to the core network 130). That is, an IAB donor may refer to a RAN node with a wired or wireless connection to core network 130. The IAB donor may include a CU 160 and at least one DU 165 (e.g., and RU 170), in which case the CU 160 may communicate with the core network 130 via an interface (e.g., a backhaul link). IAB donor and IAB nodes 104 may communicate via an Fl interface according to a protocol that defines signaling messages (e.g., an Fl AP protocol). Additionally, or alternatively, the CU 160 may communicate with the core network via an interface, which may be an example of a portion of backhaul link, and may communicate with other CUs 160 (e.g., a CU 160 associated with an alternative IAB donor) via an Xn-C interface, which may be an example of a portion of a backhaul link.

[0067] An IAB node 104 may refer to a RAN node that provides IAB functionality (e.g., access for UEs 115, wireless self-backhauling capabilities). A DU 165 may act as a distributed scheduling node towards child nodes associated with the IAB node 104, and the IAB-MT may act as a scheduled node towards parent nodes associated with the IAB node 104. That is, an IAB donor may be referred to as a parent node in communication with one or more child nodes (e.g., an IAB donor may relay transmissions for UEs through one or more other IAB nodes 104). Additionally, or alternatively, an IAB node 104 may also be referred to as a parent node or a child node to other IAB nodes 104, depending on the relay chain or configuration of the AN. Therefore, the IAB-MT entity of IAB nodes 104 may provide a Uu interface for a child IAB node 104 to receive signaling from a parent IAB node 104, and the DU interface (e.g., DUs 165) may provide a Uu interface for a parent IAB node 104 to signal to a child IAB node 104 or UE 115.

[0068] For example, IAB node 104 may be referred to as a parent node that supports communications for a child IAB node, or referred to as a child IAB node associated with an IAB donor, or both. The IAB donor may include a CU 160 with a wired or wireless connection (e.g., a backhaul communication link 120) to the core network 130 and may act as parent node to IAB nodes 104. For example, the DU 165 of IAB donor may relay transmissions to UEs 115 through IAB nodes 104, or may directly signal transmissions to a UE 115, or both. The CU 160 of IAB donor may signal communication link establishment via an Fl interface to IAB nodes 104, and the IAB nodes 104 may schedule transmissions (e.g., transmissions to the UEs 115 relayed from the IAB donor) through the DUs 165. That is, data may be relayed to and from IAB nodes 104 via signaling via an NR Uu interface to MT of the IAB node 104. Communications with IAB node 104 may be scheduled by a DU 165 of IAB donor and communications with IAB node 104 may be scheduled by DU 165 of IAB node 104.

[0069] In the case of the techniques described herein applied in the context of a disaggregated RAN architecture, one or more components of the disaggregated RAN architecture may be configured to support recurrent equivariant inference machines for channel estimation as described herein. For example, some operations described as being performed by a UE 115 or a network entity 105 (e.g., a base station 140) may additionally, or alternatively, be performed by one or more components of the disaggregated RAN architecture (e.g., IAB nodes 104, DUs 165, CUs 160, RUs 170, RIC 175, SMO 180).

[0070] A UE 115 may include or may be referred to as a mobile device, a wireless device, a remote device, a handheld device, or a subscriber device, or some other suitable terminology, where the “device” may also be referred to as a unit, a station, a terminal, or a client, among other examples. A UE 115 may also include or may be referred to as a personal electronic device such as a cellular phone, a personal digital assistant (PDA), a tablet computer, a laptop computer, or a personal computer. In some examples, a UE 115 may include or be referred to as a wireless local loop (WLL) station, an Internet of Things (loT) device, an Internet of Everything (loE) device, or a machine type communications (MTC) device, among other examples, which may be implemented in various objects such as appliances, or vehicles, meters, among other examples.

[0071] The UEs 115 described herein may be able to communicate with various types of devices, such as other UEs 115 that may sometimes act as relays as well as the network entities 105 and the network equipment including macro eNBs or gNBs, small cell eNBs or gNBs, or relay base stations, among other examples, as shown in FIG. 1.

[0072] The UEs 115 and the network entities 105 may wirelessly communicate with one another via one or more communication links 125 (e.g., an access link) using resources associated with one or more carriers. The term “carrier” may refer to a set of RF spectrum resources having a defined physical layer structure for supporting the communication links 125. For example, a carrier used for a communication link 125 may include a portion of a RF spectrum band (e.g., a bandwidth part (BWP)) that is operated according to one or more physical layer channels for a given radio access technology (e.g., LTE, LTE-A, LTE-A Pro, NR). Each physical layer channel may carry acquisition signaling (e.g., synchronization signals, system information), control signaling that coordinates operation for the carrier, user data, or other signaling. The wireless communications system 100 may support communication with a UE 115 using carrier aggregation or multi-carrier operation. A UE 115 may be configured with multiple downlink component carriers and one or more uplink component carriers according to a carrier aggregation configuration. Carrier aggregation may be used with both frequency division duplexing (FDD) and time division duplexing (TDD) component carriers. Communication between a network entity 105 and other devices may refer to communication between the devices and any portion (e.g., entity, subentity) of a network entity 105. For example, the terms “transmitting,” “receiving,” or “communicating,” when referring to a network entity 105, may refer to any portion of a network entity 105 (e.g., a base station 140, a CU 160, a DU 165, a RU 170) of a RAN communicating with another device (e.g., directly or via one or more other network entities 105).

[0073] Signal waveforms transmitted via a carrier may be made up of multiple subcarriers (e.g., using multi-carrier modulation (MCM) techniques such as orthogonal frequency division multiplexing (OFDM) or discrete Fourier transform spread OFDM (DFT-S-OFDM)). In a system employing MCM techniques, a resource element may refer to resources of one symbol period (e.g., a duration of one modulation symbol) and one subcarrier, in which case the symbol period and subcarrier spacing may be inversely related. The quantity of bits carried by each resource element may depend on the modulation scheme (e.g., the order of the modulation scheme, the coding rate of the modulation scheme, or both), such that a relatively higher quantity of resource elements (e.g., in a transmission duration) and a relatively higher order of a modulation scheme may correspond to a relatively higher rate of communication. A wireless communications resource may refer to a combination of an RF spectrum resource, a time resource, and a spatial resource (e.g., a spatial layer, a beam), and the use of multiple spatial resources may increase the data rate or data integrity for communications with a UE 115.

[0074] The time intervals for the network entities 105 or the UEs 115 may be expressed in multiples of a basic time unit which may, for example, refer to a sampling period of T s = l/(A/ max ■ Ay) seconds, for which f max may represent a supported subcarrier spacing, and Ay may represent a supported discrete Fourier transform (DFT) size. Time intervals of a communications resource may be organized according to radio frames each having a specified duration (e.g., 10 milliseconds (ms)). Each radio frame may be identified by a system frame number (SFN) (e.g., ranging from 0 to 1023).

[0075] Each frame may include multiple consecutively-numbered subframes or slots, and each subframe or slot may have the same duration. In some examples, a frame may be divided (e.g., in the time domain) into subframes, and each subframe may be further divided into a quantity of slots. Alternatively, each frame may include a variable quantity of slots, and the quantity of slots may depend on subcarrier spacing. Each slot may include a quantity of symbol periods (e.g., depending on the length of the cyclic prefix prepended to each symbol period). In some wireless communications systems 100, a slot may further be divided into multiple mini-slots associated with one or more symbols. Excluding the cyclic prefix, each symbol period may be associated with one or more (e.g., Ay) sampling periods. The duration of a symbol period may depend on the subcarrier spacing or frequency band of operation.

[0076] A subframe, a slot, a mini-slot, or a symbol may be the smallest scheduling unit (e.g., in the time domain) of the wireless communications system 100 and may be referred to as a transmission time interval (TTI). In some examples, the TTI duration (e.g., a quantity of symbol periods in a TTI) may be variable. Additionally, or alternatively, the smallest scheduling unit of the wireless communications system 100 may be dynamically selected (e.g., in bursts of shortened TTIs (sTTIs)).

[0077] Physical channels may be multiplexed for communication using a carrier according to various techniques. A physical control channel and a physical data channel may be multiplexed for signaling via a downlink carrier, for example, using one or more of time division multiplexing (TDM) techniques, frequency division multiplexing (FDM) techniques, or hybrid TDM-FDM techniques. A control region (e.g., a control resource set (CORESET)) for a physical control channel may be defined by a set of symbol periods and may extend across the system bandwidth or a subset of the system bandwidth of the carrier. One or more control regions (e.g., CORESETs) may be configured for a set of the UEs 115. For example, one or more of the UEs 115 may monitor or search control regions for control information according to one or more search space sets, and each search space set may include one or multiple control channel candidates in one or more aggregation levels arranged in a cascaded manner. An aggregation level for a control channel candidate may refer to an amount of control channel resources (e.g., control channel elements (CCEs)) associated with encoded information for a control information format having a given payload size. Search space sets may include common search space sets configured for sending control information to multiple UEs 115 and UE-specific search space sets for sending control information to a specific UE 115.

[0078] In some examples, a network entity 105 (e.g., a base station 140, an RU 170) may be movable and therefore provide communication coverage for a moving coverage area 110. In some examples, different coverage areas 110 associated with different technologies may overlap, but the different coverage areas 110 may be supported by the same network entity 105. In some other examples, the overlapping coverage areas 110 associated with different technologies may be supported by different network entities 105. The wireless communications system 100 may include, for example, a heterogeneous network in which different types of the network entities 105 provide coverage for various coverage areas 110 using the same or different radio access technologies.

[0079] The wireless communications system 100 may be configured to support ultra-reliable communications or low-latency communications, or various combinations thereof. For example, the wireless communications system 100 may be configured to support ultra-reliable low-latency communications (URLLC). The UEs 115 may be designed to support ultra-reliable, low-latency, or critical functions. Ultra-reliable communications may include private communication or group communication and may be supported by one or more services such as push-to-talk, video, or data. Support for ultra-reliable, low-latency functions may include prioritization of services, and such services may be used for public safety or general commercial applications. The terms ultra-reliable, low-latency, and ultra-reliable low-latency may be used interchangeably herein.

[0080] In some examples, a UE 115 may be configured to support communicating directly with other UEs 115 via a device-to-device (D2D) communication link 135 (e.g., in accordance with a peer-to-peer (P2P), D2D, or sidelink protocol). In some examples, one or more UEs 115 of a group that are performing D2D communications may be within the coverage area 110 of a network entity 105 (e.g., a base station 140, an RU 170), which may support aspects of such D2D communications being configured by (e.g., scheduled by) the network entity 105. In some examples, one or more UEs 115 of such a group may be outside the coverage area 110 of a network entity 105 or may be otherwise unable to or not configured to receive transmissions from a network entity 105. In some examples, groups of the UEs 115 communicating via D2D communications may support a one-to-many (1 :M) system in which each UE 115 transmits to each of the other UEs 115 in the group. In some examples, a network entity 105 may facilitate the scheduling of resources for D2D communications. In some other examples, D2D communications may be carried out between the UEs 115 without an involvement of a network entity 105.

[0081] The core network 130 may provide user authentication, access authorization, tracking, Internet Protocol (IP) connectivity, and other access, routing, or mobility functions. The core network 130 may be an evolved packet core (EPC) or 5G core (5GC), which may include at least one control plane entity that manages access and mobility (e.g., a mobility management entity (MME), an access and mobility management function (AMF)) and at least one user plane entity that routes packets or interconnects to external networks (e.g., a serving gateway (S-GW), a Packet Data Network (PDN) gateway (P-GW), or a user plane function (UPF)). The control plane entity may manage non-access stratum (NAS) functions such as mobility, authentication, and bearer management for the UEs 115 served by the network entities 105 (e.g., base stations 140) associated with the core network 130. User IP packets may be transferred through the user plane entity, which may provide IP address allocation as well as other functions. The user plane entity may be connected to IP services 150 for one or more network operators. The IP services 150 may include access to the Internet, Intranet(s), an IP Multimedia Subsystem (IMS), or a Packet-Switched Streaming Service.

[0082] The wireless communications system 100 may operate using one or more frequency bands, which may be in the range of 300 megahertz (MHz) to 300 gigahertz (GHz). Generally, the region from 300 MHz to 3 GHz is known as the ultra-high frequency (UHF) region or decimeter band because the wavelengths range from approximately one decimeter to one meter in length. UHF waves may be blocked or redirected by buildings and environmental features, which may be referred to as clusters, but the waves may penetrate structures sufficiently for a macro cell to provide service to the UEs 115 located indoors. Communications using UHF waves may be associated with smaller antennas and shorter ranges (e.g., less than 100 kilometers) compared to communications using the smaller frequencies and longer waves of the high frequency (HF) or very high frequency (VHF) portion of the spectrum below 300 MHz.

[0083] The wireless communications system 100 may utilize both licensed and unlicensed RF spectrum bands. For example, the wireless communications system 100 may employ License Assisted Access (LAA), LTE-Unlicensed (LTE-U) radio access technology, or NR technology using an unlicensed band such as the 5 GHz industrial, scientific, and medical (ISM) band. While operating using unlicensed RF spectrum bands, devices such as the network entities 105 and the UEs 115 may employ carrier sensing for collision detection and avoidance. In some examples, operations using unlicensed bands may be based on a carrier aggregation configuration in conjunction with component carriers operating using a licensed band (e.g., LAA). Operations using unlicensed spectrum may include downlink transmissions, uplink transmissions, P2P transmissions, or D2D transmissions, among other examples.

[0084] The electromagnetic spectrum is often subdivided, based on frequency/wavelength, into various classes, bands, channels, etc. In 5GNR two initial operating bands have been identified as frequency range designations FR1 (410 MHz - 7.125 GHz) and FR2 (24.25 GHz - 52.6 GHz). It should be understood that although a portion of FR1 is greater than 6 GHz, FR1 is often referred to (interchangeably) as a “Sub-6 GHz” band in various documents and articles. A similar nomenclature issue sometimes occurs with regard to FR2, which is often referred to (interchangeably) as a “millimeter wave” band in documents and articles, despite being different from the extremely high frequency (EHF) band (30 GHz - 300 GHz) which is identified by the International Telecommunications Union (ITU) as a “millimeter wave” band.

[0085] The frequencies between FR1 and FR2 are often referred to as mid-band frequencies. Recent 5GNR studies have identified an operating band for these midband frequencies as frequency range designation FR3 (7.125 GHz - 24.25 GHz). Frequency bands falling within FR3 may inherit FR1 characteristics and/or FR2 characteristics, and thus may effectively extend features of FR1 and/or FR2 into midband frequencies. In addition, higher frequency bands are currently being explored to extend 5GNR operation beyond 52.6 GHz. For example, three higher operating bands have been identified as frequency range designations FR4a or FR4-1 (52.6 GHz - 71 GHz), FR4 (52.6 GHz - 114.25 GHz), and FR5 (114.25 GHz - 300 GHz). Each of these higher frequency bands falls within the EHF band.

[0086] With the above aspects in mind, unless specifically stated otherwise, it should be understood that the term “sub-6 GHz” or the like if used herein may broadly represent frequencies that may be less than 6 GHz, may be within FR1, or may include mid-band frequencies. Further, unless specifically stated otherwise, it should be understood that the term “millimeter wave” or the like if used herein may broadly represent frequencies that may include mid-band frequencies, may be within FR2, FR4, FR4-a or FR4-1, and/or FR5, or may be within the EHF band.

[0087] A network entity 105 (e.g., a base station 140, an RU 170) or a UE 115 may be equipped with multiple antennas, which may be used to employ techniques such as transmit diversity, receive diversity, MIMO communications, or beamforming. The antennas of a network entity 105 or a UE 115 may be located within one or more antenna arrays or antenna panels, which may support MIMO operations or transmit or receive beamforming. For example, one or more base station antennas or antenna arrays may be co-located at an antenna assembly, such as an antenna tower. In some examples, antennas or antenna arrays associated with a network entity 105 may be located at diverse geographic locations. A network entity 105 may include an antenna array with a set of rows and columns of antenna ports that the network entity 105 may use to support beamforming of communications with a UE 115. Likewise, a UE 115 may include one or more antenna arrays that may support various MIMO or beamforming operations. Additionally, or alternatively, an antenna panel may support RF beamforming for a signal transmitted via an antenna port.

[0088] The network entities 105 or the UEs 115 may use MIMO communications to exploit multipath signal propagation and increase spectral efficiency by transmitting or receiving multiple signals via different spatial layers. Such techniques may be referred to as spatial multiplexing. The multiple signals may, for example, be transmitted by the transmitting device via different antennas or different combinations of antennas. Likewise, the multiple signals may be received by the receiving device via different antennas or different combinations of antennas. Each of the multiple signals may be referred to as a separate spatial stream and may carry information associated with the same data stream (e.g., the same codeword) or different data streams (e.g., different codewords). Different spatial layers may be associated with different antenna ports used for channel measurement and reporting. MIMO techniques include single-user MIMO (SU-MIMO), for which multiple spatial layers are transmitted to the same receiving device, and multiple-user MIMO (MU-MIMO), for which multiple spatial layers are transmitted to multiple devices. [0089] Beamforming, which may also be referred to as spatial filtering, directional transmission, or directional reception, is a signal processing technique that may be used at a transmitting device or a receiving device (e.g., a network entity 105, a UE 115) to shape or steer an antenna beam (e.g., a transmit beam, a receive beam) along a spatial path between the transmitting device and the receiving device. Beamforming may be achieved by combining the signals communicated via antenna elements of an antenna array such that some signals propagating along particular orientations with respect to an antenna array experience constructive interference while others experience destructive interference. The adjustment of signals communicated via the antenna elements may include a transmitting device or a receiving device applying amplitude offsets, phase offsets, or both to signals carried via the antenna elements associated with the device. The adjustments associated with each of the antenna elements may be defined by a beamforming weight set associated with a particular orientation (e.g., with respect to the antenna array of the transmitting device or receiving device, or with respect to some other orientation).

[0090] A network entity 105 or a UE 115 may use beam sweeping techniques as part of beamforming operations. For example, a network entity 105 (e.g., a base station 140, an RU 170) may use multiple antennas or antenna arrays (e.g., antenna panels) to conduct beamforming operations for directional communications with a UE 115. Some signals (e.g., synchronization signals, reference signals, beam selection signals, or other control signals) may be transmitted by a network entity 105 multiple times along different directions. For example, the network entity 105 may transmit a signal according to different beamforming weight sets associated with different directions of transmission. Transmissions along different beam directions may be used to identify (e.g., by a transmitting device, such as a network entity 105, or by a receiving device, such as a UE 115) a beam direction for later transmission or reception by the network entity 105.

[0091] Some signals, such as data signals associated with a particular receiving device, may be transmitted by transmitting device (e.g., a transmitting network entity 105, a transmitting UE 115) along a single beam direction (e.g., a direction associated with the receiving device, such as a receiving network entity 105 or a receiving UE 115). In some examples, the beam direction associated with transmissions along a single beam direction may be determined based on a signal that was transmitted along one or more beam directions. For example, a UE 115 may receive one or more of the signals transmitted by the network entity 105 along different directions and may report to the network entity 105 an indication of the signal that the UE 115 received with a highest signal quality or an otherwise acceptable signal quality.

[0092] In some examples, transmissions by a device (e.g., by a network entity 105 or a UE 115) may be performed using multiple beam directions, and the device may use a combination of digital precoding or beamforming to generate a combined beam for transmission (e.g., from a network entity 105 to a UE 115). The UE 115 may report feedback that indicates precoding weights for one or more beam directions, and the feedback may correspond to a configured set of beams across a system bandwidth or one or more sub-bands. The network entity 105 may transmit a reference signal (e.g., a cell-specific reference signal (CRS), a channel state information reference signal (CSI- RS)), which may be precoded or unprecoded. The UE 115 may provide feedback for beam selection, which may be a precoding matrix indicator (PMI) or codebook-based feedback (e.g., a multi-panel type codebook, a linear combination type codebook, a port selection type codebook). Although these techniques are described with reference to signals transmitted along one or more directions by a network entity 105 (e.g., a base station 140, an RU 170), a UE 115 may employ similar techniques for transmitting signals multiple times along different directions (e.g., for identifying a beam direction for subsequent transmission or reception by the UE 115) or for transmitting a signal along a single direction (e.g., for transmitting data to a receiving device).

[0093] A receiving device (e.g., a UE 115) may perform reception operations in accordance with multiple receive configurations (e.g., directional listening) when receiving various signals from a transmitting device (e.g., a network entity 105), such as synchronization signals, reference signals, beam selection signals, or other control signals. For example, a receiving device may perform reception in accordance with multiple receive directions by receiving via different antenna subarrays, by processing received signals according to different antenna subarrays, by receiving according to different receive beamforming weight sets (e.g., different directional listening weight sets) applied to signals received at multiple antenna elements of an antenna array, or by processing received signals according to different receive beamforming weight sets applied to signals received at multiple antenna elements of an antenna array, any of which may be referred to as “listening” according to different receive configurations or receive directions. In some examples, a receiving device may use a single receive configuration to receive along a single beam direction (e.g., when receiving a data signal). The single receive configuration may be aligned along a beam direction determined based on listening according to different receive configuration directions (e.g., a beam direction determined to have a highest signal strength, highest SNR, or otherwise acceptable signal quality based on listening according to multiple beam directions).

[0094] The wireless communications system 100 may be a packet-based network that operates according to a layered protocol stack. In the user plane, communications at the bearer or PDCP layer may be IP -based. An RLC layer may perform packet segmentation and reassembly to communicate via logical channels. A MAC layer may perform priority handling and multiplexing of logical channels into transport channels. The MAC layer also may implement error detection techniques, error correction techniques, or both to support retransmissions to improve link efficiency. In the control plane, an RRC layer may provide establishment, configuration, and maintenance of an RRC connection between a UE 115 and a network entity 105 or a core network 130 supporting radio bearers for user plane data. A PHY layer may map transport channels to physical channels.

[0095] In some cases, the wireless communications system 100 may support SISO communication, MIMO communication, or both. For example, SISO communication may include a communication between a single transmitter and a single receiver, whereas MIMO communication may include a communication between multiple transmitters and multiple receivers. In some cases, MIMO communications may include multiple SISO communications via a transmitter and receiver pair (e.g., a transmitting antenna and a receiving antenna pair). For example, a network entity 105 may include a first and a second transmitting antenna and a UE 115 may include a first and a second receiving antenna. The antenna pairs may include the first transmitting with the first receiving antennas, the first transmitting with the second receiving antennas, the second transmitting with the first receiving antennas and the second transmitting with the second receiving antennas. Each antenna pair may modify a signal utilizing a beamforming matrix (e.g., a precoding matrix, an orthogonal matrix) prior to transmission of the signal to minimize interference (e.g., linear precoders, beamformers, may create beams that focus energy for each receive antenna by weighting the phase and magnitude of transmission antennas). Because the beamforming matrix may be unknown to the receiver (e.g., the UE 115, the network entity 105), the receiver may estimate the pre-coded channel.

[0096] In some examples, the wireless communications system 100 may support channel estimations. For example, the channel estimation may be for resource grid based (e.g., slot based) wireless MIMO-OFDM systems. The channel estimation may be a 5GNR channel estimation with varying DMRS patterns, quantity of resource blocks, and the like. The channel estimation may be for super-resolution or signal recovery based on sparse observations.

[0097] In MEMO communication, multiple layers of information (e.g., communication) may interfere with each other, making the channel estimations more complex. In some cases, orthogonal cover codes may be used to remove the interference with a de-spreading step. De-spreading steps, however, may be inadequate for frequency selective or fast fading channels (e.g., high delay spread and high Doppler response). Additionally, narrowband MIMO communication (e.g., communication using relatively small chunks of bandwidth parts) may utilize different precoding matrices for each group of resources (e.g., physical resource groups (PRGs)) that may be unknown to a UE 115. The unknown precoding matrices may add a higher complexity to channel estimation, such that some wireless communications systems may not use correlation between non-contiguous PRGs in the channel estimation. For example, some estimation techniques (e.g., MMSE) may utilize TRS, or another continuous reference signal, to calculate channel characteristics (e.g., Doppler, delay spread, SNR, and the like) and estimate the channel resources based on the channel characteristics.

[0098] In some cases, some estimation techniques may include least square and linear MMSE (LMMSE). Least square may not use information about channel statistics or noise variance and may not model correlations across different PRGs and MIMO layers, thus making it relatively simple to implement with low computational expenses. However, the estimation accuracy may be inadequate for most use cases (e.g., practical applications). LMMSE may utilize second-order channel statistics and noise variance (e.g., binning based strategies based on estimated channel parameters, such as Doppler, delay spread, and the like). LMMSE may have a high computational expense and have a relatively low estimation error under some conditions. However, LMMSE may not model correlations across different PRGs and MIMO layers.

[0099] In some cases, some estimation techniques may include deep learning based techniques. The deep learning based techniques may not utilize explicit information of channel statistics and may utilize nonlinear interpolation. Unlike LMMSE, the deep learning techniques may not utilize big matrix inversion operations. The deep learning technique may utilize a separate network for every DMRS pattern. In some cases, the deep learning techniques may not consider correlation across different PRGs.

[0100] The techniques described herein provide for calculating channel estimations using recurrent equivariant inference machines, which may result in a channel estimation based on DMRS symbols that utilizes the correlation between noncontiguous PRGs (e.g., without a knowledge of precoders). In some cases, a wireless communications system 100 (e.g., OFDM systems) may deploy pilot-based channel estimation techniques for obtaining CSI with relative accuracy (e.g., obtaining accurate CSI may help maintain high data throughput, for example, in a fast fading environment). The pilot symbols may be referred to as DMRS symbols. The DMRS symbols may be inserted in transmission slots according to a known DMRS pattern, thus allowing wireless devices to estimate the unknown resources (e.g., non-DMRS locations, resources allocated for a data signal) of the channel based on the known resources (e.g., DMRS symbols). In some cases, the DMRS patterns may be preconfigured (e.g., a fixed set of possible DMRS patterns). A DMRS pattern may be used based on the channel characteristics.

[0101] In some examples, a wireless device may receive an assignment of a set of resources associated with a channel where the set of resources includes a first subset of resources allocated for data transmission (e.g., non-DMRS) and a second subset of resources allocated for a reference signal (e.g., DMRS). The wireless device may generate multiple channel estimations per layer of the channel (e.g., SISO channel estimations) and perform a refinement operation utilizing the estimations to generate a channel estimation associated with multiple layers (e.g., a MIMO channel estimation). In some cases, the refinement operation may include multiple iterations. For example, each iteration may include generating respective gradients associated with each per layer channel estimation based on the per layer channel estimations and observed resources of the channel (e.g., the known DMRS resources); generating a current set of values of a latent variable (e.g., an inferred variable based on observed variables) based on a previous set of values of the latent variable, the respective gradients, and the per layer channel estimations; and modifying (e.g., refining, updating, improving) the channel estimations based on the current set of values of the latent variable, the per layer channel estimations, and the respective gradients. In some cases, the refinement operation may be performed by a refinement network that includes a likelihood module, an encoder module, and a decoder module.

[0102] In some cases, the wireless communications system 100 may incorporate end-to-end (E2E) use of neural networks for channel state feedback. Such a neural network structure may be used for channel state information feedback (CSF) by providing intermediate channel representation to the wireless communications network and a wireless device (e.g., a receiving wireless device, a network entity 105) may reconstruct the channel (e.g., split implementation of the proposed method at the UE and the network side), such that this type of model architecture may have specification to some degree of interoperability.

[0103] FIG. 2 shows an example of a network architecture 300 (e.g., a disaggregated base station architecture, a disaggregated RAN architecture) that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. The network architecture 200 may illustrate an example for implementing one or more aspects of the wireless communications system 100. The network architecture 200 may include one or more CUs 160-a that may communicate directly with a core network 130-a via a backhaul communication link 120-a, or indirectly with the core network 130-a through one or more disaggregated network entities 105 (e.g., a Near-RT RIC 175-b via an E2 link, or a Non-RT RIC 175-a associated with an SMO 180-a (e.g., an SMO Framework), or both). A CU 160-a may communicate with one or more DUs 165-a via respective midhaul communication links 162-a (e.g., an Fl interface). The DUs 165-a may communicate with one or more RUs 170-a via respective fronthaul communication links 168-a. The RUs 170-a may be associated with respective coverage areas 110-a and may communicate with UEs 115-a via one or more communication links 125-a. In some implementations, a UE 115-a may be simultaneously served by multiple RUs 170-a.

[0104] Each of the network entities 105 of the network architecture 200 (e.g., CUs 160-a, DUs 165-a, RUs 170-a, Non-RT RICs 175-a, Near-RT RICs 175-b, SMOs 180-a, Open Clouds (O-Clouds) 205, Open eNBs (O-eNBs) 210) may include one or more interfaces or may be coupled with one or more interfaces configured to receive or transmit signals (e.g., data, information) via a wired or wireless transmission medium. Each network entity 105, or an associated processor (e.g., controller) providing instructions to an interface of the network entity 105, may be configured to communicate with one or more of the other network entities 105 via the transmission medium. For example, the network entities 105 may include a wired interface configured to receive or transmit signals over a wired transmission medium to one or more of the other network entities 105. Additionally, or alternatively, the network entities 105 may include a wireless interface, which may include a receiver, a transmitter, or transceiver (e.g., an RF transceiver) configured to receive or transmit signals, or both, over a wireless transmission medium to one or more of the other network entities 105.

[0105] In some examples, a CU 160-a may host one or more higher layer control functions. Such control functions may include RRC, PDCP, SDAP, or the like. Each control function may be implemented with an interface configured to communicate signals with other control functions hosted by the CU 160-a. A CU 160-a may be configured to handle user plane functionality (e.g., CU-UP), control plane functionality (e.g., CU-CP), or a combination thereof. In some examples, a CU 160-a may be logically split into one or more CU-UP units and one or more CU-CP units. A CU-UP unit may communicate bidirectionally with the CU-CP unit via an interface, such as an El interface when implemented in an O-RAN configuration. A CU 160-a may be implemented to communicate with a DU 165-a, as necessary, for network control and signaling.

[0106] A DU 165-a may correspond to a logical unit that includes one or more functions (e.g., base station functions, RAN functions) to control the operation of one or more RUs 170-a. In some examples, a DU 165-a may host, at least partially, one or more of an RLC layer, a MAC layer, and one or more aspects of a PHY layer (e.g., a high PHY layer, such as modules for FEC encoding and decoding, scrambling, modulation and demodulation, or the like) depending, at least in part, on a functional split, such as those defined by the 3rd Generation Partnership Project (3GPP). In some examples, a DU 165-a may further host one or more low PHY layers. Each layer may be implemented with an interface configured to communicate signals with other layers hosted by the DU 165-a, or with control functions hosted by a CU 160-a.

[0107] In some examples, lower-layer functionality may be implemented by one or more RUs 170-a. For example, an RU 170-a, controlled by a DU 165-a, may correspond to a logical node that hosts RF processing functions, or low-PHY layer functions (e.g., performing fast Fourier transform (FFT), inverse FFT (iFFT), digital beamforming, physical random access channel (PRACH) extraction and filtering, or the like), or both, based at least in part on the functional split, such as a lower-layer functional split. In such an architecture, an RU 170-a may be implemented to handle over the air (OTA) communication with one or more UEs 115-a. In some implementations, real-time and non-real-time aspects of control and user plane communication with the RU(s) 170-a may be controlled by the corresponding DU 165-a. In some examples, such a configuration may enable a DU 165-a and a CU 160-a to be implemented in a cloudbased RAN architecture, such as a vRAN architecture.

[0108] The SMO 180-a may be configured to support RAN deployment and provisioning of non-virtualized and virtualized network entities 105. For non-virtualized network entities 105, the SMO 180-a may be configured to support the deployment of dedicated physical resources for RAN coverage requirements which may be managed via an operations and maintenance interface (e.g., an 01 interface). For virtualized network entities 105, the SMO 180-a may be configured to interact with a cloud computing platform (e.g., an O-Cloud 205) to perform network entity life cycle management (e.g., to instantiate virtualized network entities 105) via a cloud computing platform interface (e.g., an 02 interface). Such virtualized network entities 105 can include, but are not limited to, CUs 160-a, DUs 165-a, RUs 170-a, and Near-RT RICs 175-b. In some implementations, the SMO 180-a may communicate with components configured in accordance with a 4G RAN (e.g., via an 01 interface). Additionally, or alternatively, in some implementations, the SMO 180-a may communicate directly with one or more RUs 170-a via an 01 interface. The SMO 180-a also may include a Non- RT RIC 175-a configured to support functionality of the SMO 180-a.

[0109] The Non-RT RIC 175-a may be configured to include a logical function that enables non-real-time control and optimization of RAN elements and resources, Artificial Intelligence (Al) or Machine Learning (ML) workflows including model training and updates, or policy-based guidance of applications/features in the Near-RT RIC 175-b. The Non-RT RIC 175-a may be coupled to or communicate with (e.g., via an Al interface) the Near-RT RIC 175-b. The Near-RT RIC 175-b may be configured to include a logical function that enables near-real-time control and optimization of RAN elements and resources via data collection and actions over an interface (e.g., via an E2 interface) connecting one or more CUs 160-a, one or more DUs 165-a, or both, as well as an O-eNB 210, with the Near-RT RIC 175-b.

[0110] In some examples, to generate AI/ML models to be deployed in the Near-RT RIC 175-b, the Non-RT RIC 175-a may receive parameters or external enrichment information from external servers. Such information may be utilized by the Near-RT RIC 175-b and may be received at the SMO 180-a or the Non-RT RIC 175-a from nonnetwork data sources or from network functions. In some examples, the Non-RT RIC 175-a or the Near-RT RIC 175-b may be configured to tune RAN behavior or performance. For example, the Non-RT RIC 175-a may monitor long-term trends and patterns for performance and employ Al or ML models to perform corrective actions through the SMO 180-a (e.g., reconfiguration via 01) or via generation of RAN management policies (e.g., Al policies).

[0111] FIG. 3 shows an example of a network 300 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. In some examples, the network 300 may be implemented by aspects of the wireless communications system 100. For example, the network 300 may be implemented by a UE 115, a network entity 105, or both, as described herein with reference to FIGs. 1 and 2.

[0112] In some examples, a wireless device (e.g., a UE 115, a network entity 105) may transmit a signal via one or more slots (e.g., a frequency and time grid) using multiple resources (e.g., time resources, frequency resources, and the like). A resource element (e.g., symbol 315, DMRS 320, a single subcarrier for a single OFDM symbol) may be grouped with multiple resource elements to form a physical resource block (PRB) (e.g., PRB 310). Each column of resources (e.g., resource elements of a same time resource and different frequency resources) of the PRB 310 may be considered a single resource block (e.g., an OFDM symbol). In some cases, the PRB 310 may include twelve subcarrier frequencies (of the frequency domain) and fourteen OFDM symbols (of the time domain). Multiple PRBs 310 may be bundled (e.g., grouped, combined) to form a single PRG, such as PRG 305. In some cases, the PRG 305 may include a quantity of PRBs 310 determined by a bundle size parameter (e.g., bundle Size. j a quantity of consecutive PRBs stacked together in a single PRG). For example, the bundle size parameter may indicate two PRBs 310 or four PRBs 310 (e.g., four consecutive resource blocks) for a narrowband precoding operation or zero PRBs 310 (e.g., no stacked PRBs) for a wideband precoding operation, and form a portion of a bandwidth part.

[0113] In some cases, a wireless communications system (e.g., the wireless communications system 100) may utilize pilot-based channel estimation techniques. The pilot symbols may be referred to as DMRS symbols (e.g., a DMRS symbol 320). For example, the wireless device may transmit a signal including one or more PRBs 310 via a physical downlink shared channel (PDSCH) (e.g., a channel for user data). The PRBs 310 may include various symbols 315 (e.g., resource elements allocated for a data signal), various DMRS symbols 320 (e.g., resource elements allocated for a reference signal for demodulation), and, in some cases, empty symbols (e.g., resource elements with no data allocated).

[0114] The DMRS symbols 320 may be inserted in various resource elements of a PRB 310 according to a resource configuration pattern (e.g., a DMRS pattern). For example, the wireless device may be configured with various DMRS patterns and select a DMRS pattern 345 for the signal. In some cases, the various DMRS patterns may include DMRS symbols inserted in adjacent resource elements, every other resource element, in a single resource block of a PRB, multiple resource blocks of a PRB, among other potential configurations.

[0115] In some examples, the DMRS pattern 345 may be known by the receiving wireless device. For example, the receiving wireless device may receive the signal and determine which of the symbols of the PRG 305 include the various DMRS 320 based on the DMRS pattern 345. In order to extract (e.g., process, determine, decode, estimate) the data at the symbols 315, the receiving wireless device may perform a channel estimation procedure. For example, the received DMRS 320 (e.g., y^) may be equal to a combination between noise (e.g., interference, nt) and a product between the channel (e.g., PDSCH, /i ) and the original data (e.g., %j), according to Equation 1.

Equation

Because the DMRS 320 is a pilot symbol known by both the transmitting wireless device (e.g., a UE 115, a network entity 105) and the receiving wireless device (e.g., a UE 115, a network entity 105), the receiving wireless device may estimate h L (e.g., extract the channel at DMRS locations) based on the known y t and x t (given some noise). The receiving wireless device may then interpolate (e.g., inpaint as for an image) the estimated channel across the various symbols 315 (e.g., the remaining resource elements) to extract (e.g., calculate) the data at the symbols 315. In some cases, nt may include cross-MIMO interference, inter-PRG interference, intra-PRG interference, and the like. Some channel estimation techniques may not account for (e.g., calculate) these types of interference (e.g., noise), which may lead to inaccurate channel estimations and inaccurate data estimations.

[0116] In some implementations, the signal may include multiple PRGs 305, where each PRG 305 may be configured (e.g., precoded) according to a unique precoding matrix (y r5 ). For example, the signal may include four PRGs 305, each precoded according to a unique precoding matrix. The unique precoding matrix may be found by performing singular value decomposition (SVD) over resource blocks within the PRG 305 or as a random (e.g., orthogonal) precoder. In some cases, the effective channel for a first PRG 305 may be equal to a product between the channel and a unique precoding matrix ( prg ) for the first PRG 305, according to Equation 2.

Equation

In some cases, the receiving wireless device may not know which precoding matrix is applied to which PRG 305, thus, the unique precoding of channels per PRG 305 may prohibit smooth interpolation of channel between PRGs 305. [0117] In some implementations, MIMO communication may introduce additional complexities to the channel estimation formulas (e.g., SISO channel estimation formulas). For example, at the DMRS tones inference (e.g., extraction) step, the formula may include multiple unknown equations (e.g., multiple equations with multiple unknown variables), which may lead to solving an underdetermined inverse problem. A complexity may include determining the alignment of the channel across different PRG bundles, and then exploiting the correlation across them. Additionally, the multiple layers may add additional multiplexing information (e.g., correlation across receiver and transmitter antenna pairs) that may be utilized to enhance the channel estimation performance.

[0118] The techniques described herein provide for calculating channel estimations using recurrent equivariant inference machines. For example, utilizing a single neural network estimator for multiple use cases of channel estimation (e.g., channel profile estimations, various DMRS pattern configurations, SNR, cross-MIMO estimation, inter- PRG estimation, intra-PRG estimation, and the like) that includes a modular and interpretable model design. In some implementations, the channel estimation may include multiple phases (e.g., steps). For example, a first phase may include solving a SISO channel estimation (e.g., for each transmitter and receiver antenna pair). A second phase may include solving a MIMO channel estimation using the SISO channel estimations (e.g., learning the correlations between the antenna pairs). A third phase may include solving MIMO channel estimations within PRG bundles (e.g., intra-PRG, learning correlations between resource elements within each PRG). A fourth phase may include solving MIMO channel estimations across PRG bundles (e.g., inter-PRG, learning correlations between resource elements across PRG).

[0119] Additionally, or alternatively, in some examples as described herein, channel estimations may be calculated using recurrent equivariant inference machines and based on MMSE operations. An MMSE operation may be an estimation technique that utilizes linear equalization to estimate a channel. MMSE may be supported by hardware within the wireless device, which may provide for reduced complexity and processing as compared with performing an initial estimation based on machine learning. A recurrent equivariant inference machine may represent an example of a type of machine learning model that provides relatively reliable and accurate estimations based on a given set of inputs. For example, a first phase of the channel estimation may include solving a SISO channel estimation (e.g., for each transmitter and receiver antenna pair) based on an MMSE operation 350 (e.g., which may also be referred to as an average MMSE (AMMSE) operation). A first set of multiple channel estimations per layer of the channel may be generated based on the MMSE operation 350. In such cases, the remaining phases may build on the MMSE-generated estimations. In some cases, the various phases of the channel estimation may be performed iteratively (e.g., include multiple iterations of the various phases). In some cases, one or more of the phases may utilize an SNR estimation (e.g., genie value). Although four phases are described, a channel estimation procedure utilizing the described techniques may include more or less phases, phases including various other steps, phases without one or more of the steps described, or any combination thereof. While the phases are described as four separate phases, they may be considered as one continuous process.

[0120] In some cases, a network may perform the various phases. For example, the network may be a request network that includes a coarse network, as described herein with reference to FIG. 3, and a refinement network, as described herein with reference to FIGs. 4-6. In some cases, the network may include a u-net type (e.g., u-net 525) encoder (e.g., encoder 430) and decoder (e.g., decoder 435) convolutional block followed by attention-based (e.g., attention 605) refinement network for longer range correlations. In some examples, the coarse network 325 may provide an initial channel estimate (e.g., a rough estimate) per PRG per antenna pair (e.g., transmission and receiver antenna pair) through a learned interpolation (e.g., smoothing). Additionally, or alternatively, the MMSE operation 350 may provide an initial channel estimate per PRG per antenna pair, and the coarse network 325 may modify or update the channel estimate through the learned interpolation.

[0121] The network 300 may represent the first phase. A wireless device (e.g., a UE 115, a network entity 105, or both, as described herein with reference to FIG. 1) may receive an assignment of a set of resources associated with a channel (e.g., a PDSCH channel). The set of resources may include a first subset of resources allocated for a data signal (e.g., symbols 315) and a second subset of resources allocated for a reference signal (e.g., DMRS symbols 320). [0122] In some cases, the wireless device may perform an MMSE operation 350 to generate a first set of multiple channel estimations 355 associated with respective layers 360 of the channel for the set of resources (e.g., an LS channel estimate at the DMRS symbols 320). For example, the set of resources may include one or more PRGs 305 and the second subset of resources may include various DMRS symbols 320 inserted to respective resource elements based on a resource configuration pattern (e.g., a pattern 345) of a set of resource configuration patterns (e.g., a set of DMRS patterns). In some examples, one or more of the first set of resources including non-DMRS symbols 315 may be initialized with zero entries. Each respective layer 360 may be associated with a respective antenna pair (e.g., a transmitter and receiver pair) of multiple SISO antenna pairs. In some implementations, initial values of the first set of multiple channel estimations 355 may be associated with SISO antenna pairs. The MMSE operation 350 may be associated with relatively low computation costs and processing as compared with other channel estimation techniques. As described herein, the network 300 may utilize the MMSE channel estimates as an input and may build on top of the MMSE estimates. For example, the MMSE operation 350 may be followed by one or more attention-based refinement operations (e.g., machine learning-based channel estimation) for relatively long range correlations, which may reduce complexity and processing as compared with other channel estimation techniques.

[0123] After performing the MMSE operation 350, the wireless device may use the first set of multiple channel estimations 355 to generate a second set of multiple channel estimations 335 using a network 325 (e.g., a coarse network). The network 325 may, in some examples, include or be based on a machine learning model for channel estimation. In some examples, the network 325 may perform a nonlinear two- dimensional interpolation of the channel based on the first set of multiple channel estimations 335. For example, the network 325 may interpolate the channel in the time and frequency domains (e.g., two dimensions).

[0124] The network 325 may involve (e.g., input) the first set of channel estimations 355 generated by the MMSE operation 350 based on the PRG 305. The network 325 may perform various iterations 330 on the first set of channel estimations 355. For example, the network 325 may include a u-net encoder decoder fully convolutional network in which each iteration may include gated and gated-dilated convolutional units. In some cases, for one or more iterations 330, the network 325 may copy and concatenate the results of previous iterations 330 to the one or more iterations 330. In some examples, the first set of multiple channel estimations 355 and the second set of multiple channel estimations 335 may be channel estimations (e.g., h™j k f, t), where h is the channel estimation, m is the PRG index, k is the PRB index, i, j is a MIMO index (Tx — Rx J ) for a given antenna pair, and ( , t) is a resource element index inside the PRB two-dimensional grid) for a respective SISO antenna pair per PRG bundle. For example, the wireless device may receive respective PRG bundles per antenna pair. If the MIMO communication includes two transmitting antennas and two receiving antennas, there may be four antenna pairs with multiple PRG bundles for each pair. The wireless device may utilize the MMSE operation 350 to generate a first channel estimation 355 for each antenna pair and each PRG bundle 305, and the network 325 to further generate a second channel estimation 335 for each antenna pair and each PRG bundle 305. In some cases, the network 325 may output argmax P : v k }] The h i.i network 325 may additionally generate a latent variable. For example, the latent variable may be an estimate 340 (e.g., a z estimate) that may not be directly observed, but rather inferred from other observed parameters. The latent variable may be an abstract representation of underlying channel characteristics (e.g., Doppler shift, delay spread). The network 325 (e.g., an embed network) may be used as a feature extractor to produce an initial latent (e.g., z T=0 ), which may be further used by subsequent refinement modules. In some examples, the output of the network 325 may be the input (e.g., input 415) of the network 400.

[0125] In some cases, the techniques described herein may result in various advantages over the other channel estimation techniques. For example, the channel estimation through the recurrent equivariant inference machines (e.g., networks 300, 400, 500, and 600) may offer various signal processing and deep learning advantages. For example, the signal processing may be based on DMRS (e.g., excluding dependence on TRS, with the exception of SNR), exclude explicit parameter estimation (e.g., Doppler shift, delay spread, and the like), avoid legacy binning strategies, utilize relatively less memory and computational overhead (e.g., reduced maintenance of a bank of parameters), model additional interactions (e.g., interference, cross-MIMO, intra-PRG, inter-PRG processing gain), abstract the orthogonal cover code (OCC) despreading step (as described herein with reference to FIG. 1), or any combination thereof, thus circumventing the associated computational costs and performance loss. The deep learning techniques may include a variable quantity of PRG bundles, a variable quantity of bundle sizes (e.g., PRBs per PRG), multiple DMRS patterns (e.g., multiple input DMRS configurations, quantity of additional columns, configuration type), underlying mathematical symmetries, a forward model into the network design, and a modular and interpretable architecture (e.g., possible to perform ablation study and gauge component significance). By utilizing the network 325 to build on top of the MMSE operation 350 as described herein, one or more nonlinear interpolation gains may be added to the MMSE estimate. The MMSE operation 350 may provide for a partial machine learning channel estimation solution that enables machine learning capabilities while maintaining existing channel estimation and demapping hardware at the wireless device, which may reduce memory consumption and computational costs, among other possibilities.

[0126] FIG. 4 shows an example of a network 400 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. In some examples, the network 400 may be an implemented by aspects of the wireless communications system 100. For example, the network 400 may be implemented by a UE 115, a network entity 105, or both, as described herein with reference to FIGs. 1 and 2.

[0127] In some cases, the network 400 may support channel estimation. For example, a channel estimation problem may be defined as maximizing a posterior of the channel (e.g., H) given an observed signal (e.g., y) and the signal (e.g., x) (e.g., max [In P Z (H |y, x)] oc max [ln p z (y|/f, x) + ln p z (/f)]). In some cases, the conditional H,z H probability distribution (e.g., P H |y, x)) may be parametrized by channel characteristics (e.g., delay-doppler profile). The latent variable (e.g., z), as described herein with reference to FIG. 3, may be an abstract representation of the underlying channel characteristics (e.g., Doppler, delay spread, and the like).

[0128] In some cases, an output of the network 300 may be an input of the network 400. For example, an input 415 may include one or more channel estimations (e.g., a channel estimation 335 per PRG 305 per SISO antenna pair) and respective estimates 340 (e.g., respective latent variables per channel estimation). The input 415 may be an input for the refinement network 405 that may include various iterations 410. For example, a wireless device (e.g., a network entity 105, a UE 115) may perform a refinement operation (e.g., via the refinement network 405) on the channel estimations. The refinement operation may include multiple iterations of generating respective gradients (e.g., gradient 480), as described herein with reference to FIG. 4 (e.g., module 425), based on the channel estimations for a subset of resources (e.g., DMRS symbols) and measured observations of the subset of resources (e.g., DMRS symbols 320); generating a second set of latent variables, as described herein with reference to FIGs. 4 and 5 (e.g., module 430), based on the channel estimations and the respective gradients; and modifying the channel estimations associated with multiple layers based on the second set of latent variables, the channel estimations, and the respective gradients, as described herein with reference to FIG. 5 (e.g., module 435).

[0129] In some cases, the refinement network 405 may include an iterative refinement by various refinement units. For example, each iteration 410 of the refinement network 405 may be performed by various modules (e.g., three or four unique refinement modules). For example, an iteration 410 may include a module 425 (e.g., a likelihood module), a module 430 (e.g., an encoder module), and a module 435 (e.g., a decoder module). Additionally, or alternatively, the iteration 410 may include various other modules for performing other tasks not illustrated in FIG. 4. In some examples, the refinement network 405 may include a respective set of machine learning parameters 485 associated with a machine learning operation. For example, the set of machine learning parameters 485 may be denoted as 0. The set of machine learning parameters 485 may be set during a machine learning simulation (e.g., a pre-field operation). In some cases, the set of machine learning parameters 485 may include a respective parameter that is unique to each iteration 410 (e.g., 0 1 , 0 2 , . . . 0 T for T iterations, where 0 1 A 0 2 A . . . A 0 T ), as inputs to each iteration may vary (e.g., different from other, traditional, encoder and decoder machine learning implementations). In some other cases, the set of machine learning parameters 485 may be common across each iteration 410 to reduce complexity or size of implementation of the refinement network 405. That is, each iteration 410 may use a same parameter (e.g., 6 1 = 6 2 , = . . . = 0 T for T iterations).

[0130] In some cases, the module 425 may output to the module 430 and the module 435, the module 430 may output to the module 435, and the module 435 may output to a next iteration 410 or be the final (e.g., last, ultimate) output (e.g., output 420) of the refinement network 405. In some examples, the output of the module 435 (e.g., output 420) may include a channel estimate (e.g., a set of MIMO channel estimations for each of the PRGs, PRBs, and antenna pairs at the respective iteration 410) that is output to a second module 425, a second module 430, and a second module 435 of the next iteration 410. For example, the set of channel estimations (e.g., H T ) for an iteration 410 (e.g., T) may be equal to In some cases, the output of the module 435 may include a latent variable (e.g., a set of latent variables for each channel estimate) that is output to the second module 425 and the second module 430 of the next iteration 410. For example, the set of latent variables (e.g., z T ) for an iteration 410 may be equal

[0131] In some examples, at least a portion of the input 415 may be an input for the module 425. For example, the second set of multiple channel estimations 335, as described herein with reference to FIG. 3, may be input to the module 425.

Additionally, or alternatively, the observed DMRS symbols (e.g., y dmrs ) and the known DMRS symbols may be input to the module 425 (e.g., (y dmrs i x dmrs , <J)). In some cases, the module 425 may coordinate descent on z and H and use recurrent inference as a gradient. For example, the module 425 may generate respective sets of values of a residual variable 465 (e.g., 8y rs j) based on a difference between the measured observations of a subset of resources (e.g., DMRS symbols 320) and the second set of multiple channel estimations 335 associated with the subset of resources (H T x ), according to Equation 3. The module 425 may combine the respective sets of values of the residual variable 465, the known observations 470 associated with the subset of resources (e.g., x^ rs t ), and a quantity of mask bits 475 (e.g., a binary mask), according to Equation 3. d lnp y \H T , X )

Equation 3 :

[0132] Thus, the module 425 may generate respective gradients 480 (e.g., V y | H t ).

For example, each antenna pair may have a respective gradient based on channel components associated with the respective antenna pair. In some cases, as observations are sparse, the feedback from the module 425 (e.g., the feedback module) may also be sparse.

[0133] In some examples, the module 430 may include various steps. For example, the module 430 (e.g., an encoder module) may, at 440, receive an input (e.g.,

V {i,j, k, m}) including a first set of values of a latent variable, the channel estimations, and the gradients 480; at 445, perform a fusion of the inputs, as described herein with reference to FIG. 5; at 450, perform various attention calculations (e.g., an intra-PRG calculation at 450-a, an inter-PRG calculation at 450-b, and a cross-MIMO calculation at 450-c), as described herein with reference to FIG. 6; at 455, perform a multi-layer perceptron (MLP) procedure; and at 460, output a second set of values of the latent variable (e.g., z^' T+1 V {i,j, k, m ). In some cases, the output of each step may be concatenated with the output of the next step. The module 430 may generate, via the various steps, the second set of values of the latent variable according Equation 4:

Equation

[0134] In some cases, the module 435 may receive the second set of values of the latent variable and modify the channel estimations associated with the multiple layers based on the second set of values of the latent variable, the channel estimations (e.g., the channel estimations of the previous iteration) and the respective gradients. For example, the channel estimations for this iteration 410 may be generated according to the Equation 5:

Equation

[0135] as described herein with reference to FIG. 5.

[0136] In some examples, each iteration 410 of the refinement network 405 may utilize a same set of one or more machine learning parameters 485. For example, a first iteration 410 may operate according to the set of machine learning parameters 485. Each of the modules 425, 430, and 435 may operate according to respective machine learning parameters from the set of machine learning parameters 485. Remaining iterations 410 of the refinement network 405 may utilize the same set of machine learning parameters 485 for estimating a given attention calculation. The set of machine learning parameters 485 may be generated based on training of the refinement network 405 according to one or more training parameter sets. The same set of machine learning parameters 485 may be generated for each iteration 410 if weights are shared during the training operation (e.g., shared weights for 0 1 , 6 2 , . . . 0 T ). The set of machine learning parameters 485 may be stored on at the refinement network 405 and may be utilized during operation of the refinement network 405. In some examples, different refinement operations may be performed for different attention calculations, and all iterations 410 of a given refinement operation may share a respective same set of machine learning parameters 485, where the set of machine learning parameters for different refinement operations may be different. The attention calculations may include, for example, an intra-PRB calculation, an inter-PRB calculation, a cross-MIMO calculation, an MLP calculation, or any combination thereof. A size of the shared parameter set may be independent of a quantity of resources and may be based on a quantity of machine learning model parameters for a given attention calculation.

[0137] Sharing the same set of machine learning parameters 485 across all iterations 410 may provide for a reduction in size of the refinement network 405 and a reduction in complexity associated with a refinement operation as compared with refinement operations that utilize different parameters for each iteration 410. The reduction in size and complexity may be obtained while reliability and accuracy of the refinement operation is relatively unchanged. For example, a quantity of parameters in a set of parameters for an iteration 410 of the refinement network may be in the range of 100 K to 2 M parameters, and thus sharing a same set of parameters across the iterations 410 may provide substantial benefits in storage space for the set of parameters. Thus, a quantity of parameters that may be generated when training the refinement network 405 may be smaller than if there are different parameters for each iteration 410, and the refinement operation may still produce a reliable and accurate output 420 associated with a refined channel estimation and latent variable. [0138] FIG. 5 shows an example of a network 500 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. In some examples, the network 500 may be implemented by aspects of the wireless communications system 100, the network 400, or both. For example, the network 500 may be implemented by a UE 115, a network entity 105, the encoder 430, the decoder 435, or any combination thereof, as described herein with reference to FIGs. 1 and 4.

[0139] In some cases, the network 500 may perform a fusion associated with neural networks. For example, the network 500 may be an example of a convolutional neural network (CNN) (e.g., a neural network that uses convolution in place of a more general matrix multiplication for multiple layers). In some cases, fusion associated with a CNN may fuse (e.g., combine, compress) two or more convolutional layers (e.g., weights associated with each layer) together.

[0140] In some examples, one or more modules (e.g., module 430, module 435) of a refinement network (e.g., refinement network 405), as described herein with reference to FIG. 4, may utilize the network 500. For example, an encoder module (e.g., module 430) may perform a fusion operation (e.g., at 445). The encoder module may generate a second set of values of a latent variable associated with multiple channel estimations based on combining (e.g., fusing) the channel estimations, respective gradients (e.g., gradients 480), and respective values of a first set of values of the latent variable. For example, the encoder module may generate output 530 (e.g., z^ +1 ) based on Equation

6:

Equation where FuCNN is the fusion operation, z^ represents the first set of values of the latent variable (e.g., a first portion of estimate 505), h™'J represents the channel estimations

(e.g., a second portion of estimate 505), and represents the respective gradients (e.g., gradient 510). For example, the encoder module may fuse the gradient 510 (e.g., from the likelihood module 425) into the hidden state variable (e.g., the first set of values of the latent variable), thus modeling MIMO multiplexing (e.g., multiplexing MIMO phenomena). The fusing operation may act independently on each PRG (e.g., PRG 305) of the channel, and incorporate the gradient 510 (e.g., gradient information) into the latent state (e.g., performing a fusion operation for each PRG of each antenna pair).

[0141] In some cases, the fusion operation may comprise various steps. For example, the encoder module may receive as input the estimates 505 (e.g., the channel estimations and the latent variable values of a previous iteration 410) and the gradients 510 (e.g., gradients 480 of a same iteration 410), as described herein with reference to FIG. 4. The encoder module may combine (e.g., fuse, concatenate) the estimates 505 and the gradients 510 to generate combination 520 In some cases, the combination 520 may be input to a u-net 525 (e.g., a tiny u-net). In some examples, the u-net 525 may be an example of a type of CNN that utilizes upsampling operators (e.g., upsampling operators with a relatively large quantity of feature channels). The u-net 525 may perform various calculations (e.g., combinations, operations) associated with a neural network to generate the output 530 (e.g., z^ +1 ).

[0142] In some examples, a decoder module (e.g., module 435) may perform a fusion operation. The decoder module may modify the channel estimations (e.g., generate a third iteration of channel estimations) based on combining (e.g., fusing) the second set of values of the latent variable (e.g., output from step 460), the second set of multiple channel estimations, and respective gradients (e.g., gradients 480). For example, the decoder module may generate output 530 (e.g., based on Equation 7:

Equation where FuCNN is the fusion operation, z™j^ +1 represents the second set of values of the latent variable (e.g., a first portion of estimate 505), represents the channel estimations (e.g., a second portion of estimate 505), and represents the respective gradients (e.g., gradient 510). For example, the decoder module may receive as input the estimates 505 (e.g., the channel estimations and the latent variable values of a same iteration 410) and the gradients 510 (e.g., gradients 480 of the same iteration 410), as described herein with reference to FIG. 4. The decoder module may combine (e.g., fuse, concatenate) the estimates 505 and the gradients 510 to generate combination 520 (e.g., x m). In some cases, the combination 520 may be input to the u-net 525 (e.g., a tiny u-net). The u-net 525 may perform various calculations (e.g., combinations, operations) associated with a neural network to generate the output 530 (e.g., The decoder module may utilize information from the likelihood module and the various sub-modules (steps 440 through 460) of the encoder module to update the latent variable, the channel estimates, or both. The decoder module may act independently on each PRG, utilizing the updated latent variable (e.g., the second set of values) along with the gradient information to improve the channel estimation (e.g., bring the channel estimation closer to the actual channel).

[0143] FIG. 6 shows an example of a network 600 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. In some examples, the network 600 may be implemented by aspects of the wireless communications system 100, the network 400, or both. For example, the network 600 may be implemented by a UE 115, a network entity 105, the encoder 430, or any combination thereof, as described herein with reference to FIGs. 1 and 4.

[0144] In some cases, the network 600 may perform an attention operation (e.g., attention 605) associated with neural networks. For example, the attention 605 may include weighting portions of the input data (e.g., input 610) differently than other portions of the input data (e.g., enhancing portions of the data while diminishing other portions of the data). In some cases, applying the attention 605 may focus (e.g., modify, align) the input data (e.g., the observed DMRS symbols) with the known data (e.g., the known DMRS symbols). In some examples, the attention 605 may model interaction (e.g., correlation) within the data elements 620. For example, a data element 620-a, a data element 620-b, a data element 620-c, and a data element 620-d may interact with a data element 620-e, a data element 620-f, a data element 620-g, and a data element 620-h, and vice versa.

[0145] In some cases, an encoder module (e.g., module 430) may perform the attention 605 (e.g., a self-attention), for example, at various steps (e.g., steps 450) of a channel estimation process, as described herein with reference to FIG. 4. For example, the encoder module may (e.g., at 450-a) perform an intra-PRG attention operation to model correlation between resources (e.g., resource elements) of each PRB (e.g., PRB 310) of a PRG (e.g., PRG 305) and the other PRBs of the PRG (e.g., PRBs belonging to a single PRG bundle). The encoder module may determine a set of values of a latent variable (e.g., output 530) from a fusion operation, as described herein with reference to FIG. 5. The encoder module may flatten each subset of values associated with each PRB of the PRG. For example, the set of values (e.g., z^ +1 ) may include four subsets of values (e.g., The encoder module may flatten (e.g., combine, compress to a single frequency row) the four subsets into the data element 620-a, the data element 620-b, the data element 620-c, and the data element 620-d (e.g., input 610). In some cases, the data element 620-e, the data element 620-f, the data element 620-g, and the data element 620-h may be duplicates (e.g., copies) of the data element 620-a, the data element 620-b, the data element 620-c, and the data element 620-d, respectively (e.g., . In some examples, the intra-PRG attention may be utilized for modeling long range correlations (e.g., PRBs separated across the frequency axis).

[0146] In some cases, the encoder module may (e.g., at 450-b) perform an inter- PRG attention operation to model correlation between resources (e.g., resource elements) of each PRG (e.g., PRG 305) of a MIMO communication. The encoder module may determine the flattened subset of values associated with each PRB of each combine (e.g., concatenate, average, mean pulling) the flattened subsets (e.g., embedded subsets) into a single set of values (e.g., z, 1 ', T+1 , as the data element 620-a, the data element 620-b, the data element 620-c, and the data element 620-d (e.g., input 610), respectively. In some cases, the data element 620-e, the data element 620-f, the data element 620-g, and the data element 620-h may be represented by z z 4 41 , respectively. The encoder module may combine (e.g., add) the respective outputs (e.g., output 615, residual) of the attention 605 with the flattened subsets of values (e.g., e.g., the subsets before averaging) as output 615 for the inter-PRG attention. In some examples, the inter-PRG attention may facilitate information exchange across different PRG bundles. [0147] In some cases, the encoder module may (e.g., at 450-c) perform a cross- MIMO attention operation to model correlation between each layer of multiple layers associated with the MIMO communication (e.g., interaction between antenna pairs per PRG bundle per PRB). In some cases, the MIMO communication may be the set of resources including the DMRS symbols 320 and the data symbols 315, as described herein with reference to FIG. 3. The encoder module may determine a source block (e.g., z { ^J +1 ) P er MIMO layer (e.g., per antenna pair). For example, in a two dimensional grid, the data element 620-a, the data element 620-b, the data element 620-c, and the data element 620-d (e.g., input 610 for the cross-MIMO attention operation) may be represented by z™ ’ T+1 , z™ ’ T+1 , z™f’ T+1 , and z 2 m ' T+1 , respectively. In some cases, the data element 620-e, the data element 620-f, the data element 620-g, and the data element 620-h may be represented by z™ k,T+1 , z™2 k ' T+1 , respectively. The encoder module may output 615 for the cross- MIMO attention. In some examples, the cross-MIMO attention may model interaction (e.g., correlation) between different transmission links per PRG per PRB (e.g., between different MIMO links in an equivariant way). The encoder module may utilize a same set of machine learning parameters for all iterations of a given attention calculation, in some examples. That is, of multiple attention calculations, each individual attention calculation may utilize a respective set of machine learning parameters, and the sets of machine learning parameters may differ across attention calculations.

[0148] FIG. 7 shows a block diagram 700 of a device 705 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. The device 705 may be an example of aspects of a UE 115 or a network entity 105 as described herein. The device 705 may include a receiver 710, a transmitter 715, and a communications manager 720. The device 705, or one or more components of the device 705 (e.g., the receiver 710, the transmitter 715, and the communications manager 720), may include at least one processor (or processing circuitry), which may be coupled with at least one memory (or memory circuitry), to, individually or collectively, support or enable the described techniques. Each of these components may be in communication with one another (e.g., via one or more buses). [0149] The receiver 710 may provide a means for receiving information such as packets, user data, control information, or any combination thereof associated with various information channels (e.g., control channels, data channels, information channels related to recurrent equivariant inference machines for channel estimation). Information may be passed on to other components of the device 705. The receiver 710 may utilize a single antenna or a set of multiple antennas.

[0150] The transmitter 715 may provide a means for transmitting signals generated by other components of the device 705. For example, the transmitter 715 may transmit information such as packets, user data, control information, or any combination thereof associated with various information channels (e.g., control channels, data channels, information channels related to recurrent equivariant inference machines for channel estimation). In some examples, the transmitter 715 may be co-located with a receiver 710 in a transceiver module. The transmitter 715 may utilize a single antenna or a set of multiple antennas.

[0151] The communications manager 720, the receiver 710, the transmitter 715, or various combinations thereof or various components thereof may be examples of means for performing various aspects of recurrent equivariant inference machines for channel estimation as described herein. For example, the communications manager 720, the receiver 710, the transmitter 715, or various combinations or components thereof may be capable of performing one or more of the functions described herein.

[0152] In some examples, the communications manager 720, the receiver 710, the transmitter 715, or various combinations or components thereof may be implemented in hardware (e.g., in communications management circuitry). The circuitry may comprise of processor, digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described in the present disclosure. In some examples, at least one processor (or processing circuitry) and at least one memory (or memory circuitry) coupled with the at least one processor may be configured to perform one or more of the functions described herein (e.g., by one or more processors, individually or collectively, executing instructions stored in the at least one memory). [0153] In another implementation, the communications manager 720, the receiver 710, the transmitter 715, or various combinations or components thereof may be implemented in code (e.g., as communications management software or firmware) executed by at least one processor. If implemented in code executed by at least one processor, the functions of the communications manager 720, the receiver 710, the transmitter 715, or various combinations or components thereof may be performed by a general-purpose processor, a DSP, a CPU, an ASIC, an FPGA, a microcontroller, or any combination of these or other programmable logic devices (e.g., configured as or otherwise supporting, individually or collectively, a means for performing the functions described in the present disclosure).

[0154] In some examples, the communications manager 720 may be configured to perform various operations (e.g., receiving, obtaining, determining, monitoring, outputting, transmitting) using or otherwise in cooperation with the receiver 710, the transmitter 715, or both. For example, the communications manager 720 may receive information from the receiver 710, send information to the transmitter 715, or be integrated in combination with the receiver 710, the transmitter 715, or both to obtain information, output information, or perform various other operations as described herein.

[0155] The communications manager 720 may support wireless communication in accordance with examples as disclosed herein. For example, the communications manager 720 is capable of, configured to, or operable to support a means for receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. The communications manager 720 is capable of, configured to, or operable to support a means for generating, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. The communications manager 720 is capable of, configured to, or operable to support a means for generating, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second set of multiple channel estimations and a set of multiple values of a latent variable, the second set of multiple channel estimations and the set of multiple values associated with respective layers of the set of multiple layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple channel estimations. The communications manager 720 is capable of, configured to, or operable to support a means for performing a refinement operation on the second set of multiple channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. In some examples, to perform each iteration of the one or more iterations, the communications manager 720 may be configured as or otherwise support a means for generating respective gradients associated with the second set of multiple channel estimations based on the second set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generating, based on a first set of values of the set of multiple values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and modifying the second set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients.

[0156] By including or configuring the communications manager 720 in accordance with examples as described herein, the device 705 (e.g., at least one processor controlling or otherwise coupled with the receiver 710, the transmitter 715, the communications manager 720, or a combination thereof) may support techniques for may support techniques for more accurate channel estimations, more efficient utilization of communication resources, and decreased memory and computational overhead.

[0157] FIG. 8 shows a block diagram 800 of a device 805 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. The device 805 may be an example of aspects of a device 705, a UE 115, or a network entity 105 as described herein. The device 805 may include a receiver 810, a transmitter 815, and a communications manager 820. The device 805, or one or more components of the device 805 (e.g., the receiver 810, the transmitter 815, and the communications manager 820), may include at least one processor (or processing circuitry), which may be coupled with at least one memory (or memory circuitry), to support the described techniques. Each of these components may be in communication with one another (e.g., via one or more buses).

[0158] The receiver 810 may provide a means for receiving information such as packets, user data, control information, or any combination thereof associated with various information channels (e.g., control channels, data channels, information channels related to recurrent equivariant inference machines for channel estimation). Information may be passed on to other components of the device 805. The receiver 810 may utilize a single antenna or a set of multiple antennas.

[0159] The transmitter 815 may provide a means for transmitting signals generated by other components of the device 805. For example, the transmitter 815 may transmit information such as packets, user data, control information, or any combination thereof associated with various information channels (e.g., control channels, data channels, information channels related to recurrent equivariant inference machines for channel estimation). In some examples, the transmitter 815 may be co-located with a receiver 810 in a transceiver module. The transmitter 815 may utilize a single antenna or a set of multiple antennas.

[0160] The device 805, or various components thereof, may be an example of means for performing various aspects of recurrent equivariant inference machines for channel estimation as described herein. For example, the communications manager 820 may include a scheduling component 825, an MMSE component 830, a coarse network component 835, a refinement network component 840, or any combination thereof. The communications manager 820 may be an example of aspects of a communications manager 720 as described herein. In some examples, the communications manager 820, or various components thereof, may be configured to perform various operations (e.g., receiving, obtaining, monitoring, outputting, transmitting) using or otherwise in cooperation with the receiver 810, the transmitter 815, or both. For example, the communications manager 820 may receive information from the receiver 810, send information to the transmitter 815, or be integrated in combination with the receiver 810, the transmitter 815, or both to obtain information, output information, or perform various other operations as described herein. [0161] The communications manager 820 may support wireless communication in accordance with examples as disclosed herein. The scheduling component 825 is capable of, configured to, or operable to support a means for receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. The MMSE component 830 is capable of, configured to, or operable to support a means for generating, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. The coarse network component 835 is capable of, configured to, or operable to support a means for generating, in accordance with a nonlinear two- dimensional interpolation of the channel for the set of resources, a second set of multiple channel estimations and a set of multiple values of a latent variable, the second set of multiple channel estimations and the set of multiple values associated with respective layers of the set of multiple layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple channel estimations. The refinement network component 840 is capable of, configured to, or operable to support a means for performing a refinement operation on the second set of multiple channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. In some examples, to perform each iteration of the one or more iterations, the likelihood component 845 may be configured as or otherwise support a means for generating respective gradients associated with the second set of multiple channel estimations based on the second set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, the encoder component 850 may be configured as or otherwise support a means for generating, based on a first set of values of the set of multiple values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and the decoder component 855 may be configured as or otherwise support a means for modifying the second set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients.

[0162] Additionally, or alternatively, the communications manager 820 may support wireless communication in accordance with examples as disclosed herein. The scheduling component 825 may be configured as or otherwise support a means for receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. The coarse network component 835 may be configured as or otherwise support a means for generating a set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. The refinement network component 840 may be configured as or otherwise support a means for performing a refinement operation on the set of multiple channel estimations including one or more iterations. In some examples, to each iteration of the one or more iterations, the likelihood component 845 may be configured as or otherwise support a means for generating respective gradients associated with the set of multiple channel estimations based on the set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, the encoder component 850 may be configured as or otherwise support a means for generating, based on a first set of values of a latent variable, the set of multiple channel estimations, and the respective gradients, a second set of values of the latent variable, and the decoder component 855 may be configured as or otherwise support a means for modifying the set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients.

[0163] FIG. 9 shows a block diagram 900 of a communications manager 920 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. The communications manager 920 may be an example of aspects of a communications manager 720, a communications manager 820, or both, as described herein. The communications manager 920, or various components thereof, may be an example of means for performing various aspects of recurrent equivariant inference machines for channel estimation as described herein. For example, the communications manager 920 may include a scheduling component 925, an MMSE component 930, a coarse network component 935, a refinement network component 940, a likelihood component 945, an encoder component 950, a decoder component 955, or any combination thereof. Each of these components, or components or subcomponents thereof (e.g., one or more processors, one or more memories), may communicate, directly or indirectly, with one another (e.g., via one or more buses) which may include communications within a protocol layer of a protocol stack, communications associated with a logical channel of a protocol stack (e.g., between protocol layers of a protocol stack, within a device, component, or virtualized component associated with a network entity 105, between devices, components, or virtualized components associated with a network entity 105), or any combination thereof.

[0164] The communications manager 920 may support wireless communication in accordance with examples as disclosed herein. The scheduling component 925 is capable of, configured to, or operable to support a means for receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. The MMSE component 930 is capable of, configured to, or operable to support a means for generating, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. The coarse network component 935 is capable of, configured to, or operable to support a means for generating, in accordance with a nonlinear two- dimensional interpolation of the channel for the set of resources, a second set of multiple channel estimations and a set of multiple values of a latent variable, the second set of multiple channel estimations and the set of multiple values associated with respective layers of the set of multiple layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple channel estimations. The refinement network component 940 is capable of, configured to, or operable to support a means for performing a refinement operation on the second set of multiple channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. In some examples, to perform each iteration of the one or more iterations, the likelihood component 945 is capable of, configured to, or operable to support a means for generating respective gradients associated with the second set of multiple channel estimations based on the second set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, the encoder component 950 is capable of, configured to, or operable to support a means for generating, based on a first set of values of the set of multiple values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and the decoder component 955 is capable of, configured to, or operable to support a means for modifying the second set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients.

[0165] In some examples, the refinement operation is a first refinement operation and the set of machine learning parameters is a first set of machine learning parameters, and the refinement network component 940 is capable of, configured to, or operable to support a means for performing a second refinement operation on the second set of multiple channel estimations, the second refinement operation including one or more second iterations performed in accordance with a same second set of machine learning parameters, where the first refinement operation and the second refinement operation are associated with a respective attention calculation of a set of multiple attention calculations.

[0166] In some examples, the set of multiple attention calculations includes an intra- PRB group calculation, an inter-PRB group calculation, a cross-MIMO calculation, an MLP calculation, or any combination thereof.

[0167] In some examples, the MMSE component 930 is capable of, configured to, or operable to support a means for performing the MMSE operation based on a resource configuration pattern of the second subset of resources allocated for the reference signal, the reference signal including a DMRS.

[0168] In some examples, to support generating the respective gradients, the likelihood component 945 is capable of, configured to, or operable to support a means for generating respective sets of values of a residual variable based on a difference between the measured observations of the second subset of resources and the second set of multiple channel estimations for the second subset of resources. In some examples, to support generating the respective gradients, the likelihood component 945 is capable of, configured to, or operable to support a means for combining the respective sets of values of the residual variable, the second subset of resources, and a quantity of mask bits.

[0169] In some examples, to support generating the second set of values of the latent variable, the encoder component 950 is capable of, configured to, or operable to support a means for combining the second set of multiple channel estimations for the second subset of resources, the respective gradients, and respective values of the first set of values of the latent variable based on generating the respective gradients.

[0170] In some examples, to support generating the second set of values of the latent variable, the encoder component 950 is capable of, configured to, or operable to support a means for modeling a correlation between resources of each resource block of a group of resource blocks and other resource blocks of the group of resource blocks.

[0171] In some examples, to support generating the second set of values of the latent variable, the encoder component 950 is capable of, configured to, or operable to support a means for modeling a correlation between resources of each group of a set of multiple groups of resources of the set of resources and other groups of the set of multiple groups of resources, where each group of the set of multiple groups of resources includes a set of multiple resource blocks.

[0172] In some examples, to support generating the second set of values of the latent variable, the encoder component 950 is capable of, configured to, or operable to support a means for modeling a correlation between each layer of the set of multiple layers for the set of resources.

[0173] In some examples, to support modifying the second set of multiple channel estimations, the encoder component 950 is capable of, configured to, or operable to support a means for combining the second set of values of the latent variable, the second set of multiple channel estimations, and the respective gradients based on the set of machine learning parameters. [0174] In some examples, the nonlinear two-dimensional interpolation of the channel is based on a machine learning model.

[0175] In some examples, the first set of multiple channel estimations and the second set of multiple channel estimations are associated with a set of multiple singleinput and single-output antenna pairs.

[0176] In some examples, each iteration of the one or more iterations is performed by a refinement network including a likelihood module, an encoder module, and a decoder module, the refinement network including a machine learning model. In some examples, each refinement network executes according to the same set of machine learning parameters.

[0177] The communications manager 920 may support wireless communication in accordance with examples as disclosed herein. The scheduling component 925 may be configured as or otherwise support a means for receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. The coarse network component 935 may be configured as or otherwise support a means for generating a set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. The refinement network component 940 may be configured as or otherwise support a means for performing a refinement operation on the set of multiple channel estimations including one or more iterations. In some examples, to each iteration of the one or more iterations, the likelihood component 945 may be configured as or otherwise support a means for generating respective gradients associated with the set of multiple channel estimations based on the set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, the encoder component 950 may be configured as or otherwise support a means for generating, based on a first set of values of a latent variable, the set of multiple channel estimations, and the respective gradients, a second set of values of the latent variable, and the decoder component 955 may be configured as or otherwise support a means for modifying the set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients. [0178] In some examples, to support generating the respective gradients, the likelihood component 945 may be configured as or otherwise support a means for generating respective sets of values of a residual variable based on a difference between the measured observations of the second subset of resources and the set of multiple channel estimations for the second subset of resources. In some examples, to support generating the respective gradients, the likelihood component 945 may be configured as or otherwise support a means for combining the respective sets of values of the residual variable, the measured observations of the second subset of resources, and a quantity of mask bits.

[0179] In some examples, to support generating the second set of values of the latent variable, the encoder component 950 may be configured as or otherwise support a means for combining the set of multiple channel estimations for the second subset of resources, the respective gradients, and respective values of the first set of values of the latent variable based on generating the respective gradients.

[0180] In some examples, to support generating the second set of values of the latent variable, the encoder component 950 may be configured as or otherwise support a means for modeling correlation between resources of each resource block of a group of resource blocks and other resource blocks of the group of resource blocks.

[0181] In some examples, to support generating the second set of values of the latent variable, the encoder component 950 may be configured as or otherwise support a means for modeling correlation between resources of each group of a set of multiple groups of resources of the set of resources and other groups of the set of multiple groups of resources, where each group of the set of multiple groups of resources includes a set of multiple resource blocks.

[0182] In some examples, to support generating the second set of values of the latent variable, the encoder component 950 may be configured as or otherwise support a means for modeling correlation between each layer of the set of multiple layers for the set of resources.

[0183] In some examples, to support modifying the set of multiple channel estimations, the decoder component 955 may be configured as or otherwise support a means for combining the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients. [0184] In some examples, initial values of the set of multiple channel estimations are associated with SISO antenna pairs.

[0185] In some examples, the second subset of resources are configured according to a resource configuration pattern of a set of resource configuration patterns.

[0186] In some examples, the set of resource configuration patterns is a set of DMRS patterns.

[0187] In some examples, each iteration is performed by a refinement network including a likelihood module, an encoder module, and a decoder module, and each refinement network further includes a respective parameter associated with a machine learning operation.

[0188] In some examples, the set of resources includes one or more groups of resources, and each respective layer of the set of multiple layers is associated with a respective antenna pair of a set of multiple SISO antenna pairs.

[0189] FIG. 10 shows a diagram of a system 1000 including a device 1005 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. The device 1005 may be an example of or include the components of a device 705, a device 805, or a UE 115 as described herein. The device 1005 may communicate (e.g., wirelessly) with one or more network entities 105, one or more UEs 115, or any combination thereof. The device 1005 may include components for bi-directional voice and data communications including components for transmitting and receiving communications, such as a communications manager 1020, an input/output (I/O) controller 1010, a transceiver 1015, an antenna 1025, at least one memory 1030 (or memory circuitry), code 1035, and at least one processor 1040 (or processing circuitry). These components may be in electronic communication or otherwise coupled (e.g., operatively, communicatively, functionally, electronically, electrically) via one or more buses (e.g., a bus 1045).

[0190] The I/O controller 1010 may manage input and output signals for the device 1005. The I/O controller 1010 may also manage peripherals not integrated into the device 1005. In some cases, the I/O controller 1010 may represent a physical connection or port to an external peripheral. In some cases, the I/O controller 1010 may utilize an operating system such as iOS®, ANDROID®, MS-DOS®, MS-WINDOWS®, OS/2®, UNIX®, LINUX®, or another known operating system. Additionally, or alternatively, the I/O controller 1010 may represent or interact with a modem, a keyboard, a mouse, a touchscreen, or a similar device. In some cases, the I/O controller 1010 may be implemented as part of one or more processors, such as the at least one processor 1040 (or processing circuitry). In some cases, a user may interact with the device 1005 via the I/O controller 1010 or via hardware components controlled by the I/O controller 1010.

[0191] In some cases, the device 1005 may include a single antenna 1025. However, in some other cases, the device 1005 may have more than one antenna 1025, which may be capable of concurrently transmitting or receiving multiple wireless transmissions. The transceiver 1015 may communicate bi-directionally, via the one or more antennas 1025, wired, or wireless links as described herein. For example, the transceiver 1015 may represent a wireless transceiver and may communicate bi-directionally with another wireless transceiver. The transceiver 1015 may also include a modem to modulate the packets, to provide the modulated packets to one or more antennas 1025 for transmission, and to demodulate packets received from the one or more antennas 1025. The transceiver 1015, or the transceiver 1015 and one or more antennas 1025, may be an example of a transmitter 715, a transmitter 815, a receiver 710, a receiver 810, or any combination thereof or component thereof, as described herein.

[0192] The at least one memory 1030 (or memory circuitry) may include random access memory (RAM) and read-only memory (ROM). The at least one memory 1030 may store computer-readable, computer-executable code 1035 including instructions that, when executed by the at least one processor 1040, cause the device 1005 to perform various functions described herein. The code 1035 may be stored in a non- transitory computer-readable medium such as system memory or another type of memory. In some cases, the code 1035 may not be directly executable by the at least one processor 1040 but may cause a computer (e.g., when compiled and executed) to perform functions described herein. In some cases, the at least one memory 1030 may contain, among other things, a basic I/O system (BIOS) which may control basic hardware or software operation such as the interaction with peripheral components or devices.

[0193] The at least one processor 1040 (or processing circuitry) may include an intelligent hardware device (e.g., a general-purpose processor, a DSP, a CPU, a microcontroller, an ASIC, an FPGA, a programmable logic device, a discrete gate or transistor logic component, a discrete hardware component, or any combination thereof). In some cases, the at least one processor 1040 may be configured to operate a memory array using a memory controller. In some other cases, a memory controller may be integrated into the at least one processor 1040. The at least one processor 1040 may be configured to execute computer-readable instructions stored in a memory (e.g., the at least one memory 1030) to cause the device 1005 to perform various functions (e.g., functions or tasks supporting recurrent equivariant inference machines for channel estimation). For example, the device 1005 or a component of the device 1005 may include at least one processor 1040 (or processing circuitry) and at least one memory 1030 (or memory circuitry) coupled with or to the at least one processor 1040, the at least one processor 1040 and at least one memory 1030 configured to perform various functions described herein. In some examples, the at least one processor 1040 may include multiple processors and the at least one memory 1030 may include multiple memories. One or more of the multiple processors may be coupled with one or more of the multiple memories, which may, individually or collectively, be configured to perform various functions herein. In some examples, the at least one processor 1040 may be a component of a processing system, which may refer to a system (such as a series) of machines, circuitry (including, for example, one or both of processor circuitry (which may include the at least one processor 1040) and memory circuitry (which may include the at least one memory 1030)), or components, that receives or obtains inputs and processes the inputs to produce, generate, or obtain a set of outputs. The processing system may be configured to perform one or more of the functions described herein. As such, the at least one processor 1040 or a processing system including the at least one processor 1040 may be configured to, configurable to, or operable to cause the device 1005 to perform one or more of the functions described herein. Further, as described herein, being “configured to,” being “configurable to,” and being “operable to” may be used interchangeably and may be associated with a capability, when executing code stored in the at least one memory 1030 or otherwise, to perform one or more of the functions described herein.

[0194] The communications manager 1020 may support wireless communication in accordance with examples as disclosed herein. For example, the communications manager 1020 is capable of, configured to, or operable to support a means for receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. The communications manager 1020 is capable of, configured to, or operable to support a means for generating, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. The communications manager 1020 is capable of, configured to, or operable to support a means for generating, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second set of multiple channel estimations and a set of multiple values of a latent variable, the second set of multiple channel estimations and the set of multiple values associated with respective layers of the set of multiple layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple channel estimations. The communications manager 1020 is capable of, configured to, or operable to support a means for performing a refinement operation on the second set of multiple channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. In some examples, to perform each iteration of the one or more iterations, the communications manager 1020 may be configured as or otherwise support a means for generating respective gradients associated with the second set of multiple channel estimations based on the second set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generating, based on a first set of values of the set of multiple values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and modifying the second set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients.

[0195] Additionally, or alternatively, the communications manager 1020 may support wireless communication in accordance with examples as disclosed herein. For example, the communications manager 1020 may be configured as or otherwise support a means for receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. The communications manager 1020 may be configured as or otherwise support a means for generating a set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. The communications manager 1020 may be configured as or otherwise support a means for performing a refinement operation on the set of multiple channel estimations including one or more iterations. In some examples, to each iteration of the one or more iterations, the communications manager 1020 may be configured as or otherwise support a means for generating respective gradients associated with the set of multiple channel estimations based on the set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generating, based on a first set of values of a latent variable, the set of multiple channel estimations, and the respective gradients, a second set of values of the latent variable, and modifying the set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients.

[0196] By including or configuring the communications manager 1020 in accordance with examples as described herein, the device 1005 may support techniques for improved communication reliability, reduced latency, improved user experience related to reduced processing, reduced power consumption, more efficient utilization of communication resources, more accurate channel estimations, and decreased memory and computational overhead.

[0197] In some examples, the communications manager 1020 may be configured to perform various operations (e.g., receiving, monitoring, transmitting) using or otherwise in cooperation with the transceiver 1015, the one or more antennas 1025, or any combination thereof. Although the communications manager 1020 is illustrated as a separate component, in some examples, one or more functions described with reference to the communications manager 1020 may be supported by or performed by the at least one processor 1040, the at least one memory 1030, the code 1035, or any combination thereof. For example, the code 1035 may include instructions executable by the at least one processor 1040 to cause the device 1005 to perform various aspects of recurrent equivariant inference machines for channel estimation as described herein, or the at least one processor 1040 and the at least one memory 1030 may be otherwise configured to, individually or collectively, perform or support such operations.

[0198] FIG. 11 shows a diagram of a system 1100 including a device 1105 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. The device 1105 may be an example of or include the components of a device 705, a device 805, or a network entity 105 as described herein. The device 1105 may communicate with one or more network entities 105, one or more UEs 115, or any combination thereof, which may include communications over one or more wired interfaces, over one or more wireless interfaces, or any combination thereof. The device 1105 may include components that support outputting and obtaining communications, such as a communications manager 1120, a transceiver 1110, an antenna 1115, at least one memory 1125 (or memory circuitry), code 1130, and at least one processor 1135 (or processing circuitry). These components may be in electronic communication or otherwise coupled (e.g., operatively, communicatively, functionally, electronically, electrically) via one or more buses (e.g., a bus 1140).

[0199] The transceiver 1110 may support bi-directional communications via wired links, wireless links, or both as described herein. In some examples, the transceiver 1110 may include a wired transceiver and may communicate bi-directionally with another wired transceiver. Additionally, or alternatively, in some examples, the transceiver 1110 may include a wireless transceiver and may communicate bidirectionally with another wireless transceiver. In some examples, the device 1105 may include one or more antennas 1115, which may be capable of transmitting or receiving wireless transmissions (e.g., concurrently). The transceiver 1110 may also include a modem to modulate signals, to provide the modulated signals for transmission (e.g., by one or more antennas 1115, by a wired transmitter), to receive modulated signals (e.g., from one or more antennas 1115, from a wired receiver), and to demodulate signals. In some implementations, the transceiver 1110 may include one or more interfaces, such as one or more interfaces coupled with the one or more antennas 1115 that are configured to support various receiving or obtaining operations, or one or more interfaces coupled with the one or more antennas 1115 that are configured to support various transmitting or outputting operations, or a combination thereof. In some implementations, the transceiver 1110 may include or be configured for coupling with one or more processors or one or more memory components that are operable to perform or support operations based on received or obtained information or signals, or to generate information or other signals for transmission or other outputting, or any combination thereof. In some implementations, the transceiver 1110, or the transceiver 1110 and the one or more antennas 1115, or the transceiver 1110 and the one or more antennas 1115 and one or more processors or one or more memory components (e.g., the at least one processor 1135, the at least one memory 1125, or both), may be included in a chip or chip assembly that is installed in the device 1105. In some examples, the transceiver 1110 may be operable to support communications via one or more communications links (e.g., a communication link 125, a backhaul communication link 120, a midhaul communication link 162, a fronthaul communication link 168).

[0200] The at least one memory 1125 may include RAM, ROM, or any combination thereof. The at least one memory 1125 may store computer-readable, computerexecutable code 1130 including instructions that, when executed by one or more of the at least one processor 1135, cause the device 1105 to perform various functions described herein. The code 1130 may be stored in a non-transitory computer-readable medium such as system memory or another type of memory. In some cases, the code 1130 may not be directly executable by a processor of the at least one processor 1135 but may cause a computer (e.g., when compiled and executed) to perform functions described herein. In some cases, the at least one memory 1125 may contain, among other things, a BIOS which may control basic hardware or software operation such as the interaction with peripheral components or devices. In some examples, the at least one processor 1135 may include multiple processors and the at least one memory 1125 may include multiple memories. One or more of the multiple processors may be coupled with one or more of the multiple memories which may, individually or collectively, be configured to perform various functions herein (for example, as part of a processing system).

[0201] The at least one processor 1135 may include an intelligent hardware device (e.g., a general-purpose processor, a DSP, an ASIC, a CPU, an FPGA, a microcontroller, a programmable logic device, discrete gate or transistor logic, a discrete hardware component, or any combination thereof). In some cases, the at least one processor 1135 may be configured to operate a memory array using a memory controller. In some other cases, a memory controller may be integrated into one or more of the at least one processor 1135. The at least one processor 1135 may be configured to execute computer-readable instructions stored in a memory (e.g., one or more of the at least one memory 1125) to cause the device 1105 to perform various functions (e.g., functions or tasks supporting recurrent equivariant inference machines for channel estimation). For example, the device 1105 or a component of the device 1105 may include at least one processor 1135 (or processing circuitry) and at least one memory 1125 (or memory circuitry) coupled with one or more of the at least one processor 1135, the at least one processor 1135 and the at least one memory 1125 configured to perform various functions described herein. The at least one processor 1135 may be an example of a cloud-computing platform (e.g., one or more physical nodes and supporting software such as operating systems, virtual machines, or container instances) that may host the functions (e.g., by executing code 1130) to perform the functions of the device 1105. The at least one processor 1135 may be any one or more suitable processors capable of executing scripts or instructions of one or more software programs stored in the device 1105 (such as within one or more of the at least one memory 1125). In some examples, the at least one processor 1135 may include multiple processors and the at least one memory 1125 may include multiple memories. One or more of the multiple processors may be coupled with one or more of the multiple memories, which may, individually or collectively, be configured to perform various functions herein. In some examples, the at least one processor 1135 may be a component of a processing system, which may refer to a system (such as a series) of machines, circuitry (including, for example, one or both of processor circuitry (which may include the at least one processor 1135) and memory circuitry (which may include the at least one memory 1125)), or components, that receives or obtains inputs and processes the inputs to produce, generate, or obtain a set of outputs. The processing system may be configured to perform one or more of the functions described herein. As such, the at least one processor 1135 or a processing system including the at least one processor 1135 may be configured to, configurable to, or operable to cause the device 1105 to perform one or more of the functions described herein. Further, as described herein, being “configured to,” being “configurable to,” and being “operable to” may be used interchangeably and may be associated with a capability, when executing code stored in the at least one memory 1125 or otherwise, to perform one or more of the functions described herein.

[0202] In some examples, a bus 1140 may support communications of (e.g., within) a protocol layer of a protocol stack. In some examples, a bus 1140 may support communications associated with a logical channel of a protocol stack (e.g., between protocol layers of a protocol stack), which may include communications performed within a component of the device 1105, or between different components of the device 1105 that may be co-located or located in different locations (e.g., where the device 1105 may refer to a system in which one or more of the communications manager 1120, the transceiver 1110, the at least one memory 1125, the code 1130, and the at least one processor 1135 may be located in one of the different components or divided between different components).

[0203] In some examples, the communications manager 1120 may manage aspects of communications with a core network 130 (e.g., via one or more wired or wireless backhaul links). For example, the communications manager 1120 may manage the transfer of data communications for client devices, such as one or more UEs 115. In some examples, the communications manager 1120 may manage communications with other network entities 105, and may include a controller or scheduler for controlling communications with UEs 115 in cooperation with other network entities 105. In some examples, the communications manager 1120 may support an X2 interface within an LTE/LTE-A wireless communications network technology to provide communication between network entities 105.

[0204] The communications manager 1120 may support wireless communication in accordance with examples as disclosed herein. For example, the communications manager 1120 is capable of, configured to, or operable to support a means for receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. The communications manager 1120 is capable of, configured to, or operable to support a means for generating, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. The communications manager 1120 is capable of, configured to, or operable to support a means for generating, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second set of multiple channel estimations and a set of multiple values of a latent variable, the second set of multiple channel estimations and the set of multiple values associated with respective layers of the set of multiple layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple channel estimations. The communications manager 1120 is capable of, configured to, or operable to support a means for performing a refinement operation on the second set of multiple channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. In some examples, to perform each iteration of the one or more iterations, the communications manager 1120 may be configured as or otherwise support a means for generating respective gradients associated with the second set of multiple channel estimations based on the second set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generating, based on a first set of values of the set of multiple values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and modifying the second set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients.

[0205] By including or configuring the communications manager 1120 in accordance with examples as described herein, the device 1105 may support techniques for improved communication reliability, reduced latency, improved user experience related to reduced processing, reduced power consumption, more efficient utilization of communication resources, more accurate channel estimations, and decreased memory and computational overhead.

[0206] In some examples, the communications manager 1120 may be configured to perform various operations (e.g., receiving, obtaining, monitoring, outputting, transmitting) using or otherwise in cooperation with the transceiver 1110, the one or more antennas 1115 (e.g., where applicable), or any combination thereof. Although the communications manager 1120 is illustrated as a separate component, in some examples, one or more functions described with reference to the communications manager 1120 may be supported by or performed by the transceiver 1110, one or more of the at least one processor 1135, one or more of the at least one memory 1125, the code 1130, or any combination thereof (for example, by a processing system including at least a portion of the at least one processor 1135, the at least one memory 1125, the code 1130, or any combination thereof). For example, the code 1130 may include instructions executable by one or more of the at least one processor 1135 to cause the device 1105 to perform various aspects of recurrent equivariant inference machines for channel estimation as described herein, or the at least one processor 1135 and the at least one memory 1125 may be otherwise configured to, individually or collectively, perform or support such operations.

[0207] FIG. 12 shows a flowchart illustrating a method 1200 that supports recurrent equivariant inference machines for channel estimation in accordance with aspects of the present disclosure. The operations of the method 1200 may be implemented by a UE or a network entity or its components as described herein. For example, the operations of the method 1200 may be performed by a UE 115 or a network entity as described with reference to FIGs. 1 through 11. In some examples, a UE or a network entity may execute a set of instructions to control the functional elements of the UE or the network entity to perform the described functions. Additionally, or alternatively, the UE or the network entity may perform aspects of the described functions using special-purpose hardware.

[0208] At 1205, the method may include receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. The operations of block 1205 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1205 may be performed by a scheduling component 925 as described with reference to FIG. 9.

[0209] At 1210, the method may include generating, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. The operations of block 1210 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1210 may be performed by an MMSE component 930 as described with reference to FIG. 9.

[0210] At 1215, the method may include generating, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second set of multiple channel estimations and a set of multiple values of a latent variable, the second set of multiple channel estimations and the set of multiple values associated with respective layers of the set of multiple layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple channel estimations. The operations of block 1215 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1215 may be performed by a coarse network component 935 as described with reference to FIG. 9.

[0211] At 1220, the method may include performing a refinement operation on the second set of multiple channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. In some examples, each iteration of the one or more iterations may include generating respective gradients associated with the second set of multiple channel estimations based on the second set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generating, based on a first set of values of the set of multiple values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and modifying the second set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients. The operations of block 1220 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1220 may be performed by a refinement network component 940 as described with reference to FIG. 9.

[0212] FIG. 13 shows a flowchart illustrating a method 1300 that supports recurrent equivariant inference machines for channel estimation in accordance with aspects of the present disclosure. The operations of the method 1300 may be implemented by a UE or a network entity or its components as described herein. For example, the operations of the method 1300 may be performed by a UE 115 or a network entity as described with reference to FIGs. 1 through 11. In some examples, a UE or a network entity may execute a set of instructions to control the functional elements of the UE or the network entity to perform the described functions. Additionally, or alternatively, the UE or the network entity may perform aspects of the described functions using special-purpose hardware.

[0213] At 1305, the method may include receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. The operations of block 1305 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1305 may be performed by a scheduling component 925 as described with reference to FIG. 9.

[0214] At 1310, the method may include generating, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. The operations of block 1310 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1310 may be performed by an MMSE component 930 as described with reference to FIG. 9.

[0215] At 1315, the method may include generating, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second set of multiple channel estimations and a set of multiple values of a latent variable, the second set of multiple channel estimations and the set of multiple values associated with respective layers of the set of multiple layers of the channel for the set of resources, where the nonlinear two-dimensional interpolation of the channel is based on the first set of multiple channel estimations. The operations of block 1315 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1315 may be performed by a coarse network component 935 as described with reference to FIG. 9. [0216] At 1320, the method may include performing a refinement operation on the second set of multiple channel estimations including one or more iterations, where each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters. In some examples, each iteration of the one or more iterations may include generating respective gradients associated with the second set of multiple channel estimations based on the second set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generating, based on a first set of values of the set of multiple values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable, and modifying the second set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the second set of multiple channel estimations, the set of machine learning parameters, and the respective gradients. The operations of block 1320 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1320 may be performed by a refinement network component 940 as described with reference to FIG. 9.

[0217] At 1325, the method may include performing a second refinement operation on the second set of multiple channel estimations, the second refinement operation including one or more second iterations performed in accordance with a same second set of machine learning parameters, where the first refinement operation and the second refinement operation are associated with a respective attention calculation of a set of multiple attention calculations. The operations of block 1325 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1325 may be performed by a refinement network component 940 as described with reference to FIG. 9.

[0218] FIG. 14 shows a flowchart illustrating a method 1400 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. The operations of the method 1400 may be implemented by a UE or its components as described herein. For example, the operations of the method 1400 may be performed by a UE 115 as described with reference to FIGs. 1 through 13. In some examples, a UE may execute a set of instructions to control the functional elements of the UE to perform the described functions. Additionally, or alternatively, the UE may perform aspects of the described functions using special-purpose hardware.

[0219] At 1405, the method may include receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. Receiving the assignment may include identifying time-frequency resources over which the assignment is transmitted, demodulating transmission over those time-frequency resources, decoding the demodulated transmission to obtain bits that indicate the assignment. The assignment may be received via DCI in a downlink control channel. The operations of 1405 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1405 may be performed by a scheduling component 925 as described with reference to FIG. 9.

[0220] At 1410, the method may include generating a set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. Generating the set of multiple channel estimations may include performing various interpolation techniques, as described herein with reference to FIG. 3, to calculate an estimation per PRG per antenna pair (e.g., a SISO channel estimation). The operations of 1410 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1410 may be performed by a coarse network component 935 as described with reference to FIG. 9.

[0221] At 1415, the method may include performing a refinement operation on the set of multiple channel estimations including one or more iterations. Performing the refinement operation may include calculating (e.g., updating, generating), via a refinement network as described herein with reference to FIGs. 4-6, various channel estimations (e.g., MIMO channel estimation) and refining the estimations over various iterations of a machine learning operation. In some examples, each iteration of the one or more iterations may include generating respective gradients associated with the set of multiple channel estimations based on the set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generating, based on a first set of values of a latent variable, the set of multiple channel estimations, and the respective gradients, a second set of values of the latent variable, and modifying the set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients. The operations of 1415 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1415 may be performed by a refinement network component 940 as described with reference to FIG. 9.

[0222] FIG. 15 shows a flowchart illustrating a method 1500 that supports recurrent equivariant inference machines for channel estimation in accordance with one or more aspects of the present disclosure. The operations of the method 1500 may be implemented by a UE or its components as described herein. For example, the operations of the method 1500 may be performed by a UE 115 as described with reference to FIGs. 1 through 14. In some examples, a UE may execute a set of instructions to control the functional elements of the UE to perform the described functions. Additionally, or alternatively, the UE may perform aspects of the described functions using special-purpose hardware.

[0223] At 1505, the method may include receiving an assignment of a set of resources associated with a channel including a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal. Receiving the assignment may include identifying time-frequency resources over which the assignment is transmitted, demodulating transmission over those time-frequency resources, decoding the demodulated transmission to obtain bits that indicate the assignment. The assignment may be received via DCI in a downlink control channel. The operations of 1505 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1505 may be performed by a scheduling component 925 as described with reference to FIG. 9.

[0224] At 1510, the method may include generating a set of multiple channel estimations associated with respective layers of a set of multiple layers of the channel for the set of resources. Generating the set of multiple channel estimations may include performing various interpolation techniques, as described herein with reference to FIG. 3, to calculate an estimation per PRG per antenna pair (e.g., a SISO channel estimation). The operations of 1510 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1510 may be performed by a coarse network component 935 as described with reference to FIG. 9.

[0225] At 1515, the method may include performing a refinement operation on the set of multiple channel estimations including one or more iterations. Performing the refinement operation may include calculating (e.g., updating, generating), via a refinement network as described herein with reference to FIGs. 3-5, various channel estimations (e.g., MIMO channel estimation) and refining the estimations over various iterations of a machine learning operation. In some examples, each iteration of the one or more iterations may include generating respective gradients associated with the set of multiple channel estimations based on the set of multiple channel estimations for the second subset of resources and measured observations of the second subset of resources, generating a second set of values of a latent variable (e.g., z T ) based at least in part on a first set of values of the latent variable, the plurality of channel estimations, the respective gradients, and modeling correlation between resources of each group of a plurality of groups of resources of the set of resources and other groups of the plurality of groups of resources, wherein each group of the plurality of groups of resources comprises a plurality of resource blocks, and modifying the set of multiple channel estimations associated with the set of multiple layers based on the second set of values of the latent variable, the set of multiple channel estimations, and the respective gradients. The operations of 1515 may be performed in accordance with examples as disclosed herein. In some examples, aspects of the operations of 1515 may be performed by a likelihood module (e.g., module 425), an encoder module (e.g., module 430), or a decoder module (e.g., module 435). In some examples, aspects of the operations of 1515 may be performed by a refinement network component 940 as described with reference to FIG. 9.

[0226] The following provides an overview of aspects of the present disclosure:

[0227] Aspect 1 : An apparatus for wireless communication at a wireless communication device, comprising: one or more memories; and one or more processors coupled with the one or more memories configured to execute the code to cause the wireless communication device to: receive an assignment of a set of resources associated with a channel comprising a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal; generate, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first plurality of channel estimations associated with respective layers of a plurality of layers of the channel for the set of resources; generate, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second plurality of channel estimations and a plurality of values of a latent variable, the second plurality of channel estimations and the plurality of values associated with respective layers of the plurality of layers of the channel for the set of resources, wherein the nonlinear two-dimensional interpolation of the channel is based at least in part on the first plurality of channel estimations; and perform a refinement operation on the second plurality of channel estimations comprising one or more iterations, wherein each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters, and wherein, to perform each iteration of the one or more iterations, the one or more processors are configured to cause the wireless communication device to: generate respective gradients associated with the second plurality of channel estimations based at least in part on the second plurality of channel estimations for the second subset of resources and measured observations of the second subset of resources; generate, based at least in part on a first set of values of the plurality of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable; and modify the second plurality of channel estimations associated with the plurality of layers based at least in part on the second set of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients.

[0228] Aspect 2: The apparatus of aspect 14, wherein the refinement operation is a first refinement operation and the set of machine learning parameters is a first set of machine learning parameters, and wherein the one or more processors are configured to cause the wireless communication device to: perform a second refinement operation on the second plurality of channel estimations, the second refinement operation comprising one or more second iterations performed in accordance with a same second set of machine learning parameters, wherein the first refinement operation and the second refinement operation are associated with a respective attention calculation of a plurality of attention calculations. [0229] Aspect 3: The apparatus of aspect 15, wherein the plurality of attention calculations comprises an intra-PRB group calculation, an inter-PRB group calculation, a cross-MIMO calculation, an MLP calculation, or any combination thereof.

[0230] Aspect 4: The apparatus of any of aspects 14 through 16, wherein the one or more processors are configured to cause the wireless communication device to: perform the MMSE operation based on a resource configuration pattern of the second subset of resources allocated for the reference signal, the reference signal comprising a DMRS.

[0231] Aspect 5: The apparatus of any of aspects 14 through 17, wherein, to generate the respective gradients, the one or more processors are configured to cause the wireless communication device to: generate respective sets of values of a residual variable based at least in part on a difference between the measured observations of the second subset of resources and the second plurality of channel estimations for the second subset of resources; and combine the respective sets of values of the residual variable, the second subset of resources, and a quantity of mask bits.

[0232] Aspect 6: The apparatus of any of aspects 14 through 18, wherein, to generate the second set of values of the latent variable, the one or more processors are configured to cause the wireless communication device to: combine the second plurality of channel estimations for the second subset of resources, the respective gradients, and respective values of the first set of values of the latent variable based at least in part on generating the respective gradients.

[0233] Aspect 7: The apparatus of any of aspects 14 through 19, wherein, to generate the second set of values of the latent variable, the one or more processors are configured to cause the wireless communication device to: model a correlation between resources of each resource block of a group of resource blocks and other resource blocks of the group of resource blocks.

[0234] Aspect 8: The apparatus of any of aspects 14 through 19, wherein, to generate the second set of values of the latent variable, the one or more processors are configured to cause the wireless communication device to: model a correlation between resources of each group of a plurality of groups of resources of the set of resources and other groups of the plurality of groups of resources, wherein each group of the plurality of groups of resources comprises a plurality of resource blocks. [0235] Aspect 9: The apparatus of any of aspects 14 through 19, wherein, to generate the second set of values of the latent variable, the one or more processors are configured to cause the wireless communication device to: model a correlation between each layer of the plurality of layers for the set of resources.

[0236] Aspect 10: The apparatus of any of aspects 14 through 22, wherein, to modify the second plurality of channel estimations, the one or more processors are configured to cause the wireless communication device to: combine the second set of values of the latent variable, the second plurality of channel estimations, and the respective gradients based at least in part on the set of machine learning parameters.

[0237] Aspect 11 : The apparatus of any of aspects 14 through 23, wherein the nonlinear two-dimensional interpolation of the channel is based at least in part on a machine learning model.

[0238] Aspect 12: The apparatus of any of aspects 14 through 24, wherein the first plurality of channel estimations and the second plurality of channel estimations are associated with a plurality of SISO antenna pairs.

[0239] Aspect 13: The apparatus of any of aspects 14 through 25, wherein each iteration of the one or more iterations is performed by a refinement network comprising a likelihood module, an encoder module, and a decoder module, the refinement network comprising a machine learning model; and each refinement network executes according to the same set of machine learning parameters.

[0240] Aspect 14: A method for wireless communication at a wireless communication device, comprising: receiving an assignment of a set of resources associated with a channel comprising a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal; generating, from the reference signal received over the second subset of resources in accordance with an MMSE operation, a first plurality of channel estimations associated with respective layers of a plurality of layers of the channel for the set of resources; generating, in accordance with a nonlinear two-dimensional interpolation of the channel for the set of resources, a second plurality of channel estimations and a plurality of values of a latent variable, the second plurality of channel estimations and the plurality of values associated with respective layers of the plurality of layers of the channel for the set of resources, wherein the nonlinear two-dimensional interpolation of the channel is based at least in part on the first plurality of channel estimations; and performing a refinement operation on the second plurality of channel estimations comprising one or more iterations, wherein each iteration of the one or more iterations is performed in accordance with a same set of machine learning parameters, and wherein each iteration of the one or more iterations comprises: generating respective gradients associated with the second plurality of channel estimations based at least in part on the second plurality of channel estimations for the second subset of resources and measured observations of the second subset of resources; generating, based at least in part on a first set of values of the plurality of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients, a second set of values of the latent variable; and modifying the second plurality of channel estimations associated with the plurality of layers based at least in part on the second set of values of the latent variable, the second plurality of channel estimations, the set of machine learning parameters, and the respective gradients.

[0241] Aspect 15: The method of aspect 14, wherein the refinement operation is a first refinement operation and the set of machine learning parameters is a first set of machine learning parameters, the method further comprising: performing a second refinement operation on the second plurality of channel estimations, the second refinement operation comprising one or more second iterations performed in accordance with a same second set of machine learning parameters, wherein the first refinement operation and the second refinement operation are associated with a respective attention calculation of a plurality of attention calculations.

[0242] Aspect 16: The method of aspect 15, wherein the plurality of attention calculations comprises an intra-PRB group calculation, an inter-PRB group calculation, a cross-MIMO calculation, an MLP calculation, or any combination thereof.

[0243] Aspect 17: The method of any of aspects 14 through 16, further comprising: performing the MMSE operation based on a resource configuration pattern of the second subset of resources allocated for the reference signal, the reference signal comprising a DMRS. [0244] Aspect 18: The method of any of aspects 14 through 17, wherein generating the respective gradients comprises: generating respective sets of values of a residual variable based at least in part on a difference between the measured observations of the second subset of resources and the second plurality of channel estimations for the second subset of resources; and combining the respective sets of values of the residual variable, the second subset of resources, and a quantity of mask bits.

[0245] Aspect 19: The method of any of aspects 14 through 18, wherein generating the second set of values of the latent variable comprises: combining the second plurality of channel estimations for the second subset of resources, the respective gradients, and respective values of the first set of values of the latent variable based at least in part on generating the respective gradients.

[0246] Aspect 20: The method of any of aspects 14 through 19, wherein generating the second set of values of the latent variable comprises: modeling a correlation between resources of each resource block of a group of resource blocks and other resource blocks of the group of resource blocks.

[0247] Aspect 21 : The method of any of aspects 14 through 19, wherein generating the second set of values of the latent variable comprises: modeling a correlation between resources of each group of a plurality of groups of resources of the set of resources and other groups of the plurality of groups of resources, wherein each group of the plurality of groups of resources comprises a plurality of resource blocks.

[0248] Aspect 22: The method of any of aspects 14 through 19, wherein generating the second set of values of the latent variable comprises: modeling a correlation between each layer of the plurality of layers for the set of resources.

[0249] Aspect 23: The method of any of aspects 14 through 22, wherein modifying the second plurality of channel estimations comprises: combining the second set of values of the latent variable, the second plurality of channel estimations, and the respective gradients based at least in part on the set of machine learning parameters.

[0250] Aspect 24: The method of any of aspects 14 through 23, wherein the nonlinear two-dimensional interpolation of the channel is based at least in part on a machine learning model. [0251] Aspect 25: The method of any of aspects 14 through 24, wherein the first plurality of channel estimations and the second plurality of channel estimations are associated with a plurality of SISO antenna pairs.

[0252] Aspect 26: The method of any of aspects 14 through 25, wherein each iteration of the one or more iterations is performed by a refinement network comprising a likelihood module, an encoder module, and a decoder module, the refinement network comprising a machine learning model; and each refinement network executes according to the same set of machine learning parameters.

[0253] Aspect 27: An apparatus for wireless communication at a wireless communication device, comprising at least one means for performing a method of any of aspects 14 through 26.

[0254] Aspect 28: A non-transitory computer-readable medium storing code for wireless communication at a wireless communication device, the code comprising instructions executable by one or more processors to cause the wireless communication device to perform a method of any of aspects 14 through 26.

[0255] Aspect 29: A method for wireless communication, comprising: receiving an assignment of a set of resources associated with a channel comprising a first subset of resources allocated for a data signal and a second subset of resources allocated for a reference signal; generating a plurality of channel estimations associated with respective layers of a plurality of layers of the channel for the set of resources; and performing a refinement operation on the plurality of channel estimations comprising one or more iterations, wherein each iteration of the one or more iterations comprises: generating respective gradients associated with the plurality of channel estimations based at least in part on the plurality of channel estimations for the second subset of resources and measured observations of the second subset of resources; generating, based at least in part on a first set of values of a latent variable, the plurality of channel estimations, and the respective gradients, a second set of values of the latent variable; and modifying the plurality of channel estimations associated with the plurality of layers based at least in part on the second set of values of the latent variable, the plurality of channel estimations, and the respective gradients. [0256] Aspect 30: The method of aspect 14, wherein generating the respective gradients comprises: generating respective sets of values of a residual variable based at least in part on a difference between the measured observations of the second subset of resources and the plurality of channel estimations for the second subset of resources; and combining the respective sets of values of the residual variable, the measured observations of the second subset of resources, and a quantity of mask bits.

[0257] Aspect 31 : The method of any of aspects 14 through 15, wherein generating the second set of values of the latent variable comprises: combining the plurality of channel estimations for the second subset of resources, the respective gradients, and respective values of the first set of values of the latent variable based at least in part on generating the respective gradients.

[0258] Aspect 32: The method of any of aspects 14 through 16, wherein generating the second set of values of the latent variable comprises: modeling correlation between resources of each resource block of a group of resource blocks and other resource blocks of the group of resource blocks.

[0259] Aspect 33: The method of any of aspects 14 through 17, wherein generating the second set of values of the latent variable comprises: modeling correlation between resources of each group of a plurality of groups of resources of the set of resources and other groups of the plurality of groups of resources, wherein each group of the plurality of groups of resources comprises a plurality of resource blocks.

[0260] Aspect 34: The method of any of aspects 14 through 18, wherein generating the second set of values of the latent variable comprises: modeling correlation between each layer of the plurality of layers for the set of resources.

[0261] Aspect 35: The method of any of aspects 14 through 19, wherein modifying the plurality of channel estimations comprises: combining the second set of values of the latent variable, the plurality of channel estimations, and the respective gradients.

[0262] Aspect 36: The method of any of aspects 14 through 20, wherein initial values of the plurality of channel estimations are associated with SISO antenna pairs. [0263] Aspect 37: The method of any of aspects 14 through 21, wherein the second subset of resources are configured according to a resource configuration pattern of a set of resource configuration patterns.

[0264] Aspect 38: The method of aspect 22, wherein the set of resource configuration patterns is a set of demodulation reference signal patterns.

[0265] Aspect 39: The method of any of aspects 14 through 23, wherein each iteration is performed by a refinement network comprising a likelihood module, an encoder module, and a decoder module, and each refinement network further comprises a respective parameter associated with a machine learning operation.

[0266] Aspect 40: The method of any of aspects 14 through 24, wherein the set of resources comprises one or more groups of resources, and each respective layer of the plurality of layers is associated with a respective antenna pair of a plurality of SISO antenna pairs.

[0267] Aspect 41 : An apparatus for wireless communication, comprising one or more memories storing processor-executable code, and one or more processors coupled with the one or more memories and individually or collectively operable to execute the code to cause the apparatus to perform a method of any of aspects 14 through 25.

[0268] Aspect 42: An apparatus for wireless communication, comprising at least one means for performing a method of any of aspects 14 through 25.

[0269] Aspect 43 : A non-transitory computer-readable medium storing code for wireless communication, the code comprising instructions executable by a processor to perform a method of any of aspects 14 through 25.

[0270] It should be noted that the methods described herein describe possible implementations, and that the operations and the steps may be rearranged or otherwise modified and that other implementations are possible. Further, aspects from two or more of the methods may be combined.

[0271] Although aspects of an LTE, LTE-A, LTE-A Pro, or NR system may be described for purposes of example, and LTE, LTE-A, LTE-A Pro, or NR terminology may be used in much of the description, the techniques described herein are applicable beyond LTE, LTE-A, LTE-A Pro, or NR networks. For example, the described techniques may be applicable to various other wireless communications systems such as Ultra Mobile Broadband (UMB), Institute of Electrical and Electronics Engineers (IEEE) 802.11 (Wi-Fi), IEEE 802.16 (WiMAX), IEEE 802.20, Flash-OFDM, as well as other systems and radio technologies not explicitly mentioned herein.

[0272] Information and signals described herein may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.

[0273] The various illustrative blocks and components described in connection with the disclosure herein may be implemented or performed using at least one general- purpose processor (or processing circuitry), a DSP, an ASIC, a CPU, an FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor but, in the alternative, the processor may be any processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices (e.g., a combination of a DSP and a microprocessor, multiple microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration). Any functions or operations described herein as being capable of being performed by a processor may be performed by multiple processors that, individually or collectively, are capable of performing the described functions or operations.

[0274] The functions described herein may be implemented using hardware, software executed by a processor, processing circuitry, firmware, or any combination thereof. If implemented using software executed by a processor, the functions may be stored as or transmitted using one or more instructions or code of a computer-readable medium. Other examples and implementations are within the scope of the disclosure and appended claims. For example, due to the nature of software, functions described herein may be implemented using software executed by a processor, hardware, firmware, hardwiring, or combinations of any of these. Features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations.

[0275] Computer-readable media includes both non-transitory computer storage media and communication media including any medium that facilitates transfer of a computer program from one location to another. A non-transitory storage medium may be any available medium that may be accessed by a general-purpose or special-purpose computer. By way of example, and not limitation, non-transitory computer-readable media may include RAM, ROM, electrically erasable programmable ROM (EEPROM), flash memory, compact disk (CD) ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium that may be used to carry or store desired program code means in the form of instructions or data structures and that may be accessed by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of computer-readable medium. Disk and disc, as used herein, include CD, laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc. Disks may reproduce data magnetically, and discs may reproduce data optically using lasers. Combinations of the above are also included within the scope of computer-readable media. Any functions or operations described herein as being capable of being performed by a memory may be performed by memory circuitry and/or multiple memories that, individually or collectively, are capable of performing the described functions or operations.

[0276] As used herein, including in the claims, “or” as used in a list of items (e.g., a list of items prefaced by a phrase such as “at least one of’ or “one or more of’) indicates an inclusive list such that, for example, a list of at least one of A, B, or C means A or B or C or AB or AC or BC or ABC (i.e., A and B and C). Also, as used herein, the phrase “based on” shall not be construed as a reference to a closed set of conditions. For example, an example step that is described as “based on condition A” may be based on both a condition A and a condition B without departing from the scope of the present disclosure. In other words, as used herein, the phrase “based on” shall be construed in the same manner as the phrase “based at least in part on.”

[0277] As used herein, including in the claims, the article “a” before a noun is open- ended and understood to refer to “at least one” of those nouns or “one or more” of those nouns. Thus, the terms “a,” “at least one,” “one or more,” “at least one of one or more” may be interchangeable. For example, if a claim recites “a component” that performs one or more functions, each of the individual functions may be performed by a single component or by any combination of multiple components. Thus, the term “a component” having characteristics or performing functions may refer to “at least one of one or more components” having a particular characteristic or performing a particular function. Subsequent reference to a component introduced with the article “a” using the terms “the” or “said” may refer to any or all of the one or more components. For example, a component introduced with the article “a” may be understood to mean “one or more components,” and referring to “the component” subsequently in the claims may be understood to be equivalent to referring to “at least one of the one or more components.” Similarly, subsequent reference to a component introduced as “one or more components” using the terms “the” or “said” may refer to any or all of the one or more components. For example, referring to “the one or more components” subsequently in the claims may be understood to be equivalent to referring to “at least one of the one or more components.” As used herein, including in the claims, the terms “set” or “subset” may be understood to refer to one or more entries. For example, referring to “a set of objects” or “a subset of objects” may be understood to be equivalently referring to one object or multiple objects.

[0278] The term “determine” or “determining” encompasses a variety of actions and, therefore, “determining” can include calculating, computing, processing, deriving, investigating, looking up (such as via looking up in a table, a database or another data structure), ascertaining and the like. Also, “determining” can include receiving (e.g., receiving information), accessing (e.g., accessing data stored in memory) and the like. Also, “determining” can include resolving, obtaining, selecting, choosing, establishing, and other such similar actions. [0279] In the appended figures, similar components or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If just the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label, or other subsequent reference label.

[0280] The description set forth herein, in connection with the appended drawings, describes example configurations and does not represent all the examples that may be implemented or that are within the scope of the claims. The term “example” used herein means “serving as an example, instance, or illustration,” and not “preferred” or “advantageous over other examples.” The detailed description includes specific details for the purpose of providing an understanding of the described techniques. These techniques, however, may be practiced without these specific details. In some instances, known structures and devices are shown in block diagram form in order to avoid obscuring the concepts of the described examples.

[0281] The description herein is provided to enable a person having ordinary skill in the art to make or use the disclosure. Various modifications to the disclosure will be apparent to a person having ordinary skill in the art, and the generic principles defined herein may be applied to other variations without departing from the scope of the disclosure. Thus, the disclosure is not limited to the examples and designs described herein but is to be accorded the broadest scope consistent with the principles and novel features disclosed herein.