In which we discuss interesting developments in ICLR 2023 reviews and how much pretrained Transformers rely on input-dependent attention.
Update #37: ICLR Reviews Get Spicy and How Much Attention Actually Attends
Update #37: ICLR Reviews Get Spicy and How…
Update #37: ICLR Reviews Get Spicy and How Much Attention Actually Attends
In which we discuss interesting developments in ICLR 2023 reviews and how much pretrained Transformers rely on input-dependent attention.