๐ŸŽ“How I Study AIHISA
๐Ÿ“–Read
๐Ÿ“„Papers๐Ÿ“ฐBlogs๐ŸŽฌCourses
๐Ÿ’กLearn
๐Ÿ›ค๏ธPaths๐Ÿ“šTopics๐Ÿ’กConcepts๐ŸŽดShorts
๐ŸŽฏPractice
๐ŸงฉProblems๐ŸŽฏPrompts๐Ÿง Review
Search
How I Study AI - Learn AI Papers & Lectures the Easy Way

Papers1

AllBeginnerIntermediateAdvanced
All SourcesarXiv
#Positional Interpolation

Beyond Real: Imaginary Extension of Rotary Position Embeddings for Long-Context LLMs

Intermediate
Xiaoran Liu, Yuerong Song et al.Dec 8arXiv

Big language models use RoPE to remember word order, but it throws away the imaginary half of a complex number during attention.

#RoPE++#Rotary Position Embeddings#Imaginary Attention