🎓How I Study AIHISA
📖Read
📄Papers📰Blogs🎬Courses
💡Learn
🛤️Paths📚Topics💡Concepts🎴Shorts
🎯Practice
🧩Problems🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Papers943

AllBeginnerIntermediateAdvanced
All SourcesarXiv

RealGen: Photorealistic Text-to-Image Generation via Detector-Guided Rewards

Intermediate
Junyan Ye, Leiqi Zhu et al.Nov 29arXiv

RealGen is a new way to make computer-made pictures look so real that they can fool expert detectors and even careful judges.

#photorealistic text-to-image#detector-guided rewards#reinforcement learning

Visual Generation Tuning

Intermediate
Jiahao Guo, Sinan Du et al.Nov 28arXiv

Before this work, big vision-language models (VLMs) were great at understanding pictures and words together but not at making new pictures.

#Visual Generation Tuning#VGT-AE#Vision-Language Models

VQRAE: Representation Quantization Autoencoders for Multimodal Understanding, Generation and Reconstruction

Intermediate
Sinan Du, Jiahao Guo et al.Nov 28arXiv

VQRAE is a new kind of image tokenizer that lets one model both understand images (continuous features) and generate/reconstruct them (discrete tokens).

#VQRAE#Vector Quantization#Representation Autoencoder

ThreadWeaver: Adaptive Threading for Efficient Parallel Reasoning in Language Models

Intermediate
Long Lian, Sida Wang et al.Nov 24arXiv

ThreadWeaver teaches a language model to split big problems into smaller parts it can solve at the same time, like teammates working in parallel.

#adaptive parallel reasoning#fork–join#threaded inference

Recurrent Neural Networks (RNNs): A gentle Introduction and Overview

Beginner
Robin M. SchmidtNov 23arXiv

Recurrent Neural Networks (RNNs) are special neural networks that learn from sequences, like sentences or time series, by remembering what came before.

#Recurrent Neural Network#Backpropagation Through Time#Truncated BPTT

Attention Is All You Need

Intermediate
Ashish Vaswani, Noam Shazeer et al.Jun 12arXiv

The paper introduces the Transformer, a model that understands and generates sequences (like sentences) using only attention, without RNNs or CNNs.

#Transformer#Self-Attention#Multi-Head Attention

Enriching Word Vectors with Subword Information

Intermediate
Piotr Bojanowski, Edouard Grave et al.Jul 15arXiv

This paper teaches computers to understand words by also looking at the smaller pieces inside words, like 'un-', 'play', and '-ing'.

#subword embeddings#character n-grams#skip-gram
7576777879