This document outlines a 6-hour course on representation learning for natural language processing. The course is divided into 4 modules that will provide both theoretical understanding and code implementations. Module 1 introduces text representation and different encoding methods. Module 2 focuses on word vectors and models like CBOW, Skip-gram, GloVe, and techniques like negative sampling and t-SNE visualization. Module 3 covers sentence and document representations with models like PV-DM, PV-DBOW, and Skip-Thoughts. Finally, Module 4 introduces character-level representations using RNNs, LSTMs, and character embeddings. The course aims to provide both conceptual understanding and hands-on coding experience of various NLP representation learning techniques.