Academic Research & Papers

[1] Vaswani, A., et al. (2017). "Attention Is All You Need." Advances in Neural Information Processing Systems, 30.
[2] Brown, T. B., et al. (2020). "Language Models are Few-Shot Learners." Advances in Neural Information Processing Systems, 33, 1877-1901.
[3] Devlin, J., et al. (2019). "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding." Proceedings of NAACL-HLT 2019.
[4] Radford, A., et al. (2019). "Language Models are Unsupervised Multitask Learners." OpenAI Technical Report.

Books & Publications

[5] Russell, S., & Norvig, P. (2020). "Artificial Intelligence: A Modern Approach" (4th ed.). Pearson.
[6] Goodfellow, I., Bengio, Y., & Courville, A. (2016). "Deep Learning." MIT Press.
[7] Mitchell, M. (2019). "Artificial Intelligence: A Guide for Thinking Humans." Farrar, Straus and Giroux.

Technical Documentation & Reports

[8] OpenAI. (2023). "GPT-4 Technical Report." OpenAI Research.
[9] Anthropic. (2024). "Claude 3 Model Card." Anthropic Documentation.
[10] Google Research. (2023). "PaLM 2 Technical Report." Google AI.
📚 This is a preview showing 10 sample references. The complete book includes 150+ academic citations, research papers, technical reports, and authoritative sources used throughout all 15 chapters.

Access the complete bibliography with 150+ sources in the full book.

Get the Full Book + Toolkit →