Links for February 4th

Here are some links I found interesting this week.

[1711.00937] Neural Discrete Representation Learning

The Magic of Small Databases

OpenAI CEO Sam Altman | AI for the Next Era - YouTube

ElevenLabs || Prime Voice AI

(1) CS25 I Stanford Seminar - Self Attention and Non-parametric transformers (NPTs) - YouTube

Vexu/bog: Small, strongly typed, embeddable language.

Patterns | Build next-gen AI systems | Patterns

Replacing a SQL analyst with 26 recursive GPT prompts | Patterns

The Foundationalist Manifesto: The Politics of Future Past • The Worthy House

Large Transformer Model Inference Optimization | Lil'Log

Malum in se - Wikipedia

Malum prohibitum - Wikipedia

[2009.06732] Efficient Transformers: A Survey

Non-local means - Wikipedia

[2101.03961] Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity

NEURAL MACHINE TRANSLATION BY JOINTLY LEARNING TO ALIGN AND TRANSLATE

EFFICIENTLY SCALING TRANSFORMER INFERENCE

News — Magrathea Metals

[0804.2996] The Epic Story of Maximum Likelihood

[1707.06347] Proximal Policy Optimization Algorithms

Conspiracy Theory as a Hegemonic Weapon of Psychological Warfare

[2302.00270v1] Internally Rewarded Reinforcement Learning

Exclusive Q&A: John Carmack's 'Different Path' to Artificial General Intelligence » Dallas Innovates

Note 10: Self-Attention & Transformers

Manifold | ChatGPT, LLMs, and AI — #29