Latent Space
Deep technical AI engineering content. The go-to podcast for AI builders.
183 episodes curated
Episodes
RLHF 201 - with Nathan Lambert of AI2 and Interconnects
In 2023 we did a few Fundamentals episodes covering Benchmarks 101 , Datasets 101 , FlashAttention , and Transformers Math , and it turns out those were some of your evergreen favorites! So we are experimenting with more educational/survey content in the mix alongside our regular founder and event coverage . Pls request more ! We have a new calendar for events; join to be notified of upcoming things in 2024! Today we visit the shoggoth mask factory : how do transformer models go from trawling a deeply learned latent space for next-token prediction to a helpful, honest, harmless chat assistant?
The Accidental AI Canvas - with Steve Ruiz of tldraw
Happy 2024! We appreciated all the feedback on the listener survey ( still open, link here ) ! Surprising to see that some people’s favorite episodes were others’ least, but we’ll always work on improving our audio quality and booking great guests. Help us out by leaving reviews on Twitter , YouTube , and Apple Podcasts ! 🙏 Big thanks to Chris Anderson for the latest review - be like Chris! Note to the Audio-only Listener Because of the nature of today’s topic, it makes the most sense to follow along the demo on video rather than audio. There’s also about 30 mins of demos and technical detail
The AI-First Graphics Editor - with Suhail Doshi of Playground AI
We are running an end of year survey for our listeners! Please let us know any feedback you have, what episodes resonated with you, and guest requests for 2024! Survey link here! Listen to the end for a little surprise from Suhail . Before language models became all the rage in November 2022, image generation was the hottest space in AI (it was the subject of our first piece on Latent Space !) In our interview with Sharif Shameem from Lexica we talked through the launch of StableDiffusion and the early days of that space. At the time, the toolkit was still pretty rudimentary: Lexica made it ea
The "Normsky" architecture for AI coding agents — with Beyang Liu + Steve Yegge of SourceGraph
We are running an end of year survey for our listeners. Let us know any feedback you have for us, what episodes resonated with you the most, and guest requests for 2024! RAG has emerged as one of the key pieces of the AI Engineer stack. Jerry from LlamaIndex called it a “hack” , Bryan from Hex compared it to “a recommendation system from LLMs” , and even LangChain started with it . RAG is crucial in any AI coding workflow. We talked about context quality for code in our Phind episode . Today’s guests, Beyang Liu and Steve Yegge from SourceGraph , have been focused on code indexing and retrieva
The Busy Person's Intro to Finetuning & Open Source AI - Wing Lian, Axolotl
The Latent Space crew will be at NeurIPS on Tuesday ! Reach out with any parties and papers of interest. We have also been incubating a smol daily AI Newsletter and Latent Space University is making progress. Good open models like Llama 2 and Mistral 7B (which has just released an 8x7B MoE model ) have enabled their own sub-industry of finetuned variants for a myriad of reasons: * Ownership & Control - you take responsibility for serving the models * Privacy - not having to send data to a third party vendor * Customization - Improving some attribute (censorship, multiturn chat and chain of tho
Notebooks = Chat++ and RAG = RecSys! — with Bryan Bischof of Hex Magic
Catch us at Modular’s ModCon next week with Chris Lattner , and join our community ! 2024 note: Hex is now hiring AI Engineers . Due to Bryan ’s very wide ranging experience in data science and AI across Blue Bottle (!), StitchFix, Weights & Biases, and now Hex Magic, this episode can be considered a two-parter. Notebooks = Chat++ We’ve talked a lot about AI UX (in our meetups , writeups , and guest posts ), and today we’re excited to dive into a new old player in AI interfaces: notebooks! Depending on your background, you either Don’t Like or you Like notebooks — they are the most popular exa
AGI is Being Achieved Incrementally (OpenAI DevDay w/ Simon Willison, Alex Volkov, Jim Fan, Raza Habib, Shreya Rajpal, Rahul Ligma, et al)
SF folks: join us at the AI Engineer Foundation’s Emergency Hackathon tomorrow and consider the Newton if you’d like to cowork in the heart of the Cerebral Arena . Our community page is up to date as usual! ~800,000 developers watched OpenAI Dev Day, ~8,000 of whom listened along live on our ThursdAI x Latent Space , and ~800 of whom got tickets to attend in person: OpenAI’s first developer conference easily surpassed most people’s lowballed expectations - they simply did everything short of announcing GPT-5 , including: * ChatGPT (the consumer facing product) * GPT4 Turbo already in ChatGPT (
Beating GPT-4 with Open Source LLMs — with Michael Royzen of Phind
At the AI Pioneers Summit we announced Latent Space Launchpad , an AI-focused accelerator in partnership with Decibel . If you’re an AI founder of enterprise early adopter, fill out this form and we’ll be in touch with more details. We also have a lot of events coming up as we wrap up the year, so make sure to check out our community events page and come say hi! We previously interviewed the founders of many developer productivity startups embedded in the IDE, like Codium AI , Cursor , and Codeium . We also covered Replit’s (former) SOTA model , replit-code-v1-3b and most recently had Amjad an
Powering your Copilot for Data – with Artem Keydunov of Cube.dev
The first workshops and talks from the AI Engineer Summit are now up ! Join the >20k viewers on YouTube , find clips on Twitter (we’re also clipping @latentspacepod ), and chat with us on Discord ! Text-to-SQL was one of the first applications of NLP. Thoughtspot offered “Ask your data questions” as their core differentiation compared to traditional dashboarding tools. In a way, they provide a much friendlier interface with your own structured (aka “tabular”, as in “SQL tables”) data, the same way that RLHF and Instruction Tuning helped turn the GPT-3 of 2020 into the ChatGPT of 2022. Today, n
The End of Finetuning — with Jeremy Howard of Fast.ai
Thanks to the over 17,000 people who have joined the first AI Engineer Summit! A full recap is coming. Last call to fill out the State of AI Engineering survey ! See our Community page for upcoming meetups in SF, Paris and NYC . This episode had good interest on Twitter and was discussed on the Vanishing Gradients podcast . Fast.ai’s “Practical Deep Learning” courses been watched by over >6,000,000 people, and the fastai library has over 25,000 stars on Github. Jeremy Howard, one of the creators of Fast, is now one of the most prominent and respected voices in the machine learning industry; bu
Why AI Agents Don't Work (yet) - with Kanjun Qiu of Imbue
Thanks to the over 11,000 people who joined us for the first AI Engineer Summit! A full recap is coming, but you can 1) catch up on the fun and videos on Twitter and YouTube , 2) help us reach 1000 people for the first comprehensive State of AI Engineering survey and 3) submit projects for the new AI Engineer Foundation . See our Community page for upcoming meetups in SF, Paris, NYC, and Singapore . This episode had good interest on Twitter . Last month, Imbue was crowned as AI’s newest unicorn foundation model lab, raising a $200m Series B at a >$1 billion valuation . As “stealth” foundation
[AIE Summit Preview #2] The AI Horcrux — Swyx on Cognitive Revolution
This is a special double weekend crosspost of AI podcasts, helping attendees prepare for the AI Engineer Summit next week. After our first friendly feedswap with the Cognitive Revolution pod , swyx was invited for a full episode to go over the state of AI Engineering and to preview the AI Engineer Summit Schedule , where we share many former CogRev guests as speakers. For those seeking to understand how two top AI podcasts think about major top of mind AI Engineering topics, this should be the perfect place to get up to speed, which will be a preview of many of the conversations taking place d
[AIE Summit Preview #1] Swyx on Software 3.0 and the Rise of the AI Engineer
This is a special double weekend crosspost of AI podcasts, helping attendees prepare for the AI Engineer Summit next week. Swyx gave a keynote on the Software 3.0 Landscape recently (referenced in our recent Humanloop episode ) and was invited to go deeper in podcast format, and to preview the AI Engineer Summit Schedule . For those seeking to ramp up on the current state of thinking on AI Engineering, this should be the perfect place to start, alongside our upcoming Latent Space University course (which is being tested live for the first time at the Summit workshops). While you are listening,
RAG Is A Hack - with Jerry Liu from LlamaIndex
Want to help define the AI Engineer stack ? >800 folks have weighed in on the top tools, communities and builders for the first State of AI Engineering survey, which we will present for the first time at next week’s AI Engineer Summit . Join us online ! This post had robust discussion on HN and Twitter . In October 2022, Robust Intelligence hosted an internal hackathon to play around with LLMs which led to the creation of two of the most important AI Engineering tools: LangChain 🦜⛓️ ( our interview with Harrison here ) and LlamaIndex 🦙 by Jerry Liu, which we’ll cover today. In less than a ye
Building the Foundation Model Ops Platform — with Raza Habib of Humanloop
Want to help define the AI Engineer stack? >500 folks have weighed in on the top tools, communities and builders for the first State of AI Engineering survey! Please fill it out (and help us reach 1000!) The AI Engineer Summit schedule is now live! We are running two Summits and judging two Hackathons this Oct. As usual, see our Discord and community page for all events. A rite of passage for every AI Engineer is shipping a quick and easy demo, and then having to cobble together a bunch of solutions for prompt sharing and versioning, running prompt evals and monitoring, storing data and finetu