Gå offline med appen Player FM !
Rise of the AI PC & local LLMs
Manage episode 421994411 series 2385063
We’ve seen a rise in interest recently and a number of major announcements related to local LLMs and AI PCs. NVIDIA, Apple, and Intel are getting into this along with models like the Phi family from Microsoft. In this episode, we dig into local AI tooling, frameworks, and optimizations to help you navigate this AI niche, and we talk about how this might impact AI adoption in the longer term.
Changelog++ members save 5 minutes on this episode because they made the ads disappear. Join today!
Sponsors:
- Ladder Life Insurance – 100% digital — no doctors, no needles, no paperwork. Don’t put it off until the very last minute to get term coverage life insurance through Ladder. Find out if you’re instantly approved. They’re rated A and A plus. Life insurance costs more as you age, now’s the time to cross it off your list.
- Neo4j – Is your code getting dragged down by JOINs and long query times? The problem might be your database…Try simplifying the complex with graphs. Stop asking relational databases to do more than they were made for. Graphs work well for use cases with lots of data connections like supply chain, fraud detection, real-time analytics, and genAI. With Neo4j, you can code in your favorite programming language and against any driver. Plus, it’s easy to integrate into your tech stack.
- Fly.io – The home of Changelog.com — Deploy your apps and databases close to your users. In minutes you can run your Ruby, Go, Node, Deno, Python, or Elixir app (and databases!) all over the world. No ops required. Learn more at fly.io/changelog and check out the speedrun in their docs.
Featuring:
Show Notes:
- Ollama
- LM Studio
- llama.cpp
- OpenVINO
- MLPerf client working group
- Article - 5 top small language models
- GPTQ article
- Article - Which quantization method is right for you
Something missing or broken? PRs welcome!
Kapitel
1. Welcome to Practical AI (00:00:00)
2. Correcting our mistakes (00:00:43)
3. Local offline AI PCs (00:03:34)
4. Local model applications (00:09:23)
5. Hosted vs Local (00:12:32)
6. Sponsor: Ladder Life Insurance (00:16:01)
7. Exploring local options (00:18:00)
8. AI PC prices (00:21:54)
9. Naming conventions (00:24:04)
10. Sponsor: Neo4j (00:25:40)
11. Alphabet soup (00:26:56)
12. CPU derivitive (00:29:22)
13. Thanks for joining us! (00:34:09)
14. Outro (00:34:48)
302 episoder
Manage episode 421994411 series 2385063
We’ve seen a rise in interest recently and a number of major announcements related to local LLMs and AI PCs. NVIDIA, Apple, and Intel are getting into this along with models like the Phi family from Microsoft. In this episode, we dig into local AI tooling, frameworks, and optimizations to help you navigate this AI niche, and we talk about how this might impact AI adoption in the longer term.
Changelog++ members save 5 minutes on this episode because they made the ads disappear. Join today!
Sponsors:
- Ladder Life Insurance – 100% digital — no doctors, no needles, no paperwork. Don’t put it off until the very last minute to get term coverage life insurance through Ladder. Find out if you’re instantly approved. They’re rated A and A plus. Life insurance costs more as you age, now’s the time to cross it off your list.
- Neo4j – Is your code getting dragged down by JOINs and long query times? The problem might be your database…Try simplifying the complex with graphs. Stop asking relational databases to do more than they were made for. Graphs work well for use cases with lots of data connections like supply chain, fraud detection, real-time analytics, and genAI. With Neo4j, you can code in your favorite programming language and against any driver. Plus, it’s easy to integrate into your tech stack.
- Fly.io – The home of Changelog.com — Deploy your apps and databases close to your users. In minutes you can run your Ruby, Go, Node, Deno, Python, or Elixir app (and databases!) all over the world. No ops required. Learn more at fly.io/changelog and check out the speedrun in their docs.
Featuring:
Show Notes:
- Ollama
- LM Studio
- llama.cpp
- OpenVINO
- MLPerf client working group
- Article - 5 top small language models
- GPTQ article
- Article - Which quantization method is right for you
Something missing or broken? PRs welcome!
Kapitel
1. Welcome to Practical AI (00:00:00)
2. Correcting our mistakes (00:00:43)
3. Local offline AI PCs (00:03:34)
4. Local model applications (00:09:23)
5. Hosted vs Local (00:12:32)
6. Sponsor: Ladder Life Insurance (00:16:01)
7. Exploring local options (00:18:00)
8. AI PC prices (00:21:54)
9. Naming conventions (00:24:04)
10. Sponsor: Neo4j (00:25:40)
11. Alphabet soup (00:26:56)
12. CPU derivitive (00:29:22)
13. Thanks for joining us! (00:34:09)
14. Outro (00:34:48)
302 episoder
Alla avsnitt
×Välkommen till Player FM
Player FM scannar webben för högkvalitativa podcasts för dig att njuta av nu direkt. Den är den bästa podcast-appen och den fungerar med Android, Iphone och webben. Bli medlem för att synka prenumerationer mellan enheter.