In this edition we’ll be covering…
Merriam-Webster's word of the year reflects AI's impact
Google's new unified API for models and agents
NVIDIA's strategic moves in AI infrastructure
5 trending AI signals from the week
3 cutting-edge AI tools to boost your workflow
And much more…
The Latest in AI
The Dictionary Gets Real About AI Content
Merriam-Webster just made it official: "slop" is the word of 2025. And yes, they're talking about AI.
The dictionary defines slop as "digital content of low quality that is produced usually in quantity by means of artificial intelligence." Apparently, when your social feeds look like they've been run through a content blender, there's now an official term for it...
What's driving this?
Nearly 75% of all new web content from earlier this year involved some kind of AI, according to a May study from Ahrefs.
The rise of AI tools like OpenAI's Sora and Google's Veo has created what critics call a "slop economy" where low-quality AI content can be monetized through advertising.
The term has expanded beyond media to describe AI's impact on cybersecurity reports, legal briefings, and even college essays.
So What?
This isn't just a vocabulary update. It's a cultural marker. When a dictionary picks up a term that mocks the very technology everyone's investing billions into, it signals we've hit a new phase. AI has officially moved from "revolutionary" to "ubiquitous nuisance" in the public consciousness.
The real question isn't whether AI can generate content anymore. It's whether we can tell the difference between what's worth reading and what's just... well, slop. And according to Merriam-Webster, most of us already can't.
Innovation Showcase
Build AI Agents with Google’s Interactions API
Google just dropped the Interactions API, and it's a game-changer for building AI agents.
Instead of managing complex state and context manually, the Interactions API gives you a unified endpoint for both models like Gemini 3 Pro and agents like Gemini Deep Research. It handles server-side state management, background execution, and even supports Model Context Protocol (MCP) servers.
Here's how to get started with the Interactions API:
Sign up for Google AI Studio and get your Gemini API key.
Make your first interaction with a model:
import time
from google import genai
client = genai.Client()
interaction = client.interactions.create(
input="Summarize the key features of transformer models",
model="gemini-3-pro-preview"
)
print(interaction.outputs[-1].text)Try it with an agent for long-running research tasks:
interaction = client.interactions.create(
input="Research the history of GPU computing",
agent='deep-research-pro-preview-12-2025',
background=True
)
# Poll for results
while True:
interaction = client.interactions.get(interaction.id)
if interaction.status == "completed":
print(interaction.outputs[-1].text)
break
time.sleep(10)🔥 Power User Tip: The Interactions API supports streaming responses and automatic reconnection for long-running tasks. Set stream=True to get real-time updates on research progress without keeping a constant connection open.
Using the Interactions API is just scratching the surface when building AI Agents. Check out our platform for most comprehensive way to build them for your use-cases!
Industry Intel
NVIDIA Buys the Racetrack
NVIDIA just threw down the gauntlet with a double announcement that shows they're not letting anyone catch up in the AI race.
First, they unveiled Nemotron 3, a new family of open-source AI models designed to compete head-on with the surge of Chinese offerings from DeepSeek, Moonshot AI, and Alibaba. Then they acquired SchedMD, the company behind Slurm—the workload management system running more than half of the world's top supercomputers.
What's happening here?
Nemotron 3 Nano (30B parameters) is available now on Hugging Face, with larger models rolling out in 2026. These models are built for agentic AI systems and available completely open-source.
The SchedMD acquisition ensures NVIDIA controls the critical infrastructure needed for AI training and inference at scale—Slurm is used by foundation model developers and will remain open-source and vendor-neutral.
This move comes as Chinese AI firms are flooding the market with high-quality open-source models, intensifying global competition in the AI space.
So What?
NVIDIA isn't just selling chips anymore. They're building the entire stack—from silicon to scheduling software to the models themselves. By keeping Slurm open-source while owning its development, they're positioning themselves as both the enabler and the leader of the AI infrastructure revolution.
While everyone else is racing to train bigger models, NVIDIA just bought the racetrack. And they're inviting everyone to keep running on it—as long as they remember who owns the place.
Quick Bites
Stay updated with our favorite highlights, dive in for a full flavor of the coverage!
Google Translate brings live speech translations to any headphones on Android, supporting over 70 languages—no more Pixel Buds required.
Trump launches "U.S. Tech Force" to hire 1,000 engineers for AI infrastructure projects, partnering with Amazon, Apple, Google, Microsoft, NVIDIA, OpenAI, and others for two-year federal service programs.
OpenAI hires veteran Google executive as corporate development VP, continuing the talent shuffle between AI giants.
Time magazine names the architects of AI as "Person of the Year," featuring Jensen Huang, Elon Musk, Dario Amodei, Lisa Su, Mark Zuckerberg, Demis Hassabis, Fei-Fei Li, and Sam Altman.
Barry Zhang and Mahesh Murag from Anthropic highlight Agent Skills in their latest AIE talk.
Trending Tools
🔍 Google Deep Research Agent - An autonomous research agent that plans, executes, and synthesizes multi-step research tasks powered by Gemini 3 Pro.
🔗 n8n 2.0 - Open-source workflow automation with enhanced security, reliability, and a new Publish/Save paradigm for safer production updates.
🤖 GPT-5.2 - OpenAI's most advanced model for professional work, achieving 70.9% on GDPval knowledge tasks and setting new records in coding and long-context reasoning.
Until we Type Again…
Thank you for reading yet another edition of the Digestibly Newsletter!







