Gemini CLI 6.0: They JUST ADDED a TON NEW FEATURES for GEMINI-3!
By AICodeKing
Summary
## Key takeaways - **Gemini CLI: From Chat to Developer Platform**: The Gemini CLI has evolved beyond a simple terminal chat into a scriptable assistant with extensions, smart model routing, and an autonomous sidekick for background coding tasks. [00:20], [00:33] - **v0.11.0: Stream JSON & Markdown Toggle**: Version 0.11.0 introduces streamable JSON output for headless runs and a Markdown toggle for easy switching between rendered and raw text, improving visibility and workflow. [01:29], [01:39] - **v0.12.0: Smart Model Routing & Code Investigator**: Version 0.12.0 makes model selection first-class with smart routing (Flash for quick tasks, Pro for heavy ones) and introduces a Codebase Investigator to explore your workspace for context. [02:24], [02:46] - **Jules Integration: Autonomous Background Coding**: The Jules integration acts as an autonomous sidekick, cloning your repo, installing dependencies, and modifying files in a managed VM, allowing you to delegate background coding tasks. [04:04], [04:10] - **Extensions: Expandable Functionality**: Extensions like Hugging Face, Monday.com, and Data Commons can be installed, listed, and updated directly within the CLI, expanding its capabilities. [03:11], [03:18] - **Practical Workflows: Export & Stream**: New workflows allow exporting conversations with tool calls (Markdown/JSON) and streaming JSONL headlessly for real-time monitoring of agent progress. [06:24], [06:36]
Topics Covered
- Gemini CLI evolves into a scriptable platform.
- Streamable JSON output for headless automation.
- Model routing optimizes for performance and quota.
- Codebase investigator brings workspace awareness to the CLI.
- Jules: Your autonomous sidekick for background coding tasks.
Full Transcript
[Music]
[Applause]
Hi, welcome to another video. Today I'm
looking at the Gemini CLI upgrades after
V0.9.0
and I'm folding in the Jules integration
because it meaningfully changes how you
work.
Quick vibe check. This is no longer just
a chat in your terminal.
It's a scriptable assistant with
extensions. Smarter model routing and
now an in autonomous sidekick. You can
delegate background coding tasks to
jewels. It's free to try. Setup is not
bad and the polish shows.
Let's start at V0.10.0.
This release is heads down polish.
Interactive tool calling got better. So
if you need to run a TTY tool inside the
CLI, it handles that without bouncing
you to another shell. Alt plus key
support is broader, which helps if you
live in tiling terminals. Telemetry now
tracks diff stats, lines changed by me
versus the model, useful for CI
visibility, and understanding how much
automation actually edits your code.
It's small stuff, but you feel it
dayto-day.
then v 0.11.0
zero. Two things I care about
orchestration and visibility.
There's a proper streamable JSON output
mode output format stream JSON so you
can tail agent progress and headless
runs.
Markdown has a toggle alt +m or controll
+m for switching between rendered and
raw text which is handy for clean copy
paste. You can edit cued messages with
the up arrow when the input is empty. So
prompt iteration doesn't break your
flow.
JSON web fetch shows nonHTML payloads to
the model correctly now and you can run
MCP/comands
non-interactively with Gemini summc
prompt.
Some deprecated flags are gone. You'll
switch to newer patterns or envy.
Now version 0.120
is the platform inflection.
Model selection is first class with
/model and model routing sends quick
queries to flash while heavier creative
or analytical tasks go to pro. It's
practical. You preserve quota without
babysitting.
You can opt out and pin a model if you
need deterministic runs.
The other big piece is the codebase
investigator sub agent. Turn it on in
slash settings and it will explore your
workspace, resolve relevant files and
bring context into the session.
Limit turns if you prefer guard rails.
It's basically an indexer for the
assistant for multifile changes and
refactors. It helps extensions keep
growing. Hugging face, monday.com, data
commons. You install with Gemini
extensions, install, enable it, list to
confirm, update when needed, and you can
manage from inside the session with
/extensions list and / extensions
update.
There's also extension explorer, which
just opens the catalog in your default
browser.
Compression thresholds are configurable
in slash settings.
API key O now has a secure dialogue. No
more sprinkling secrets in NVARS if you
don't want to.
Sequential approvals let you approve
multiple tool calls in a row which
reduces the tap dance on longer
executions.
All right,, let's, talk, jewels, because
this is the new mental model. Jules
integrates as an autonomous sidekick
that you command from Gemini CLI. It
runs in a managed VM, clones your repo
installs dependencies, modifies files
and can submit changes to a new branch.
You stay in flow in the terminal. Jules
does medium span work in the background.
Here's the setup straight from the
article. You need a Jules account and to
connect your GitHub repo in the Jules
console. Install the extension with the
command and then use it with slash
prompts /jwles convert commonjs modules
to ES modules and check status with
/jwles what is the status of my last
task. What this means practically you
can offload background bug fixes
mechanical refactors or format
conversions while you keep shipping in
Gemini CLI.
Jules handles the VM work, clone
depths, edits, and pushes results to a
new branch. Treat it like CI scope
prompts clearly, review diffs, and gate
merges.
Let me do a quick explanation of the
updates as well. You can start a
session, then run /model to pin pro for
heavier coding work, or switch to flash
for quick notes.
You can use this to adjust compression
and enable the codebase investigator in
slash settings, capping its turns for
guardrails.
You can use this to install extensions
from a GitHub URL or local folder with
Gemini extensions installed, enable or
disable them, list to see what's
installed, update to pull the latest
changes, and scaffold new ones with new.
If you're building
you can use this to manage extensions
mid session with /extensions list or
/ext extensions update.
You can use this to open the gallery
with /extension explorer and discover
community partner and Google built
integrations.
You can use this to set show status in
title to true. So your terminal title
shows live status and thoughts while
juggling pains.
You can use this to export a
conversation with tool calls included
using / chat share file markdown or file
JSON for PRS and postmortems.
You can use this to stream JSONel
headlessly with -ash output format
stream JSON to monitor agent progress in
real time.
And you can use this to delegate
background tasks with /jwles prompts.
Check status when you need and review
the resulting branch diffs before
merging. That's the flow and it lands
cleanly. They are trying to turn the CLI
into a platform. You bring your stack to
the agent which is quite awesome. Model
routing is sensible and quota smart and
pinning models gives deterministic
behavior when you want it.
The codebase investigator adds workspace
awareness which I've really wanted for
feature work.
Jewels is the async layer that offloads
medium span tasks and returns clean
branches for review.
I really liked it and have been using
it. That's why I thought to share it
with you guys as well.
Stream JSON is super cool for automation
and the secure API key dialogue reduces
friction around secrets.
The markdown toggle cued message editing
and non-interactive MCP prompts are
quality of life wins that make daily use
smoother. However, there are
limitations.
extensions need initial setup off
configuration and a bit of yak shaving
on first run. So if you wanted instant
magic, that's a bummer. Headless
approvals are powerful, but you'll want
guard rails, trusted folders, sandboxed
workspaces, clear policies, and a tight
list of allowed tools.
IDE plug-in maturity will vary by
editor. Some experiences will be richer
sooner. Others will lag.
Compression tuning can get fiddly, too
aggressive, and you lose context
fidelity. Too loose and your runs get
heavier. With model routing, if you care
about reproducible outputs in tests
you'll use /model to pin rather than
leaving routing on. And with jewels
prompt precision matters. Noisy diffs
are on you if you underspecify tasks.
So there's that personal take. This
direction makes Gemini CL I feel like a
serious platform. You can use this to
wire up extensions, delegate to jewels
stream telemetry, and export clean
artifacts with tool calls included. They
are surely trying to make it better for
Gemini 3 for sure. Overall, it's pretty
cool. Anyway, share your thoughts below
and subscribe to the channel. You can
also donate via super thanks option or
join the channel as well and get some
perks. I'll see you in the next video.
Bye.
[Music]
Loading video analysis...