PgAdmin 4 9.13 with AI Assistant Panel
79 points by __natty__ 9 hours ago | 21 comments

panzi 5 hours ago
Yeah, no thanks. I switched to dbeaver already anyway, because pgadmin was annoying about to which postgres versions it could connect. Too much of a hassle to setup a new version from source back when I tried. With dbeaver I just run ./dbeaver from the extracted .tag.gz. dbeaver is also not a web interface, but a real desktop application (Java, though).
reply
chaz6 7 hours ago
When I got the update I looked through the settings and there appears to be no way to disable it. I do not want AI anywhere near my database. I only use it for testing/staging at least so I should hopefully not have to worry about it wrecking production.
reply
ziml77 6 hours ago
What's the danger? It can see the schemas to help it generate the queries but it can't run anything on its own. Also you have to give the application credentials to an AI provider for the feature to work. So, you can just not do that.
reply
adamas 6 hours ago
There is no need of potential dangers to not want to have non-deterministic features in an application.
reply
imjared 7 hours ago
The docs suggest that you can set the default provider to "None" to disable AI features: https://www.pgadmin.org/docs/pgadmin4/9.13/preferences.html#...
reply
smartbit 6 hours ago
Note: AI features must be enabled in the server configuration

  LLM_ENABLED = True 
in config.py for these preferences to be available.
reply
OptionOfT 5 hours ago
I did not enable this and yet I got the panel in the UI.
reply
zenmac 6 hours ago
It is nice that they have the default set to "None". However to have this feature in pgAmdin is as distraction from the project.

If it is just calling API anyway, then I don't want to have this in my db admin tool. It also expose surface area of potential data leakage.

reply
bensyverson 6 hours ago
Worth pointing out that Postgres is perfectly usable without an admin dashboard at all
reply
lateforwork 4 hours ago
Did you miss this:

"This feature requires an AI provider to be configured in Preferences > AI."

And then you have to supply an API key (see here https://www.pgedge.com/blog/ai-features-in-pgadmin-configura... )

You don't get AI for free!

reply
rubicon33 6 hours ago
Why do you do in production?
reply
vavkamil 6 hours ago
Quick fix based on https://github.com/pgadmin-org/pgadmin4/issues/9696#issuecom...

Click on the "Reset layout" button in the query tool (located in the top right corner), and it will move the "AI Assistant" tab to the right. Now, when you query a table, it will default to the Query tab as always.

reply
jplaz 3 hours ago
Switched from DBeaver to DataGrip and I couldn't be happier.
reply
aitchnyu 6 hours ago
Might as well choose our AI subscription for our tools. I always hated the sparkle icons in Mongodb Compass (db browsing tool), Cloudwatch (logs) etc which is wired to a useless model. So I always chose to write Python scripts to query Postgres and other DBs and render pretty tables to CLI.
reply
zbentley 6 hours ago
Eh, as someone generally on the skeptical end of the spectrum for a lot of AI-assisted ops tasks, exploratory query generation is a great use case for it.

I’m highly proficient in code, only average at SQL, and am routinely tasked to answer one-off questions or prototype reporting queries against highly complex schemas of thousands of tables (owned by multiple teams and changing all the time, with wildly insufficient shared DAO libraries or code APIs for constructing novel queries). My skill breakdown and situation aren’t optimal, certainly, but they aren’t uncommon either.

In that context, being able to ask “write a query that returns the last ten addresses of each of the the highest-spending customers, but only if those addresses are in rhetorical shipment system and are residences, not businesses”. Like, I could figure out the schemas of the ten tables involved in those queries and write those joins by hand, slowly. That would take time and, depending on data queries, the approach might get stale fast.

reply
stuaxo 6 hours ago
If I can use this with a local LLM it could be useful.
reply
kay_o 3 hours ago
In ollama is included default add the endpoint URL yourself
reply
zbentley 6 hours ago
Yeah. This seems like an area where a “tiny” (2-4GB) local model would be more than sufficient to generate very high quality queries and schema answers to the vast majority of questions. To the point that it feels outright wasteful to pay a frontier model for it.
reply
msavara 5 hours ago
No thank you. One of the worst ads for python that exists. The only one worse than pgAdmin is Windows 11.
reply
allthetime 2 hours ago
postico is really nice on macos
reply
naranha 6 hours ago
The only interface that works for me efficiently with LLMs is the chatbot interface. I rather copy and paste snippets into the chat box than have IDEs and other tools guess what I might want to ask AI.

The first thing I do with these integration is look how I can remove them.

reply