The latest feature: google_lens_detect uses OpenCV to find objects in an image, crops each one, and sends them to Google Lens for identification. GPT-OSS-120B, a text-only model with
zero vision support, correctly identified an NVIDIA DGX Spark and a SanDisk USB drive from a desk photo.
Also includes Google Search, News, Shopping, Scholar, Maps, Finance, Weather, Flights, Hotels, Translate, Images, Trends, and more. 17 tools total.
Two commands: pip install noapi-google-search-mcp && playwright install chromium
GitHub: https://github.com/VincentKaufmann/noapi-google-search-mcp
PyPI: https://pypi.org/project/noapi-google-search-mcp/
Booyah!But wasn't it Google Lens that actually identified them?
If something was built by violating TOS' and you use that to do more TOS violations against the ones who initially did the TOS violations to build the thing, do they cancel out each other?
Not about GPT-OSS specifically, but say you used Gemma for the same purpose instead for this hypothetical.
> What exact llama model (+ quant I suppose) is it that you've had better results against
Not llama, but Qwen3-coder-next is on top of my list right now. Q8_K_XL. It's incredible (not just for coding).
> Jinja threw a bunch of errors and GPT-OSS couldn't make tool calls.
This was an issue for a week or two when GPT-OSS initially launched, as none of the inference engines had properly implemented support for it, especially around tool calling. I'm running GPT-OSS-120b MXFP4 with LM Studio and directly with llama.cpp, the recent versions handle it well and I have no errors.
However, when I've tried either 120b or 20b with additional quantization (not the "native" MXFP4 ones), I've seen that they're having troubles with the tool syntax too.
> Not llama
What does your original comment mean then? You said llama was "strictly" better than GPT-OSS, which specific model variant are you talking about or you miswrote somehow?
Why do I need gpt-oss-120B at all in this scenario? Couldn't I just directly call e.g. gemini-3-pro api from the python script?
What part here is the knowing or understanding? Does solving an integral symbolically provide more knowledge than numerically or otherwise?
Understanding the underlying functions themselves and the areas they sweep; has substitution or by-parts, actually provided you with this?
OP says “I taught LLM how to see” and this should mean the LLM (which is capable of being taught/learning) internalized how to. It did not, it was given a tool that does seeing and tells it what things are.
People are very interested in getting good local LLMs with vision integrated, and so they want to read about it. Next to nobody would click on the honest “I enabled an LLM to use a Google service to identify objects in images”, which is what OP actually did.
I'm under the impression I'm being hampered by a separation of 'brain' and 'eyes', as I have yet to find a reasoning + vision local model that fits on my Mac, and played with two instances of qwen (vision and reasoning) to try to solve, but no real breakthroughs yet. The requirements I've given myself are fully local models, and no reading data from the ROM that the human player cannot be aware of.
I was hoping OP was able to retro-fit vision onto blind models, not just offload it to a cloud model. It's still an interesting write-up, but I for sure got click-baited
In 1D, substitution by linear functions like "t=3x+1" is very insightful. It's a pity that sometimes we don't have time to analyze it more deeply. Other substitutions may be insightful or not. Some tricks like "t=sin(x)" has a nice geometrical interpretation, but it's never explained, we don't teach it anyway now.
Integration by parts is not very insightful until you get to the 3rd or 4th year and learn Solovev spaces or advanced Electrodynamics. I'd like to drop it, but other courses require it and I'd be fired.
In some cases, parity and other symmetries are interesting, but those tricks are mostly teach in Physics than in Math.
Also, in the second year we get 2D or 3D integrals, that have a lot of interesting variable changes. Also, things like the Gauss theorem and it's relation with conservation laws.