At one point seemingly out of nowhere he pointed out on his screen share "Look at how many tokens I've used this month. I run so much Opus." It was a number that was offensively large.
I remember thinking "That's a really odd flex, this crap is so expensive the fact that you use so much should be a red flag"
He demonstrated a number of Claude Code use cases he had to manage and tweak AWS infrastructure that made me, the old greybeard sysadmin older than the internet think "You've used AI to do something that was a single command."
So this story makes sense. They were being encouraged to just blast away at it six plus months ago.
But if you hit "tab" it'll claim that as an AI-edited line, LOL.
(A lot of the rest of it is stuff I could already have been doing just as fast if I'd ever bothered to learn to use multiple cursors, learned vim navigation, or set up some macros—I never did because my getting-code-on-the-screen speed without those has never been slow enough to hold anything up, in practice)
Probably there is no dichotomy going on and it depends on multiple factors, but it seems so weird to see reports that are so different between each other.
Yes, and that’s a good thing! This is in fact where a lot of AI value lies. You dont need to know that command anymore - knowing the functional contract is now sufficient to perform the requisite work duties. This is huge!
Of course I lose about as much time as I save to its fuck-ups, so I'd still have been better off learning to actually use a text editor properly. Though (as I mentioned in a another post) part of why I've never done that in 25ish years of writing code for pay is that my code-writing speed has never been too slow for any of the businesses I've worked in, i.e. other things move slowly enough it never mattered.
I find it hard to read "You can do things without knowing things" as a positive improvement in work, society, life, anywhere
Now, they might be; they've certainly used silly metrics in the past (LoC, commit count, etc.) without ever fully acknowledging it. But I don't believe that it's as simple as more tokens = more better.
[1] https://locusmag.com/feature/cory-doctorow-full-employment/
This isn't like that, as it isn't funded through taxes. This is private companies experimenting with their money, and risking downstream cost increases that may cause people to go elsewhere, as they do when they try anything new.
This is much better than just funding people regardless of productivity through forced taxes.
[0] https://nintil.com/the-soviet-union-achieving-full-employmen...
I don't think USSR poverty rates surpassed those of Tsarist Russia that preceded them. To their credit, I think ideologic competition between capitalist and communist blocks was part of what allowed improvement of life conditions of workers in capitalist countries, after WWII. Fear of revolutions avoided one-percenters taking all productivity gains in the period. Thay had to share some to keep guillotines away. As soon as things went south in the USSR, from the 70s onwards, and capitalism took over the whole world, lacking any sort of viable extant competition, we reverted back to the old norm, the workers were denied their share of the productivity gains since then, and here are us now. A regime premised on free competition was undermined by lack of competition to itself.
They're using tokens for pointless stuff right now in order to figure out use cases where it helps. You can't do that without also learning where it doesn't help.
My company is doing the same thing.
This reminds me of the story of how the USSR nearly made whales extinct to meet a quota for whale meat that nobody wanted to eat.
USSR barely accounted for 15% of the world caught amount (with Japan as the leader).
> that nobody wanted to eat
unsubstantiated.
How are we sliding face first into “snowpiercer but dumber”?
There's definitely some pressure from managers when they hear about N00% productivity boosts in internal presentations, but where I am at they would figure out if you were making up tasks rather than working pretty quickly and the pressure comes from aggressive deadlines and a shift from the yearly OP1 process to a more agile one.
One person I've talked to has someone in their org who is running GasTown and chews through tokens 24/7. They don't contribute very much, but they're comfortably in the #1 spot.
Add a pre-commit hook to re-create the diagrams on every commit (in case anything changed, of course), that way you can really burn tokens and look good to management.
If you can't change your company, change your company!
I've chosen the wrong profession.
I believe there has to be some downward pressure on these executives to take these decisions but I would like to know where it's coming from exactly and what's the logic behind them. Is it some big institution like Blackrock which has leverage on many of these companies? That's always been my bet but I never knew for sure.
Tokens is just yet another proxy for business value.
The problem they face is if everybody is judge by business value in dollars, crappy managers are the first to go
This is analogous to measuring productivity by LoC output.
True, but it looks like productivity to people whose own productivity is measured by how busy their subordinates appear to be.
Burn resources at all costs to appear productive and use proxy metrics to measure success.
Fire productive employees to ensure we have resources to fund the proxy metrics.
AI slop fool’s gold is the product.
We are living in a totally bonkers time.
Things that rhyme with this have indeed been happening at the biggest names.
We have always been living in bonkers time.
And the fact that it is an industry-wide meme at this point makes bright red flashing lights and klaxons go off on my mind that a catastrophic reckoning can't be too far. There's not enough money in the world to keep this up for too long.
Negative 2000 Lines of Code
https://news.ycombinator.com/item?id=44381252