What are the most upvoted users of Hacker News commenting on? Powered by the /leaders top 50 and updated every thirty minutes. Made by @jamespotterdev.
> However, they recently switched to investing in private equity funds and now they are getting much better returns, without all the pesky moral issues involved with it.
Investors Warn of 'Rot in Private Equity' as Funds Strike Circular Deals - https://news.ycombinator.com/item?id=46380751 - December 2025
Once Wall Street’s High Flyer, Private Equity Loses Its Luster - https://news.ycombinator.com/item?id=46364566 - December 2025
Private Equity’s Latest Financial Alchemy Is Worrying Investors - https://news.ycombinator.com/item?id=44891882 - August 2025
People Are Worried About Private Market Liquidity - https://www.bloomberg.com/opinion/newsletters/2025-06-10/peo... | https://archive.today/wJ3Uf - June 10th, 2025
Private Equity Fundraising Plunges Amid Struggle to Return Cash - https://www.bloomberg.com/news/articles/2025-05-27/private-e... | https://archive.today/hxvzb - May 27th, 2025
Private Equity Firms Hunt for Alternate Ways to Return Investor Cash - https://www.bloomberg.com/news/newsletters/2025-05-14/privat... | https://archive.today/6UzBk - May 14th, 2025
Unlocking a potential US$3.8 trillion opportunity for private equity firms - https://www.deloitte.com/us/en/insights/industry/financial-s... - December 16th, 2024
> I am old. This immediately triggered "Perl!?" in me...
Same same.
Yup. This is precisely why the first image seems to have oscillating brightness, with clear sharp peaks at yellow and cyan. It's because it's not just changing color, it's literally twice as much light. It goes:
Red - 1x
Yellow - 2x
Green - 1x
Cyan - 2x
Blue - 1x
Magenta - 2x
(Of course magenta is not part of the spectrum.)A very first step towards a better spectrum is just to maintain constant output brightness (accounting for gamma). There will still be perceptual differences in brightness, as we naturally perceive green as brighter than blue.
Obviously this gets taken into account by the time the author gets to the CIE color model. But there are a number of "intermediate" improvements like that, which you can make.
(1) The tests on icons is really not fair, I mean, PNG is always going to win on that kind of image.
(2) It is little realized that AVIF kinda sucks for high-quality photographic images. I mean, if you are making a hero image for your blog and you want it to be big and fast loading and it doesn't have to stand up to close inspection it is great. If you took it with your mirrorless and you want it to look like you took it with a mirrorless you do very well with JPEG or (better) WebP. I mean, AVIF is meant for encoding video where people don't look closely at individual frames. JPEG XL rules for high quality photographs.
It kind of depends on where in the globe one is, many countries see programmers as a white collar job that happens to be better paid than a plain secretary, a mere transition phase into management, that is the success story for most families, the programmer turned boss.
So those of us that fight against becoming managers, it was for love of programming and the related technical details, as it usually comes with a payment and career ceiling.
And being unemployed beyond 50 years old in many countries that see being a programmer as yet another office job, means too old for employment, and too young for retirement.
Same with me. It's just an expression. One definition of "indulge" is "to take unrestrained pleasure in" (MW). I just read it as an activity the kid really really enjoys.
Or literally a part of the self, which is what the OP was getting at I think. And there is plenty of that in the software world. "I'm a Rubyist", "I'm a Pythonista", "A rustacean" and so on. There is plenty of identity ridiculousness. I've been a C programmer but I've also been a basic programmer an assembly language programmer, a PHP programmer, a FORTH programmer and a whole list of others. To me that collapses to "I'm a programmer" (even if the sage advice on HN by the gurus is to never call yourself a programmer I'm more than happy to do so). It defines what I do, not what or who I am, and it only defines a very small part of what I do. That's one reason why I can't stand the us-vs-them mentality that some programming languages seem to install in their practitioners.
Unfortunately these are a form of "achievement laundering". When I hear about some young person who won a national science fair I assume their parents are upper-middle class or higher and had connections with a research lab. See
https://www.nature.com/articles/s41562-022-01425-4
If the problem in that paper is not addressed, any kind of DEI in academia is hollow, e.g. "if you are black and your parents are professors you get 500 job offers."
It is very hard to make any progress in this area because there is a lot of demand for status to be hereditary and the more strident anyone is about "meritocracy" the more concerned they are that people in powerful roles are perceived to be deserving of them and the less concerned that talented people find opportunity.
On the PC you can distinguish between the two of them in "raw" mode, but almost all keyboard maps flatten them both into the same key.
The only time I've seen them mapped differently is games.
It was the electrics and the power train that were the problem. Oh, that and process.
Only because they were originally designed for Web 3D.
> The change to VB/Access and SQL
Brazil had a vibrant and omnipresent Clipper developer ecosystem until VB and Access ate their lunch. This also made a lot of businesses adopt windows.
Nice one, I will link to this from Pianojacq.com if you're ok with that?
It is indeed, and it is very much ripe for a serious review. Which is a pity because I think it is one of HN's most powerful pieces.
My introduction to databases was via dBASE III Plus, shortly followed by Clipper Summer '87, and Clipper 5.x (already OOP and some C++ like constructs).
The change to VB/Access and SQL later on took some mind shifting as the concepts on how to design a database are quite different.
Additionally it is quite remarkable the productivity that xBase offered, for a constrained environment like MS-DOS, in an automatic memory managed language, with AOT compilation (when using Clipper, FoxPro and co).
Under-appreciated factor: the problem with decentralization is that it pushes work on to the end user, who is least equipped to deal with it. People actively want centralization of things like anti-spam because it lightens the load. The fact that this gets paid for in insidious ways rather than directly paying for a service causes all sorts of weird market distortions.
Note that Discord doesn't replace IRC, it also competes with TeamSpeak; there's a whole voice and video sub-feature to it. Not everybody uses it but the fact that it's available in the same software was advantageous to the original market, gamers.
Submarines work on the principle of the arch: a spherical or cylindrical hull section transfers all the force into compression of the material so there is no net "inwards" force.
The weak points then turn out to be joints, material defects (the famous Titan failure), windows and other piercing points, and any unexpected shear forces.
This is rather like my observation about British car companies in the late 20th century:
- large factory of British workers + British management: strife, strikes, disaster, bankruptcy (British Leyland)
- small factory of British workers + British management: success, on a small scale (lots of the F1 industry, McLaren etc; also true of non-car manufacturing)
- large factory of British workers with overseas management: success (Nissan Sunderland, BMW era Mini, etc)
I just bought this the other day, https://www.retro-gamer.de/shop/heft/retro-gamer-2-26-einzel...
The Xbox 360 is now considered a retro gaming device, that was such a reminder how old I am now, to note my first home computer was a Timex 2068.
> "person who likes making things chooses making things over Netflix"
This is subtly different. It's not clear that the people depicted like making things, in the sense of enjoying the process. The narrative is about LLMs fitting into the already-existing startup culture. There's already a blurry boundary between "risky investment" and "gambling", given that most businesses (of all types, not just startups) have a high failure rate. The socially destructive characteristic identified here is: given more opportunity to pull the handle on the gambling machine, people are choosing to do that at the expense of other parts of their life.
But yes, this relies on a subjective distinction between "building, but with unpredictable results" and "gambling, with its associated self-delusions".
Current AI systems aren't biomimicry; they run a simulation of something vaguely similar to neurons. This is rather like "why does it take more processing power to emulate a PS2 than the original PS2 had".
They shouldn't show as visual representations, but some "ASCII" charts show the IBM PC character set instead of the ASCII set. IIRC, up to 0xFF UTF-8 and 8859 are very close with the exceptions being the UTF-8 escapes for the longer characters.
It's hard to tell now, but most likely your second post got "second chanced". That's where they go through things that they think might be popular and put them back on the front page, usually a couple days after they were initially posted.
We only have to write div soups to style our websites, because people keep misuing a platform for interactive documents for an OS abstraction.
I have seen CMS systems and asset management products, whose translation and designer teams are now mostly gone, thanks to AI taking care of their work.
How many translation jobs, or asset creation jobs are still available?
I also have witness backend teams being reduced, thanks to SaaS and iPaaS cloud products that remove the need of backend development, now one only needs to plug a couple of products, do some AI based integrations in Boomi, Workato, n8n,... create a frontend with Vercel's v0 and be done with it.
I am in no ilusion that it will come for me as well, and better slide into some other alternative skill set, at least I am closer to retirement, than hunting for my first job.
Very nice overview, however just like 30 years ago, neural networks and deep learning stuff is not for me, regardless of the tutorials.
Yet, 2D and 3D graphics feel relatively natural, maybe because at least I can visualize that kind of math.
Given what most C compilers are written in, are C programmers also C implementers?
I suspect it also depends on who exactly the compiler writers are; the GCC and LLVM guys seem to have more theoretics/academics and thus think of the language more abstractly, leading to UB being truly inexplicable and free of thought, while MSVC and ICC are more on the practical side and their interpretation of it is, as the standard says, "in a documented manner characteristic of the environment". IMHO the "spirit of C" and the more commonsense approach is definitely the latter, and K&R themselves have always leaned in that direction. This is very much a "letter of the law vs. spirit of the law" argument. The fact that these two different sides have produced compilers with nearly the same performance characteristics shows IMHO that the argument of needing to exploit UB is mandatory for performance is a debunked myth.
> Adam Mosseri, who has led Instagram for eight years
This is the light of moral clarity in Mountain View that champions Instagram for Kids [1].
[1] https://www.npr.org/2021/12/08/1062576576/instagrams-ceo-ada...
They are open source cathedrals.
What is the equipment at [1] in the video? It looks like a huge room of large centrifugal blowers not connected to any ductwork. Is that some random AI-generated picture? From Los Alamos National Laboratory?
For comparison, here are some large centrifugal blowers from The New York Blower Company.[2] They're a standard industrial item, but unrelated to X-ray machines.
> behave the way the C implementers want them to
If you don't please your users, you won't have any users.
The classic phrase for this is the "reserve army of the unemployed".[1] That goes back to Engels and Marx, around 1845. That's surprisingly early for industrial unemployment. The Industrial Revolution was still starting up.
Farming economies ran out of land, not jobs.
Note that all the examples come from lack of bounds checking.
That is one of the reasons, the other is everything is in the box, hence why Next.js is also the React approach the same enterprises will reach for.
Also why enterprises love Java with Java/Jakarta EE, .NET with VS Studio full install, Rails, Django,...
> Today, 1 in 4 unemployed people, or 1.8 million Americans, have been job searching for over half a year, which in most cases means they’ve also exhausted their unemployment insurance benefits. Benefits vary by state but on average replace less than 40% of a person’s previous income.
The 6-12 months needed to find a job is a worrisome economic predictor and isn’t effectively communicated by unemployment rates alone.
White Americans’ feelings of being “last place” are associated with anti-DEI attitudes, Trump support, and Trump vote during the 2024 U.S. presidential election - https://advances.in/psychology/10.56296/aip00046/ | https://doi.org/10.56296/aip00046
Abstract: Due to racial wealth inequality in the U.S.—inequality that benefits White Americans on average—many Americans associate White people with wealth. Yet, many White Americans report feeling like they, personally, are “falling behind.” We conducted a five-wave longitudinal study with a representative quota sample of non-Hispanic, White Americans (N = 506) during the 2024 U.S. presidential election. We found that White Americans who feel they are falling behind White and Asian Americans, while also being close to being passed by Black and Hispanic Americans, within a perceived tight status hierarchy, reported the most support for DEI bans and Trump, controlling for objective status. Further, White Americans with these status perceptions were most likely to vote for Trump in the 2024 election. We conclude that White Americans’ subjective perceptions of their position in the racial economic hierarchy meaningfully relate to political attitudes and behavior.
The Findings: Using a statistical technique called Latent Profile Analysis (LPA), we identified distinct groups based on where people subjectively ranked themselves and other racial groups on the American status ladder.
* We found a specific group of White Americans (~15% of our sample) who perceived themselves as "tied for last place" with Black Americans.
* Crucially: This group was the most likely to vote for Donald Trump and support bans on Diversity, Equity, and Inclusion (DEI) initiatives.
* Importantly, this effect held true even when we controlled for their actual income, education, age, and gender. In other words, feeling like you are losing status predicted voting behavior more strongly than actually having low status.
Reddit AmA with the authors: https://old.reddit.com/r/politics/comments/1qz9158/we_are_pr...
The general sentiment from musicians and collectors seems to be that they don't want a bunch of scientists to come into their world and tell them that what they are or are not hearing or they just don't understand why controlled tests are required.
There seems to be the same sentiment from audiophiles against testing their ridiculously overpriced placebos, although sometimes it does happen and the results are exactly as you'd expect: https://news.ycombinator.com/item?id=47015987
> The issue isn't AI, it's effort asymmetry
Effort asymmetry is inheret to AI's raison d'être. (One could argue that's true for most consumer-facing technology.)
The problem is AI.
The headline is misleading.
It's simply so now-widows can have another child (or two) with their late husband.
It's motivation for the soldiers -- even if you don't make it, you can still have another child after your death, and in that way you'll live on.
Cracking IDA yourself was, and maybe still is, a "rite of passage" in certain communities.
I've seen messages like that from junior employees up to CEOs. I think it's mainly because most people simply can't type and think quickly enough (150-200wpm) that their thoughts naturally become words with next to no effort.
Stradivarius instruments deserve being put on a pedestal for historical reasons. Stradivari basically defined the sound of the modern violin, using flatter arching and f holes with smaller hole areas than the Amatis, which resulted in a significantly more powerful instrument that was better suited to playing in a concert hall (vs. the chamber music of earlier times). Stradivarius violins are also noted for their extremely fine craftsmanship and attention to detail. The majority of modern violins are still modeled after Stradivarius examples (with a probably smaller number modeled after del Gesu instruments and some other makers). Most top soloists play on (heavily modified) Strads, and so it seems pretty clear that, at the very least, Strads are not holding any soloists back - and that is not the case for Amati instruments, for example, which despite being coveted for their age and history just don't have the same power and sound projection as Strads.
But, as other comments have said, there have been at this point a good slew of blind tests, and Strads are hardly ever recognized better than chance when compared to modern instruments, even when played by experts and judged by experts. People have been studying and modeling after Strads for so long it would be pretty shocking if we couldn't make instruments that sounded as good. In my mind that doesn't make Strads any less valuable - an original Picasso is still valued so highly because it was created by the master that invented Cubism, but that doesn't mean that a modern painter couldn't create a Cubist painting that was "just as good", objectively.
Can you please teach me how to use the CAPS LOCK key as a push-to-talk?
So, I had a friend. Had because he's dead. He did this work for a decade and a half and then couldn't deal with it anymore. In that time he put countless assholes behind bars. At some point he stopped responding to my emails so I called the unit and they were absolutely devastated, this guy was the backbone of their operation, the one with by far the most computer experience of all of them. RIP Ronald.
It is very hard to imagine what the life of someone on the frontline is like, the ones that are really battling online scum. So take that 'think of the children' thing and realize that there are people who really do think of the children and it is one of the hardest jobs on the planet.
Quote from TFA:
"The BBC asked Facebook why it couldn't use its facial recognition technology to assist the hunt for Lucy. It responded: "To protect user privacy, it's important that we follow the appropriate legal process, but we work to support law enforcement as much as we can."
So, privacy matters to FB when it is to protect the abusers of children. How low can you go...
Looks to me that the issue is with the PR process, not with open-source.
From the article -
> It's gotten so bad, GitHub added a feature to disable Pull Requests entirely. Pull Requests are the fundamental thing that made GitHub popular. And now we'll see that feature closed off in more and more repos.
I don't have a solution for this, I'm pointing to the flaw in the assumption that AI is destroying open-source.
Trying to pay a bill. On the website ... it took 24 minutes to navigate to the right place. Then they needed 2FA, so they emailed it to her.
Now she's supposed to open her email while keeping the web page open. It took 5 minutes to do that, find the email, copy down the code, close the email ...
Web site has timed out.
Just one of many examples.
One reason soldering with an iron can be difficult is because your hand is so far away from the tip, like trying to write with a pen held by the end.
Newer irons, especially for SMD work, have gotten smaller and the grip-to-tip distance also shrunk; here's a good visual comparison:
https://www.eevblog.com/forum/reviews/grip-to-tip-distance-o...
It's worth noting that the longest one there is already much shorter than the classic mid-century unregulated irons, and all of those can be held like a pencil.
No... you have to actually be important to countersignal with your clothing.
And yes, those plenty of executives are precisely in the "no signaling" category.
Mere executives don't get to countersignal with their clothing in such a visible way. Majority owners do.
In a funny inversion of the normal analogy to machine code and compilers, you could say the same thing about people using decompilation rather than getting gud at reading ARM assembly.
It would be very UK to set up their own system which is only used in the UK instead of jumping on to Wero, the emerging European system which has a potential user base of ~400m. Or even dovetailing with Paypay, the Japanese system.
“We find evidence that drivers experienced more unpaid idle time and longer distances driven between tasks…Using a simple model of the labor market for platform delivery drivers, we show that our evidence is consistent with free entry of drivers into the delivery market driving down the task-finding rate until expected earnings return to their pre-reform level.”
I thought the effect would be take-home pay remaining constant because hours worked fell. This is sort of worse, unless idling isn’t really an issue for a gig worker.
Also, you can read the plate from much farther away than the TPMS sensors.
Abstract excerpt
> We provide systematic evidence on the economic damages from espionage to US firms and industries.. revenues and R&D expenditures at targeted firms decline by roughly 40% within five years.. exports in targeted sectors decline by 60% over a decade.. espionage has clear economic harms to targeted firms and US industry, but firms are puzzlingly unresponsive in how they manage innovation.
I think this is more interesting as a rubric than as a prediction. I agree with some of it and not with others; I don't know if we're "cooked" or not; I do like how they've broken vertical software's moats down though.
1. I don't buy that chat interfaces will replacing existing user interfaces. I'm in particular a little bit familiar with Bloomberg's user culture. I don't know that I buy that it's going to be replaced with LLM chat prompts. But software agents are going to make faithfully reproducing those existing user interfaces much easier, so: half credit?
2. Half credit again on LLMs vaporizing the "business logic" moat, because the vertical-specific rules that justified the original software market are I think a lot harder to encode in Markdown than the 1 week they gave it, and also verification becomes a bear as more ground-truth business logic is replaced with nondeterministic AI output. There's a thing happening here for sure, I just don't buy it's as decisive as they say.
3. Public data access: I 100% buy this. If this was a real moat, it's dead.
4. Talent scarcity: same deal. Remember, we're talking about vertical software, where the underlying technical work is fairly repetitive and best-practices driven; it's the exact slice of software development work LLMs excel at.
5. Bundling (you get IB messaging along with your charting and your news service); maybe. This point feels tautological. Work out what LLMs do to each of the bundled experiences and there's your answer for how resilient that moat is.
6. Proprietary data: I think they're just dead on right here, and it does indeed seem to be a good time to be a company like Bloomberg?
7. Regs lock-in: half credit, because AI does make regs compliance a lot easier, and I think we're at the very early stages of seeing how.
8. Network effects seems like a repetition of "bundling" and if I have a qualm about this rubric it's that they made it look like an even 10, so they could have clean wins and losses.
9. Transaction embedding (ie, being a payment processor or a loan originator) also seems tautological; it's a moat, sure, but they're begging the question of whether AI enables people to stand up viable competitors.
10. I think "system of record" and "transaction embedding" are kind of the same moat.
I wish people would not blog on X (I will call it X when it's used as a crappy blog platform); these ASCII charts are awful. But that's neither here nor &c.
Well done! Have you uploaded these scans to the Internet Archive? If not, please consider doing so.
https://help.archive.org/help/uploading-a-basic-guide/
https://help.archive.org/help/managing-and-editing-your-item...
Trail Crew Stories and Mountain Gazette might also be interested in this.
> The overall index has been pretty well flat. What sectors gained?
Data centers and AI.
The current US economy is flat except for AI and data centers.[1]
[1] https://fortune.com/2025/10/07/data-centers-gdp-growth-zero-...
Disappointing article. On one hand, we have direct costs and over prescription. On the other hand, we have folks being educated that (a) they have a condition they should speak to their doctor about or (b) a condition they knew they had had a new treatment option. To answer the headline question you need to at least attempt to measure each of those.
> it is important to note that this is not necessarily what people have in mind when they think of "LLMs generating skills
I’m reading this paper as don’t do this. If you deploy agents to your workforce and tell them to use skills, don’t. Tell them to give it tasks. This sounds obvious but might not be to everyone. (And in any case, it’s nice for researchers to have confirmed pre-prompt skill writing doesn’t work. It would have been neat if it had.)
I use handy as well, and love it.
Can consciousness ever be understood — this side of death?
One could say even that they are trying to hasten the fall of one empire to avoid a protracted period of wars for dominance before the next one rises.
Ok. So I have a bunch them here, different sizes, both SSD and spinning rust. The big ones are all consumer grade drives with a little adapter board like you describe. The small ones are a mix of a single custom board with a USB connector and adapter board based ones. The tell tale is the outer dimension in the length, if the case is a little bit longer than a standard drive you have a very good chance of having one with an adapter board if it is the same or even smaller than the standard format then almost all of them are custom boards. The really nice ones have NVME guts in them that you can immediately repurpose.
The AI world moves at a blistering pace. Academic publishing does not. In this particular case the "random dude on HN" is probably six to nine months ahead of the academic publication, not in the sense of being that much smarter but literally just being that much further progressed through time relative to the academic publication pipeline.
This is a bit in the same direction as Epstein’s horrible tech higiene- using computers with outdated software, little or no cryptography, and so on. Another person summarised it quite well: “too rich to care”.
People react differently towards me depending on how I dress. It's quite noticeable. The sensible thing to do is take advantage of it.
> It doesn’t feel like you’re playing when you use it
That's a feature, not a bug. Gamifying nature is a bad idea. It's tourism, but with the worst kind of tourists.
> isn't this more a trait of autism than anything else?
No. It’s a sign of drive and discipline.
The latter, specifically the focus element, overlaps with autism. But more broadly it does not. (There are a lot of impressive teenagers applying themselves diligently to impressive ends. Most of them are not on the spectrum, though I suspect mild autism is slightly over-represented in that set.)
The Soviets used the "iron broom" (i.e. murder) on the wealthy people.
It didn't make anyone better off.
Most of the logic of this post will be incoherent in a world where AI has replaced software jobs wholesale. You have to pick a lane. Is it so effective that it (and the labor market more broadly) needs to be aggressively regulated, or it not very useful for anything but trolling? It can't be both.
Two more recent articles by this author:
https://0byte.io/articles/neuron.html
https://0byte.io/articles/helloml.html
He also publishes to YouTube where he has clear explanations and high production values that deserve more views.
https://www.youtube.com/watch?v=dES5Cen0q-Y (part 2 https://www.youtube.com/watch?v=-HhE-8JChHA) is the video to accompany https://0byte.io/articles/helloml.html
>On the positive side of this, research papers by competent people read very clearly with readable sentences
That's because it's their PhDs that did the actual work...
Marketing cost vs improved hiring, comp, quality of life, etc (which is expensive and hard).
Ironically the linked text by this Kellog guy is 100% AI slop itself
Cost is irrelevant if they get more out of doing it than the processing costs.
Not to mention millions of ex-white-collars or student age that would have tried to be white-collar rushing to become carpenters and roofers, bringing down the pay and the quality of those jobs way down...
The top 30 finalists are listed here:
https://www.societyforscience.org/jic/2025-project-showcase/
> Either way, there's no reason to name numbers until AFTER the company makes an offer with included compensation package details.
I agree that a candidate shouldn't name numbers until after an offer.
But I think the company should give a range as early as possible. This is because of point #2 above. As an engineering manager I've had at least one heartbreaking experience where we took a candidate through the hiring cycle and then found out we and they were way out of line re: comp. Hiring sucks enough without that curveball.
That's why, for all the warts, I'm a fan of salary disclosure laws (like those in Colorado, USA). Yes, it's hard to have an accurate range, because jobs and skills are squishy. Yes, candidates anchor towards the top. Yes, it's weird for a buyer of a thing (labor) to state a price.
But companies have more power in the hiring process (there are, after all, many employees working for a company, but usually only one company an employee works for). Companies, or the hiring managers, also have a budget.
If you are a hiring manager, I'd encourage you to have your salary range shared with candidates as early as possible in the process.
TIL a new shorthand for "the real problem is capitalism", thanks!
>Tattoo-Associated Uveitis: An Emerging Eye-Health Challenge
I was able to source from Best Buy, thank you!
> My best guess is that comfortable clothes are necessary but you also need something high value in addition
I’m just a regular. The point is I’m not signaling anything, I’m just not bothering with a signal because I have other things (namely, being recognized) that will e.g. ensure I get a table even if it’s a busy night.
If I go to Vegas I may grab a silk shirt because, yes, my service experience absolutely varies based on that, and I don’t want to have to wait until they see what I order or get to the check-in counter to start being paid attention to. (Which is annoying. And I prefer my t-shirts with cat holes in them. But I don’t like waiting in lines more than I dislike having to do my hair.)
(I do maybe counter signal in Palo Alto, where I refuse to wear a blazer or a Palo-Alto-grey hoodie. But that’s less of a power move than me inviting attention as a now outsider.)
I know it's popular comparing coding agents to slot machines right now, but the comparison doesn't entirely hold for me.
It's more like being hooked on a slot machine which pays out 95% of the time because you know how to trick it.
(I saw "no actual evidence pointing to these improvements" with a footnote and didn't even need to click that footnote to know it was the METR thing. I wish AI holdouts would find a few more studies.)
Steve Yegge of all people published something the other day that has similar conclusions to this piece - that the productivity boost for coding agents can lead to burnout, especially if companies use it to drive their employees to work in unsustainable ways: https://steve-yegge.medium.com/the-ai-vampire-eda6e4f07163
Related:
Europe's $24T Breakup with Visa and Mastercard Has Begun - https://news.ycombinator.com/item?id=46958399 - February 2026 (1020 comments)
https://news.ycombinator.com/item?id=46963089 (Wero subthread)
No, it does not mean that. In the purest sense it means 'fractional ownership', which can or may lead to profits.
Most reporting I've seen rhymes with this, from last year https://www.theguardian.com/technology/2025/jun/05/english-s...
https://www.edf.fr/en/the-edf-group/dedicated-sections/journ...
(TLDR The deployment of more utility scale battery storage is required in France)
I'd like to see some concrete examples that illustrate this - as it stands this feels like an opinion piece that doesn't attempt to back up its claims.
(Not necessarily disagreeing with those claims, but I'd like to see a more robust exploration of them.)