What are the most upvoted users of Hacker News commenting on? Powered by the /leaders top 50 and updated every thirty minutes. Made by @jamespotterdev.
It's one of those glib statements that I never really believed.
If you got at something with the wrong mental model you are always going to be pushing a bubble around under the rug and you're going to feel tied up by constraints, find "correct" always elusive not matter how much time you take.
If you go at something with the right mental modal it often falls into place.
The "fast vs cheap" dilemma in a well-run operation is that once you have got efficient development under control (cheap for real) you can spend money to accelerate the schedule. "Efficient" and "under control" pretty much require correct. On the other hand I'd expect the average person using this slogan glibly is working on a project that will be late, expensive, and terribly incorrect.
> there is at least one tape that we know used to have Doctor Who on it but which now has another programme.
Recording over another recording does not completely erase the other. I wonder if it could be recovered.
Personally I've lived in the world of "small entrants" and can see that but I think the average voter doesn't really understand that "just anybody" could have created an online service. That is, they think you have to have VC money, be based in Silicon Valley, that it's a right for "them" and not for "us".
Cost, maybe? It is one thing to ship up a valuable satellite (which they all can do). But to ship up 1000s of satellites (and keep doing it in perpetuity, because IIRC they don't have a long lifetime[0]) gets expensive.
0: Looks like 5 years. https://www.space.com/spacex-starlink-satellites.html
TLDR
* Destructuring via Record Patterns
The most prominent feature is the ability to use a record pattern on the left-hand side of a local variable declaration. This allows you to "destructure" an object and initialize multiple variables in a single statement.
Traditional way:
Point p = getPoint();
int x = p.x();
int y = p.y();
Enhanced way: Point(int x, int y) = getPoint();
This also supports nested patterns, allowing you to reach deep into an object hierarchy in one go: Circle(Point(int x, int y), double radius) = getCircle();
* Pattern Matching in Enhanced for LoopsYou can now use these same record patterns in the header of an enhanced for loop to extract data from every element in a collection or array.
for (Circle(Point(int x, int y), double radius) : circles) {
// Directly use x, y, and radius here
}
They don't really have a choice. The launch window is small and they either make it or they don't.
Reminds me of the PowerPC era, where Macs were ridiculously faster than anything x86.
I'm not sure people realize that HN is already at the most libertarian end, and all the discourse spaces which are much closer to actual power and legislation are much less pro-privacy.
"There are several writeups of large backends ported from node/python/ruby to Go which resulted in dramatic speedups, including drop in P99 and P99.9 latencies by 10x"
But that's not comparing apples to apples. When you get a dramatic speedup, you will also see big drops in the P99 and P99.9 latencies because what stressed out the scripting language is a yawn to a compiled language. Just going from stressed->yawning will do wonders for all your latencies, tail latencies included.
That doesn't say anything about what will happen when the load increases enough to start stressing the compiled language.
Dueling physical analogies is never a productive way to resolve a conversation like this. It just diverts all useful energy into arguing about which analogy is more accurate but it doesn't matter because the people pushing this law don't care about any of them and aren't going to stop even if the entire internet manages to agree about an analogy. This needs to be fought directly.
Link is the Youtube short, full video below.
https://www.youtube.com/watch?v=gtiiHXsnsrY
Original title "Why have far-forward nominal Treasury rates increased so much in the past few years? Old risks reemerge in an era of Fed credibility" compressed to fit within title limits.
This came up a few weeks ago. I don't think it's true. This lawsuit from 26 years ago is the only example anybody has come up with. Among the problems with this claim:
* Nobody can find a police department that administers any kind of general cognitive test.
* There are large states with statewide written police aptitude tests that are imperfect but correlated to general cognitive ability, and maximizing scores on that test is the universal correct strategy.
* It's a luridly stupid policy and most municipalities aren't luridly stupid.
I think this happened like, once or twice, in one or two of the 20,000 police departments across the United States, many of which are like one goober and his sidekick (no offense to them; just, you live in gooberville, you're a goober), and now it's an Internet meme that police departments specifically hire for midwittery. Nah.
> At least the US and Israel have a chance of improving their position in the geopolitical landscape.
This seems, uh, awfully optimistic.
Turbo Vision and Clipper want their glory MS-DOS days back.
"Mate, what on Earth is this absolute shithousery."
is either handwritten or Grok! It's definitely a problem that you can get typecast by your own blog. I mean, I kinda quit updating mine around the time my strategy was in rapid flux and the landing page doesn't make sense at all.
"This might sound familiar. It is how I have characterized mediocrity in the past. That’s what comes after uselessness."
I mean, never mind younger us. I have a M5 MBP and even I am tempted for a Neo for travelling
Back in the late 1980s people thought about color quantization a lot because a lot of computers of the time had 16 or 256 colors you could choose out of a larger palette and if you chose well you could do pretty well with photographic images.
"In other words, AI is not only capital intensive—it is labor destructive."
I feel here something is a bit off, like everything is all mashed up together. Something that hasn't happened since 1983 is "unprecedented" and Marxism-Leninism has made peace with the Prisoner's dilemma and the mindset of the young republican college-educated man who pushes crypto and pop economics.
"A Chromebook doesn’t teach you that"
But seriously, if you go to school today they will make a point to put you on the slowest, weakest Chromebook available because they're terrified that you're going to play Krunker.
The result is you become an adult and you'll never buy a Chromebook. It's the same way that being bullied on the schoolbus means you become an adult who'll never ride public transit.
will sell you a desktop computer for around $150 (e.g. four of them for the price of a Neo) which will put an enterprising young person on a much better path to learn about computers than the Neo ever will.
Now you might say I'm being Orwellian but I really think Swift/XCode/iOS is "slavery" and the web platform is "freedom". I mean dev tools are fully competitive for the web platform, I can write my application once and run it on desktop and phone and tablet and VR headset and game consoles and all sorts of things I don't even know about it. I never have to ask permission if I want to deploy an app or update it. I don't have to pay anyone 30% of my revenue.
Meta get to impose verified ID on everyone and link it to their advertisers, AND kill competing networks.
> This launch marks a major milestone in our commitment to the Linux community and the Arm ecosystem.
So does Chrome finally hardware accelerates You Tube on GNU/Linux, and supports WebGPU, just like on Android/Linux and ChromeOS/Linux?
Corporate has the magic touch to do that to any programming language.
Except you are missing the part that CLR has a type system designed specifically for cross language interop, and is taken into account on the design of WinRT as well.
Common Language Specification - https://simple.wikipedia.org/wiki/Common_Language_Specificat...
> The CLS was designed to be large enough to include the language constructs that are commonly needed by developers, yet small enough that most languages are able to support it. Any language construct that makes it impossible to quickly confirm the type safety of code was excluded from the CLS so that all languages that can work with CLS can produce verifiable code if they choose to do so.
WinRT Type System - https://learn.microsoft.com/en-us/uwp/winrt-cref/winrt-type-...
Because then you cannot meet the KPIs of using AI tools on the job. /s
Is anyone from the Computer History Museum listening? If they could do that, as well as scans with “exploded” parts it’d be a boon for both students and enthusiasts, who’d be able to 3D print replacements for many parts.
It misses having a custom scheduler option, like Java and .NET runtimes offer, unfortunely that is too many knobs for the usual Go approach to language design.
Having a interface for how it is supposed to behave, a runtime.SetScheduler() or something, but it won't happen.
>Heavy alcohol use and marijuana are both known to impact memory and recall directly.
And who said they don't do this (long term) exactly through their affecting the gut microbiome?
I've never encountered a person who was attracted to a stupid person.
BTW, the Flintstones is just The Honeymooners without Jackie Gleason. One could also argue that Family Guy and The Simpsons are also reboots of The Honeymooners.
> who have strangely short haircuts and go hunting the way people go to work today
"They're the modern stone age family" are the words in the Flintstones' theme song.
Why go halfway, embrace compiled languages in the backend.
Fast all the way down, especially when coupled with REPL tooling.
Could be intentional dark UI, to get people to put even more trust in the LLM.
"So they don't want to just let Claude do it? Start asking 10x the confirmations"
Never make it to management that way. Other people have to do the failing.
Texaco + Mexico = Texico? The Japanese never fail to amuse foreigners with their naming.
> How is this the fault of AI
It isn't, the article doesn't claim (or even imply) that it is "the fault" of AI, only that AI was part of the chain of events, and nothing is the fault of AI until AI is sufficiently advanced to constitute a moral actor. “At the source of every error which is blamed on the computer, you will find at least two human errors, one of which is the error of blaming it on the computer” remains true.
OTOH, it is potentially the fault of the reliance human actors put on an AI determination.
If you put part of the address in the body space, you can't encrypt the entire body.
IPv6 adoption has been linear for the last two decades. Currently, 48% of Google traffic is IPv6.[1] It was 30% in 2020. That's low, because Google is blocked in China. Google sees China as 6% IPv6, but China is really around 77%.
Sometimes it takes a long time to convert infrastructure. Half the Northeast Corridor track is still on 25Hz. There's still some 40Hz power around Niagara Falls. San Francisco got rid of the last PG&E DC service a few years ago. It took from 1948 to 1994 to convert all US freight rail stock to roller bearings.[2] European freight rail is still using couplers obsolete and illegal in the US since 1900. (There's an effort underway to fix this. Hopefully it will go better than Eurocoupler from the 1980s. Passenger rail uses completely different couplers, and doesn't uncouple much.)[3]
[1] https://www.google.com/intl/en/ipv6/statistics.html
[2] https://www.youtube.com/watch?v=R-1EZ6K7bpQ
[2] https://rail-research.europa.eu/european-dac-delivery-progra...
Indeed, a single high-end desktop today at full load would probably use more power than everything in that room.
_thinks: "gotta be a sarcastic comment"_
Not enough memory -> can't do it.
Not enough CPU -> can do it, but it's slow.
(Ubuntu with the OOM killer - could do it, but when it filled half of memory, it was killed.)
I did some RE'ing of BIOS code back in the days of the first SDR SDRAM and the calibration part was reasonably straightforward; basically sweeping some controller registers through their ranges while doing repeated read/write operations with lots of transitions in the data (e.g. 0x55555555 <> 0xAAAAAAAA) to find the boundaries where errors occurred, and then choosing the middle of the range.
While the article does mention periodic calibration, I wonder if there are controllers which will automatically and continuously adapt the signal to keep the eye centered, like a PLL.
> The Papal ban on children for priests is perhaps the only instance where a theocracy managed to prevent this slide.
Pretty impressive effect, given that there is no such ban. There are a number of other rules which can combine to make it look approximately like there is, but there isn't.
> How is this the fault of AI?
AI is being used by bureaucrats and enforcers to justify lazy, harmful conclusions. You don't live in the real world if you think "just punish the bureaucrats, don't make it about AI" is going to remotely rectify this toxic feedback loop and ecosystem.
Absolutely everybody has face doubles.
Identikit got pretty close and there weren't that many bits in there and quite a few of them were hairstyles and that's a choice, not genetics. How many head shapes, noses, eyes, mouths and ears can there be?
A few million? Then everybody has a few thousand doubles. 100 Million? Still 80.
That's not who I was responding to and you know that full well.
> that money goes to a mega-corp and the savings is passed on to execs
And the execs invest that money back into the economy.
This seems to be mostly re litigating COVID lockdowns, but without any details or analysis or data.
(As usual, the answer to the headline question is no)
He writes three outreach emails and says he's a hustler?
A business owner lamented to me recently that it wasn't the taxes that were crushing his business, but the costly regulations that keep on coming.
The harder the government makes it to operate a business, the less businesses there will be.
You got a lot further than I did.
This is proposed legislation, isn't it? Proposed bills are generally off-topic on HN; they're a dime a dozen.
Can't wait to see how this plays with Vision Pro
You can use gen AI entirely in the spirit of craft. For instance if you need to consume, implement or extend some open source software you can load it up in an agent IDE and ask “How do I?” questions or “how is it that?” questions that put you on a firm footing.
A Dutch journalist was rightly kicked out of Ukraine early in the war for his absolutely stupendously stupid scoop with pictures of russian rockets landing in a city. The moron was all indignant about it too.
https://nltimes.nl/2022/04/04/dutch-journalist-expelled-ukra...
There are far more divides than just that one.
For instance, the ones that look at it from an economics perspective, security perspective, long term maintainability perspective and so on. For each of these there are pros and cons.
This sounds right to me:
> Before AI, both camps were doing the same thing every day. Writing code by hand. Using the same editors, the same languages, the same pull request workflows. The craft-lovers and the make-it-go people sat next to each other, shipped the same products, looked indistinguishable. The motivation behind the work was invisible because the process was identical.
Helps explain why some people are delighted to have AI write code for them while others are unhappy that the part they enjoyed so much has been greatly reduced.
Similar note from Kellan (a clear member of the make-it-go group) in https://laughingmeme.org/2026/02/09/code-has-always-been-the... :
> That feeling of loss though can be hard to understand emotionally for people my age who entered tech because we were addicted to feeling of agency it gave us. The web was objectively awful as a technology, and genuinely amazing, and nobody got into it because programming in Perl was somehow aesthetically delightful.
> I feel like I'm going crazy with this narrative.
We're only getting warmed up. There are programmers on HN that will take the output of their favorite AI, paste it and run it. And we're supposed to be the ones that know better.
What do you think an ordinary person is going to do in the presence of something that they can not relate to anything else except for an oracle, assuming they know the term? You put anything in there and out pops this extremely polished looking document, something that looks better than whatever you would put together yourself with a bunch of information on it that contains all kinds of juicy language geared up to make you believe the payload. And it does that in a split second. It's absolutely magical to those in the know, let alone to those that are not.
They're going to fall for it, without a second thought.
And they're going to draw consequences from it that you thought could use a little skepticism. Too late now.
I've added an instruction: "do not implement anything unless the user approves the plan using the exact word 'approved'".
This has fixed all of this, it waits until I explicitly approve.
Well, there's always wars as the way to get rid of people. I really don't rule out that the people that benefit from this sort of thing will purposefully steer the world in that direction because the poor won't have any choice other than to enlist as a way out of their situation, and never mind the consequences. You can already see some of this happening.
If this is satire, it's not that funny. If you're serious, it's a good example of 'the ugly American.'
> we really need a fix for this. When cops are grossly negligent the money should come out of their aggregate pension fund
This is on us as voters. If we didn’t piss our pants every time a police union sneezed, we’d realize wholesale restarting police departments is precedents in even our largest cities.
It is an AI error, but also an error on the part of the cops, the prosecutors, the judge, and the county sheriff (who is responsible for the jail inmates). I hope everyone involved in this travesty is sued into oblivion and unable to hide behind their immunity defenses. Facial recognition should never be the sole basis for a warrant.
What "serious" tasks does banking involve?
I log in to transfer money, to take a photo of a check to deposit it, to check my balance.
All of that is fine on a phone screen. Actually, it's a lot easier to take the check photo.
And a banking app is a whole lot more secure than a browser tab running extensions that might get hijacked, on a desktop OS whose architecture allows this like widespread disk access, keyloggers, etc.
The key difference is this would be identifiable foreigners doing it.
> I deeply wish the democrats had run a better campaign in 2024.
No, the damage was done before that. Harris ran the best campaign she was capable of running. We know that because she ran a terrible campaign in 2019, even with all the Obama people backing her. I went to the Iowa primary campaigning in 2019. I saw Harris several times, including at a small event focused at Asians. She’s an abysmal retail politician. Warren was hugging people and taking selfies while Harris was hiding in her tour bus. Harris is obviously an introvert who doesn’t really like people.
Given Biden’s age and early talk of being a one-term president, the smart choice was to nominate Elizabeth Warren, who is a fantastic campaigner. But Harris was the choice to appease the identity activists. They killed Dems’ chances in 2024 even before Biden’s term began. That’s a gift that will keep giving because South Carolina is now Democrats’ first primary. If Harris runs again she’s virtually guaranteed to begin the campaign with a strong primary start.
I've seen NetWare, Vines, some proprietary hacks to form the backbone.
It's worse than that though. They are dismantling things that congress has mandated, and also just not making legally mandated payments, and no one is stopping them.
Congress has abdicated their duty of checks and balances. In a functioning government, the executive would have already been removed for not following legally mandated spending.
It's very true for healthcare (especially mental healthcare) and education today as well, because for most people, the choice isn't LLM vs. human attention - it's LLM vs. no access at all.
Right. What banks do is sell loans. That's the profit center. Teller windows, vaults, and cash handling are all low or no revenue cost items.
So newer bank branches look like car dealership offices. There are many little glass rooms where you sit down with a bank employee and discuss loans and other financial products. That's where the money is made.
There's a small area in back with traditional tellers. It's not where the money is made.
Japanese food and Indian food are as different from each other as Indian food and Italian food.
Because bash and sed and suchlike turn out to be the most useful tools for unlocking the abilities of AI agents to do interesting things - more so than previous attempts like MCP.
... well, well, well. I spent a lot of the 2010s revisiting symbolic AI and I'd say the worst problem it had was "reasoning with uncertainty" If you consider a medical diagnosis system like
https://en.wikipedia.org/wiki/Mycin
the result is probabilistic in nature, there's always some chance you'll get it wrong.
Language processing is the same. Language is ambiguous, there are thousands of possible parse trees for a common sentence. You might be talking with somebody and then get a piece of information that revises your interpretation of what they said an hour ago. It's just like that.
In that time frame I was very interested in the idea that decision theory was the key link between computation and action whether you were using symbolic methods (e.g. a very plausible set of rules for address matching might be 99.9% reliable in some cases, 97% in others, 2% in others) or learned methods. A model for predicting market prices is priceless, but put that together with a Kelly Better and you've got a trading strategy.
Maybe there is more to his argument than I got, but as I see it he's defending a boundary that isn't there.
It is a thing, i've been hearing it for at least 6 months. There's a lot of people who really hate AI and want nothing to do with it.
For what it's worth I thought the modal dialog on the original was worse than the pop-over ad on the copy.
Wild Moose just made a blog post[0] about this. They found that putting things into foundation models wasn't cutting it, that you had to have small finely-tuned models along with deterministic processes to use AI for RCA.
[0] https://www.wildmoose.ai/post/micro-agents-ai-powered-invest...
> Good for the bottom line, bad for the worker
Bad for most workers. Good for a cabal. The Jones Act directly lead to the failure of American shipbuilding.
They’ve gotten largely more repairable since then, including adhesives you can electrically debond.
You don't want to give the agent a raw key, so you give it a dummy one which will automatically be converted into the real key in the proxy.
So how does that help exactly? The agent can still do exactly what it could have done if it had the real key.
"This post is the first in a series. We are extending this analysis to more realistic workloads beyond artificial SWE benchmarks. Follow the account and stay tuned.---"
Did something get cut off at the end?
... and really weird hallucinations if you stay awake instead of using as directed although the usual response to the weirdness is "go to bed"
I read through the thing and don't quite understand what this adds that the dozens of LLM coding wrappers don't already do.
You write a markdown spec.
The script takes it and feeds it to an LLM API.
The API generates code.
Okay? Where is this "next-generation programming language" they talk about?
Correct. The story isn’t correct even in the original formulation. US population increased by 50% from 1980 to 2010, and the economy became far more financialized. But the number of bank teller jobs barely grew during that period, even before the iPhone.
Yamaha Motor Corporation
https://global.yamaha-motor.com/news/2026/0226/subsidiary.ht...
Cost cutting measure in an unfavorable market environment, including lack of growth, they expect to persist:
> Yamaha Motor Co., Ltd. is undertaking structural reforms aimed at improving the profitability of its U.S. operations in response to cost increases resulting from U.S. tariffs and changes in the market environment. In addition to implementing cross-business cost reduction initiatives, the Company seeks over the medium to long term to build a profit structure that is not solely dependent on top-line growth, thereby transforming itself into a more resilient and robust organization capable of adapting to change.
https://connectder.com/ avoids needing a new panel or line side tap if your authority having jurisdiction (AHJ) allows meter socket adapters. No affiliation, have one on hand as a demo unit to show people. Any electrician is going to verify your panel from a load calculation perspective in person for free as part of a free estimate.
If there are any incentives available, recommend load center upgrades when funds permit for future proofing and capturing incentives while they're available to offset personal cost outlay.
I pay for https://karakeep.app/ (also open at https://github.com/karakeep-app/karakeep/), create a list per friend, and then share that list to the friend (who can also subscribe using an RSS feed). Items to share are added to the respective lists. URLs added via Karakeep are automatically captured and archived in my instance.
This model also enables federation between Karakeep instances, if so desired. Mobile apps are available.
> This is just another form of that "shadow banking" system isn't it?
Private-credit lenders are literally shadow banks [1]. But I'd be cautious about linking any shadow banking with crisis. Tons of useful finance occurs outside banks (and governments). One could argue a classic VC buying convertible debt met the definition.
That said, the parallel to 2008 is this sector of shadow banking has a unique set of transmission channels to our banks. The unexpected one being purely psychological–when a bank-affiliated shadow bank gates redemptions, investors are punishing the bank per se.
[1] https://en.wikipedia.org/wiki/Non-bank_financial_institution
" maintained by people that nobody technically employs"
In my opinion, it is similar to Parkinson's Law [1] about work filling into the time given to it, but replace work with bureaucracy filling the aggregate enterprise revenue on offer. Atlassian's work, one might argue, is "done" but has such cashflow from business customers that they can continue to spend above and beyond what is needed to maintain what has been built to service their customer base. They could be 37signals/Basecamp, but they are enabled beyond that (from the business customer cashflows mentioned), and so these actions occur until an innovator comes along to replace them (and potentially, the cycle repeats due to enterprise sales cycle durations, inertia, etc).
You see this with all manner of large enterprise in my experience, where what they continue to do is "good enough" to allow for these inefficiencies and actions because they are, on some spectrum, "money printers" due to their moat and inertia. Creative destruction is not a forgone conclusion, nor fast. Is the incumbent exploring the problem spaces adjacent to their core business(es) to increase their TAM to increase shareholder value? Are they innovating? Or are they just churning and burning up revenue on meaningless work?
All of these companies doing layoffs to invest in AI is not about AI specifically, it is about reaching for profits and yield in a challenging business landscape and macro post zero interest rate policy ("ZIRP") imho. They are desperate for productivity growth, whether that is doing more with less people, AI, offshoring, whatever because money now has a cost.
You can quickly find historical availability & consumption data and I don't think it supports any trivially obvious hypotheses like these. You'll find headlines saying things like that we're at a low point in vegetable consumption going back to 1988, but I'm reading an NIH paper charting '70-'2010 and the patterns look stable, except for increases in total calories, in dairy, and in added dairy fats and oils.
Whatever's going on, it's probably going to end up being complicated and multifactorial.
(I do love me a crucifer, though).
They may be, but if there are no elections, there is no United States. Constitutionally, its government is predicated on having elected representatives.
I could see Trump trying this, but I also can see dozens of other people or groups, some richer, more powerful, more competent, and more ruthless than Trump, just waiting in the wings for the guardrails to come off to make a play to rule the territory of the former United States. If he tries and succeeds at this it's open-season. It's not a Trump dictatorship, it's a civil war, akin to the Chinese Civil War after the emperor fell or the Syrian civil war after the Arab Spring.
It's got to be a major struggle maintaining motivation in the face of aggressive and ungrateful users. It's bad enough when they're giving you money, it must be much worse when they're not.
Money.
The general public thinks phones and computers are fundamentally different. Heck, I remember arguing this point even on HN back when smart phones were first coming out and being generally on the losing side as people got very excited about "app stores" and such. I see no practical path to getting to the point that enough of us realize that there is simply no reason for our phones to be locked down the way they are that the companies are forced to undo it, especially with our elites pushing with all they are worth to lock things down harder.
The companies take that confusion to the bank.
There have been numerous attempts at making phone/laptop crossovers, where you can plug your phone into a dock and get a computer, or slide your phone into a laptop case, etc. Some of them are even still around, but they're all definitely second-class citizens. There's a variety of problems that I think they've had in the market, not least of which is the fact that the average person still sees "phones" and "computers" as fundamentally different so the product makes no sense to them, but another issue that I think has held them back is that the product inevitably work by porting the limitations of the phone into the computer, rather than porting the freedom of the computer into the phone.
In the USB-C era, there is no excuse for every phone not having a mode where you can plug it into any ol' USB-C hub/dock and be able to get a desktop environment, even down to the "middle-of-the-line" phones. It would require in most cases no extra hardware. They just don't.