What are the most upvoted users of Hacker News commenting on? Powered by the /leaders top 50 and updated every thirty minutes. Made by @jamespotterdev.
Several times I have rewritten overly-multithreaded (and intermittently buggy) processes with a single-threaded version, and both reduced LoC to roughly 1/20th and binary size to 1/10th, while also obtaining a few times speedup and reduced memory usage, and entirely eliminating many bugs.
And then I keep using it after in bed and then wake them up with my tapping.
"Is Dark Mode Good for Your Eyes" (2020), 300 comments, https://news.ycombinator.com/item?id=33947820
This is the new system for emergency communications? TfL just finished up an upgrade on that in 2021. That upgrade was built by Thales.[1] That system is purely for operational use, and is not cell phone compatible. It's compatible with the gear cops and fire brigades use. Is it being replaced?
As late as 2018, the classic century-old system, with two bare wires on insulators on the tunnel walls, was still maintained.[2] Clipping a telephone handset to the two wires would connect to a dispatcher, and the wires were placed so that reaching out of the driver's cab to do this was possible. In addition, squeezing the wires together by hand would trip a relay and cut traction power. Is that still operational? The 2011 replacement was ISDN.
[1] https://www.thalesgroup.com/en/news-centre/press-releases/th...
[2] https://www.railengineer.co.uk/communications-on-the-central...
That's a neat little system.
Two surprising design decisions:
- One-way messages. You send, then, in a separate operation, you wait for a reply. This happens at each end. That means two extra trips through the scheduler and more time jitter. QNX has a blocking "MsgSend" which sends and waits for a reply. The scheduler transfers control to the receiving thread in the common case where the receiver is waiting, which behaves like a coroutine with bounded latency. It's a subtle point, but one of the reasons QNX is so well behaved about jitter.
- Interprocess communication by memory remapping instead of copying. This is high overhead for small messages, and at some fairly large size, becomes a win. Remapping pages means a lot of MMU and cache churn. Cost varies with the CPU and memory architecture. Mach worked that way, and the overhead was high. Not sure how expensive it is with modern MMUs. Do you have to stop other threads that might have access to the page about to be unmapped?
"This company predicts software development is a dead occupation"
Citation needed?
Closest I've seen to that was Dario saying AI would write 90% of the code, but that's very different from declaring the death of software development as an occupation.
> everyone should abandon their US free trade agreements
Do you have a link to Doctorow's argument? On its face, this is incredibly stupid--for most economies, the cost of losing a FTA is well above any of the tariff levels being discussed.
> Every single one of these icons should be available to choose by the user.
They are. You can replace the icon of any app straight in Finder, in the Get Info window. Copy the icon from somewhere else, click the icon in Get Into to select it, and Cmd+V to paste.
I mean, you'll need to get the original icon, but that's not too much work. I don't think Apple themselves should be shipping every high-resolution icon they've ever used for every app. OS's are already large enough.
> Pen pots aren't a thing that most people are familiar with
Personally, no. Cognitively? We've been seeing quills and ink in children's stories for centuries. One doesn't have to have used a bubble level to get the analogy in the iOS Level app.
> pen and paper, but now a) everyone knows what a pen and paper look like
A quill and ink are conventionally portrayed in relation to writing. A pen and paper could refer to e.g. sketching.
I'm obviously nitpicking. But I reject the notion that we have to oversimplify to the degree you're suggesting.
> it literally says the name of the app
The OS does this almost everywhere apps exist. Putting the name in the logo is superfluous.
Gray Mode, obviously. /s
Seriously, the best UIs let users adjust things to their preferences instead of forcing one or two-polar-opposite choices.
HN cut it off at "karab" and I thought this was the generic name of some new drug.
File a complaint with your state Attorney General and provide receipts. Google doesn’t care, but regulators do.
Counterpoint: perhaps it's not about escaping all the details, just the irrelevant ones, and the need to have them figured out up front. Making the process more iterative, an exploration of medium under supervision or assistance of domain expert, turns it more into a journey of creation and discovery, in which you learn what you need (and learn what you need to learn) just-in-time.
I see no reason why this wouldn't be achievable. Having lived most of my life in the land of details, country of software development, I'm acutely aware 90% of effort goes into giving precise answers to irrelevant questions. In almost all problems I've worked on, whether at tactical or strategic scale, there's either a single family of answers, or a broad class of different ones. However, no programming language supports the notion of "just do the usual" or "I don't care, pick whatever, we can revisit the topic once the choice matters". Either way, I'm forced to pick and spell out a concrete answer myself, by hand. Fortunately, LLMs are slowly starting to help with that.
It failed because for management and HR, DevOps means ops, for them it is the new buzzword for systems integration, integration engineers, sysadmin,..
I have been foolish enough to accept a few project proposals with DevOps role, which in the end meant ops work dealing with VMs, networking and the like.
I really enjoyed the full piece by Dave Kiss that this piece quoted: https://davekiss.com/blog/agentic-coding/
Personally I am attracted to the simplicity of the I Ching in that I can do a reading for someone very quickly in the field without a lot of explaining (off brand as-a-fox) so I pack an assortment of coins in my tail (ahem… backpack) though I am still looking for a second pocket I Ching so I have something other than Wilhelm.
It's a fun class; worth keeping in mind that several topics with 1-2 units here are whole specializations in the field, including:
* memory safety and exploitation (the "buffer overflow" section is about 20 years out of date, though super appropriate for a first course)
* the WebPKI/certificates thing
* messaging security and messaging cryptosystems,
* microarchitectural security and hardware side channels.
Multiple full courses on each of these subjects would bring you up to "practitioner" levels of expertise.
My experiments with AI generated code is you have to specify it like a programmer would, i.e. you have to be a programmer.
Strongly agree, but this gets expensive fast when you account for all of the network, logging, and security trimmings.
In all seriousness, I assumed that was part of why the US models itself on Rome rather than Greece. Not that there was no homosexuality in the days of the Roman empire, but there was a lot more performative masculinity to make up for it.
The US will only learn by feeling pain, both the administration and the electorate. So, make them feel it politically and economically.
no paywall: no paywall: https://www.wsj.com/tech/ai/google-deepmind-documentary-yout...
I thought that there were some that were visible without an account:
---
The Mandalorian leading a birdwatching group:
This is the jay.
---
The Mandalorian after winning a preliminary injunction:
This is the stay.
---
The Mandalorian making cheese:
This is the whey
---
"Boyd" (2004) is a bio of John Boyd. Boyd was a USAF fighter jock and a very good one, and his OODA (Observe, Orient, Decide, Act) cycle is a Marine icon. The USAF never really liked him, because he was loudly critical of some high level USAF aircraft decisions.
Boyd had a lot of influence on the design of the F-16, the fighter jock's dream airplane. He also pushed for, of all things, the A-10 Warthog, hated by fighter jocks, loved by ground troops who really needed close air support.
Reading up on the OODA loop is useful. Boyd himself barely wrote anything about it. Others have written volumes.
He means other humans.
The stupid point it makes is "since we have AI now, why even accept human contributions to a FOSS project?"
The post itself is AI (or co-AI) slop, which means it can be just ignored.
>If an idiot like me can clone a product that costs $30k per month — in two hours — what even is software development? (...) The new software developer is no longer a craftsman. It’s the average operator, empowered
If entire industries employee counts are decimated and/or commodified, this means the "new software developer" wont find people to pay for what they create (whether software or a software driven service). For the majority, it also means further degradation of the kind of jobs one will be able to find.
Without trying to be glib about it, this post sounds like a description of second-system syndrome, applied to entrepreneurship. It happens to all of us.
https://www.theverge.com/2023/3/13/23637401/samsung-fake-moo...
Are we sure the planets are real?
More like, why have it regurgitate something likely to have been in its training data?
Because they are who create wealth from their labor. All the very wealthy do is capture it via paperwork and legal frameworks. The pendulum is simply swinging back towards fairly compensating labor for its work.
There was a "Never Trump" movement of Republican leaders. It's dead.[1] By now, most of the Never Trumpers are either out of power or have groveled to Trump. The National Review, a conservative publication, wrote: "At no point did Never Trump possess the basic traits of a political movement: a small number of leaders and large number of followers." It was all leaders, or former leaders, or people who thought they should be leaders. The article says Never Trump was composed of "1) experts in foreign policy, economics, and law ... 2) campaign professionals ... and 3) public intellectuals ..." Not Republican governors and members of Congress. Not big donors. Those people only matter when they're in power. There are small conservative journals in which they still write. Few read them. They're not on Fox News.
It's not at all clear what the GOP looks like after Trump. The most likely Republican successors are said to be Vance, Rubio, and DeSantis. The last two have failed badly at presidential bids before.
[1] https://www.nationalreview.com/corner/the-end-of-never-trump...
There are still many companies with cubicles, although they do seem to be getting rarer.
When in the 1990s-00s people posted Dilbert strips, it wasn't, IME, because they identified with the character Dilbert.
They did it because they saw in their work environment echoes of the environment portrayed in the comic, of which Dilbert was as much a part as the PHB.
Well, PHB is not about simply being a manager, but about being a certain type of manager, so he might very well be justified in his wall decoration.
I used to say seeing Dilbert strips in the office is a warning sign. People shouldn’t identify with Dilbert.
I liked this post. I may have some minor qualms (e.g. while I think execs should be proxies for the customer, they have many other competing motivations that can push customer needs way down), but I especially liked the closing section:
> Understanding before criticizing
> Large software companies have real problems. Some are structural. Some are cultural. Many are self-inflicted. But many of the behaviors people complain about are not pathologies – they are consequences.
> If you want to criticize how large organizations operate, it helps to first understand why they operate that way. Without that understanding, the criticism may feel sharp, but it will not be useful.
I see that kind of "criticizing before understanding" all the time on HN, and while that's probably just inherent to an open forum, commenters who do that should realize it makes them come across as "less than insightful", to put it generously. Like I see tons of comments often about how managers only get to their position through obsequious political plots. And sure, that may exist in some orgs. But you can always tell when folks have never even considered the competing forces that act on managers (i.e not just the folks they directly manage, but requirements coming from higher ups, and peer teams, and somehow being responsible for a ton when you actually have few direct levers to push) and solely view things through the lens of someone being managed.
Pretty much everyone believes they can do the boss' job better than the boss. Until they get promoted and become like every other boss.
When you start your own business, though, you have nobody to blame but yourself.
What were you trying to “revert back”? You should have been able to just stop using jj, there’s nothing to revert back to. It’s also possible that I’m misunderstanding what you mean.
Many of your examples came from people who were funded by Universities in the 80s, which was basically the VC of the time. And in the 90s, a lot of the core committers of those projects were already working at VC funded companies.
Back then it was very normal to get VC funding and then hire the core committers of your most important open source software and pay them to keep working on it. I worked at Sendmail in the 90s and we had Sendmail committers (obviously) but also BSD core devs and linux core devs on staff. We also had IETF members on staff.
And we weren't unique, this happened a lot.
One requires more electricity than the other, and custodians of somewhat different skills. A sysadmin is a librarian and custodian with technology skills. If you can vault and custodian physical archives at scale, you can do the same for digital data (imho, based on experience with both). You’re simply building resilient systems on durable primitives.
I’m hopeful for a future where you can potentially carry all recorded knowledge on a device and media you can fit in something somewhat human portable [1]. But until then, humans interested will maintain and continually improve archival and information retrieval systems to preserve and make accessible knowledge.
[1] SPhotonix – 360TB into 5-inch glass disc with femtosecond laser - https://news.ycombinator.com/item?id=46268911 - December 2025 (27 comments)
Title does not match link, flagged for mod attention.
> If institutional investors don't raise housing prices, then how can they profit?
FTA: “large investors may well have economies of scale. They might hire maintenance people who work for them, and thus have an adequately suited number of tasks, rather than having to contract out to workers who charge a premium for not being guaranteed work, or be better at managing properties with software, or perhaps be a more trustworthy borrower and pay less for capital.”
TL; DR they make money on rents. Not appreciation. Obviously appreciation doesn’t hurt, which is why “them owning lots of properties may increase their market power, and thus they price the homes above what is optima” is also tested.
> Everyone I know who has built a house has thought very much about sun, season and temperature.
I've lived in houses that certainly did not take into account sun, season and temperature. I learned a lot from that experience. My current house is optimized for it. I've learned a few more things about it, and could do better.
> the idea that it has something to teach modern architects and builders is pure fantasy
Not my experience with architects and builders.
For example, how many houses have a cupola? They're common on older homes, but non-existent on modern ones. What the roof does is accelerate the wind moving over the roof, then the air vents in the cupola let the wind through, which sucks the heat out of the attic.
Another design element is eaves. Eaves shade the house in summer and don't shade it in winter (for more heat gain). Eaves also keep the sides of the house dry, which means your siding and paint and window frames last a lot longer. Mine are 1.5 feet. Most houses around here have tiny or even non-existent eaves.
The advent of air-conditioning is when architects stopped paying attention to the sun.
You can be a super productive Python coder without any clue how assembly works. Vibe coding is just one more level of abstraction.
Just like how we still need assembly and C programmers for the most critical use cases, we'll still need Python and Golang programmers for things that need to be more efficient than what was vibe coded.
But do you really need your $whatever to be super efficient, or is it good enough if it just works?
How does this work on laptop screens? E.g. running Chrome on my MBA with a notch, the Chrome menus take up 3/4 of the screen width, and then the remaining ~6 icons there is space for are utilities I need. There are even a couple more icons I regularly use and have to switch to Finder to access them, just because it has less menus. The idea is interesting, but it's not clear at all from the homepage how/if this works on laptops as opposed to large monitors, when you're using an application with lots of menus.
I'm also curious how this compares to other similar solutions -- QuickCMD, Raycast, Keyboard Maestro, Command Keeper, etc. It seems clear that its featureset is different, but it's hard to figure out which ones do which things. If you included a comparison features chart it might be helpful so potential customers can see what makes this one unique -- i.e. it's the only one that does X and Y and Z, because every other app only does 2 but not all 3.
I’ve been an HN participant for ~14 years, the vibe ebbs and flows. You can hide and ignore the folks you mention, what’s important imho is that the mods continue to aggressively cultivate a specific community vibe. Nothings perfect, but this is as close as it’s going to get to perfect imho as it relates to intellectual curiosity and understanding how the systems we exist in work.
It's an interesting article on this one particular mansion, but the idea that "the same tricks for more efficient heating can be used in modern designs" seems pretty silly.
We don't use fireplaces anymore (a major "trick" being to put them in the middle of the house rather than in the exterior walls), and while using large windows to capture sunlight and heat works great in the winter, it also leads to overheating in the summer and thus more energy for air conditioning.
> These are modest changes, imperceptible to most, and they won't enable us to forgo active heating and cooling entirely. But they do echo a way of thinking which, today, is oft ignored. Hardwick Hall was designed with Sun, season and temperature in mind.
Everyone I know who has built a house has thought very much about sun, season and temperature. This is very much a factor in determining the sizes and quantity of windows on south-facing vs. north-facing walls, for example.
Again, it's a very interesting article on this one particular castle, but the idea that it has something to teach modern architects and builders is pure fantasy. We're already well aware of all these factors and how they interact with materials and design.
An em-dash (in this use, there are others where the normal style differs) set with regular spaces around it isn’t a grammar problem; it is a less common style preference (usually they are set closed—without spaces—or surrounded by thin spaces, or an en-dash surrounded by regular spaces is used instead.)
I was investigating a fun webcam-to-ASCII project so now I am tempted to take an approach at porting the logic from the blog post into something reusable.
It appears to be a combination of racism, “crabs in the pot” mentality, tribalism and in group motivations, and people looking for another group to look down on because they have no opportunity ahead of them between now and death. Happiness is reality minus expectations.
His supporters still say “It’s not great but I’d vote for him again.” Well, the unfortunate news is he’s near end of life and they footgunned their own economic opportunity light cone. The global economy is going to route around the US accordingly, because it cannot be trusted to trade as an adult vs a bully. They will continue to have their vote and mental model regardless of rationality and logical reasoning. All you can do as a nation state counterparty is defend against military action and disconnect economically.
Same vibes as "If you can convince the lowest white man he's better than the best colored man, he won't notice you're picking his pocket. Hell, give him somebody to look down on, and he'll empty his pockets for you." —- LBJ
(derived from first principles)
It stops drafts from the window before they reach occupants. Yes, it is less efficient in terms of total heat inside the structure, but its more effective at avoiding uncomfortably cold spots, which is (in most places at most times of year) more important, plus, the utility lost to the occupied under-window space is less than the utility that would be lost for the same space elsewhere; the window already limiting alternate uses.
See the recent news about Canada strengthening economic ties with China and welcoming them into their auto market. This wouldn’t have happened in a million years had it not been for US tariffs and hostilities towards Canada. America is truly uniting the world (against them).
Discussion: https://news.ycombinator.com/item?id=46659677
> I believe that explicitly teaching students how to use AI in their learning process
I'm a bit nervous about that one.
I very firmly believe that learning well from AI is a skill that can and should be learned, and can be taught.
What's an open question for me is whether kids can learn that skill early in their education.
It seems likely to me that you need a strong baseline of understanding in a whole array of areas - what "truth" means, what primary sources are, extremely strong communication and text interpretation skills - before you can usefully dig into the subtleties of effectively using LLMs to help yourself learn.
Can kids be leveled up to that point? I honestly don't know.
People can play a role and clearly see the role they play as well.
Plenty of managers see the absurdity in a lot of what they have to do, but it's mandated by the people above them.
> I don’t believe I’ve ever seen shape utilized in generated ASCII art, and I think that’s because it’s not really obvious how to consider shape when building an ASCII renderer.
Not to take away from this truly amazing write-up (wow), but there's at least one generator that uses shape:
https://meatfighter.com/ascii-silhouettify/
See particularly the image right above where it says "Note how the algorithm selects the largest characters that fit within the outlines of each colored region."
There's also a description at the bottom of how its algorithm works, if anyone wants to compare.
The risk is the same as it was a decade ago, and the decade before that.
The startup will take over your life. You will pour every bit of time and money you have into it. You will lose your sanity. And it’s pretty much a guarantee that you will fail.
Unless you already have a comfortable financial cushion and/or industry connections that will guarantee funding and partnerships, you have to be truly crazy to try that life. But, ultimately it’s the crazy ones that end up winning.
> Understanding this doesn’t mean rejecting new tools. It means using them with clear expectations about what they can provide and what will always require human judgment.
Speaking of tools, that style of writing rings a bell.. Ben Affleck made a similar point about the evolving use of computers and AI in filmmaking, wielded with creativity by humans with lived experiences, https://www.youtube.com/watch?v=O-2OsvVJC0s. Faster visual effects production enables more creative options.
It reads a bit like my current position after two decades of on-and-off-GTD and ~three years of PARA: the project/area/resource distinction is practical, but not earth-shattering.
But what‘s really working is GTD, which the article doesn‘t call out, but implicitly lumps together with PARA: actionable next tasks and collecting everything in some kind of inbox.
I haven‘t found much use for PARA itself in my personal life, but for organizing my work OneDrive it shines.
"This is the Lockpicking Robot, and what I have for you today is 34 hours of brute-forcing a master lock."
I'd ask online how to solve this billionth problem I've had with computers, get an answer, follow it and go on with my life, like I did when my OS updated and video files opened with MediaPlayer instead of VLC.
That didn't get thousands of upvotes, or any rage, let me tell you.
Be the change you want to see in the world.
> Until now, I've always been competing against other flesh-and-blood humans
Unless you're a few centuries old, you haven't. You've had the potential to be competing agaist industrial and computational technology your whole life. Go back further, and the prevalence of slaves served as a similar cost differential (free humans versus enslaved, human versus AI).
Indeed, in 1998 I wanted a Voodoo for some 3D work I was going to do, and naturally wanted the best and also play with Glide.
However it had a problem with my PCI version, and the shop guy was nice to trade it back for a TNT, which not only had no issues with my motherboard, made me an early NVidia customer.
I'm not exactly sure what you're asking, but if you pop into the DBOS discord, they can probably help you out.
> Self driving is not a commodity, because it is not fungible - you cannot take BYD's self driving and put in in your Tesla
Currently, no. All evidence points to it as a feature approaching fungibility once it’s good enough and plentiful.
Like, tires aren’t perfectly fungible. But functionally, they’re close to it.
The barber I patronized for many years was a boat person. Wonderful people.
What goes around comes around.
Back in the 80's with OOP starting to take off, it was IC for Software, then Components (famously successful during the 90's on Windows world), now it is IKEA.
>…a trend in which Mr. Trump has used the unfettered presidential clemency power to reward allies and those who have paid his associates or donated to his political operation.
This is a race to the bottom. Criminals confer political advantage by operating outside the law. If we tolerate this from one side we implicitly require it from all.
We need to strike the pardon power. Hamilton thought “the dread of being accused of weakness or connivance, would beget…circumspection” when it came to corrupt pardons [1]. That has not occurred.
Meanwhile, his argument against the Congress granting pardons through law was that “in seasons of insurrection or rebellion, there are often critical moments, when a welltimed offer of pardon to the insurgents or rebels may restore the tranquillity of the commonwealth,” and that such a deliberative process “would have a tendency to embolden guilt.”
We’ve had a civil war. It was not prevented by pardons. At its conclusion, various groups in the executive and the Congress deliberated and passed measures of amnesty [2]. Our modern pardon process, moreover, has evolved into—more often than not— an entire office. And under Trump, its concentrated and corrupt application has directly emboldened those who seek to overthrow our Constitution by violent means.
[1] https://avalon.law.yale.edu/18th_century/fed74.asp
[2] https://www.archives.gov/files/research/naturalization/411-c...
Who decides what information is "accurate"?
My trust in what the experts say has declined drastically over the last 10 years.
> highly inaccurate authority.
The presentation style of most LLMs is confident and authoritative, even when totally wrong. That's the problem.
Systems that ingest social media and then return it as authoritative information are doomed to do things like this. We're seeing this in other contexts. Systems believing all their prompt history equally, leading to security holes.
I believe it's $10,000/year for the top level brand plus $600/year/"affiliate account" https://help.x.com/en/using-x/premium-business
> The US? No, Trump.
No, the US, through its government (which is not just the executive branch) as chosen (in theory, via election) and, in practice, tolerated by its population at large.
It's not just Trump. If the US decided not to follow him he would have no power.
Do you have a more current example
While I have at least a decade or so before my son is an adult, I would be proud for him to be a stay at home son if that was his choice. As long as he's happy and cared for, he always has a place in our family's home.
WYSIWYG came about when displays became bit-mapped graphics with a sufficient amount of dots per inch.
Previously, displays used a character generator ROM chip which mapped ASCII onto one character. For a terminal I designed and built in those days, I used an off-the-shelf character generator chip which had a 5x7 font.
The original IBM PC used a character generator.
I play disco music to keep the kids off my lawn.
Increased productivity increases the value of work and the number of areas it is useful to apply it. Yes, if you are working for a non-growth firm with basically fixed sales, a productivity increase translates to a headcount decrease in that firm, but across the industry it means more jobs at higher pay, as shown by the whole history of productivity improvements in software development.
I don't think the Great Papago Escape was that great - "Over the next few weeks, all of the escapees were eventually recaptured without bloodshed."
The thing that makes this balloon escape story is so enthralling is that it actually worked.
> Too often devs look at QA groups as someone to whom they can offload their grunt work they don't want to do.
That's a perfectly legitimate thing to do, and doing grunt work is a perfectly legitimate job to have.
Elimination of QA jobs - as well as many other specialized white collar jobs in the office, from secretaries to finance clerks to internal graphics departments - is just false economy. The work itself doesn't disappear - but instead of being done efficiently and cheaply by dedicated specialists, it's dumped on everyone else, on top of their existing workloads. So now you have bunch of lower-skill busy-work distracting the high-paid people from doing the high-skill work they were hired for. But companies do this, because extra salaries are legible in the books, while heavy loss of productivity isn't (instead it's a "mysterious force", or a "cost disease").
In a computing system, LLMs aren't substituting for code, they're substituting for humans. Treat them accordingly.
> Even an ATM will resist power tools far longer than it will take the cops to show up.
It'll also spit in your face with a paint that's incredibly hard to wash off.
It's all fun and games until it changes the name of a drug on your prescription.
Do you think having your conversation on speakerphone in public is the same as talking to someone?
>If they produce content with some truth in them, they're truthful, regardless of whether an AI agent did or didn't write them
Nope, they're still slop. Just like a spam message about a product you actually like is still spam.
>People - here in Germany as well as abroad - forget too easily what a sinister but also ridiculous state the GDR was
Wait till you hear how sinister its precursor state was
I didn't believe such conspiracy theories, until one day I noticed Sonnet 4.5 (which I had been using for weeks to great success) perform much worse, very visibly so. A few hours later, Opus 4.5 was released.
Now I don't know what to think.
> Quality absolutely matters, but it's hyper context dependent.
There are very few software development contexts where the quality metric of “does the project build and run at all” doesn’t matter quite a lot.
I need a name for people who dismiss an entirely new and revolutionary class of technology without even trying it, so much so that they'll not even read about any new ideas that involve it.
One might say the same about HN comments.
"Person already sceptical of microplastics papers is convinced by a single counter-paper that he was right all along", news at 11!
$3,500 divided by $199[https://www.rayneo.com/products/rayneo-air-3s-xr-glasses?var...] = 17: close enough!
Imagine trying to live your life where other people’s desires by default overrode you own.
Unfortunately that happens a lot; it's called the government.