What are the most upvoted users of Hacker News commenting on? Powered by the /leaders top 50 and updated every thirty minutes. Made by @jamespotterdev.
Όχι από μέρους μου, και επίσης δεν γνωρίζω τη νέα συμφωνία για την ορθογραφία της πορτογαλικής γλώσσας.
It's funny but... Sometimes I play a lot of Beat Saber which has a thin multiplayer mode where you get in a a room of randos and get put into a really structured kind of environment where you can't hassle people but you can friend people and then invite them into a room where you are in a voice conference.
A lot of the people I've met are retirees who enjoy social VR and like to post pano videos they take on cruise ships -- it is good clean fun but it must drive Mark Zuckerburg up the wall since he's looking for a younger and more impressionable audience.
Are you saying that has failed? It isn't obvious to me from that page that anything in particular is going wrong. I don't think anyone is daft enough to claim that AI solves the "Iowa remains unplantable due to winter conditions" problem.
It also ignores the fact that your backpack needs change.
At various points in my life I've needed:
- A huge backpack, then a small one
- Water bottle holders weren't important, then they were
- Straps I could tighten to hold a yoga mat weren't important, then they were
- A laptop slot wasn't important, then it was critical
Plus my preference in color has changed, as well as my aesthetic preferences.
Paying $200 for a backpack would be insane when I'll have different needs in a few years anyways. I buy cheap-ish backpacks, I've never had a zipper or seam fail on me before I needed to buy a new one for a different reason anyways. Or it was just stolen/lost.
My general life philosophy is to buy the cheapest thing that meets my needs generally, replace as necessary (since I often need to replace/upgrade for functional reasons anyways), and buy a very few expensive high-quality items that I know are actually worth it. Like a mid-tier espresso machine, a good leather jacket, quality boots, a decent home speaker, and... I'm honestly struggling to think of anything else.
Note that the judge is bound by precedent and law as to what "unreasonable" means, they can't just make it up as they go along unless there is no precedent. Otherwise the case can be reversed on appeal.
I was on a jury recently where we had to swap out judges in the last couple days of the trial. The reason was because the judge had been assigned another case where the defendant had not waived his right to a speedy trial. The judge wanted to finish his existing case first, the defense lawyers said "You can't do that", the judge looked it up and found out that indeed they were right, so off he went to start the new case and handed off the existing one to a colleague. In my experience judges really do take the law seriously - that's how they get to be judges.
As one of the cities I spend part of my life, the new metro experience is great, and how they integrated the stations into old Greek infrastructure.
I only morn the loss of jobs that could have been part of the metro, if the wagons weren't robots.
Same, I was gifted a Sandqvist backpack ten years ago. I travel with it as my sole piece of luggage (you can imagine the overstuffing). It has outlived three laptop backpacks that don't go through half as much as it has.
If you have a large pile of spare cash and want your own gem museum, there's one closing down: https://www.bbc.co.uk/news/articles/c937d7p0gzpo
What taught me how to write a compiler was the BYTE magazine 1978-08 .. 09 issues which had a listing for a Tiny Pascal compiler. Reading the listing was magical.
What taught me how to write an optimizer was a Stanford summer course taught by Ullman and Hennessy.
The code generator was my own concoction, and is apparently quite unlike any other one out there!
I have the Dragon Book, but have never actually read it. So sue me.
You're not seriously trying to help.
One thing that may help resolve your issue is that while I do agree with the Sam Vimes theory, it is also not guaranteed. There are also scenarios where the $50 boots will last forever, or you can buy $2 boots that will only last five years... but across your entire lifetime will still be cheaper. Or you can account for the fact that you take better care of your stuff than most people and the cheap thing may in fact be fine for a long time. Or you buy the cheap thing twice and maybe in 15 years when you have more disposable income buy the thing that lasts. Or buy the cheap thing and hit the occasional estate sale and eventually find a thing that lasts, but for dirt cheap prices, because you weren't in a hurry because your immediate needs were met and you had the time to wait for a deal.
The meta-lesson of the Vimes theory is really more that you need to think about these things, but it's not guaranteed that the expensive thing will be better in the longterm on a bang-for-the-buck basis. For furniture, there is something to be said for the technique beloved by the just-starting-out set of buying "whatever I scrounged together from garage sales", and there's something to be said for "I outfitted my apartment from Ikea". Yeah, it's cheap and one way or another you're going to pay for that cheapness, but it's so much cheaper than the alternative that as long as you aren't practicing your wrestling moves on the Ikea end tables, you can get a long way with them even if you're replacing them every 10 years.
And, per your last point... at least when you buy cheap, you know you bought cheap. I found myself in need of a dining room table light a few years back. We went to a lighting store and I stood there staring at all the bespoke LEDs that I knew would die and couldn't be replaced, and the multi-thousand dollar lamps that looked nice but I simply couldn't know if they were quality... and ended up buying a $15 dollar extension cord with 5 light sockets on it, bought some light bulbs to put in it, and wrapped the cord around the remains of the previous what-turned-out-to-be-proprietary track lighting. We decorate it for the season with various ribbon things to hide the cords. Because damn it, if it's all just going to fail anyhow, at least I knew I could replace the lights with whatever I wanted, and it cost me less than $100 all in. We've had that for, gosh, I think at least 10 years now, and I've probably cycled the lights at least twice now, but that's probably still under $100 total... all because I simply can't trust the expensive stuff.
Are you at all worried that the message you are spreading here is "We are no longer confident in our own ability to secure your data?"
That's like saying machine guns didn't change warfare because we had guns before that.
The law has a concept of a "carrier" [1], and has the ability to judge whether or not the carrier in question is responsible for what it is carrying.
I'm not making a blanket statement that that means everything is a carrier, because a good chunk of the page I linked is devoted to endless legal nuances and I defer the details of the concept to those who know better. I'm just saying that the law has a well-established concept for this sort of situation, such that it is not the case that just because a third party is involved instantly all protections dissolve. If you really want to dig into the details, that's something an AI that hits the web and digests things would be pretty good at, as long as you're not planning on legal action based on that. Sometimes the hardest part of learning about something is just finding the term for it that lets you dig in.
Discussion: https://news.ycombinator.com/item?id=47779730
Loved that section about "meat shields". LLMs cannot be held accountable. Someone needs to be involved in decision making, with real stakes if those decisions are bad.
Re-reading Discworld books today demonstrates how timeless they are. Stories Terry wrote in the 1980s still feel like biting satire against the modern world today.
The books also get better as I get older - I read them first as a teenager and many of the deeper ideas about the human condition went straight over my head.
The way the cult leader in Guards! Guards! manipulates his followers, to give just one example.
If you don't know someone high up at Facebook, chances are you're stuck. Facebook has no real support staff left outside of things like ads that bring in actual revenue.
> All that said, in most orgs I've worked with, they were following agile processes over agile principles - effectively a waterfall with a scrum-master and dailies.
In my experience, they're all waterfall in scrum skin, except they also lose the one thing that was a strength of the old-school method: building up a large, well thought out, thoroughly checked spec up front.
So in the end, "business process MBA grinder" reshapes any idea to adapt to leadership needs - and so here, Agile became all about the things that make software people predictable cogs in the larger corporate planning machine. They got what they need anyway, but we threw away the bits that were useful to us.
It may well be in reduced AC bill as well due to increased reflectivity.
So, this is slightly off topic, but out of curiousity, what are NPUs good for right this very second? What software uses them? What would this NPU be able to run if it was in fact accessible?
This is an honest, neutral question, and it's specifically about what can concretely be done with them right now. Their theoretical use is clear to me. I'm explicitly asking only about their practical use, in the present time.
(One of the reasons I am asking is I am wondering if this is a classic case of the hardware running too far ahead of the actual needs and the result is hardware that badly mismatches the actual needs, e.g., an "NPU" that blazingly accelerates a 100 million parameter model because that was "large" when someone wrote the specs down, but is uselessly small in practice. Sometimes this sort of thing happens. However I'm still honestly interested just in what can be done with them right now.)
> Desktop UI culture shifted from “look at this crazy skin” to “work reliably and get out of my way.”
I miss the wobbly windows I had in Linux when we started playing with Compiz.
Or neko on my Sun machines.
As for weird-shaped windows, I think it is about ergonomics. A different shape requires more thinking to operate. Form should follow function, not the other way around - if the odd shape serves a purpose, then it makes sense. If it's just to show off, or to make the app look different, then it becomes a usability issue.
I remember in the late 1990s Windows applications, particularly the little weird ones like the app you would use to work a (flatbed) scanner, often tried hard to have unique themed appearances. The industry seemed to lose interest by 2005 or so. I got a job as a Silverlight programmer not long after that which got me to learn WPF and WPF had facilities for theming that seemed capable and well thought out (would be easy, for instance, to turn pill buttons diagonal) but these hardly ever got used, I think the industry had moved on.
Lately I have had to run Office '98 which tries to take over your desktop with Clippy and other things and it still tries to do it to Windows 11. The borderless windows from Office '98 don't quite look right now but it all works.
That used to be true, but no, nowadays they print perfectly out of the box.
https://www.amazon.com/Visibility-Eco-Friendly-15-Minute-Saf...
>Specified and approved by the Bureau of Explosives and Underwriters Laboratories. No expiration date on road flares, the date shown on the flare is manufactured date. Orion flares will burn in all weather conditions, waxed Flare w/Plastic Cap. 15 Minute Burn Time — Non Perchlorate Formula
Thessaloniki had the same issue, and now there's a stop where you have walkways above the ruins.
Some photos of the "before" here:
https://www.thessalonikiguide.gr/metro-thessalonikis-mia-arx...
That's very interesting, thanks!
I absolutely love your project and I hope it will become a breakout success. It has all the right components for a computing environment that is not controlled.
Have you thought about RISC-V implementations of the kernel as well (iirc you're on ARM and on x64)?
> On February 28, 1974, Shafrazi spray-painted Picasso's 1937 painting Guernica with the words "KILL LIES ALL" in foot-high letters.
> Tony Shafrazian was born in Abadan, Iran, to Iranian Armenian parents
> In 2020, Shafrazi publicly supported Donald Trump for president.
If nothing else, the universe has a sense of humor.
That's why you need to evaluate things by their outcomes. It's the same as weightlifting, if you get injured, "you didn't have good technique".
If your thing sometimes harms people, for whatever reason, your thing isn't safe enough, or easy enough to understand how to do safely.
> because we were sure we needed drop shadows and geometry transforms for windows
As screens get larger, the amount of pixels you need to push to composite windows gets larger-squared. It makes sense to move the pixel pushing away from the CPU and more importantly away from CPU-RAM and on to a separate RAM bus.
The "single buffer with invalidation" model of Win16 (I cannot remember how it works in X) saves memory at the cost of more redraws. The composition model allows you to do things like drag window A over window B without forcing a repaint of window B every frame.
It also allows for better process isolation. I think in both Win16 and X11 you could just get a handle to the "root window" and draw wherever you wanted?
For spammers.
They don't have one for regular people who want to do regular end-user computation.
WhatsApp doesn't support multi-device. You can't have it installed on two phones at once.
It is OK. I actually love looking around other people’s work. Perhaps, I will never follow exactly but one a while, I get the gotchas where I can steal and adapt to mine. Let it be, let people express. If not for the veterans with years of experience, people coming in recently should find these things something to read up and learn.
I just hope they're the Responsible Cybermen.
Aren't they widely believed to be Russian? They've been running for long enough that they're almost certainly in a non-extradition jurisdiction and know to stay there.
Reminds me of repl.it, which perma-blocked my newly created account before I even had a chance to type in e-mail verification code; in fact the notice about account block came before the one-time e-mail verification code.
I still wonder what did I do wrong (support isn't responsive). But it's true that we're both safe from having a user/vendor relationship now.
No. It's code for the thickest, densest book on the subject that you're ever gonna not read, as it actually assumes you're experienced in the subject and goes into everything except intro level topics.
See e.g. Petzold, et al.
First of all, AI written crap: "That's when Stephenson will learn it's never been about the device. It's about what the device enables us to do."
Second, VR never panned out in line with the promises and investments still. Even the last big thing: "Based on reports as of late 2025/early 2026, Apple has significantly cut production of the original $3,499 Vision Pro due to poor sales, with reports suggesting production ceased early in 2025 to manage high inventory. While not officially "discontinued" in the sense of being pulled from sale, production has halted to focus on a new, lower-cost model and AI-focused smart glasses."
So not the best example to use.
Real World Haskell was published in 2008, followed by Real World OCaml in 2013.
Scala got introduced in 2004, with the first Programming in Scala book, in 2008.
HN had plenty of PureScript and Elm.
FP finally was going to get their spotlight, and then mainstream languages got enough FP features to keep being the go to tooling.
Which is why Raspberry PIs are more valuable to me than an x86 NUC, even if the prices are similar.
There are no ARM NUCs at such prices, and even if there were the GNU/Linux support would be horrible.
some kind of genuine software engineering certification
That only gives those in power another way to push people into toeing the line. There's enough corporate authoritarianism these days as it is already. Give Stallman's "Right to Read" a read. His dystopia is exactly where we're going to be headed quickly if we keep demanding someone to "do something".
"The optimal amount of fraud is nonzero."
"Those who give up freedom for security deserve neither."
Oh, people are still using Enlightenment.
My last time I used it was still in the 1990's, before I settled into Afterstep and soon afterwards Windowmaker.
In what concerns my use of GNU/Linux, it was CDE on others.
Apparently nothing big came out of Enlightenment and Tizen.
Thanks for the lengthy response.
My point was towards more to the reasoning "V8 moved away from it, thus bad" that usually comes out of such remarks, when in a reality it is much more nuanced that this.
As you kind of nicely put out.
I fully agree with you, however this is basically the fashion on big corporations.
Building business on top of SaaS products, iPaaS integrations, and serverless middleware.
If you think programming a GPU is hard, try to learn how to do a factorial on one of those quantum emulators.
Here is Microsoft one,
https://learn.microsoft.com/en-us/azure/quantum/qdk-main-ove...
Yeah I mean they're only providing the multimillion dollar product people came to the theater to see, who do they think they are wanting to get paid half the ticket price. I bet if movie tickets just cost $10 instead of $20 people would come to the theater and watch ads for 3 hours while gorging overpriced popcorn and sugary snacks.
When I was in my teens and twenties, 11pm to 2am were my workout hours. Consistently, productively and satisfyingly. I’ve since adapted it to early afternoon or late morning. But the idea of running yourself tired at the end of the day still carries unique appeal for me.
As plenty of other things in an "everything is explicit" language, whose goal is to be a safer C and nothing else.
The "module" system is another hack.
The entities holding the information here are literally police departments. The information itself is evidence, used in active criminal investigations. It's good to want things, though.
I think it's code for "the government will have to bail them out".
We need a law that says if you hold any data about a person, they must be notified when anyone accesses it, including law enforcement.
I used to work in criminal investigations. I understand how this might make investigation of real crime more difficult. But so does the fact that you need a warrant to enter someone's home, and yet we manage to investigate crime anyway.
Your data should be an extension of your home, even if it's held by another company. It should require a warrant and notification. You could even make the notification be 24 hours after the fact. But it should be required.
I worry that passkeys are going to confuse the heck out of less technically sophisticated users the moment they hit an edge cases, and I bet they can find edge cases.
> I don't want to be the asshole who is making a shit experience for all of my neighbors, but at the same time, I pay for unlimited
What is it about the word “unlimited” that turns technology-minded people into lawyers? Anyone on HN knows that network pipes are inherently shared, somewhere. I’ve got a 10 gig Comcast fiber and I can’t download at 10 gig from Google Drive because there’s a maxed out pipe somewhere.
OpenCode, Pi, whatever Anthropic doesn't let you use with their subscription because they want to lock you in to their stuff.
Not like I need it for coding but technically or graphically I like the idea of fonts that have double with for CJK characters and it all tiles nicely.
I use opencode with claude models through a GitHub subscription. I've also used claude through Amazon Bedrock.
Both give you optionality because they support N models.
It's called the receiver: https://en.wikipedia.org/wiki/Receiver_(firearms)
We're going the other way: now any random vibe coded slop is the norm.
>Civilization is violent. The Roman Empire maintained it's economy through slavery.
So? Slavery was the baseline back then. The question is whether the Roman Empire was more peaceful/less violent than the alternative, not whether they had slavery or some degree of violence.
>The Catholic Church started the crusades.
After centuries of arab expansion conquering over 6 centuries pre-existing Christian cities and populations in the wider middle east.
What purge?
I'm searching Google trying to figure out what you're talking about but not getting any meaningful results.
I think calling them "commits" is doing it a disservice because it's not the same as git commits, and the differences confuse people coming from git. I'd say "jj changes are like git commits, except they're mutable, so you can freely move edits between them. They only become immutable when you push/share them with people"..
It's a mouthful, but it's more accurate and may be less confusing.
First time I've seen this pattern in the "getting started" guide for a project:
claude "$(curl -sSf https://plainframework.com/start.md)"
https://plainframework.com/start.mdLooks like that usually runs:
uvx plain-start .
Which runs this: https://tools.simonwillison.net/zip-wheel-explorer?package=p...
Yes they are. I am. Many other people are too.
git was a great step forwards, but its conceptual model just doesn't map well to a lot of workflows, and some very simple things are very difficult or impossible with it. It was designed using a certain set of assumptions and primitives, and other assumptions and primitives turn out to be much more suitable for certain workflows.
I don't know if jj is the perfect answer, but it's a huge step forwards in many ways.
Beads is needlessly overengineered. Puts me off from checking Gas Town.
I have the same worry about being locked out.
So I back it up to a NAS. I bought a Synology NAS (back before they turned into an evil company) which includes a Cloud Sync app which will connect to your Google Drive and sync changes every hour. It's technically sync not backup, but because all deleted files go into a "Trash bin" directory that you can set to never empty, it effectively works as backup for deleted files too (though you can't recover older versions of a file that still exists). The really great feature is that it has the option to sync all files that are in Google Docs/Sheets/Slides format as converted to Word/Excel/PPT. And the great thing about the backup running on your NAS is that it doesn't depend on your computer being on or anything.
I know Synology's considered an evil company now because they seem to tie you to their own hard drives now, but I don't know if there's anything else as easy to set up for reliably syncing consumer cloud files to a NAS. Hopefully there is though, if anyone else knows?
And of course, you can similarly run a backup program on your computer to back up your local files to it, as it's just a network mount.
Quite a lot of stuff is on iPlayer. But as always, licensing is the killer.
(Not to mention reputational risk, which is why so many episodes of Top Of The Pops are hidden)
> moves the scales toward abiogenesis
Or the warm early universe hypothesis. In its early life, the entire universe was at a temperature that could sustain liquid water literally anywhere. The idea being, in this hypothesis, life was literally everywhere and then went dormant.
The security side of OpenSSL improved significantly since Heartbleed, which was a galvanizing moment for the maintenance practices of the project. It doesn't hurt that OpenSSL is now one of the most actively researched software security targets on the Internet.
The software quality side of OpenSSL paradoxically probably regressed since Heartbleed: there's a rough consensus that the design of OpenSSL 3.0 was a major step backwards, not least for performance, and more than one large project (but most notably pyca/cryptography) is actively considering moving away from OpenSSL entirely as a result. Again: while security concerns might be an ancillary issue in those potential migrations, the core issue is just that OpenSSL sucks to work with now.
I encourage you to present that analogy to an actual court and see how far it gets you. It's very easy to find the statutory definition of a "data broker" under California law.
This is what I mean by the fruitlessness of these kinds of legal discussions on HN. What do you want me to argue, that you're wrong to want the law to work that way?
> don't understand how people are compelled to violence by a technology they barely understand
Altman has literally been preaching AI as an agent of impending doom since ChatGPT [1]. If you keep telling folks the thing you’re building might mean “lights out” for humanity, some people will take you seriously.
To be clear, this doesn’t excuse the idiot attempted murderer. But dialing up hysteria to scare up investors doesn’t come for free.
[1] https://fortune.com/2023/06/08/sam-altman-openai-chatgpt-wor...
Only if you're all playing the same game. Corruption usually happens because some players have higher priorities.
> The other big thing was making research actually persist across sessions. Most agents treat a single deliverable (a PDF, a spreadsheet) as the end goal. In investing that's day one.
This is a problem with pretty much everything beyond easy single-shot tasks. Even day-to-day stuff, like e.g. I was researching a new laptop to buy for my wife, and am now enlisting AI to help pick a good car. In both cases I run into a mismatch with what the non-coding AI tools offer, vs. what is needed:
I need a persistent Excel sheet to evolve over multiple session of gathering data, cross-referencing with current needs, and updating as decisions are made, and as our own needs get better understood.
All AI tools want to do single session with a deliverable at the end, that they they cannot read, or if they can read it, they cannot work on it, at best they can write a new version from scratch.
I think this may be a symptom of the mobile apps thinking that infects the industry: the best non-coding AI tools offered to people all behave like regular apps, thinking in sessions, prescribing a single workflow, and desperately preventing any form of user-controlled interoperability.
I miss when software philosophy put files ahead of apps, when applications were tools to work on documents, not a tools that contain documents.
This is stupid thinking indulged in by westerners who were born in the lap of luxury. The market is incredibly moral. When my dad was born in a village in Bangladesh, 1 out of 4 kids didn’t live past age 5. Thanks to market reforms and the resulting economic growth, child mortality in Bangladesh has plummeted. Bangladesh’s under-5 morality rate is better today than America’s was at the same time my dad was born.
If India and Bangladesh hadn’t fucked around with socialism for decades after independence, we could have reached the same point many years ago. Millions of children would have been saved. Talk about immorality.
Get an old Prusa MKIII and stick a Revo in there, then learn everything there is to know about 3D printing without spending a fortune or getting locked in. Once you have processed a couple of rolls of filament you'll be much wiser about your needs and that would be the moment to pull the trigger on a 'proper' printer.
Bambu AI is a very good printer (we have 10's of them, and 10's of Prusas as well), but the Bambu eco-system is not ideal and they push really hard to get you to use their cloud connect, the printers have cameras and send footage to servers in China if you get them connected to the point that they are usable. In contrast, there are many open source solutions that will connect a Prusa to your LAN and allow various degrees of remote management (Octoprint, for instance).
Prusa's are extremely hackable, I've adapted them to do all kinds of stuff they were never meant for (1x1x.25 meter for instance, or standard width and height but 65 cm tall). Bambu's are quite closed, though in theory you could hack on their slicer but it's infuriatingly bad compared to the alternatives.
Singapore's largest grocery chain is a co-op run by the country's labor unions, which are closely aligned with the government.
To go further there is a difference between “Zionism” and the way they go about it. I have no problem with the state of Israel per se, and I even think they have the right to defend themselves, but I think the way they treat the Palestinians is terrible.
"Hey, let's try something new!" without a plan for success is just a recipe for failure.
I honestly don't understand the desire for municipal grocery stores at all. Grocery stores famously operate on super slim margins, so it's not like they're raking in the dough. Many of them are often run extremely well. In Texas, HEB is so beloved that a lot of people consider it far better at disaster recovery operations than the actual government.
I'm not against plans to better help people afford groceries, but somebody needs to at least explain how the plan is economically rationally viable, not just "let's try something new!"
In a nutshell, nodes enable arbitrary programming. This is one of the big success stories for visual programming. Nothing would stop you from doing all that in a text programming language but there's definitely an appeal to the graphical layout when you have modules getting input from half-a-dozen different sources and then outputting to just as many.
In Seattle, the proposal for a government grocery store included exemption from paying property taxes and rent.
Similar molecules have been found in meteors for a long time so it is not a surprise. There is no proof life started off planet but it is also possible.
In a roundabout way this article captures well why I don't really like thinking in terms of "normal forms", especially as a numbered list like that. The key insights are really 1. Avoid redundancy and 2. This may involve synthesizing relationships that don't immediately obviously exist from a human perspective. Both of those can be expanded on at quite some length, but I never found much value in the supposedly-blessed intermediate points represented by the nominally numbered "forms". I don't find them useful either for thinking about the problem or for communicating about it.
Someone, somewhere writing down a list and that list being blessed with the imprimatur of Academic Approval (TM) doesn't mean it is actually useful... sometimes it just means that it made it easy to write multiple choice test questions. (e.g., "What does Layer 2 of the OSI network model represent? A: ... B: ... C: ... D: ..." to which the most appropriate real-world answer is "Who cares?")
Yes, one way to think about jj in a sort of low-level way is that every jj command does the equivalent of that, every time.
(You can also set up watchman and have that happen on every file change...)
Smoking (even of tobacco) can generally be banned in the CC&Rs of properties (multifamily complexes is the case where this makes the most sense) and by the landlord in any rented property, multifamily or subject to CC&Rs or not.
Given the alleged recent extreme reduction in Claude Code usage limits (https://news.ycombinator.com/item?id=47739260), how do these more autonomous tools work within that constraint? Are they effectively only usable with a 20x Max plan?
EDIT: This comment is apparently [dead] and idk why.
> Somewhere around 2005-2007, when people were wondering if the Internet was done
Literally who wondered that? Drives me nuts when people start off an argument with an obvious strawman. I remember the time period of 2005-2007 very well, and I don't remember a single person, at least in tech, thinking the Internet was done. I don't know, maybe some ragebait articles were written about it, but being knee-deep in web tech at that time, I remember the general feeling is that it was pretty obvious there was tons to do. E.g. we didn't necessarily know what form mobile would take, but it was obvious to most folks that the tech was extremely immature and that it would have a huge impact on the Internet as it progressed. That's just one example - social media was still in its nascent stages then so it was obvious there would be a ton of work around that as well.
Somewhere around 2005-2007, when people were wondering if the Internet was done, PG was fond of saying "It has decades to run. Social changes take longer than technical changes."
I think we're at a similar point with LLMs. The technical stuff is largely "done" - LLMs have closer to 10% than 10x headroom in how much they will technologically improve, we'll find ways to make them more efficient and burn fewer GPU cycles, the cost will come down as more entrants mature.
But the social changes are going to be vast. Expect huge amounts of AI slop and propaganda. Expect white-collar unemployment as execs realize that all their expensive employees can be replaced by an LLM, followed by white-collar business formation as customers realize that product quality went to shit when all the people were laid off. Expect the Internet as we loved it to disappear, if it hasn't already. Expect new products or networks to arise that are less open and so less vulnerable to the propagation of AI slop. Expect changes in the structure of governments. Mass media was a key element in the formation of the modern nation state, mass cheap fake media will likely lead to its fragmentation as any old Joe with a ChatGPT account can put out mass quantities of bullshit. Probably expect war as people compete to own the discourse.
The GPC spec does not say "no cookies will be set" [1], and does not mention cookies at all. It merely provides a way for the user to indicate their preference that their information not be shared or tracked. The spec even says:
> In the absence of regulatory, legal, or other requirements, websites can interpret an expressed Global Privacy Control preference as they find most appropriate for the given person, particularly as considered in light of the person's privacy expectations, context, and cultural circumstances.
The CCPA [2] also never explicitly mentions cookies or forbids them from being set. The relevant passages about opting out on the sale of personal information are:
> a) A business shall provide two or more designated methods for submitting requests to opt-out, including an interactive form accessible via a clear and conspicuous link titled “Do Not Sell My Personal Information,” on the business’s website or mobile application. Other acceptable methods for submitting these requests include, but are not limited to, a toll-free phone number, a designated email address, a form submitted in person, a form submitted through the mail, and user-enabled global privacy controls, such as a browser plug-in or privacy setting, device setting, or other mechanism, that communicate or signal the consumer’s choice to opt-out of the sale of their personal information
How would you respond to their claim that you are fundamentally misunderstanding GPC, and that the spec and the law do not mean you never set cookies, they mean that you must honor the preferences expressed by the header in backend processes that involve tracking or sale of personal information?
[1] https://w3c.github.io/gpc/
[2] https://www.oag.ca.gov/sites/all/files/agweb/pdfs/privacy/oa...?
Many software companies in the 80s were quiet about their software being bootlegged because it turned out to be great for building a critical mass of users of their software.
The interesting question to me at the moment is whether we are still at the bottom of an exponential takeoff or nearing the top of a sigmoid curve. You can find evidence for both. LLMs probably can't get another 10 times better. But then, almost literally at any minute, someone could come up with a new architecture that can be 10 times better with the same or fewer resources. LLMs strike me as still leaving a lot on the table.
If we're nearing the top of a sigmoid curve and are given 10-ish years at least to adapt, we probably can. Advancements in applying the AI will continue but we'll also grow a clearer understanding of what current AI can't do.
If we're still at the bottom of the curve and it doesn't slow down, then we're looking at the singularity. Which I would remind people in its original, and generally better, formulation is simply an observation that there comes a point where you can't predict past it at all. ("Rapture of the Nerds" is a very particular possible instance of the unpredictable future, it is not the concept of the "singularity" itself.) Who knows what will happen.
Kind of hilarious that our Treasury will be better defended against cyber threats than our DoD.
Dropping them like I accidentally picked up shit...
>The short of it is that there’s no money in photography, compared to videography.Movies routinely have 8 or 9 digit budgets, with teams of hundreds of people who have to collaborate to make footage coming from dozens of different cameras look seamless and consistent.
Movies are not where BlackMagic makes their money. It's from the millions and millions of small videographers, news teams, ad teams, and content creators.
Same for photos.
That's more useful. A big question is how much is really turned off in a computer waiting for the wake-up packet. "The power to the Ethernet controller must be maintained at all times, allowing the Ethernet controller to scan all incoming packets for the Magic Packet frame". So the full network controller is still alive. There's not some tiny Magic Packet detector hardware running off a rechargable coin cell or something, with the main power supply turned off. At least not in the original design.
A lot of sleep modes leave more running than you'd expect.