Our developers managed to run around 750MB per website open once.
They have put in ticket with ops that the server is slow and could we look at it. So we looked. Every single video on a page with long video list pre-loaded a part of it. The single reason the site didn't ran like shit for them is coz office had direct fiber to out datacenter few blocks away.
We really shouldn't allow web developers more than 128kbit of connection speed, anything more and they just make nonsense out of it.
PSA for those who aren’t aware: Chromium/Firefox-based browsers have a Network tab in the developer tools where you can dial down your bandwidth to simulate a slower 3G or 4G connection.
Combined with CPU throttling, it's a decent sanity check to see how well your site will perform on more modest setups.
I once spent around an hour optimizing a feature because it felt slow - turns out that the slower simulated connection had just stayed enabled after a restart (can’t remember if it was just the browser or the OS, but I previously needed it and then later just forgot to turn it off). Good times, useful feature though!
I now wonder if it'd be a good idea to move our end to end tests to a pretty slow vm instead of beefy 8 core 32gb ram machine and check which timeouts will be triggered because our app may have been unoptimized for slower environments...
Music producers often have some shitty speakers known as grot boxes that they use to make sure their mix will sound as good as it can on consumer audio, not just on their extremely expensive studio monitors. Chromebooks are perfectly analogous. As a side note, today I learned that Grotbox is now an actual brand: https://grotbox.com
If you want to see context aware pre-fetching done right go to mcmaster.com ...
There are good reasons to have a small cheap development staging server, as the rate-limited connection implicitly trains people what not to include. =3
I'm pretty damn sure those videos were put on the page because someone in marketing wanted them. I'm pretty sure then QA complained the videos loaded too slowly, so the preloading was added. Then, the upper management responsible for the mess shrugged their shoulders and let it ship.
You're not insightful for noticing a website is dog slow or that there is a ton of data being served (almost none of which is actually the code). Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.
From the perspective of the devs, they expect that the infrastructure can handle what the business wanted. If you have a problem you really should punch up, not down.
> Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.
If a bridge engineer is asked to build a bridge that would collapse under its own weight, they will refuse. Why should it be different for software engineers?
It's a website and not a bridge. Based on the description given, it's not a critical website either. If it was, the requirements would have specified it must be built differently.
You're not even arguing with me BTW. You're arguing against the entire premise of running a business. Priorities are not going to necessarily be what you value most.
There’s a magic word that can be used in scenarios like this: “No.”
Failing that, interpret the requirements.
Nobody can watch a bunch of videos at once that don’t even show up until you scroll! That’s a nonsense requirement and the dev’s failure to push back or redirect in a more viable direction is a sign of their incompetence, not that of the non-technical manager that saw YouTube’s interface and assumes that that’s normal and doable.
It is! You’d have to know about lazy loading and CDNs, but neither is black magic.
I agree except for your definition of "developers". I see this all the time and can't understand why the blame can't just be the business as a whole instead of singling out "developers". In fact, the only time I ever hear "developers" used that way it's a gamer without a job saying it.
The blame clearly lies with the contradictory requirements provided by the broader business too divorced from implementation details to know they're asking for something dumb. Developers do not decide those.
From the perspective of the devs, they have a responsibility for saying something literally wont fly anywhere, ever, saying the business is responsible for every bad decision is a complete abrogation of your responsibilities.
Why don't you tell your boss or team something like that and see how well that flies.
The responsibility of the devs is to deliver what was asked. They can and probably do make notes of the results. So does QA. So do the other stakeholders. On their respective teams they get the same BS from everyone who isn't pleased with the outcome.
Ultimately things are on a deadline and the devs must meet requirements where the priority is not performance. It says nothing about their ability to write performant code. It says nothing about whether that performant code is even possible in a browser while meeting the approval of the dozens of people with their own agendas. It says everything about where you work.
Maybe everyone’s got a different situation, but when a different department tried to put ActiveX avatars all over their site, though it offended me from a UX perspective, I was able to get higher ups to reject it by pointing out that it would shut out 20% of their customers.
We always have discussions here about how you have to learn to talk to communicate your value to clients in a language they understand. Same goes for internal communications.
These days the NYT is in a race to the bottom. I no longer even bother to bypass ads let alone read the news stories because of its page bloat and other annoyances. It's just not worth the effort.
Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
We'll simply cut the headlines from the offending website and past it into a search engine and find another site with the same or similar info but with easier access.
I no longer think about it as by now my actions are automatic. Rarely do I find an important story that's just limited to only one website, generally dozens have the story and because of syndication the alternative site one selects even has identical text and images.
My default browsing is with JavaScript defaulted to "off" and it's rare that I have to enable it (which I can do with just one click).
I never see Ads on my Android phone or PC and that includes YouTube. Disabling JavaScript on webpages nukes just about all ads, they just vanish, any that escape through are then trapped by other means. In ahort, ads are optional. (YouTube doesn't work sans JS, so just use NewPipe or PipePipe to bypass ads.)
Disabling JavaScript also makes pages blindingly fast as all that unnecessary crap isn't loaded. Also, sans JS it's much harder for websites to violate one's privacy and sell one's data.
Do I feel guilty about skimming off info in this manner? No, not the slightest bit. If these sites played fair then it'd be a different matter but they don't. As they act like sleazebags they deserve to be treated as such.
In the past some site had light versions, but I haven’t come across one in over 10 years
Makes me wonder if this isn’t just some rogue employee maintaining this without anyone else realizing it
It’s the light version, but ironically I would happily pay these ad networks a monthly $20 to just serve these lite pages and not track me. They don’t make anywhere close to that for me in a year
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
Seems like a gross overestimation of how much facility people have with computers but they don't want random article readers anyway; they want subscribers who use the app or whatever.
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
They know this. They also know that web surfers like you would never actually buy a subscription and you have an ad blocker running to deny any revenue generation opportunities.
Visitors like you are a tiny minority who were never going to contribute revenue anyway. You’re doing them a very tiny favor by staying away instead of incrementally increasing their hosting bills.
Do you think youtube will continue to make it possible to use alternate clients, or eventually go the way of e.g. Netflix with DRM so you're forced to use their client and watch ads?
If Google were just starting YouTube today then DRM would likely be enforced through a dedicated app. The trouble for Google is that millions watch YouTube through web browsers many of whom aren't even using a Google account let alone even being subscribers to a particular YouTube page. Viewership would drop dramatically.
Only several days ago I watched the presenter of RobWords whinging about wanting more subscribers and stating that many more people just watch his presentations than watch and also subscribe.
The other problem YouTube has is that unlike Netflix et al with high ranking commercial content are the millions of small presenters who do not use advertising and or just want to tell the world at large their particular stories. Enforced DRM would altogether ruin that ecosystem.
Big tech will slowly enforce "secure browsing" and "secure OS" in a way that will make it impossible to browse the web without a signed executable approved by them. DRM is just a temporary stopgap.
What does playing fair mean in this context? It would be one thing if you were a paid subscriber complaining that even paying sucks so you left, but it sounds like you’re not.
It is strange to hear these threats about avoiding websites from people who are not subscribers and also definitely using an ad blocker.
News sites aren’t publishing their content for the warm fuzzy feeling of seeing their visitor count go up. They’re running businesses. If you’re dead set on not paying and not seeing ads, it’s actually better for them that you don’t visit the site at all.
I'd like to answer that in detail but it's impractical to do so here as it'd take pages. As a starter though begin with them not violating users' privacy.
Another quick point: my observation is that the worse the ad problem the lower quality the content is. Cory Doctorow's "enshitification" encapsulates the problems in a nutshell.
You're right, it means nothing. But it cuts two ways. These sites are sending me bytes and I choose which bytes I visualize (via an ad blocker). Any expectation the website has about how I consume the content has no meaning and it's entirely their problem.
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
No.
"savvy" web surfers are a rounding error in global audience terms. Vast majorities of web users, whether paying subscribers to a site like NYT or not, have no idea what a megabyte is, nor what javascript is, nor why they might want to care about either. The only consideration is whether the site has content they want to consume and whether or not it loads. It's true that a double digit % are using ad blockers, but they aren't doing this out of deep concerns about Javascript complexity.
Do what you have to do, but no one at the NYT is losing any sleep over people like us.
I also use and like the comparison in units of Windows 95 installs (~40MB), which is also rather ironic in that Win95 was widely considered bloated when it was released.
While this article focuses on ads, it's worth noting that sites have had ads for a long time, but it's their obnoxiousness and resource usage that's increased wildly over time. I wouldn't mind small sponsored links and (non-animated!) banners, but the moment I enable JS to read an article and it results in a flurry of shit flying all over the page and trying to get my attention, I leave promptly.
My family's first broadband internet connection, circa 2005, came with a monthly data quota of 400 MB.
The fundamental problem of journalism is that the economics no longer works out. Historically, the price of a copy of a newspaper barely covered the cost of printing; the rest of the cost was covered by advertising. And there was an awful lot of advertising: everything was advertised in newspapers. Facebook Marketplace and Craigslist were a section of the newspaper, as was whichever website you check for used cars or real estate listings. Journalism had to be subsidised by advertising, because most people aren't actually that interested in the news to pay the full cost of quality reporting; nowadays, the only newspapers that are thriving are those that aggressively target those who have an immediate financial interest in knowing what's going on: the Financial Times, Bloomberg, and so on.
The fact is that for most people, the news was interesting because it was new every day. Now that there is a more compelling flood of entertainment in television and the internet, news reporting is becoming a niche product.
The lengths that news websites are going to to extract data from their readers to sell to data brokers is just a last-ditch attempt to remain profitable.
This is just the top of the iceberg. Don't get me started on airlines websites (looking at you Air Canada), where the product owner, designers, developers are not able to get a simple workflow straight without loading Mb of useless javascript and interrupt the user journey multiple times.
Give me back the command line terminal like Amadeus, that would be perfect.
How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
You can't beat China Southern . They have the most dog shit website I've ever seen. The flight was fine but I gave up doing online check in after 3 attempts. Never mind the bloat:
- required text fields with wrong or missing labels. One field was labeled "ticket no.". It kept getting rejected. I randomly tried passport number instead. It worked.
- sometimes fields only have a placeholder that you can't fully read because the field has not enough width ("Please enter the correct...") and the placeholder disappears once you start typing.
- date picker is randomly in Chinese
- makes you go through multi step seat selection process only to tell you at the end that seat selection is not possible anymore.
- signed up with email; logged out and went back to the SAME login page; now sign up via phone number is required!?
How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
Loudly oppose the trendchasing devs who have been brainwashed into the "newer is better" mindset by Big Tech. I'm sure the shareholders would want to reduce the amount they spend on server/bandwidth costs and doing "development and maintenance" too.
Simple HTML forms can already make for a very usable and cheap site, yet a whole generation of developers have been fed propaganda about how they need to use JS for everything.
This rubbish also exists disproportionately for recipe pages/cooking websites as well.
You have 20 ads scattered around, an autoplaying video of some random recipe/ad, 2-3 popups to subscribe, buy some affiliated product and then the author's life story and then a story ABOUT the recipe before I am able to see the detailed recipe in the proper format.
It's second nature to open all these websites in reader mode for me atp.
This is why people continue to lament Google Reader (and RSS in general): it was a way to read content on your own terms, without getting hijacked by ads.
What on earth do you have to rely on alphabet, an ad company, to read rss for? there are many other options, that are not made by an ad company.
Google Reader was never the answer. It's such a shame that people even here don't realize that relying on Google for that had interests at odds - and you weren't part of the equation at all.
Well, except for your data. You didn't give them enough data. So they shut down shop. Gmail though, ammirite? :D
Yeah I wonder why gmail was not one of the shut down products /s
It's really hard to consider any kind of web dev as "engineering." Outcomes like this show that they don't have any particular care for constraints. It's throw-spaghetti-at-the-wall YOLO programming.
I think it's a GOOD thing, actually. Because all these publications a dying anyway. And even if your filter out all the ad and surveillance trash, you are left with trash propaganda and brain rot content. Like why even make the effort of filtering out the actual text from some "journalist" from these propaganda outlets. It's not even worth it.
If people tune out only because how horrible the sites are, good.
This site more or less practices what it preaches. `newsbanner.webp` is 87.1KB (downloaded and saved; the Network tab in Firefox may report a few times that and I don't know why); the total image size is less than a meg and then there's just 65.6KB of HTML and 15.5 of CSS.
And it works without JavaScript... but there does appear to be some tracking stuff. A deferred call out to Cloudflare, a hit counter I think? and some inline stuff at the bottom that defers some local CDN thing the old-fashioned way. Noscript catches all of this and I didn't feel like allowing it in order to weigh it.
rule #1 is to always give your js devs only core 2 quad cpus + 16GB of RAM
they won't be able to complain about low memory but their experience will be terrible every time they try to shove something horrible into the codebase
Only major media can get away with this kind of bloat. For the normal website, Google would never include you in the SERPs even if your page is a fraction of that size.
Maybe I'm just getting old, but I've gotten tired of these "Journalists shouldn't try to make their living by finding profitable ads, they should just put in ads that look pretty but pay almost nothing and supplement their income by working at McDonalds" takes.
I'm pretty sure people would read more and click on more ads if they didn't have to endure waiting for 49 MB of crap and then navigating a pop-up obstacle course for each article.
This argument is valid if journalism was actually journalism instead of just ripping off trending stories from HN and Reddit and rehashing it with sloppy AI and calling it a day and putting in 4 lines of text buried inside 400 ads.
I don't like the state of journalism either but you realize this is a vicious cycle, no? People not paying for news (by buying newspaper, or more importantly paying for classified ads) leading to low quality online reporting leading to people not wanting to pay for online news.
I never understand this type of comment. People don't pay for news so newspapers (which by the way have pay walls) are forced to degrade their service. It seems strange to me. If I have a restaurant and people don't want to pay for my food, making even worse food with worse service doesn't seem a good solution. If I write books and people don't buy them, writing worse books doesn't make my sales better. Why journalists are different? They sell a service for money like all the others, but for some reason they have a special status and it's totally understandable that they respond to bad sales with a worse product. And actually, somehow it's our fault as customers. For some reason we should keep buying newspapers even if we don't think it's worth to save them from themselves.
Using your analogy, if every restaurant in town had a problem where most people wanted to come in and get food for free (and it was an expectation in the industry) and people refused to go in and pay, everyone would be upset they could no longer go out to eat when there were none left. If nobody is interested in paying for their meal, you can't be shocked the ingredient and chef quality drops in turn.
Our developers managed to run around 750MB per website open once.
They have put in ticket with ops that the server is slow and could we look at it. So we looked. Every single video on a page with long video list pre-loaded a part of it. The single reason the site didn't ran like shit for them is coz office had direct fiber to out datacenter few blocks away.
We really shouldn't allow web developers more than 128kbit of connection speed, anything more and they just make nonsense out of it.
PSA for those who aren’t aware: Chromium/Firefox-based browsers have a Network tab in the developer tools where you can dial down your bandwidth to simulate a slower 3G or 4G connection.
Combined with CPU throttling, it's a decent sanity check to see how well your site will perform on more modest setups.
I once spent around an hour optimizing a feature because it felt slow - turns out that the slower simulated connection had just stayed enabled after a restart (can’t remember if it was just the browser or the OS, but I previously needed it and then later just forgot to turn it off). Good times, useful feature though!
Same for fancy computers. Dev on a fast one if you like, but test things out on a Chromebook.
“Craptop duty”[1]. (Third time in three years I’m posting an essentially identical comment, hah.)
[1] https://css-tricks.com/test-your-product-on-a-crappy-laptop/
I now wonder if it'd be a good idea to move our end to end tests to a pretty slow vm instead of beefy 8 core 32gb ram machine and check which timeouts will be triggered because our app may have been unoptimized for slower environments...
Music producers often have some shitty speakers known as grot boxes that they use to make sure their mix will sound as good as it can on consumer audio, not just on their extremely expensive studio monitors. Chromebooks are perfectly analogous. As a side note, today I learned that Grotbox is now an actual brand: https://grotbox.com
Based on the damage rate for company laptop screens, one can usually be sure anything high-end will be out of your own pocket. =3
If you want to see context aware pre-fetching done right go to mcmaster.com ...
There are good reasons to have a small cheap development staging server, as the rate-limited connection implicitly trains people what not to include. =3
I'm pretty damn sure those videos were put on the page because someone in marketing wanted them. I'm pretty sure then QA complained the videos loaded too slowly, so the preloading was added. Then, the upper management responsible for the mess shrugged their shoulders and let it ship.
You're not insightful for noticing a website is dog slow or that there is a ton of data being served (almost none of which is actually the code). Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.
From the perspective of the devs, they expect that the infrastructure can handle what the business wanted. If you have a problem you really should punch up, not down.
> Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.
If a bridge engineer is asked to build a bridge that would collapse under its own weight, they will refuse. Why should it be different for software engineers?
Because bridge engineers can be sued if the bridge kills people
It's a website and not a bridge. Based on the description given, it's not a critical website either. If it was, the requirements would have specified it must be built differently.
You're not even arguing with me BTW. You're arguing against the entire premise of running a business. Priorities are not going to necessarily be what you value most.
Sounds just like a "helpless" dev that shifts blame to anyone but themselves.
Do you have a suggestion how else to handle the situation I described?
There’s a magic word that can be used in scenarios like this: “No.”
Failing that, interpret the requirements.
Nobody can watch a bunch of videos at once that don’t even show up until you scroll! That’s a nonsense requirement and the dev’s failure to push back or redirect in a more viable direction is a sign of their incompetence, not that of the non-technical manager that saw YouTube’s interface and assumes that that’s normal and doable.
It is! You’d have to know about lazy loading and CDNs, but neither is black magic.
"Developers" here clearly refers to the entire organization responsible. The internal politics of the foo.com providers are not relevant to Foo users.
I agree except for your definition of "developers". I see this all the time and can't understand why the blame can't just be the business as a whole instead of singling out "developers". In fact, the only time I ever hear "developers" used that way it's a gamer without a job saying it.
The blame clearly lies with the contradictory requirements provided by the broader business too divorced from implementation details to know they're asking for something dumb. Developers do not decide those.
In general, how people communicate internally and with the public is important.
https://en.wikipedia.org/wiki/Conway's_law
Have a wonderful day =3
From the perspective of the devs, they have a responsibility for saying something literally wont fly anywhere, ever, saying the business is responsible for every bad decision is a complete abrogation of your responsibilities.
Why don't you tell your boss or team something like that and see how well that flies.
The responsibility of the devs is to deliver what was asked. They can and probably do make notes of the results. So does QA. So do the other stakeholders. On their respective teams they get the same BS from everyone who isn't pleased with the outcome.
Ultimately things are on a deadline and the devs must meet requirements where the priority is not performance. It says nothing about their ability to write performant code. It says nothing about whether that performant code is even possible in a browser while meeting the approval of the dozens of people with their own agendas. It says everything about where you work.
Maybe everyone’s got a different situation, but when a different department tried to put ActiveX avatars all over their site, though it offended me from a UX perspective, I was able to get higher ups to reject it by pointing out that it would shut out 20% of their customers.
We always have discussions here about how you have to learn to talk to communicate your value to clients in a language they understand. Same goes for internal communications.
These days the NYT is in a race to the bottom. I no longer even bother to bypass ads let alone read the news stories because of its page bloat and other annoyances. It's just not worth the effort.
Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
We'll simply cut the headlines from the offending website and past it into a search engine and find another site with the same or similar info but with easier access.
I no longer think about it as by now my actions are automatic. Rarely do I find an important story that's just limited to only one website, generally dozens have the story and because of syndication the alternative site one selects even has identical text and images.
My default browsing is with JavaScript defaulted to "off" and it's rare that I have to enable it (which I can do with just one click).
I never see Ads on my Android phone or PC and that includes YouTube. Disabling JavaScript on webpages nukes just about all ads, they just vanish, any that escape through are then trapped by other means. In ahort, ads are optional. (YouTube doesn't work sans JS, so just use NewPipe or PipePipe to bypass ads.)
Disabling JavaScript also makes pages blindingly fast as all that unnecessary crap isn't loaded. Also, sans JS it's much harder for websites to violate one's privacy and sell one's data.
Do I feel guilty about skimming off info in this manner? No, not the slightest bit. If these sites played fair then it'd be a different matter but they don't. As they act like sleazebags they deserve to be treated as such.
It’s hard to beat https://lite.cnn.com and https://text.npr.org (I imagine their own employees likely use these as well) or https://newsminimalist.com
https://lite.cnn.com seems to load 200KB of CSS
I’m honestly dumbfounded that these exist
In the past some site had light versions, but I haven’t come across one in over 10 years
Makes me wonder if this isn’t just some rogue employee maintaining this without anyone else realizing it
It’s the light version, but ironically I would happily pay these ad networks a monthly $20 to just serve these lite pages and not track me. They don’t make anywhere close to that for me in a year
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
Seems like a gross overestimation of how much facility people have with computers but they don't want random article readers anyway; they want subscribers who use the app or whatever.
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
They know this. They also know that web surfers like you would never actually buy a subscription and you have an ad blocker running to deny any revenue generation opportunities.
Visitors like you are a tiny minority who were never going to contribute revenue anyway. You’re doing them a very tiny favor by staying away instead of incrementally increasing their hosting bills.
> They also know that web surfers like you would never actually buy a subscription
I subscribe, and yet they still bombard me with ads. Fuck that. One reason I don’t use apps is that I can’t block ads.
Do you think youtube will continue to make it possible to use alternate clients, or eventually go the way of e.g. Netflix with DRM so you're forced to use their client and watch ads?
If Google were just starting YouTube today then DRM would likely be enforced through a dedicated app. The trouble for Google is that millions watch YouTube through web browsers many of whom aren't even using a Google account let alone even being subscribers to a particular YouTube page. Viewership would drop dramatically.
Only several days ago I watched the presenter of RobWords whinging about wanting more subscribers and stating that many more people just watch his presentations than watch and also subscribe.
The other problem YouTube has is that unlike Netflix et al with high ranking commercial content are the millions of small presenters who do not use advertising and or just want to tell the world at large their particular stories. Enforced DRM would altogether ruin that ecosystem.
Big tech will slowly enforce "secure browsing" and "secure OS" in a way that will make it impossible to browse the web without a signed executable approved by them. DRM is just a temporary stopgap.
What does playing fair mean in this context? It would be one thing if you were a paid subscriber complaining that even paying sucks so you left, but it sounds like you’re not.
It is strange to hear these threats about avoiding websites from people who are not subscribers and also definitely using an ad blocker.
News sites aren’t publishing their content for the warm fuzzy feeling of seeing their visitor count go up. They’re running businesses. If you’re dead set on not paying and not seeing ads, it’s actually better for them that you don’t visit the site at all.
I'd like to answer that in detail but it's impractical to do so here as it'd take pages. As a starter though begin with them not violating users' privacy.
Another quick point: my observation is that the worse the ad problem the lower quality the content is. Cory Doctorow's "enshitification" encapsulates the problems in a nutshell.
If you have enough detail for a blog post I'd heartily encourage you to submit it.
You're right, it means nothing. But it cuts two ways. These sites are sending me bytes and I choose which bytes I visualize (via an ad blocker). Any expectation the website has about how I consume the content has no meaning and it's entirely their problem.
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
No.
"savvy" web surfers are a rounding error in global audience terms. Vast majorities of web users, whether paying subscribers to a site like NYT or not, have no idea what a megabyte is, nor what javascript is, nor why they might want to care about either. The only consideration is whether the site has content they want to consume and whether or not it loads. It's true that a double digit % are using ad blockers, but they aren't doing this out of deep concerns about Javascript complexity.
Do what you have to do, but no one at the NYT is losing any sleep over people like us.
I also use and like the comparison in units of Windows 95 installs (~40MB), which is also rather ironic in that Win95 was widely considered bloated when it was released.
While this article focuses on ads, it's worth noting that sites have had ads for a long time, but it's their obnoxiousness and resource usage that's increased wildly over time. I wouldn't mind small sponsored links and (non-animated!) banners, but the moment I enable JS to read an article and it results in a flurry of shit flying all over the page and trying to get my attention, I leave promptly.
My family's first broadband internet connection, circa 2005, came with a monthly data quota of 400 MB.
The fundamental problem of journalism is that the economics no longer works out. Historically, the price of a copy of a newspaper barely covered the cost of printing; the rest of the cost was covered by advertising. And there was an awful lot of advertising: everything was advertised in newspapers. Facebook Marketplace and Craigslist were a section of the newspaper, as was whichever website you check for used cars or real estate listings. Journalism had to be subsidised by advertising, because most people aren't actually that interested in the news to pay the full cost of quality reporting; nowadays, the only newspapers that are thriving are those that aggressively target those who have an immediate financial interest in knowing what's going on: the Financial Times, Bloomberg, and so on.
The fact is that for most people, the news was interesting because it was new every day. Now that there is a more compelling flood of entertainment in television and the internet, news reporting is becoming a niche product.
The lengths that news websites are going to to extract data from their readers to sell to data brokers is just a last-ditch attempt to remain profitable.
This is just the top of the iceberg. Don't get me started on airlines websites (looking at you Air Canada), where the product owner, designers, developers are not able to get a simple workflow straight without loading Mb of useless javascript and interrupt the user journey multiple times. Give me back the command line terminal like Amadeus, that would be perfect.
How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
> Don't get me started on airlines websites
You can't beat China Southern . They have the most dog shit website I've ever seen. The flight was fine but I gave up doing online check in after 3 attempts. Never mind the bloat:
- required text fields with wrong or missing labels. One field was labeled "ticket no.". It kept getting rejected. I randomly tried passport number instead. It worked.
- sometimes fields only have a placeholder that you can't fully read because the field has not enough width ("Please enter the correct...") and the placeholder disappears once you start typing.
- date picker is randomly in Chinese
- makes you go through multi step seat selection process only to tell you at the end that seat selection is not possible anymore.
- signed up with email; logged out and went back to the SAME login page; now sign up via phone number is required!?
How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
Loudly oppose the trendchasing devs who have been brainwashed into the "newer is better" mindset by Big Tech. I'm sure the shareholders would want to reduce the amount they spend on server/bandwidth costs and doing "development and maintenance" too.
Simple HTML forms can already make for a very usable and cheap site, yet a whole generation of developers have been fed propaganda about how they need to use JS for everything.
> How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
Or for developers to pad their CV.
Sadly, I think the only answer is some other form of payment than ad clicks. I've no idea what that could be, though.
This rubbish also exists disproportionately for recipe pages/cooking websites as well.
You have 20 ads scattered around, an autoplaying video of some random recipe/ad, 2-3 popups to subscribe, buy some affiliated product and then the author's life story and then a story ABOUT the recipe before I am able to see the detailed recipe in the proper format.
It's second nature to open all these websites in reader mode for me atp.
Good sites do exist. It's just that they drown.
True, these ad heavy cooking sites also dabble extensively in SEOmaxxing their way to the top.
This is why people continue to lament Google Reader (and RSS in general): it was a way to read content on your own terms, without getting hijacked by ads.
Why lament it? I've been using Inoreader for over a decade after Google Reader went away. And I gladly pay for it year after year.
RSS and feed readers still exist! All hope is not lost.
People should stop lamenting Google Reader and start using RSS. There are numerous threads about it on HN, e.g., https://news.ycombinator.com/item?id=45459233
What on earth do you have to rely on alphabet, an ad company, to read rss for? there are many other options, that are not made by an ad company.
Google Reader was never the answer. It's such a shame that people even here don't realize that relying on Google for that had interests at odds - and you weren't part of the equation at all.
Well, except for your data. You didn't give them enough data. So they shut down shop. Gmail though, ammirite? :D
Yeah I wonder why gmail was not one of the shut down products /s
Let's play a fun prediction: I ask HN readers what will be the page size of NYTimes.com in 10 years? Or 20 years?
Want to bet 100 MB? 1 GB? Is it unthinkable?
20 years ago, a 49 MB home page was unthinkable.
It's really hard to consider any kind of web dev as "engineering." Outcomes like this show that they don't have any particular care for constraints. It's throw-spaghetti-at-the-wall YOLO programming.
it's still engineering, just for different constraints - cost & speed.
I think it's a GOOD thing, actually. Because all these publications a dying anyway. And even if your filter out all the ad and surveillance trash, you are left with trash propaganda and brain rot content. Like why even make the effort of filtering out the actual text from some "journalist" from these propaganda outlets. It's not even worth it.
If people tune out only because how horrible the sites are, good.
This site more or less practices what it preaches. `newsbanner.webp` is 87.1KB (downloaded and saved; the Network tab in Firefox may report a few times that and I don't know why); the total image size is less than a meg and then there's just 65.6KB of HTML and 15.5 of CSS.
And it works without JavaScript... but there does appear to be some tracking stuff. A deferred call out to Cloudflare, a hit counter I think? and some inline stuff at the bottom that defers some local CDN thing the old-fashioned way. Noscript catches all of this and I didn't feel like allowing it in order to weigh it.
rule #1 is to always give your js devs only core 2 quad cpus + 16GB of RAM
they won't be able to complain about low memory but their experience will be terrible every time they try to shove something horrible into the codebase
Only major media can get away with this kind of bloat. For the normal website, Google would never include you in the SERPs even if your page is a fraction of that size.
I hate this trend of active distraction. Most blogs have a popup asking you to subscribe as soon as you start scrolling.
It’s as if everyone designed their website around the KPI of irritating your visitors and getting them to leave ASAP.
and the NYT web team was praised as one of the best in the world some (many?) years ago.
previously: nytlabs.com https://web.archive.org/web/20191025052129/http://nytlabs.co...
now: https://rd.nytimes.com
Some of them are good (formerly Richard Harris - Svelte[0]) some of them should stop podcasting.
[0]: https://svelte.dev/
49mb web page? Try a 45mb graphql response.
Ublock origin helps mitigate at the least a little bit here.
Maybe I'm just getting old, but I've gotten tired of these "Journalists shouldn't try to make their living by finding profitable ads, they should just put in ads that look pretty but pay almost nothing and supplement their income by working at McDonalds" takes.
Well, I'm going to block the ads anyway (or just leave), so if they're trying to find profitable ads, they may need to revise their strategy.
“I’m going to either steal your work in a way you don’t consent to, or not consume it” isn’t really great. The alternative is paywalls
Steal? Their server gave me some HTML and it’s up to my user agent to present it however I want.
Much of their work consists of poorly sourced articles, sensationalism, disinformation, and bias to sway the audience.
I'm pretty sure people would read more and click on more ads if they didn't have to endure waiting for 49 MB of crap and then navigating a pop-up obstacle course for each article.
100,000 people clicking at $0.01 CPM is way worse for them than 10,000 people clicking at $2 CPM.
This argument is valid if journalism was actually journalism instead of just ripping off trending stories from HN and Reddit and rehashing it with sloppy AI and calling it a day and putting in 4 lines of text buried inside 400 ads.
I don't like the state of journalism either but you realize this is a vicious cycle, no? People not paying for news (by buying newspaper, or more importantly paying for classified ads) leading to low quality online reporting leading to people not wanting to pay for online news.
I never understand this type of comment. People don't pay for news so newspapers (which by the way have pay walls) are forced to degrade their service. It seems strange to me. If I have a restaurant and people don't want to pay for my food, making even worse food with worse service doesn't seem a good solution. If I write books and people don't buy them, writing worse books doesn't make my sales better. Why journalists are different? They sell a service for money like all the others, but for some reason they have a special status and it's totally understandable that they respond to bad sales with a worse product. And actually, somehow it's our fault as customers. For some reason we should keep buying newspapers even if we don't think it's worth to save them from themselves.
Using your analogy, if every restaurant in town had a problem where most people wanted to come in and get food for free (and it was an expectation in the industry) and people refused to go in and pay, everyone would be upset they could no longer go out to eat when there were none left. If nobody is interested in paying for their meal, you can't be shocked the ingredient and chef quality drops in turn.
In the case of the New York Times, they have subscriptions and many are willing to pay for their work - but their subscriptions are not ad-free.
49MB or homelessness? There is surely other options.
If you can think of any, then congratulations! You've saved journalism!
You should probably tell someone so the knowledge doesn't die with you.
Solution, see my post. ;-)
> Journalists shouldn't try to make their living by finding profitable ads
I mean, they can absolutely try. That doesn't mean they should succeed.