Skip to content

GoogleGodHNComments

David Blue edited this page Jul 26, 2021 · 1 revision

"Google Is Not God of the Web"

26-07-2021 12:58

22.82 Mbps will reliably download very complex web pages nearly instantaneously.The author may be unaware of how ridiculously huge web pages have gotten. I just loaded Wired.com, scrolled down and let it sit for a few seconds. It downloaded 96.2 MB, requiring over 33 seconds on one of those average connections. On a pay-as-you-go data plan, it would have cost about a dollar just to load that one page. The front page has about 500 words of content. It also covered 100% of the content with ads, twice.This is unsustainable. Web developers have utterly squandered all efficiency gains of the last 30 years, and then covered their grossly inefficient work with impossibly annoying, privacy-invading advertising. Google should be applauded if they make these wasteful developers suffer monetarily until they shape up. They've already stolen untold amounts of time and energy from us all.

> 22.82 Mbps will reliably download very complex web pages nearly instantaneously.

The author may be unaware of how ridiculously huge web pages have gotten. I just loaded Wired.com, scrolled down and let it sit for a few seconds. It downloaded 96.2 MB, requiring over 33 seconds on one of those average connections. On a pay-as-you-go data plan, it would have cost about a dollar just to load that one page. The front page has about 500 words of content. It also covered 100% of the content with ads, twice.

This is unsustainable. Web developers have utterly squandered all efficiency gains of the last 30 years, and then covered their grossly inefficient work with impossibly annoying, privacy-invading advertising. Google should be applauded if they make these wasteful developers suffer monetarily until they shape up. They've already stolen untold amounts of time and energy from us all.

>Web developers have utterly squandered all efficiency gains of the last 30 years, and

Loading Wired.com with uBlock and no javascript the page comes in below 1.5MB for me, with most functionality seemingly intact (in that the front page looks mostly normal and I can load articles that appear to have their text completely intact). The bulk of that seems to be fonts and images, which are probably unavoidable for a media site.

Some reasonable noscript whitelisting for Wired.com and a few others (out of 12 total that noscript shows me) gives a page size that's still under 5MB.

Looking at the full page with everything loaded and unblocked, the biggest offender here seems to be not web design, but an aggressively downloading autoplay video on the front page. Without that the page itself is - while not necessarily great - at least reasonably bearable.

Truth be told, I'd started this post intending to blame advertisers, and there is still some merit to that since even before the video kicks in the various third party scripts balloon the page size several times over from the minimal functional one that loads with everything blocked. But in this case, it does simply seem to be a wasteful media stream playing without regard to whether anyone wants it to or not.

With uBlockO it was 1.5MB for me, without it was 3.8MB, on Firefox, coming from Germany. Which is still pretty ridiculous on both numbers for what's actually visible on the page.

Once you scroll, however, things get messy no matter what, because of their "Scott Adkins Answers Martial Arts Training Questions From Twitter" auto-play video they have right now. That ate way another 30MB quickly and the video wasn't even visible (I had scrolled past it).

I browse without 3rd-party and 1st-party scripts [0]. I wanted to praise my setup but it does not work well with Wired:

1.37 MB / 722.93 KB transferred, Finish: 6.57 s

versus uBlock default

8.62 MB / 4.49 MB transferred, Finish: 28.38 s

Clean setup mostly increase load time:

11.58 MB / 5.79 MB transferred Finish: 1.13 min

^ checked with "Disable Cache".

Not much content delivered for such big HTML file

660.46 / 167.60 KB transferred

because it's mostly inline script:

    $$('script').map((s) => s.textContent).join('').length
    568681

And fonts.css is inline font:

127.70 / 100.15 KB transferred

[0]

uBlock Origin:

    * * 1p-script block
    * * 3p block
    * * 3p-frame block
    * * 3p-script block
    * * inline-script block

or uMatrix:

    * * * block
    * * frame block
    * 1st-party css allow
    * 1st-party frame allow
    * 1st-party image allow

That's still a lot. The longest pages on my website download around 200kb of data. 1.5MB is a lot of data for text-based information.

Of course, some space should be left for enhancing the text with images, but the Wired front page doesn't have that many of those.

25 years ago I used to read Texas Monthly magazine (on paper) because it was some of the best journalism around. I finally unsubscribed when I had to flip through 16 pages of ads to get to the table of contents, and several ads per issue contained scratch-n-sniff perfume samples which meant that reading an article required you to smell like a bordello afterwards.

Wired is the digital equivalent of that now. There's some great journalism in there, but no journalism is good enough to put up with a hostile attack on one's sensibilities just to read it.

There isn't much difference with adblock on or off for this page for me (using Chrome on Windows). With adblock off, there are three obvious ads on the top right of the page, search results on the left. With adblock on, these are gone.

In both cases, the top of the search results is a map with a few shops listed, and under this are normal search results.

Repeating this on mobile, the ads are moved inline above the map. They take up about 50% of the screen. That is a bit intrusive. However, I have muscle memory that automatically scrolls down a bit after searching and to be honest, I would not have even noticed them if I wasn't looking.

I don't know where the the 100% of content being ads claim is coming from. Perhaps it's different in different regions?

I'm not trying to be an apologist for Google here, by the way. However, when it comes to intrusive ads there are far worse offenders (although perhaps not when it comes to behind the scenes data collection, which is a bigger issue).

You're right, by turning off my ad-blocker, my mobile screen was fully covered with ads.

What constantly worries me however, is what happens when most visitors use ad-blockers? My guess is, a war would be declared against current web standards, where users can still modify DOM and remove unwanted content. Edit: reword for clarity.

I used to get Wired in print in the late '90s, and i can tell you, Wired was the print equivalent of Wired.com now!

Pages and pages of ads before the contents, and then each article would usually be one or two beautiful full colour pages, before a 'continued on page 94' which took you to the remainder formatted as cramped commieblock-looking text at the back - which you got to by skipping over a few dozen more pages of ads.

Indeed. Ads are just noise that should be filtered out. To visit a website and be served 100 megabyes of noise along with less than 1 megabyte of actual information... It's an absurdly low signal-to-noise ratio, such an incredible waste.

We need to fix the web by blocking these things. No compromises. If a website can't exist without advertising, it should disappear. Eventually, only good websites will remain. Websites made by people who actually have something to say rather than websites made purely to attract audiences for advertisers.

There are two types of noise being served here

The advertising The tracking

The tracking is being used to: tell advertisers how well their advertising is working tell the site how well articles are working give unscrupulous sites the possibility of selling that data to others, which are probably advertisers but maybe other companies.

As far as the ratio of advertisement to content this https://www.editorandpublisher.com/news/higher-ad-to-edit-ra... regarding that ratio in newspapers assumes 60 / 40 where I believe the 40 is supposed to be content (although I find the wording not 100% clear)

Why should they have to "sustain themselves"? If an author wants to put their ideas out there, maybe they should pay for it themselves so they can have their own unmoderated space on the internet.

Authors that rely on advertising have an inherent conflict of interest: they simply won't write anything that offends the advertisers because they're afraid of losing their revenue. Sites like Reddit will nuke entire communities if they prove controversial enough not because they're offended by it but because it causes advertisers to refuse to associate with them. Activist groups can attack and censor anyone these days by putting pressure on advertisers and sponsorships and causing them to pull out.

Why should they have to not "sustain themselves?" If a reader wants to read what an author puts out, maybe they should be allowed to be subject to advertising so that the author doesn't have to pay for it.

With a few exceptions, I learn more from user comments on sites like this than I do from today's "journalism".

Turns out, people willing to spout random ideas on a topic are not in especially short supply and 99% of them are willing to do it for free. The best part is, these free users usually get right to the point.

Long form and investigative journalism need to be funded but the kind of information I find junk articles on the homepage of CNN or Fox is usually better hashed out (and much less biased) in the comments section than reading an article.

In a sentence, most media doesn't have much value-add. Even less so if I have to click through 6 ads and be exposed to malware to see it.

> If a reader wants to read what an author puts out

Why would anyone want to read stuff like sponsored articles which are nothing but thinly veiled advertising? Articles that were pretty much written by PR firms? Why would anyone trust journalists with conflicts of interest? Social media "influencers"?

I want real information. Real thoughts from real people. Not some censored, manipulated corporate viewpoint created to maximize profits. People who actually have something to say go out of their way to tell as many people as possible about their ideas. They don't need to get paid for it. I'm not getting paid to post here.

> maybe they should be allowed to be subject to advertising so that the author doesn't have to pay for it.

Allowed by whom? The user is in control, not the author. It is the user who owns the computer that runs the browser. If any ad gets shown on the screen, it is because the user generously allowed it to happen. Most people do this out of pure good will only to end up being mercilessly abused for the sake of someone's business model. Nevertheless, it is a privilege which can be unilaterally revoked and there is next to nothing that websites can do to stop it. After content has left the server, the author is no longer in control.

Sometimes the best things aren't commercially self-sustaining. Blogs, paid for by the writers' day jobs. Professors' sites on .edu domains, paid for by research budgets.

As for professional journalism, the lack of conflicting interests caused by ads is essential for it to be considered "good", so no good journalism website should be clouded by advertising. Yes, that probably means subscriptions.

Why must websites sustain themselves financially?

It can literally cost you $30 a year to host a website.

Many people will spend more on coffee. Per month.

Is it though? Nowadays you have plenty of good free CMS-es which integrate directly with Netlify. Sure, it might be half a centimeter more complex than WordPress admin but it's still really easy to grasp for many non-tech people (checked).

But even if it was very complex -- which it isn't -- I still fail to see how that supports a model of an ad-supported web hosting.

I think the main cost is producing the content. Researching, creating, and editing the actual words and images.

I hate capitalism. I'm fully an anti-capitalist.

But I'm also a realist. We live in a capitalist society and free content has come out of advertising since before the web. It's an annoying part of the present world but not the most annoying part imo. I don't know how people denouncing just advertising expect the publication of free information to work.

This society has created a vast plenty. I don't see why advertisers, the public and publishers couldn't reach a truce where a moderate amount of semi-relevant text ads get shown the reader in excahnge. But everyone wants to total control, wants to club to death all competitors and that seems to be the way this world of plenty is ending.

It is hard to believe that 100mb of payload is needed to show me same amounts of ads on a page. Merely optimising that without even changing your ad volume /model would go miles in establishing the trust that had been lost.

I agree, but somehow when I google for something, Wired and all these websites are the ones that appear on the first result, so what's the deal? Google don't care if your website is heavy as long as that domain provides a lot of ads income?

I've made a website pure HTML with just a small CSS and no JS; with real great content. Google doesn't take it into account. So I don't know if they are really pushing for a light network, or maybe, I don't know, because it is easier to convert these website into AMP?

High network utilization doesn't have to mean an unresponsive website.

Google cares more about responsiveness. A site is responsive if you can start reading it quickly, regardless of how much network traffic is being used by ads, as long as the ads are loaded asynchronously after the primary content.

Penalizing for high total network traffic is short-sighted and would prevent most video hosts from ranking well in Google Search.

having worked specifically on website responsiveness improvement for a medium-size publisher, I can certainly attest that ads heavily influence performance, even if added asynchronously after the first paint, or after the site becomes interactive; we tried every trick in the book, and then some.

the main reason is that ads are coded by people that have no interest in performance. I've seen huge JS libraries loaded to display a static image. I've seen multiple versions of the same library loaded in the same ad. I've seen things that wouldn't fly anywhere else.

Why? Because the ad agencies are under a lot of time pressure to make things quickly, and there is NO penalty if the quality is terrible. So they take what worked last campaign, add new tags to satisfy the new customer, and ship it out. It displays? Perfect. It's huge and slow as hell? Who cares, it's not their website that gets slow as molasses.

Do you know how they decide if someone is able to start reading quickly? Personally I find asynchronously loading ads around the text so distracting that I basically can't focus on reading until it's done loading so for me I basically need to wait for the whole site needs to load anyway. In the same vein, I would prefer Google down score all sites with anything but entirely static ads.

One of the performance checks they use is how quickly the body text renders into a readable format as a page loads. If your site has slow loading web fonts, AND you haven't specified a fall-back font (i.e. serif, or sans-serif etc.), Google will penalise it.

They decide by the Wired guys having a coffee with the Google guys or a chat in some party.

Google is people afterall. Corrupt as fuck. That's why plenty call it gulag.

You're being downvoted but this is actually a useful suggestion. I use millionshort.com all the time when I want to strip out the low-value SEO crap that dominates search results.

"Google should be applauded if they make these wasteful developers suffer monetarily until they shape up."

As the browser author with the dominant market share, Google could start by not enabling these "wasteful developers". These ridiculously huge web pages can lock up a user's computer when the user's browser is a "modern" and "supported" one but not when the user agent is something simpler.

I can read wired.com really fast using various tcp clients to make the http requests and links to view the html. If user A reads the articles with Chrome and user B reads them using something simpler, how is user A advantaged over user B? All things being equal, if we quiz them on the readings, would user A score higher than user B?

The difference is the advertising, which almost always requires graphics - the more dazzling the better, and detailed tracking, which requires Javascript and the presence of other "modern" browser "features". There is a strong argument to be made that Google's browser caters more to the online ad industry (who wants to show ads and do tracking) than to users (who want to read wired.com quickly and efficiently).

Software developers have long been squandering user's resources beginning with Microsoft Windows. Hardware manufacturers were Microsoft's first customers and there was an incentive to get users to keep "upgrading" their hardware. - buying new computers.

Web developers are simply following the tradition.

A user can get those 500 words of content in an instant with zero ads, using the right user agent, even on an "obsolete" device. However there are zero incentives for online ad industry-supported companies/organisations maintaining "modern" browsers, web developers writing code to run on them nor hardware manufacturers to help the user do that.

The easiest way to change the "UX" for the web user is to change the user agent. Trying to get web developers to change what they design to only use a small fraction of what Chrome and the user's computer can do is far more difficult, if not outright impossible.

> Trying to get web developers to change what they design to only use a small fraction of what Chrome and the user's computer can do is far more difficult, if not outright impossible.

This is not only improbable to happen that web developers will change their malicious behaviour, it's also the user agent's fault for allowing that.

Why is there no connection speed detection in the browser? Why does the browser allow media playback by default? Why is there no mechanism that reflects the expectations of the user? Is the user expecting videos on the news website or just to read text and images?

I personally think that user agents are not really user agents anymore, as there's not even the idea of offering the user a choice like this.

And personally, I do not agree with the concept of trusting random websites on the internet - by default. Any website on the web should be distrusted by default, with the user having the choice on what to load and what to expect when navigating to that specific resource.

If I visit an i.imgur.com/somename.jpg, why am I redirected to an html page with megabytes of files just because the user agent accepts html then? Should this not be outright impossible?

But please take my comment with a grain of salt, I am building a web browser concept that allows filtering and upgrading the web (which is superhard to build) [1] and it's still a ton of work to get anything working in a stable manner.

[1] https://github.com/cookiengineer/stealth

All those things you mention are design choices.

Perhaps one of the impediments to the development of new user agents is a feeling that they must be complex and meet some "standard" of features. A standard that is nigh impossible to meet for the average programmer. On top of that, web developers demand the ability to create complex web pages full of executable code.

However we have no proof that users would reject a cornucopia of different agents that did not all have the same set of features. User agents do not need to be designed to satisy web developers. User agents can be designed to satisfy users.

They can be designed to do one thing well instead of do everything.

No user agent need be intended to "replace" any other, and certainly not to replace "modern" browsers. The intent is to create choice and user experimentation.

It is still possible to access the web with simple programs. It is not gopher or gemini but it still can work in a relatively simple way. Web developers probably do not like that but it remains true. The complexity of today's web sites is optional. It is a deisgn choice. Made possible by... "modern" browsers.

Godspeed.

I find it ironic that you bring up Windows as an example, when the amount of data the parent comment mentioned - ~100MB -- is enough for a full install of Windows 3.11 and Office 4.3... and will yield many times more enjoyment than the front page of Wired.

It is sad that Microsoft would never acknowledge that some users might want to run older software on newer computers. IMO, it is easier to see the performance improvements in new hardware when running older software than it is when running "the latest" software. I would have run 3.11 for many years on newer hardware. However the goal of the company and the message pushed to its software users was/is always "upgrade".^1 Today, it is "update".

1. Over time almost all user choice in "upgrading" has been removed. "Forced upgrades" is a thing.

You are not market share Microsoft aiming for. But you are not obliged to use run Microsoft Windows either. Linux runs perfectly on old hardware.

42 MB RAM without graphical system

64 MB RAM with graphical system

You may run Windows applications on Wine. Or Windows 3.11 on virtual machine.

Netbook I bought in 2008 was underpowered for Windows XP but was perfect for Linux. I still have it around. With up to date Firefox and Chrome it feels slow but in console mode it's snappy.

No need for install with LiveUSB. Everything is here, countless people made it possible, would you use it?

"Linux runs perfectly on old hardware."

I prefer NetBSD. I do not need graphics. I make custom images that boot from USB or the network.

As for Windows, there was a time, in the 32-bit era, and before the widepsread availability of virtual machines and Windows 3.11 images, when users were compelled to upgrade hardware and Windows versions. It was not made easy for a non-technical user to buy new hardware and use 3.11 if the hardware came with a more recent Windows version pre-installed. Microsoft will not facilitate installing older Windows versions on newer hardware ("metal", not VM) and may actively discourage it. In contrast I can easily install any version of NetBSD I want on new hardware. I am not compelled to install the most recent version. There is user choice.

How easy is it today to run Office in a VM on Linux?

> Microsoft will not facilitate installing older Windows versions on newer hardware

It usually works though nowadays, unless you go nuts and try to boot Windows XP or something. Are there any processors that flat-out can't run Windows 7 atm?

(Older versions of macOS, on the other hand, absolutely will not run on newer processors.)

If trying to get this work, Windows 7 is a good choice?

Have you ever successfuly imaged Windows 7 from an older laptop and installed it on a new compuer?

I only need Office. I do not necessarily need the latest version, so long as documents are XML-based.

I have never imaged OS's—I'm sure it's a fine practice since lots of people do it, but it feels "unclean". I always do clean installs.

That said, I was able to pretty quickly install Windows 7 on a then-just-released Ryzen 3950X last October. I do remember there being one hitch, I think I had to slipstream in USB 3.0 drivers.

You can run many old operating systems including esoteric one using virtual machine. Modern computers use very little overhead for virtualization.

This is not a solution to avoiding the resource consumption of running a "modern" OS on a "modern" computer. Your comment completely misses the point.

A lot of sites even download new content continuously even after the page has been fully rendered! They swap out silent media ads one after another, in hopes that you minimized and ignored the page.

Even with adblocking it downloads 30 MB almost instantly, and while typing this reply it's now up to 48 MB. Oh wait, it's at 50 now. It keeps going up.

Looking at the requests it seems to be downloading a video from Cloudfront. And yes, in the middle of the page there is a video playing. I'm sure people with metered connections will love that.

That said, with adblocking at least the design looks clean enough. I'm willing to bet that this is what their designers see, and then another team adds all ad overlays on top of the existing design.

My browsing these days looks like

- JS off by default

- Web fonts off by default

- Media elements larger than 100KiB not loaded by default

uBlock origin is the god of internet.

Edit(s): formatting

> Google should be applauded if they make these wasteful developers suffer monetarily until they shape up.

So you want Google to use their dominant position to force webmasters into a new paradigm that probably(?) benefits Google more than today's status quo? And when people start yelling for antitrust provisions, will you still back Google?

I stopped visiting wired.com years ago, when, instinctively, I could just sense the bloat by the spike in CPU, and lag in pageload, for what was ostensibly plain text, plus a serif font and one header image.

As if I wouldn't notice that casual reading material was causing my cooling fans to spin up.

As if I were multi-tasking so hard, that it couldn't be discernible which dog in the room could take the blame for who just farted.

The wired.com website first showed hints of becoming unusable when their audience participation widget provided by Disqus, for reader/user comments, became this opaque blob of compressed/minified JavaScript. Then they started piling on video. This was back around 2009 or 2010. By 2013, I had mostly washed my hands Conde Nast publications.

Whoops.

>Web developers have utterly squandered all efficiency gains of the last 30 years

It's true, whenever a company has pleaded with me to bring a site in with better performance, I have adamantly refused to do so.

When companies say guess what, Bryan, we are going to focus a sprint on just making sure everything download as fast as possible and we get rid of anything getting in the way of the best possible user experience, I have spent that sprint watching quirky animation, and sometimes turned that animation into a base64 encoded gif and put it in a div that was set with a z-index low enough that it would be not seen by people but would still have to be downloaded by the browser!

I do all these things of course, because it was decided by the Secret League of Obnoxious Web Developers (SLOWD) that we should do the utmost in our power to make the web slower for everyone.

OR - it could be that I have in fact asked project management at sites repeatedly to focus on performance (and accessibility, another thing that always gets ignored) and been told that nobody care about that stuff.

I guess it's up to you to determine where the fault lies.

I prefer fast non bloated websites but your example does remind me of a tangential example. Video Game Load times. People complain but they really only complain if the content is not great. If the content is great no complaints. Case in point, any game by Valve has atrocious load times, even and including their latest, Half Life: Alyx. And yet Valve's titles are among the highest rated games. And so, load time is rarely prioritized because it clearly doesn't matter. What appears to matter is content people want. (T_T)

I think the context of what you're serving matters. In this case the game content is good enough that the performance probably doesn't matter.

The worst example of a refused performance improvement I can think of was in relation to improving a help site, it had generally bad reviews from users (it is very difficult to get good reviews from users for a help site because if a user is coming to your help site they are already in a bad mood)

but obviously if you are on a site that you are mad about being on and it then takes a long time to load all the data so you can try to figure out your problem you are going to be steaming.

Project manager wouldn't prioritize the three performance improvement tickets I made with lots of cogent description of why it needed to be done. Somehow though, this is my fault.

Looks like you’re saying that developers can’t discuss priorities with management.

In some companies this might be true.

But in many, when a CTO or a senior dev demonstrates business value of a speedy website, it would get attention.

I’m sure project management doesn’t care about keeping packages up to date either. But most devs can successfully communicate that this is necessary and that the alternative is way worse.

Yes, when I was CTO I did a lot to make our website speedy.

When I was consulting I would often do performance analysis of the sites I consulted at, show how performance improvements could be made, put links to relevant studies on performance improvements and the effects on the bottom line with nice quotes, to have the task of cleaning everything up be put in the backlog and forgotten forever.

>I’m sure project management doesn’t care about keeping packages up to date either.

How many days or weeks does it take you to update a package? It generally takes me a minute, sometimes problems happen and I need to spend some hours but those are infrequent. If a package update is going to take too long it becomes something for project management to be aware of and sometimes it is not allowed to update a package.

But generally issues with site performance need handling over a longer period of time than updating a package, I would think that was evident to anyone that has ever updated a package or done a performance analysis of a site. The comparison between something that generally takes a minute and something that takes weeks seems insincere.

But I can make an example where an update was needed that would take a significant amount of time, the reason that the update was accepted was that it would fix certain bugs with the old package and it needed to be done. Either the package was updated or the bugs were allowed to remain or we could fix the bugs with longer time than it took to update the package. I think it is obvious how this is a different argument than the site performance thing, the package update is an argument that this way we will fix the problems that you the business have pointed out, the site performance thing is an argument that this is a problem we want you the business to acknowledge.

I once worked for a place where the business unit got us to deploy one of those crappy client side A/B testing scripts which blocks rendering. Conversion rates started to dip and the same people came back to us complaining about it. I was able to pull together plenty of evidence to show them what the issue was. All they cared about was if we could A/B test it using the aforementioned client side script. Some people just can't be reasoned with, so glad I got out of that place.

I don't think anyone is under the impression that web developers can't make excuses; we have plenty of evidence of that.

I think discussions like this are capable of serving as valuable reference material when we are engaged with project managers on this topic.

I don't think there is a paucity of research that shows that higher performing sites get more users and are generally more valuable for the site owners. The problem is not lack of reference material, the problem is that nobody considers the problem that important (generally) when presented with that reference material.

Now the thread is quite big so I have not read everything but I have skimmed through it all, and have not seen anything that would be a really useful argument for getting someone to consider maybe we should try to improve site performance. A bunch of people complaining about Google, Wired, and a few other things that they complain about is not as impressive as just the hits you get if you search "effects of site performance on user retention" (replace user retention with other useful metrics to get sources for the argument being written up)

What if my audience is people in the US on desktop devices? Does it make sense for me to build for data capped networks which aren't a thing for 99% of US desktop devices? Does it make sense that I now have to spend time and money optimizing my company site for an audience that I don't care about so we don't get down-ranked?

100% understand. I am not suggesting that this is true for everyone, and I agree that this is a good goal for many. My point was that my site targets an audience (corporate employees on their work computers). These policies ask us to spend a lot of time and money building out a site for an audience we don't have.

" Web developers have utterly squandered all efficiency gains"

It depends on how you measure efficiency.

Edit: I should add what I would thought would be obvious, and that it is that bloat is related to advertising which drives their bottom line, and I'm doubtful of any existing material alternatives to 'wired.com's already problematic ability to exist. Ergo, the bandwidth is set to optimise for things developers perceive to be inefficient, but not other members of the team.

Web-designers and web-developers design and build the things that the people with the money ask them to. It's that simple.

You should build what they need - though a lot of the problem is that those specifying websites have very little idea about how the web works and what is needed.

Unfortunately this isn't the 60's at Sterling Cooper where you can brain storm an idea like go to "work on an egg" or a "a mars a day helps you work rest and play"

In my experience, even if you start by "building what they need".

When you present your V1 "they" come back and say something along the lines of "can't you make it look nicer/cooler? You know add some wow factor!"

What they mean is: "I just took a pile of money from the client for this and you didn't make it look expensive enough"

The people that 'ask them to' are bound by an inexorable set of rules appear in the system, for the most part. That system is observable and a little bit predictable, it's just more economic than it is technical, so it's worthwhile for devs to take one step out of their zone sometimes to see how that math works. I don't really think anyone at wired actually wants a slow site or tons of crap ads.

(Disclaimer: previously worked at Google search)

I think some commenters are attributing to Google an ulterior motive, whether ill- or good-intentioned, separate from its core business. But in this case no such motivation is necessary.

Basically, Google wants its users to be satisfied - otherwise it will lose to, say, Bing. So it measures user satisfaction - e.g., if a user clicks on a Google result, and immediately hits back button in three seconds, it's a strong signal that the user was not satisfied. And Google tries very hard to increase this "user satisfaction" (and other similar metrics), because not only does it help Google's business, but it also improves the service itself.

And, guess what? When a page takes fifteen seconds to load, lots of people hits the back (or close) button. Showing such a page is giving the user a bad experience. Unless there's literally no alternatives, it makes sense for Google to "penalize" such a page.

Of course no metric is perfect, so it will occasionally backfire and penalize a great page that takes thirty seconds to load. But that's life.

Good answer... Out of interest (I understand if you can't answer NDA etc). Taking your example of the back-button (I assume it relates to "dwell time" we keep reading about...) My question is, does this feedback mechanism also include "closing the website" ? I seldom back-button(on desktop) i just use new/close tab.

I wasn't in the team that gathered those events, so I don't remember the details (and it's probably covered by NDA, as you say), but I guess Google would be gathering these data if there was a reasonable(?) way for them to get it.

How is this helping Google not lose to Bing, when the change would equally improve Bing experience (rising tide rises all boats?).

Not all websites are going to improve their performance, even with Google's incentive. If Google ranks fast websites better than slow ones, that gives Google an advantage over any competitor that doesn't, as long as slow websites still exist.

Google already has market dominance, they lose nothing if things rise equally for both themselves and bing.

You can easily instrument this server side - cookie and referrer is enough to track a single hop journey

There are lots of questionable ways in which Google owns the web (AMP, reCaptcha harassing users without Google cookies, Chrome's "Fire And Motion" web standards strategy), but this one isn't one of them.

In the webdev community it's well known that good performance is very important for user satisfaction, and that's backed up by research. There are no ideal metrics, and unfortunately every single one of them has some dumb edge cases. You could endlessly bikeshed which metrics could be better, but this particular set is not unreasonable.

It makes sense for search engine to pick websites that not only have the relevant content, but also are able to actually get that content on screen without frustrating users with slow-loading crap that makes browser freeze and stutter.

Keep in mind that your high-end desktop web experience is in minority. The web is now mostly mobile, and mobile is mostly low-end Android. That's a shit sandwitch, and it desperately needs an intervention.

> it's well known that good performance is very important for user satisfaction

Even putting that aside, I'm not sure how them changing from one rank scoring system to another on their own search engine is them playing "God" for the web. The whole goal of Search is to rank items per some metric of "quality", and speed seems as good and fair as any to me.

> The whole goal of Search is to rank items per some metric of "quality", and speed seems as good and fair as any to me.

No matter what I search for, I'm not going to be pleased to have the results listed in order of page load speed. The goal of search is to rank items according to a metric that matches what the searcher is looking for, not a metric that reflects the search engine's internal preferences.

Page-load speed would be one metric among hundreds. But we can't assume that Google has lost sight of the ultimate metric - did the searcher find what he was looking for ?

For any competitive search engine, its internal preferences are expected to be a reflection of its users' preferences and not a whim as you seem to be implying

What's infuriating to me about these types of "signals" to the search rankings is that they have little to do with the content that I'm searching for. Google will hide results that I might find useful because the web master hasn't kept up with whatever Google decided was today's best practices. How about ranking based on the best source for what I'm looking for?

Google's ability to surface useful results has been thoroughly defeated by SEO spammers. To a lesser extent, the same is true of other search engines (Bing, etc.) though Google is the foremost SEO spam target for obvious reasons. Given that state of things, there is some sense in promoting more user-friendly pages that are thus a bit less likely to be SEO spam.

> there is some sense in promoting more user-friendly pages that are thus a bit less likely to be SEO spam

If there's a metric that will improve ranking, the SEO people are all over it, and have more incentive and resources to optimize for it than normal publishers.

I wonder if Google could combat this by having every X search pages swap page 1 with, say, page 5? Or give users the option to jump straight to a given search result page by default?

That way the SEO-ignorant sites that actually have the info you want, but get pushed out of the way due to SEO spam, will have some chances at traffic.

I have never written a search engine, so this comment is worth about 1 kb.

For some queries I instinctively jump a few pages ahead because I know the first few pages are going to be absolutely filled with SEO spam. The remainder is still not free of spam, but has a higher chance of containing what I want to find.

Nah, this is just unimaginative. The issue is that it is really hard to admit one was wrong and start from scratch. Rules should be simple enough for mortals to read. If they keep adding to the existing formula there will be web-lawyers required.

If only one website has what I'm looking for, then definitely give me that site. If multiple sites have what I want, then prioritizing by which sites will give it to me fastest sounds like it both helps me with this search and helps with future searches (since it puts incentives towards making sites less slow).

In general, prioritizing speed highly helps small independent sites and over large bloated sites: they typically have less JS, fewer round trips on the critical path, etc. Make your site simple enough (ex: https://danluu.com/) and it will automatically be fast.

(Disclosure: I work for Google, speaking only for myself)

Sounds like a solution that fits your needs. The topics you deal with probably fit that idea perfectly.

I would make it more generic. Have the user provide his system specs and give them the option to filter out what is or isn't reasonable for them.

I use: 1) a decent desktop, 2) a phone with reasonable specs, 3) a laptop with shit poor specs

a) 400 mbit cable, b) free wifi (crazy slow), c) my ISP provides wifi hotspots that are reasonably fast, d) a prepaid wireless plan where 10 euro equals 1 GB

Shortcomings/experience is pretty obvious with each combination. The laptop (to pick just one) cant reasonably open a google search result, the duck works just fine, fb messenger works too, it can download and play HD videos. Most significant but not all that obvious, it has a qwerty keyboard with which I can write substantial amounts of text. If the search result was tailored for this I could see myself use forums and blogs (with comment sections) over prepaid tethering. It's webcam is unsuitably poor.

Edit: pay 5 cents to view a clean page in stead of 1 euro bandwidth to freeze my client all of a sudden seems a fantastic deal.

That's why we should ditch google search for alternatives. So far I've been using DDG and am quite happy though it needs improvements in search results they may be the ones to listen to our needs unlike google who is too big to care.

DDG just gives you Bing's results, while avoiding giving Microsoft information on you. This will necessarily produce worse results, but you might decide that this is worth the tradeoff.

DDG also gives google results. If it grows in adoption it might get better than the two is what I think. If not, some other search engine is welcome, something better than google and bing

Source? Take any query and it will mostly be carbon copy of Bing results. Haven't seen any Google results on DDG yet. An engine that did both would certainly be desirable.

I was under the impression that google results are also used. I guess I was wrong about it. Does anyone know why google results are not aggregated as well? Technical or legal reason?

Likely to be monetary reasons. DDG is basically (not entirely accurate but close enough) in the business of reselling Bing Search API and monetizing it through Bing ad network revenue (+some affiliate revenue through Amazon and Ebay).

If you also added Google results, your input costs essentially double. Also if so much of your marketing is based on bashing Google, it would be harder to justify such move from a branding perspective.

Forgive my ignorance, I really didn't know all of that. I only my criticism of google for the reasons I stated, they have grown so much that they don't care about their users and their search quality has been deteriorating quite rapidly. In part it'snot their fault as an army of marketers found their way to play the game to push their products up in results but, it's not only that. Google has been acting more and more like a corporate monopoly, they're not what they were when they started for sure

> today's best practices

Speed has always been a good practice for as long as the web has existed. The specific metric changes, but the goal is the same. It's just that "simple" metrics are very easy to game.

An example of this is how they went from "First Paint" to "First Contentful Paint" to "Largest Contentful Paint". They're all trying to get at the same concept, which is when the page loads, but each iteration gets more precise and accurate. Realistically, as a webdev, if your site loads fast, it shouldn't matter which metric is used.

>Realistically, as a webdev, if your site loads fast, it shouldn't matter which metric is used.

I like this ! Very true !

So glad you were able to articulate what I've been feeling for years. Google promotes good websites which has little to do with good content. It's a (debatable) signal of website quality, which is only tangentially related to content quality.

Back in 1998 or so people weren't just enthusiastic about the Google search engine because its results were good but because the search page was simple and fast.

Compare that to Altavista or Yahoo whose pages were belarded with all sorts of irrelevant links and ads around the search results. Slow to load and hard to visually navigate.

I still think the sparse pages are the best.

I worked at one of these companies during this era and yours is an often forgotten fact. It was a very long time before Google search result quality was any good. Their initial popularity was due almost entirely to page load speed, which they included on every page to highlight it.

The second was being able to serve a large index with parallelized querying, which was relatively easy for a newcomer company with no user base to engineer, and much harder for existing search engines trying to protect a revenue stream. People often don't really remember how late Google was to this business and how much of a difference that page speed indicator was.

The same exact argument could be made for Chrome. Google was extremely late to the browser game and most of the initial switch was on how quick it was in comparison to other browsers at the time.

I was replying to the parent comment above mine stating that Google came late to the browser game just as they did with the search game and if part of what initially made Google appealing was that they were simple and faster than other search engines, that is true when Chrome was introduced as well. Google explicitly marketed Chrome speed as a selling point. For example: https://www.youtube.com/watch?v=nCgQDjiotG0

HotBot and Copernic.us I think is what people used. Lycos too. But what started search engines down this wayward path was Overture that combined with doubleclick proved to be too attractive a revenue model and we’ve gone down hill since.

> There is a very reasonable argument for essential services like search engines and news websites to conform to/adopt standards like AMP, but for the rest of The Open Web, ingenuity and risktaking should be encouraged, not discouraged, for the true good of all Peoplekind.

Hadn't really considered this - because minimalist page size is often such a given - but, for instance, many amateurs often don't know yet how to crush their pngs and such.

> https://bilge.world/open

Cool - thanks for this!

(As an aside, it's great to see a continuation of topics like this - which is commenting on last week's article from Parimal Satyal. It makes this place seem more like a forum.)

Actually, I'd be interested in an alternative web where ingenuity and risk-taking would be utterly forbidden. Just HTML, and a very basic subset at that. No Javascript at all, no CSS.

So much of the web would be better and more universally usable without "modern" cruft.

Who decides what is and isn't cruft? You?

Is data journalism cruft? Are web applications? Is Google Office cruft? What about the web application my parents have been using to order groceries during the pandemic - that has loads of JS, and loads of CSS to make it readable to anyone over 40. Does that qualify as cruft?

Are dyslexia friendly styelsheets cruft? Is Google Maps cruft? It's full of JavaScript. Are browser games cruft? I played QuakeJS last night and had a lot of fun with it. I was also using a WebXR 3D app the other day to preview a rental property remotely - is being able to socially distance myself cruft?

It's all cruft until you ask the people who use it.

That it works is table stakes. A web application your parents wouldn't be able to use to order groceries because it was so broken wouldn't even be discussed, it would be pulled off the Internet and replaced with something that works.

The criticism about cruft is one level up. Not about how to accomplish something, but how to do it in a way that isn't extremely wasteful of both computer resources and end-user's time.

Most of that doesn't belong on the alternative web I have in mind. It can keep living on the current one.

Usability features like dyslexia friendly fonts, large fonts, etc., belong in the browser, not on a web page. If anything, this would be easier on the alternative web.

The key idea would be that when you go to a URL on this alternative web, you know you're not going to get slammed with some cycle-sucking, RAM-sucking, virus-carrying, UI horror. More gopher-ish, but relatable to those who have used a web browser.

I can dream.

edit: Ha. Right now, this comment has 6 points, and my original above has -2 points. My illusions of HN rationality are thus reduced. sniff

> Who decides what is and isn't cruft? You?

Browser vendors do. Popups were cruft. Flash was cruft. Not all the time. Just almost all the time.

Ever used Reader View? There's the great de-cruftifier! It doesn't work with web apps, but it sure works great on content. Perhaps some day browsers will default to Reader View, and the web will become more pleasant.

> Actually, I'd be interested in an alternative web where ingenuity and risk-taking would be utterly forbidden. Just HTML, and a very basic subset at that. No Javascript at all, no CSS.

Definitely. A lot of the "ingenuity and risk-taking" are efforts to make the web something it was never indented to be (e.g. a binary(ish) application runtime/delivery platform) that has lots of downsides.

I was thinking about this though not quite as severe. I would also love to see something like this in the end though. It sucks that as a mobile user I'm paying for analytics, unoptimized images, and poorly-written code when just simple markup and text would do. With the speed at modern networks run server-side processing shouldn't be as taboo.

One reason for a minimal web would be the idea that I could one-day be living on the far end of a rather slow modem as my only Internet connection. Sucking ten tons of Javascript/images/adtech down a link like that would be pretty awful.

Didn't think of it before, but your "server-side" suggestion could minimize some of the pain of that, I guess. Low-bandwidth VNC on the client to a browser that's actually running in a DC somewhere. Maybe a VNC add-on to block/freeze rapidly updating squares (videos, gifs, etc).

Not great, exactly, but would ameliorate some of this.

Alternative Web? The big "selling" point of Mosaic and other early Web browsers was bringing diverse information sources together into one application! Gopher, WWW, WAIS, FTP, and probably others, all available in one program, and a GUI program at that, unless you were stuck using Lynx over SLIP or some other bizarre excrescence of the dial-up age.

The fact Web browsers won rather disproves your whole notion.

I guess. The fact that no human can plausibly consider writing their own web browser from scratch these days seems telling. Nor really can any real human seriously consider finding a bug in a web browser, tracking it down in the source code, and submitting a patch.

Web browsers have become satanic mega-behemoths of inscrutable code. (I'm too lazy to look--are there more lines in Chrome, deps included, or the Linux kernel?) They are the utter antipode of the Unix philosophy, and arguably an engineering abomination.

It exists right here and now. Disable javascript - almost everything works. You can disable CSS and images as well - just a few clicks in uBlock Origin on uMatrix [0].

People with Vision Lost browse the web. No need for separate web.

[0]

uBlock Origin (advanced user)

* * 1p-script block * * 3p block * * 3p-frame block * * 3p-script block * * image block * * inline-script block

uMatrix

* * * block * * frame block

Totally appreciate what you're saying - some of the time I'm just in a basic text mood as well. (And am a fan of, for instance, legiblenews.com.)

Most of the time I'm a JavaScript maximalist, though. Crufty, dank and deplorable.

I think most people haven’t internalized that Google is no longer a search engine but an answering engine.

A search engine tries to find all sorts of relevant information related to your query. The more the merrier (it’s searching after all) and then sorted in a way that puts the relevant results first. An answering engine, in the other hand, tries to minimize the number of results. In an ideal world, it would only return one thing, which tells you exactly what you want.

One example of this change is the fact that it’s no longer useful to go beyond the first page or so of Your results. Because anything down that low is irrelevant as an answer and is probably discarded by google anyways, which wasn’t the case when it was a search engine.

I’m not saying this is a bad thing. In fact, I suspect the majority of time the majority of people want an answer, and not a multitude of results. But I think this is what leads to google search changing in a way that does not meet many people’s expectations here.

It means google emphasizes stuff that gets people answers quickly. They parse text and reveal the answers on their page itself. And they are not very useful for exploring anymore.

...and the answers it gives are often not even right (or wrong).

In an ideal world, it would only return one thing, which tells you exactly what you want.

In Google's ideal world, it would also only return one thing, which tells you exactly what they want to tell you.

I think this is true of any “answering engine” in an ideal world. In an ideal world, it would act as a function and return a single output for any input, because any particular question should ideally only have a single correct answer.

How would it decide which source to get this answer from?

Because let's face it, one of the biggest challenges with that idea isn't that there are tons of different valid answers to a question (though that could be the case for many political queries or ones related to art/media quality), but that hundreds or thousands of pages and sites contain the requested info, and search engines have to rank them in some sort of order.

>I’m not saying this is a bad thing. In fact, I suspect the majority of time the majority of people want an answer, and not a multitude of results.

True ! To give a concrete example... LONG FORM RECIPE PAGES - Should be on SERP 100+. It's gotten a little bit better, but still see sometimes a 2000 word article when I google something like "apple pie recipe" or "apple pie oven temperature"

This meme is tired. We get it, Google makes money from advertising.

Your comment is same as "NYTimes is a advertising company hurr durr because they make money from advertising"

No, I don't think you do get it. Originally, Google's experiment, the reason they came into existence, was to organize the world's knowledge and sell advertising to people who were seeking it, on the idea that you must be interested in something if you're searching for it. That's like the New York Times.

Now, Google seems to be trying to learn the details of all the people, and they're selling access to that. Their mission seems to have flipped.

Well that's a very rude comment. You are also using false equivalence to halt any discourse on the matter. Doesn't seem very safe.

Both, actually. They figure that if you can get all your answers from Google itself, they get all of the ad dollars and don't have to share. That's why they try to get the implied question answered with pages served from them if possible.

One anecdote where their “Largest Contentful Paint” metric fails, and fixing degrades performance:

We have a large 300kb animated gif that takes up maybe 20% of the viewport above the fold. The gif demonstrates visually what our service does.

A couple months ago Webmaster Tools reported that page as being “slow” pointing to the large image download. So we decided to show the first frame of the gif as a (30kb) png file, and then swap in the gif 2 seconds after the page is fully loaded.

Except now the new “largest contentful paint” metric is failing on those pages because it includes the 2 second delay when the animated gif is swapped in. I guess technically they’re not wrong in how they’re calculating it.

In fewer words, Google doesn’t like anything being lazy loaded if it’s above the fold.

The metrics and how they’re calculated are questionable. We ended up optimizing for Google and removed the lazy load (ignoring that we think it’s a better UX to lazy load that specific gif).

> We have a large 300kb animated gif that takes up maybe 20% of the viewport above the fold. The gif demonstrates visually what our service does.

You might be able to turn a 300kB GIF into a much smaller encoded video; as long as it doesn't have audio, you can autoplay it.

Interesting case scenarios: 1) a Site has a stupid big gif that shows logo and staff parties, basically adding nothing but sucking up bandwidth(Not your case)

  1. Your case, where you actually add value-content with the gif.

Now the speed metrics are just that, speed metrics they "report" in isolation from "content".

So now my question is: Is Google's OTHER-content-signals good enough to overcome any penalty that might have been applied because of the speed ?

That sounds like a nightmare. Is there any other way to delay the loading of the gif that doesn't negatively affect your score? Like a JS callback or similar.

As an aside, the other replies to your comment is very telling of the HN audience. Many of which aren't knowledgeable in the webdev field, but "they did it in the 90's", "use bleeding-edge tech" and "everything you're doing is wrong". Also IFRAMEs, hahaha.

I will not stay on a site with auto-playing video or GIFs. I can't focus on anything with it there.

Or you can scroll down. It's presumably a hero imagespace that serves to explain their product in the fastest way. 300kb for a gif isn't that big for such an inefficient type.

It takes me some effort to focus on most things. I'm not going to invest the energy to decide if the assault on my senses adds some value.

As a web developer who has recently spent an ungodly amount of time trying to make my pages meet Google's impossible standards for qualifying as "fast" on mobile, I sympathize with the author's point. But I think he's missing the even bigger picture. Personal computing is mobile now. And even though the phones have as many megabytes, kilotonnes, little clowns or whatever the device greatness is measured in these days, browsing the web on them is still slow as hell. And I would seriously entertain the suggestion that it's all Apple's evil plot if every Android phone I ever used didn't suck donkey balls for browsing the web. Whatever the reasons for this are, what's at stake now is not web's diversity but it's relevance altogether. I'd rather live in the world where web is needlessly optimized for performance than in the world of apps.

It doesn't help that ~every mobile device is 6+ cores while the web still largely pretends that there's only a single CPU core & that that core is getting faster. The web should have started adapting back in 2006 when this trend really become common reality, but it didn't.

So you're stuck with 1 shitty CPU core, and you're stuck sharing it with the browser & JS runtime (yes there's of course multithreaded aspects to the browser itself, but you're still sharing an awful lot of CPU time on your single thread with the browser). 1/6th to 1/8th the performance of the phone is the most you can achieve if you're lucky. That's a fucking terrible starting point, and nobody should be surprised the mobile web sucks ass as a result.

It's funny that Google is so large, that one way to grow their business is to improve the user experience of the internet as a whole.

> Google is so large, that one way to grow their business is to improve the user experience of the internet as a whole.

(where user = advertiser)

User == user

Advertiser == 0.6 * customer

Their revenues are increasingly diversified, Google received far more dollars from myself and company via non ad revenues, more than $1000 / month now and growing (re: gcloud)

User == user, in the sense of a drug user. Hooked up, can't say no, will suffer through almost anything, and is a disposable commodity from the supplier's POV.

If this were true chrome would ship with ad block and they would accept cold hard cash for their services.

I tried this, I used it, no big adoption, product got killed.

Largely people prefer the free perception of the internet and wouldn't pay the prices it would cost if direct payments were made.

I believe its in the $x00s per year

> Largely people prefer the free perception of the internet

You have any reasoning behind this or are you taking the people who killed the product at face value? If I were to guess that product (i have no idea to what you’re referring) was designed to fail, like youtube premium.

Google Chrome already ships with an ad blocker (it is in the "site settings"). It is, however, very weak.

I think Google would love to block ads more aggressively, but the conflict of interest is so obvious it is going to be raining lawsuits the instant they do that.

A more aggressive ad blocker would block Google's primary source of revenue. So no, they have no interest in shipping an ad blocker akin to others.

It really makes me wonder what the internet would be like if google had built microtransactions into a browser to support their sites instead, or as a complement to ads (e.g. an advertiser can pay for your time on the site in exchange for an ad, and google would only middleman the ad negotiation, not the transaction).

I think the surreptitious tracking is the real problem. I don't mind ads on podcasts, because their tracking is explicit and opt-in ("enter our show's promo code on the sponsor's product page.")

Likewise Google Contributor did not reduce tracking, which is why I never signed up.

Exactly the way it looks today, because most people aren't willing to pay for content.

NY times has 3.5M digital subscribers. Netflix has 120M. There are 4.5 billion internet users.

There is no way that anybody would see /amp as improving the user experience of the internet as a whole other than an "product management by KPI" dumbed down organization.

AMP is a huge, huge improvement over the sites that it was intended to help: local news sites that are unusable, tab-crashing, malware-infested swamps. I don't know what their problem is, but they are like drug addicts and AMP is their rehab. Without AMP these sites are simply useless.

I see amp as improving the user experience of the internet as a whole. Amp sites aren't amazing, but they are usually better than the alternative.

> web.dev is operating on some irritating assumptions:

> 1. Smaller assets are ideal.

> 2. Minimalistic design is necessary.

This doesn't sound right to me. Aren't the three new page metrics mostly targeting what happens when the page initially loads?

https://web.dev/vitals/

> Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading.

> First Input Delay (FID): measures interactivity. To provide a good user experience, pages should have a FID of less than 100 milliseconds.

> Cumulative Layout Shift (CLS): measures visual stability. To provide a good user experience, pages should maintain a CLS of less than 0.1.

The first two are about initial loading. For the last one, you can avoid layout shift by e.g. reserving space for images that are yet to load fully.

For example, it sounds like your page could load 100MB of images with the most complex design ever, and it would get a good score as long as the initial viewport of the page displays quickly, is interactive quickly, and doesn't jump around as it's loading.

They sound reasonable metrics to me in terms of fighting bloat but with flexibility for web developers (as opposed to AMP). Who gets to decide the metrics is another issue though.

If not Google, someone else must step in and set some standards. Either way, I don't see anything wrong, their platform, their rules. Don't like their rules? Don't use them. When less people use their products, they will listen to what customers want.

Having said that, what exactly do customers want? They want the best experience on whatever device they're on. This is 2020, there's so much that has happened since the 1990s. We can't simply keep using standards from the 1990s.

> 22.82 Mbps will reliably download very complex web pages nearly instantaneously.

The author needs to come down from his high horse and use internet in developing countries. I was in India the other day for a client meet and I was on one of the largest networks there. I had a subscription to NYT and I tried to load an article and whoa, it took me 3 full minutes for the browser to fully load the article to the point where it was even barely readable. I'm not saying the network in India is slow, I'm saying, even with the best networks, when you're travelling your speeds will be in KBPS. If we don't have strict standards, the sizes of these pages will only grow and grow.

Later that day, I loaded the same article from my desktop again. The site made a gazillion requests in the network tab to so many advertising vendors and each of them had consistently sizeable assets. More than being offended for selling me out despite a paid subscription, I was offended how ridiculously unoptimized sites like NYT are, despite being a popular online, large scale publisher.

I'm happy such sites like NYT will be penalized if they don't provide their users a good experience.

Not sure what this article is arguing.

Sometimes you want to make a slow website that doesn't fit well on a phone screen?

Leaving aside the fact that you can of course do that, and that if I'm using a search engine on my phone I probably (usually?) don't want to look at your slow site that I have to horizontally scroll...

> Modern web design principles are very rarely directed at regular people looking to make a website on something they are interested in. Instead, the focus is on creating websites that perform well:

> Don't use too many colours. Write short, catchy headlines. Don't let content be too long. Optimise for SEO. Produce video content, attention span is decreasing. Have a an obvious call to action. Push your newsletter. Keep important information above the fold. Don't make users think. Follow conventions.

All that's true to some extent if you're making a product on the web and you have a few seconds to hook a customer before they move on. If you're making a website for enthusiasts in some niche, though, content is your draw and you can worry less about some of these things.

Sometimes you want to make a slow website that doesn't fit well on a phone screen?

Sometimes you should have the choice of making a slow web site that doesn't fit on the phone screen. A large corporation shouldn't dictate how you present your information.

Sometimes you want to put a document on the web without having to pay someone to run on Google's treadmill of changing standards and policies for the rest of your life.

I'll take a crappy-looking web site with good content over an SEO-pumped disaster that provided no information.

Craigslist, eBay, and dozens of others didn't get to be huge because of their good looks.

Sometimes you can choose to make a website whose sole purpose isn't to attract Google Search traffic, and that's OK! Not every website has to rank in Google Search to be useful.

but that isn't really true. google gets ~80% of the searches on the web. that means that if someone who isn't conforming to google's frankly arbitrary ranking system is being censored, to use strong language.

> but that isn't really true. google gets ~80% of the searches on the web.

Not every destination started as a web search.

Case in point, the list of links on this very site that we're all commenting on. This discussion didn't start with a google search. Google was never involed at all. Not everything on the web goes through google search. Nearly all searches go through Google, but there's a hell of a lot more to the web than searches.

As also evidenced by some of the largest & most visited sites driving a ton of non-search traffic - like reddit. And facebook. And twitter. And etc...

google's frankly arbitrary ranking system

Oh, it's anything but arbitrary. It's totally strategic, which is why people pushing back against them being able to dictate web policies is exactly the right thing to do.

I now see people link these "amp" web links, where the domain you are linking to isn't the domain you are trying to reach.

That's one step away from somebody seeing

    paypal.ampstuff.com

and thinking it's ok and then getting ripped off.

No.

How does one do that when browsers have the search and address bars merged?

You content yourself with the users that type https:// before your site name and never make a typo?

I think we’re at the point with the internet where modern design is inversely proportional to good content. I find more stuff on half-forgotten .edu sites, old.reddit threads, and weird forums than anything “responsive” nowadays. Most of the nice looking sites are just regurgitating stuff from the older, rougher ones anyway.

The author seems to argue:

- Small size is not always better, especially when the wait may be worth it for the audience

- Minimalistic design is not always necessary, and in fact it's a good idea to take some risks in the other direction, if it could result in a desired outcome

- Google doesn't have an intrinsic right to dictate best practices. In fact, the idea of a single set of best practices runs counter to the spirit of the web, in which a website can be a creative or cultural experience which transcends convention.

- In addition, on that last item: The web _can be_ more than what it is now. So monolithic standards could easily get in the way.

- Google is going to hold an event and tell visitors how to conduct "modern web development." The author is not comfortable with Google's singular focus on Google's preferred standard and mode of web development being The Only Way, so they are going to attend and push back.

It's a good post. Even the headline by itself resonates in a lot of different ways, especially if you've spent a lot of time on the receiving end of Google's dev messaging.

Speaking more broadly, quite often the blanket design advice given to web enthusiasts and subjective-values-driven creatives comes from the economic side of the web. It starts from "don't annoy your users" and draws a (suspiciously) straight line from there directly to "Largest Contentful Paint". This represents a potentially huge, complicating energy drain for a project that may have a lot of other important design parameters to start with.

In another way, it changes the question. The web wasn't designed as a way to get your customers really specific information as quickly as possible. There's so much more to it than that.

A lot of HN users are researchers from the economics-first side of things though, not artists, so this may be a bit hard to understand. It's like hearing yet another friend tell you they've signed on for a liberal arts degree.

Such researcher-consumers (who incidentally write some amazing online reviews, but that's another story) are also well known for getting frustrated at the slightest delay or sales pitch. Art on the web can end up, therefore, being perceived as a broken web experience, a bad sales pitch for an ephemeral and vapid product, as opposed to something which inspires or changes viewpoints. "I can't even find the site map!"

The web doesn't even have to be users-first--quite often what's best is a series of standards and compromises that alternately put creators and users first at different points.

They’re arguing that the whole idea behind the web is that different people can try different things with different priorities and constraints.

The priorities shouldn’t be dictated by a single company.

I don’t fault Google for what they are doing (besides the AMP stuff). The problem the author is talking about is largely because google is the de facto way pretty much the entire world finds new stuff.

If website discovery was driven by blofs and the like wns people then added the stuff they liked to their RSS readers then the authors issue would be resolved.

Unfortunately that world doesn’t exist anymore and Google’s preferences end up dictating what all websites prioritize.

Anyone who can force the web developers to make more responsive, small, junk free, sites that focus on user than ads will have my support. I don't see anyone else making the attempt to force a change. The author is mistaken in their views.

I thought this was going to be a reasonable article but its just another whine from a designer who wants to add another 6mb of pretty animation to a web page.

1 and 2 are totally wrong and 3 googles moving away from amp ant letting normal pages rank.

I have wasted to many hours of my time on conference calls with people like this.

I keep telling people that search shouldn't be a monopoly but they just keep looking at me like I'm a crazy person

Google Search is in a horrendous state right now. Search results have been getting worse each year, with interesting information being buried 20 pages down.

I really hope they have plans to improve this or find an approach that works as a middle ground for generic SEO content.

I don't know if it's a combination of overreliance on algorithmic rankings + seo or what, but if I search for something like "Panasonic GH5" (something I searched recently), I want to see the top two ranked items as the Panasonic product page, followed by the Wikipedia page.

When I run that search on Google in Canada, there is a Panasonic link on the first page (third link), but it's for the US store listing, which has less info on it than the normal product page.

But as you say, search is in a horrendous state. So is video discovery on Youtube. It seems like my front page recommendations on Youtube have turned to garbage in the last 12 months.

Maybe the idea is to encourage Panasonic to pay Google to get their result on the first page.

Maybe ten or fifteen years in the future, paying to Google (or Facebook, or...) will be the only way to have your web page delivered to anyone.

I mean, why should Google display your page to anyone for free, if someone else is willing to pay to have their page displayed instead. It's not like people are going to switch en masse to something else, just because Google results become slightly less relevant. And when paying becomes the new normal (e.g. it will become normal to pay to have Hacker News on the first page for "Hacker News"), the results will return to be more relevant again.

For certain niches, this is already happening. Pretty much every web hosting company nowadays run ad campaigns for their own brand, because other companies kept advertising on those keywords, and the only way to put their own web site up top is to outbid the competitor.

I agree that Google search is horrendous but Youtube is actually manageable. If you want decent recommendations you need to curate your viewing history. If you watch some random video and find a lot of annoying related videos showing up in your recommendations then go into your history and delete that one video from the history. Now Youtube will "forget" you watched that video and stop recommending things related to it.

I use Youtube in a very specific way: I subscribe to a bunch of channels that I like and only watch what those people post. My recommendations tend to be very focused around things related to my interests. Occasionally I will look up a random video for whatever reason and I'm always careful to delete it from my history unless I actually want more recommendations related to it.

I agree and will go further. I love YouTube, the content and the concept. I watch more YouTube than any other streaming service (except for Space-Force on NetFlix - damn it's good !). Sure not every video that is recommended will be what you like.. but it's quite a hard problem I think to recommend something in EVERY GRID POSITION that you will like.. Usually I'm willing to watch at least 2/5 to 4/5 videos being recommended AND I have "discovery" most of my fav channels I still follow to this day via their algorithms. But maybe it's just me :)

About a year ago, my Youtube front page feed was pretty good. Then it has slowly degraded since then. I went from Youtube being the service I used the most to watch stuff to becoming much less interested. Most of my recommendations don't even come from my subscriptions.

I like my recommendations not coming from my subscriptions. I check my subscriptions for new stuff often, so I want the recommendations to be other stuff.

But you're right, they outright sucks. They are definitely dynamic because I only have to show my son a single Roblox video to change it, but at the same times there are stuff that is repeatedly showing up that I have no interest in. Even stuff that I've told YouTube that I'm in no way interested in...

What I REALLY miss is a random category - for stuff that I would never otherwise see.

Youtube 12 months ago was less bad without me having to do anything.

Youtube can make me do work to improve my experience, but I do have another option, stop using Youtube's front page and just discover stuff via Reddit or search engine results.

I had a similar experience with Google News trickling up garbage news sources, and I spent way too much time blacklisting sources, and have basically given up on it.

They are relying too much on AI and "collective wisdom". The results you get now are what Google thinks you want based on what others want, not on what you are actually searching for. What you typed is now treated as a mere clue, not as a search key.

Depends maybe Panasonics enterprise SEO game is weak - you should rank your product pages on page one

From the perspective of a search engine's end user, Google should find those product pages and set them at the top of the results whether Panasonic's SEO is good or not, otherwise I'm going to switch to a search engine that can figure that stuff out.

The onus is on the search engine to produce good results, because they're the one providing the service. Hell, that's why I started using Google around 2000, they had the best first page search results.

Google Search is in a horrendous state right now. Search results have been getting worse each year, with interesting information being buried 20 pages down.

I ran into this over the weekend.

I had a question about something I took a photo of in a small village, and I knew a priest I met there would have the answer.

I went to Google to find the parish web site, and it's simply not in Google. Not at all. Lots of sorta-kinda matches in places thousands miles away, even when I specified the very unique village name, state, and ZIP Code.

So I dug through the pile of dross that comes home with me after I travel and found an old parish bulletin that had the web address printed on it. It turns out the church has a web site — a pretty nice one — but Google doesn't know about it. Or it's so heavily down-rated by Google that it didn't come up in the first seven or eight pages of results, no matter how I formed the search query.

So I went to the parish's web site to see if he has an e-mail address. The web site indicated that he is no longer the priest there. So I put his name into Google to see where he'd been moved. (I've been told by other priests that unless you're tied to a school or other institution, priests in America are shuffled around every four to six years.)

His name is fairly uncommon, so I expected Google to show me links to church bulletins or announcements about arrival or going away parties, or even church employee rosters, of which there are thousands on the web. But instead, it was page after page after page of SEO spam for "Complete phone number for $priest_name!" "Is $priest_name cheating on you? Find out!" "$priest_name in your city want to meet you!" The few I clicked through just in case had none of the promised information.

I gave up on Google, and phoned the parish office when it opened this morning. He retired six months ago. Here's his phone number.

Time wasted with the company that promised to take all of the world's information and make it available to everyone: 20-40 minutes.

Time spent getting the information the old fashioned way: 2 minutes.

Thanks for nothing, Google.

You are getting that because other people do not care about what you are searching for :( their AI is maximizing the chances of quickly finding what most people want, not what you want.

Most people want offers to sell fictional criminal records and address records that don't match what I searched for? I don't think the AI understands what the word "most" means.

unfortunately, those websites are probably more popular than the church's website in question (or at least, receive more clicks from google search)

That reminds me... It used to be fun to search about oneself, all kinds of weird references used appear.

Now there is just spam.

> Google Search is in a horrendous state right now.

I bet they simply optimized for the average person on the web, and we're outliers.

I have to append 'reddit' to a lot of queries now to find relevant information it seems. I was researching standing desks recently and was having a hard time getting through the cruft with regular google searches.

I would agree search seems to have gotten worse.

I wonder how much of the search abuse / manipulation they are dealing with given the current situations and upcoming elections?

This seems like a common claim here on HN. I would expect the creativity of the world's SEO hackers to outstrip any single team, so I expect you'rer right. But I struggle to find queries where interesting information isn't in the top handful of links.

[badgers in montana], [while loop python], [how to fix noisy refigerator], [nude stallman] all do just fine. Are there veins of queries that are particularly polluted? Even [homeopathy covid] is pretty good.

SEO hackers to outstrip any single team

SEO hackers are taking advantage of Google's biggest weakness: they are an advertising company that makes money on search ads.

If search were run by a nonprofit like Wikipedia or the Internet Archive, it could be made to filter out sites with ads on them. SEO hackers would have no way to get their foot in the door unless they drop all of their ads, removing their own revenue source. This would create a space on the internet for non-commercial activity to flourish without ads and all of the tracking garbage that comes with it. A form of cultural ad blocking that, in the long run, could be a lot more effective than the technical arms race of client-side ad blockers.

Searches on highly-specific/technical terms often fail for me. Also, for programming, I often get spammy sites instead of more "official" sources. For your example, the while loop in python, I would expect the official python resource to be first. I find this even worse in other languages.

I would prefer not to see W3Schools, Tutorialpoint or GeeksForGeeks to be so highly ranked. This problem is even worse when I search for ML stuff, with pages upon pages of Medium blogspam.

Recipes are a good example. Not only am I confident that Google isn't providing the most useful or authoritative results, it's easy to see that they have forced authors to morph their content into a terrible cookie cutter format. Scrolljacking, obnoxious ads, and pages upon pages of introductory text are the norm. At this point I often use image search and just guess at how they made it rather than try to slog through the results.

The search AI probably opted out because of your preferences... I often search error messages, but sometimes I don't find anything and I think to myself: - Am I the only person in the world who got this error!? Or I finally find a forum post and the answer is "just google it"

I tend to think search results could benefit from a little bit of curation.

For example, when I search anything HTML, CSS or Javascript related, W3Schools manages to be the top link in many of the cases.

While W3Schools is fine, I guess... I have to ask if they truly represent the best result for my search query.

Did you actually search “nude stallman”? I can’t bring myself to do it for obvious reasons but I am curious what you saw.

The worst recesses of hell, lurking in the back of your mind, suddenly brought to the forefront. The sort of thing you sleep to get away from.

Google Search is an index, it can do whatever decision it like. If we don't like it, we can use other indices, even better, we can build more alternatives.

The web is a open network. Anyone can share content as well as indices. I know it's mostly impossible to beat google, but niche indices has their place to shine.

For example, here, HN is an index of hand picked (mostly great) content, and there are multiple "unofficial" HN variances. See? the web is very diverse and free.

>5. Rally a group of developers and PR to bash Webkit as the only one not implementing those flawed API and name it as the new IE.

Google has largely made the internet a safer and more efficient system by pushing standards through their market dominance.

Is this a bad thing?

My biggest problem with Google is that in the 2010s they began flagging sites that had too many links as "link farms" - this discouraged bloggers from posting blogrolls and killed many directories. This really damaged the Open Web that the author is talking about. You saw a widespread abandonment of blogs during 2011-2015. (People stopped seeing traffic to their blogs and just assumed blogs had died due to lack of interest.)

In a way, yeah, they made the Web safer. And cleared it out almost completely in the process.

The disconnect between the expectations of impact and the real impact of big tech experiments on the populace is perpetually surprising.

What about the people who abuse Google search to promote bad things?

Should Google not adjust their algorithms to demote bad or harmful content. There are two sides of the coin.

Is it really much different from the security cat and mouse game?

It is a private platform, correct?

Would you prefer no search or what we had before Google?

What would search look like of they did nothing? What would the world look like over the last 20 years?

You can't really say "before Google" like they've been the same for twenty years. Google's search used to be really good. They would actually use what you typed in the search box to perform the search.

Then Google started adding more portal-like features and using NLP to perform searches on what they wished you had entered in the search field. They also started tailoring search results to your past search history and whatever profile they constructed about you.

Some people might want a natural language question answerer, others might want a search of only current events, but some definitely just want to search for content.

Google has catered to the former at the expense of the latter. They try to stuff all their different modes into a single text box and display them on a single results page. There's no segmentation to clearly different functional modes.

For me I'd love to see Google go back to actual web searches. Support Boolean operators and term quoting. Return me only search results without offering me shit they think I want to see.

I abandoned Google as my main search engine in favor of DuckDuckGo years ago. While not perfect they at least show me actual search results. If I want other modes of search like video or news I can click those tabs but they don't force that content on me.

Arguably other companies would have taken the burden, and it could have been for the better, who knows.

We might not see any obvious candidates for that, as Google was dominant and they had no market access.

Imagine a timeline where Palm ruled the phone market, and we’d be here asking rhetoric questions about what it would have been if Palm didn’t do it.

I think Google is dipping below the standard of Altivista (adjusted for amount of content). It was out there you could find it but it it might take clever search strings. But it was a tool that worked for the user and priority was helping people find what they want. Vs helping people find what advertisers want now.

If by "made the Internet a safer and more efficient system" you mean "a non-stop barrage of Pinterest results", sure.

There is always the trade-off between higher quality through what is essentially regulation and standards and squeezing out those who don't comply (which disadvantages new entrants).

With drugs, we weight that trade-off heavily towards higher quality over new entrants. The question is, is that an acceptable trade off on the web?

Reading this, I now think I should go put AMP on my personal blog. Before that, I needed an HTTPS cert. Before that, I needed to confirm I owned the site to Google.

All things which may make the web better, but it certainly increases the operational burden and may discourage others from bothering in the first place.

Push a static site to netlify for free and all of this is taken care of you for free.

The burden is necessary given the proliferation of bad actors

I would not recommend AMP, but there are dozens of standards that are not part of what the public sees, low level enough they don't care to see. These are the important things Google has caused the world to adopt as best practices. You don't have to have http, but you are telling users you don't care about their security. This was largely pushed by green/red indicators in the URL bar.

> Push a static site to netlify for free and all of this is taken care of you for free. The burden is necessary given the proliferation of bad actors

This heavily pushes the web toward centralization, which I see as a really bad thing, in many ways the cure is worse than the disease. It's happening in so many ways (have you tried standing up your own email server?) and this sort of thing drives it the hardest.

I don't think Google is all bad, in fact I appreciate a lot of what they have done for the world (you mention some of these things). But as with nearly everything in life, it's complicated. I worry a lot that we're headed towards an internet where the individual is at the mercy of organizations to allow them to have a voice.

> The burden is necessary given the proliferation of bad actors

But the bad actors don't really abuse technical standards. They abuse links. The only part in Google's list of criteria that has largely remained untouched in the last two decades and is still the factor for ranking. So much so, that I'm confident that, if ranking factor relevance was a search result, the first page would just be the same result over and over: Links.

That's the reason for the centralization of power: those with more links will be ranked on the top, will have more visibility and, you guessed it, get more links. Guess what happens: they then start to sell either links or they rent out subdomains and folders on their website.

Google, as a company, does not care about bad actors. Some teams in Google do, but they are not the ones setting the policies.

But google isn't pushing, it's baiting. And what you get is all websites look and act the same. A lot of creativity is falling through the cracks. And it's not just google. Why did we need og: tags and twitter:cards when meta tags are already there

My answer is zero, but most people's answer is much larger. Having said that, how many Google services and systems are there for us?

None. And I would not trust a service today that depends on any Google service either.

I'll visit YouTube for free entertainment sometimes, but I hardly depend on it for anything, and I really wish more creators posted their content elsewhere, even at a price.

Nothing that they actually made themselves after I (I'm not OP) have stopped using Google search, it's just YouTube now.

What has Google actually created that's successful, in house, other than search & gmail?

nah buddy, google is a leech in the way the Romans weren't. Not even a comparison, google is not taxing you it's stealing your content and your users

mhh___ types this while using chrome (open source) on android (open source), after reading their email in gmail and searching for the latest cat dancing videos on youtube. The sad reality

most of the folks who claim never to use google or google never makes something available - almost always hot air.

I do remember hotmail and MSN and AOL which preceded gmail, gmail was really pretty great when it came out.

I think microsoft may oddly be coming back a bit after the balmar years. I'm not a huge fan of facebook taking over (yuch).

Chrome isn't open source, and a large part of android isn't either (google play services). Gmail is quickly becoming a walled garden itself (anyone who runs their own email service can tell you that). And YouTube is a hot mess.

You’re only considering the (debatable) positives (chrome sucks ass, folks). If you can’t engage in a good faith discussion why bother posting?

You must not remember the web before google and google funded firefox!! This whole "google has never done anything" meme just seems ridiculous on its face.

Google took on and basically CRUSHED some major market players and powers (Microsoft basically lost the entire web and handset markets from a monopoly position) as a result of google service offerings.

Check out how much money they pumped into firefox as well - that used to be a major part of their push vis a vis IE.

Regardless of how much google "sucks" - they seem to have at least some things that folks like.

Right, well given that they HAVE obviously done a lot of things, I would interpret them to mean “they haven’t done anything I’ve liked or wanted or expected”, which is certainly fine to express. I don’t see why this justifies brushing away google’s decline with “but their products have engagement”. It’s like arguing because people buy an iphone, the iphone is the smartphone people want—in reality it’s just necessary and mildly better in some subjective way than the competition.

Again, I think you don't realize how substantially better the iphone was compared to the competition (which then has worked to copy the iphone).

Interesting - I'd never heard of COSS.

I'd thought they were doomed (ie, no linux cloud offering expected, no multi-platform or open source offerings). Windows Mobile did crash and burn, but Azure actually has linux containers (most common type I think) and their other stuff is sometimes open source / linux friendly now - (SQL Server??) it's whiplash for those of us who were around for the EEE period.


==113864== Words

"Google Is Not God of the Web"

26-07-2021 12:57

I agree with the general thrust of the article, but not a lot of the details. And then there's this:> Our phones have as much RAM as my “studio” work desktopThis is unlikely to be true. From what I can find, the latest iPhone has 4GB of RAM, and Samsung is up to 8GB (and they are growing this stat fast to be sure), but no "studio" desktop made in the last five years is going to have that little.> 22.82 Mbps will reliably download very complex web pages nearly instantaneously.This is definitely not true. It is true that the download time is not large, but between DNS and TLS latency and the fact that most "complex" web pages are built of assets from dozens of different servers, your actual wait time for the assets can be quite long. But even if you discount that, the render time is probably longer than the download time. If your page is that complex, I hope it's very beautiful to look at.

I agree with the general thrust of the article, but not a lot of the details. And then there's this:

> Our phones have as much RAM as my “studio” work desktop

This is unlikely to be true. From what I can find, the latest iPhone has 4GB of RAM, and Samsung is up to 8GB (and they are growing this stat fast to be sure), but no "studio" desktop made in the last five years is going to have that little.

> 22.82 Mbps will reliably download very complex web > pages nearly instantaneously.

This is definitely not true. It is true that the download time is not large, but between DNS and TLS latency and the fact that most "complex" web pages are built of assets from dozens of different servers, your actual wait time for the assets can be quite long. But even if you discount that, the render time is probably longer than the download time. If your page is that complex, I hope it's very beautiful to look at.

And we are allowing it. From the dangerous tactic of allowing them to MITM all users via the AMP platform, to pushing their features only in their browser. A browser, which I may remind, was pushed to success by abusing their market position.

Had Google pulled this in the 90's they would have been attacked like Microsoft.

I see a lot of hate for SEO - Sure it can be use for crappy things but so can a lot of business functions. I see a lot of complaints about ppl saying "oh but this stupid page outrank company abc or the page they would expect." I am just thinking out loud... if we are willing to employ "specialist" lawyers, accountants, programmers etc in our business, why do we balk at employing a SEO specialist as a specialist business function?

There are many good seo companies(true, you have todo your home-work, but that is true for other service providers as well).

The dream is of course to not have to use them and just reply on Google and it's good nature and or algorithms... But is that not saying.. Oh we will just relay on never getting sued and therefor never need a lawyer cause, crimes are bad ?

The web is optimized for Google Chrome, not the inverse. It's not surprising. Google is an advertising company with >90% search market share. There is little competition in the economy, only product differentiation.

Didn't expect something as silly as this, especially the assertion that page sizes don't matter and since internet speeds are going up, web people are free to use it.

I don't see internet costs going down per MiB. Not everyone is on an unmetered connection. And users expect the large amount of bandwidth to be utilized for Netflix or something like that.

And how do you say Google is being evil for doing this? Even if there was no search monopoly and imagine there were two competing search engines. I think both of them would factor page load times in search results for better user experience.

> The entire point of the web was to democratize and simplify publishing [...] But the iPhone's [...] shitbox performance means we're all sort of ready to throw that progress away.

A hilariously ignorant statement. Simplified and democratized publishing doesn't require one smidge of javascript, not one pointless tracker cookie, not a single facebook pixel, no hideous autoplaying videos, not even a jot of CSS. It needs nothing but HTML and text, rendered plenty fast by any computer made this side of 1995.

It would be nice if you could specify preferences. For instance, if you hate tracking and ads, you could tell google that and it would down-weight those results.

It thought I be in favor of whatever this article was arguing but then it went in a weird direction by arguing against certain things that are just considered good practice but its bad now because google is saying it. Like yes, I would consider it a good thing to have lighter websites. Unless your site is catering to some niche that needs to offer 1 to 1 perfect image quality there no reason to not compress your assets.

Casual users don't care about details in user experience, when you innovate casual experience they will consider using new search engine.

I recently began having a lot of problems with Chinese "searchbot" traffic for a website. Filtering it out, my load went down by two orders of magnitude. It made me wonder how much of the purpose of this sort of thing is SEO. And how much slower Google is making the web for everyone by ranking on it, and therefore encouraging the shenanigans.

Every website is different, but if a single bot is bringing your website to a crawl, you should probably start caching a few things and working on performance.

I wish there was a way to index the web which was not so susceptible to marketers gaming the ranking system; it is clear Google has lost the battle there. Moreover, I don’t believe it is in Google’s best interest to surface the best results for me, but those that will generate them the most revenue.

Many inlinks are good for search ranking - people create fake sites and spam. domain is important for search ranking - people pick domain after search keyword. Faster pages rank higher - people make fast websites. Quality content rank higher - like if that ever going to happen.

God, how? Like, in the monotheistic sense or the polytheistic sense? If we're going poly, then Google is definitely one of the Gods of the web. Maybe even the top God, but there may be some competition there.

Google is the God of Control of the web. It thinks it knows what the people want, but it's just using its influence to make it look that way.

The only difference is that Google isn't even aware that's what it's doing.

My favorite Google war story is implementing a web application using Google's polymer, only to watch their Google bot choke on crawling the site. It took about a year for them to get their bot working.

The article is an unexpected take on what I would have assumed "the basics of web development" that would not be argued against. I would like to touch to some of the points here, as the arguments of the article were not very clear to me:

> The simple assumption that it is always better to have the smallest page possible – that images should be resized and compressed to hell and typography/other elements should be few in number. I strongly agree with this statement overall, and the article doesn't seem to provide any counter-arguments against it. We need to serve the smallest page possible because the larger the page, the more resources it consumes, it is that simple. Every _unnecessary_ byte added to a page literally translates to more storage for the page, more processing power to prepare the page, more data being sent on the wire, more data on the client-side to interpret the given data, and all these add up to more energy being used to consume that page and more resources being wasted. If I can remove one byte from a page, this is for sure a win for everyone, one byte is one byte, whether that saving is relevant considering the scale and the effort is a whole another discussion, and the claim was never "send the smallest page at all costs". Considering the time and effort, if there is a way to send a smaller page, then do it, it is no different than turning the lamp off of an empty room, just on a different scale.

> Instantaneous page loads should be priority over any other standards of measure for a web page – like interesting design, for instance. I have never seen such a claim anywhere before, needs citation. As a developer, I think the look and the feel of the pages are as important as performance or efficiency, and web on its own can be used as an art platform, which would make this whole point irrelevant. Again, the overall point is this: if you can offer the same thing with smaller pages witha reasonable amount of effort, do it.

> Minimalistic design is necessary. I have never seen such a claim, needs citation. As a user, I prefer cleaner design over fancier things, but this is neither a "rule" nor the industry standard. There are various research being done on this topic and I am no expert on it, but a joint research [1] done by University of Basel and Google/YouTube User Experience Research shows that the users perceive the cleaner designs as more beautiful, making the point that if the user perception is a goal for the given webpage, then keeping things simple might actually make a difference there. Again, depends on the use-case.

> 22.82 Mbps will reliably download very complex web pages nearly instantaneously This is a pain I am living with every day. I have a 4 year old mobile phone with 6GB of RAM, and it takes at least 8-10 seconds for Medium to be usable over a ~100Mbps connection, combined with fetching the page and rendering / interpreting it. This is exactly the point I was making above, if the page was smaller, it would have actually made a difference of seconds. The same device over the same connection at the same time opens the bettermotherfuckingwebsite.com in under a second, so, there is something to be seen there.

In addition to that, even if I had 1Ghz connection, a 1-byte waste is a waste, irrelevant of my connection speed. I am not talking of the effort of saving that byte, but it is important to acknowledge the waste.

> Google has the right to dictate “Best Practices.” This is a point that I agree with more than the other ones, but this seems to be a separate topic to me. The previous arguments were against the practices themselves, and this one is against the entity that is supplying those practices. Even though I agree with the majority of the points there, it would have been a more informative read if the claims and frustrations there were stated with better point-by-point explanations and data to back those claims up. Google having a huge power and monopoly to push people to certain standards is a big problem, but it is not clear in the article whether the author is arguing against the practices or Google itself.

Overall, I believe it would have been a more resourceful article if the points and claims against the given practices were backed by better alternatives and data. We all accept that more data is more processing power + more energy, therefore trying to minimize it is an important goal, if the author thinks it should not be, then would be more interested in the answer of "why?" rather than a rant against long-standing practices.


==12100== Words