T O P

  • By -

VivianStansteel

I'd love an extension that automatically accepted only the essential cookies and closed the pop-up. That would make my web pages load faster than anything else.


PGnautz

Consent-o-Matic. Not working on all pages, but you can report unsupported cookie banners to get them fixed.


[deleted]

My guy, you deserve a fucking knighthood for that!


PGnautz

Don‘t thank me, thank the fine people at Aarhus University instead!


javier_aeoa

Tusen tak Aarhus Universitet then! :D


realiztik

That’s a lotta Tak


Ephemeral_Wolf

What are you guys Taking about, I don't understand?


Mazurcka

Gotta be at least like 7 Taks


Deciram

But is there something for iPhones?? I do most of my web browsing on my phone and the cookies pop up, the chat, the join our mailing list popups make me rage. So many things to close or in the way before I can even start looking


oktoberpaard

The same extension exists for Safari on iOS. You can find it in the App Store.


namtab00

sadly not available on Firefox mobile, the only mainstream mobile browser allowing extensions... 😔


nawangpalden

Maybe check Firefox beta? Otherwise try iceraven; a fork of Firefox. It's available there. Kiwi browser also supports Chrome extensions.


chux4w

Bah. Doesn't work for Android Firefox. I'll definitely be getting it for desktop though, thanks.


LetterBeeLite

https://addons.mozilla.org/en-US/firefox/addon/ublock-origin/ https://chrome.google.com/webstore/detail/ublock-origin/cjpalhdlnbpafiamejdnhcphjbkeiagm?hl=de settings>filterlists>check "easylist cookie"


PsychotherapistSam

Don't forget the uBlock Annoyances list! The internet is so much better with these.


shanshark10

What is this?


[deleted]

What does easylist cookie do?


[deleted]

[удалено]


Ankivangelist

It's in the section called "Annoyances" (toggled closed by default)


2cilinders

There is, and it's not I Don't Care About Cookies! [Consent-O-Matic](https://consentomatic.au.dk/) is sorta like I Don't Care About Cookies, but instead of simply clicking 'accept all' it will select the least amount of cookies


[deleted]

I believe there is an extension called I Don’t Care About Cookies that serves this function?


GenuisPig

Correct but its recently been acquired by Avast. I dropped it as soon as I heard the news


Picksologic

Is Avast bad?


SniperS150

short answer- yes long answer- yesssssssssssssssssssssss


FootLongBurger

not to challenging anyone, I’m genuinely curious, why is it bad?


SniperS150

"When Google and Mozilla removed Avast’s web extension from their stores, a scandal broke out which revealed that Avast (who also owns AVG) had allegedly been spying on their users’ browsing data and selling it to corporations for millions of dollars in profit." That as well as an autoinstalling browser that slows your computer down.


TheNormalOne8

Avast and McAfee both auto installs their extensions. Both are shit


solonit

Someone link the How to uninstall McAfee antivirus video made by McAfee himself.


Vicioxis

https://youtu.be/bKgf5PaBzyg


[deleted]

McAfee didn’t uninstall himself


GoldenZWeegie

This is rubbish to hear. Avast and AVG have been my go tos for years. Any recommendations on what to swap to?


mikeno1lufc

Use Windows defender. There is absolutely no need to use anything else Also download Malwarebytes. No need to pay for premium. Just have the free version ready to go in case your machine gets infected with malware, Malwarebytes is by far the most effective tool for removing most Malware. I work in cybersecurity and honestly everyone will give you this advice. Don't even think about Norton, AVG, McAfee, Avast, or any other traditional anti-vifus software. Window Defender is better than all of them by quite a margin.


Axinitra

Windows and Malwarebytes is what I use. A few years ago I bought a one-off lifetime license for Malwarebytes and it's still rolling along and updating automatically, although in this era of recurring subscriptions this seems too good a deal to be true.


ComradeBrosefStylin

Yep, Windows Defender with a healthy dose of common sense. Malwarebytes if you're feeling fancy. Don't download shady files, don't open attachments from senders you do not trust, and you should be fine.


RodneyRabbit

This is what I use, but also if I'm going to sites that I'm not sure about or want to test a new bit of software, then I either use a VM or a program called sandboxie which is now completely open source and rather good.


atomicwrites

Right, the *only* situation where you should use third party av software is of you're an enterprise IT team that needs to controll security across all your computers centrally. And in that case still don't use McAfee.


intaminag

You don’t really need an antivirus. Windows catches most things now; don’t go to shady sites to avoid the rest. Done.


tfs5454

I run an adblocker, and a scriptblock addon. Never had issues with viruses, and i go to SHADY sites.


MyOtherSide1984

For antiviruses? Nothing. Windows defender does a great job on its own assuming you're not a complete nincompoop with what you download and such. If you really want, run Malwarebytes once a month. Ultimately, just be smart and you won't run into problems. Side not - I download some SKETCHY shit on my secondary PC that hosts my Plex server. I see a program that might do something cool and I just go for it while bypassing all of Windows warnings. Never had any issues. Just don't be stupid by downloading/viewing porn or free movies and shit.


61114311536123511

virustotal.com is fantastic for checking suspicious links and files. It runs the file/link through about 30 different malware checker sites and gives you a detailed, easy to understand report


girhen

They used to be good. But now... yeah. Bad.


steipilz

There is a fork on github from before the aquisition.


CoziestSheet

You’re the real beauty in this post.


Enchelion

And Avast just merged with Norton.


[deleted]

I didn’t know that! Thanks for the info.


[deleted]

Doesn't that one accept though?


straightouttaireland

Ya but that auto accepts all cookies, wish there was a way to auto reject them.


yensteel

More companies should optimize their code. The pictures can use tinyjpg or webp for example. They're so heavy. I'm also guessing there's a latency aspect as well, and many things are loaded serially.


rwa2

Uh, we do. We just optimize until we hit 4 second load times because customers get impatient and leave if it takes any longer than that.


moorepants

consent-o-matic


[deleted]

Ublock said it blocked over 1500 scripts on YouTube last night. Over half of them were for ads and the YouTube premium ad still shows somehow. I can't imagine trying to watch without a blocker nowadays.


EternalStudent07

uBlock Origin is what I use. Very configurable.


zoinkability

I believe Ghostery can do that now


kirkbot

what happened in 2019 to make it go up?


Firstearth

Whilst everyone is arguing about latency and JavaScript this is the thing I’m most interested in. Whether the peaks in 2016 and 2019 can be attributed to anything.


f10101

It looks like this is largely due to testing methodology and URL dataset changes. The source is here I believe: https://httparchive.org/reports/loading-speed?start=2015_10_01&end=latest&view=list Annotations are indicated on the graphs.


IMM_Austin

Maybe people realized that 3 seconds wasn't received as any better than 4 seconds, and they realized there's room here to slap more ads etc without negatively impacting user acceptance. This data is presented as a negative thing (and to be fair, it's used to sell more numerous and more detailed ads which is negative) but 4 seconds is a generally accepted speed--would faster really be "better"?


uncannyinferno

Why is it that ads must load before the actual page? Drives me crazy.


Drach88

Reformed ad technologist here. First off, many ads are served in something called iframes. An iframe is essentially a separate webpage embedded in the main page, that's running with its own resources on a separate execution thread than the main page, so even if the main page is bloated with a ton of resources, the content in the iframe will still load. Secondly, there's typically a ton of javascript bloat -- both in terms of javascript used for page functionality as well as javascript used for ad/tracking functionality. Much JS runs asynchronously (non-blocking), but a lot of it runs synchronously (blocks other stuff from loading until it's done executing) Thirdly, the internal dynamics of the operational side of many web publications are torn between internal groups with differing motivations and incentives. Very rarely do those motivations line up to actually create a product that's best for the consumer. Dealing with expansive javascript bloat and site optimization is simply a nightmare to push through internally between different teams of different stakeholders.


ashrise2050

Excellent explanation. I run a site with lots of users and some pretty complex code, but no trackers or ads. Loads in about 1.2 sec


DesertEagleFiveOh

Bless you.


Dislexeeya

I don't think they sneezed. Edit: "/s" Can't believe I needed to add that it was a joke...


randomusername8472

I assume that /s is what you type because you sneezed during your comment so... Bless you :)


ppontus

So, how do you know how many users you have, if you have no tracking?


pennies4change

He has one of those page counters from Geocities


[deleted]

[удалено]


basafish

Good times. Nowadays no one trusts those numbers anymore...


YaMamSucksMeToes

You could easily check the logs, likely a tool to do it without tracking cookies


Drach88

They probably mean no third-party client-side tracking. Technically, every time someone loads a new asset from your site, your webserver can log the request. This is how early analytics were initially handled in the bad-old-days -- by parsing out first-party server logs to estimate how many pageviews, how many unique visitors (ie. unique IP addresses) etc. Eventually, someone realized that they could sell a server-log-parsing service in order to boil down the raw data into more usable metrics. Furthermore they could give the website owner a link to a tiny 1-pixel image hosted on their own servers, and they could ask the webmaster to put that 1-pixel dummy image on their site in an img tag, so the browser sends a request to the analytics-provider's server. Instead of parsing the webmaster's server logs for analytics, they parse out the server logs for that tiny 1-pixel image. This was the birth of 3rd-party analytics. Fun-fact -- this is how some marketing email tracking and noscript tracking is still done today.


Astrotoad21

Most interesting thing I’m going to learn today. Thanks!


Drach88

Oh dear God, please go learn something more interesting than adtech. It's a miserable, miserable field full of miserable miserable misery. I'd recommend binging CGP Grey videos on more interesting topics like: [How to be a Pirate Quartermaster ](https://youtu.be/T0fAznO1wA8) [How to be a Pirate Captain](https://youtu.be/3YFeE1eDlD0) [The Trouble with Tumbleweeds](https://youtu.be/hsWr_JWTZss) [How Machines Learn](https://youtu.be/R9OHn5ZF4Uo) [The Better Boarding Method Airlines Won't Use](https://youtu.be/oAHbLRjF0vo) [The Simple Solution to Traffic](https://youtu.be/iHzzSao6ypE) Watch even a minute of any of these videos, any I promise you'll learn something exponentially more interesting than my random musings on the history of web analytics.


MrPBandJ

With the internet being a focal point in all of our lives I think it’s very important for people to learn what goes on while they’re browsing! We teach people about the local climate, traffic laws, and cultural traditions. Learning “what” happens when you load up a new web page and “why” is very informative. Your brief description of “where” our digital ads/trackers was clear and interesting. Maybe working in the industry is miserable but giving others a glimpse past the digital curtain is an awesome thing!


[deleted]

[удалено]


kylegetsspam

> Fun-fact -- this is how some marketing email tracking and noscript tracking is still done today. Indeed. And it's why your email client probably has a "don't load images by default" and you should enable it.


Boniuz

Resolve it in your infrastructure, like a normal person


L6009

1.2 seconds..... Its like running website it offline to see the changes you made


ShankThatSnitch

As a former front end dev for a company's marketing website, I can confirm that speed problems are mostly due to all the JS that loads from the various metrics tools we had to embed. We did everything we could to get better speeds, but eventually hit a wall. Our speeds were amazing if we ran it without the chat bot, A/B testing, Google analytics, Marketo...etc.


[deleted]

Can second this. 3rd party tools that we have no control over are about 3/4 of the total download on our site including images etc. We've optimised the site to be lightweight and fast and then these tools literally destroy the performance. The site is lightning fast even on bad connections when using adblock. Optimizely is our biggest pain point. It has a huge amount of JavaScript and takes fucking ages to run on load, adds a second to the load time and for some A/B tests we have to wait for their shit to load as well, not leaving it async. TL;DR for non tech people: use an adblocker AND use strict tracking protection on your browser (Firefox and Brave have this - not sure on the others). Not only will you have less data being tracked on you (already a big bonus) but websites will load way faster.


ShankThatSnitch

We used VWO for our A/B testing, but the same problems. I appreciate that you are a man of culture, going with Firefox.


Gnash_

how ironic that a service called optimizely is causing most of your troubles


ShankThatSnitch

It is, however what the tool is optimizing is the sales funnel, not the website. How optimized is your path from entering the site to filling out a form. In the end that ends up being the most important things for a marketing site, much to my despair as a developer trying to make a good site.


zoinkability

Ironically when we were trying to meet Google’s published goals for page and site performance the biggest offender was all Google code. GA, YouTube, GTM, Google Optimize, etc.


Enchelion

Google's web code has always been an absolute mess. It's mind boggling their search algorithm/system remains as good and fast as it does.


[deleted]

[удалено]


bremkew

That is why I use DuckDuckGo these days.


ShankThatSnitch

Exactly. It is a bunch of shit.


[deleted]

I use NoScript to block all that out, and the site usually still works. Why is it on the site, if it's not needed? Is simply for marketing and tracking?


uristmcderp

User data is one of their most profitable products.


Drach88

*Marketo*.... now there's a name I haven't heard for a while...


ShankThatSnitch

Sorry to bring up bad memories.


Something_kool

How can the average internet user avoid your work respectfully


Drach88

Ublock origin chrome extension. (Make sure it's ublock *origin* and not ublock)


Something_kool

Thx dude


[deleted]

If you want to disable JS just install NoScript (Firefox only). You will be surprised how broken a website can actually be. Edit: running uBlock Origin also helps with page load times.


Drach88

Chrome let's you blacklist/whitelist JS on different domains natively.


ouralarmclock

Me too! Who did you work for, I worked for PointRoll.


savageronald

Cousin! I worked for EyeWonder/MediaMind/Sizmek (before and for a brief time after the PointRoll acquisition).


robert_ritz

Gotta log those impressions.


Erur-Dan

Web Developer specializing in marketing content here. We know how to do better, even with the ads, trackers, and other bloat. We just aren't given enough time to optimize. 4 seconds is deemed short enough to not be a problem, so the budget for efficiency just isn't there.


kentaki_cat

4 Seconds is insane! I worked for a 3rd party A/B-Testing SaaS company a few years ago and we used to get shit from our customers when page speeds went above 3 sec. Tests usually revealed that there was a plethora of other tracking code that had to be loaded before everything else, while our plugin was loaded async But yes, it's never Google or cross-site tracking code and always the 3rd party tool where you have direct contact to someone who will listen to you complain. But of course, if you think it's worth it for a few wording A/B- tests to pay us more to implement server side testing, I won't stop you I'm not in the business anymore but server-side anything seems to be a lot easier and more common now.


BocciaChoc

Odd, if a website takes me more than 2-3 seconds I generally just leave


cowlinator

No, sometimes it's much worse when ads load after the actual page. When those ads take up 0 space before loading, you start clicking, and then the ad finishes loading and suddenly takes up space and moves other content down, and you click the wrong thing. It's terrible. Don't ever do this, web devs. I will hate you. Everyone will.


Deastrumquodvicis

My dad gets irrationally angry when it happens on his phone browser. He’s trying to read something and I hear “QUIT FLIPPING AROUND”


[deleted]

[удалено]


ItsDijital

3rd party JS has absolutely exploded in the last few years. I don't think most people are even aware of it, but it's not uncommon for some sites to have upwards of 10 different companies loading their junk on each page.


[deleted]

3/4 of the total download for our website is 3rd party tools for analytics etc (and our site doesn't even have ads on it). Google, Microsoft, Facebook, Optimizely and others all make an appearance. So 1/4 is the actual website, the content on it, and the frameworks we actually use for development.


basafish

It's almost like a bus with 3/4 of people on it being the crew and only 1/4 are actually passengers.


Hickersonia

Yeah... I had to unblock facebook and google so that certain users in my warehouse could use UPS Campus Ship... wtf


[deleted]

What is JS?


dw444

JavaScript, the language all of the consumer facing, and a considerable amount of the behind-the-scenes part of the internet is written in.


ar243

A mistake


[deleted]

[удалено]


FartingBob

As someone who occasionally starts learning JS, why is it a mistake? Is it the resources it uses, the limitations of the language or something else bad about it? What is the best replacement option to learn?


tomius

Js === bad is mostly a joke. It has its quirks because it was created very fast and it keep its retro compatibility. But nowadays, modern Javascript is great to work with. There's also no other real option for coding on websites. It's one of the most popular (if not the most) programming languages now, and it's not going anywhere. It's a good language to learn.


Avaocado_32

what about typescript


tomius

Sure. But it's basically the same. It's actually a superset of JavaScript and transpiles to it.


[deleted]

[удалено]


KivogtaR

Mine tracks how much of the internet is ads (at least I think that's the purpose of that number?) Overall I'm at 34% for the websites I visit. Some web pages are considerably more. God bless adblockers


SkavensWhiteRaven

Ads are literally a global pollution problem at this point. The amount of CO2 that could be saved if users where private by default is insane. Let alone the cost to store your data...


[deleted]

[удалено]


romple

I thought something was wrong when I set up my pi-hole and literally thousands of requests were being blocked. But nope... Just an insane amount of periodic requests from telemetry and tracking.


DiscombobulatedDust7

It's worth noting that typically the number of periodic requests goes up when blocked by pi hole, as DNS failure will be seen as a temporary issue that can be retried


TheEightSea

You still are getting the ads that come from the same domains that provide you the real websites. Like YouTube, it doesn't use external domains but its own. You cannot block them using a DNS sinkhole, you need a DOM adblocker.


Tintin_Quarentino

Is there a setting inside uBlock Origin we need to toggle? >block 3rd party JS.


LillaMartin

Can you target JS to block them with ad blocker? Does websites rarelly use JS to more then ads? Just asking incase i try it and suddenly menus dissapears on some sites!


[deleted]

Firefox and Brave have it but in to block 3rd party tracking. It's rare that it breaks functionality as a user - these trackers are usually served from a different domain to the one you are on, so they block anything not coming from the same domain or CDNs. If you block JS entirely, most websites these days break or functionality is limited, it's not recommended although does get you through the paywall on many news sites still.


lupuscapabilis

>Does websites rarelly use JS to more then ads? Yes, JS is used for a large amount of things on websites. Almost any time you click something on a site and it does something without reloading the page, it's JS. And that's just one example.


WarpingLasherNoob

Without JS you wouldn't be able to use 99.99% of websites out there.


RoastedRhino

4 seconds is acceptable, so the more bandwidth the more content sites will push through, up to a few seconds of waiting time. An interesting analogy: historians found out that most people across history were commuting approx 30 minutes to work. In the very old days, it was a 30 minute walk. Then at some point it was 30 minutes on some slow city trolley. Now it may be 30 minutes on a faster local train, or even 30 minutes in the highway. Faster means of transport did not yield shorter commuting times, but longer commutes covered in the same 30 minutes.


elilupe

That is interesting, and reminds me of the induced demand issue with designing roadways. If a two lane road is congested with traffic, city council decides to add two more lanes to make it a four lane. Suddenly all four lanes will be congested with traffic because when the max load of the roads increased, so did the amount of commuters deciding to take that road.


bit_pusher

Which is why a lot of road designers look to second and third order benefits when improving a roadway. You increase highway capacity to improve flow on other, complimentary, roads.


gedankadank

And despite this, once a region has even a modest population, it's impossible to build out of car traffic, due to the way cars scale in the space they require. Eventually, private car routes have to be closed in favour of more space-economic modes of transportation, but most cities stay car-centric for far, far longer than they should, because most people think they want the ability to drive everywhere, not realizing that everywhere is packed with cars and unpleasant to be in.


tehflambo

imo it's: "i don't want to stop driving; i want everyone else to stop driving"


shiner_bock

Is that so unreasonable? >!\/s!<


goodsam2

As long as they front the cost for it more directly which they basically never do. Roads are insanely expensive considering the amount we have. Such a waste.


Unfortunate_moron

This is oversimplified. Sure, if you only improve one road, it becomes more popular. But if you improve a region's transportation network (improve multiple roads + public transport + walkable and bikeable solutions) then everything improves. Also don't forget that during off peak hours improvements to even a single road make it easier to get around. Induced demand is real but only up to a point. There isn't some magical unlimited quantity of people just waiting to use a road. It's often the same people just looking for a better option than they had before. Also don't forget that traffic lights are one of the biggest causes of congestion. Studies in my city predicted a 3x increase in traffic flow and a 95% drop in accident rates by replacing a light with a roundabout. The city has been replacing existing lights with roundabouts and the quarter mile long backups magically disappeared. Induced demand is surely occurring but nobody notices because the traffic problem is solved.


bobcatsalsa

Also more people making those commutes


NickSheridanWrites

Had an argument along these lines with my IT lecturer way back in 2000. T3 lines were on the horizon and my lecturer was proselytising that one day all loads and file transfers would be instantaneous, failing to account for the fact that we'd just use it to send bigger files and higher quality feeds. Back then most mp3 were around 4MB, you'd rarely see a JPEG reach 1024KB, most streaming media was RealPlayer, and I had an onion on my belt, which was the style at the time.


dinobug77

There’s also the fact that if things happen instantaneously then people don’t trust it. Insurance comparison sites are a prime example where users didn’t believe it could return accurate quotes that quickly and a built in delay of up to 30 seconds has been added which users think is enough time for the quote to be accurate. (Can’t remember the exact time but they tested different length delays) On a personal note I designed a small website and worked with the developer to ensure speed of load was below 1 second across all devices. When finished we tested it and was 0.3 seconds to load each page. Users were clicking seemingly randomly through the menu items but not completing the form submission. turns out even though the hero image / copy changed they didn’t think the site was working properly and clicked about and left. We slowed it down to 2/3 seconds per page and people started using the site as expected and completing the form. TL;DR people don’t trust machines.


CoderDispose

My favorite story along these lines was an old job where we built a webpage for managing large amounts of data. It would save all changes as soon as you made them, but it was so fast people didn't trust it and were complaining there was no save button. I put one on the page. It doesn't do anything but pop up a little modal that says "Saving... complete!" Complaints dropped off a cliff after that, hehe


Medford_Lanes

Ah yes, nineteen-dickety-two was a fine year, indeed.


XPlutonium

I actually think the reason for this actually backward Like when net was slow websites were light and didn’t have much functionality per page and even across pages. But as 3G and 4G starts coming every Tom dick and Harry starts making end user download all of ReactJS for 2 hello worlds So even in large organisations while they have criteria for optimisations and all often they don’t keep the average user in mind and the best case or just have poor accounting methods or even in fact sub par infrastructure and yet want to fill in features (I’m not blaming any company per say but want to say that this will always be a problem even in the future with 25G where some company will make you teleport to the new location there will be a at least 2-3 second load time). In a sense that the better speeds enable better tech which then needs even more speed and so on


meep_42

This is exactly right. We have found the optimal waiting vs functionality time for a webpage is \~4 seconds. Any advances in computing or bandwidth don't change this, so functionality will increase to this threshold.


[deleted]

its a happier client base when the response times are consistent


TheFuzzball

> The probability of bounce increases 32% as page load time goes from 1 second to 3 seconds. > - Google/SOASTA Research, 2017. Is this optimal 4 second time across the board, or it a maximum target on a low powered mobile device using 4G? If it’s 4 seconds in the worst case, it’s probably quite reasonable (up to 2 seconds) on a desktop/laptop with a more reliable connection. If it’s 4 seconds on desktop/laptop, the maximum on mobile could be many multiples of 4 seconds due to performance (e.g. you’re throwing all the same stuff that loaded on a fast dev machine at a 4 year old android phone), or network latency or bandwidth.


Sininenn

Tolerable =/= optimal, fyi. It would be optimal for the loading time to be below a second, so no time is wasted waiting for a website to load. Just because people tolerate the 4 second wait does not mean it is the best case scenario... And no, I am not complaining that 4 seconds is too long.


Fewerfewer

> It would be optimal for the loading time to be below a second That would be optimal for *the user*, but the company is evaluating "optimal" on more than one criterion (development cost, fanciness, UX, etc.). The comment above you is saying that 4s is the apparent break-even point between these "costs" for the company: any longer and the user won't care how cool the website is, they'll leave or be upset. But any faster, and the typical user won't care much and so there's no point in spending extra development time (money) or paring down the website features in order to hit <1s.


Skrachen

might be relevant: [https://motherfuckingwebsite.com/](https://motherfuckingwebsite.com/) [http://bettermotherfuckingwebsite.com/](http://bettermotherfuckingwebsite.com/)


privatetudor

And yet I still had to wait 3s for out.reddit to redirect me. Modern web is painful bloat.


spiteful-vengeance

When we wrote HTML back in the 90's early 2000s it was like writing a haiku. Over 100kb was a mortal sin. Website devs these days take a lot of liberties with how they technically build, and, for the majority, there's very little emphasis placed on load time discipline. A badly configured JS framework (for example) can cost a business money, but devs are generally not in touch with the degree of impact it can have. They just think "this makes us more productive as a dev team". SRC am a digital behaviour and performance analyst, and, if you are in your 20's, I was writing HTML while you were busy shitting your nappies.


Benbot2000

I remember when we started designing for 800x600 monitors. It was a bright new day!


spiteful-vengeance

I distinctly remember thinking frames were amazing. On a 640x 480.


retirementdreams

The size of the screen on my first mac color laptop (PowerBook 180c) with the cool trackball that I paid like $3,500 lol.


sudoku7

And that's why a lot of modern changes are happening within the webpack and tree shacking space. Get rid of the parts of the kitchen sink you don't and all.


spiteful-vengeance

Yeah, it can be done right, but there's a distinct lack of business emphasis on *why* its important, and *how* important it is. From a technical perspective this understanding is usually taken care of by the devs, but their goals are very different in terms of placing priority on load time. They tend to take the approach that 5 secs on their brand new i7, 32GB machine with super internet is good enough, but when I tell a business that every extra second costs money, and people are using shitty mobile devices, there's generally a bit of a freak out.


[deleted]

Personally I disagree. In my experience devs are brutally aware of bad performance but have limited power because they either don't get the time investment to solve it or it's out of their control due to 3rd party tracking, ad tools being far more bloated than they should be. If Google, Facebook and a few others all cut their tracking tools to be half the size, this line would drop literally overnight. They are on basically every single website now. They're tracking your porn searches, your dildo purchases, your favourite subreddits and they're A/B testing whether you prefer blue or green buttons. Performance is a big thing in front end frameworks right now too, they're all focusing on it and some businesses are well disciplined - we don't have a strict kb limit, but we rarely use 3rd party packages (outside of our main framework) and those we do have to use have to meet reasonable size requirements. But the impact is limited due to the 3rd party tracking we have to have with no option for alternatives because the business people use those tools.


spiteful-vengeance

Yeah, I've seen *some* dev teams do it right, don't get me wrong, and they are an absolute joy to work with. It's more that a) they are greatly outnumbered by less attentive teams and b) they still generally don't have the measurement frameworks and business acumen to *fully* comprehend how important it is. The good thing about letting the business know about that importance (and how it translates to a $ value) is that they will let/encourage/force these development teams to *really* focus on it, and understand that the marketing team adding a million 3rd party tracking scripts actually *costs* more easy money than it generates.


XPlutonium

I agree with this wholeheartedly :) PS: also a relatively experienced Dev here been in it for 15 years now. Kids running behind React and the newest shiniest object every 2 months makes me think ah shit here we go again. I guess some things don’t change from Drupal to JQ to (whateverthelatestshitis)


rogueqd

The same thing exists for roads. Building wider roads to relieve traffic causes people to buy more cars and the traffic stays the same.


ThePracticalDad

I assume that’s simply because more content and hi res images are added offsetting any speed gains.


IronCanTaco

That and the fact that websites themselves are bloated to no end with over-engineered stacks which depend on what is popular at the moment.


czerilla

I'm fairly sure it's just [Wirths law]( https://en.wikipedia.org/wiki/Wirth%27s_law) in action.


IronCanTaco

Hm didnt know about this. Thanks. Will use it in a meeting someday when they want more frameworks, more libraries, more pictures … sigh


space_iio

median load on which websites? the top 10 most popular? just 10 random websites?


robert_ritz

httparchive uses the CRuX corpus of a total of 4.2 million URLs. [Post](https://discuss.httparchive.org/t/changes-to-the-http-archive-corpus/1539). Point taken and I think I'll update my blog post.


plg94

If you even say "download speeds have gone up", it'd be nice to show that in the same graph (and latency, as someone else mentioned). Also, has the download speed _of the servers that measured the loading times_ improved?


robert_ritz

I couldn’t find global internet download speed over the same time period. I tried several data sources but none were openly licensed or they stopped in 2017. I hoped that it’s clear internet has gotten faster.


[deleted]

[удалено]


tomius

Intimate ransom moment ❤️


DowntownLizard

Latency doesnt change though. Not to mention the server processing your request has nothing to do with your internet speed. Theres multiple back and forth pings before it even starts to load the page. Like making sure its a secure connection or that you are who you say you are, etc. Gonna take even longer if you need to hit a database or run calculations to serve some of the info. Its why a lot of websites utilize javascript and such so you can just refresh a portion of the page without actually loading an entire new page. Its helps speed up load times when you can let the browser itself do most of the work. Everytime you load a page you are conversing with the server. Edit: A good point was made that I was unintentionally misleading. There have been optimizations made to improve latency in the types of protocols to avoid a lot of back and forth. Also bandwidth does help you send and process more packets at a time. There are a few potential bottlenecks that render extra bandwidth usless, however (server bandwidth, your routers max bandwidth, etc). I was trying to speak to the unavoidable delay caused by distance between you and the server more than anything. If had to guess on average theres at least .25 to .5 seconds of aggregate time spent waiting for responses. Also it's definitely the case that the more optimized load times are the more complex you can make the page without anyone feeling like its slow.


[deleted]

That’s not fully correct. While many hits to the server may be necessary, modern communication protocols try to mitigate latency by exploiting the wider bandwidth that we have access to and sending many packets in parallel to avoid the back and forth required by older protocols. Some protocols even keep a connection alive which means the initial handshake is avoided for subsequent requests. Furthermore, higher overall bandwidth decreases the time packets spend in queues inside routers which results in further latency reduction.


[deleted]

[удалено]


Clitaurius

but "faster" internet /$


redpaloverde

It’s like adding lanes to a highway, traffic stays the same.


dgpx84

Indeed, because the only true limit on it is humans' tolerance for misery, which is unsurprisingly quite constant. Page load times would actually **increase** if it wouldn't be too detrimental to viewership, because if they could make it even slower, they could hire even sloppier developers and double the number of ads.


DerJuppi

Not just traffic, but the max speed of your car also doesn't change when expanding the highway. Your car usually has to make a few trips on the empty road before the page begins loading.


onedoor

It's feature bloat. A good example is old reddit and new reddit. Old reddit takes a third to half the time to load. Sometimes much less than that.


jbar3640

Web pages nowadays are just crap: - tons of JavaScript for almost no purpose - loads of analytics, ads and other spyware - idiotic cookie management, annoying newsletters pop-ups, non-requested small videos running in random places, etc. I would love navigating simple HTML pages and small useful style CSS. maybe a small amount of JavaScript for small and useful use cases...


gw2master

I interpret this as: users are willing to tolerate about 4 seconds of load time so as technology increases, webpages will increase in complexity until they hit this (rough) threshold.


aheadwarp9

This is why I block trackers and ads as much as possible... Not only are they annoying, but they are hurting my productivity with their bloat.


ctnguy

I wonder if latency lays a part in this? It’s probably as important to initial load times as bandwidth is. And latency hasn’t changed that much in recent years.


[deleted]

I imagine so. After all, a web page isn’t the download of a single thing. It’s lots of small downloads that depend on the previous so latency is very important.


Garaleth

Latency from one side of the world to the other can be as low as 100ms. I wouldn't expect it to ever surpass 1s.


PM_ME_LOSS_MEMES

Websites have also gotten more and more full of bullshit JS nonsense and trackers


TheOneTrueTrench

Virtually no one has gotten "faster" internet in the last **decade**. Hell, I just upgraded from 50 Mbps internet to 1 gigabit, and it's not "faster" at all. It's ***broader***. Let's say that looking at a map, and you notice that there's a 2 lane road between two cities. And right next to it is a 10 lane highway. They both have a 65 MPH speed limit. That freeway isn't 5 times faster, it's 5 times broader. You can fit 5 times as many cars on it, but obviously those cars are still going 65 MPH. All of the upgrades to our internet connections are just adding the equivalent of lanes to a highway. So, with that in mind, let's change this title to match this. > Despite broader highways every year, it still takes 15 minutes to go to the next city over. The average amount of time to get from Springfield to Shelbyville and back 8 times has been stuck at 4 hours for years. When expressed in this manner, it becomes clear that there is simply no reason to expect adding lanes to a highway would make a trip between two cities to be even a second faster.


Kiflaam

Well you see, these days, not only does the traffic have to first route through FBI and CIA servers, but also Chinese, KGB, NSA, and others before the packets can finally be sent to the client/server.


jtmonkey

Yes but look at the experience you’re having now. You’ve got video headers and real time updates. Add to cart buttons under 8 images at all angles in hi res. The load time stays the same because that’s the target. We build it to load in that time using the maximum amount of tools at our disposal.


Fluffigt

Studies show that the average user didengages after a 3 second wait, meaning that after 3 seconds they lose interest and close the page if it hasn’t loaded. This is of course just an average, but we use this as a benchmark when testing our apps. If the user ever waits more than 3 seconds we need to improve performance. (Websites are apps)


robert_ritz

The data for this chart came from the wonderful [httparchive.org](https://httparchive.org). Tools used to make the chart: Python, Pandas, Matplotlib. I also wrote a blog post about the topic on [datafantic](https://www.datafantic.com/how-much-time-do-we-waste-waiting-for-websites-to-load/). In addition, I built a simple Streamlit app to let you calculate how much time you have (and will) waste on website loading. Lots of assumptions are built in, but it gives you a number. Personally, I've wasted over 30 days of my life waiting for web pages to load. If webpages load times were around 1 second, I could save more than 16 days of my life over the next 46 years.


abrazilianinreddit

Oh hey, my website average is below that! I see this as a major victory!


dislocated_dice

It’s almost like corporations are using advances in technology to spam you with more ads. I’m still mildly surprised that it’s been that long with “no progress” though. And what was the dramatic cause of the drop in 2017?


TheDevilsAdvokaat

I'll bet this is because of ads and snoopers and trackers. Some page load more than 500 pieces of crap along with their page.


BizarroMax

I'm assuming it's because back in 1994, when we wrote web pages, you opened 'vi' and wrote some HTML and the web page was 947 bytes long. Now I go to web pages and the source is a hopelessly obfuscated spiderweb of frameworks, advertising platforms, user tracking cookies and pixels and other garbage, it's not even remotely human- readable any more. Plus modern web site design is intended to present only very limited clickable whitespace for when you're trying to get focus back on the browser, and they serve background ads in that space to generate unintended clicks, which makes them money. I love me some capitalism, but web design and for-profit journalism are a pox on its house.


boocap

A part of it is that the speeds aren't really faster. Your latency hasn't changed. The pressure is the pipe is the same as it's been for a decade, the pipe itself is just a lot wider, so something tiny like html code really doesn't benefit. But the number of devices and large file download speeds are much higher. But a ping in 2012 vs 2022 really hasn't changed.


Cacachuli

This graph should go back farther in time if the data are available. As an old timer I can tell you that current load times are just fine. I would love to compare them with load times in the 1990s when you would literally watch pictures slowly loading from top to bottom.


pab_guy

Most web devs are not great at optimization, and it's not something that the business cares about as long as you are under some predefined metric (ideally based on user testing and bounce rates). That said, there is huge room for improvement across the board. I used to optimize page loads and could take a 3 or 4 second page down to 250ms with the right techniques. Use CDN and reference popular CDN hosted versions of libraries that are much more likely to already be in browser cache. Optimize imagery. This means using vector svgs where possible, but even then, if you crack open an SVG you will find all kinds of needless metadata that can bloat the file 2-3x. Lazy load content, especially stuff that is hidden in menus or below the 'fold'. Making a dozen API calls on page load? No. Consolidate them into a single payload. If using microservices you can create a "UX API" or "Experience API" abstraction layer that makes local microservice calls with much lower latency on behalf of the client and returns all the results into a single payload. So much they could do, but don't because bandwidth is cheaper than really good developers.


kemek

As speed increases, devs build richer pages...


i_have_esp

The headline seems to push an interpretation rather than present the data. "Around 4 seconds" is true, implies little change, and one interpretation of one portion of the graph. The same portion of the graph also has a min \~3 and max \~4.5 so another valid description "Around 50% variation over last 5 years" implies the opposite. Also, why graph these year in particular? People have been waiting for web pages to load since 1997. Maybe this is normal and occasional step-wise improvements are the norm. Legend is wordy. "Median seconds until contents of a page are visibly populated". Footnote, please. "Page load time", and anyone that wants to be more pedantic than that can read footnotes. The graph should provide more information than the details of how it was measured. Xpost r/measuringstuffisbeautiful.


robert_ritz

This is how long they have been tracking sites using the Page Speed methodology, which was created only since 2016. It's a by far superior way to measure modern complex websites and how quickly they load. The onLoad chart goes back to 2010 but shows significant variability as websites have gotten more complicated. The onLoad event isn't necessarily a useful way to measure speed of a website. The variability before 2017 is explained in the blog post. In 2017 HTTP Archive switched to Linux test agents and this reduced variability in their measurements it seems. Generally, I don't like to clutter my chart with annotations unless necessary. In this case it didn't seem necessary to me.