What does Sup Forums think of web bloat?

What does Sup Forums think of web bloat?

For reference the "2012 website" is the same size as Crime and Punishment.

Other urls found in this thread:

sciencealert.com/bad-news-study-finds-80-of-students-can-t-tell-the-difference-between-real-and-fake-news
rise.cse.iitm.ac.in/shakti.html
twitter.com/SFWRedditGifs

Web bloat isn't a real issue when HTTP supports gzip. Also, websockets allow requests to be aggregated and bundled eliminating the overhead in establishing and tearing down a new connection for every request.

Isn't this like saying "my cars not falling apart; it's held together with duct tape"?

Besides, unzipping and then rendering the page will likely erase any benefit from having a smaller size sent through the series of tubes.

Or just not send all the stuff in the first place, web dev.

>Isn't this like saying "my cars not falling apart; it's held together with duct tape"?
Not really.

First of all, the infographic with "average web page" doesn't really say anything. What is that, "average web request"? How did they measure this? Average as in mean? What is the distribution? Normal?

Secondly, from 2010 to 2015 we have gone from 1 Gb/s being state of the art to 40 Gb/s and 100 Gb/s are becoming common now too.

>Besides, unzipping and then rendering the page will likely erase any benefit from having a smaller size sent through the series of tubes.
No, because bandwidth is the bottleneck, not CPU or frame rates.

Your browser is supposed to handle client-side caching anyway.

Is it possible to make the website so that when it is partially loaded that everything is in the spot where it will be when fully loaded rather than having shit jump all over the place as the rest of the site loads? I fucking hate when I'm about to click a link but then other stuff loads and there link keeps fucking jumping around...

>Is it possible to make the website so that when it is partially loaded that everything is in the spot where it will be when fully loaded rather than having shit jump all over the place as the rest of the site loads?
Yes. This is why tags support height and width attributes.

It's crazy how many professional corporate websites can't do this then. Every goddam time the link jumps away and I click the wrong thing.

>Your browser is supposed to handle client-side caching anyway.

Which is only required because of all the stuff you're sending.

>40 Gb/s and 100 Gb/s are becoming common now in the city coffee shops I go to

Fixed.

Well, programmers are in general despicable people and half of them are lazy and/or incompetent.

>Which is only required because of all the stuff you're sending.
Why are you even browsing a IMAGE board if you don't like images on the web?

It's not about images on an image board. It's about 1080p html5 ads on a website where i go for text content (ie news websites).

Not with more shitty browser hacks because HTML is trash and always has been. The "tree" structure is downright lousy for rendering content in any sort of efficient way.

>It's about 1080p html5 ads on a website where i go for text content (ie news websites).
Just get adblock. The html5 ads are third party anyway, usually news sites have no influence over them except placement. Just look at the requests, they go to third-party domains and often download agent code which is potentially harmful.

This is not a problem with modern web, this is a problem with news websites being retards.

Get an adblocker you homo

The thing is that for an imageboard Sup Forums is incredibly bloated before you even get to the images. We've gotten to the point where images are not what is taking up the bandwidth any more, it's Javascript and other content that pulls in. A wikipedia page has more Javascript than all the other resources put together.

But your second sentence brings up another point in modern web design where a page can straight up not work at all without scripts. You'll just get a big page of white.

I don't want my cpu cycles spent unzipping your bloated shithole of a site

>I don't want my cpu cycles spent unzipping your bloated shithole of a site
So you would rather waste cpu cycles pulling data from your NIC into RAM instead?

true that

No I want competent people building websites

No, what you are ACTUALLY saying is that you want old-fashioned text-only web sites without any form of interactive content.

Just admit it you homo.

fuck you, macfaggot. Literally nobody thinks your "interactive content" adds anything. Just give me the content I want

Question: How much interactive content would you accept as "not too much" in a web page? and where would you add it?

Ignore him, hes a T-series ThinkPad user who still uses a Celeron processor.

most websites could stand to lose 80% of the "interactive" crap and still be 100% usable.

>if you don't want text-only websites you're a Mac user

Not even you believe that, how dumb are you?

>interactive content doesn't add anything
>posting on an interactive discussion forum with webm support for posting pictures of chinese cartoons and arguing with a stranger on the other side of the globe in real-time

>open up some Tumblr link
>browser freezes, enture experience is really unpleasant
>open up some shitty old website that haven't had it's layout been updated since like 2005
>it's a treat to use despite looking like total shit

Man I hate the internet these days.

Newspapers are 100% usable too, but why would you want physical newspapers when you can read the latest news updated in real time online?

Nah, I've got a haswell i7, no real problems loading the content. I just find most "interactive content" just gets in the way.
Also this shit is what keeps $200 chromebooks with from being super useful. If you manually whitelist scripts/xhrs etc pretty much every site loads fine

Because these days newspaper are more easy and pleasant to read.

Just enable JavaScript and images you paranoid delusional. Are you some kind of masochist that block everything except text?

>40 Gb/s and 100 Gb/s are becoming common now too.
Dude I still get

also raspberry pi's fall into that category too

This

What, no I mean, with how websites are formatted these days newspapers are generally a treat to read in comparison.

>implying 95% of "interactive content" isn't fucking trash bloat
I never said Sup Forums was bloated,

>But your second sentence brings up another point in modern web design where a page can straight up not work at all without scripts. You'll just get a big page of white.
Lol whenever I load up a link and it's just white because noscript has pwnt it I just close the tab. Nothing of value was lost.

>went to top news story of new york times
>turn off all ad blocking and script blocking
>7MB
>84 JS requests at 5MB
>turn off ads and all script
>900KB of CSS
>turn off THAT
>133KB for just HTML and images
>but the actual news article is 1.5KB of text

TL;DR: kill yourself

Get some glasses then or enable universal design (virtually all modern websites have this option these days).

>new york times
>surprised the actual news article is minimal
kek

liberal cuck spotted

Will it stop the content being built around ads and having no character limit?

>enable universal design
how do I access this?

Don't be a coward. Attack the point I was making.

No, but "built around ads" isn't a problem with modern website design. This is a problem with funding.

You faggots are acting like invasive banners and ads and paid content is a problem of web design.

It's a browser setting, usually. Your browser will embed some x-meta-string thingy in HTTP requests.

thread

You don't have a point. JavaScript and CSS should be cached if you don't deliberately restrict yourself. Ads are not a problem within the domain of web design, it's in the domain of news funding.

>No, but "built around ads" isn't a problem with modern website design. This is a problem with funding.

I'll remember that when I'm reading something hidden between 6 different next buttons.

>I'll remember that when I'm reading something hidden between 6 different next buttons.
That's your average clickbait website, would hardly count as a "news" article.

Read this: sciencealert.com/bad-news-study-finds-80-of-students-can-t-tell-the-difference-between-real-and-fake-news

>6 different next buttons.

Stop reading clickbait you retard

It's a real problem that won't get better any time soon, if ever.
Thanks to 'responsive design', botnet js tracker and ads the size of websites will increase. Just look at all these clusterfuck js libraries. Who in his right mind would use that garbage? 'Modern' webdevs of course.
For all I care we should go back to web 1.0. For most sites it wouldn't matter.
Just install NoScript. If the sites is broken, just don't visit it anymore. You're doing yourself a favor this way.

>This is a problem with funding.

Why not just not send so much shit down the line and cut the costs of funding drastically? Even if you only enable first-party scripts you still balloon the site's size dramatically.

> a problem of web design.

They didn't appear out of thin air.

>cached

From where? The website.

Which I just downloaded it all from.

Do you think the "cache" is something that's inherent to the browser?

>Just install NoScript. If the sites is broken, just don't visit it anymore. You're doing yourself a favor this way.
This.

>It's a browser setting, usually. Your browser will embed some x-meta-string thingy in HTTP requests.
link me I can't find it

That's where a lot of news is these days.

But I was also thinking about the fact that legit news site generally cut up a lot of news stories into multiple links so you'll see more adds.

I mean, sure if new big news rolls around but you don't have to make a new link just to say "oh and btw, the murder might or might not have been influenced by video games".

>Why not just not send so much shit down the line and cut the costs of funding drastically
Are you seriously implying that shaving of a few bits here and there is anything compared to, you know, paying journalists and editors to do their job?

>From where? The website.
>Which I just downloaded it all from.
Are you deliberately acting stupid or are you genuinely retarded. Refresh the site and see the network traffic you moron.

>If the sites is broken, just don't visit it anymore.
>not writing your own userscripts and stylesheets to make it work

>just don't visit 99% of websites, not a problem

Hell just disabling the refereed header broke shit like there was no tomorrow for me.

>>just don't visit 99% of websites, not a problem
Er, nah
Tbh if nothing shows up from noscript you don't want to be on the site.
A lot of stuff won't show up though, which is what you're going for. Now if you want some video player to load you enable 1st party scripts and/or scripts from the CDN hosting the video. It's not that difficult really and cuts down on a lot of shit.

I don't see the point when the content is basically the same on every result I get. Hell, even reddit comment sections are usually more informative than traditional news sites.

99% of the web is pure shit.

Which browser do you have? Chrome might need some (Google maintained) accessibility plugins, Firefox have some built in and need plugin for other while Safari has a broad range of such features.

Not nothing but it does break the sites a lot. At least the few times I've used it.

>99% of the web is pure shit
Yes. And then you want to add another layer where you can only use 1% of those 1% of websites that are kinda useful.

chrome, tried a couple extensions from the appstore none of them had options for that type of header

it's awful. The web is basically a giant malicious javascript resource-hogging spyware clusterfuck now.

All this effort just so they can serve you shitty ads, it's fucking pathetic. I'm amazed anyone makes money at it.

>Refresh the site and see the network traffic you moron.

You are directly avoiding the point. Clearing the cache provides a clear slate in order to measure the weight of the page. Whether I got it from the home page or that page doesn't matter; it's still 5MB of Javascript.

But I'll humor you:
Homepage: 5.6MB (clear cache, as if I was visiting for the first time)
Now that is all loaded in to my cache, so the news page should be tiny
News page (1.5KB of text, remember): 3.3MB (primed cache, giving you the benefit of the doubt)

Choke on shards of glass for defending this shit.

What's Sup Forums's excuse for not starting their own ad agency?

Websites with good content will usually be javascripped out the ass, or will be paywalled.

I'd prefer everything were paywalled and I'd just have to buy what I want, honestly.

My neighbour started one a few years ago and that was my first job ever interning at 18
I hope you ran some of my learned-on-the-fly javascript : ^ )

>You are directly avoiding the point.
The fucking point was request size you dense illiterate fuck.

Caches demonstratively reduce request size. So does using agent code (JavaScript) to refresh content and do alignment stuff instead of using images.

If you have a problem with local storage for websites, then why the FUCK are you on Sup Forums. Sup Forums issues cache directives for every single fucking thumbnail to be stored for two weeks.

So you're why my browser is constantly hogging 8gb of ram...

No, the reason why is this: Google Chrome with 25 open tabs never use more than a couple of hundred megs. Use a proper browser not written by Pajeets, user.

>complains about memory size
>but still doesn't want to waste CPU cycles on zipping and swapping out inactive data

>Google Chrome with 25 open tabs never use more than a couple of hundred megs. Use a proper browser not written by Pajeets, user.
I am using a chromium based browser. The problem is it never frees cached shit from closed tabs. This is a big problem when virtually every page loads up HD video content.

Or just don't cause the problem in the first place that requires all this hackery to get it to not be a total merry-go-round shitshow?

Why is this such an alien concept to you? Just don't cause the problem.

Chrome doesn't have this problem

See and Either stop using Sup Forums or at least admit that you are just reminiscing about the BBS era

what better way to democratize the web then by enforcing a tired series of web standards that are constantly changed by Google and Microsoft? I mean, how useful is the DOM! XML syntax is so easy to understand lets make everything conform to JSON.

The older I get the more bloat and complexity bothers me

DOn't forget OAuth, it's got Open in the name so it's definitely Open.

I never understood how Sup Forums could have a hissy fit over a few extra KBs in HTTP requests when they insist watching 4k 10-bit anime in a lossless (or, even preferably, raw) format.

>text only web sites
Web sites were never text only. The tag was there from the start. What you perhaps mean are static pages, but I can't see why you'd dislike them so much. Not every site needs (nor should have) interactivity, and even then it doesn't have to come from multiple JSs from multiple domains.

>webm support
The difference is that if I browse on a piece of software that can't play the webms I can still view the site. They have no inherent draw on my system's resources.

>pictures
We've covered this.

>real-time
Nope.

>reminiscing about the BBS era

Spoiler: Sup Forums is a BBS.

I'm thinking all webpages should have their content in a machine readable format so Google can be richer. I mean, I love spending my time conforming to some random multinational's standard so that they can reduce operating costs. I think it's best for everyone that Google has the money to sit on the top industry talent.

False equivalence.

>Spoiler: Sup Forums is a BBS.
It's not. BBSes are distributed, Sup Forums is not.

>4k 10-bit anime
>HEVC too

That's a big meme

I downsample them to 480p or 720p and re-encode them to comfy 8bit H.264

HEVC is a shit

BAHAHAHA
I cackled out loud over that one.

>BBSes are distributed

Only by implementation, not by design. Common BBS software is designed to run on a single piece of hardware at a specific address (once a telephone number).

Why do we even have access to typing in the URL anymore? That should be considered breaking DMCA.

>Chrome doesn't have this problem
Go ahead and add up all that memory use for "Iron" I got 59%.

Complexity is a meme.

You can tie an absolute shit-heap of a knot and call it "complex" but it doesn't make it a good knot. The best implementation is usually the most simple. You know, the NASA-Pencil parable.

What's wrong with using a bit of your extra RAM to store Google state? Servers are expensive. I'm sure you agreed to it in the EULA.

Use 'free -h' as its easier to read.

A bit? I have 1gb of free ram on a machine with 16gb and most of that is being used by my fucking browser. This shit is retarded.

It's not like could afford a NIC capable of those speeds anyway.

2.5gbps and 5gbps ethernet is coming, but 40gbps and 100gbps links are still reserved for data centers and ISP backbone.

Backpedalling much? You know what a BBS is, why are you wasting my time making me explain in detail why a web page differs from a BBS?

>Web sites were never text only. The tag was there from the start
I was talking about Gopher vs HTTP. Gopher was designed around interactive user actions, whereas HTTP was designed to just transfer HTML documents in simple request-reply messages.

>What you perhaps mean are static pages
No, I meant text-centric pages for which HTML was designed.

Think about it this way, you have more RAM then Google had in their entire infrastructure at one point. It would be selfish if you didn't use it to help distribute AdWord load.

Who remembers the Internet before the WWW?

Chrome uses 417 MB on my system now.

The web should have never gone beyond this.
rise.cse.iitm.ac.in/shakti.html

I wonder how much of that is due to ads

I thought that's what he was saying