HTMLNeutrality

Net Neutrality is a big concern that a lot of anons have been talking about lately, and we've seen arguments from both sides of the issue.
One of these arguments is that losing Net Neutrality will cause websites to get throttled unless you pay extra to an ISP,
and even then, some sites may still get throttled.

This is a very good point, but allow me to bring up a separate issue that some autistic anons and people in tech
have brought up in the past that might actually have a relation to this Net Neutrality concern.
That issue is of website bloat. I've seen anons bring this issue up on a few occasions, complaining about the
inherent bloat created by Javascript and other related interpreted languages, libraries, and frameworks
being used on the web. These bloated technologies can also very easily be used to spy on us, which is part of
the reason for certain popular web browser extensions such as NoScript and uMatrix.

The thing is, it's not just people on imageboards complaining about this. Many people in tech have brought up
this issue as well. Our lord and savior, Richard Stallman, includes a reference to an article on this topic
in his "How I do my computing" article, saying that he agrees with it and that it inspired the layout of his site.
stallman.org/stallman-computing.html
The article is one by Olia Lialina, author as well as Co-founder of the Geocities Research Institute.
contemporary-home-computing.org/RUE/
In this article, Olia makes a case against Web 2.0. Although it does not directly relate to bloat, it does
bring up the idea that users are not creating their own experiences on the internet, but are merely following
the shaped experience predefined by the author, hiding programmability and customizability of a system.

Other urls found in this thread:

idlewords.com/talks/website_obesity.htm
youtube.com/watch?v=tefielQeHZY
youtube.com/watch?v=Ihli_guFhkU
html5rocks.com/en/tutorials/webcomponents/customelements/
stallman.org/stallman-computing.html
wiby.me
twitter.com/NSFWRedditGif

She also brings up that old Web 1.0 maymays such as peeman.gif, despite their crudeness, offer more expressiveness
than what is provided by modern sites. "because it is an expression of a dislike, when today there is only an opportunity to like"
It is my opinion that these shaped experiences also contribute to bloat.

Another person who has shared this sort of view is Maciej Ceglowski, who has released a talk/article
entited "The Website Obesity Crisis".
idlewords.com/talks/website_obesity.htm
In it, he explains that websites have become needlessly bloated, with one of his opening points being that a single tweet nowadays
is larger than a full-length Russian novel, and that even sites from Facebook and Jewgle that should be about reducing this bloat
are extremely and unneccessarily so. He also brings up the point of "Chickenshit Minimalism", or minimal sites that are still
overwhelmingly bloated due to Javascript shit.

This idea has been shared on Jewtube as well, with Bryan Lunduke creating "The World Wide Web Sucks", which has similar views as
the Ceglowski article.
youtube.com/watch?v=tefielQeHZY
He discusses the immense bloat of a browser attempting to load a common website such as CNN.com, in comparison to great software
achievements such as the system used the Apollo 11 Computer, or the original DOOM. He ends this talk by suggesting that the web should
return to HTML. Even Terry A. Davis brings up bloat, although referring to software rather than websites.
youtube.com/watch?v=Ihli_guFhkU

So what does this all mean to Net Neutrality? Well if sites will be throttled, these bloated designs will have a hard time loading
on a slow connection. Hell, some people already have trouble loading them even with Net Neutrality!
But the idea of reducing the bloat and returning to an older web can have an impact here. If websites are throttled, owners will
still want people to come visit their sites. So a possible solution for them is to get rid of the needless Javascript, simplify
things, and thus have a site that will load very efficiently under the Net Neutrality-less internet.

I say that we use Net Neutrality as a means of killing off the bloat of the World Wide Web, and making it decentralized.
What are your thoughts?

bump

website bloat is a result of lazy developers
you can't kill lazy
that's the default

>Oh hai I needs a Javascripts function to do ajaxes!
>I can write 4 lines of code, or I can drag in kilobytes of JQuery.
>4 lines. That's a lot of work. I should import the library instead. Let them do the work!

And that ultimately degenerates down to the great leftpad crisis of 2016.

websites won't be throttled
sited won't be affected because majority of JS is loaded from cloud hosting from other domain
bloat and spyware will be there to stay because it doesn't cost anything and only can generate money

I would like this, and I agree that the consumption-based nature of web 2.0 (rather than the expressive nature of web 1.0) isn't good. There's also the fact that the high demand of a site like Youtube (I wonder how many petabytes that is now...?) means that they'll need a Youtube in order to present themselves.

A text or HTML file will always work.

Right
But if trimming down the bloat becomes less of a "it's ideal, but we're too lazy to do it", and becomes more like "HOLY SHIT OUR BUSINESS IS GOING TO LITERALLY DIE IF WE DON'T MAN UP AND GET RID OF THIS SHIT", then it is more likely that we can kill the bloat.

Yeah all of this is assuming throttling is going to even happen.

I heard from Luke that MediaGoblin is gonna add features to be like a decentralized, federated youtube type thing. Worth keeping an eye on I guess.

>that lunduke video
I didn't like that video, because it had so much potential, but he ended up wasting so much time over-emphasizing meaningless points. He talked about the cumulative RAM usage of CNN on every visitor's computer as if it had any meaning, he compared browsers to the apollo 11's computer, as if a web browser (or any graphical program) could ever possibly come close to using that few kilobytes of RAM, and he made some retarded RAM usage projection by fitting whatever random function he decided to use to a graph, completely ignoring the fact that RAM sizes don't double as fast today as they did in the early 2000's. When he made his own experiment comparing different old and current browsers, he used sites that some of the browsers couldn't even display, completely negating the point. He should have used the same page that could have been rendered entirely by all three browsers, so we really could have seen that new browsers use much more RAM for displaying the exact same content.
He also made the argument that websites should support old browsers because people are forced to use old browsers in some edge cases, when supporting old browsers is pure cancer and shouldn't be done ever.

Do note, that there are some cases, where web 2.0 technology usage is justified. This does not apply to 99% of the web, but we can actually build highly interactive and performant applications to work on almost any personal internet-capable device out there. The problem of course is, that almost all of the sites out there pile on unjustified amounts of JS to effectively serve static content.

Like fucking clockwork. Monday rolls around and you paid shills clock in to shill on Sup Forums. Sage

To help combat this.
html5rocks.com/en/tutorials/webcomponents/customelements/

Yeah the only time I could see this bloat being justified is for actual legit web applications such as Google Docs, Office 365, etc, and even then, most of those are botnets.
Thankfully, I think LibreOffice was planning a sort of self-hosted google docs-type thing. I've also found draw.io, which is an Open Source Visio-type thing.

But outside of legitimate web applications, sites should not be using this shit. Even then, these applications should start considering that WebAssembly thing anyway

>assuming throttling is going to even happen
no, the bloat is already mostly loaded from google/amazon cloud, you can throttle the domain but throttling that cloud is plain crazy

>stallman.org/stallman-computing.html
>I don't want to spend time comparing them
>but learning about them is not a priority for me and I don't have time
>but learning about them is low priority for me and I have other things to do
>I am too busy to do much programming
So what the fuck does he do?

>However, if I am visiting somewhere and the machines available nearby happen to contain non-free software, through no doing of mine, I don't refuse to touch them. I will use them briefly for tasks such as browsing. This limited usage doesn't give my assent to the software's license
It actually does.

>I do not post on Sup Forums. I have nothing against it, and I have occasionally answered questions for interviews for Sup Forums, but any posting there that says it is by me is by an impostor.
;_;

You missed when he actually came to Sup Forums and left right away because "all I saw were inane comments".

It's what any normal person would think. I only come here for the laughs. OP needs to choose his audience more carefully.

I think that's kind of a flaw with the modern web. There's very few places to have a thought out and longform conversation. OP could post this on UseNet, but he'd be preaching to the choir there.

>There's very few places to have a thought out and longform conversation
There are forums, but then those require registration. As a kid 14-15) I used to frequent forums quite a lot and liked the format. The only problem was required registration, I guess so that you could send private messages to users among other things.

>I can write 4 lines of code
That going to work only on chrome. Jquery happened not for shit and giggles. The bloat comes from fucking weback and es6, that makes a simple 20 lines script with couple of node modules into 2Mb monster. Jquery is far from current angular/react bloat.

>es6
>bloat
Only, if you still support ancient browsers by compiling to ES5. Native ES6 is actually faster, because of explicit constants, classes and arrow functions.

>if you still support ancient browsers by compiling to ES5.
No one is retarded enough to ship native es6.

I do, but with feature detection. All major browsers have fully supported ES6 for 2 years now IIRC.

>I do
On your homepage? Sure thing, m8.

On a moderately complex web application. If you read the thread, not writing optimised code is what was discussed here.

>we now have access to gbit level connections
>we should not even attempt to use them because of muh minimalism

Get to fuck minimalistfags. I bet you all browse Sup Forums with the native extension

Bump

>his shit internet can't handle a small download of jQuery

The bloat will remain, the sites will just split their content into multiple smaller pages. They can even make more ad money this way because more page loads = more chances to deliver ads.

The thing is, regardless of whether we CAN do this, we shouldn’t HAVE TO do this.
Most of this shit, in essence, is images and text. Why should images and text be this big?

We can rebuild the web to be unbloated. wiby.me is an example.

is Stallman insane? he download a whole website only to read it locally.

>I do not post on Sup Forums. I have nothing against it, and I have occasionally answered questions for interviews for Sup Forums, but any posting there that says it is by me is by an impostor.

kek, Sup Forums got told hard

>There's very few places to have a thought out and longform conversation
You know, you used to be able to do this all the time on Sup Forums.

>because this is about internet speeds