why don't they make a chromium that isn't bloated shit?
Why is it so bloated to begin with?
why don't they make a chromium that isn't bloated shit?
Why is it so bloated to begin with?
Other urls found in this thread:
aur.archlinux.org
twitter.com
Think about translating and sand boxing all of javascript html and css, I don't know how I would do it.
The Chromium browser is basically a VM.
>translating and sand boxing all of javascript html and css
it's still just document rendering, I don't see how that accounts for the bloat
why?
What about this?
aur.archlinux.org
Because that's how it can run JavaScript faster than any other browser.
The challenging part about making a browser isn't the HTML/CSS parsing, it's how to run the JavaScript fast.
>inb4 vivaldi shills try bringing up their botnet browser
Seriously?
Why do we need all this javascript in the first place, though? It can't be worth this bloat
Javascript. Is. Everywhere. A browser has to support it, unless it is targeted to RMS and the five people that consider him sane.
>A browser has to support it
Why? Honestly, I think it would be faster to add eyecandy to webpages just by resending and redrawing using the basic browser functionality. Javascript is a massive failure and I don't understand why it's so popular.
Because JavaScript is a massive success and it's impossible to not understand why it's so popular.
letting google track me and target ads at me?
anyone can imagine a better system than we have currently, the problem is getting anyone to use it
technically better has nothing to do with actual adoption
so why don't people get together and make a standard? Seems like they do it all the time...
>I think it would be faster to add eyecandy to webpages just by resending and redrawing using the basic browser functionality.
Are you really arguing that round trip time + the time it takes the server to compute the eye candy < the time it takes the client to compute the eye candy? Cause that's stupid. Also it would put more strain on the servers, so why would any website do this.
There's probably a middle ground where you're not sending back the entire web page but you're also not sending some bloated script to the client to interpret in a VM to try to get them to draw an animation when the browser itself is perfectly capable of that.
please get off my board
Some JS frontend frameworks do support "server-side rendering" where the bulk of the JS runs on the server and the client (browser) side just fetches and replaces parts of the page as needed. It's rarely necessary unless the actual JS is horrible garbage.
There is plenty of horrible JS garbage out there, though: I've seen some sites include 10+ MB of minified JS libraries and frameworks on every page and then use them so badly that the page's content is basically re-rendered on every click whether the event caused something or not. Even if the browser caches the files it still has to parse and run them on every new page.
>so why don't people get together and make a standard?
Anyone has that image of there being 20 Linux distributions and then some guys decide to make a "standard" one and suddenly there's 21 Linux distributions? Because that's why.
>Seems like they do it all the time...
Then again you might know already and I am too autistic to recognize it in your post.
You know what I mean.
>server recieves click
>server tells browser to redraw 8 times over .5 seconds box 2 with lower border from {x, x-2px,...}
>broswer just does basic refresh operations
Maybe the solution to bad javascript programming is to remove the temptation?
>Linux distributions
There are tons of web standards though.
So you want all event handling to lag by 200+ ms and make the browsers DDoS the server?
How would reacting to mouse hover or dragging work? Just send an event per a pixel or two traveled like how the DOM API currently works? What if the page needs live validation of the user input? Should the text only appear in the UI half a second later after the server sends the new state of the input box?
Thin clients are fucking silly with so much computing power in every device and this would make the web worse than remote X11.
-lag is already very common online there is no reason to believe this would make it worse
-it wouldn't ddos the server, it would just increase the number of requests by a modest factor
-mouse hover can be handled by signals from the browser that tell the server the cursor has entered a box
-I don't know what you think validation is or how you think it works
-we keep increasing computing power yet basic web browsing continues to get slower and buggier
>-it wouldn't ddos the server, it would just increase the number of requests by a modest factor
Try a factor of 100 or more. There are many moving parts in a non-trivial web application.
>-mouse hover can be handled by signals from the browser that tell the server the cursor has entered a box
Any box, or just specific boxes? HTML+CSS is crude enough you may need to wrap things in quite a few of them to get the layout you need without risking browser compatibility issues.
>-I don't know what you think validation is or how you think it works
Seen plenty of code that blocks wrong characters from being typed at all (showing a tooltip to tell the user what they're doing wrong) by grabbing text input events in the framework's (Angular, React, etc.) hooks and only letting the element's value change if the input was acceptable. HTML5 has some limited support for typed input fields but it is never enough by itself, and browser support is lacking. This would lag like fuck if it had to go through the server.
>-we keep increasing computing power yet basic web browsing continues to get slower and buggier
Not surprising. The complexity of web applications has grown so fast that retarded "developers" who can only be taught by example and can't teach themselves have not kept up at all, and are using their tools completely wrong. Most organizations are even slower learners than this and have basically no standards on JS code quality.
>Try a factor of 100 or more.
I don't think so.
>HTML+CSS is crude
Whichever box gets the mouseover flag.
>only letting the element's value change if the input was acceptable. HTML5 has some limited support for typed input fields but it is never enough by itself, and browser support is lacking. This would lag like fuck if it had to go through the server.
It wouldn't really, and in any event this edge case doesn't justify the clusterfuck that is javascript
>basically no standards on JS code quality.
if they can't be trusted with it why give it to them?
Show me how you'd implement something like google docs with pageloads on every keypress.
Getting people to stop using ie6 was hard enough. Making people switch to a completely new platform is basically almost impossible.
>Show me how you'd implement something like google docs with pageloads on every keypress.
You just think that's hard to do because you're used to all the javascript crap lagging all over the place, so you think everything must be very resource intensive and laggy.
The server is just sending a small document update which the browser is rendering. It's a whole different paradigm, very lightweight, and would be better than that shit nobody uses anyway.
maybe the chinese will do it?
The web standards require it. Browsers are bloated by definition.
They require loading masses of codecs, stuff like midi support, unfinished Web Audio spatial sound, their own network stack, etc. You get the picture. Doesn't help, that some ressources can't be shared for multiple processes, so every chromium tab loads their own ressources.
Found the ideas guy
Html5 will solve everything.
...
It's making everything even more complicated.
The absurd standard is the reason why everything needs to be so bloated in the first place.
Found the cynical guy who doesn't mind condemning the literal universe to mediocrity
I almost want to become a webdev now so I can fix webdevving
Top kek
I thought it mostly just codified the stupid things browsers were already doing and then glued on some half baked features like videos with no codecs or streaming support.
The spec on how to interpret the tag soup markup alone is already like 100 pages long because of all the exceptions.
You can't really fix it. Removing features from the standard is even harder than introducing new ones.
Only features that are unanimously considered huge fuckups, like AppCache or non-standard things like SPDY get removed.
But if you're serious, work with the WHATWG.
At least we're now in the situation where they, a group of people who actually develop for the web (and web browsers) decide on specs. It used to be the W3C, where a bunch of turds who had no idea how to define a proper spec that's easy to use, performant and still low-level enough to do stuff with it just dropped their turds and we had to accept them.
>But if you're serious, work with the WHATWG.
now that's nifty