Have to enable at least two or three scripts per website for it to resemble functionality

>have to enable at least two or three scripts per website for it to resemble functionality

why is this allowed?

If it requires too many scripts I just give up. Usually the content doesn't require much

What if we had a proxy that loaded each page you want, deals with all the JS bullshit and then screencasts you the website but image mapped? Because it is image mapped you could still interact with all the elements but you wouldn't have to load any tracking shit -- the proxy would take the hit? Does anyone understand. Pls understand

Why aren't you using uMatrix?

Why no virtual machine? Just boot up Win 7 for that bullshit. If it's worth the time.

>be me
>running crouton on a chromebook with 2GB of ram
>shitty fucking websites designed by retards completely hang the machine on page load
>install umatrix and only allow the js I want
>everything is crispy

I use umatrix and I have the same problem
I pretty much have to run down the script and plugin columns to figure out what I need to enable.

>be retard
>buy Chromebook
Checks out

>be me
>get a fully functional machine for $200
>does everything I want
I'm actually a genius m8

Such service couldn't possibly be for free.

archive.is?

Yeah but it is quite cumbersome to "surf the web" with. More like a Tor proxy but the relay would be dealing with the scripts etc. The experience would have to seem transparent

Typically you only need to do that once you first install the extension, then it's fine.

I think I know what you're talking about, something like a cloud browser. Cloud operating systems are probably already under development or at least an idea in major companies (most likely Microsoft). You'll need a good Internet connection and it will basically let you use a powerful remote PC, removing the risk of local infection and expensive hardware failure. The problem with this is you'll still need a decent GPU and Internet connection to fetch and display the images in 60Hz+. Even if you ignore the fact that it will probably require over 20MB/s download speed (for 60Hz, unless the images are well compressed), it will definitely be a paid service and the risk of "botnet" and data theft will be even greater.
Using just the browser to accomplish this would probably be less intensive on your network, but we're still going to need a good, reliable, privacy respecting host for these services. You'll only going to be able to pick two of the above, mostly being good and reliable but not privacy respecting (example, Microsoft). If Microsoft ever releases another OS it's likely going to be fully on the cloud and a paid service.

Judging from this, you've probably meant proxies/servers only processing javascript. I don't think that would improve your browser's speed at all. What I had in mind was using a "proxy" computer to browse from your machine, just using the other PC's resources. Your point is, replace
>your machine
>website
>script process
>your machine
With
>your machine
>proxy machine
>website
>script process
>your machine
It doesn't seem beneficial unless you're running a very underpowered potato PC with a good network speed.

If I have to allow anything more than the main site, then that tells me it's a shitty site.

this tbqhfamalam

I've tried umatrix before but I couldn't hide elements like ublock by just clicking that one tool and selecting in screen
Is there a way to do that now?

Use ublock + umatrix
Enable all of the filter lists in umatrix
Enable all the ublock lists except for the ones enabled in umatrix.
Use ublock for element hiding.

Sounds possible with a local nginx proxy and some sort of request router

i'll start on the logo

Why would I use both like that?
Wouldn't it be easier to just use ublock and no script?

>Site doesn't load properly
>Open NoScript settings
>Allow googleanalytics.com
>Allow google-ajax.com
>Allow fbcdb.net
>Allow a fuckload of random *-js.com sites

There's too much dependency on third party JS libraries nowadays. You don't even know how to program to be a web dev, really.

As much as I programming pure HTML myself, I feel with the proper release of HTML5, it should be an incentive for web devs to learn how to use actually develop instead of relying on outdated technology.

JavaScript needs to either improve or die or a better third party solution needs to come along. I mean, (and I know its a bias example) the FCC website runs fine without JS and looks and functions like any other "modern" websites

It's time to stop living in 2010, web devs.

It would be easier to use ublock and umatrix.

Lots of sites and even vendors are completely functional without having to allow 15 js libraries just to add an item to your cart

The only excuse is laziness and greed

Where the hell do you have to enable Google or Facebook js? Other than fb and YouTube I don't think anything needs those enabled.

click disable

then click yes when it asks if you want to keep protections but stop blocking scripts

voila

uBlock and uMatrix are designed to work well together. They have done redundancy in features, but that's not really that big of a downside.

Pretty easy to do, you should look into Phantomjs

>allowed
Who do you think has authority over that? The internet police? Stop making retarded threads you faggot OP.

>check the list for what needs to be allowed
>the list is longer than my vertical screen space

Yeah, I'm out.

Glad I switched. It's a lot easier to use than I expected.

The only thing I wish unlock had is some kind of user repo for site specific settings.
For example, I had to play around with YouTube to get comments to work properly, but if there was a repo somewhere that had all those settings in it already it would be much easier to use.

Try suggesting that to gorhill on GitHub. He's a cunt and will probably insult you, but he still usually implements good features when they're recommended.

pajeets and millennial web """developers""" have ruined the web with their overreliance on javascript

Why hasn't javascript been aborted and replaced yet?

Do you really think he'd implement something like that? Seems like more of a third party system than something that would be natively implemented into umatrix development.

I like when a site starts to load, and you can see the text and shit, then everything goes away because its using some js to change a font or something.

>all content loaded through javascript

>visit The Verge
>all computing devices in a 2 mile radius reboot due to the amount of fucking cancerous bloat

forget noscript and adblock, just block the malicious websites with your hosts file, it wont use browser resources and it completely blocks them from your PC when using ANY & ALL applications not just your browser,

>Cloud operating systems are probably already under development or at least an idea in major companies

Why not just use an existing OS and put it on Azure or something?

>it completely blocks them from your PC when using ANY & ALL applications not just your browser,
Thanks! Now I don't have to worry about SSH-ing into a malicious domain by mistake!

Shy can't the umatrix dev get their head out of their ass and just implement that feature and get ride of the redundancy?

He already has a filter for uBlock that unbreaks websites, why wouldn't he include an option to include a community-run list that just makes things more convenient?

kek at you fucking retards

when i have a web project, i use cdn if when importing scripts instead of hosting the scripts in my server.

so you are acually blocking legit scripts from the developers site and allowing shaddy scripts from the website itself

>2016
>people still delete their posts

For $200 you could have gotten a used Thinkpad with better specs

Who the fuck says we're just blindly allowing things? Of course we keep everything blocked unless it's necessary.

libreJS is the only good choice, you know that right Cred Forums?

the problem is how people design modern websites, not how noscript functions, FYI

>NoScript completely breaks the majority of websites

More like NoWork

I had the exact same thought process before reading your comment. I love Cred Forums

I often blindly allow things desu

the only things I never allow are the ones with 'ad' in the name

I should probably get in the habit of at least 'temporarily allowing' everything

Just allow javascript globally and use it's other protections. Use uMatrix to block what you really don't want.

It's bullshit and I hate how much garbage websites load. I use ublock and umatrix, which is set to deny everything except images and css. I only allow websites I commonly use and maybe a few times a week I will come across a website that needs 500 scripts to load an image or a video, at which point I just leave the website.
I use to use noscript, but I have found umatrix to be a lot better and easier to use thanks to its granularity.
I really wonder if it will get worse in the future which it probably will.

Novel idea
Why dont we share out filter lists for popular websites, so that we may all benefit from a safer web?
(umatrix format)