"If you're seeing this message, that means JavaScript has been disabled on your browser.

Please enable JavaScript to make this website work."
techinasia.com/

I'm so sick of websites websites refusing to even display text and images if I don't agree to run their proprietary Javascript on my computer. Isn't it time that browsers started treating requests to run Javascript like requests to use the mic or camera, and asked the user before allowing them? Ideally with crowdsourced info about what the scripts are, and what they do? In other words, make something like #NoScript a standard part of browsers.

Follow

@strypey Be careful what you ask for: that would kill the peer web before it started. JavaScript on the client isn’t the enemy. Business logic on the server is the enemy.

@aral that's a fine distinction. As a developer, you're in a better position than I am to know. But you'll need to convince me. Because it seems to me that outsourcing processing work to the user's PC - via JS black boxes - is exactly how #SurveillanceCapitalism achieves massive scale, while claiming that #ThereIsNoAlternative to centralized server infrastructure. A myth they've propagated so effectively that even many developers have started believing it:
signal.org/blog/the-ecosystem-

@strypey @aral

Not to consider the fact that browsers' development is by far more centralized than server side computation!

We can count the browsers' owner on the fingers or one hand!

#JavaScript (or #WASM) execution in the browser should at least be opt-in per #Web site.

The "Peer Web" shouldn't rely on such #spyware. It should teach people to install better tools.

People can learn. 😉

@strypey @aral There's no harm in keeping an eye on when websites want to run Javascript(it can be eye-opening). Mastodon runs Javascript - that's how new toots are delivered to the feeds.
Javascript is a tool which can be used to build well-designed sites but also malicious ones.

I regularly don't return to websites which give garbage messages about Javascript being required when I know it is not necessary to deliver their content.

@strypey
From my experience you can say the same about any language now thanks to WASM. Any kind of language can be compiled to a binary format beforehand you have absolutely no gauranteee about its functionality. I guess one thing that can be done is cryptographically signing the client side code & the verifying it before execution, OFC that can have certain payload overhead but still it can be a good tradeoff to build trust?

We should also remember that the web is extremely large and you can't always gauranteee that the program you run is secure, but the same can be said about any program or system ever created.

@aral

@buoyantair @strypey @aral Personally I really like the look of Guix for addressing this issue, because then you'd have the source code for all the programs you run.

@alcinnz
One of the reasons why the web is the way it is, is thanks to huge market influence. Look at it from a business perspective and you can see that a lot of new things that are coming up like AMP / PWA are ways to make the web faster so that more people can use it & therefore increase profits.
@aral @strypey
@alcinnz @aral @strypey
The same tools can be used for greater good but I don't think this whole "open web" thing is going to continue (infact it's already a closed, controlled and monitored web)

@buoyantair @aral @strypey What some still call the "open web" is still there, it's just that those farm houses have been covered up by skyscrapers. And from my perspective those skyscrapers look awefully shaky.

I actually get very interested in the question of how to discover and search for those independant websites. There's a heap of great stuff if you can find it.

@buoyantair @aral @strypey Ways to make it faster without giving up their surveillance capitalism that is. If performance was truly their goal, they could do so much better!

@strypey

JavaScript isn't a black box, though. You can inspect all the code that's running in your browser.

Some JS is obfuscated, but it can be easily de-obfuscated. All browser-side JavaScript is effectively open source, even if it's not licensed as such.

If your concern is about privacy, it's not the JS running in your browser that should concern you. It's the data sent from the JavaScript to the server.

It would be reasonably simple to disable AJAX, thus preventing data to be sent to/received from the server, but allow all other JavaScript, allowing interactivity to still work.

@aral

@danjones
> effectively open source, even if it's not licensed as such.

Please don't misuse the term #OpenSource this way. It has a widely accepted definition (opensource.org/osd) , and unlicensed or proprietary JS does not fit that definition. If there is no existing term to describe code that is visible but not freely reusable, feel free to coin one :)

@danjones fair points. But privacy is only a subset of a much larger concern, which is about *control*. Putting aside the argument we could have over the "black box" part of my post, the fact remains that:

> outsourcing processing work to the user's PC - via JS - is exactly how #SurveillanceCapitalism achieves massive scale, while claiming that #ThereIsNoAlternative to centralized server infrastructure.
@aral

@danjones There are many possible strategies for redecentralizing, and resolving the *many* problems with JS, some of which are described here:
gnu.org/philosophy/javascript-

I agree with @alcinnz that moving interactive functions back into native apps, leaving the web as a platform for static pages that don't require (or use) JS, is a strategy worth exploring.
@aral

@strypey @danjones @alcinnz Maybe the FSF should worry more about its logo appearing next to Google’s as they sponsor the same events than some ridiculous and ill-informed stance against a programming language that spreads FUD about potential alternatives. Remember that an AGPLv3 licensed app specifically built for drones to send hellfire missiles to little children would get the FSF seal of approval. Free Software is just a component of ethical tech but doesn’t care about ethics of use cases.

@aral I share your concerns about open source events being sponsored by Google, as do FSF, but they can't control this. As for approving of child-killing drone software, that's FUD worthy of Microsoft. FSF have often spoken out about the use of freely-licensed code to do much less anti-social things than that:
fsf.org/blogs/rms/ubuntu-spywa

Perhaps you could respond to the concerns laid out in 'The Javascript Trap' with some substance, rather than resorting to whattaboutism?
@danjones @alcinnz

@aral as for the claim that the FSF's criticisms of Javascript are a ...
> ridiculous and ill-informed stance against a programming language

I note that they're far from alone in seeing JS as a problem. Plenty of experienced engineers have serious problems with it too. A quick selection off the top of my head:
* soc.freedombone.net/objects/20
* hackernoon.com/the-javascript-
* onpon4.github.io/articles/kill
@danjones @alcinnz

@aral I'm aware of the holy wars that constantly rage for and against programming languages. But AFAIK JS is the only one that results in code being downloaded and run on the users computer on-the-fly. As onPon's article points out, that makes proprietary JS code effectively impossible to replace at the user end with free code. These are not trivial issues, and implying that they are suggests a failure to understand the scope of the problem.
@danjones @alcinnz

@strypey @danjones @alcinnz Right and what happens exactly when you have automatic updates on and a native app gets updated? Now what happens when you’ve allowed say Apple to use bitcode? Instead of vilifying JS when some of us are trying to build systems using it, let’s understand that the real issue is business logic on the server, proprietary/closed source code, and lack of reproducible builds. Spreading FUD about in-browser JS could jeopardise what I’m working on with ar.al/2019/02/13/on-the-genera

@aral
> the real issue is business logic on the server, proprietary/closed source code, and lack of reproducible builds.

Sure, these are all problems, and I get that JS isn't the only vector for them. But the way it's deployed makes it particularly vulnerable.You've left out the major architectural weaknesses of JS (eg the security audit nightmare created by dependance on hundreds of third-party modules). As for Apple, the FSF criticize their practices harshly elsewhere, as I'm sure you know.

@aral
> could jeopardise what I’m working on

This is neither here nor there, but it does suggest you're getting too emotionally close to the issues to be totally objective in your analysis. For the discussion to continue productively, it's probably best to look at it from 50,000 ft, and purely from a user POV, pretending for the sake of argument that you have no skin in the game as a technology creator.

@strypey You bet I’m emotionally invested in it – I’m not sitting in an ivory tower perpetuating some bs notion of neutrality in the matter while enjoying my tenure. I’m building what I’m building because I care about the issues not the other way around :) (And I’d further argue that objectivity is impossible for any being with self interests – even base ones like a need for food or shelter. The best we can be is transparent about our biases and subjectivity.)

@aral make no mistake, I believe you really do care about protecting users from surveillance capitalism, as I do, which is why I regularly signal boost your stuff. All I'm saying is that folks who really like their hammers have a tendency to start seeing every problem as a nail. It's important not to let a sunk cost fallacy prevent you from considering other options that might also work, and corner you into interpreting any suggestion along those lines as a demand to ban hammers ;)

@aral @strypey

> objectivity is impossible

True but only because we are limited by our physical nature.

> for any being with self interests

Yet some people act independently of their self-interests.

I'm one, so I know they exist.

I'm not particularly good or wise, I'm just curious.

@aral, just like @strypey
I appreciate your work, but I agree that you are scared by something that isn't going to affect your work.
Let's assume that one days #JavaScript and #WASM execution becomes opt-in in #Web #browsers, your application is not a web browser, so nothing would change.

This is not #FUD, but a real attack from the #Russian gov (mostly) to their citizens: bugzilla.mozilla.org/show_bug.

#JS enabled this #surveillance, and if it was #Google doing the same we would have never known it.

@strypey @danjones @alcinnz Right, and as an experienced engineer (if 35 years of experience in programming counts for anything) who cares deeply about this problem, I’m telling you that a client-side-only JavaScript-based approach is essential to building a bridge from surveillance capitalism to a peerocracy. If you haven’t, please read what I wrote here, where I explain why JS in browser – if not linked to business logic on the server – is not the enemy: ar.al/2019/02/13/on-the-genera

@aral I finished reading this article today, after starting it yesterday, and reposting it so anyone following my account on the birdsite gets it, as well as those on the fediverse. It's a good overview. I didn't say JS is the enemy, I argued that it ought to be opt-in, so that:
a) users can protect themselves from the harms it is already known to do
b) web designers are incentivized to use it only when it's really needed, not use it to add trackers etc to what ought to be static pages

@aral if JS was opt-in, people would opt-in when that made sense in their use case, or use a native app when that made sense. Much less bad JS would be written because it can't be slid in under the door and run in someone's browser without their informed consent. It would be good for users, good for native apps, and good for JS.

@aral @strypey @danjones I thought we agreed the other day that these concerns are orthogonal. I'm not against your webapps, but auto-executing JS is a valid concern.

If you're afraid of lack of JS hurting you, I encourage you to write good <noscript> messages promoting the native apps.

P.S. As you should well know from Better there's plenty of Fear, Uncertainty, and Doubt to be had around what JS is doing.

P.P.S. I have read your blogpost.

@strypey Hmm, I couldn't read @bob's comment about how much trouble JavaScript has caused him until I allowed the Pleroma instance there to run JavaScript 😆 😭

@jamey case in point. Even a logged out user with JS turned off ought to be able to see text, images, and some level of layout sanity, using pure HTML/CSS. Maybe file that as a bug on the Pleroma issue tracker? Same with any other fediverse platform that doesn't have a sane fallback for non-JS visitors.
@bob

@strypey @aral @danjones @alcinnz I've been programming for 30 years and I think javascript is very bad. let's not forget it killed off better alternatives at the behest of Google, who embrace it because it powers their surveillance. you can't deploy javascript without tacitly approving of Google's (and other surveillance capitalists') use of it imo.

@walruslifestyle
> let's not forget it killed off better alternatives at the behest of Google

To what alternatives are you referring?

IIRC, when JavaScript was gaining steam, the closest thing to a serious contender was VBScript, which only worked in IE.

Also, Google didn't exist yet.

@strypey @aral @alcinnz

@danjones @strypey @aral @alcinnz Google's existed since 1996. at that time you could run Tk/Tcl scripts in Netscape navigator,with a plugin. you could run perl scripts too, which was big at the time because perl/CGI was the main way to make a web site dynamic back then. Java applets, which made an effort to be secure, were also starting to appear

@walruslifestyle

None of those alternatives worked without a browser plugin, unlike JavaScript.

Because of that limitation, I wouldn't consider them as serious contenders.

And Java applets were an awful user experience.

And Google certainly didn't have the clout to kill any of those off at the time.

@strypey @aral @alcinnz

@danjones
> None of those alternatives worked without a browser plugin, unlike JavaScript.

Sure, but that's only because there was no #W3C standard saying the dominant browser vendors ought to implement them in the browser, and no consensus among them on de-facto standards. Javascript is fine for prototyping, but before anything gets mainstream use, it ought to be standardized and put into all the browsers, not sent down the pipe indefinitely. It's amateurish.
@walruslifestyle @aral @alcinnz

@strypey @danjones @aral @alcinnz agree, and it's also dangerous because all it takes is a monopoly or duopoly among browser makers for the entire industry to consolidate around a technology that benefits their business interests. instead of having a healthy ecosystem of alternatives so that developer's can choose what works best for them

@walruslifestyle @strypey @aral @danjones Just curious. What are you considering to be the better alternatives it killed off?

@alcinnz @strypey @aral @danjones minimally, code execution in the browser should be managed by an open plugin architecture that can be security audited and user controlled. that was starting to come into being when I was into web development 20-ish years ago, but died and was replaced with the javascript-as-ubiquitous-target state we're in today

@danjones @strypey @aral Keyboard interception (clipboard use prevention), and bad "while true" loops, and back button breaking, and auto-clicks on mouse over, and silent href alteration, code dehydration... none of that requires AJAX and still makes me distrust my browser.

@danjones @strypey @aral
> All browser-side JavaScript is effectively open source, even if it's not licensed as such.

erm.

> The “source code” for a work means the preferred form of the work for making modifications to it. “Object code” means any non-source form of a work.

gnu.org/licenses/gpl.html

@aral @strypey I disagree. Controlled code execution in any of both places is fine.
If people can know and can control what is executed that's fine. IMO.

JavaScript and Browser infrastructure is heavy for lightweight clients. Ethics also have the accessibility point... Hm...

@aral @strypey I don't begrudge you for using the expedient tools of today, but I must request stand in the way of developments that NEED to happen to ensure all the software we run is libre. So we can do better than the whack-a-mole efforts like your Better (which I really appreciate, btw, thank you!).

Apple and Microsoft already learnt their lesson about auto-executing CD software, but that vulnerability only moved to The Web.

@aral @strypey I agree strongly with you that logic must be clientside.

I disagree that The Web is a good tool for that. It has it's place, but we really need to pushing for libre native apps following open standards.

And browsers should assist with that be pointing people towards the apps they need to use those standards securely. I've implemented that for Odysseus.

@alcinnz @strypey Agree. But we need untrusted relay nodes to guarantee findability and availability to match expectations of the levels possible in centralised systems if we want to build a bridge from here to there. We can shed those training wheels once there are enough nodes. But without them we won’t get adoption.

@aral @strypey I do not see how these untrusted relay nodes (which yes, we do need), but I do not see how that relates to the argument we are having. Why can't we be distributing native apps for each platform?

smashingmagazine.com/2012/06/m

@alcinnz @strypey I agree. But I don’t have the resources to do so. So I’m starting with an offline browser app first and then hopefully we’ll inspire others to build native apps. But there is no one size fits all; a plurality of approaches and initiatives can only make us stronger.

@aral @strypey And as I said at the start, I don't begrudge you for doing so.

@aral
Why should there be a "client" if the logic is entirely on the client side? We can just turn to P2P networking and destroy the idea of "servers"
@strypey @alcinnz

@VeintePesos
> We can just turn to P2P networking and destroy the idea of "servers"

I like this idea politically. But after 20 years of waiting, I've yet to see a single piece of network software that can work entirely #P2P, without supernodes of any kind for bootstrapping, relay, ID mapping etc. Can you think of one? #BitTorrent needs both trackers and search sites to be useful. Even #BlockChains need "miners", which are supernodes, not pure P2P.
@buoyantair @aral @alcinnz

Having said that, I agree that supernodes don't have to be as centralized as servers. The #fediverse takes this one step, by connecting standard servers. The next step might be to disaggregate server functions, with some specializing in authentication, or media storage, or search, and so on. But we can't get rid of the concept of servers entirely without giving every device a persistent IP address (#IP6), and creating a decentralized replacement for #DNS.
@VeintePesos @buoyantair @aral @alcinnz

@buoyantair I don't know enough about the network topology of #Scuttlebutt to be sure, but from what I've learnt so far, it seems unlikely to be able to work on a large scale without a much greater use of "pubs" and other such supernodes. I don't see that as a problem (see the second post). To me, it just means that server/client and P2P are two ends of a spectrum, rather than a hard dichotomy, and that successful future network tech will involve hybrids of the two.
@alcinnz @aral @VeintePesos

Sign in to participate in the conversation
Aral’s Mastodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!