@kate Looks like they're desperate for funds. It is alarming, and we should support ff somehow.
If ff fails, the web will remain chromium-only (except maybe caves of gemini). That would leave google alone to set all the rules. Somewhat doomsday scenario. So don't be too picky and keep supporting ff. If they fail in funding, we all will get ads in chromium address bar or something worse.

@dudenas @kate As far as I'm concerned Google is already setting all the rules!

I wish Mozilla the best of luck in holding this back as long as possible, but unfortunately they find themselves having to make these compromises. I don't think they can do what it takes to save the web without losing what influence they still have...

@alcinnz @kate

Another thing - open source software, especially as crucial as ff, can be clearly classified as public good. It is weird how there is no political will to appreciate and fund it. At least in countries I see. Is it because there is no demand from voters? Maybe there is demand, but not yet articulated?

> I It is weird how there is no political will to appreciate and fund it. At least in countries I see. Is it because there is no demand from voters? Maybe there is demand, but not yet articulated?

I tried to do my bit by emailing the tech spokespeople in NZ political parties a link to Nadia Eghbal's 'Roads and Bridges' report, with a bit of contextualizing comment about why it's important they read it.

fordfoundation.org/media/2976/

@dudenas @alcinnz @kate @onepict @humanetech

#NadiaEghbal #funding

@strypey @dudenas @alcinnz @kate @onepict @humanetech

Except that roads and bridges are not exactly virtual or electronic infrastructure, just to add to the confusion.. Tell a politician about virtualization and they immediately think about money.

@strypey @dudenas @alcinnz @kate @onepict @humanetech
@aral
Perhaps the Common Browser proposal should open with the line: The Common Browser Programme is not about money.

@strypey @dudenas @alcinnz @kate @onepict @humanetech @aral

So far just the idea has been floated, but apparently the need for it will become real. I would be very interested in this, also as an antidote to those who claim that open source is automagically commons, because most of the open sources have not been created by commoners. (see en.wikipedia.org/wiki/Elinor_O )

@gert @strypey @dudenas @alcinnz @kate @onepict @aral

Slightly OT.. I saw that Drew Devault started working on visurf, based on NetSurf and intends to create a HTML + CSS framework specifically targeted to smaller browsers as 1st-class citizens.

drewdevault.com/2021/09/11/vis

@humanetech @gert @strypey @dudenas @alcinnz @kate @onepict Sadly, without client-side (you know, the side YOU control) JavaScript, it can’t be used to implement small web sites (how are you going to ensure your keys are held only by you?) The problem is trusting servers. Client-side JS that you own and control, if you can verify the source, isn’t the problem, it’s actually the solution to protecting your privacy on the web.

@aral @humanetech @gert @strypey @dudenas @alcinnz @kate @onepict

The key point here is ”if you can verify the source”. This is in practice impossible, and JS is executed as the page loads. We can’t expect people to inspect the source code of every page before rendering.

I don’t see why JS is needed to implement small sites. I only use it for warmedal.se/~wobbly/ and even then only for a nicer UX. It could as well have been an ordinary web form.

Follow

@tinyrabbit @humanetech @gert @strypey @dudenas @alcinnz @kate @onepict It is not impossible, it’s just not possible within the confines of current browsers. Entirely possible via an extension or third-party app, etc.

We need it for Small Web (small-tech.org/research-and-de) because there’s no other way for you to own your own keys or ensure that your content is end-to-end encrypted.

· · Web · 2 · 0 · 1

@aral
Hm, It may make sense as a longshot. But almost nobody can perform security audit on their own. That means, you need to trust someone's agency. As I think of it, a perfect model would be the one, where I could choose my agent I trust to verify content for me.

On client side, probably most antivirus software claim to audit web content.. But I admit, I usually consider them more annoying than most viruses.

@tinyrabbit @humanetech @gert @strypey @alcinnz @kate @onepict

@dudenas @tinyrabbit @humanetech @gert @strypey @alcinnz @kate @onepict Well, there’s verification and there’s verification. I’m not talking about source code audit but at least verifying that the signature of the file matches what the organisation you trusts says it should be. Beyond that, yes, a bigger issue is having trusted agents that actually perform things like source code audits.

@aral @dudenas @humanetech @gert @strypey @alcinnz @kate @onepict

So which companies should I trust? How do I decide? What about vanilla JS in <script> tags? Small unknown libs?

I don’t trust React, Angular or a dozen others equally bloated libs no matter which CDN offer them. I don’t think we can build a trust system that can provide any security or trust in a meaningful definition of those words.

1/2

@aral @dudenas @humanetech @gert @strypey @alcinnz @kate @onepict

If you want trustworthy e2ee then NO javascript is the way to go. But in practice we decide a vendor we trust with it, like Signal or Telegram, or we might use GPG.

I really don’t see how the client has any meaningful control over a client side script other than deciding whether it should be executed or not.

@tinyrabbit @aral @dudenas @humanetech @strypey @alcinnz @kate @onepict

> I don’t think we can build a trust system that can provide any security or trust in a meaningful definition of those words.

Perhaps not, but communities can.

@tinyrabbit

You trust who you choose to trust.
That's basically what's Free Software is about.
If you trust no one, you audit the source code yourself.

@aral @dudenas @humanetech @gert @strypey @alcinnz @kate @onepict

@Iutech @aral @dudenas @humanetech @gert @strypey @alcinnz @kate @onepict

I'm sorry, but this is pretty naive.

As I said, JS is executed in the browser before I've decided whether I trust it or not. Let's say I have a plugin where I allowlist sources. How do you suggest I keep that updated with all the hundreds of frameworks that pop up every day? And how do I deal with vanilla JS? Trust or not?

It makes more sense to block JS and only enable it on *sites* I trust. Not JS sources I trust.

@tinyrabbit

We need JS code signatures and signature verification (I believe that's what @aral was talking about).
Obviously that imposes constraints (like each update to the javascript code of a website would require to resign it) but it's not impossible, and these constraints are reasonable for security-minded people.

@dudenas @humanetech @gert @strypey @alcinnz @kate @onepict

@Iutech @aral @dudenas @humanetech @gert @strypey @alcinnz @kate @onepict

But any JS library will allow devs to do whatever JS is capable of doing, which is a lot. There's no guarantee that evilcorp.com uses the latest version of React in an ethical and -- to me -- secure way even if the source and signature check out.

@Iutech @aral @dudenas @humanetech @gert @strypey @alcinnz @kate @onepict

Sorry, we may be talking about different things. Let's say I trust goodsite.org to run JS, and they import Angular. Then of course I want to be certain that the file imported is the actual Angular source that they intend to run, and not something malicious inserted in a supply chain attack.

I just realised that's probably what you mean, in which case we're in full agreement 😆

@Iutech @tinyrabbit @aral @dudenas @humanetech @gert @strypey @alcinnz @kate @onepict I think the ship has sailed with respect to trusted code. The only solution is to not trust any code, and just isolate everything. It's the kind of pragmatic approach taken by Qubes OS.

Even if there was a practical way to only run trusted code, that code could still have bugs which leads to security issues. Letting the code run in isolated containers neatly deals with this issue, at the expense of making intra-container communication more cumbersome (as any Qubes user will know)

@loke @Iutech @tinyrabbit @aral @dudenas @humanetech @gert @strypey @kate @onepict

I used to take that "the ship has sailed" stance... Now I think there's additional reasons to see about walking it back...

@alcinnz @Iutech @tinyrabbit @aral @dudenas @humanetech @gert @strypey @kate @onepict I believe that we need some way of running code that is downloaded from a remote server.

From my perspective, this need exists. JS is of course a terrible way to deliver software, but regardless of technology, we still need to be able to run code from remote sources.

So yes, I agree that JS on web pages is bad, and we need a better way to deliver content. But the issue isn't security. Getting rid of JS would have a benefit for privacy though.

With regards to downloading code from the Internet, as ActiveX showed us in the 90's already, relying on trusted code simply does not work. We have to assume that anything you download is hostile, and isolation is the only solution that I know of that actually works and can be used today.

@loke @Iutech @tinyrabbit @aral @dudenas @humanetech @gert @strypey @kate @onepict O.K., at least partially agreed!

(I quite like package managers as a primary mechanism...)

@alcinnz @Iutech @tinyrabbit @aral @dudenas @humanetech @gert @strypey @kate @onepict Nothing wrong with package managers. My issue with them is that while they provide structure to the deployment of software (especially things like Nixos, even though I'm not a fan of its actual implementation), very few of them even attempts to provide some form of isolation.

The only system that tries is Flatpak, and it's nowhere near perfect.

@gert What I'm actively doing in my own amateur browser dev is to drop JavaScript support, and have them recommend compatible apps from your package repositories which can open as yet unsupported links.

I've got multiple reasons to drop JS:
1) Far too much work
2) Security
3) UX control

@loke @Iutech @tinyrabbit @aral @dudenas @humanetech @strypey @kate @onepict

@alcinnz

I agree that you are eliminating a number of problems, but also important features. Of course you should go ahead and see how that would work.

@loke @Iutech @tinyrabbit @aral @dudenas @humanetech @strypey @kate @onepict

@gert Yup, that's true. And if others are interested in exploring with me, I'll be happy to codesign alternative means of achieving those important features whilst addressing my concerns! Which don't involve immediately running programs before I've had an opportunity to audit it.

@loke @Iutech @tinyrabbit @aral @dudenas @humanetech @strypey @kate @onepict

@aral @tinyrabbit @humanetech @strypey @dudenas @alcinnz @kate @onepict

Although many think SSH is also flawed, it needs neither central nor decentral authority. (ducking under the table)

Sign in to participate in the conversation
Aral’s Mastodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!