gnunet-developers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: The messenger service is ready to use


From: carlo von lynX
Subject: Re: The messenger service is ready to use
Date: Sat, 20 Mar 2021 20:07:36 +0100
User-agent: Mutt/1.5.20 (2009-06-14)

On Sat, Mar 20, 2021 at 12:38:56PM +0100, Lukas Schmidt wrote:
> >>> Sure, you could argue that most centralized services probably just
> >>> visualize an option of deletion or modification while keeping the
> >>> old
> >>> state stored anyway. But I would like to actually provide this
> >>> functionality.
> >> Didn't understand your argumentation since delete and edit operations
> >> are syntactically integrated into PSYC rather than plastered on top
> >> with custom message types, therefore any PSYC message can contain
> >> modifications of the state including edits and deletions, however
> >> suitable. It's one of the major differences to JSON and XML.
> > I'm not talking about partitial removal or correction. I mean fully
> > deletion. At least I didn't find any about this in the paper about Psyc
> > from 2013.
> >
> > The deletion message as implemented will remove any evidince from the
> > stores of all peers. The permission for such deletions is the ownership
> > of the selected message verified by its encryptic signature. The
> > deletion message can operate with a custom delay as well.
> 
> Auto-deletion and deletion by sender is problematic because it cannot be
> guaranteed or enforced in any way. I also find it problematic because to
> be even slightly enforced it goes hand in hand with a loss of control
> over your device (similar to DRM problematic).

Yes, so in practice it depends more on practicalities: how practical is
it to modify your client not to support deletions? Will others receive
hints making you look stupid? Will it be a lot more practical to just
use the client the way it is being provided? In the case of Telegram
you can have a free version from f-droid.org, but not one that will
ignore sexting deletion requests. You have to go through the extra
effort of making one, and once done you can't even publish it without
looking like the bad toxic male kind of guy.

But there's an extra thing about PSYC here. Since the deletion operator
'-' is deeply radicated in the lower levels of the protocol, there isn't
even an easy way to distinguish a privacy-driven deletion from any other
kind of deletion like removing last year's christmas pictures and making
space for new ones. Or simply remove the data of last month's weather
which doesn't even reflect the actual weather that followed, so it is
utterly useless data. You can come up with better examples of legit
deletions which would only stuff up your drive with outdated or redundant
protocol snippets and blobs. That's because the actual access to photos
etc in secushare would probably not be in form of files on a hard disk
but rather through a virtual file system which assembles the items on
demand from the fragmented data store.

> >>> decentralized I could query all friends of some contact without
> >>> them
> >>> knowing about it and track down when they meet anyone or befriend
> >>> them.
> >> No, you can't. Only the ones that were shown to you, and it would
> >> be visible through whom you are contacting them.
> > I didn't imply to contact them. Just being able to the get contacts as
> > results in a search allows drawing conclusions from that if they come
> > from a social graph.
> >
> > Many people I know would strongly restrict such a publication of
> > contacts which would end up in making them very difficult to contact at
> > all. It's a good structure for social networking but there are people
> > which tend to dislike social networking these days. Still they need an
> > application to message someone.
> 
> It is a personal choice not to be easy to be contacted. It goes the same
> way as I don’t give only my private phone number to my close friends and
> am only reachable by them. The multiple Egos in GNUnet allow this, to
> have more or less public ways of getting contacted that are more or less
> published depending on the choice of the user. An option to ask for
> exclusion from social graphs maybe sensible.

Well, it may make sense to allow for "ghosts". You know there is a person
but when you subscribe to her public channel it is completely empty. Yet
she might help you figure out which nodes to trust for routing.

> >>> So unless there is a protection to prevent such an attack, I would
> >>> prefer a social graph on application level rather than on service
> >> The graph is also needed in order to be able to trust servers that
> >> offer storage capacity and for them to trust you being a person worth
> >> storing data for. And similar structural jobs. Sure, you would still
> >> have encryption wrap around that data, but it's still uncool if all
> >> the data resides on a company's cloud. Social graph offers an
> >> additional way of measuring the trustworthiness of GNUnet nodes.
> > Would it be possible to generate trust? In other words does the amount
> > of others trusting a specific peer makes it trustworthy or does every
> > user have to rate others with a selected trust-rate?
> >
> > Because first option would still make the big companies the ones being
> > trustworthy.

The big companies are not participating in secushare, by design and
by AGPL. It simply isn't legal to serve GNUnet nodes en masse. And even
if they were doing so, the nodes wouldn't really know much. They're
just moving encrypted padded blocks around.

Still the social graph provides an extra way to avoid rogue nodes.

> I think that trust should be rated internally and not publicly. You can
> internally choose how much you trust X as a source of information and
> the graph resulting from X for you and choose if X should be forwarded
> in your own graph in public or not.

Yes, but there are several kinds of trust - for the functioning of
secushare it is enough to know a node belongs to a real person, not
a network of evil bots. You may not trust your greatest enemy a lot,
but they are still real people using real GNUnet nodes. So the trust
you personally put into specific people isn't the sort of information
needed to protect against sybil attacks on routing level.

> >>>> Also, I'm not sure how well the use of DHT and GNS is protected
> >>>> from systematic attacks. Can I, by knowing someone's email
> >>>> address,
> >>>> publish a wrong pubkey in her name? Can I DoS a person out of
> >>>> existence?
> >>> Yes, you could probably publish a wrong connection if you know the
> >>> exact credentials, I assume. However this is possible in every
> >>> other
> >>> messenger as well since most of them rely on server trust which
> >>> basically means there's nothing preventing the server to perform a
> >>> PITM
> >>> attack on its users. The only way to prevent this is physical key
> >>> exchange.

There's still a huge difference if one company or nation state can
steal your identity, or if pretty much anyone can...

> >> Whereas with the social graph, searching for Michelle would return a
> >> Michelle that is friends with Robert and a few other peeps, some
> >> other Michelle that is friends with other folks, and a fake Michelle
> >> who's only friend is the idiot who is desperately trying to fool you.
> >> I think this is a lot safer than using a DHT w/o social graph.
> > I wouldn't say it's unlikely "the idiot" and Robert could be the same
> > person. Because even trust is not equal to integrity.

Then you can tell that Robert is an idiot trying to fool you,
why would continue dealing with him?

> > Assuming the only option in the current messenger service would be
> > searching in the DHT for "Michelle", you would probably get a result
> > like 10 to 1000 different entries... all called "Michelle". So people
> > clicking on the first entry assuming they can't be fooled are quite
> > enthusiastic, I guess.

That's why we don't do this. Would you in your messenger?

> > So the first option to add a contact is the physicial exchange (key
> > pairs get exchanged). The second option is adding a contact which is
> > already member of a common groupchat (the same key pairs will be used
> > as in the groupchat). The third option is asking a contact/friend to
> > establish contact. Then the last option is using a search via DHT for
> > example.
> Some kind of (optional) filtering is sensible. I think using
> trust/social graphs instead of DHT would be the GNUnet way thought.

Thanks. By the way, nice to get to know you, Lukas!

> > The thing is not everyone likes social networking. So some people won't
> > share contacts with anyone. However that does not imply they are
> > strangers, far distanced or unsocial.

Then contact their secretariat.

> > Also some people only know others from communication via the internet.
> > They don't have any common friends making a connection with each other.
> > So those people could not interact at all?

secushare is intentionally not empowering people to get to interact
on the basis of a common interest without any social control.
See http://my.pages.de/bandenbildung for the complete reasoning.
Luckily there are ways to implement social control for most legal
use cases, so this mostly applies to criminality.

> >> That answer has some flaws: it's not good if the protection against
> >> potential DoS works by avoiding to socialise with people, and it
> >> also not good if it stops working as soon as your perpretator got
> >> a hold on your public key and can thus attack its DHT nodes.
> > Restricting certain people from contacting you is not equal to avoiding
> > to socialize. You just select or filter the people you socialize with.
> > I know several people which completely avoid social media because
> > everyone can contact them, spam messages and annoy them. So that's an
> > issue without including any potential DoS attacks.
> We have multiple egos for this kind of use cases. Each ego can have a
> different level of public reachability.

Also I don't think any major social media still allows random strangers
to spam you. Soundcloud maybe. E-Mail does! And now I even got phishing
texts via GSM!

> >>>> Should that assumption prove wrong, then we need approximizations
> >>>> such as "whispering" over existing channels, sending irrelevant
> >>>> data and notifications to smartphones that can't even decipher
> >>>> them.
> >> You haven't argumented why it wouldn't be possible to allocate
> >> all the channels one might want to have. 
> > New channels for private conversations would use a random port. So
> > unless people can brute-force a 512bit hash, I would think it's
> > unlikely that other people besides the ones invited show up in your
> > chat.

That was not what I asked (and of course channels shouldn't be
penetrable by strangers). I asked whether there is a scalability
limit to the number of channels one can have - if setting up
multicast distribution channels is cheap or expensive.

> Happy Saturday

+1

P.S. t3ss, I'm waiting in your chatroom…  ;)

-- 
  E-mail is public! Talk to me in private using encryption:
   //  http://loupsycedyglgamf.onion/LynX/
  //    irc://loupsycedyglgamf.onion:67/lynX
 //    https://psyced.org/LynX/



reply via email to

[Prev in Thread] Current Thread [Next in Thread]