@sorenpeter@darch.dk well edits can be detected with either approach really
Summary of Discussions (as best I can):
- @lyse@lyse.isobeef.org and @sorenpeter@darch.dk express simplicity. Both Lyse and Sorenpeter support location-based addressing.
- @falsifian@www.falsifian.org believes we should continue to develop ideas and extensions progressively over time like we’ve always done.
- @david@collantes.us @quark@ferengi.one and @bender@twtxt.net would like a better user experience, especially when threads break due to edits, deletions or feed location changes.
- @anth@a.9srv.net would like to see utf-8 mandated, and the threading model remain largely the same as it is today, which is primarily based on the convention of a Twt Subject anyway, Twt Hash(es) just make the threading “more precise”. Anth also states that format, client and server specification/recommendations should be kept separate.
- @movq@www.uninformativ.de @xuu sorry you two haven’t said too much really, so I’m not too sure?
Overall, the 22 votes we’ve had on the poll from the community (if you can call it a community?) have clearly shown that:
- We continue to support content-based addressing. (65/35)
- We think about formally supporting edits/deletes (60/40)
- We do not increase the use of cryptography (thworing things like authenticity and identity out the window) (70/30)
And overall the NPS (net promoter score) of “Would I recommend Twtxt to a friend” is a whopping 7/10 (which is crazy! 🤯)
Let’s have our monthly catch up soon™ (1hr) and discuss together. My own take on the direction we should take at this point is as follows:
- We continue to use hashing for the threading model.
- We think about changing this to SHA-256 for simplicity.
- We think about changing this to SHA-256 for simplicity.
- We either adopt @anth@a.9srv.net’s UUID approach or @lyse@lyse.isobeef.org Dynamic URL approach.
- We continue to incrementally/progressively improve things over time as @falsifian@www.falsifian.org suggested.
- We think about mandating utf-8 as @anth@a.9srv.net suggests which makes things so much easier for everyone.
- We further discuss the merits/ideas of supporting formal Edit/Delete requests or other ways to better support this in some way.
@lyse@lyse.isobeef.org Got time now before you head off?
@xuu@txt.sour.is Oh geez! Is this anywhere near you?
@falsifian@www.falsifian.org Thank you! 🙏
If we want this though (or some of us do) I will probably have to make the hard decision here to just fork from Twtxt entirely and define a completely new spec. If we care about the UX we need a few properties (some of which we have, some of which we don’t have and some of which are “weak”):
- Authenticity
- Integrity
PrecisionVersioning
The last one involves actually supporting the notion of “Edits” and “Deletes” IMO more formally. Without this it would be quite hard to support a strong/good UX. Another way to think about this is “Versioned Twts”.
I think the only legit way of preventing this kind of “spoofing attack” would be:
Digitally Sign Twts: Each Twt could be digitally signed using a private key associated with the UUID. The signature would be calculated over the concatenation of the UUID, timestamp, and content. The public key could be published along with the feed so anyone can verify the authenticity of the Twt by checking the signature. This approach ensures that only the true owner of the UUID (and the corresponding private key) can produce valid hashes.
Which leads us to more Cryptography. Something which y’all voted against.
@bender@twtxt.net This is sadly where you need two things:
- A
/twtxt.txt.sig
(detached signauture)
- Or a way to sign the
# uuid =
with a key that can be verified.
Hmmm and as I write this actually, I think this doesn’t work either, because you can still just copy it regardless. Hmmm @xuu help me out here? How do we prevent “spoofing”? 🤔
That page says “For the best experience your client should also support some of the Twtxt Extensions…” but it is clear you don’t need to. I would like it to stay that way, and publishing a big long spec and calling it “twtxt v2” feels like a departure from that. (I think the content of the document is valuable; I’m just carping about how it’s being presented.)
It’s for this reason I’d like to try changing the Twt Hash extension to use SHA-256 which is a far more common tool available pretty much everywhere. I think the effort involved in “precise threading” (using content addressing) becomes much easier to “author” (note that participating in an existing thread has always been trivial, just copy the Twt Subject in your Twt).
Again, I like this existing simplicity. (I would even argue you don’t need the metadata.)
I argue you do. It’s nice to have a “@nick@domain` a feed author prefers to be called by, rather than you just making shit™ up haha 😝
It’s also quite nice to have a visual representation of the feed too. description can be optional.
Without this, feeds are a bit too “bland” IMO.`
@falsifian@www.falsifian.org Yeah I agree with this actually (introducing too many changes at once is often a bad idead):
but IMO that shouldn’t be done at the same time as introducing new untested ideas
@bender@twtxt.net Bahahahahahahaha 🤣
This is why we need “authenticity” 🤣 Yes if you copied my feed’s UUID, then you’d end up generating identical hashes to me if we posted at identical times with identical timestamps. Not good 😌
Also, was the dot after the timestamp intended?
No, sorry.
For example a v2 spec might just simply mandate the following as a starting point:
cat <<EOF
# nick = $USER
# avatar = https://example.com/$USER.png
# description = Hi 👋 I'm Bob!
# uuid = 7E9BC039-4969-4296-9920-4BACDBA8ED5C
2024-09-28T11:19:25+10:00 Hello World!
EOF > ~/public_html/twtxt.txt
And:
- Serve your file with
Content-type: text/plain; charset=utf-8
@falsifian@www.falsifian.org I don’t have a problem with continuing the way we have been for the past ~4 years, little extensions and improvements that we try along the way. That has worked quite well 💪 As a blind person myself, I can totally empathise with reading a full (lots of text) spec. Even if we decide to combine all the ideas into a full fleshed out v2 spec, it might be worthwhile having a cut-down version that is as simple as it can be a no less.
Deprecating non-UTC times seems reasonable to me, though.) Having a big long “twtxt v2” document seems less inviting to people looking for something simple. (@prologic@twtxt.net you mentioned an anonymous comment “you’ve ruined twtxt” and while I don’t completely agree with that commenter’s sentiment, I would feel like twtxt had lost something if it moved away from having a super-simple core
See https://yarn.social (especially this section: https://yarn.social/#self-host) – It really doesn’t get much simpler than this 🤣
@falsifian@www.falsifian.org We’ve been doing this for years:
There are lots of great ideas here! Is there a benefit to putting them all into one document? Seems to me this could more easily be a bunch of separate efforts that can progress at their own pace:
@bender@twtxt.net I’m not following it, but someone on my pod is 🤣 And yes based on statistical evidence, I doubt you’ll see a reply either 🤣
@doesnm@doesnm.p.psf.lt The useragent tool now natively supports the Caddy (JSON) logfile format. 🥳
This is a 1-way feed by the looks 🤣 Maybe someone can figure out how to reach out to this person and see if they’re aware and interested in something a bit more “social” (albeit slow) 🤣
@falsifian@www.falsifian.org Sorry I didn’t make that super clear 🤦♂️ Be happy to see you there and some new folks 🙇♂️
This Facebook/Meta story on storing passwords in plain text it just wow 😮 – Like how da fuq does a company, or anyone for that matter in the business of software / technology even do this?! Like at least base64 encode the fuckers right?! (oh wait 🤦♂️)
yarnd
hwoever:
@xuu If you have time, could you help me pinpoint this bug? 🐛
@lyse@lyse.isobeef.org It’s from 12pm to 4pm UTC so if you can make it at all, that’d be great 👍
@xuu Do you think we should just detect edits at the client-level then? 🤔
Probably the best idea I’ve heard/seen si far is @anth@a.9srv.net’s idea of a feed having a uuid # uuid =
(if present) otherwise just falling back to the URL you fetched it from and dropping the idea of a feed # url =
entirely.
@lyse@lyse.isobeef.org Yup you’re right, it’s s terrible idea 💡
Well the poll clearly shows:
- ~65/35 in favor of Content Addressing
- ~60/40 in favor of supporting Edit/Delete
- ~70/30 against more cryptograph
And an NPS score of 7/10 🤣
@bender@twtxt.net Zero technical issues 🤣 I never claims otherwise 😅
@bender@twtxt.net Yes but you’ve got me curious now 😅
Okay, co-founder of Wordpress and CEO of Automation.
What has the poor guy done? 🤣
@david@collantes.us Who’s Matt Mullenweg? 🤔
Like really tbh, it’s just a matter of abstracting out the “fetching” part of your client. There are zero issues with fetching Gopher/Gemini hosted feeds. They just lack any mechanisms for Discovery and Caching.
@doesnm@doesnm.p.psf.lt Still haven’t received it. Did you send to james at mills dot io
? 🤔
@movq@www.uninformativ.de I don’t think I intend to either tbh for yarnd
. If there was any poorly worded “things”, it was just merely pointing out lacking capabilities for caching and discovery.
@bender@twtxt.net Oh so what you’re saying is “we” (royal we) ruined Twtxt 🤣
james
instead 🤣
@doesnm@doesnm.p.psf.lt Are you sure? Not seen the mail yet…
@aelaraji@aelaraji.com LOL 😂 Here’s one for you:
You can take IRC out of my cold 🥶 dead 😵 hands 🙌
@doesnm@doesnm.p.psf.lt Ooops you might want to re-send that to james
instead 🤣
@aelaraji@aelaraji.com It sadly does not it seems. 🤣 Seems like the search engine has come across mentions of your feed via its other two protocols 🤣
$ inspect-db yarns.db | jq -r '.Value.URL' | grep 'aelaraji.com'
https://aelaraji.com/test_feed.txt
https://aelaraji.com/twtxt.txt
@doesnm@doesnm.p.psf.lt My Salty public key is:
kex1fhxntuc0av7q48hlfj970ve297dzzghn82wp5cahr9r92y8rlrqqtwp983
@doesnm@doesnm.p.psf.lt Do you have a sample Caddy log file you can supply? I’ll see if we can improve the tool 👌
@doesnm@doesnm.p.psf.lt Fot a sample access log? Which tool are you using?
@doesnm@doesnm.p.psf.lt I don’t think it does. I think it’s completely different to what you’re thinking.
@doesnm@doesnm.p.psf.lt Yeah just move your feet. It’s totally fine. Don’t worry about it.
@doesnm@doesnm.p.psf.lt I couldn’t find any references to this anywhere either.
@doesnm@doesnm.p.psf.lt Like now?
@doesnm@doesnm.p.psf.lt I have no idea to be honest 🤣 I’m actually not really sure how you can ruin something be improving it 🤦♂️
We:
- Drop
# url=
from the spec.
- We don’t adopt
# uuid =
– Something @anth@a.9srv.net also mentioned (see below)
We instead use the @nick@domain
to identify your feed in the first place and use that as the identify when calculating Twt hashes <id> + <timestamp> + <content>
. Now in an ideal world I also agree, use WebFinger for this and expect that for the most part you’ll be doing a WebFinger lookup of @user@domain
to fetch someone’s feed in the first place.
The only problem with WebFinger is should this be mandated or a recommendation?