@johanbove@johanbove.info Way to go! ā what will you be reading? Got anything planned?
Cancelled Mastodon because the time spent on it could have been used for reading books instead and the level of interaction is not enough to keep me interested.
@prologic@twtxt.net I say we should find a way to support mentions with only url, no nick, as per the original spec.
- For
@<nick url>
we already got support
- For
@<nick>
the posting client should expand it to@<nick url>
, if not then the reading client should just render it as@nick
with no link.
- For
@<url>
the sending client should try to expand it to@<nick url>
, if not then the reading client should try to find or construct a nick base on:
- Look in twtxt.txt for a
nick =
- Use (sub)domain from URL
- Use folder or file name from URL
- Look in twtxt.txt for a
Iām still making progress with the Emacs client. Iām proud to say that the code that is responsible for reading the feeds is almost finished, including: Twt Hash Extension, Twt Subject Extension, Multiline Extension and Metadata Extension. Iām fine-tuning some tests and will soon do the first buffer that displays the twts.
Thank you @bender@twtxt.net Much appreciated š and Sorry you/anyone had to read thatā¦
Iām usually comfortable keeping my hardship to myself, most especially AWAY from the internet; an act of kindness of sorts towards others, āEveryoneās got their own problems to worry aboutā kind of thing.. But maaan am I starting to believe creating a twitter account would be a healthy decision š¤£š¤¦ Read nothinā out there, just a one way echo chamber of sorts to let that shi_ out of my chest. It seem thatās what everyone elseās been using it for all this time.
A Bsky would be even better! š Iād get to shi_ post and yap all I want, allll the way from terminal and never ever have to look back at it or whatever comes out of it. But I digressā¦
I FU_ing despise this ⦠whatever this is. I wish I could just wake up in some sort of parallel universe where everything is just sunshine and rainbows, alas, life would be just as meaningless.
and sorry you had to read this if you did.
@kat@yarn.girlonthemoon.xyz iām reading this and i already have a gts server that i could secure with this but iām thinking itād be best for most of my public sites https://ovelny.sh/blog/a-complete-guide-for-your-gotosocial-server/
@xuu@txt.sour.is ROFLMAO! 𤣠reading that, the Tech bro sounded in my mind like Cow from Cow and Chicken
should get back to reading doctorowās internet con, itās so good
@movq@www.uninformativ.de woah itās like a cheatsheet with explanations! java is kind of arcane magic sorcery to me so iām having trouble understanding it but i have that with most programming languages. this is like so much easier to actually look at and read instead of my eyes glazing over lol
Want this API for Goryon or just Goryon with support to just twtxt.txt. I canāt read timeline without visible replies and missing twts
nick = _@domain.tld
in the twtxt.txt?
What should the advantage be to nick = _
compared to just not defining a nick and let the client use the domain as the handle?
What is not intuitive is that you put something in the nick field that is not to be taken literary. The special meaning of _
is only clean if you read the documentation, compared to having something in nick that makes sense in the current context of the twtxt.txt.
Thanks @bender@twtxt.net for the feedback. I fixed and expanded the article. Iām sorry for my poor interaction. Furthermore, Iām reading and writing while programming a client in Emacs.
after thinking and researching about it, yep, I agree that WebFinger is a good idea.
For example reading here: https://bsky.social/about/blog/4-28-2023-domain-handle-tutorial
I wasnāt considering some scenarios, like multiple accounts for a single domain (See āHow can I set and manage multiple subdomain handles?ā in the link above)
Did I write here already that the reason why I love Twtxt so much is that it works without having to compile, install anything extra. Just the bin applications that come with 95% of all operating systems and youāre good to read and participate, giving you have a domain name somewhere to host the twtxt.txt file.
You can select some text from a web page and right/command click and select print⦠and select To PDF to quickly save snippets for save-keeping or further reading.
I think itās centralized shit with lying about decentralization. All network is worked by two centralized things: plc.directory (did storage?) and network relay (bsky.network). You can host your relay but this require TOO MUCH resources (2TB storage and 32GB RAM read more ). Also i try running PDS and: 1. I canāt register account via app,only via cli 2. It leaked on 2GB virtual machine then killed by oom after trying to register account via cli
@wbknl@twtxt.net are you still in Russia? It could be hard mailing anything to there these days. I read your ārussia is eternally coldā, and became curious. Patagonia is the only place I know on South America that it has rounded mountains, though they can be anywhere. Originally from Chile, or Argentina? My curiosity doesnāt need feeding, by the way. Itās all good if it doesnāt. :-)
@eapl.me@eapl.me here are my replies (somewhat similar to Lyseās and Jamesā)
Metadata in twts: Key=value is too complicated for non-hackers and hard to write by hand. So if there is a need then we should just use #NSFS or the alt-text file in markdown image syntax

if something is NSFWIDs besides datetime. When you edit a twt then you should preserve the datetime if location-based addressing should have any advantages over content-based addressing. If you change the timestamp the its a new post. Just like any other blog cms.
Caching, Yes all good ideas, but that is more a task for the clients not the serving of the twtxt.txt files.
Discovery: User-agent for discovery can become better. Iām working on a wrapper script in PHP, so you donāt need to go to Apaches log-files to see who fetches your feed. But for other Gemini and gopher you need to relay on something else. That could be using my webmentions for twtxt suggestion, or simply defining an email metadata field for letting a person know you follow their feed. Interesting read about why WebMetions might be a bad idea. Twtxt being much simple that a full featured IndieWeb sites, then a lot of the concerns does not apply here. But thatās the issue with any open inbox. This is hard to solve without some form of (centralized or community) spam moderation.
Support more protocols besides http/s. Yes why not, if we can make clients that merge or diffident between the same feed server by multiples URLs
Languages: If the need is big then make a separate feed. I donāt mind seeing stuff in other langues as it is low. You got translating tool if you need to know whats going on. And again when there is a need for easier switching between posting to several feeds, then itās about building clients with a UI that makes it easy. No something that should takes up space in the format/protocol.
Emojis: Iām not sure what this is about. Do you want to use emojis as avatar in CLI clients or it just about rendering emojis?
@prologic@twtxt.net Iām not a yarnd user, so it doesnāt matter a whole lot to me, but FWIW Iām not especially keen on changing how I format my twts to work around yarndās quirks.
I wonder if this kind of postprocessing would fit better between composing (via yarndās UI) and publishing. So, if a yarnd user types ¼, it could get changed to ¼ in the twtxt.txt file for everyone to see, not just people reading through yarnd. But when I type ¼, meaning first out of four, as a non-yarnd user, the meaning wouldnāt get corrupted. I can always type ¼ directly if thatās what I really intend.
(This twt might be easier to understand if you read it without any transformations :-P)
Anyway, again, Iām not a yarnd user, so do what you will, just know you might not be seeing exactly what I meant.
Inversion by Aric McBay was another random library pick. Like The Fall of Io, itās the most recent in a series, though I think this series is pretty loosely connected. In contrast, the villain in this book is simple and cartoonishly evil. The book presents a design for utopia which was interesting but a little cloying. Iām not sure if Iām supposed to want to live there, but I donāt think I do. I enjoyed the book as easy reading, and might try the others in the series some time. (4/4)
I read Starter Villain by John Scalzi. Enjoyable, like his other books that Iāve read. Somewhat sillier. (¾)
Iām enjoying Wesley Chuās Tao and Io series. Spies, action, ancient aliens. Some funny parts, some interesting world-building parts, some action-filled parts. I picked up The Fall of Io at random from a library a few weeks ago, and it turned out to be the last in a series of six (technically two series), so after finishing that I read the first and am partway through the second. Usually I try to read series in order, but this way is interesting. One thing I liked about The Fall of Io was that it it followed many points of view with somewhat conflicting interests, some more evil than others, and I felt sympathy for most of them. (I was kind of hoping it would be about Jupiterās moon Io, but it wasnāt, but Iām satisfied with what I ended up with.) (2/4)
Liking the distinction in Vivaldi between bookmark and something to add to your reading list.
Simplified twtxt - I want to suggest some dogmas or commandments for twtxt, from where we can work our way back to how to implement different feature like replies/treads:
Itās a text file, so you must be able to write it by hand (ie. no app logic) and read by eye. If you edit a post you change the content not the timestamp. Otherwise it will be considered a new post.
The order of lines in a twtxt.txt must not hold any significant. The file is a container and each line an atomic piece of information. You should be able to run
sort
on a twtxt.txt and it should still work.Transport protocol should not matter, as long as the file served is the same. Http and https are preferred, so it is suggested that feed served via Gopher or Gemini also provide http(s).
Do we need more commandments?
@doesnm@doesnm.p.psf.lt finally someone read my blogpost ;)
@prologic@twtxt.net currently? it wouldnt :D.
we would need to come up with a way of registering with multiple brokers that can i guess forward to a reader broker. something that will retry if needed. need to read into how simplex handles multi brokers
How to read twts without browser? I dont understand context in reply messages
Recent #fiction #scifi #reading:
The Memory Police by YÅko Ogawa. Lovely writing. Very understated; reminded me of Kazuo Ishiguro. Sort of like Nineteen Eighty-Four but not. (I first heard it recommended in comparison to that work.)
Subcutanean by Aaron Reed; https://subcutanean.textories.com/ . Every copy of the book is different, which is a cool idea. I read two of them (one from the library, actually not different from the other printed copies, and one personalized e-book). I donāt read much horror so managed to be a little creeped out by it, which was fun.
The Wind from Nowhere, a 1962 novel by J. G. Ballard. A random pick from the sci-fi section; I think I picked it up because it made me imagine some weird 4-dimensional effect (āfrom nowhereā meaning not in a normal direction) but actually (spoiler) it was just about a lot of wind for no reason. The book was moderately entertaining but there was nothing special about it.
Currently reading Scale by Greg Egan and Inversion by Aric McBay.
(#2024-09-24T12:53:35Z) What does this screenshot show? The resolution it too low for reading the textā¦
(#2024-09-24T12:45:54Z) @prologic@twtxt.net Iām not really buying this one about readability. Itās easy to recognize that this is a URL and a date, so you skim over it like you would we mentions and markdown links and images. If you are not suppose to read the raw file, then we might a well jam everything into JSON like mastodon
Aggred. But reading twtxt in raw form sounds⦠I canāt do this
Finally pubnix is alive! Thatās im missing? Im only reading twtxt.net timeline because twtxt-v2.sh works slowly for displaying timelineā¦
@prologic@twtxt.net Thanks for writing that up!
I hope it can remain a living document (or sequence of draft revisions) for a good long time while we figure out how this stuff works in practice.
I am not sure how I feel about all this being done at once, vs. letting conventions arise.
For example, even today I could reply to twt abc1234 with ā(#abc1234) Edit: ā¦ā and I think all you humans would understand it as an edit to (#abc1234). Maybe eventually it would become a common enough convention that clients would start to support it explicitly.
Similarly we could just start using 11-digit hashes. We should iron out whether itās sha256 or whatever but thereās no need get all the other stuff right at the same time.
I have similar thoughts about how some users could try out location-based replies in a backward-compatible way (append the replyto: stuff after the legacy (#hash) style).
However I recognize that Iām not the one implementing this stuff, and itās less work to just have everything determined up front.
Misc comments (I havenāt read the whole thing):
Did you mean to make hashes hexadecimal? You lose 11 bits that way compared to base32. Iād suggest gaining 11 bits with base64 instead.
āClients MUST preserve the original hashā ā do you mean they MUST preserve the original twt?
Thanks for phrasing the bit about deletions so neutrally.
I donāt like the MUST in āClients MUST follow the chain of reply-to referencesā¦ā. If someone writes a client as a 40-line shell script that requires the user to piece together the threading themselves, IMO we shouldnāt declare the client non-conforming just because they didnāt get to all the bells and whistles.
Similarly I donāt like the MUST for user agents. For one thing, you might want to fetch a feed without revealing your identty. Also, it raises the bar for a minimal implementation (Iām again thinking again of the 40-line shell script).
For āwho followsā lists: why must the long, random tokens be only valid for a limited time? Do you have a scenario in mind where they could leak?
Why canāt feeds be served over HTTP/1.0? Again, thinking about simple software. I recently tried implementing HTTP/1.1 and it wasnāt too bad, but 1.0 would have been slightly simpler.
Why get into the nitty-gritty about caching headers? This seems like generic advice for HTTP servers and clients.
Iām a little sad about other protocols being not recommended.
I donāt know how I feel about including markdown. I donāt mind too much that yarn users emit twts full of markdown, but Iām more of a plain text kind of person. Also it adds to the length. I wonder if putting a separate document would make more sense; that would also help with the length.
@prologic@twtxt.net I read it. I understand it. Hopefully a solution can be agreed upon that solves the editing issue, whilst maintaining the cryptographic hash.
Could someone knowledgable reply with the steps a grandpa will take to calculate the hash of a twtxt from the CLI, using out-of-the-box tools? I swear I read about it somewhere, but canāt find it.
More:
Subject: The [tag URI scheme](https://en.wikipedia.org/wiki/Tag_URI_scheme) looks interesting. I like that it human read- and writable. And since we already got the timestamp in the twtxt.txt it would be
somewhat trivial to parse. But there are still the issue with what the name/id should be... Maybe it doesn't have to bee that stick? Instead of using `tag:` as the prefix/protocol, it would more it clear
what we are talking about by using `in-reply-to:` (https://indieweb.org/in-reply-to) or `replyto:` similar to `mailto:` 1. `(reply:sorenpeter@darch.dk,2024-09-15T12:06:27Z)' 2.
`(in-reply-to:darch.dk/twtxt.txt,2024-09-15T12:06:27Z)' 2. `(replyto:http://darch.dk/twtxt.txt,2024-09-15T12:06:27Z)' I know it's longer that 7-11 characters, but it's self-explaining when looking at the
twtxt.txt in the raw, and the cases above can all be caught with this regex: `\([\w-]*reply[\w-]*\:` Is this something that would work?
Subject: The [tag URI scheme](https://en.wikipedia.org/wiki/Tag_URI_scheme) looks interesting. I like that it human read- and writable. And since we already got the timestamp in the twtxt.txt it would be
somewhat trivial to parse. But there are still the issue with what the name/id should be... Maybe it doesn't have to bee that stick? Instead of using `tag:` as the prefix/protocol, it would more it clear
what we are talking about by using `in-reply-to:` (https://indieweb.org/in-reply-to) or `replyto:` similar to `mailto:` 1. `(reply:sorenpeter@darch.dk,2024-09-15T12:06:27Z)` 2.
`(in-reply-to:darch.dk/twtxt.txt,2024-09-15T12:06:27Z)` 3. `(replyto:http://darch.dk/twtxt.txt,2024-09-15T12:06:27Z)` I know it's longer that 7-11 characters, but it's self-explaining when looking at the
twtxt.txt in the raw, and the cases above can all be caught with this regex: `\([\w-]*reply[\w-]*\:` Is this something that would work?
Notice the difference? Soren edited, and broke everything.
@mckinley@twtxt.net Thanks for the feedback.
- Yeah I agrees that nick sound not be part of syntax. Any valid URL to a twtxt.txt-file should be enough and is more clear, so it is not confused with a email (one of the the issues with webfinger and fedivese handles)
- I think any valid URL would work, since we are not bound to look for exact matches. Accepting both http and https as well as a gemni and gophe could all work as long as the path to the twtxt.txt is the same.
- My idea is that you quote the timestamp as it is in the original twtxt.txt that you are referring to, so you can do it by simply copy/pasting. Also what are the change that the same human will make two different posts within the same second?!
Regarding the whole cryptographic keys for identity, to me it seems like an unnecessary layer of complexity. If you move to a new house or city you tell people that you moved - you can do the same in a twtxt.txt. Just post something like āI move to this new URL, please follow me there!ā I did that with my feeds at least twice, and you guys still seem to read my posts:)
The tag URI scheme looks interesting. I like that it human read- and writable. And since we already got the timestamp in the twtxt.txt it would be somewhat trivial to parse. But there are still the issue with what the name/id should be⦠Maybe it doesnāt have to bee that stick?
Instead of using tag:
as the prefix/protocol, it would more it clear what we are talking about by using in-reply-to:
(https://indieweb.org/in-reply-to) or replyto:
similar to mailto:
(reply:sorenpeter@darch.dk,2024-09-15T12:06:27Z)
(in-reply-to:darch.dk/twtxt.txt,2024-09-15T12:06:27Z)
(replyto:http://darch.dk/twtxt.txt,2024-09-15T12:06:27Z)
I know itās longer that 7-11 characters, but itās self-explaining when looking at the twtxt.txt in the raw, and the cases above can all be caught with this regex: \([\w-]*reply[\w-]*\:
Is this something that would work?
@prologic@twtxt.net Brute force. I just hashed a bunch of versions of both tweets until I found a collision.
I mostly just wanted an excuse to write the program. I donāt know how I feel about actually using super-long hashes; could make the twts annoying to read if you prefer to view them untransformed.
Theyāre in Section 6:
Receiver should adopt UDP GRO. (Something about saving CPU processing UDP packets; Iām a but fuzzy about it.) And they have suggestions for making GRO more useful for QUIC.
Some other receiver-side suggestions: āsending delayed QUICK ACKsā; āusing recvmsg to read multiple UDF packets in a single system callā.
Use multiple threads when receiving large files.
@prologic@twtxt.net I believe you when you say registries as designed today do not crawl. But when I first read the spec, it conjured in my mind a search engine. Now I donāt know how things work out in practice, but just based on reading, I donāt see why it canāt be an API for a crawling search engine. (In fact I donāt see anything in the spec indicating registry servers shouldnāt crawl.)
(I also noticed that https://twtxt.readthedocs.io/en/latest/user/registry.html recommends āThe registries should sync each others user list by using the users endpointā. If I understood that right, registering with one should be enough to appear on others, even if they donāt crawl.)
Does yarnd provide an API for finding twts? Is it similar?
@movq@www.uninformativ.de, that would be a nice addition. :-) I would also love the ability to hide/not show the hash when reading twtxts (after all, thatās on the header on each āemailā). Could that be added as a user configurable toggle?
Morphotrophic by Greg Egan is built around an idea for how life on Earth could have worked out differently. It gets increasingly strange and interesting as the story progresses. My partner and I finished it last night and thoroughly enjoyed it. The beginning is free online: https://gregegan.net/MORPHOTROPHIC/00/MorphotrophicExcerpt.html #scifi #reading
O meu novo salva-vidas na hora de montar um novo site #vuejs sem as tretas dos build systems: Vue3 Tiny Template, da inimitƔvel @b0rk@b0rk
Every time I start a Vue project, I get confused and waste 15 minutes reading the documentation and remembering how to set up Vue.
So this is a tiny template I made for myself so that I can avoid that next time. I donāt use a build process, instead it uses the CDN version of Vue and a single HTML / JS file.
So updated. Seems to duplicate here in the ui. And what is this āRead Moreā on every twt now?
Picked up Altered Carbon again on Netflix. Really need to read those books.
Another minor inconvenience could have been avoided by reading the Arch Linux news feed before upgrading.
@bender@twtxt.net ha! He goes his āpoemā:
A string of letters, a forgotten name,
An email crafted, a message to claim.
We hit send with a click, a hopeful sigh,
But a bounce-back arrives, a tear in our eye.āDelivery failed,ā the message reads cold,
The address it seems, is a story untold.
A ghost in the system, a memoryās trace,
Lost in the void of cyberspace.
:-D
Thanks @prologic@twtxt.net, I also just manage to get my own version of webmentions working. Please have a read at Webmentions vs. Custom Mentions Spec for Twtxt/Yarn - HedgeDoc and User Lookup for Twtxt/Yarn - Webfinger or Decentralized Identifiers (DIDs) - HedgeDoc for how it sorta works
Did another write up on #webfinger and DIDs for twtxt/yarn that you can read and edit/comment in: User lookup for twtxt/yarn - Webfinger or Decentralized Identifiers (DIDs) - HedgeDoc
>
?
@sorenpeter@darch.dk this makes sense as a quote twt that references a direct URL. If we go back to how it developed on twitter originally it was RT @nick: original text
because it contained the original text the twitter algorithm would boost that text into trending.
i like the format (#hash) @<nick url> > "Quoted text"\nThen a comment
as it preserves the human read able. and has the hash for linking to the yarn. The comment part could be optional for just boosting the twt.
The only issue i think i would have would be that that yarn could then become a mess of repeated quotes. Unless the client knows to interpret them as multiple users have reposted/boosted the thread.
The format is also how iphone does reactions to SMS messages with +number liked: original SMS
@lyse@lyse.isobeef.org I have read the white papers for MLS before. I have put a lot of thought on how to do it with salty/ratchet. Its a very good tech for ensuring multiple devices can be joined to an encrypted chat. But it is bloody complicated to implement.
Reading The School Of Life by Alain De Botton, Page 65
The best gift anyone could give me is the time to be able to finish reading some books.
@movq@www.uninformativ.de I lasted for a long time.. Not sure where or when it was āgotā. We had been having a cold go around with the kiddos for about a week when the wife started getting sicker than normal. Did a test and she was positive. We tested the rest of the fam and got nothing. Till about 2 days later and myself and the others were positive. It largely hasnāt been too bad a little feaver and stuffy noses.
But whatever it was that hit a few days ago was horrible. Like whatever switch in my head that goes to sleep mode was shut off. I would lay down and even though I felt sleepy, I couldnāt actually go to sleep. The anxiety hit soon after and I was just awake with no relief. And it persisted that way for three nights. I got some meds from the clinic that seemed to finally get me to sleep.
Now the morning after I realized for all that time a part of me was missing. I would close my eyes and it would just go dark. No imagination, no pictures, nothing. Normally I can visualize things as I read or think about stuff.. But for the last few days it was just nothing. The waking up to it was quite shocking.
Though its just the first night.. I guess Iāll have to see if it persists. š¤
@johanbove@johanbove.info Sounds interesting. It is only for reading or also posting?
Reading the backlog of my twtxt posts and itās nice to be able to just delete some outdated things by editing a simple txt file.
@prologic@twtxt.net in the article they say they have a p99 of 15ms reading historical data. Which is pretty nuts.. Aside from having close to 900TB of SSDā¦
Read this interesting retro about discords migration path from Mongo to Cassandra to now ScyllaDB.
https://discord.com/blog/how-discord-stores-trillions-of-messages
@abucci@anthony.buc.ci read my new skibloreet about why social meets payments is the next level idea! For just §5 bitshlongs a month on my serfdomage site!
I donāt remember where I first read this, but: A governmentās OODA loop is ~6 months long.
von Neumann and Morgenstern is a joy to read :-D
Iāve never gotten much out of a book review & donāt understand why people read or write them.
guy who would rather read than meditate
@abucci@anthony.buc.ci i have an old copy of the 2005 version from university if you want to give it a read through. its quite dry.
@prologic@twtxt.net What is the SMART reading for the disk?
An interesting read about testing code using nullable states instead of mocks.
https://www.jamesshore.com/v2/projects/testing-without-mocks/testing-without-mocks
tbh whenever someone is like āthe existing arguments for agi xrisk were insufficient/unclear, hereās my better versionā the arguments read exactly the same to me as the existing ones.
Reading The Precipice & struck by the fact that it was released before GPT-3, just as COVID-19 was ramping up. Pretty insane
why is >80% of the most read literotica content incest related?
now is a good time to sit back and read
(cont.)
Just to give some context on some of the components around the code structure.. I wrote this up around an earlier version of aggregate code. This generic bit simplifies things by removing the need of the Crud functions for each aggregate.
Domain ObjectsA domain object can be used as an aggregate by adding the event.AggregateRoot
struct and finish implementing event.Aggregate. The AggregateRoot implements logic for adding events after they are either Raised by a command or Appended by the eventstore Load or service ApplyFn methods. It also tracks the uncommitted events that are saved using the eventstore Save method.
type User struct {
Identity string ```json:"identity"`
CreatedAt time.Time
event.AggregateRoot
}
// StreamID for the aggregate when stored or loaded from ES.
func (a *User) StreamID() string {
return "user-" + a.Identity
}
// ApplyEvent to the aggregate state.
func (a *User) ApplyEvent(lis ...event.Event) {
for _, e := range lis {
switch e := e.(type) {
case *UserCreated:
a.Identity = e.Identity
a.CreatedAt = e.EventMeta().CreatedDate
/* ... */
}
}
}
Events
Events are applied to the aggregate. They are defined by adding the event.Meta
and implementing the getter/setters for event.Event
type UserCreated struct {
eventMeta event.Meta
Identity string
}
func (c *UserCreated) EventMeta() (m event.Meta) {
if c != nil {
m = c.eventMeta
}
return m
}
func (c *UserCreated) SetEventMeta(m event.Meta) {
if c != nil {
c.eventMeta = m
}
}
Reading Events from EventStore
With a domain object that implements the event.Aggregate
the event store client can load events and apply them using the Load(ctx, agg)
method.
// GetUser populates an user from event store.
func (rw *User) GetUser(ctx context.Context, userID string) (*domain.User, error) {
user := &domain.User{Identity: userID}
err := rw.es.Load(ctx, user)
if err != nil {
if err != nil {
if errors.Is(err, eventstore.ErrStreamNotFound) {
return user, ErrNotFound
}
return user, err
}
return nil, err
}
return user, err
}
OnX Commands
An OnX command will validate the state of the domain object can have the command performed on it. If it can be applied it raises the event using event.Raise() Otherwise it returns an error.
// OnCreate raises an UserCreated event to create the user.
// Note: The handler will check that the user does not already exsist.
func (a *User) OnCreate(identity string) error {
event.Raise(a, &UserCreated{Identity: identity})
return nil
}
// OnScored will attempt to score a task.
// If the task is not in a Created state it will fail.
func (a *Task) OnScored(taskID string, score int64, attributes Attributes) error {
if a.State != TaskStateCreated {
return fmt.Errorf("task expected created, got %s", a.State)
}
event.Raise(a, &TaskScored{TaskID: taskID, Attributes: attributes, Score: score})
return nil
}
Crud Operations for OnX Commands
The following functions in the aggregate service can be used to perform creation and updating of aggregates. The Update function will ensure the aggregate exists, where the Create is intended for non-existent aggregates. These can probably be combined into one function.
// Create is used when the stream does not yet exist.
func (rw *User) Create(
ctx context.Context,
identity string,
fn func(*domain.User) error,
) (*domain.User, error) {
session, err := rw.GetUser(ctx, identity)
if err != nil && !errors.Is(err, ErrNotFound) {
return nil, err
}
if err = fn(session); err != nil {
return nil, err
}
_, err = rw.es.Save(ctx, session)
return session, err
}
// Update is used when the stream already exists.
func (rw *User) Update(
ctx context.Context,
identity string,
fn func(*domain.User) error,
) (*domain.User, error) {
session, err := rw.GetUser(ctx, identity)
if err != nil {
return nil, err
}
if err = fn(session); err != nil {
return nil, err
}
_, err = rw.es.Save(ctx, session)
return session, err
}
Hi, I am playing with making an event sourcing database. Its super alpha but I thought I would share since others are talking about databases and such.
Itās super basic. Using tidwall/wal as the disk backing. The first use case I am playing with is an implementation of msgbus. I can post events to it and read them back in reverse order.
I plan to expand it to handle other event sourcing type things like aggregates and projections.
Find it here: sour-is/ev
@prologic@twtxt.net @movq@www.uninformativ.de @lyse@lyse.isobeef.org
shoutout to the woman that broke my heart so that I now read papers titled stuff like āMultiverse-wide Cooperation via Correlated Decision Makingā
HOW DO THOSE PEOPLE READ SO MUCH
me reading planecrash is unstable bc while reading it I become so enthusiastic about reading textbooks that I go off and do that instead
reading reddit thread about advice for women fighting men rn, and one of the advices is to yank an eye out, not considering that eyes are pretty hard to yank out
@prologic@twtxt.net Oh.. reading comprehension is strong today.. you went to US and now back.
āThe problem with Marcusā argument is that the only alternative to statistical AI is spiritual AIā no what the fuck why would you say this did you even read the sequences i donāt even know where to start with that
at least iām up to a level where iām reading a comment on an acx post and someone uses āconsciousnessā and iām hella out there real quick
āDodecahedron Assemblies was the publication I chose to read on my way to heavenāāapparently Uriel has managed to gain short contact via the Gato agent.
āiāll stop reading at 1:30ā this is your internal agents coordinating
to diversify its lingo, the rationality community should adopt the evidentials smį¹ti (āI read this in a blogpostā) and Åruti (āsomeone told this to me in a one-on-oneā).
letās not read the thielleaves
no, niplav, you wonāt get sucked into reading the heraldry wikipedia articles, even though āescutcheonā looks like a really good word to drop in a conversation.
i remembered i liked structural regular expressions, but re-reading the paper reminds me of how cool they are
#!/bin/sh
# Validate environment
if ! command -v msgbus > /dev/null; then
printf "missing msgbus command. Use: go install git.mills.io/prologic/msgbus/cmd/msgbus@latest"
exit 1
fi
if ! command -v salty > /dev/null; then
printf "missing salty command. Use: go install go.mills.io/salty/cmd/salty@latest"
exit 1
fi
if ! command -v salty-keygen > /dev/null; then
printf "missing salty-keygen command. Use: go install go.mills.io/salty/cmd/salty-keygen@latest"
exit 1
fi
if [ -z "$SALTY_IDENTITY" ]; then
export SALTY_IDENTITY="$HOME/.config/salty/$USER.key"
fi
get_user () {
user=$(grep user: "$SALTY_IDENTITY" | awk '{print $3}')
if [ -z "$user" ]; then
user="$USER"
fi
echo "$user"
}
stream () {
if [ -z "$SALTY_IDENTITY" ]; then
echo "SALTY_IDENTITY not set"
exit 2
fi
jq -r '.payload' | base64 -d | salty -i "$SALTY_IDENTITY" -d
}
lookup () {
if [ $# -lt 1 ]; then
printf "Usage: %s nick@domain\n" "$(basename "$0")"
exit 1
fi
user="$1"
nick="$(echo "$user" | awk -F@ '{ print $1 }')"
domain="$(echo "$user" | awk -F@ '{ print $2 }')"
curl -qsSL "https://$domain/.well-known/salty/${nick}.json"
}
readmsgs () {
topic="$1"
if [ -z "$topic" ]; then
topic=$(get_user)
fi
export SALTY_IDENTITY="$HOME/.config/salty/$topic.key"
if [ ! -f "$SALTY_IDENTITY" ]; then
echo "identity file missing for user $topic" >&2
exit 1
fi
msgbus sub "$topic" "$0"
}
sendmsg () {
if [ $# -lt 2 ]; then
printf "Usage: %s nick@domain.tld <message>\n" "$(basename "$0")"
exit 0
fi
if [ -z "$SALTY_IDENTITY" ]; then
echo "SALTY_IDENTITY not set"
exit 2
fi
user="$1"
message="$2"
salty_json="$(mktemp /tmp/salty.XXXXXX)"
lookup "$user" > "$salty_json"
endpoint="$(jq -r '.endpoint' < "$salty_json")"
topic="$(jq -r '.topic' < "$salty_json")"
key="$(jq -r '.key' < "$salty_json")"
rm "$salty_json"
message="[$(date +%FT%TZ)] <$(get_user)> $message"
echo "$message" \
| salty -i "$SALTY_IDENTITY" -r "$key" \
| msgbus -u "$endpoint" pub "$topic"
}
make_user () {
mkdir -p "$HOME/.config/salty"
if [ $# -lt 1 ]; then
user=$USER
else
user=$1
fi
identity_file="$HOME/.config/salty/$user.key"
if [ -f "$identity_file" ]; then
printf "user key exists!"
exit 1
fi
# Check for msgbus env.. probably can make it fallback to looking for a config file?
if [ -z "$MSGBUS_URI" ]; then
printf "missing MSGBUS_URI in environment"
exit 1
fi
salty-keygen -o "$identity_file"
echo "# user: $user" >> "$identity_file"
pubkey=$(grep key: "$identity_file" | awk '{print $4}')
cat <<- EOF
Create this file in your webserver well-known folder. https://hostname.tld/.well-known/salty/$user.json
{
"endpoint": "$MSGBUS_URI",
"topic": "$user",
"key": "$pubkey"
}
EOF
}
# check if streaming
if [ ! -t 1 ]; then
stream
exit 0
fi
# Show Help
if [ $# -lt 1 ]; then
printf "Commands: send read lookup"
exit 0
fi
CMD=$1
shift
case $CMD in
send)
sendmsg "$@"
;;
read)
readmsgs "$@"
;;
lookup)
lookup "$@"
;;
make-user)
make_user "$@"
;;
esac
I would HIGHLY recommend reading up on the keybase architecture. They designed device key system for real time chat that is e2e secure. https://book.keybase.io/security
A property of ec keys is deriving new keys that can be determined to be āon curve.ā bitcoin has some BIPs that derive single use keys for every transaction connected to a wallet. And be derived as either public or private chains. https://qvault.io/security/bip-32-watch-only-wallets/
you are a successful person who has started recently reading lesswrong, i am a novel rodent control agent
the filter is not just having read marx, it is having read marx and still understanding him
@benk@kwiecien.us I am using jenny (we chatted a bit on IRC earlier today). I have been using it for over five months now, I think. It is truly a joy to use, specially because you can use the power of Mutt/NeoMutt to read your twts.
@lyse@lyse.isobeef.org I thought it was just me. I drives me nuts to try reading on that page. I guess I am no longer capable to look at old CRT monitors without side effects.
@movq@www.uninformativ.de, is removing the hash from the body of the twt on the TODO? I read it, but I am unsure if it is there already, or not. š Sorry if it is, and I failed to spot it!
@movq@www.uninformativ.de i believe the delete of any twt was a tech limitation with retwt parser not knowing where in the file a twt came from. lextwt tracks the bytes in file where a twt was read from. which could be used to delete a twt from file.. in theory.
I think something has caused my feed to be in a bad state and is now unpardableš„
I can read this on jenny, but the twt isnāt making it to my own pod. Something has gone really wrong, me thinks.
If
Subject
contains the full twt, then you can skim over conversations just by reading those lines in muttās index pager
Yes, I do the same, true.
So I decided: Okay, letās have mutt do it.
And Mutt does it well. I agree it was/is a good idea.
The subject lines are already ācompressedā
I noticed, yes.
I am not sure why I asked to begin with; in retrospect, in was a silly request. Perhaps the OCD in me got triggered while viewing rich headers, on a specific twt, when I saw the huge subject line that is, otherwise, always hidden.
Anyway, donāt mind me, move along. š
@adi@f.adi.onl
Just like your highschool girlfriend in Afghanistan ādoesnāt need savingā, right? I think it is a language issue you are having, as English isnāt your mother tongue.
QAnon followers are cultist nuts. Some of them wanting out are finding that it is a hard thing to do (did you read the article?). Saying that āthey donāt need to escapeā is a silly thing to say, at the very least. To me, it just doesnāt make sense.
Third generation of AirPods. So, new AirPods. Read more at Apple. š