@kat@yarn.girlonthemoon.xyz yes, both the newsletter and the podcast, from time to time.
@prologic@twtxt.net I was not expecting much, but since the list of restaurants near company buildings, was hard coded into it, I did expect it to at least copy the menu text, from the websites, in its database. Ironically, the only restaurant where it got something right, is the only one, where the websites has the text as a transparent PNG, the AI has to convert to text.
I asked ChatGPT what it knows about Twtxt š And surprisingly itās rather accurate:
Twtxt is a minimalist, decentralized microblogging format introduced by John Downey in 2016. It uses plain text files served over HTTPāno accounts, databases, or APIs.
In 2020, James Mills (@prologic@twtxt.net) launched Yarn.social, an extended, federated implementation with user discovery, threads, mentions, and a full web UI.
Both share the same .twtxt.txt format but differ in complexity and social features.
@bender@twtxt.net Exactly. I suspect it was because of sqlitebrowser
also accessing the database in parallel to debug the original issue.
So far, I have not found the exact reason why some replies donāt show up. When I do not filter for unread messages and show all, though, I actually see them. So, thereās that.
I just noticed that my unread messages counter was off by quite a bit. It showed 8, but I only saw one unread message. Even after restarting my client, which recalculates the number of unread messages, it remained at eight. Weird. Looking in the database revealed that this is indeed correct.
Apparently, my query to build up the message tree must be incorrect. It somehow misses seven messages. They all are orphaned, maybe thatās a clue. However, generating missing root messages (and thereby including the replies) typically works just fine. Hmm.
@movq@www.uninformativ.de json and database put together sounds terrifying. i must try jenny
jenny really isnāt well equipped to handle edits of my own twts.
For example, in 2021, this change got introduced:
https://www.uninformativ.de/git/jenny/commit/6b5b25a542c2dd46c002ec5a422137275febc5a1.html
This means that jenny will always ignore my own edits unless I also manually edit its internal ājson databaseā. Annoying.
That change was requested by a user who had the habit of deleting twts or moving them to another mailbox or something. I think that person is long gone and I might revert that change. š¤
@prologic@twtxt.net is it twice on database, or simply rendering twice? If you manually expunge it, will it affect the yarn?
@xuu@txt.sour.is Wow, thatās a giant graveyard. In my new database I have 16,428 messages as of now. Archive feed support is not yet available, so itās just the sum of all the 36 main feeds.
tt
reimplementation that I already followed with the old Python tt
. Previously, I just had a few feeds for testing purposes in my new config. While transfering, I "dropped" heaps of feeds that appeared to be inactive.
Thanks, @movq@www.uninformativ.de!
My backing SQLite database with indices is 8.7 MiB in size right now.
The twtxt
cache is 7.6 MiB, it uses Pythonās pickle
module. And next to it there is a 16.0 MiB second database with all the read statuses for the old tt
. Wow, super inefficient, it shouldnāt contain anything else, itās a giant, pickled {"$hash": {"read": True/False}, ā¦}
. What the heck, why is it so big?! O_o
A collection of postgreSQL patterns that you can use in other databases
https://mccue.dev/pages/3-11-25-life-altering-postgresql-patterns
#postgresql #databases
(Back in tt
.) Well, it kinda worked. At least appending to the file. But my cache database got screwed up. I do not yet support replies, so the subject and and root hash columns have not been set at all, resulting in a message that is just not shown at all. I gotta do something about that next. The good thing is, though, after simply fixing the two columns the message appeared on screen.
wahhh i wanna work towards my dream of offering pay as you can web hosting (static & dynamic) but i donāt know how!!!!! i keep drifting towards hosting panels but i donāt exactly have fresh linux servers for those nor do i like the level of access they require. so iām like ok i can do the static site part with SFTP chroot jails and a front-end like filebrowser or somethingā¦. but then what about the dynamic sites!!!!!!! UGH
granted i doubt iād get much interest in dynamic sites but iād like to do this old school where i can offer people isolated mySQL databases or something for some project (iām thinking PHP based fanlistings), which means i could do it the old school way of⦠people ask me to run it and i do it for them. but i kind of want to let people have access to be able to do it themselves just short of giving them SSH access which isnāt happening
@andros@twtxt.andros.dev If something fits in a CSV file, it typically doesnāt require a database. I agree with that. Depending on the application, more complicated queries might benefit from a database, though. I donāt know awk very well, but I could imagine that grep, sed and cut reach their CSV processing limits rather quickly when you have to deal with escaped (multiline) fields.
I only very rarely have to deal with CSV files or databases in my day to day life. Maybe, these classic Unix tools offer some tricks Iām not aware of. When I have some more complicated CSV input, I generally reach for Python.
pls elaborate on a āp2p databaseā, āall storyā and āRegistriesā.
My first thought takes me to something like secure-scuttlebutt
which itās painful to sync data using clients, and too slow compared to downloading a text file.
Also Iād like for twtxt to avoid becoming an ActivityPub. Works well but itās uses too many resources IMO.
https://kingant.net/2025/02/mastodon-the-cost-of-running-my-own-server/
Iām defending being able to self-host your Web client (like youād do with a Wordpress, twtxt is a micrologging, at the end), instead of federated instances, so in a first thought Iād say Registries have many disadvantages being the first one that someone has to maintain them active.
What does the #twtxt community think about having a p2p database to store all history? This will be managed by Registries.
@prologic@twtxt.net We often turn to a database when we can use a plain text file, such as a CSV. With sed or awk, you can run simple queries without using a database.
Did I get the context right? š
The other day, after a discussion online, we came to the conclusion that using awk+sed+tr could replace much of the development that requires a database. However, using SQLite to have a SQL syntax isnāt a bad idea either. What do you think?
Iām continuing my tt
rewrite in Go and quickly implemented a stack widget for tview. The builtin Pages is similar but way too complicated for my use case. I would have to specify a mandatory name and some additional options for each page. Also, it allows me to randomly jump around between pages using names, but only gives me direct access the first, however, not the last page. Weird. I donāt wanna remember names. All I really need is a classic stack. You open a new fullscreen dialog and maybe another one on top of that. Closing the upper most brings you back to the previous one and so on.
The very first dialog I added is viewing the raw message text. Unlike in @arne@uplegger.euās TwtxtReader, Iām not able to include the original timestamp, though. I donāt have it in its original form in the database. :-/
Next up is a URL view.
I think it is not easy to implement, you need a database. Timeline is an elegant solution: read and sort.
FINALLY!! Got #Caddy server up and running and got rid of nginx proxy manager and Mysql database containers š„³š„³š„³
What is clean architecture? Thatās a good question.
You think of a pattern for ordering code with good decisions isolating technologies (you can change the web framework or database without break the business logic), easy to test (you only test interfaces and use cases), sharing code between frameworks (entities and use cases), scalability, modulations and standardizing names. Clean architecture is not perfect, it has a learning curve and some abstraction in each technology. You can even find rejection with yours colleagues.
I have a good article on this topic.
https://programadorwebvalencia.com/implementando-arquitectura-limpia-en-python/
#python
been playing with making fun scripts using charm CLIās gum library :P
one that gets lyrics from an open lyrics databaseās API and accepts input for artist & song names: https://asciinema.org/a/697860
and one that uses a user-provided last.fm API key to pull whatās currently playing or what last played on your account :) https://asciinema.org/a/697874
@prologic@twtxt.net that ālittle database that couldā is simply amazing, isnāt it? I run Conduwuit (nevermind, this one is RocksDB), and GoToSocial using it as a backend, no issues. And, of course, sqlite is the database of choice for a lot of things under iOS.
I demand full 9 digit nano second timestamps and the full TZ identifier as documented in the tz 2024b database! I need to know if there was a change in daylight savings as per the locality in question as of the provided date.
BTW this code doesnāt incorporate existing twts into jennyās database. Itās best used starting from scratch. Iāve been testing it using a custom XDG_CACHE_HOME and XDG_CONFIG_HOME to avoid messing with my ārealā jenny data.
I wrote some code to try out non-hash reply subjects formatted as (replyto ), while keeping the ability to use the existing hash style.
I donāt think we need to decide all at once. If clients add support for a new method then people can use it if they like. The downside of course is that this costs developer time, so I decided to invest a few hours of my own time into a proof of concept.
With apologies to @movq@www.uninformativ.de for corrupting jennyās beautiful code. I donāt write this expecting you to incorporate the patch, because it does complicate things and might not be a direction you want to go in. But if you like any part of this approach feel free to use bits of it; I release the patch under jennyās current LICENCE.
Supporting both kinds of reply in jenny was complicated because each email can only have one Message-Id, and because itās possible the target twt will not be seen until after the twt referencing it. The following patch uses an sqlite database to keep track of known (url, timestamp) pairs, as well as a separate table of (url, timestamp) pairs that havenāt been seen yet but are wanted. When one of those āwantedā twts is finally seen, the mail file gets rewritten to include the appropriate In-Reply-To header.
Patch based on jenny commit 73a5ea81.
https://www.falsifian.org/a/oDtr/patch0.txt
Not implemented:
- Composing twts using the (replyto ā¦) format.
- Probably other important things Iām forgetting.
Thinking of building a simple āThings our kids sayā database form, using Node, Express and SQlite3. Going beyond simple text files.
From my small experience in writing an event database, I am inclined to agree with this.
If youāre looking for a cool p2p database system have a look at www.earthstar-project.org
Hi, I am playing with making an event sourcing database. Its super alpha but I thought I would share since others are talking about databases and such.
Itās super basic. Using tidwall/wal as the disk backing. The first use case I am playing with is an implementation of msgbus. I can post events to it and read them back in reverse order.
I plan to expand it to handle other event sourcing type things like aggregates and projections.
Find it here: sour-is/ev
@prologic@twtxt.net @movq@www.uninformativ.de @lyse@lyse.isobeef.org
The complexity is a feature. It means standards can be replaced with products that let providers get their cut. It means putting data into the slowest most expensive database in cost and enviromnmental impact.
The lospec palette list is a database of palettes for pixel art: [[https://lospec.com/palette-list]] #links #pixelart #color
huh. it seems that dumping + gzipping a SQLite database can sometimes have better compression than gzipping the SQLite database directly. cool. #sqlite
here is the script I use to convert my twtxt feed into a SQLite database: !twtxt_sqlite
a unique thing I do with my twtxt feed is convert it to a SQLite database. This, combined with the Janet + SQLite scripting abilities available in SQLite, could provide interesting metrics and insights over time.
in particular, twtxt provides timestamps. weewiki doesnāt really track the passage of time. it only wants to be a key/value database with org markup.
I love it. I have a program that needs to processing about half a million records, which will take 3 days. The database that all those records are suppose to go to is acting up after Iāve just done 140K records.
While certainly not a solution to everything, I find Iām using temporary SQLITE database a bunch to solve problems with a few lines of sql and less then 50 lines of code (to insert data into the SQLITE DB) instead of several hundred of lines of code and a bunch of arrays.
@sdk@codevoid.de I have to admit thatās true. While I donāt call myself an expert, I almost always wore several hats at places Iāve worked. Programmer, Server Admin, Network Admin, Cable Puller, Telephone Admin, PBX installer, Database Admin, etc