I guess it was just @david@collantes.us and I today, see y’all next time 😅
@doesnm@doesnm.p.psf.lt You don’t generally call go build main.go
or whatever. You generally call go build .
or go build ./cmd/foo/...
– Because you need to tell the compiler to build a whole package or a bunch of sub-packages + main. go run main.go
only works for the simplest case.
Starting the call: https://meet.mills.io/call/Yarn.social
Come join us!
Very nice presentation! 👏
@sorenpeter@darch.dk@darch.dk i’m there! Just in time I think, can’t comment, wants to signup, which I won’t.
@aelaraji@aelaraji.com I knew you’d end up choosing OpenGist 🤣
@aelaraji@aelaraji.com Yeah I’ve been busily refactoring code today to use yt-dlp
under the hood 👌
Same here:
$ youtubedr download 'https://www.youtube.com/watch?v=YpiK1FMy2Mg'
2024/11/23 09:01:12 download to directory .
time=2024-11-23T09:01:12.946+10:00 level=INFO msg="Downloading video" id=YpiK1FMy2Mg quality=medium mimeType="video/mp4; codecs=\"av01.0.01M.08\""
chunk at offset 0 has invalid size: expected=10485760 actual=0
What I’m seeing is some kind of detection going on and the CDN servers responding with 0 bytes.
Wow! Just Wow! 😮
Discovered this whilst trying to debug why my Youtube frontend no longer works:
$ youtube-dl 'https://www.youtube.com/watch?v=YpiK1FMy2Mg'
[youtube] YpiK1FMy2Mg: Downloading webpage
WARNING: unable to extract uploader id; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; see https://yt-dl.org/update on how to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.
ERROR: unable to download video data: HTTP Error 403: Forbidden
@sorenpeter@darch.dk@darch.dk Post us a link to the livestream as you’re about to go on? 🙏
@sorenpeter@darch.dk@darch.dk Cool! 😎
@bender@twtxt.net I no longer do, no. But I do run https://gist.mills.io/
@ both look pretty good and delicious to me 😀
@bender@twtxt.net Can’t say I have sorry 😔
@bender@twtxt.net Fair enough 😀
\u2028
from the description. Also, could the description field be changed to a textarea
instead? Preferably to one who will "understand" new lines, and convert them to \u2028
automatically?
@bender@twtxt.net Got it! 👌
@bender@twtxt.net Perhaps it might be better to describe a “Pod” or “Yarn Pod” as a Web Application or Desktop/Mobile App that provides a good user experience to provide a decentralised set of capabilities for following and interactions with one or more Twtxt feeds? 🤔 By that definition, even Jenny would fit that bill 😉
The only reason that Yarn was ever referred to as a pod was because it supported multiple users.
@aelaraji@aelaraji.com Nice! 👌
@aelaraji@aelaraji.com Evening! 👋 What’s up? 🤔
vim
more often, just because it honestly does run better on my machines. The mode-based UX still hasn't grown on me, but I'm getting used to it.
@gallowsgryph@prismdragon.net i’ve been an exclusive vim power user for some 30+ years now I think 🤣
@movq@www.uninformativ.de It’s been raining here non-stop for the past two or three days too 😱
vim
more often, just because it honestly does run better on my machines. The mode-based UX still hasn't grown on me, but I'm getting used to it.
@gallowsgryph@prismdragon.net I even use vim inside VSCode these days 🤣
@cuaxolotl@sunshinegardens.org What do you mean by this?
eugen and his interlocutors have had immense power with which to challenge twitter but their racial and cultural and ideological insularity prevented them from using i
Can you share examples? 🤔
@bender@twtxt.net Wrong! 💯 I s my fault🤣
@bender@twtxt.net thanks for this! Do you remember the numerous times that I have stated the nuances between distributed networks and decentralized networks? With Bluesky it’s even worse as the way they are operating building and maintaining their service, It’s more closer to centralized service than anything remotely close to what we would consider “decentralized”.
@bender@twtxt.net Same 😅
yarnd
automatically rotates at the configured maximum fetch size.
Btw the way, here’s a copy of the Email I sent to my Federal MP (Elizabeth Watson Brown):
@doesnm@doesnm.p.psf.lt I think it’s worth looking at the Web Browser Timeline to really understand the history and derivation of web browsers over time. I agree that building a Web Browser is complicated and hard, but that’s only because of the expectations we place on web browsers today and the enormous set of features they now carry. Ultimately we’re still talking about one of the most powerful and simplest protocols ever invesnted, the Hypermedia Text Transport Protocol and Hypermedia Systems.
But that’s not what I meant when I said “The web is seemingly garbage these days”.
@movq@www.uninformativ.de No worries 🤗
@movq@www.uninformativ.de Were you going to add Jenny here? https://twtxt.dev/clients.html
I still wanna know whether you’ll get your pizzas on time 🤣
@rrraksamam@twtxt.net The specs seem fine for a low end laptop. 👌
Ya know, like how yarn is stable 🤣
@bender@twtxt.net Teue but it’s also likely pretty stable 😅
@bender@twtxt.net I can see that 😅 I’m thinking about buying two for the Mills DC to use as CI/CD build machines 🤣
Maybe 🤔
Test
Okay that bug is squished (was my bug, not bluge’s 🤣)
Far out, the new Mac Mini is actually cheaper than one from several years ago 😱
I will promote the feature then, as well as webringer
and search
(soon™) – After Which we can probably cut a “big ass” release 🤣 (well overdue 🤦♂️)
@bender@twtxt.net All good 👍
@xuu Which ones in particular do you think could be applied in a truely decentralisd context of say Twtxt/Yarn? Hmm? 🧐
Fuck! 🤦♂️ I keep finding bugs in bluge 😢
@bender@twtxt.net If it’s to be removed, what should replace it? 🤔 It’s been enabled as a feature for a while now on my pod, and I do use it occasionally.
@bender@twtxt.net Fair
One of the things I’m going to work on next (maybe today, we’ll see how much time there’s left in the day) is being able to load up old conversations (fallen off the cache) like this one.
This pod is now using the index for archive twts instead of the old (naive) disk-based index with that results in millions of files over a long time 🤣