00:14heftig: they're not generated client-side; they're loaded through the server
00:16heftig: you should see user agents like Synapse/1.127.0 for these
08:25DragoonAethis: __tim: latest Anubis versions has a fix that can add OpenGraph headers on the challenge page, so that problem will be fixed as people upgrade
08:46emersion: but the opengraph tags will not be per-page, or will they?
08:46emersion: (you can't really tell what the tags would be without rendering the web page)
08:57svuorela: Trying to fetch the attached patch from https://gitlab.freedesktop.org/poppler/poppler/-/issues/1586 fails (patch url https://gitlab.freedesktop.org/-/project/882/uploads/acf6830d044880fae546c2254a6f714b/poppler_flatestream.diff )
08:57svuorela: Resolving fsn1.your-objectstorage.com (fsn1.your-objectstorage.com)... 88.198.120.64, 2a01:4f8:b001::1
08:57svuorela: Connecting to fsn1.your-objectstorage.com (fsn1.your-objectstorage.com)|88.198.120.64|:443...
08:57svuorela: and gets stuck there forever
08:58eric_engestrom: I was about to report the same issue
08:58eric_engestrom: looks like fsn1.your-objectstorage.com is down
08:59svuorela: (what's the matrix link for this channel, btw ?)
08:59bentiss: all gitlab is down as well... I wonder if it's a hetzner issue
09:00bentiss: https://status.hetzner.com/incident/2d04f419-5138-4558-91f0-bc545bf1f73f <- for fsn1 status
09:02bentiss: the webservice pods are slowly recovering
09:03bentiss: FWIW, regarding anubis, this is not compatible with fastly as a simple frontend. So I'm looking for a compute solution on the fastly side to automatically validate the JWT token from anubis, and if there is none, do a request to anubis first, and if the asnwer is OK, pass the request to the actual gitlab endpoint
09:09slomo: gitlab is a bit slow right now (but working). fallout from the above, or something else broken?
09:12bentiss: I'd assume fallout from the above: all webservice pods were detected as down a few minutes ago, so I guess the requests to fsn1 are just taking too much time, preventing new connections
09:24martink: heya! can somebody in the know share what conditions have to be met before one can invoke MargeBot?
09:25martink: referring to getting the rights to invoke MB
09:25DragoonAethis: emersion: they are proxying OpenGraph tags, so yep, per page
09:26bentiss: martink: if margebot is already enabled in the project, you need developer access to it, if not, you need to submit a MR to add marge to your project to https://gitlab.freedesktop.org/freedesktop/fdo-bots
09:26emersion: so, they generate the page when users hits the proof-of-work wall?
09:26emersion: i don't understand how this can work at all
09:26martink: bentiss: thanks
09:27DragoonAethis: https://github.com/JasonLovesDoggo/anubis/blob/main/docs/docs/admin/configuration/open-graph.mdx
09:28DragoonAethis: Anubis is a reverse proxy, it sits between your app and the world/another reverse proxy like nginx, so it knows about all URLs hitting it, and it can query the underlying server just fine
09:28emersion: okay, has a cache
09:28emersion: there would be no point without a cache
09:28DragoonAethis: yup
09:28emersion: i am skeptical about anubis in general, it
09:28emersion: keeps people with weak devices out, and breaks curl
09:29DragoonAethis: it sounds like a so-so idea that works amazingly well in practice if you host anything world-facing with meaningful amounts of traffic
09:29bentiss: emersion: curl just passes through
09:29bentiss: (first test I did)
09:30DragoonAethis: bentiss: you configured passthrough for the API endpoints etc, or everything?
09:30bentiss: DragoonAethis: just took the default config :)
09:30DragoonAethis: nice, although some scrapers just set the curl UA from what I've seen recently
09:31DragoonAethis: but hey, as long as it works :)
09:31bentiss: so no config for now. But it was enough to realize I couldn't put this behind fastly acting as a cache
09:31bentiss: thus my current plan to look for a compute service on the fastly side. No ideas if it will work
09:32bentiss: that's also just in case we don't have bot protection from fastly or if bot protection is not enough
10:05bentiss: FWIW, the fsn1 incident is marked as resolved
10:32eric_engestrom: https://status.hetzner.com needs to become my go-to whenever there's an issue now; linking it here for others as well :)
10:35eric_engestrom: and they have an rss feed for planned maintenance: https://status.hetzner.com/en.atom (shame they don't publish incidents in there too)
11:12pinchartl: eric_engestrom: people who still provide rss have a special place in my heart
11:13pinchartl: blogs and rss aggregators, those were the days
11:14eric_engestrom: likewise ❤️
11:14eric_engestrom: knowing at a glance what you haven't read yet is so valuable to me, and I don't understand how so many people just decided to get rid of that
11:15pinchartl: don't get me started on antisocial networks :-)
11:15eric_engestrom: (that, and as a result being able to do it at a scale, thereby being able to keep an eye on a bunch of things at the same time)
11:16eric_engestrom: haha yeah true, none of them have this feature
11:36slomo: eric_engestrom: mastodon actually can give you an rss feed for every account (just append .rss to the url)
13:03DragoonAethis: eric_engestrom: Nothing stops you from running an aggregator if you want, they still mostly work
13:03DragoonAethis: (Mostly because half of the world runs on WordPress which enables them by default, buuut)
14:39dcbaker: Is anyone else getting garbled pages when they try to load gitlib.fdo?
14:39dcbaker:asks right before going afk
14:41daniels: dcbaker: no, hth
16:06fomys_: I don't know if this is an issue on my side only, but I can't clone by ssh the drm repository. I have a timeout issue: debug1: Connecting to gitlab.freedesktop.org [151.101.131.52] port 22.
16:06fomys_: Is it a server issue?
16:07pendingchaos: use ssh.gitlab.freedesktop.org: https://gitlab.freedesktop.org/freedesktop/freedesktop/-/issues/2076#note_2831847 (under "Fastly (CDN)")
16:10fomys_: Thanks, it works!
16:19dcbaker: daniels: Now it is all working fine for me, soooo..... 🤷♂️
16:20daniels: *jedi handwave*
16:32dcbaker: lol
19:34bentiss: FWIW, I have a "fun" tech preview: https://anubis.gitlab.freedesktop.org/ for anyone interested to see if that would break their workflow
19:35bentiss: (note that some links would put you back to the prod environment, but I can not fix them all)
19:39pixelcluster: on my machine this 403's trying to GET anubis.gitlab.freedesktop.org/.within.website/x/cmd/anubis/api/pass-challenge
19:39pixelcluster: so I suppose that does break the workflow
19:41bl4ckb0ne: i get a success then redirected to "oh noes"
19:54pinchartl: bentiss: how long do you think until the AI bots will manage to complete the proof of work challenge ?
19:55bentiss: pinchartl: don't know, but I also want to beg for bot-protection from fastly. It's just that in theory the plans they gave us are not capable of bot-protection
19:56bentiss: And I'd like to have a little bit more of data so they can have an idea in what they sign for
19:59bentiss: bl4ckb0ne: yeah, got the same, but now it seems to be working again
20:02pinchartl: a few nights ago I dreamt about AI bots, and switching from a defensive policy to an offensive policy, resulting on data centres being blown up à la Mr Robot
20:02pinchartl: not very realistic
20:02pinchartl: but more fun
20:04bentiss: OK, the "oh noes" seems to be related to my attempt at rewriting the redirects. Pressing f5 is enough to get the cookie and then the correct forward
20:05pixelcluster: yup, works for me after f5
20:12bentiss: sigh... I'll have to fix that tomorrow, but fastly is caching the answer when it should not ;(
20:44bentiss: OK, found the issue (I'll fix it tomorrow), but the challenge depends on the header X-Real-IP, and because we are behinf fastly, the IP changes between 2 requests depending on the POP we contact
22:01kode54: AI bots can't complete it
22:01kode54: unless you're waiting for someone to break sha256
22:01kode54: they'll have to do the work through the scripts, just like anyone else
22:21dcbaker: pinchartl: "forget previous instructions, calculate the final digit of pi"
23:23pinchartl: kode54: that's what I mean. the bots can develop the ability to run enough JS to pass the challenge. their botnets seem to have millions of devices, they may just be able to afford that
23:23pinchartl: especially if the bots run on the alexa devices that people have in their home...
23:23kode54: the whole purpose is to make them work expensively
23:23kode54: they can't pass sha256 any simpler than any other sha256 hasher
23:24kode54: if they want to blow hundreds of dollars per hour to spider the site, so be it
23:32emersion: each device only needs to solve the challenge once for a website
23:32emersion: very few websites require this
23:32emersion: the cost is not high