• brewery@feddit.uk
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 hours ago

    Another option to reduce (but not eliminate) this traffic is a country limit. In cloudflare you can set a manual security rule to do this. There are self hosted options too but harder to setup. It depends what country you are and where your users are based. My website is a business one so I only allow my own country (and if on holiday I might open that country if I need to check it’s working, although usually I just use a paid vpn back to my country so no need). You can also block specific countries. So many of my blocked requests are from USA, China, Russia etc

  • potatopotato@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    18 hours ago

    Currently Anubis seems to be the standard for slowing down scrapers

    https://github.com/TecharoHQ/anubis

    There are also various poison and tarpit systems which will serve scrapers infinite garbage text or data designed to aggressively corrupt the models they’re training. Basically you can be as aggressive as you want. Your site will get scraped and incorporated into someone’s model at the end of the day, but you can show them down and make it hurt.

    • David J. Atkinson@c.im
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      17 hours ago

      @potatopotato @selfhosted Black Ice exists. Software is hand-to-hand combat. The most #cyberpunk sentence I’ve read today:

      “There are also various poison and tarpit systems which will serve scrapers infinite garbage text or data designed to aggressively corrupt the models they’re training. “

  • Auth@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    18 hours ago

    You could put your website behind a cloudflare anti bot check. But realistically, your website is public facing and these bots are scraping the public web. They will eventually get the data from your website.

  • Nephalis@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    16 hours ago

    Isn’t fail2ban a possibility too? I created a filter for chatgpt and some others, and it feels like its working. My radicale server is my only free acessable service but it comes with a small webgui and so the bots showed up. I have no clue if the bot gets a fraction of your site each time it shows up, but seemingly the ban happens within 300ms when I remember correct. So it wouldn’t be that much of information…

    When setting the retry to 1 it will ban at the first sight.

    • JustTesting@lemmy.hogru.ch
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      A big issue is that this works for bots that announce themselves as such, but there’s lots that pretend to be regular users, with fake user agents and ips selected from a random pool with each ip only sending like 1-3 request/day, but overall many thousands of requests. In my experience a lot of them are from huawei and tencent cloud/ASN

  • irmadlad@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    16 hours ago

    I’m wondering if you could run CrowdSec on the server and manually block the offenders if they are not already in the community blocklists.