The instances that give the best results seem to also get throttled pretty often on the source search engines, to the point of near uselessness.

Thinking of hosting my own, but the maintenance seems pretty involved according to the docs.

What’s your experience been like?

Edit: all right y’all, thanks for the feedback. I’m going to spin up an instance.

  • ohshit604@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    6 days ago

    I host my own SearXNG via docker compose, reverse proxied it via Traefik, added a few security headers, restricted access to my country to help prevent abuse.

    Use it daily, the only complaint I really have is it occasionally doesn’t search when you type in the address bar of a browser. What I mean is I’ll type a search query and instead of redirecting to the query (searx.yourdomain.tld/search?q=test) it’ll just redirect to the homepage of my SearXNG instance (searx.yourdomain.tld) forcing me to retype my query. Annoying but not the end of the world.

    • irmadlad@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 days ago

      searx.yourdomain.tld/search?q=test

      Do you have the %s in the search string: searx.yourdomain.tld/search?q=%s. For instance, in Firefox, when you add you searx instance, I had to add the %s to get it to search properly.

      • ohshit604@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 days ago

        Yup, it works 90% of the time. Happens on all devices so I suspect Searx is just running into an error of some sort. Too lazy to investigate.

  • deleted@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 days ago

    I wish we have a free to use, open source, and privacy respecting search engine that do the crawling and indexing and don’t rely on other search engines.

    Maybe we can utilize all selfhosted instances to do the crawling and consolidate it.

  • liliumstar@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 days ago

    I host and use it as my default search on all devices. Bare metal deployment. The maintenance is pretty low, I just run the instance update script from to time.

    Results have been worse lately, I think it needs some tuning in regards to weights and what engines are in use.

  • fccview@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    7 days ago

    Hey, I know you got a ton of replies but yeah, been using searXNG with a custom theme made by me and it’s basically identical to google (including the feeling lucky part lol)

    Used it for months and it’s awesome, haven’t missed google at all.

    The amazing thing about it is that with an instance of meilisearch I was able to index all my media libraries/book libraries/game libraries and searching for !home <query> actually sarches within my home lab, which is a huge win for me.

    Hope this helps give you an idea of how powerful this can be <3

    • irmadlad@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 days ago

      The amazing thing about it is that with an instance of meilisearch I was able to index all my media libraries/book libraries/game libraries and searching for !home <query> actually sarches within my home lab, which is a huge win for me.

      I’m intrigued. I’ve always wanted to point my search engine to my ebook library and be able to search them for data. Scrape my library as it were. I’ve also wanted to change the Searxng log as well, to personalize it.

      • fccview@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 days ago

        Nice!

        The only drawback I have had for heavily customising it is that it’s now not compatible with the latest versions, unfortunately they’ve re-structured their codebase and I frankly don’t have the time to re-do all my hard work, so I’ve been running a very old (but extremely stable) version of it lol

  • isgleas@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 days ago

    I have it setup on each of my laptops, so I have it available at all time with no need to expose it on my home setup.

    Automatically start the container on my laptop, and add it to my browser’s search engines as default. Pretty simple.

  • マリウス@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 days ago

    I have been hosting multiple SearXNG instances, the newest one (https://マリウス.com/be-your-own-privacy-respecting-google-bing-brave/) being a private instance for my own community channel, and it has been relatively smooth sailing.

    Some niche engines, like e.g. Mojeek seem to be notoriously slow, but that might also depend very much on the VPS that your requests are coming from.

    If, however, big engines like Bing or Google are blocking/throttling you, it might be due to your IP/subnet reputation and it might be worth switching your host.

    Alternatively, you could overengineer a setup in which you round-robin route your SearXNG request through a number of simultaneously running Wireguard tunnels from a VPN provider to obscure your traffic.

    However, if my experience, most VPN providers suffer extremely from Cloudflare and ReCaptcha blocks, hence ymmv.

  • root@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    8 days ago

    Self hosting and the maintenance is painless, but the results have been pretty terrible lately

      • root@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        8 days ago

        Recently I’ll often get results that aren’t at all related to what I searched for. I also get a lot of timeouts from the upstream search engines, and sometimes I get results that are in Chinese for some reason

        • fizzle@quokk.au
          link
          fedilink
          English
          arrow-up
          11
          ·
          7 days ago

          There’s an open issue about this on github. It’s the remote API is only recognising the first word of your query.

          This has been bugging me too.

          The timeouts are because the engines are presenting captchas. There’s a work around whereby you use your instance as a proxy, navigate to that remote engine, and do the captcha.

          These two issues are a real pain in the ass so while I do presently have a searxng instance I’ve been using qwant the last few weeks because I’m just over it.

        • Engywook@lemmy.zip
          link
          fedilink
          English
          arrow-up
          8
          ·
          7 days ago

          IIRC these are related to Bing misbehaving. There should be an ooen issue about that. Try deactivating it in preferences flash a workaround.

      • root@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        Lol, thanks. I think it’s still worth it if you want more ownership over your searches, but just expect some rough edges

  • sj_zero@lotide.fbxl.net
    link
    fedilink
    arrow-up
    4
    ·
    6 days ago

    I’ve been running my own, it’s mostly automated now. I started a yacy instance as well so not only am I aggregating bigger websites, I’m including the sites I crawled myself and the other sites available on yacy through it’s huge p2p search functionality. In this little way, I’m trying to make sure my search isn’t totally dominated by corporate search.

    Tbh, yacy is 1000x harder to keep running than searxng.

  • thagoat@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 days ago

    I have it installed as a docker container on a server on my home network, and use it as my default search on my home machines, and access it on mobile through wireguard.

  • Suzune@ani.social
    link
    fedilink
    English
    arrow-up
    13
    ·
    7 days ago

    Yes. I selfhost it. It’s pretty easy. All you need to know is that you occasionally need to merge your config with the original that is getting updated.

    If you know how to use nvim diff mode, it’s trivial.