Like the title says, I’m new to self hosting world. 😀 while I was researching, I found out that many people dissuaded me to self host email server. Just too complicated and hard to manage. What other services that you think we should just go use the currently available providers in the market and why? 🙂thank you

  • fourstepper@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I try to NOT self-host any service that:

    1. Is open-source and provides a hosted solution
    2. Doesn’t have some technical/price requirement (like a media library/photo library where you pay per GB) to use the mentioned hosted version
      • if this is the case, I donate to the project I’m using instead
    • Simplixt@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Do you have an example?

      “Open Source + hosted” always involves trust, as you can only look into the Github repository, not if the running hosted application is running identically.

      Only exception: It’s an E2EE encrypted solution, and everything else happens client-side (example: Bitwarden)

  • rgnissen202@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I’d say backups. At least it shouldn’t be only local. I follow the rule of threes: two local copies and one off site with backblaze. Yeah, it ties up a not insignificant amount of disk space I could use for other things, but dammit, I’m not loosing my wedding photos, important system configurations, etc.

  • No-Needleworker-9890@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Passwords:
    -> You want to have immediat access to them, even if your house burns down

    Notes:
    -> You want to be able to read the documentation how to fix your selfhosted service, even when your selfhosted services are down

    Public Reverse proxy:
    -> A reverse proxy is only as safe as the applications behind. And NO, most selfhosted-applications are not hardened or had security audits
    (reverse proxy with a forward authentication proxy is something different)

  • shrugal@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    People saying email, look into using external SMTP servers as relays. Your domain most likely comes with at least one email account with SMTP access. You can use that as a relay to send personal/business emails from your server using the provider’s reputable IP addresses.

  • Vogete@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    A password manager because if anything goes wrong, you’ll be completely screwed.

    What you SHOULD absolutely self host though is a password manager, so you can be in control of your most sensitive data.

    Regarding email, I think everyone should absolutely self host it, but it’s less and less viable in this google/Microsoft duopoly world. But ideally everyone would self host it. The reason why people advise against it really comes down to lack of real competition, and the two tech giants dictating how we violate every RFC possible.

    • pogky_thunder@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      A password manager because if anything goes wrong, you’ll be completely screwed.

      What you SHOULD absolutely self host though is a password manager, so you can be in control of your most sensitive data.

      Wot?

  • GolemancerVekk@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Don’t self-host email SMTP or public DNS. They’re hard to set up properly, hard to maintain, easy to compromise and end up used in internet attacks.

    Don’t expose anything directly to the internet if you’re not willing to constantly monitor the vulnerability announcements, update to new releases as soon as they come out, monitor the container for intrusions and shenanigans, take the risk that the constant updates will break something etc. If you must expose a service use a VPN (Tailscale is very easy to set up and use.)

    Don’t self-host anything with important data that takes uber-geek skills to maintain and access. Ask yourself, if you were to die suddenly, how screwed would your non-tech-savvy family be, who can’t tell a Linux server from a hot plate? Would they be able to keep functioning (calendar, photos, documents etc.) without constant maintenance? Can they still retrieve their files (docs, pics) with only basic computing skills? Can they migrate somewhere else when the server runs down?

    • Zoenboen@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I’m running Ollama, the LLAMA2 port for Mac. I hosted an LLM for a site that generated the next line of story, no issues.

      There’s no reason to hide from running an LLM at home if you can, people should, the source is out there for a reason.

  • bulletproofkoala@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Okay I understand that email hosting is bad for SENDING email , but what about only RECEIVING email , isn’t it a good idea to keep my stuff private ? I rarely send personal emails, and like to avoid my data being used for marketing purposes Is that bad to have smtp imap open on dynamic ip address ? Just asking your opinion

  • dgibbons0@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Personally I don’t think it’s worth hosting recursive dns resolvers. Most of the options with ad blocking are single points of failure and when it breaks the household acceptance factor is just too low.

    • Vogete@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Just…set up two RPIs with Pihole instead of one? Chances are your router can have a fallback DNS. Sure, you have to update the rules in both places, but honestly it’s not a big deal, and you now have redudancy.

      I’m running 2 powerdns recursors and authoritative servers, and 2 piholes (long story why so many), and none of them have failed on me so far, and when I took one of them offline, I didn’t notice anything because the other took over. And if anything REALLY fails, I’ll just switch my router back to using cloudflare or Google or quad9 temporarily, and at least Internet access will be restored so people can at least browse the internet.

      Pihole also has an API and a home Assistant integration, so you can create an AdBlock toggle switch for others, in case it blocks something and they need immediate access. Not ideal, but it’s a doable workaround.

      This is really something that’s super easy to self host, and mitigate if something goes wrong. Especially since that commercial router is already a single point of failure for most households.

    • vkapadia@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      What is wrong with that? Don’t they still need correct credentials to connect?

      • Korlus@alien.topB
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        The service itself is insecure. You need to hide it behind a more secure setup if you want to expose it to the internet. It’s been a long while since I tried, but I have some foggy memories of an RDP Server that would encapsulate the connection in an SSL tunnel and forward the connection to the remote machine rather than exposing the RDP client itself to the internet.

        Definitely do your research on how to do it securely before you just set it up and open it to the wild.

          • Korlus@alien.topB
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            Oh sure, VPN is definitely the preferred way if you already have the infrastructure in place. My experience with the front-end RDP server was years ago as the sysadmin for a company. My experience is likely very out of date, and was very corporate-focused, rather than for an enthusiast.

            Nowadays I try not to touch Windows, and haven’t used RDP in years.

    • FlockSystem@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      What do you mean by “clearly”. Open RDP without password protection?

      I often use RDP to access my desktop Windows 10.

    • HashtagMOMD@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I have a load balancer on my network that has opened one port on my home network. The load balancer is connected over the cloud flare and is encrypted on both sides. Is that okay?

  • timawesomeness@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Internet-accessible authoritative DNS nameserver(s) (unless you have a completely static public IP).

  • SwingingTheLamp@midwest.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    In my opinion, cloud storage for (zero knowledge) backup. Your backup strategy should include a diversity of physical locations. I had a house fire a few years ago. Luckily, my data drives survived, but if they hadn’t, my cloud backup would’ve been invaluable.

    • KN4MKB@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Meh, been doing it for 5 years now with minimal issues. Had one issue come up where my domain was flagged as malicious, but was solved in a few days and some emails to security vendors.

      I think it’s important that those who can, and are educated enough to keep it running properly do host their own. Hosting your own email should be encouraged if capable because it helps reduce the monopoly, and keep a little bit of power for those who want to retain email privacy.

      • rad2018@alien.topB
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        I agree with KN4MKB. I’ve been hosting my own mail server for decades. Not one issue. I use that in lieu of a mail service provider (Google immediately comes to mind), as their EULA service agreement will tell you that - since you’re using their service, on their servers - anything goes. Read the fine print on Gmail, and you’ll see. 😉

    • HecateRaven@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I’m doing it on a bm I rent for 10 years now without issues with spf, dmarc, dkim and everything from scratch (no docker bloat)

        • Drwankingstein@alien.topB
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          Docker is horrid for duplication. Unless you use a filesystem with good deduplication, docker can hurt a lot on your storage. and even then it still can just not work often due to due to already deduplicated extent stuff

          • TBT_TBT@alien.topB
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            WTF? You obviously don’t understand Docker at all.

            - Docker and Docker images provide the absolute mimimum environment which is necessary to run an application. Containers don’t have reserved resources, so only what is really used is used. A VM has a lot more overhead, as a whole computer plus complete OS are emulated.

            - There is not much to deduplicate because there is no redundant storage going on. Only the bare OS essentials plus the app are stored. There are some base OS containers (e.g. Alpine Linux) which are <10 Mbytes in size.

            - If containers themselves are “big”, you are doing Docker wrong and store data inside of a container and not externally of the container in volumes or the host filesystem. With the next container pull, that data would be lost.

            - no idea what " just not work often due to due to already deduplicated extent stuff" is supposed to mean. That does not even make sense.

            • Drwankingstein@alien.topB
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              As someone who is wasting gigabytes upon gigabytes of data on Docker, get out of here with your stupid bullshit.

              docker layers across containers are not duplicated properly if the images arent setup right which is the case for a significant amount of images.

              The entire point of Docker is that way you can easily go and deploy these specific images that aren’t set up right. One cannot just say, "Well, don’t use those images, do something else, and do your own” because that completely defeats the point of Docker.

              I had nearly 80 gigabytes of duplicated garbage. across my home system alone after after setting up things like surveillance, my nas, nitter etc.

              Don’t come here telling me I don’t understand docker when you are the one who has no idea.