It feels like the new update is slower than before. Anyone else noticed this?

Could be temporary maybe, or something that will be fixed in a update.

But i get timeouts loading pages even, and that never ever happened before.

  • mrmanager@lemmy.todayM
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    This should be much better now. It was a new parameter to the database in the new update:

    synchronous_commit=off
    

    Caused the cpu to frequently run at 100%, causing timeouts and very slow performance.

    We removed the setting and this seems to have made the server able to breathe again.

      • mrmanager@lemmy.todayM
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        Thanks a lot, it made me look at this thread right away.

        I posted above, but I think its still bugs in 0.19.1 that was supposed to be fixed in this version…

      • mrmanager@lemmy.todayM
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        Yeah I like it too, its awesome!

        I had to disable it temporarily though since it was hammering the server for some reason, and we are not quite sure why.

        Will post an update as soon as we know whats going on… :)

  • mrmanager@lemmy.todayM
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    7 months ago

    I tried a quick restart of Lemmy to see if posts and comments are federated now after restart…

    It seems to have made comments federate, but its worrying that it can stop working in this version. I have posted on AskLemmy, will see if anyone else has similar issues.

  • MacN'Cheezus@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    Performance does seem to have improved somewhat now, but is anyone else having problems with federation or is it just me?

    Some of my posts and comments in remote communities just don’t seem to be showing up anywhere else but here. I checked the modlog on the remote instance to make sure I’m not banned there, but I’m not. Post is still not showing up there though.

      • MacN'Cheezus@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        Well, interestingly enough, all of my posts and comments from the last 24 hours DID just show up on those remote instances.

        Not sure if you made any changes to the configuration, but if you did, it seems to be working.

        • mrmanager@lemmy.todayM
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          7 months ago

          I did a restart of Lemmy software just now, so I guess that made federation start to work again. But Im worried now it will stop again at any time. Will post a question in AskLemmy and see if anyone else has similar issues.

          • MacN'Cheezus@lemmy.today
            link
            fedilink
            English
            arrow-up
            0
            ·
            7 months ago

            Ah, so perhaps there was a backlog in the database and the server wasn’t processing it for some reason?

            But yes I agree, the 0.19 release seems to have quite a few issues. Other instances have seen similar problems ever since the upgrade.

            • mrmanager@lemmy.todayM
              link
              fedilink
              arrow-up
              0
              ·
              7 months ago

              Yeah I think the way outgoing federation works now in 0.19 has been rewritten to use a queue. So no comments or posts should ever be lost with this design, but there seems to be a bug causing the software to not process that queue sometimes I guess.

              • MacN'Cheezus@lemmy.today
                link
                fedilink
                English
                arrow-up
                0
                ·
                7 months ago

                Well, as long as a restart can get it to work again that’s at least something, but let’s hope the devs manage to find a more permanent fix.

                • tal@lemmy.today
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  edit-2
                  7 months ago

                  Back in the era of FidoNet and federated BBSes in the 1980s, it used to be that a number of BBS forums would only forward messages at night to save on telephone costs. I suppose a nightly restart to push out messages would be a bit of a blast from the past.

    • tal@lemmy.today
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      7 months ago

      I don’t think that any of my comments on remote communities for the past 22 or so hours prior to this comment, which I assume is when the update happened, are visible on other instances.

      Ditto for posts.

      I can see material via lemmy.today that remote users are putting on remote communities, though.

      • MacN'Cheezus@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        7 months ago

        Yeah same here. Just looked at my profile on the other instance, and nothing I posted there in the last 24 hours is showing up there.

        • tal@lemmy.today
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          7 months ago

          I can narrow down the time a bit, since I was listening to some audio and transcripting/summarizing it in a comment series, and things broke right in the middle.

          My last comment to make it out to another instance looks to be this one, starting with “Bergman: Michael, I want to turn to you”:

          https://lemmy.today/comment/4228981

          Visible on the remote instance as this:

          https://kbin.social/m/Ukraine_UA/t/716867/Aid-to-Ukraine-and-the-Future-of-the-War-with#entry-comment-4244549

          The first to fail to propagate – to that same instance – was the response to that, starting with “Bergman: Michael, I think there’s another”

          It’s also not just the kbin.social instance; other instances like sh.itjust.works also are affected, to rule out it being a kbin.social problem.

          EDIT: Both the sh.itjust.works instance federation list and the lemmy.today instance federation list show the other instance as being federated:

          https://lemmy.today/instances

          https://sh.itjust.works/instances

          EDIT2: I don’t think that there’s much else beyond what I’ve written above that I can do from a troubleshooting standpoint as a non-admin user – but if there is, feel free to let me know, @mrmanager.

          • mrmanager@lemmy.todayM
            link
            fedilink
            arrow-up
            0
            ·
            7 months ago

            I can see the comment on kbin now after restart of Lemmy software:

            It really seems like outgoing federation is buggy in this version. :/

            • tal@lemmy.today
              link
              fedilink
              arrow-up
              0
              ·
              edit-2
              7 months ago

              Thanks!

              A warning, though…after your restart, I just commented in that other lemmy.ml thread talking about 0.19 federation problems on lemmy.ml and linked to this thread – given that you identified an important data point – and that comment doesn’t appear to have propagated. That’ll be a comment added to the queue without a restart after the comment was made. Now, it was only 5 minutes ago that I made the comment, so maybe I’m just excessively impatient, but…

              The local view of the thread:

              https://lemmy.today/comment/4245923

              The remote view of the thread:

              https://lemmy.ml/post/9624005?scrollToComments=true

              My comment text:

              We were just discussing some potentially-0.19.1-related federation problem that lemmy.today users were experiencing after the update; that’s how I ran across this thread.

              https://lemmy.today/post/4382768

              The admin there, @mrmanager@lemmy.today, restarted the instance again some hours later to attempt to resolve the problem, and it looked like federation started working at that point.

              That might be worth consideration if any other instances are seeing problems with posts/comments/votes not propagating.

              • mrmanager@lemmy.todayM
                link
                fedilink
                arrow-up
                0
                ·
                7 months ago

                Indeed, it hasnt federated. So the restart of Lemmy polls the queue and then it stops working again. :/

              • mrmanager@lemmy.todayM
                link
                fedilink
                arrow-up
                0
                ·
                7 months ago

                I did another restart and your comment shows up in the thread.

                So seems to be some bug that makes it stop federating after it has polled the queue once.

                • tal@lemmy.today
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  edit-2
                  7 months ago

                  Hmmm.

                  A couple thoughts:

                  • As I commented above, this doesn’t appear to be impacting every 0.19.1 instance, so there may be something on lemmy.today that is tickling it (or it and some other instances).

                  • If you decide that you want to move back to 0.18.x, I have no idea whether lemmy’s PostgreSQL databases and stuff support rolling back, while continuing to use the current databases, whether there were any schema changes in the move to 0.19.x or whatever.

                  • Something that also just occurred to me – I don’t know what kind of backup system, if any, you have rigged up, but normally backup systems backing up servers running databases need to be aware of the database, so that they can get an atomic snapshot. Like, if you have something that just backs up files nightly or something, they may not have valid, atomic snapshots of the PostgreSQL databases. If you do attempt a rollback, you might want to bring all of the services down, and only while they are down back up the PostgreSQL database. That way, if the rollback fails, it’s at least possible to get back to a valid copy of the current 0.19.1 state as it is in this moment.

                    If all that’s old hat and you’ve spent a bunch of time thinking about it, apologies. Just didn’t want a failed rollback to wind up in a huge mess, wiping out lemmy.today’s data.

          • MacN'Cheezus@lemmy.today
            link
            fedilink
            English
            arrow-up
            0
            ·
            7 months ago

            Yeah, for me it is Lemmy.world and Lemmy.ml. The latter is also running on 0.19.1, and the former is still on 0.18.5, so the remote server software or version does not seem to be a factor here.

            • tal@lemmy.today
              link
              fedilink
              arrow-up
              0
              ·
              edit-2
              7 months ago

              Oh, that’s a good point. We can see server versions elsewhere.

              Here’s a comment from @nutonic@lemmy.ml, one of the lemmy devs, to a remote community on lemmy.world:

              https://lemmy.ml/comment/6801426

              https://lemmy.world/comment/6173591

              That comment appears to have propagated.

              While I cannot say for certain that at that point lemmy.ml had already updated to 0.19.1, it was after lemmy.today had, so it seems plausible.

              So whatever the problem is, I would lean towards guessing that it does not affect all 0.19.1 instances.

              EDIT: Other users are talking about potential federation problems in this thread, where votes appear not to be making it out:

              https://lemmy.ml/post/9624005?scrollToComments=true

              But as someone there points out, the first two users there put up a comment from a 19.1 instance (lemm.ee and sopuli.xyz), and their comments did make it out.

              • MacN'Cheezus@lemmy.today
                link
                fedilink
                English
                arrow-up
                0
                ·
                7 months ago

                Well, as I just said here, all my comments and posts from the last 24 hours JUST started showing up on those other instances, so perhaps the problem is fixed now.

                Maybe there was some sort of database backlog or something, let’s see what /u/mrmanager says.

                • mrmanager@lemmy.todayM
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  edit-2
                  7 months ago

                  I think it was Lemmy software stopping to federate for some reason, and after i did a restart of Lemmy, it federated everything in the queue right away. But its worrying that this can happen, and I assume its a bug somewhere hiding in the software still.

  • 1984@lemmy.todayOP
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    7 months ago

    Yes it’s much much better now. Actually a bit faster than before even… :)