Slashzero
- 3 Posts
- 19 Comments
Slashzero@lemmy.worldOPto Selfhosted@lemmy.world•Lemmy instance broken after upgrading to 0.18.01·2 years agoAnd yes… I see sync issues as well. I’ve responded to some comments multiple times because I didn’t see my original response at a later time.
Slashzero@lemmy.worldOPto Selfhosted@lemmy.world•Lemmy instance broken after upgrading to 0.18.01·2 years agoI also added the proxy network to both lemmy-ui and pictrs, but that did not resolve the issue.
I had to manually remove the sites icon from the database for the site to start working again.
I’m going to try removing the proxy network and instead use pictrs:8080 on the internal network as that feels more secure.
Slashzero@lemmy.worldOPto Selfhosted@lemmy.world•Lemmy instance broken after upgrading to 0.18.03·2 years agoNote: this seems like it has something to do with the database, and something getting royally messed up post upgrade.
After trying all sorts of network hacks and updates, I eventually just decided to backup my Postgres container, and nuke it.
With a fresh Postgres DB running along with 0.18.0, my self hosted site is back online. Of course, my local post history and all my subs are gone… but at least my site is operational again.
I’d advise anyone self-hosting to not upgrade to 0.18.0 yet.
Slashzero@lemmy.worldto Fediverse@lemmy.world•Lemmy's total users exceeds 740k today, up from 540k yesterdayEnglish1·2 years agoThat sounds correct to me.
Slashzero@lemmy.worldto Fediverse@lemmy.world•Lemmy's total users exceeds 740k today, up from 540k yesterdayEnglish3·2 years agoI wonder if these are real users or if someone wrote a script to register users via the lemmy API… 🤔
Slashzero@lemmy.worldto Fediverse@lemmy.world•Lemmy's total users exceeds 740k today, up from 540k yesterdayEnglish4·2 years agoI self host! Very nice having an instance all to myself.
Slashzero@lemmy.worldto Lemmy.World Announcements@lemmy.world•Workaround for the performance issue with posting in large communitiesEnglish4·2 years agoYes. Absolutely does happen on other instances that have thousands of users.
Slashzero@lemmy.worldto Lemmy.World Announcements@lemmy.world•Workaround for the performance issue with posting in large communitiesEnglish4·2 years agoThat actually sounds like something I would have enjoyed. I joined Reddit around the time it started taking over, I think.
Slashzero@lemmy.worldto Lemmy.World Announcements@lemmy.world•Workaround for the performance issue with posting in large communitiesEnglish3·2 years agoThat’s pretty neat! I’ve honestly never seen it mentioned on Reddit before, so got a bit excited to see someone mention it here, admittedly maybe too excited.
Slashzero@lemmy.worldto Lemmy.World Announcements@lemmy.world•Workaround for the performance issue with posting in large communitiesEnglish2·2 years agoI really hope someone is doing some level of performance testing on those changes to make sure the changes fix the performance issues.
Slashzero@lemmy.worldto Lemmy.World Announcements@lemmy.world•Workaround for the performance issue with posting in large communitiesEnglish12·2 years agoHave you tried enabling the slow query logs @ruud@lemmy.world? I went through that exercise yesterday to try to find the root cause but my instance doesn’t have enough load to reproduce the conditions, and my day job prevents me from devoting much time to writing a load test to simulate the load.
I did see several queries taking longer than 500ms (up to 2000ms) but they did not appear related to saving posts or comments.
Slashzero@lemmy.worldto Lemmy.World Announcements@lemmy.world•Workaround for the performance issue with posting in large communitiesEnglish112·2 years agoOh, Big-O notation? I never thought I’d see someone else mention big O notation out in the wild!
:high-five:
Slashzero@lemmy.worldto Lemmy.World Announcements@lemmy.world•Lemmy.world starting guideEnglish2·2 years agoI did see it, thanks. I’m hoping to find some time to contribute this week.
Slashzero@lemmy.worldto Critical Role@lemmy.world•Welcome to the Critical Role communityEnglish7·2 years agoWelcome here, and thanks for creating the CR community.
Is it Thursday yet?
Slashzero@lemmy.worldto Lemmy.World Announcements@lemmy.world•Lemmy.world starting guideEnglish0·2 years agoI suppose I could, but, I’ve honestly spent the majority of today on lemmy answering “support” questions for people lol… Maybe I can try to take a look tomorrow. 🤷
Actually, saving edits on lemmy.ml is also slow, about 4-5 seconds. It’s probably a combination of user load and non-optimized queries.
Slashzero@lemmy.worldto Lemmy.World Announcements@lemmy.world•Lemmy.world starting guideEnglish0·2 years agoWould be funny if it was missing an index and doing a full table scan for some odd reason…
I focus on application performance in my day-to-day work, and missing indexes, greedy upoptimized queries, etc, are the root of a lot of issues. Hopefully you can get to the bottom of it.
Quick note: I’m not seeing a big delay (10+ seconds) when posting or saving on lemmy.ml, or my own instance.
Slashzero@lemmy.worldto Lemmy.World Announcements@lemmy.world•Installed Lemmy 0.17.4-rc1English0·2 years agoThe footer still shows the old versions, by the way. It does feel snappier than earlier for sure.
I’m setting up my own instance to mess around with as well. I’ve got it running via Docker. The SSL setup was a pain, not really documented well. Also pictrs is giving me issues complaining about DB access.
The redirect is tricky and works in some browsers, but not others.
Edit: save took about 10 seconds, which is better than 20-30 seconds!
Edit 2: the footer shows the updated versions, now.
I checked this, and my lemmy.hconf file already has the host for pictrs set to http://pictrs:80.
The only thing that has worked so far is manually unsetting my site’s image icon by unsetting it directly int he Database.