Hi folks! I just came up with something, no idea if its good but you judge that:

TL;DR: Can we make a browser extention or something that gives us a button to copy an entire comment chain with crucial, niche advice to a lemmy post or is that gone since the API went down?

I often google things for work and hobbies, code snippets, log entries, ways to make my insane docker setups work. For example, I got a lemmy instance working in docker with this.

Very often, I end up on reddit. If the post or comment in question is helpful, I’d like to upvote or ask a follow up question. For that I still need my reddit account.

But for the “praise helpful comment” I could also do that here. Some people link to the comment/post which also brings back traffic to reddit (which I dont like).

So I’d probably just copy the post (and/or comment chain in question) to a new post on lemmy.

It would go a little like this:

  • User1: “Post describing a topic”
  • User2: Helpful comment nr 1
  • User1: Follow up question
  • User2: Helpful comment nr 2

For that, I’d need some kind of automation. Since the API is gone, I don’t know if that is possible but the option to “copy entire comment chain up to this comment” would definitely be awesome.

Feel free to tell me otherwise.

Edit: If this isn’t obvious: It would accumulate the most helpful stuff from reddit on here without it being blindly crossposted by bots and would push google results (because its niche!) and most likely not a copyright issue because it is so few things that it should fly under the radar.

  • RBG
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    1 year ago

    I sympathise with your disclaimer down there but you and I both know, if someone makes this automated for users, someone else will add it to a bot and overdo it. It’s the way of the world.

    • HauiOP
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Thank you for your kind words! I‘m always the idealist. :)

  • Lvxferre@lemmy.mlM
    link
    fedilink
    arrow-up
    8
    arrow-down
    2
    ·
    edit-2
    1 year ago

    I’m no coder but I think that this could work. I picture something like this.

    1. Create a Lemmy comm. Let’s call it c/redditimports.
    2. The only user allowed to post in c/redditimports is a bot, let’s call it u/Ribbit (Reddit Import Bot, Busy Importing Threads.). However everyone can comment in it.
    3. There’s always one pinned thread in c/redditimports. In that thread, humans can comment which Reddit threads+comment chains they want to import from Reddit to Lemmy. They do it by pinging u/Ribbit, then providing links to 2+ comments in the same thread.
    4. When u/Ribbit is pinged, it checks if someone provided it some Reddit comment links. If they’re valid, u/Ribbit creates a new post in c/redditimports, and fills it with content from: the Reddit post, the Reddit comments directly linked, their parents, grandparents etc., recursively.

    Now here’s the hard part - someone coding the bot. Preferably in a way that Reddit doesn’t tell it apart from an actual human being, as Reddit Inc. and Greedy Pigboy will certainly not like it.


    Also, don’t worry about lowering Reddit dependency. Bots there will do it for us, the immigration leftover is already complaining about it.

    • HauiOP
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Thank you for laying this out. Maybe someone can make something out of it. :)

    • GregorGizeh@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      We already have something extremely similar, a bot instance that reposts content from subs that users request. It is incredibly useless in reality, and really spamming the shit out of my feed. The first user I blocked.

      There is not much use in third hand content that suggests a discussion, while it isn’t being discussed here. Now don’t get me wrong, I’d love for reddit to up and die, but it isn’t happening yet if we are being realistic and won’t happen by artificially creating zombie repost content on Lemmy.

      Id rather we had more content creators to provide valuable OC.

      • Lvxferre@lemmy.mlM
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        An important key difference is that the bot that you’re talking about mirrors whole subreddits, regardless of people interacting with it; while my suggestion is a bot that copies specific threads upon user request. As such the later would generate considerably less noise, enough to keep it contained to a single comm.

        There is not much use in third hand content that suggests a discussion, while it isn’t being discussed here.

        OP proposed a use - building a knowledge base here that could attract other users.

        Id rather we had more content creators to provide valuable OC.

        I agree with you but I don’t think that we need to choose between one or another.

  • ninjan@lemmy.mildgrim.com
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    Could of course be done with a browser extension like “webscraper”, with some tweaking you could likely get the data you want without much issue. That could then be packaged into a new extension made specifically for scraping a Reddit post you find and push it to Lemmy. The main downside I see is that all comments will be from the same user, so you likely want to set up a specific user for those posts. Like “reddit-scraper” or some such.

    If I didn’t have 2 toddlers and a 7 year old then I’d love to take a stab at it. But what little free time I have is not enough to make any proper progress on something like this.

    • HauiOP
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I agree completely. I would argue that we dont need the comments at all. A post, laying out the initial comment chain would be enough. Like I said „user1“ and so on but with real names.

      The reason I‘d like to do it this way (without comments but preserving the OPs names) is twofold: its not pretentious (as comments from the same person) and it would honor the original posters.

  • density@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I havent checked into its status but you should check out the archive team’s reddit project.

    I made a post a few months ago with description. Suffice to say archive team are your best hope for mass scraping.

    For personal use use you can use a web clipper extension (if you want to convert to markdown) or one that archives complete pages you visit as local files. If you are willing to install software to the computer also, there are tools that will archive your complete history and bookmarks. So you can go get whatever urls you have previously visited (assuming you saved them).

    As to cloning the discussion to the fediverse, there were a few projects to try to acheive this but as far as im aware none really got off the ground. Have a search on github.com, gitlab.com and codeberg.org most of them were based out of one of those sites.