Exciting news for who? Only the site owner is excited that a free resource now requires a subscription

“Yay! Now I have to pay another subscription! I’m so excited! Let’s celebrate with them!” - nobody

  • Honytawk@lemmy.zip
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    2
    ·
    7 months ago

    Subtitles are like 5kb text files, why even limit their downloads in any way?

    • jayandp@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      3
      ·
      7 months ago

      The overhead isn’t the storage but the request. Processing a request takes CPU time, which can get expensive when people setup a media server and request subtitles for dozens of movies and shows. Every episode of a TV show is a separate request and that can add up fast when you scale it to thousands of users.

    • thisNotMyName@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      7 months ago

      a typical (full subtitle) .srt file for a movie is like 100-200 kb - still not much, but 5 is a little off

      • dan@upvote.au
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        7 months ago

        If it’s all text, it’d compress quite well, especially since there’s likely lots of repeated words. Not to 5kb of course, but I wouldn’t be surprised if it had at least a 3x compression ratio with zstd.

    • ricecake@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      7 months ago

      If they’re storing them in something like Amazon s3, there is a cost (extremely low, but not free) associated with retrieving data regardless of size.

      Even if they were an entirely free service, it’d make sense to put hard rate limits on unauthenticated users and more generous rate limits on authenticated ones.

      Leaving out rate limits is a good way to discover that you have users who will use your API real dumb.

      Their pricing model seems fucked, but that’s aside from the rate limits.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        Agreed, they could have done this much more gracefully. Same as the reddit API. Average user? Who cares. Sending millions of requests? Okay we’re going to clamp down pretty hard on you

        • ricecake@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          7 months ago

          Oh, I’m pretty sure it’s close to trivial. $0.0004 per thousand requests is $400 per billion, or $0.40 per million.
          That’s as close to insignificant as you can get and still pay attention to. Caching solutions are probably going to end up costing you more in the long run. An HA setup that can handle a billion requests a year is going to cost you at least $100 a month, and still provide less availability than s3.

          You don’t want unmetered access, but their pricing is unlikely to be based on access rates, and more likely on salary costs and other infrastructure costs, like indexing and search.

    • azertyfun@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      7 months ago

      Subtitle are like 1h worth of content, why even download more than 10 a day?

      They could make it 20 and it wouldn’t change much I guess, 10 does seem a bit low, but if they make it 1000/day (which you could argue is “no heavier than one JPEG”) they’ll have Kodi addons or whatever attempting to auto-download an entire library’s worth of subtitles. It’s not about the throughput, it’s about the processing time of establishing connections, negotiating cyphers, processing a request, hitting a search indexer, etc. All those small costs add up if every day you have thousands of users downloading hundreds of file without giving anything back.

      • SendMePhotos@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        Just start downloading them and using them to create a new platform. Bam!

        (I am saying this with 100% ignorance)

    • Appoxo@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      15
      ·
      7 months ago

      Electricity aint exactly free. Even if the data they store is minuscule. Servers will pull >300w if you store 10gb or 2000gb.

          • LufyCZ@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 months ago

            You know what?

            If you gave me a datadump and a docker image, I’d host it, for free.

            Insane I know

        • Appoxo@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          10
          ·
          edit-2
          7 months ago

          If a server costs X and the amount of free users is Y and VIP is Z then you’d need to create an equilibrium where you can make more money to sustain the infrastructure and have enough in case it goes belly up.
          Aka: If 10k users are free, and the income from VIP or ads is Z then you have to limit the capabilities of the free users to sustain the platform which in turn can stay (to some anount) free because the VIPs pay for it.
          Means: Limit API calls.