• Sparkega@lemmy.world
            link
            fedilink
            English
            arrow-up
            63
            ·
            9 months ago

            Eliminates a malicious threat vector. Gives you piece of mind to charge your devices without worry that what you connect to is going to interact with your device.

          • eyeon@lemmy.world
            link
            fedilink
            arrow-up
            26
            ·
            9 months ago

            it’s also safer when wanting to charge from untrusted chargers, though you can still get an adapter to block the data pins or just bring your own wall charger/battery when traveling

        • flashgnash@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          9 months ago

          I think power only cables should exist because they are significantly cheaper, but they should have some kind of marking to differentiate them enforced by the standard

      • Zamundaaa
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        They are afaik not allowed to exist with USB C. And except for some very few very sketchy manufacturers, it’s also luckily not a thing in practice.

  • JCreazy@midwest.social
    link
    fedilink
    English
    arrow-up
    57
    arrow-down
    3
    ·
    9 months ago

    It’s not the cables that are the issue, it’s the manufacturer that don’t design their products to USB C specification so they don’t charge via C to C cable like it should.

  • Wilzax@lemmy.world
    link
    fedilink
    arrow-up
    38
    arrow-down
    3
    ·
    9 months ago

    I just buy Anker for everything and as a result most of my cables do what I expect them to.

    • the_weez@midwest.social
      link
      fedilink
      arrow-up
      11
      ·
      9 months ago

      It does look kinda sketchy for a full 3.0+ data cable. Could be one of those stupid charging only cables.

  • Zink@pawb.social
    link
    fedilink
    English
    arrow-up
    7
    ·
    9 months ago

    Either it’s usb to wall or usbc to usbc with all the features. I don’t own anything else out of pure fear.

    • Jordan_U@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      9 months ago

      Do you throw away all your cables when new features are added?

      Only when you start to own a device that uses one of those new features?

      • randombullet@programming.dev
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        9 months ago

        The latter.

        I have 2 types of cables. USB 2.0 100w PD capable cables and USB 4 cables.

        The USB 4 cables are too heavy to travel with so I have many more USB 2.0 PD cables.

        I also have a sprinkling of USB 3.0 compliant A to C cables.

        I throw away all the included USB 2.0 A to B to avoid mixups.

      • Zink@pawb.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        The latter. New features don’t matter till I get a new device with them

  • vsis@feddit.cl
    link
    fedilink
    English
    arrow-up
    4
    ·
    9 months ago

    I’m too lazy to label them. So, I usually keep the PDs connected to the charger and that’s it.

    But if it becomes a problem I’ll probably use a wire labeler.

    It would be nice if they came labeled from factory, though.

    • InputZero@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      9 months ago

      Coloured electrical tape, way cheaper than wire labels if you only have a few to mark.

  • ilinamorato@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    9 months ago

    I honestly have never had this issue with USB-C. A and Micro, absolutely. Mini, every day from like 2004-2007. But never C.

    Have I just been buying all the right cables without knowing it?

    • BorgDrone@lemmy.one
      link
      fedilink
      arrow-up
      46
      ·
      9 months ago

      Doesn’t matter much. Is it a 7.5W cable, or maybe a 15W, 45W, 60W, 100W, 140W or 240W cable ? Does it support USB 2.0 (480Mbps), 3.1 gen1 (5Gbps), 3.1 gen2 (10Gbps), does it support Thunderbolt, and if so the 20Gbit or 40Gbit version ? Does it support DisplayPort alt mode?

      You can’t tell any of this from looking at the cable. It’s a terrible mess.

      • PassingThrough@lemmy.world
        link
        fedilink
        English
        arrow-up
        20
        ·
        9 months ago

        The worst part is, I could accept that as a generational flaw. The newer ones get better, the olds ones lying around do less. OK, that’s the beast of progress.

        But no. They still make cables today that do power only. They still do cables that do everything except video. Why? Save a couple cents. Make dollars off multiple product lines, etc. Money.

        What could have been the cable to end all cables…just continued to make USB a laughing stock of confusion.

        Don’t even get me started on the device side implementations…

        • BorgDrone@lemmy.one
          link
          fedilink
          arrow-up
          10
          arrow-down
          1
          ·
          9 months ago

          That’s because it doesn’t make sense to use an €40 cable when a €1 cable would do.

          • shinratdr@lemmy.ca
            link
            fedilink
            arrow-up
            14
            ·
            9 months ago

            Agreed, but not requiring labeling or some sort of method to identify was a real fuckup on their part.

            My problem isn’t the existence of different tiers of cable, it’s that there is literally no way to know if the cable you’re using supports something until you try it.

            • BorgDrone@lemmy.one
              link
              fedilink
              arrow-up
              6
              arrow-down
              1
              ·
              9 months ago

              Agreed, but not requiring labeling or some sort of method to identify was a real fuckup on their part.

              Yeah, we used to have that. It was great. They even made it so you couldn’t even fit the wrong cable in a port. They did that by having different connectors for different cables.

              • shinratdr@lemmy.ca
                link
                fedilink
                arrow-up
                2
                arrow-down
                1
                ·
                9 months ago

                Yeah but with modern thin & light mobile devices, that’s a bad solution. Then you need multiple holes to serve multiple purposes, which impacts waterproofing and requires extra space & hardware.

                One port to rule them all makes sense. But it should have had a way to identify cables capability at a glance. I still prefer having one cable that can charge all my devices, even if the trade off is some confusing situations when it comes to cable capability.

            • Imgonnatrythis@sh.itjust.works
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              9 months ago

              Is this true? There’s no app to test these by plugging them into your phone? No chip in them that encodes a spec sheet?

          • PassingThrough@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            9 months ago

            I feel the only place for a €1 cable is met by those USB-A to C cables that you get with things for 5V charging. That’s it. And it’s very obvious what the limits on those are by the A plug end.

            Anything that wants to be USB-C on both ends should be fully compatible with a marked spec, not indistinguishable from a 5V junk wire or freely cherry picking what they feel like paying for.

            Simply marking on the cable itself what generation it’s good for would be a nice start, but the real issue is the cherry picking. The generation numbers don’t cover a wire that does maximum everything except video. Or a proprietary setup that does power transfer in excess of spec(Dell, Raspberry Pi 5). But they all have the same ends and lack of demarcation, leading to the confusion.

            • BorgDrone@lemmy.one
              link
              fedilink
              arrow-up
              5
              arrow-down
              1
              ·
              9 months ago

              I feel the only place for a €1 cable is met by those USB-A to C cables that you get with things for 5V charging. That’s it.

              But those are useless to me as my MacBook doesn’t have any USB-A ports anymore and since the PC world usually follows a few years later they will basically disappear in the near future.

              Anything that wants to be USB-C on both ends should be fully compatible with a marked spec, not indistinguishable from a 5V junk wire or freely cherry picking what they feel like paying for.

              But there is no single spec, there are lots of optional parts. Options that also come with limitations. Anything above the bare minimum needs an identification chip in the connector so the computer can determine it’s capabilities. That adds cost. A 240W cable is necessary thicker and thus less flexible than a 7.5W cable. A passive cable that supports thunderbolt 3 or 4 cannot be longer than 2 meters, above that it needs to be an active cable which is a lot more expensive.

              So if you want to make it mandatory for all cables to support all features, that means that if you want a 5 meter charging cable so you can use your phone on the couch while it charges you have to spend over €400 for a cable. Or, you could not make it mandatory and have 5m cables that do not support 20gbit for €10.

              • PassingThrough@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                ·
                edit-2
                9 months ago

                I’ll take a compromise where “3.1” is etched in each head end, and I can trust that “3.1” means something, and start with that.

                The real crux of the issue is that there is no way to identify the ability of a port or cable without trying it, and even if labeled there is/was too much freedom to simply deviate and escape spec.

                I grabbed a cable from my box to use with my docking station. Short length, hefty girth, firm head ends, certainly felt like a featured video/data/Dock cable…it did not work. I did work with my numpad/USB-A port bus thing though, so it had some data ability(did not test if it was 2.0 or 3.0). The cable that DID work with my docking station was actually a much thinner, weaker feeling one from a portable monitor I also had. So you can’t even judge by wiring density.

                And now we have companies using the port to deviate from spec completely, like the Raspberry Pi 5 technically using USB-C, but at a power level unsupported by spec. Or my video glasses that use USB-C connections all over, with a proprietary design that ensures only their products work together.

                Universal appearance, non-universal function, universal confusion.

                I hate it. At least with HDMI, RCA, 3.5mm, Micro-USB…I could readily identify what a port and plug was good for, and 99/100 the unknown origin random wires I had in a box worked just fine.

                • marcos@lemmy.world
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  edit-2
                  9 months ago

                  I can trust that “3.1” means something

                  This is also a problem. That 3.1 is the same as 2.X for some X that I don’t remember, that is the same as some number in the original standard.

                  It would certainly be better than not marking, but no, that 3.1 doesn’t have a clear meaning.

        • TrickDacy@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          9 months ago

          In practice I’ve only had these types of issues with a couple of (shitty) devices. Maybe one or two cables. Otherwise it just works.

    • shinratdr@lemmy.ca
      link
      fedilink
      arrow-up
      21
      arrow-down
      1
      ·
      9 months ago

      This is absolutely not a “US under regulation thing”, that makes no sense. What “regulation” would dictate what a connector carries over its cable? That would be compliance with the spec, and the spec is a connector.

      USB-C can carry USB 2.0, 3.0, 3.1, 3.2, 4.0, PD, DisplayPort, wattages from 5w to 100w & Thunderbolt 4. No one cable would be required to carry all those or all cables would be $50/ft.

      Just because you’ve never encountered a USB-C power only cable doesn’t mean they don’t exist in your country. They’re made by the bucketload in China, and you’ll encounter one soon enough.

    • Imgonnatrythis@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      How long can that last? Only a matter Of time before the Gideons start leaving them in hotel rooms and then before you know usb power cables are out and about like an invasive species.

      • PassingThrough@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        Actually, that leads me to another point:

        One upon a time, the concept behind a universal USB-C connector was so we could do exactly that.

        Laptop? Phone? Camera? America? Germany? Japan? Power? Connect the to TV? Internet?

        Wouldn’t matter anymore. USB-C to cover it all. Voltage high for the laptop, low for the camera, all available just the same in every country, universal. So yes, fill the airports and hotels with them. Use them for power and to play videos on the TV. Because we weren’t supposed to have to question the voltage or abilities of the ports and cables in use.

        Did/will that future materialize?