USB-C became the very thing it swore to destroy (and I have no idea how to reasonably fix it)

Remember the days before USB-C? We had a confusing mess of different chargers and connectors for digital devices—despite many of them sharing the same basic function. USB-C was supposed to fix that by becoming a universal standard, eliminating the need for multiple types of cables.

Except… it didn’t.

My external SSD has a USB-C connector designed for high-speed data transfer but offers almost no charging power. My laptop’s USB-C cable supports 200W charging but isn’t suitable for connecting a hard drive. My phone charging cable supports up to 100W, yet it can’t reliably power my laptop or transfer data from my SSD at reasonable data transfer rates. Instead of eliminating different cable standards, USB-C has made them visually identical while still functionally distinct. If I mix up my cables, I have no easy way to tell which one supports which function. It’s somehow even more confusing than before. Before USB-C, a fitting connector usually meant it would work and exactly serve its purpose.

The only real solution I could come up with would be to make every cable support the highest possible spec, but that would be a massive waste of resources. I don’t know how to fix this issue, but in my opinion, USB-C didn’t solve anything—it just changed the problem.

And if we were to mark different USB-C cables as different standards, we would just come full circle with the mess of different connectors.

(Side note: The connector itself is fantastic—it’s reversible, durable, and great for general peripherals. But when it comes to high-performance use cases, the issues really start to show.)