Yeah, I’ve given up trying to know all the libraries in my projects. I feel like the added development speed and code quality is just so good that not taking the risk of a supply chain attack is basically not an option.
I do try to primarily use libraries from the Rust team or from more widely known devs (and hope that they also do that), but most projects worth doing will need one or two specialty libraries where all bets and bus factors are off…
You think your code is higher quality with more dependencies? All you’re doing is offloading complexity to a separate project.
If you make a program that does “something worth doing”, but you need some specialty library to actually do it (which you didn’t implement yourself), than sorry, but it wasn’t you who did it.
Yes, offloading complexity to a separate project which has already invested more time into code quality than I could possibly justify.
As for your second point, I don’t care who solved the problem. If you care, I hope you’re smelting your own sand to build your own CPU and assembly language. But I’m obviously also not solving the exact same problem as the library already solved.
Why are you looking for conflict?
If you want to build something from scratch, you first have to invent the universe :) (paraphrased from Carl Sagan)
My problem was with the first line of your comment:
Yeah, I’ve given up trying to know all the libraries in my projects.
This leads me to assume that you don’t actually know that those dependencies are as well maintained as you claim.
Obviously dependencies are important and make sense to use in many cases, but using trivial dependencies to speed up development isn’t good.
As for your second point, I don’t care who solved the problem. If you care, I hope you’re smelting your own sand to build your own CPU and assembly language. But I’m obviously also not solving the exact same problem as the library already solved.
I was just saying it isn’t you who solved the problem in that case, really, as the hard work was done for you. Honestly though, it was pointless and rude so I apologise.
Apology taken.
This leads me to assume that you don’t actually know that those dependencies are as well maintained as you claim.
Well, I can’t guarantee that none of them are buggy, unmaintained etc… But that’s why I prefixed that sentence with “I feel”.
On average, it seems to me like the code quality is a good bit higher than I’m able to produce under money/time constraints.In particular, even the worst libraries tend to be not as bad as they may be in many other languages, because Rust’s strict type system + compiler enforces quite a bit of correctness on its own.
Well, and the good libraries are just obsessed with correctness and performance, so they drag code quality upwards, even if they introduce a mild risk of a transitive dependency being a dud…
This assumes that I could implement something as well as the maintainers of the library I use. I agree that something trivially should be implemented on your own, but if there is special knowledge required (the obvious example is cryptography, but also something like HTTP requests) I rather rely on a widely used library than my own code that I now have to maintain and check for security issues instead of just updating the dependency version whenever a CVE is published.
Also if there is. A client by an API provider for my language, why shouldn’t I use it instead of rolling my own?
Another example is a framework like React or Angular or Svelte, which brings along a whole lot of dependencies. Sure, I could not use something like that and write everything from scratch.
But where is the value of all that code to customers? If I want to roll my own HTTP server up from the sockets, I can do that as a play project. But not using libraries for a real world project to solve business needs is a bit of an odd take.
Anyways, that’s enough of a rant. Have fun in the replies. 😎
Oh, I forgot one thing:
sorry, but it wasn’t you who did it.
This sounds like you want to prove something. That you can do it better than the maintainers of the library. That you can solve hard problems on your own instead of relying on other people.
That’s all great and sometimes it’s good to do hard things on your own and make sure you could do it just in case. But it’s not always necessary to do everything yourself and learn every lesson yourself. It’s a valid way to build on knowledge and work of others to achieve your goals.
Holy shit this. I’ve observed a lot of competent devs go through that phase, trying to be clever and come up with what inevitably ends up being pale imitations of existing established solutions. Yes, we do avoid pulling in dependencies when we can avoid it, but this reeks of “Real Programmers Don’t Use Pascal”, without the tongue in cheek tone lol
Welcome to modern framework development!
- C# has nuget
- any nodejs based framework ( react, vue, angular, … ) (npm)
- python …
All of the above are chuckful of dependecies upon dependencies, and webdev stacks are the worst of them. They make it VERY hard to make software that requires any security related certification because of the dependency hell…
I swear to god, all those frameworks are designed so badly when looking at dependency hell …… Yet i will write c and c# code everyday haha
Who can we blame though? If we need something simple as
sed
, yes, go ahead and have a great security scan report. Web development has a complexity to make a dog puke so naturally you can’t practically write every line of code by yourself. The choices are either trust those package maintainers will maintain their software regularly, or build no web application.or build no web application
Don’t threaten me with a good time.
I mean, to some degree i believe you are right. I myself manage a .net library to parse barcodes. However, webdev has layers upon layers upon layers of dependencies. The advantage is that even my cat could make a website. The downside is it will be horribly inefficient because of those layers of dependencies. 90% of what they bring is stuff you dont need and are in the way. Or you use, but because youre going through all those layers, its fucking slow.
This applies to desktop dev too, but less hard than webdev. Most of the webdev development i just question why something was created and most of the time i can only conclude its because of some hack job and something missing. So they take a huge library and use only part of it for something. Its just… Eug
I am i developer/lead that likes to make things as small and efficient as possible and that just makes me die a little inside every time :p
Ez, feature bloat the project so all those dependancies are actively used
The real LPT is always in the comments
At least there’s tree shaking. Not everything is getting in the final build
Fuck that. It’s awesome! I want to have lazy initialized globals. It’s that package. I want code to shorten my builder pattern I import that. I need a typed concurrent work steel queue. No problem.
I look at a c project. Everywhere custom macros to do the most basic shit. I want to parse an xml in c? Better use a sax Parser and put all the data into globals. Cryptography? Better implement that ourselves… Using a library would be too much of a hassle.
I mean, it’s awesome until it isn’t.
NPM is already on the “isn’t” side of it. Specially with all the malware going around. Who has time to read the code of the dependencies of the dependencies of their dependencies? For every single version. It’s just not possible…
I guess the main concern with this is security. You’re literally running code you don’t even know about on your machine, probably next to personal files or your company’s code base.
A simple http call to publish all your private code wouldn’t be hard to sneak in a 6th level dependency.
So, to expound on this a little…
There’s a password manager I use, but the CLI tooling sucks. Thankfully, there’s a third party CLI tool in a language I know fairly well, and because I’m a little paranoid, I reviewed the code. Then I reviewed the code of the libraries it imported. And then the code of the libraries of the libraries it imported. Thankfully, that was as far as it went, and I was mainly looking for any code that made network calls… it was manageable, just barely.
And I made some improvements and submitted PRs, only some of which were accepted, but I used them so I maintained a fork. Which was lucky, because a few months later upstream changed their parseargs library to a framework, and the dependencies exploded. 6 layers deep, and dozens of new dependencies - utterly unauditable without massive effort. I caught it only because of the rebase from upstream. I abandoned the rebase and now maintain a hard fork, of which I’m the only user AFAIK.
The moral of the story is that introducing dependencies under the guise of “reuse” is a poisoned fruit, a Trojan Horse. It sounds good, but isn’t worth it in the long run. The Go team got it right with their proverb: a little copying is better than a little dependancy.
Honestly, I don’t like the Go way. If they are going to have that philosophy, at least they should have provided a strong core with high level functions and generics. From the start. Not 5 years later.
I’ve never used Rust, but this definitely reminds me of my days running Slackware on my computers.
Oh, hey, I’d like to run this new package. Great. I’ll need this dependency…and that one…and the one over there…
I know it now has dependency management, but I just couldn’t do it any more. I was tired of worrying about what was going to break. I started with Slackware in the 3.x days, too.
I switched my server to Debian, and I feel like I never have to worry about it any more. Laptop and desktop are both Kubuntu, but they’re going to go to Debian at some point in the near future.
cough NodeJS cough
I find it especially weird that it’s almost always labeled like something special if it’s written in Rust, even though as the end user the only thing I know will be different is the compile time, as it usually takes around 10-20 times longer than if it would be written in c, with 500 dependencies being pulled and recompiled every time. Which means if tests fail, even though the app works fine, and I had that happen twice in Rust, it will take three tries or so until I manage to fully remove the test section from the pkgbuild, resulting in an hour loss for just installing something that could’ve taken 5 minutes.
> Decide to create a very basic GUI app in Rust, as everyone is saying it’s a great language for it
> First compilation takes over 15 minutes to download and compile 100 libraries
> Debug files take up 2GB of storage
> Output binary file comes out massive for no reason
> “Yeah you’re supposed to write a few lines to optimize for size in your release profile”
> Compiling now takes 30 minutes instead
Reimplements in C
Compiles in 5 Minutes (you accidentally did it on the RPi Zero W, on a PC it’s done in 30 secs)
Reimplements in C
Segmentation fault (core dumped)
Reimplements in C
Segmentation fault (core dumped)
change code so it no longer segfaults
still is UB, has arbitrary code execution vulnerability
everybody dies
But you died faster, that’s not to be underestimated
You seem to be a rather specific user, if the compile time is something you notice, let alone the only thing…
It’s just weirdly noticeable when one rust program with ~150 lines of code, designed to connect to a specific device and send commands according to the intensity of music, takes longer to compile than updating a typical Arch testing setup after a month without maintenance, including the (non Rust) AUR packages.
Well, I’m not here to claim that Rust’s compile times aren’t comparatively long, especially for non-incremental builds. It’s a trade-off that was chosen to not need a runtime environment, nor be as simplistic/footgun as C.
What I’m saying is that this trade-off was chosen and continues to be popular, because the vast majority of users will never notice (nor will programmers really, as they have incremental builds).
Maybe you can download the fully built package from somewhere? Maybe Arch can package it in the proper repos?mf conveniently forgetting about incremental compilation
Every time I see a project decide to use rust I groan knowing my build/packaging time is about to skyrocket. Case in point, the Python cryptography project.
And given cryptography’s importance in the Python ecosystem what used to be an easy pip install of a package now almost always going to include is an enormous and horribly slow rust build environment.
Seeing a rust libraryjust makes me sad now 😭
I seem to recall when the switch was made it took me about a week to figure out how get it to work on OpenBSD, because the Rust build step failed there (for a reason I can’t remember now).
Yeeap. My FreeBSD box has such pain with 'em. Because unfortunately *bsd is not in Python’s precompiled wheels. So one is almost building from the source.
Now every time I pip install something there’s a high likelihood I’m going to end up having to install the rust tool chain and burn so much time on building libraries. I get why the project made the switch, but man does it hurt being downstream of it.
My only dependency is libstdc++
Fight me
stdc is just bloat, i implement the methods and do the syscalls myself if i need them
C is bloat, thats why everyone should use Asm
Pff, I prefer good ol’ flip switches, like on my trusty Altair.
Noob. I prefer to use a screwdriver to poke around the CPU and memory lanes
I use butterflies
I swear there is an XKCD for that
Always. https://xkcd.com/378/
libc++ here.
Bring it.
GPL wins.
Dependencies. Not even once.
Sounds like JavaScript
Laravel moment
I often find myself wishing Cargo had a feature that would warn me if different dependencies I used tried to pull in both openssl and rustls. Happened way too many times.
You could use cargo-deny for that: https://embarkstudios.github.io/cargo-deny/checks/bans/index.html#use-case---denying-specific-crates
You’d need to remember to run it, though. Either in CI/CD or as a pre-commit hook or personally, I like to just have a script which also runs unit tests and Clippy, so that it’s useful enough that I run it myself.
Grep cargo.lock on pre-commit?