The commenters below are confusing two things - Rust binaries can be dynamically linked, but because Rust doesn’t have a stable ABI you can’t do this across compiler versions the way you would with C. So in practice, everything is statically linked.
Rust's stable ABI is the C ABI. So you absolutely can dynamically link a Rust-written binary and/or a Rust-written shared library, but the interface has to be pure C. (This also gives you free FFI to most other programming languages.) You can use lightweight statically-linked wrappers to convert between Rust and C interfaces on either side and preserve some practical safety.
> but the interface has to be pure C. (This also gives you free FFI to most other programming languages.)
Easy, not free. In many languages, extra work is needed to provide a C interface. Strings may have to be converted to zero terminated byte arrays, memory that can be garbage collected may have to be locked, structs may mean having to be converted to C struct layout, etc.
A culture isse, as in the C++ world, of Apple and Microsoft ecosystems, shipping binary C++ libraries is a common business, even it is compiler version dependent.
This is why Apple made such a big point of having a better ABI approach on Swift, after their experience with C++ and Objective-C.
While on Microsoft side, you will notice that all talks from Victor Ciura on Rust conferences have dealing with ABI as one of the key points Microsoft is dealing with in the context of Rust adoption.
COM is actually good though. Or if you want another object system, you can go with GObject, which works fine with Rust, C-+, Python, JavaScript, and tons of other things.
Plenty do not, especially on Apple and Microsoft platforms because they always favoured other approaches to bare bones UNIX support on their dynamic linkers, and C++ compilers.
Static linking doesn't produce smaller binaries. You are literally adding the symbols from a library into your executable rather than simply mentioning them and letting the dynamic linker figure out how to map those symbols at runtime.
The sum size of a dynamic binary plus the dynamic libraries may be larger than one static linked binary, but whether that holds for more static binaries (2, 3, or 100s) depends on the surface area your application uses of those libraries. It's relatively common to see certain large libraries only dynamically linked, with the build going to great lengths to build certain libraries as shared objects with the executables linking them using a location-relative RPATH (using the $ORIGIN feature) to avoid the extra binary size bloat over large sets of binaries.
Static linking does produce smaller binaries when you bundle dependencies. You're conflating two things - static vs dynamic linking, and bundled vs shared dependencies.
They are often conflated because you can't have shared dependencies with static linking, and bundling dynamically linked libraries is uncommon in FOSS Linux software. It's very common on Windows or with commercial software on Linux though.
You know how the page cache works? Static linking makes it not work. So 3000 processes won't share the same pages for the libc but will have to load it 3000 times.
I wonder what happens in the minds of people who just flatly contradict reality. Are they expecting others to go "OK, I guess you must be correct and the universe is wrong"? Are they just trying to devalue the entire concept of truth?
[In case anybody is confused by your utterance, yes of course this works in Rust]
That would have been a good post if you'd stopped at the first paragraph.
Your second paragraph is either a meaningless observation on the difference between static and dynamic linking or also incorrect. Not sure what your intent was.
Go may or may not do that on Linux depending what you import. If you call things from `os/user` for example, you'll get a dynamically linked binary unless you build with `-tags osusergo`. A similar case exists for `net`.
Kind of off-topic. But yeah it's a good idea for operating systems to guarantee the provision of very commonly used libraries (libc for example) so that they can be shared.
Mac does this, and Windows pretty much does it too. There was an attempt to do this on Linux with the Linux Standard Base, but it never really worked and they gave up years ago. So on Linux if you want a truly portable application you can pretty much only rely on the system providing very old versions of glibc.
It's hardly a fair comparison with old linux distros when osx certainly will not run anything old… remember they dropped rosetta, rosetta2, 32bit support, opengl… (list continues).
And I don't think you can expect windows xp to run binaries for windows 11 either.
So I don't understand why you think this is perfectly reasonable to expect on linux, when no other OS has ever supported it.
Static linking produces huge binaries, it lets you do LTO but the amount of optimisation you can actually do is limited by your RAM. Static linking also causes the entire archive to need constant rebuilds.
You don't need LTO to trim static binaries (though LTO will do it), `-ffunction-sections -fdata-sections` in compiler flags combined with `--gc-section` (or equivalent) in linker flags will do it.
This way you can get small binaries with readable assembly.
For what it's worth, a friend of mine is a lawyer in a well-known hedge fund and he gets access to their funds too (funds that would not otherwise be accessible without making a substantially larger investment I believe).
This is false and represents a poor understanding of how these models work - they do abstract concepts and no you can't trivially get training images out.
Copyright is about copying. It is not about observing. Reading a copyrighted book isn't infringement. Writing out copies of it is, even if you don't use a computer or anything.
I don't think, given that we don't even have a particularly full understanding of how human conceptual logic functions, that we can claim that even AIs are using it as well. It's only "abstracting" in the sense that it has labeled one million objects in its training set with the word "tree," and fuses many of those images together to form a general picture dependent on specific parameters made to limit its set (oak tree, winter tree, etc.)
But that is different from me or you using the word tree, which is just a signifier among signifiers, it stands for nothing but a negation of the very thing it points to in a certain set of symbolic relations. Humans communicate in the order of symbolic structures, our minds function much more like LLMs, creating multitudinous pattern relations. What you call "abstract concepts" are of a secondary order imposed to create rigorous exactitude overtop the riddled mess that we call the human psyche.
On the other hand, Facebook went from about 4,000 employees to over 70,000 in the same time period, so given how much Apple's business has grown in the last decade (when they released the iPhone 5) it seems pretty impressive that they grew only ~2x.
How is that different with LLMs versus badly-written human generated content? Most clickbait/SEO articles are as poorly researched as they come, and shouldn't be assumed to be accurate anyway.
> Anybody who lost their sense of taste or smell took months to recover that. Loss of smell or taste is almost 50% for Covid prior to Omicron. Even with Omicron, it's about 20%.
And yet, in more anecdata, everyone I know who had taste issues recovered them in much less than a month with no long-term impacts.
Though couldn’t a teenager (older than 14, but still) understand the value that deferring paying for something is useful in context to a part time job? I would only get paid at certain days, and knowing generally what I’ll get at the end of the month after tips and hours, I could budget for that number rather than what I have in the moment.
Then the teenager might ask, sure, that's credit like a bar tab or whatever and credit is convenient for all kinds of reasons, it's just, why do you need a whole other company to get involved and issue a specialized card to do that? Why doesn't the bank just say "we'll let you overdraft up to $4000 without any fuss but you'll have to pay interest on the negative balance"?
Because when your credit card is scammed, its the credit card companies money that has been stolen so they are want to get it back. When its your debit card being scammed, its your money being stolen, and any help to get it back is just a cost center for the bank so they don't really invest in this side of their business like credit card companies do.
So there's a bunch of existing intermediaries, which is bad, and to fix that we're going to add the exact same set of intermediaries, but on top of bitcoin, which means its good?