It's like /agdg/ except without videogames. Well there wasn't any videogames to begin with, but now there may be other software.
See also /agdg/ at >>>/v/ for videogames.
What are you working on?
There isn't really a screenshot for it, but I just finished improving my mouse input system.
The program now only handles mouse events in 2 cases: either the mouse is doing something on top of the window, or you click-and-dragged outside of the window. It was more annoying to do than it sounds because Windows only fires input events when the mouse is over the window, so you can't drag out from it (which makes sliders and such annoying if they're near the window edge). But if you get global mouse position manually, it doesn't have any information about whether some other window is obscuring yours. Global mouse position is also based on the monitor, not based on the window unlike the mouse move events, so you have to fix that. You also need to detect when you lift a button while the mouse is away from the window (which again Windows doesn't fire an event for), otherwise you get the age old issue where the button gets stuck on until you click on the window again.
I need this because next I'm going to work on a program that has custom controls for moving/scaling the window, and it needs to mostly sit idle so I don't want any unnecessary mouse events causing it to run logic and use CPU.
>fixing up emulators & controller tools to not be shit
mostly minor changes but these minor changes were really starting to annoy me
Got the window UI figured out, now I just need to actually put something in it. This is a remake of an older program, an image viewer of sorts.
what's this? an imgui clone? I haven't seen the previous thread
PSA on toilets:
>an imgui clone?
I'm not sure what you mean by that, imgui is an idea about how to manage a GUI. If you mean Dear Imgui then no, I wouldn't call what I'm doing a UI system at all, I needed a bunch of buttons so I just made an array of buttons. The UI is just means to an end rather than a project in itself.
Very cool Anon.
Working through the graphics chapters in Stroustrup's PPP2, and working on creating a wrapper library for FLTK that eliminates the need to write C code to use it. The author of the library claims it's 'C++' but plainly it's 'C with classes' at best. Hardly surprising given it's heritage goes all the way back to SGI & the NeXT.
Anyway, when I'm done we should have a very simple to use, very portable, very high performance GUI/windowing system that will run smoothly on even tiny microcontrollers if needed.
This is an interesting area to learn and touches on a lot of different disciplines, as Bjarne correctly points out. It will be a good learning exercise for me, and will teach me a lot about doing efficient library design using C++.
why a wrapper though? wouldn't it be easier to just fork FLTK 1.x and replace raw pointers with RAII pointers, const char* with string_view, char* with string, and function pointers with std::function<...>?
>wouldn't it be easier
No, heh it definitely wouldn't be easier. FLTK is a big-ass library (unsurprising given it's basically a full-featured Windowing/GUI system over 30 years in development). So, I guess my reasons are mainly these two.
A) I'd like to have this done before Summer's end, not five years from now.
B) I'd like anons to simply install the FLTK dependency from their own repos, then my code will simply rely on that.
Taken together, I think the wrapper idea is cleaner & less error prone. Besides, I've already succeeded at this approach for the cURL library, and it works fine that way.
But you're right about using modern C++ programming idioms Anon, they really are a good idea in general.
What is the difference between using modern C++ features and free()ing memory on my own?
I understand it seems like automatic but what's wrong with explicit memory management and C idioms?
>What is the difference between using modern C++ features and free()ing memory on my own?
Well, the first and most obvious one is, well, you Anon.
>and free()ing memory on my own
Relying on humans to correctly and consistently manage resources properly has obviously been an utter failure. The computer does some things much better than we can, and RAII is one extremely good example of that.
And btw, RAII isn't 'modern' in that sense. It's been a part of C++ from the very, very beginning. Certainly by ARM in '85.
Right, RAII isn't modern. And I know it's similar to auto-closing fridge doors. Then again, shouldn't people close their fridge doors on habit? It's just lazy or stupid not to anyways.
Well it's not quite that simple Anon. Computing systems are prone to exceptional conditions. And men (there are no women in programming, really) aren't all that clever at the foresight required to anticipate all those possibilities.
Even if you free resources dutifully in the recommend fashion, exceptional states can often render those steps useless. RAII basically always does the right thing, and it relies on very little else to do so. Certainly not the men involved, for instance.
not him, but:
first, have in mind that C++ RAII unique_ptr implementation doesn't have memory overhead unless you use a custom destructor (which you almost never do), and won't affect execution time as long as you use -O1/2/3 (which remove the additional function call on creation).
there are many inconveniences with manual memory de-allocation; just to name a few that come to my mind:
0. let X be an array of structs Y with arrays Z inside. do I really want to write de-allocation functions for all those types? it's going to be exactly the same for all of them.
1. suppose I have a function and I want to insert "early returns" (a common pattern when using error codes); without RAII, I would have to free all my resources before every return statement.
2. I can't make "ownership" explicit with raw pointers syntax, so I'm going to have to document which pointers I'm supposed to free (FLTK does this).
3. many algorithms require ref-count'd pointers. C++ ref-count implementation has been the same for decades
so here are the two main reasons why I use RAII:
>have to write less code for exactly the same functionality
>0 (zero) overhead
just give it a try anon, it stops feeling wrong after the first three or four times. besides, if you are on GNU, you can use gcc -S -O2 <your file> and check the generated x86 by yourself.
I see. By using those features, the foresight can be shared and inherited. It makes sense to use them if and only if the programmer knows what they are doing. Blindly following practices just like a pajeet is bad, especially when the execution is not explicit.
I will. Still prefer C with class for now, will try adding in RAII to the mix.
I should just nuke this thread from orbit before another anon contracts this pajeet aids.
i don't care if the pajeets can do it or if it melts their brain all the more reason to call them retards.
memory management is so easy it doesn't even factor to the difficulty of writing a program usually.
and you are not supposed to use malloc and free regularly inside functions for like whatever.
its methods you should rarely use, insted of using raii learn to use your brain.
all these problems and more can be solved if you just learned assembly
>it's fine if I procrastinate and play videogames a bit
>suddenly it's 4 days later
Where does time go
Continuing this, you can now actually view images on it. I don't have a smooth image scaling function yet though, so the image looks a bit messy when zoomed out.
Continuing with some UX. The image gets centered to the window when you open it, and also scaled down if it's too big to fit the window. The bottom left button now also works, it toggles between fitting the image into the window, and filling the window with the image.
>factor to the difficulty of writing a program usually
>use malloc and free regularly inside functions for like whatever
nobody said that. why can't you refute a simple point and have to embarrass yourself?
not him but, assembly isn't hard, it's probably the most simple and easy to learn programming language. most comp-sci students learn x86_64 and mips64 in their first year.
the only good reason to mention asm (x86 in particular) would be to point out the overhead of an additional PLT symbol lookup (1 jmp, ~2 movs, 1 ret) the first time an object is deleted, but only IF the call destination address was not resolved at link time (unlikely).
>all these problems and more can be solved if you just learned assembly
But I did learn assembly Anon. I even did it before C and then C++. C++ brings so many benefits to the table, with so little overhead related (usually no overhead) that to me it's an absolute no-brainer to use the language for any resource-constrained projects as long as a build system is supported on the platform. Since today that includes practically every platform it's an easy choice IMO. Some things are better to do in Bash or even Python. But for practically everything else C++ produces the singular best results -- both for size, runtime perf, extensibility & maintainability.
Perhaps in the future some like Golang (if you can stomach the poz of the community itself) may displace C++ for systems programming, perhaps not. Regardless, neither Assembler nor C ever stand a chance of doing so.
Added more ways to open images. You can now drag images in from a web browser, copy&paste bitmaps from a browser or image editor, and also copy transparent images from Krita and Gimp. Anything that puts a regular bitmap into the clipboard should work, and I think programs like Krita always do that as fallback. I don't know if there's a commonly accepted clipboard format for images with transparency, Gimp and Krita both put a png file into the clipboard, but they use a different ID and name ("image/png" from Krita and "PNG" from Gimp). Not sure what other programs like Photoshop do.
Strangely you can't copypaste stuff from Krita into Gimp. You can copy from MS Paint into Gimp so Gimp does recognize the default bitmap. Both I and MS Paint can successfully read the bitmap data from all 3 programs, so I wonder why Gimp can't read it when it comes from Krita.
>neither Assembler nor C ever stand a chance of doing so.
C is already the de-facto systems programming language.
Smooth scaling now works. It scales in a separate thread when the zoom level changes, in the meantime a quick preview is shown.
This is the first time I'm using threading for any actual purpose, and I already hate it. I can only imagine what a mess it would be to add threading to a videogame and managing a million mutexes for everything. I wish you could use a bracket scope syntax for them, so instead of this:
// process CIA niggers
unlock_mutex(cia_killer_mutex);You could just do something like this:
// process CIA niggers
I found this book on multi-threaded shared-memory programming to be quite interesting. You can check it out: https://cdn.kernel.org/pub/linux/kernel/people/paulmck/perfbook/perfbook.html
I have better things to do with my time but thanks, I'll save it just in case.
>I have better things to do with my time
really, like what
Programming. Reading that much for what will probably amount to a handful of neat tricks isn't worth stopping programming for. How much would I learn that I wouldn't learn by doing multithreaded programming for the same amount of time anyway? Rhetorical question.
Skim and skip. You can probably learn quite a bit of tricks that are not easily searchable.
>I can't learn to program properly I have to shit on the street
Pajeet, you should be as far away from computers as possible.
Sounds like needle in a haystack. I mean I'm sure there's useful stuff there, but nothing I'm going to do for a long time warrants a >500 page book.
How about you post your program instead of whining about what other people do.
Holding Shift now preserves window aspect ratio, and holding Ctrl will scale the image along with the window. Clicking and dragging with right/middle mouse will scale the image linearly.
The point under mouse will now stay in place when zooming.
The bottom right button now works, it scales the window to the image size.
Nigger rigged my old bitmap text system to show zoom level, and instructions at startup. The instructions text isn't perfectly centered because it includes spaces in the row width, but I'll probably write a whole new bitmap font system sometime.
UI is now visible when dragging stuff even if mouse leaves the window.
Sprites are embedded into the file so the executable is completely standalone. There is a file though, which you can edit to customize the icons, but if the file isn't found then the embedded data is used as fallback.
I think this program is now complete as long as I'm not forgetting anything. Well, I still need to figure out how prevent stb_image_resize from fucking up the alpha channel when it does a smooth rescale.
There was something I was supposed to add after all: buttons to flip the image.
>prevent stb_image_resize from fucking up the alpha channel
Never mind I'm retarded, I wasn't enabling the alpha channel on the scaled down image, so my own drawing function was ignoring alpha. I didn't realize because it's usually enabled automatically when I load an image file.
Anyway it's done, here it is in case anyone is curious:
It is meant to be a tool that artists can use to view a reference image while they draw.
Nice work lad.
I looked around but I couldn't find the source code Anon?
Photoshop watch out.
The reason I worked on >>1535 is because I thought it would be a good warmup to what I really want to work on: a graphics editor. I've been working on the UI and design of this program for ages for fun, I'm very happy with how it all looks so now I just have to worry about actually making it.
> Some things are better to do in Bash
No, shell sucks ass, unless it is absolutely 100% trivial.
> with so little overhead related (usually no overhead)
OOP adds a memory overhead. And C++'s convenient features (good example is vectors which are actually truly convenient) do things behind your back (by design) which is an issue when programming systems software or embedded systems.
> Golang (if you can stomach the poz of the community itself) may displace C++ for systems programming,
Golang will never displace or replace C nor C++. Golang is not good for systems programming. Golang is good for creating tools and, in general, Go is a less harmful python replacement.
> C++ produces the singular best results for maintainability
Not unless you and your team follows a strict style guide that tells which features you can use. C++ has a ton of features and C++ is way more complex language than C.
> C is already the de-facto systems programming language.
Either C or "C-style" C++ (without OOP or anything that uses OOP or anything that has a big overhead in general) C-style C++ can be better than C because the C++ compiler does more checks and you get some extra features.
Good Job Anon!
>shell sucks ass
Do you mean POSIX shell sucks or the concept of shell sucks? Why?
C++ can help create much more maintainable code than C ever can. And as with much of the rest of your postions in this post, you have the point about strict guidelines exactly backwards: the only way you can ever hope to have maintainable large C projects is by them. Well-written C++ naturally tends to enforce the designer's intentions well, by design. It's much much easier to creating a steaming pile with a good C developer's code than with a good C++ developer's.
>C-style C++ can be better than C because the C++ compiler does more checks and you get some extra features.
It's true that C++ compilers are better than C ones, but 'C-style C++' is neither, and sort of an abomination. C developers may like that shit, but certainly no professional C++ developers do.
I'm not him, but both suck and I'm angry. Angry about shell.
POSIX shell is completely useless for any serious programming, see http://www.etalabs.net/sh_tricks.html for a good rundown on some of the myriad ways it sucks. The other extension shells aren't much better, sh is just the most outrageous of them all. Writing a shell program that handles all inputs correctly is straight up impossible because there is so much hidden and retarded "helpful" action that you can't possibly keep all of it in your head.
Almost everything has to be prefixed by some gargantuan BE_RETARDED=NO --dont-do-the-retarded-thing --no-really-stop-being-retarded song and dance, with no way to get rid of it once and for all. I doubt you knew all of the pitfalls in that link, and before you misunderstand this as a challenge to try and learn them so you can be 1337 Unix Greybeard, let me say that memorizing that shit is a colossal waste of your time and runs counter to the point of a computer, which is to automate work, not create it. The reason you see all these shitty to mediocre "scripting languages" pop up again and again is that the shell sucks.
The concept of a shell as found in Unix sucks because what you get is a really retarded programming environment where everything has to be built from global variables (called files), byte streams and fuckhuge variadic functions (called programs) that take opaque byte arrays (called arguments) and return a small integer (called return code). Anything beyond that must be reimplemented by every single program and thus ends up wildly inconsistent.
What you're doing in the shell is really programming - the environment and language are just so thoroughly crippled and useless that you don't even recognize it as programming half of the time. The classic example of a system that does this right is Genera, but really any proper programming environment would be an improvement, even if it's just the javashit REPL of your browser. When you hear windowsfags rave about Powershell, that's because it fixes some of these issues. I don't know how many of them though, wild horses couldn't drag me to Microsoft land anymore.
> Do you mean POSIX shell sucks
Yes and so do all POSIX shell look-a-likes.
> you have the point about strict guidelines exactly backwards
Let me reiterate: because C++ has so many features and so much more complexity in the language, you simply cannot use all of the features. If you are working with a team, you should pick the allowed features to prevent your codebase from becoming a mess.
>no professional C++ developers do.
Like I said, C++ is not suitable for embedded systems or resource (especially memory) constrained environments/software (Unless, you use "C-style" C++)
>by design. It's much much easier to creating a steaming pile with a good C developer's code than with a good C++ developer's.
Define what you mean by "steaming pile" exactly. are you complaining about Cstrings again? (I never claimed that C has good support for strings) or are you complaining about the C type system (which I never praised). Don't tell me you are actually comparing a legacy codebase (that can date back decades) to a modern one. Finally, if you are complaining about muh header files, then you must realize that modules were introduced to C++ just recently.
> I don't know how many of them though, wild horses couldn't drag me to Microsoft land anymore.
FYI, PowersHell also works on GNU/Linux: https://github.com/PowerShell/PowerShell (but I have never used it on Linux)
>OOP adds a memory overhead
OOP != C++. you can do OOP with C. actually, most big C projects do, in an inconsistent and inefficient way (see the linux kernel or any other C project with more than 20K LOC).
>vectors which are actually truly convenient
>do things behind your back (by design)
>which is an issue when programming systems software or embedded systems
if you don't know the difference between stack and heap you shouldn't be near embedded systems anyway. literally not a language problem.
>Not unless you and your team follows a strict style guide that tells which features you can use.
guidelines are just common sense, they only exists because companies keep hiring uneducated retards to drive wages down.
>what is const
>what is constexpr
definitely not C
have you ever programmed an embedded device? people don't use C because they want, most of the time they do because there are no C++ (or any other language) compilers for those ISAs, and companies are too greedy to spend money developing a C++ compiler just for constexpr and consteval.
internet 1337 h4x0rs praising handwritten vtables are mentally ill
>It's true that C++ compilers are better than C ones, but 'C-style C++' is neither, and sort of an abomination. C developers may like that shit, but certainly no professional C++ developers do.
Mike Acton, former Engine Director from Insomniac Games, and current Principal Engineer at Unity, in charge of the complete overhaul they're doing of their backend to try and make their performance not dogshit. Well known advocate of Data Oriented Programming, one paraphrased quote from him is "we only hire C++ programmers because they're easy to find, then teach them to write C in C++ instead"
Kill this anon, he is refusing to accept C++ into his heart and soul. This is unacceptable. Not using [latest C++ features] and believing in the C++ committee is the highest form of sin. C++ is perfection, C++ is god. Programming anything without using at least 7 C++ features in it is bad coding practice.
honestly, we can agree to disagree if you don't want to discuss this any further since this kind of discussion is not very productive.
> OOP != C++.
true, but OOP is one of the biggest features of C++.
>vectors which are actually truly convenient
>do things behind your back (by design)
the reason why vectors are convenient is that they do the allocation/growing behind your back. vectors use more memory than C arrays because vectors allocate more memory than what is actually needed.
> literally not a language problem.
yes it is. if you use those language features or stdlib functions/types you have less precise control over how the memory is being used. memory is usually the most precious resource in embedded systems and adding more memory is going to increase the cost (per unit) of the final product.
I want to make a jigsaw puzzle engine for browsers, what's the best graphical interface to use for this? Was looking into WebGL but would like a second/third/nth opinion.
You literally have no options besides just drawing shit on a canvas directly, which is just going to be slower.
>the reason why vectors are convenient is that they do the allocation/growing behind your back
doing something that you explicitly asked != doing something behind your back. all the documentation is pretty straight forward about memory allocation (which is mostly common sense, you shouldn't need documentation to know how std::vector works)
>vectors use more memory than C arrays
because they're different data structs that serve completely different purposes. if you don't need dynamic allocation, you should use std::array which is equivalent to C arrays but with more compile-time information and without the pitfalls (degradation, etc).
>if you use those language features
not a language feature. constexpr is a language feature
>stdlib functions/types you have less precise control over how the memory is being used
then don't use them? do you really need someone to explain that to you? that you have to choose the right tools and not be a complete retard?
namespace and std, not malloc and free ok. praise bjarne
Keke. There's a recent video interview done with him. This lockdown kikery is obviously really wearing on him over the past year. I'm a bit concerned over his health now tbh.
BTW, you really seem to know what you're doing around the computer. Have you ever taught programming Anon? I have a thread intended to help complete newcomers get up to speed with C++ on our board, but so far no one's participated in it. I'm wondering if you have any tips on how to go about it properly?
I'm quite aware of Mike's positions, and saw his presentation as the closing keynote for CppCon14. Mike's kind of a pandering ass, and laughably his only real agenda that day was to spew out verbal clickbait over his disgust with the audience he was being paid to serve. Maybe warranted, maybe not. But disingenuous of him at best.
But he is exactly right about keeping the cache line occupied as a performance key. The C++ Committee -- indeed all the systems programming world -- was quite aware of how the prefetcher works long before Acton took the stage that day.
>actually, most big C projects do, in an inconsistent and inefficient way (see the linux kernel or any other C project with more than 20K LOC).
This. Double-nigger this. If it weren't for the fact we have to actually wade through these steaming piles of ad-hoc chicanery on a regular basis, these DO NOT WANT, DO NOT WANT, DO NOT WANT. oh wait...i want
manchildren rants would be pretty laughable.
NIHS is basically a form of mental illness. One the rabid C crowd suffers from deplorably.
>it's just more verbose and specific C without a standard library
I thought it was supposed to be super hard and impossible to make programs with. Now I want to make a programming language that's just macros for assembly code and some syntax features.
Check out HLA or LLVM.
Very interesting but I don't know what it has to do with programming.
HLA as in High Level Assembly (check out The Art of Assembly Language, 2nd edition by Randall Hyde) If you don't like HLA, check out the 1st edition of The Art of Assembly Language instead.
I dont know shit about computers let alone languages that make things happen, but can anyone help me understand the whole ordeal behind people exploiting terry davis image/work on cuckchan and holy c itself?
What made holy c different from the other c stuff and languages/engines you guys use?
I honestly just want to understand how fucked in the head you have to be to exploit a dead person to sell ugly clothing merchandise and the retards buying them
The fuck are you talking about? How are people "exploiting" terry? What's going on?
> https://web.archive.org/web/20170204193807/http://www.templeos.org/Wb/Doc/Welcome.html (important)
> https://web.archive.org/web/20170204193809/http://www.templeos.org/Wb/Doc/Charter.html (important)
possible language: html, relevance: 8
* Oracle in with <F7> for words or <SHIFT-F7> for passages. See tongues.
* x86_64, ring-0-only, single-address-map (identity), multitasking kernel with multicore support.
* Master/Slave MultiCore
* Free, public domain, 100% open source.
* 64-bit compiler/assembler for HolyC. Truly compiles, doesn't interpret. Just-in-Time and Ahead-of-Time compilation. With JIT, no need for object or exe files.
* No 32-bit krufty code.
* 640x480 16 color VGA graphics.
* Keyboard & Mouse support.
* ATA PIO Hard drives, support for FAT32 and RedSea file systems with file compression.
* ATAPI PIO CD/DVD support with ISO9660 file system. Can make bootable ISO9660 ISO files so you can roll-your-own distro's.
* Partitioning tool, installer, boot loaders for CD/DVD and hard disk.
* Editor/Browser for a new Document Format. Source files and the command line window can have graphics, links, icons, trees, colors, super/sub scripts, margins. Everything is seamless through-out the tool chain. No need for separate resource files.
* 8-bit ASCII, not just 7-bit. Supported in entire tool chain. <CTRL-ALT-a>
* Graphics in source code, no resource files, graphic sprite editor. <CTRL-r>
* 64-bit pointers. All memory, even more than 4Gig, can be directly accessed by all tasks on all cores at all times.
* Ring-0-only. Highest CPU privileged mode at all times. No off-limits insts. No time lost changing modes or address maps. Switches tasks in half a microsecond.
* 2D/3D graphics library
* Real-time fancy differential-equation solver for physics engines, to use in games. (Adaptive step-size Runge-Kutta, interpolated for real-time.)
* Auto-completion, jump-to-source tool called AutoComplete with Dictionary.
* Window Manager. Pan scrn with <CTRL-LeftDrag>. Zoom scrn on mouse cursor with <CTRL-ALT-z>/<CTRL-ALT-SHIFT-Z>.
* File Manager, <CTRL-d>.
* Code profiler, merge, diff utils.
* PC Speaker support with many hymns.
* Music composing tool.
* Many games, demos and documentation.
* All source code included. Only compiles with the included TempleOS compiler and assembler.
> I dont know shit about computers
Read K&R, Learn HolyC. Install TempleOS. Entertain Mr. God.
> What made holy c different from the other c stuff
Terry A. Davis improved the C language (Basically, Terry added some features and changed it so it's easier to use as shell language for TempleOS)
They are probably still selling TOS t-shirts and mugs. Terry said it's ok.
If you pick your programming language based on how small the statically linked hello world is you can't go wrong.
All the mystification of assembly as some impossible god-level shit pajeets made really gets to one's head, makes me wonder what else they managed to sneak into my thoughts.
t. programs 8086 assembly with JWASM on DOS.
I'm not sure if Ken Thompson was held at gunpoint while he worked on Go or if age got to him, but when he made C he understood what programming meant. He wrote a language that simultaneously provides only abstractions that cleanly and efficiently map into assembly, and provides all the useful abstractions a programmer might want.
This is the definition of a good programming language that modern languages just don't understand. Modern languages are about enabling ignorance and hiding the machine away like it's evil.
C on the other hand shows you that after the machine has what it wants, you can have what you want too, but only if a skilled language designer makes that his goal.
>unironically posting the trannyme book
Terry would call your shit out for being a FUCKING NIGGER
>pajeets made assembler
Why does the BO allow this diversity-hire tier glownigger shit here? Why hasn't he run them over yet?
>I can't read
calm down, wigger. it's just a joke.
hapas are a joke. degenerate faggots are not, except as udderly cow fodder.
>If you pick your programming language based on how small the statically linked hello world is you can't go wrong.
If everything you write is on that level, sure. I don't particularly care how my compiler optimizes trivial programs because that's not what I spend most of my time on.
>He wrote a language that simultaneously provides only abstractions that cleanly and efficiently map into assembly, and provides all the useful abstractions a programmer might want.
Except for overflow flags, proper strings and arrays, advanced control flow, useful macros, precise control over data layout... If you're looking for nonsense that silently sneaked into your thoughts, look no further than the deification that C went through. C was a product of substantial poverty (take a look at the machine Unix was made for) and perfectly appropriate at the time, but it's a joke of a language outside of that context. It wasn't even particularly efficient originally, all that came through later work and often clashed with the original design - that's especially visible if you compare K&R with the shameless pilpul over undefined behavior of today.
>If you pick your programming language based on how small the statically linked hello world is you can't go wrong.
<small automatically == better
That depends. If you are on a tiny, tiny microcontroller or sensor where size is of utmost concern then sure, that's correct. However, most of us (and the rest of the world) care more about wallclock perf. In that case, increasing the memory footprint of the image in exchange for much, much higher performance is the proper course. Either C or C++ can allow you to make these kinds of tradeoffs, but few other languages can. Assembler doesn't count b/c I don't want to spend six months of my life managing to pull off something I can do instead in two seconds with -O3
As with all engineering, there are always tradeoffs to consider, with different ramifications. The only absolute guarantee we have going in is The 2nd Law of Thermodynamics. Beyond that, it's a progression of exploration & creativity.
Don't presume Anon, test.
>but when [Thompson] made C