Emacs unfortunately uses Emacs lisp, not common lisp or scheme.
Emacs unfortunately uses Emacs lisp, not common lisp or scheme.
One important thing to realize is that different dialects of English have slightly different grammars.
One place where different dialects differ is around negation. Some dialects, like Appalachian English or West Texas English, exhibit ‘negative concord’, where parts of a sentence must agree in negation. For example, “Nobody ain’t doin’ nothing’ wrong”.
One of the most important thing to understanding a sentence is to figure out the dialect of its speaker. You’ll also notice that with sentences with ambiguous terminology like “he ate biscuits” - were they cookies, or something that looked like a scone? Rules are always contextual, based on the variety of the language being spoken.
English definitely has rules.
It’s why you can’t say something like “girl the will boy the paid” to mean “the boy is paying the girl” and have people understand you.
Less vs fewer, though, isn’t really a rule. It’s more an 18th century style guideline some people took too seriously.
No.
There’s two types of grammar rules. There’s the real grammar rules, which you intuitively learn as a kid and don’t have to be explicitly taught.
For example, any native English speaker can tell you that there’s something off about “the iron great purple old big ball” and that it should really be “the great big old purple iron ball”, even though many aren’t even aware that English has an adjective precedence rule.
Then there’s the fake rules like “ain’t ain’t a real word”, ‘don’t split infinitives’ or “no double negatives”. Those ones are trumped up preferences, often with a classist or racist origin.
The idea of a Pokémon clone isn’t protectable, but existing Pokémon are.
You can make a Pokémon clone with entirely novel monsters, but if a judge thinks they look too much like an existing Pokémon they’re gonna have a problem.
Yeah, projects also exist in the real world and practical considerations matter.
The legacy C/C++ code base might slowly and strategically have components refactored into rust, or you might leave it.
The C/C++ team might be interested in trying Rust, but have to code urgent projects in C/C++.
In the same way that if you have a perfectly good felling axe and someone just invented the chain saw, you’re better off felling that tree with your axe than going into town, buying a chainsaw and figuring out how to use it. The axe isn’t really the right tool for the job anymore, but it still works.
C is not how a computer truly works.
If you want to know how computers work, learn assembly and circuit design. You can learn C without ever thinking about registers, register allocation, the program counter, etc.
Although you can learn assembly without ever learning about e.g. branch prediction. There’s tons of levels of abstraction in computers, and many of the lower level ones try to pretend you’ve still got a computer from the 80s even though CPUs are a lot more complex than they used to be.
As an aside, I’ve anecdotally heard of some schools teaching Rust instead of C as a systems language in courses. Rust has a different model than C, but will still teach you about static memory vs the stack vs the heap, pointers, etc.
Honestly, if I had to write some systems software, I’d be way more confident in any Rust code I wrote than C/C++ code. Nasal demons scare me.
Right tool for the job, sure, but that evolves over time.
Like, years back carpenters didn’t have access to table saws that didn’t have safety features that prevent you from cutting off your fingers by stopping the blade as soon as it touches them. Now we do. Are old table saws still the “right tool for the job”, or are they just a dangerous version of a modern tool that results in needless accidents?
Is C still the right tool for the job in places where Rust is a good option?
Keep in mind, though, AI progress is often more like punctuated equilibrium.
Each new approach gets you much further, and polishing each approach gets you slight improvements until the next approach comes along. Improvements to chatgpt might plateau until the next big breakthrough architecture. Or maybe not.
MicroSD cards are better, here. They’re 250mg; a pigeon can transport 75g. That’s 300 microSD cards, ignoring the weight of the SD card enclosure.
It’s a real quote, from the 80s, published in a networking textbook.
It’s amusing, but it’s always been a serious and occasionally practical observation.
IPoAC is a joke about printing actual IP packets, sending them by pigeon, then scanning them.
You do the whole usual TCP ACK/SYN thing, but with pigeons.
It’s not the same as ‘sneakernet, but strapping microsd cards to a pigeon’. It’s way, way sillier.
Levelized cost averages the fixed costs over the lifetime of the generation
They’re generally comparing utility scale installations, not home rooftop solar though.
I’ve never actually used PHP.
Does it still have random Hebrew in error messages like ‘unexpected T_PAAMAYIM_NEKUDOTAYIM’?
Prescriptivism is mostly just an unprincipled mishmash of shibboleths someone pulled out of their rear end hundreds of years ago, classism, and knee-jerk reactions against language change.
For example - why do people distinguish less vs fewer to refer to countable vs uncountable nouns? Because someone wrote in 1770 that they thought that distinction was elegant, despite not actually reflecting the way English at the time was spoken.
Why is ain’t “not a word”? Because it originated in the speech of poor people, and was used less commonly by rich people. People roll their eyes at new business-speak because it comes from rich, powerful people, but look down their nose at language innovations from poor hillbillies and other disfavored groups.
And you can find writings from old prescriptivists complaining about literally every change in the language, such as hating the new ambigious use of singular ‘you’ when ‘thou’ was perfectly good and unambiguous or hating phrases like ‘very pleased’.
The old joke is that C++ is an octopus made by nailing legs to a dog.
So it should probably be a rifle-chaku made by connecting two Garands with a chain.
C# vs Java is also really weird since C# started out as basically a Java clone.
Reddit very much depends on the subreddits you subscribe to.
Browsing /r/askhistorians or /r/programming isn’t really the same experience as r/memes or whatever. Not logging in to reddit makes it way worse since you only see the popular low-effort threads instead of better niche content.
Would you really rather see <\Foo>
than )
?
There’s a reason why most popular languages use }
rather than end if
or fi
. The added verbosity doesn’t actually help people read your code more than e.g. indentation or editors with paren matching or rainbow parens.
Python virtual environments feel really archaic. It’s by far the worst user experience I’ve had with any kind of modern build system.
Even a decade ago in Haskell, you only had to type cabal sandbox init
only once, rather than source virtualenv/bin/activate.sh
every time you cd
to the project dir.
I’m not really a python guy, but having to start touching a python project at work was a really unpleasant surprise.
Emacs is a bunch older than common lisp.
One of its more idiosyncratic design decisions was using dynamic scope, rather than lexical scope. They did add in per-file lexical scope, though.
It also just doesn’t implement a lot of common lisp’s standard library.