There is no reason to gensym all of your concepts like this. It is different just for the purpose of being different: apparently you can't sell people on a "revolutionary technology" without appearing to be extremely different.
Nock is also not a good virtual machine. Recognizing blessed sequences of bytecode and replacing them with opaque blobs of code is not a valid approach to optimization. No one can actually run a pure Nock VM, so what is the point of having Nock in the first place?
Someone else on HN gave the best summary of Urbit I've seen yet: an elaborate cup and ball game, meant to give the impression of innovation and technical excellence.
> We should note that in Nock and Hoon, 0 (pronounced "yes") is true, and 1 ("no") is false. Why? It's fresh, it's different, it's new. And it's annoying. And it keeps you on your toes. And it's also just intuitively right.
Focus on the intuitively right.
I think it makes sense for 0 to yes/true.
Think of the Unix exit codes, and golang returns.
No news is good (yes/true) news.
edit: as to the "annoying" that's just Yarvin being cheeky.
I really don't think it is too serious.
Zero is good for "absence of failure". It's a "yes" in the sense that "this worked". Whereas any one of the myriad non-zero codes is "no, it didn't work".
If we have a situation in which there are many numeric codes, exactly one of which means "success", it's probably best if that one is assigned zero.
Punning means that we have two types, and we somehow interchange them; use an object of type A, as if it were of type B. We thereby leave behind the type system and take responsibility for that being correct.
If a language doesn't have a Boolean type, and some other type serves for indicating truth/falsehood, then that is a representational technique distinct from punning.
A language with no Boolean type can be statically typed. It just means that the conditional operators work with some non-Boolean, like integer, yet according to well-understood rules. For instance if we have some "if expr foo bar" such that either foo or bar is evaluated based on whether or not expr is zero. This would be statically well-typed if expr has integer type and foo and bar have the same type.
Of course, we can't do pattern-matching whereby a value is classified as Boolean or integer in separate cases. (Unless we use the language's type construction ability to define a Boolean type which wraps integer, or whatever).
I don't want to quibble over terminology, so fine: Hoon doesn't pun integers and booleans, it conflates them. Whatever you call it, it's still a horrible hack, and IMHO if you put in your language you surrender your snark privileges when comparing your language to Lisp.
Common Lisp conflates all types with Boolean. NIL, the sole instance of the NULL type in Common Lisp, is false, and an instance of every other type is true. (Of course, this situation quite good).
I think that's debatable. Javascript, for example, does something similar but it takes all empty containers as false, along with zero and a few other things. (Not that I am necessarily holding up Javascript as an example of good language design. My point here is just that there is not a consensus on how (or whether -- c.f. Scheme) to conflate booleans with other types.)
The salient differences between CL's boolean conflation and Hoon's are:
1. CL's conflation design is justifiable in terms of how it simplifies coding recursion over a list, which is the single most important Lisp coding idiom.
2. Hoon tries to justify using zero as true by saying that non-zero integers can carry information about the nature of a failure. (At least I've heard some people try to justify it in this way. I'm not sure if Curtis does not not.) But now you are no longer conflating integers with booleans, you are conflating integers with multiple enum types, one for every possible set of failure types. Flouting convention is one thing. Flouting convention in order to enable conflation of integers and enums undermines Hoon's claim of being strongly statically typed (at least in any interesting way).
CL's conflation also means that can use NIL to indicate the absence of an object (in the usual situation in which NIL isn't a valid value). We can test for this absence glibly, using a conditional. This is at least as useful as the benefits in list recursion.
The Javascript design means that we cannot test for the absence of an integer in this manner, because a negative result could mean that a zero is present.
It could be handy from time to time. I propose a NAUGHT function for Lisp which returns true for empty sequences, empty hashes, zero (integer, real or complex) and any other nothing you can think of.
Sure, but if you want to make that argument then you have to deal with the fact that there is no false value which indicates the "absence of an object" in the case where the value being tested is a list. Having multiple false values really can be a feature. (They just shouldn't be integers!)
I'm sorry but this... madness. Referring to return codes, which are completely arbitrary in nature, as being intuitive and thus, 0 as true being intuitive, feels like a complete logic breakdown.
Yes, if this language were to exist in a complete vacuum, where no other languages are existing, and nobody learnt anything else before, then sure, 0 as true and 1 as false would be fine I guess. Still not more or less intuitive, but one could agree on using it. But in a world where every other programming language uses 0 as false (or some equivalent) and where the chance that someone would use "nock" as his one and only programming language is very much 0%, this feels like utter bullshit.
Also, the versioning scheme based on Kelvin temperature? Seriously?
And then look at the Hoon syntax. No keywords, just ASCII symbols. Just look at it.
> Referring to return codes, which are completely arbitrary in nature, as being intuitive and thus, 0 as true being intuitive, feels like a complete logic breakdown.
Does the opposite approach make sense though? Generally, something being "true" means a program can happily move along. Something being "false" generally requires more introspection on why exactly it's false, i.e. error handling, exception handling, with the resulting changes in execution flow.
It makes sense because it's de facto standard, and intuition is experience, and intuitive is what we're used to. I'm not sure how it can be explained more basically.
That's not an argument that scales well, as then we'd have a single programming language, anything else being un-intuitive. Every single non-obscure language that exists today had to go against the (previously acquired) intuition, introduce new concepts and then work its way up to the point where new concepts are the (newly acquired) intuition.
I think there is sound theoretical basis to pronounce inverting the traditional true/false bit identification a mistake. Yarvin as much as admits this in his Lambda Conference recording on YouTube.
Its ok to set yourself apart. But for some reason this feels very app.net in a way. Not to disrespect the projects or ceeators but the scope and ambitions were rather big.
App.net at least had a couple clear use cases in mind (twitter clone, notifications/pub/sub service), even if they weren't unique/compelling enough to sell people on the service.
AFAICT, this platform has a broad scope without any kind of clear vision as to what they want to do with it. Maybe I haven't found the right part of their website yet?
https://urbit.org/docs/hoon/advanced/
There is no reason to gensym all of your concepts like this. It is different just for the purpose of being different: apparently you can't sell people on a "revolutionary technology" without appearing to be extremely different.
Nock is also not a good virtual machine. Recognizing blessed sequences of bytecode and replacing them with opaque blobs of code is not a valid approach to optimization. No one can actually run a pure Nock VM, so what is the point of having Nock in the first place?
Someone else on HN gave the best summary of Urbit I've seen yet: an elaborate cup and ball game, meant to give the impression of innovation and technical excellence.