https://github.com/TazeTSchnitzel/Firth/commit/7b9bf0b4c090e...
Thanks for the idea! :)
-1234
which are distinguished because there is no whitespace. You can distinguish the unary operator being applied to 1234 from a true -1234 constant.You'd have to actually run the language's parser in order to do the transformation to avoid changing strings, and even then it'd only work if the parser output kept track of the original line and character so that you could know where to make the change.
This sort of text-transformation is something I've long wished my text editor did. At a previous job the standard was three-spaces of indent, regardless of the language.
Emacs does this in some cases: specifically for camelCasedWords (http://www.masteringemacs.org/article/making-camelcase-reada...) and for the word 'lambda' which can be displayed as a symbol. There are many other modes which "overlay" some text over how it looked originally.
And I agree with you that hyphens are more readable. They're also good for adding extra semantic meaning - https://news.ycombinator.com/item?id=3978992
VAR_$(PATH) := whatever
where PATH could be path/to/foo-parser.o, say.Tab is a statically-typed, functional, type-inferred language that occupies a niche between bash and python.
It's also not Turing-complete but can compute almost everything you could ever think of.
(I wish more languages aimed for Turing-incompleteness -- unsurprisingly, it turns out Turing-incomplete languages have big benefits for performance and resource management.)
Well... that's just, like, your opinion, man.
Seriously though. In Elixir, for example, much of the language itself is implemented via its own macros, which demonstrates a certain nice extensibility. If Elixir followed this same pattern, it would get really annoying really quickly, as even simple if statements would require a leading slash.
Also, I preferred "unf" ;)
But I can see just "knowing" at a glance if it's a macro or not.
I think the answer would basically be determined by how much of the language itself uses its own macro system AND what type of macro system it actually is. If it's significant, having special syntax would just look weird.
What is the need for knowing if it's a macro or not when you could just know how it works (what'll it spit out / do?)?
While I do believe in limiting stuff for the sake of simplicity, this notation will actually burden the developer into not using the macro system fully, simply because someone wants there to be a non-forced distinction between macros and other constructs in the code.
Google Cache Text-Only:
http://webcache.googleusercontent.com/search?q=cache:cOp3ebJ...
"Tulip is still in active development, and I could use a whole lot of help, both filling in the design gaps here and actually churning out the implementation"
https://github.com/jneen/tulip/blob/master/tulip/libedit.py
(RPython is the "static" subset of Python used to bootstrap PyPy)
That is incredibly disingenuous. The only way a Haskell or ML program could be as colossally unsafe as a unityped language program is if the programmer used only one giant sum type for the entire program, and most functions in the program were non-total with respect to that type.
"Unityping" provides no static type safety. It is isomorphic to, and usually a euphemism for, the lack of any static type system.
I would say this is more like go than anything, though it seems to lack methods (and interfaces) and includes a functional syntax.
You're going to run into issues when attempting to extend polymorphism for built-in functions to user-defined types—imagine trying to figure out how to sort an 'unknown' type without a way to compare them without modifying the method to be explicitly aware of the new type.
Are fifth graders critiquing programming languages now? Seriously, who makes that association and then feels the need to comment on it?
) (
( _)
|/