It's true that languages are abstractions, but not all abstractions are useful.
member this.foo(x, y) = ...
Again, "this" is just an identifier here, and doesn't have any special meaning.It's more elegant firstly because it follow use, and secondly because it means that "def foo(x)" has the same meaning both inside and outside of a class declaration - it's just a function, and there's nothing special about its first argument. As it is, we need stuff like @staticmethod and @classmethod. It's especially annoying when you have class attributes that happen to reference a function, because conversion from functions to methods is a runtime thing - so you have to remember to wrap those in staticmethod() as well.
You would need those even with your suggestion (except you could drop @staticmethod if you add another layer of magic so that methods declared without an leading identifier were assumes static; you'd still need @classmethod some equivalent mechanism to distinguish which non-static methods were class vs instance methods.)
No it doesn't. Currently `def foo(self): pass` is called as `instance.foo()`. You're suggesting that `def self.foo(): pass` would be called as `instance.foo()`, except now it looks like self and instance are syntactically related in ways that they aren't.
> Again, "this" is just an identifier here, and doesn't have any special meaning.
But the grammar is no longer LL(1), and you have weird conditionally valid syntax, like `.` in a function name is valid only in a class block.
> "def foo(x)" has the same meaning both inside and outside of a class declaration
This is a stretch, especially since you're now optimizing for the uncommon case. Staticmethods are rare compared to instance methods (I'll go further and claim that static methods are an antipattern in python, modules are valid namespaces, you can stick a function in a module and couple it to a class also defined in that module and nothing bad will happen. Banning staticmethods entirely doesn't reduce expressiveness). Aligning staticmethods with functions, instead of aligning instance methods with functions (as python does currently) encourages you to do the wrong thing.
> classmethod
Your changes don't affect classmethod at all, if anything they'd make classmethod more of a special case. How do you signal that `self.foo()` takes `self` as the class instead of the instance?
> It's especially annoying when you have class attributes that happen to reference a function, because conversion from functions to methods is a runtime thing - so you have to remember to wrap those in staticmethod() as well.
What do you mean? Like
class Foo:
a = Foo.func()
@staticmethod
def func():
return 1
I'll say again: staticmethods are an antipattern in python: def func()
return 1
class Foo:
a = func()
works just as well, better in fact. Modules are great namespaces. Classes are more than namespaces, and if all you need is a namespace, you shouldn't use a class.> because conversion from functions to methods is a runtime thing
I'd also quibble with this: it's a binding thing.
class Foo:
a = Foo.foo(None)
def foo(self):
return 1
will work, and if you check, type(Foo.foo) is still just `function`, its only when you create an instance of Foo that the function `foo` is bound to the instance, and when that is done, the bound `foo` is converted to a method object. This was different in python2, where Foo.foo and instance.foo were both "instancemethod" objects, but in python3, Foo.foo is a plain old function, and instance.foo is a method.Specifically this means that if you can get your hands on the `method` constructor (like with `type(instance.method)`), you can then do silly things like
class A():
def foo(self): pass
instance = A()
def f(self): return 5
assert instance.func() == 5
and this will work. You'll have bound the function to the instance. Of course, if you stick an attribute on `instance` (or `A`), and reference `self.attribute` in the function, this will still work. (this also lets you do things like bind a given instance of a function to a different instance of the class, but that's because the method constructor is essentially just partial with some bookkeeping for class information)But also, there's no reason to make those legal only inside classes. All it needs to do is make "def foo.bar" produce a different type of function, that has the method descriptor-producing behavior that is currently implemented directly on regular functions.
As far as less vs more common case - I think it's more important to optimize for obviousness and consistency. If "def foo" is a function, it should always be a function, and functions should behave the same in all contexts. They currently don't - given class C and its instance I, C.f is not the same object as I.f, and only one of those two is what "def" actually produced.
What I meant by function references inside classes is this:
class Foo:
pass
Foo.bar = lambda: 123
foo = Foo()
print(foo.bar())
This blows up with "TypeError: <lambda>() takes 0 positional arguments but 1 was given", because lambda is of type "function", and it gets the magic treatment when it's read as a member of the instance. So you have to do this: Foo.bar = staticmethod(lambda: 123)
and even then this is only possible when you know that the value is going to end up as a class attribute. Sometimes, you do not - you pass a value to some public function somewhere, and it ends up stashed away as a class attribute internally. And it all works great, until you pass a value that just happened to be another function or lambda.On the other hand, this only applies to objects of type "function", not all callables. So e.g. this is okay:
Foo.bar = functools.partial(lambda x: x, 123)
because what partial() returns is not a function. Conversely, this means that you can't use partial() to define methods, which can be downright annoying at times. Suppose you have: class Foo:
def frob(self, x, y): ...
and you want to define some helper methods for preset combinations of x and y. You'd think this would work: class Foo:
def frob(self, x, y): ...
frob_xyzzy = functools.partial(frob, x=1, y=2)
frob_whammo = functools.partial(frob, x=3, y=4)
except it doesn't - while frob_xyzzy() and frob_whammo() both have the explicit "self" argument, they aren't "proper" functions, and thus that argument doesn't get treated as the implicit receiver: foo = Foo()
foo.frob(x=0, y=0) # okay
foo.frob_xyzzy() # TypeError: frob() missing 1 required positional argument: 'self'
foo.frob_whammo(foo) # okay!
Which is to say, this all is a mess of special cases. You can argue that this all isn't really observable in the "common case" - the problem is that, as software grows more complex, the uncommon cases become common enough that you have to deal them regularly, and then those inconsistencies add even more complexity into the mix that you have to deal with - just when you already thought you had your plate full.