But also, there's no reason to make those legal only inside classes. All it needs to do is make "def foo.bar" produce a different type of function, that has the method descriptor-producing behavior that is currently implemented directly on regular functions.
As far as less vs more common case - I think it's more important to optimize for obviousness and consistency. If "def foo" is a function, it should always be a function, and functions should behave the same in all contexts. They currently don't - given class C and its instance I, C.f is not the same object as I.f, and only one of those two is what "def" actually produced.
What I meant by function references inside classes is this:
class Foo:
pass
Foo.bar = lambda: 123
foo = Foo()
print(foo.bar())
This blows up with "TypeError: <lambda>() takes 0 positional arguments but 1 was given", because lambda is of type "function", and it gets the magic treatment when it's read as a member of the instance. So you have to do this: Foo.bar = staticmethod(lambda: 123)
and even then this is only possible when you know that the value is going to end up as a class attribute. Sometimes, you do not - you pass a value to some public function somewhere, and it ends up stashed away as a class attribute internally. And it all works great, until you pass a value that just happened to be another function or lambda.On the other hand, this only applies to objects of type "function", not all callables. So e.g. this is okay:
Foo.bar = functools.partial(lambda x: x, 123)
because what partial() returns is not a function. Conversely, this means that you can't use partial() to define methods, which can be downright annoying at times. Suppose you have: class Foo:
def frob(self, x, y): ...
and you want to define some helper methods for preset combinations of x and y. You'd think this would work: class Foo:
def frob(self, x, y): ...
frob_xyzzy = functools.partial(frob, x=1, y=2)
frob_whammo = functools.partial(frob, x=3, y=4)
except it doesn't - while frob_xyzzy() and frob_whammo() both have the explicit "self" argument, they aren't "proper" functions, and thus that argument doesn't get treated as the implicit receiver: foo = Foo()
foo.frob(x=0, y=0) # okay
foo.frob_xyzzy() # TypeError: frob() missing 1 required positional argument: 'self'
foo.frob_whammo(foo) # okay!
Which is to say, this all is a mess of special cases. You can argue that this all isn't really observable in the "common case" - the problem is that, as software grows more complex, the uncommon cases become common enough that you have to deal them regularly, and then those inconsistencies add even more complexity into the mix that you have to deal with - just when you already thought you had your plate full.