var did not grow on me. At least not yet.
A main rationale or at least one of the rationales of it were - to increase readability.
For me it actually decreases readability, and that's the main reason it didn't grow on me. When I see it, I have less confidence I understand it.
1.
Oracle provides some examples which sort of should hook you up on this:
As opposed to:
But now imagine you need to use some other code that expects those types as arguments, I personally feel I'd need to check what's actually inferred by
var and whether that would be what I actually need. Ok, later you may remove explicit type once you got details in agreement in order to shorten the line, but that's not more readable by any sense to me.
2.
Now, here is another example which experimented with:
It is kind of
de facto standard, what you really want here is:
Now if I define the following methods - both compile and work, that's a bit puzzling, looks like some sort of dynamic inference:
I did not expect it to compile actually. Before I found out actually both compile, I didn't now for sure which one would compile either.
Not sure need to
write it down down to lower readability or simply my lack of understanding.
3.
Another just came to mind to try (compiles):
4.
Kotlin and Scala also have auto-inference, so it is not
Java var that bugs me, I don't feel the need for auto-inference, I like types to be spelt out so no ambiguities left and I just find it harder to parse it in my head otherwise.
Slightly on a more worse side, you can't use var consistently throughout the code. It can't be an instance variable, it can't be parameter (understandable), just local variables - so for me that makes code look inconsistent.
I'm not against this change, just trying to find the way to get used to it.
Do you have some other perspective to provide which I could look at, why this feature could be liked by the wider community?