I think that "OOP" is an incredibly overloaded term which makes it difficult to speak about intelligibly or usefully at this point.
Are we talking about using classes at all? Are we arguing about Monoliths vs [Micro]services?
I don't really think about "OOP" very often. I also don't think about microservices. What some people seem to be talking about when they say they use "OOP" seems strange and foreign to me, and I agree we shouldn't do it like that. But what _other_ people mean by "OOP" when they say they don't use it seems entirely reasonable and sane to me.
Using classes hasn't been a part of the definition of OOP since the Treaty of Orlando. Pre-ECMAScript-2015 JS is a mainstream OOP language that doesn't have classes, just prototypes. (Arguably ECMAScript 2015 classes aren't really classes either.)
It sounds funny to say that JavaScript classes aren't classes, but if you don't believe me, just ask JavaScript in your browser (Ctrl-Alt-K, probably):
I think in terms of language features and patterns which actually mean something. OOP doesn't really mean anything to me, given that it doesn't seem to mean anything consistent in the industry.
Of course I work with classes, inheritance, interfaces, overloading, whatever quite frequently. Sometimes, I eschew their usage because the situation doesn't call for it or because I am working in something which also eschews such things.
What I don't do is care about "OOP" is a concept in and of itself.
Well.. I don't understand how you can read a confused and muddled article by someone who doesn't want to know the difference between JavaTM and one of its notable choices in the many dimensions of language choices and not wish to be a little more enlightened as to the difference between hiring an OOP monkey or a VMware jockey to smash some bits about.. The article is like a poster child for taking an hour to learn what your profession is about.
Respectfully, it's not clear to me what you're saying. You're clearly displeased with both the author of the article and with myself, but beyond that, I'm not sure what your thesis is.
We give features of a profession terms so we can refer to them independently and make reasonable discussions that don't confuse people as to which traits we think something that is neither Java nor Python (and so need not match either of them on every dimension) should have.
For example, I hate Java because of OOP, but strong typing can make a lot of bad in a language tolerable. Does the writer of the article agree with me? They don't seem able to understand whether they do.
In a comment about nailing the jargon, kinda ironic that you probably meant to say static typing rather than praise Java for its mediocre strong typing.
Technically maybe yes, but that would confuse me with some of my colleagues that like explicit static types. I like type inference systems but loath type confusion.
Are we talking about using classes at all? Are we arguing about Monoliths vs [Micro]services?
I don't really think about "OOP" very often. I also don't think about microservices. What some people seem to be talking about when they say they use "OOP" seems strange and foreign to me, and I agree we shouldn't do it like that. But what _other_ people mean by "OOP" when they say they don't use it seems entirely reasonable and sane to me.