Is it because it would also depend on what that person's definition of "scaling" is?
Yes. I like that kind of approaching the problem, by the way.
What is your definition or idea of how scaling Bitcoin should actually be?
Scaling is directly related with compression. If you can use the same resources to achieve more goals, then that thing is "scalable". So, if the size of the block is 1 MB, and your "scaling" is just "let's increase it into 4 MB", then it is not a scaling anymore. It is just a pure, linear growth. You increase numbers four times, so you can now handle 4x more traffic. But it is not scaling. Not at all.
Scaling is about resources. If you can handle 16x more traffic with only 4x bigger blocks, then this is somewhat scalable. But we can go even further: if you can handle 100x more traffic, or even 1000x more traffic with only 4x bigger blocks, then this has even better scalability.
Also, scaling is directly related to intelligence. You can read more about Hutter Prize, which was advertized by Garlo Nicon some time ago:
http://prize.hutter1.net/You, ser, are correct! In the context of Bitcoin, real actual scaling should be defined as, "increasing network throughput, without sacrificing decentralization". - That's probably where the problem is, because different people in BitcoinTalk have different definitions of "scaling". But to many, it merely means increasing the block size.
I want to know everyone's definition/idea about scaling.
I will add a bonus question: how do you measure if your model is scalable or not? Write it as a function in big-O-notation, or anything you like. I support "constant-based scaling", which means O(1) scaling, which means leaving current resources as they are, and improving algorithms to build things on top of that, for example through commitments.
If that question is just for me, then you're asking the wrong person. Although I would like to read the posts of the smarter users in the forum, to learn and get some insight.