In our country, this is the most common tradition or belief. While we're young, we were taught to study hard and finish schooling, and get a stable job. Because once we have already acquired a stable job, we will start earning and improve our lives. And create consistent progress, until we end up getting rich and enjoy the best things in life.
But I don't think we are getting align with it. Instead of working to earn and get rich, why are majority now end up struggling in life? Honestly, what I'm seeing right now is that we work to survive, not to make money and get rich in the process. Could this be the real possibility of life? Or we are just blinded by fantasies in life? Or is it because working alone is never sufficient enough to make riches in life, but we need various investments to achieve success and get wealthy in the long run?
I know different opinions and ideas may arise, but I'll be more willing to read and reflect each of it.