I read it somewhere and they wrote "doing a great job that you like is more important than the money earned". I think this is not so true. why do the job in the first place if you don't want to earn or will you prefer after working all through the month you earn thank you from the company since you like the job, and go home with the thank you from the company and it becomes everything you need to take are of your family? c'mon don't deceive yourself.