College is to educate not to allow you to earn more money. Sure the qualifications enable you to get a higher paying job as you'll be qualified for it, but this is only because without the college education you would nt know what you are doing
Employers prefer qualified employees and people who know what they are doing...of course depending on the field, college graduates look better than your average high school pass.
No, College does not earn you more money, but it certainly does enable you to earn more money.