Depends on where you live.
If the jobs vacancy require degree, pretty sure university degree is important.
If you live in a country who choose experience over degree and other theory thing, university degree isn't important anymore.
Many people that I met usually say this to me "the most important thing of school is you can make friends and relationship, they will help you in the future". So, I think university is important for the social benefit than the thing you learn since you can learn in work.