They say the teeth are important to keep healthy and I get that,but sometimes feel like these dentists are really about making money at our expense.
Thoughts?
Then don't go to them and let your teeth rot. Do you expect them to give you treatment for free or something?