I agree with what OP saying. Yes, this pandemic teaches us something about health. How can we get health if we don't care about our body? We all forget that health is the main thing in our lives because without having healthy, we cannot do anything we want. We can take a look at a hospital that so many people wants to get medicine to get fit. And this Covid-19 gives us an important lesson that we need to take care of our health, and need to have a healthy body.