According to the dictionary, religion is, among other things, "... something one believes in and follows devotedly; a point or matter of ethics or conscience."
If you want to change the meaning of
religion, go ahead. But if you want to be honest about it, explain your definition.
If you are talking about the religions that are organized as religions, say it. Others will want to use the dictionary definition, since the dictionary definition is generally what is accepted without special meanings being given.
