You still don't get it. It's not about the values that a religion brings, but simply the facts, that I think should be taught. When did it start? What do they believe? Where do they pray? What is the cultural significance?
Treat it like a history class. Maybe If everyone was taught properly about religions, there may be slightly less bigotry.
Are you from the US? Have you heard of how southern schools (Texas I believe) basically influenced textbooks into becoming conservative literature that were then sold throughout the country? You really want public schools to teach anything religious and do you really think it could be done without bias? I went to school in southern Missouri for a bit and listened to a teacher attemp to tell us how there were two different kinds of black people. Public school is a joke.