Yes, the Bible should be taught in our schools because it is necessary to understand the Bible if we are to truly understand our own culture and how it came to be. The Bible has influenced every part of western culture from our art, music, and history, to our sense of fairness, charity, and business.
Joel OsteenSome of you had big dreams, but you had some disappointments & now you think itโs over. But God is saying, Begin again.
Joel Osteen