Monday, May 11, 2009

Americans increasingly change Faiths

Research by the Pew Research Center shows that Americans increasingly leave the faith they were brought up in for others or desert religion altogether. The Pew study also has interactive section about the survey results.