Is it odd that reading the bible has made me even more critical of Christianity?
I've taken on the task of teaching myself about the world's major religions. I'm trying really hard to know as much background and historical context that I can. I expected to better understand Christianity by reading the basis of their belief system...but it's only made me even more frustrated with it. Has anyone else had this experience?