There has been a notable shift in Hollywood from film portrayals that mock Christianity to films that portray Christianity as a force of evil.