One of the most deeply held beliefs of humanity is that we are the absolute masters of the earth. This belief considers that nature in general and animals in particular are only here to serve us as we see fit. Can we kill animals for food or fun or profit or just use them to advance our own interests? Do they exist only to please our needs and whims? Is there any duty that we owe to the animals, and if so, why? Does Christianity have anything to say on this topic? If we look at the beliefs and practices of Christian churches, do we find them to be based on the Bible or on cultural traditions? And what exactly does the Bible teach us about the treatment of animals?
Recent Comments