One can interpret this to mean man can do what he wants to with the animals on earth. God put them here as our plaything, and to eat.
True. And once upon a time, powerful men used violence and repression to impose the opinion that they were endowed by their Creator with the right to have dominion over other, lesser men, to treat those below themselves in any way they chose, to own other human beings as property, to demean women solely as sex objects or as incubators for their male heirs. The principles of patriarchy and "Divine Right" were thoroughly justified by Biblical passages, these powerful men maintained.
As time went by, we humans came to understand that these beliefs were not in line with Divine Truth. Among the first nations on earth to embrace this understanding was the United States.
Would it be altogether surprising then to find that the US is taking a leadership role in dispelling the notion that humans have a "Divine Right" to treat other animals as they please, based on a single verse in the Bible?
US Christians are adopting a new paradigm