I receive a lot of review books, but I have never once told lies about the book just because I got a free copy of it. However, some authors seem to feel that if they send you a copy of their book for free, you should give it a positive review.
Do you think reviewers are obligated to put up a good review of a book, even if they don’t like it? Have we come to a point where reviewers *need* to put up disclaimers to (hopefully) save themselves from being harassed by unhappy authors who get negative reviews?
I don’t think that I should be obligated to givea positive book review. If the book is awful, it’s awful and I’m going to say that it’s awful. If I don’t honestly state my opinion and the basis for it, I will lose credibility for my readers. I should hope that people don’t have to put up disclaimers simply for stating their own opinion.