Reviews redux

Last month, I took you on a tour of Fakespot, the online service that spots “fake” reviews. Many of you provided feedback and I am happy to report that the majority of you found your reviews to be in good shape. However, quite a few of you did not — and we agree it is unjust. What to do? Here’s another site that can assist in evaluating reviews: Review Meta. Click on the button below to test it.
This site has a lot more information in its feedback. Give this site a try and let me know how it compares to FakeSpot for your reviews.
So why all this hubbub about reviews? If you are a new author, you’ve missed the back issues where I talk about the importance of 50+ reviews on Amazon — and reviews on other platforms. At 50+ reviews on Amazon, they start cross-promoting your book. I am thinking they also have an algorithm that checks the “quality” of the reviews to make sure they aren’t a bunch of friends and family.
In closing, I’ve learned that Library Journal now deals directly with NetGalley for reviews. Rather than sending print copies to LJ reviewers, they require a NetGalley token! More on this in future issues.

The importance of high-quality reviews

It is so important to accumulate reviews for your books. As I have mentioned several times, we tend to use NetGalley for our fiction and Cision for our nonfiction to attract review opportunities. While we are not always happy about the quality of the reviews from NetGalley, we are finding a good number of the readers come through with something — eventually.
Of course, there are also the Goodreads and Amazon reviews that inevitably happen as the book is sold and read. Some of you also benefit from our opportunities with Publishers Weekly, Kirkus, or Library Journal when we use those channels — usually only when there is enough advance time to do so.
Despite all of these channels for reviews, we have found the number one determinant of sales (besides category) was Amazon reviews — having at least something — and ideally 50+. Because Amazon is the largest bookseller, their reviews count the most. The traditional reviewers now have a lot less pull.
Since our last newsletter, we became aware of a rating service for Amazon reviews, and we were very surprised with the results from this platform. We ran several examples and share two below:
Sherry Knowlton’s Dead of Autumn has 75 Amazon reviews — well beyond our target. I chose Sherry’s book for this reason — and not meaning to pick on Sherry — note that every one of our books that we tried scored very low — an “F” letter grade for quality — here’s why:
Analysis overview
Our engine has analyzed and discovered that 37.3% of the reviews are reliable.
This product had a total of 75 reviews on Aug 18 2018.
Interesting tidbit: the most used word by reviewers is book.
How are reviewers describing this item?
great, dead, down, next and first.
Our engine has profiled the reviewer patterns and has determined that there is high deception involved.
One of my favorite novels, Howard Frank Mosher’s The Fall of the Year, has only 18 reviews. However, it scored at “A”:
Analysis overview
Our engine has discovered that over 90% high quality reviews are present.
This product had a total of 18 reviews on Aug 19 2018.
Our engine has profiled the reviewer patterns and has determined that there is minimal deception involved.
Our engine has determined that the review content quality is high and informative.
Interesting tidbit: the most used word by reviewers is book.
How are reviewers describing this item?
great, frank, wonderful, every and first.
I could go on and on with examples — but I wanted to make you aware of this site. Check your book(s) by copying their Amazon URLs into the analyze box. I have put the link below — just click on FAKESPOT. Let me know your results — this is something we need to further analyze. Obviously, if these kinds of algorithms are going to be utilized more and more, it is VERY important we score average or better …