The good news is that Liliputing appears to have recovered much (but not all) of its search engine traffic. The bad news is that I have no idea if any of the hundreds of changes I made to the site had any impact at all, or if I just wasted an awful lot of time.
For the past few years Liliputing has been one of the top resources on the web for news and information about netbooks, tablets, and other affordable mobile computers. But a few months ago search engine traffic started falling off a cliff, and the changes seemed to coincide with Google’s launch of a new search algorithm code-named Panda.
While Panda was designed to help separate high quality web sites from low, there are plenty of examples of false positives. I could be biased, but I’m pretty sure Liliputing was among them. The site is full of original articles including dozens of the most detailed computer reviews you’ll find anywhere on the web. We never copy and paste content from other sites, and in fact, I’ve avoided even posting press releases or block quotes longer than a few words.
Anyway, I’ve been documenting the steps I’ve taken to try to convince Google that Liliputing is a “high quality” website — and in fact, some of those changes really have improved the user experience, so I’m glad I made them. For instance, there’s no reason you should be redirected to a new URL every time you click on an image in a gallery.
A week ago I started to notice that search engine referrals were close to what they had been a few months ago. I was cautiously optimistic, but figured I’d wait a few days before declaring victory.
The last few changes I’d made to the Liliputing really seemed like they might have made a difference. I reduced the number of links in the sidebar, header, and other areas of the site so that overall there were far fewer links on each page. I used Google’s new tool for specifying how URL parameters should be handled to reduce pages that were essentially indexed twice. And I identified every post on the site that had fewer than 100 words and either added a “noindex” attribute or updated the post with new information or more details.
That last change was one I’d been putting off because it just seemed so daunting for a website with more than 7,000 posts. As it turns out, I only had to edit a few hundred articles — and when I did, I realized that some weren’t just light on information — they were also outdated.
That little exercise has changed the way I think about blogging. While Liliputing is still largely a news site, every time I write a new article I try to imagine what someone would think if they found this post 3 years from now. Would it still offer useful information?
Because search engines don’t surface a series of posts in chronological order the way they were written, you have to assume that it’s possible someone will find your article with absolutely no context. That means you should give them enough information in each post to fully understand the topic — or at least links to follow to get more information.
This doesn’t mean you need to post a full review each and every time you mention a tablet or netbook. But it does mean it’s not enough to just mention the model number and assume everyone knows what you’re talking about.
Anyway, so when traffic started to pick up almost immediately after making those recent changes, I figured I’d finally cracked the code. After all, Google’s only real advice for dealing with Panda so far had come down to: Write better content, and get rid of low quality content.
Today I found out that Google rolled out a Panda 2.3 update at pretty much the exact time that Liliputing was regaining its search engine traffic.
So while I’m glad I made many of the changes I did, it’s possible web traffic would have recovered anyway.
It’s also possible that the only reason Liliputing has started to recover is because I made those changes and Google rolled out an update to Panda which recognized the changes.
From what I understand, Google pushes out Panda updates manually. So even if I made dramatic changes to my site in early July, just days after the Panda 2.2 update rolled out in June, Google may not have noticed until Panda 2.3 was ready to go.
In other words… I wish I could tell you I’ve cracked the Panda code. But I’m not sure that I have. All I know is that Liliputing is doing better than it was a week or two ago… but that the biggest thing I’ve learned is that we need to continue building an audience of loyal readers by broadening the scope of coverage and using social media and other tools so that we get enough traffic to keep the site running even when search traffic is on the low side.
Liliputing still hasn’t had a full recovery. I’ll be honest, I never monitored my traffic stats closely enough before Panda to know how much of my traffic came from Google and how much came from other sources. But overall traffic is a bit lower than it used to be. That could be from a decline in Google search. It could be due to the fact that I’ve essentially told Google not to even bother looking at hundreds of pages on my site. Or it could just be due to the fact that traffic tends to dip in the summer anyway.
Overall, something’s changed and it would be nice if I knew if I had anything to do with it.
As I watched my visits and page views climb over the past week, I imagined I’d write a post this weekend declaring victory and explaining how I managed to show Google that my site was a high quality site (which isn’t the same thing as gaming the system or tricking Panda… because Liliputing really does have thousands of pages of high quality, well-researched, original articles).
Instead all I can say is I guess I’m happy that things are improving and if you’re lucky and/or worked hard, maybe you’ll see the same thing on your site?
Update: So it occurs to me that there’s a framework under which this all makes sense: Google is intentionally trying to kill SEO (Search Engine Optimization).
That would explain why there have been so few stories about publishers making changes to their sites and then recovering after they were hit by Panda. It would also explain why most recovery stories that we do hear seem to happen at a moment when Google is rolling out a major algorithm change: because it’s not us… it’s Google.
In other words, Google doesn’t want webmasters to make specific changes to their sites in order to make their content easier for search engines to discover. It’s Google’s job to find the best content, no matter how your web pages are formatted. Instead, Google wants publishers to focus on creating good content. Period.
Which is all fine and dandy. But when Google changes its algorithm in a way so that publishers that have been cranking out good content for years are suddenly penalized, the only rational response is to try to figure out why they’ve been penalized and take specific steps to get back into Google’s good graces.
I never paid much attention to SEO before Panda. I figured if I continued to publish content that was of value to my readers, Google would find it and direct even more readers to it. Once Panda hit, I suddenly had to start looking for SEO best practices.
If you’d asked me a few months ago if I would be OK with Google rolling out an update that specifically makes it tougher for people to game the system using SEO practices, I would have said sure. But if you had told me that the way Google was going to do this was to make dramatic changes to try to identify high quality sites from low quality sites and that there was a decent chance that some websites would get caught on the seventh circle of Hell by accident for up to 5 months, I probably would have felt differently.
Great report man, glad you're recovering! Panda 2.3 is my bet, but I'm sure your site only became better due to having to make all these changes and will benefit in the long run.
Yep… although honestly, I probably removed more content from Google's Index than I needed to, and deleted a number of pages from my site altogether.
It's possible if I'd done absolutely nothing and hadn't spent the last few months trying to guess what Google wants, traffic would be even higher right now.
Still, it's good to see that Google is specifically looking for ways to avoid false positives when identifying low quality sites.
At UMPCPortal I've seen no changes that I can attribute to Google. In general, traffic is down although on a per-post basis, I'm seeing much the same as I had before.
The same apply to Carrypad.
My only conclusion is that breaking news and controversial posts along with the usual reviews and to-do's are the best value posts in terms of business. My problem is that it's becoming more difficult to break news and I am not the type that creates controversial posts just for the traffic.
Having been temporarily booted from Google News I also recently discovered that being first (or early) with a story isn't worth much without Google's help. You need to find stories nobody else is likely to find or confirm on their own and then get larger sites to link to you.
Even that isn't worth what it once was, because most sites will summarize your story so thoroughly that few readers will follow the link back to your site (although if you create a youtube video at least they might embed it).
All of which is to say, we're at Google's mercy – or we need to build rabid canvases somehow.
I sometimes regret that my big idea for mobiputing was to launch a platform agnostic blog about mobile apps. It's much easier to get readers to identify with your brand when you're writing about the device or operating system they already use and love.
The 'summarisation' (sp?) of articles Is becoming a real issue. I totally agree with your thoughts on that. Any ideas on how to limit it?
Nope – especially because if they simply said "hey, there's a cool article at carrypad, go check it out" then they would probably be the ones penalized by google for running low quality, aggregated content.
Overall I think what we need to for.is create such strong identities for our sites that once people interested in a particular topic find us through Google, a link another site, or any other means, they feel compelled to bookmark, subscribe, or take other action to keep coming back.
I finally got around to setting up Facebook and twitter feeds for my sites a few months ago, realizing they're the new RSS. Every day I get a few new followers/fans. It's slow going and it will be months before I top 1000 on either network at this point – the process of gaining my first thousand RSS subscribers in 2008 was much faster. But at least the number is going up instead of down, which means the content still strikes a chord.
Perhaps broader… or narrower content would attract even more people. Watching newer niche sites like Android Police it new broad topic sites like The Verge is intructive — although The Verge is clearly playing with a loaded deck since the all-star writing team brings a lot of existing fans to the table.
i checked analytic yesterday and don't find any kind of traffic drop
@Brad,
Thanks for the top tips. I wondered if you'd be able to give an update following the Panda 2.4 and 2.5 updates please? The last one is the one that got me!
Mark