Websites get most visitors through Google, so it makes sense optimizing for it. But the visitors themselves should have the best experience possible… as it turns out, both look at pages similarly.
Recently I’ve described how I want to improve my art page – and the first tweaks seem to have helped. With this article I’m rolling out more changes. And I’ll explain why I think those will help Google and visitors.
This means articles that appear under different link names: They are under the title itself, but also listed under the tags (like in this case the “Art Website” tag). My front page used to show full text articles – and one change I’m releasing today is that I only show previews.
It would be great to get impressions from you readers, if the change is nice – I can so far only rely on my introspection. I never found the long pages all that practical – since if I want to show a post/artwork to someone, I always have to scroll…scroll…scroll. Having a bunch of articles with sometimes 15 pictures each, makes the loading slow. On mobile that it can even get slow to display. And the overview suffers.
Duplicate content is especially tricky for Google. They give each link a sort of power meter called “page rank”. Each link from outside pages adds to it’s power. Should a page have collected a very high power, it’ll be visible on top of Google results.
And here comes the issue: Right now I had my full articles on the front page, and again on the listings per topic (e.g. my tag collections). Now, should anyone link to these instead of the actual article, then the ranking-power of the article is split. So maybe both articles would show up on page two on google – but the page would gather much more views if just one is in the first results (users rarely click beyond the third result – let alone through to the second result page).
Here is a comparison of what I had before, and how it looks now. I list all articles just as preview, with little text. All tags act that the same – which means hundreds of sites change as (I roughly have 200 tags).
The tags are actually now my major search-traffic driver. Namely the David Hockney collection. As you can see here in the web-analytics.
That means I’m going a risk with the change – but I mitigate it by programming an exception for the top tags. David Hockney will be still on two pages, but that’s much better than having all content on five tags.
Judging exactly how these changes affected traffic will be tough. Maybe the specific articles start to rise as the tags will fall as traffic driver. We’ll see how it balances, or if it will backfire.
One wants to provide good links to outside pages and the own content. But less is more.
I had a tag cloud on the side for a long time, and archives… meaning they appeared on every page linking to all content I have. I thought it was practical: users interested in one topic could just click on the tag “Oil Paintings” for example. Yet my stats never I saw anyone did – and I mean never quite literally. I think it was just too many links to even read through. Or they were placed too close to unrelated topics.
Either way – that cloud is long gone. What I still had was a list of tags on top of each article. Again – no one ever clicked those. Maybe because they’re at the top, where people didn’t even start with the article yet? I moved them down to the end now. I also just removed them from the front page.
Another good example of Google mimicking users: I have the impression they don’t care much for navigation links. Again, the oil painting tag was right on top on every page, but Google never ranked it high. Rather David Hockney was the most popular link – which wasn’t in the navigation at all.
A huge resource on the Goolge-side of the equation is at Northcutt’s list of ranking factors. It explains (with good proofs) how link power is given. Having a lot of links for example will weaken each one, as the power is divided between them. So my reduction should help. Here’s them on Page Authority:
Distribution of Page Authority
Typically, pages that are linked site-wide are given a large boost, pages linked from them get a lesser boost, and so forth. A similar effect is often seen from pages linked from the homepage, because this is commonly the most-linked page on most websites. Creating a site architecture to maximize this factor is commonly known as PageRank Sculpting.
So site-wide links are powerful, but I should the power with care. Right now, I left only a handful of essential links. I’ll go later into optimizing so that specific searchable content shows – like my ink tutorials.
Moving links lower on the actual page also makes clear to google that they’re less important.
With that I mean pages where visitors are stuck with no exit. Mostly for lack of links – or interest in links.
“Bounce Rate” is a statistic that can hint at this. That means visitors leave without visiting a second page. As you can see below – that’s the case for 75% of my visitors. By itself that’s not a bad number – it really depends on the site. If the user finds everything desired on the front page, then that’s good. But I would prefer that users browse more around – find more content like the one they came for.
One issue were the long article listings described above. So that’s fixed – and the change in the stat will be interesting. Another problem could be the images in my gallery: once you click for the “big-version”, you end up just with an image. The user has to click on the browser’s back back button. Users avoid extra actions and load times though – it’s annoying especially for many images. So I now added a lightbox mode, meaning the images open in the same window, with just darkening the background. Going through more images is a breeze, and going back is a quick click outside the image, just like users know from facebook.
For the Google-bot it’s not an issue. It just crawls all links. Google might track my bounce-rate too though. For example when users use the search, and instantly come back to google – that hints them that I don’t have good content. I don’t suspect that has a strong effect, but the improvements might help a bit.
As I was at it, I made some cosmetic changes. I modernized the fonts everywhere. And also changed the links, so that they blend more into the text. Research shows that colorful links are no problem, and might even help comprehension1. So I would be up for changing it back. If you have any opinions or input, I would be happy to hear it. Also if any other trouble came up – I still expect a bug or two from all the changes.
What’s to come
Pinpointing the impact of the changes will be tough and a lot of factors will muddy the picture:
- Google often needs a handful of months to re-crawl content. The effects will hit one page at a time.
- It’s a lot of changes at the same time. I’ll just have to judge the overall effect.
- I’ll continue with improvements (suggestions are welcome), I hope the stats give enough hints at what helped.
- The content is still king! If I create good art and articles, then that
But I’ll keep updating and share my statistics.