Beantin

James Royal-Lawson

SEO

5 reasons your web presence misses the mark

DDB Stockholm have been behind some excellent creative work, including some fantastic viral videos (Who hasn’t seen the piano stairs?). Their new website is a step forward showing that they understand that your website is just one part of your web presence – one part of your distributed website that exists across multiple platforms and services.

But, it falls short of ticking all the boxes. Here are 5 examples of where they have missed a trick…

1. Flash based

A similar result could have been achieved (plus improved performance/less CPU-drain) with other technologies (eg html, css & ajax). For a recap of this bugbear of mine, see this post on Why flash based site suck.

2. Loading time

We may not be using 56kbps modems anymore, but loading times are just as important as ever.
It’s not only humans that bore whilst waiting for pages to load (and we bore very quickly), search engines bore too. Slow to respond and slow to load pages will be penalised.
Just the index.html file on ddb.se is 148 KB. The entire start page (non-flash version) is 1231 KB (1057 KB of this are the various images used)

3. Accessability

OK, perhaps not up their amongst DDB’s target audiences, but making a web site accessible isn’t an optional extra.
It should be standard practice for everyone producing web sites. Granted, a non-flash version of
the site exists, which of course is a Good Thing, but accessibility doesn’t stop at “alternative content when flash disabled”.

4. Sitemap.xml

An easy win. All the major search engines love eating up sitemaps. Combine a sitemap.xml with robots.txt and you’ve made it so much easier for your content to be indexed. A valid and correctly linked RSS feed is an important part of the package, but it’s not a sitemap.

5. Microformats

rel=”me”. This is just as important for companies as for individuals in order to consolidate and confirm official identities across multiple sites and platforms. By cross referencing your various pages, you help join the dots for search engines (and visitors). Other microformats are of increasing importance; Geo-tags, contact details, product information. The sooner you make use of them, the quicker you’ll have the data indexed.

What is required…

These things are not overly complicated, new, expensive, or unavoidable. What is required is a web project manager with a good broad knowledge of the how the web really works, plus a quality web master/web manager. A web site manager isn’t a code-monkey or a copywriter, but someone who understands your web strategy, your target audiences, and the Internet – and who can make sure your web presence keeps on ticking all the boxes long after launch.

10 important consequences of personal search results

Today, Simon Sundén published a post in Swedish called personalising search results as standard. It is no longer limited to those who are logged in. This has some pretty large consequences for search results and keyword rankings. The inclusion of real-time results and the upcoming Caffeine update are important; but personalised results as standard is a major change rather than a significant tweak.

In simple terms, it means that you’re less likely to get the same search result with the same search times in different browsers, or on different computers (and in different countries, logged in to google or not). SERPs suddenly got a whole lot more complicated.

Here’s a brief translation of the 10 consequences Simon described in detail on his blog, with followed by description:

1. Lots of companies think that they rank highest, but in reality they aren’t seen at all

It’s going to have to be explained a lot of times, but searching yourself is no longer empirical evidence that you’re poorly ranked, or well ranked.

2. Google me! or, well, don’t

Universal guaranteed search terms that rank high, won’t be quite as guaranteed anymore.

3. Optimise titles and descriptions!

Conversions from SERPs to clicks is even more important as clicks in results are now part of the ranking equation. Titles and descriptions are what are shown.

4. More important to optimise for all search types

Optimise for all search results – blended search; image, news, real-time, video. Everything that’s just not the classic organic search result.

5. Tuffer times for Ranking tools

Keyword ranking tools suddenly became a lot less accurate. They still have a roll to play as an indicator, but less so as a firm statistic.

6. Wide content scope is beneficial

As Google value well-visited sites, it’s even more important that you take good care of your visitors (or searchers so that they become visitors!)

7. Statistics even more important

It’s going to be more important to collect statistics about exactly which page visitors have come from via search engines; 1st, 2nd, 3rd?

8. Google webmaster reports become important… again

The “Top search queries” report becomes more relevant the more personalised results become. It’s the new ranking tool.

9. Bye, bye “Don’t be Evil” Google!

Google has take even one step further to being all-conquering and all-important. will be harder and hard to be perceived as not doing evil.

10. Bye, bye paid-rankings?

Various companies provide paid servicing giving top 3 or top 10 results on Google. What happens to that business model now?

Google is shaping our Internet. With almost every announcement they have the power to change not only our lives but also our businesses.

Keywords meta tag is not completely useless

Despite Google confirming that they do not use the keywords metatag for ranking purposes, the keyword meta tag is not completely without it’s uses.

When you optimise the content of a page for a given set of keywords then that is an investment you have made, and the resultant set of keywords is an asset. The keywords meta tag is a simple way of safe-guarding that asset over time.

More often than not, web pages are optimised when they are initially published. Subsequent edits and updates are unlikely to include a recheck against the original set of keywords especially given the web publishing models many organisations use. This means that the original investment has been at best devalued and at worst wasted.

My recommendation is to keep entering the keywords into the keyword meta tag that the page has been optimised for – not because Google cares about it – but so that editors can be educated to always check their content changes against that list, and to update the list when the page has been re-optimised.

By utilising the tag in this was, we increase the return on the original investment that was made in optimising, and increase the chance that future edits will also be on target & keep the original keywords in mind.

10 reasons why all-Flash websites usually suck

Last week, Simon Sundén published a post in Swedish called 10 anledningar varför sajter helt byggda i Flash oftast suger (10 Reasons why sites completely built in Flash usually suck).

He’s right. They do. There are numerous reasons why all-flash sites fail to be a good choice. Simon had a problem getting down to as few as 10. In all my years I’ve still yet to be presented with a convincing argument for a site to be totally Flash-based.

Here is a brief translation of Simon’s 10 reasons. Some of the 10 points do have workarounds or solutions, but that’s like treating an illness you could have avoided catching in the first place.

  1. Long loading-times
  2. Accessibility
  3. Mobile compatibility
  4. Problems with visitor statistics
  5. Browser & mouse functions (CTRL-F, right-click, etc)
  6. No unique URLs – forget “caring is sharing”
  7. Back and Forward buttons don’t work
  8. Copy and Paste
  9. Indexing by search-engines
  10. Internal links and SEO

Flash objects have their use, but put quite simply, a wholly Flash-based site just doesn’t live up to the web standards we should be following to make the browsing experience more enjoyable and successful.

Twitter chat: is this social media or black-hat SEO?

TweetAvatar

JesperAstrom: @beantin – yes. 🙂 please write your perspective in a blog post and perhaps we could reach a conclusion?

TweetAvatar

beantin: @JesperAstrom yep, so we get back to a white-hat, black-hat SEO discussion rather than one about social media

TweetAvatar

JesperAstrom: @beantin – yes. But how does it effect the serps?

TweetAvatar

beantin: @JesperAstrom Not 100% sure either. Heart (and head?) says interesting content travels naturally within social media world.

TweetAvatar

JesperAstrom: @beantin – I c your point. And I am unsure. It is border-line. I know. But is it unethical you think? I weigh relvance vs. technical imp.

TweetAvatar

beantin: @JesperAstrom Don’t get me wrong Jesper, it’s a great technical post! I enjoyed it. It was the ethical side that made me uneasy.

TweetAvatar

JesperAstrom: @beantin – It says – releasing news THROUGH social media. It is – as I say in the post – NOT social medias best usage 🙂

TweetAvatar

beantin: Reading: “Releasing News through Social Media” by @JesperAstrom bit.ly/TpLNW Gut reaction: is this social media or black-hat SEO?

8 of 9
123456789
Reload this page with responsive web design DISABLED