ZOMDir > Blog

Sunday, 20 March 2016

Will linkbuilding for SEO become meaningless?

Recently I have read about RankBrain's Judgment Day.

Although it is stated in other words. One of the statements in this article is that:

Soon external signals (e.g., links) will not be used for relevance of the organic search results.
This is nonsense. Recent research learned that Links are still relevant. Even Gary Illyes states that RankBrain is something different as suggested in the article above.
Lemme try one last time: Rankbrain lets us understand queries better. No affect on crawling nor indexing or replace anything in ranking 
— Gary Illyes (@methode) March 18, 2016
I think it is better to believe the latest.

Hans

--
ZOMDir.com is a dynamic directory and a wiki
Everyone is able to add a link in 10 seconds
To learn more view this Slideshare presentation

Monday, 29 February 2016

Linkbuilding from directories done right

Since the rise of Google linkbuilding is extremely important. The reason? 

Google was the first search engine who used the number of links to a webpage as an indicator of the relevance of that page.


Links are still relevant

Update: March 24, it became in the news that content and links are the number one and two ranking factors (or vice versa). See: Google's top 3 ranking factors.

Nowadays linking is still an important indicator. A recent study of Backlinko makes clear that ...
Backlinks remain an extremely important Google ranking factor. We found the number of domains linking to a page correlated with rankings more than any other factor.

Another indication that the original Pagerank algorithm is still a huge factor comes from this accidental SEO test from Moz. The concluded that ...

What our data suggests is that, on average, there’s a 15% traffic loss following a 301 redirect; but any individual redirect could be much better, or much worse.

What’s particularly fascinating about this number, 15%, is that it is exactly the amount of PageRank loss Google described in the original PageRank paper.

How to get links the wrong way?

Since it is clear that a lot of links helps your website to become higher in the search results people created "fly-by-night" directories and "Private Blog Networks" to create links by them self. 

It is clear that Google don't like these tactics because they corrupt the search results. So Google declared the war on link schemes.


Content marketing

Therefor a lot of people believe that content marketing is the answer. Write helpful attractive content which help users and people will come, link to you and you will earn a high position in the search results.

That doesn't mean that a link from a directory is useless. If the directory has a page with links that belongs to the same category (which could be considered as 
helpful attractive content) and your link belongs also to this category then it is a perfect fit.

Create helpful links

If your link is the only link in a category then it is either a very detailed category either it is suspicious. 

A single link on a page could be treated by Google as a link scheme which can negatively impact a site's ranking in search results.

The latest is also the case when your link belongs to another category.

So when you add a link to a directory make sure that it is the correct category and that there are other relevant links in that category.

Good luck with linkbuilding,
Hans

By the way, almost as important as links to your site are the links from your site. A study from Reboot Online Marketing learned that outgoing links have positive effects. So feel free to link to me ;-)

--
ZOMDir.com is a dynamic directory and a wiki
Everyone is able to add a link in 10 seconds
To learn more view this Slideshare presentation

Monday, 15 February 2016

Linkbuilding is hard, creating a meaningful directory is harder

Linkbuilding is hard, however building a link directory is harder. The two main reasons are:
  1. Collecting relevant links related to a subject (location) couldn't be done in a split second;
  2. There is a negative sentiment regarding link directories.




Collecting relevant links is hard

Collecting relevant links is like content curation. You're not only hunting for relevant links for the subject your are working on, you also have to add these links to the corresponding page of the link directory.

Imagine that you want to create a page about broken link checkers. Often you will start with Google. When I search for broken link checker in Google.com I got the following results in 0.33 seconds:



The top 10 links I found* are:

  1. http://www.brokenlinkcheck.com/
  2. https://wordpress.org/plugins/broken-link-checker/
  3. http://www.deadlinkchecker.com/
  4. https://validator.w3.org/checklink
  5. http://www.powermapper.com/products/sortsite/checks/link-checker/
  6. http://www.iwebtool.com/broken_link_checker
  7. http://smallseotools.com/websites-broken-link-checker/
  8. https://wummel.github.io/linkchecker/
  9. http://www.screamingfrog.co.uk/broken-link-checker/
  10. https://linktiger.com/



There is more than the first page

There are more pages about broken link checkers. For example I missed the still famous broken link checker Xenu's Link Sleuth.  

I also missed ZOMDir's Broken Links at a Glance in the top 10, but hey, that's a relative new link checker. What to expect else :-)

However Google indicates that there are more than 1 billion pages found. That is a little bit to much to add to a link directory. 

So you have to curate which broken link checker, or better said which webpages related to broken link checker, should be listed. That makes a huge difference.
There are a lot of articles with titles like 
and so on. Do you want to add these articles? Remember taking a decision takes time...


In depth analysis

Content curation is nothing without an in depth analysis. So you have to take a look at every site you want to add and decide what to do with it. During this process you have to answer questions like:
  • Do you want to add broken link checkers which checks only one page? 
  • Do you want to add online broken link checkers which need a captcha? 
  • Do you add all broken link checkers you find or do you select only the best working link checkers?
Based on your decisions you will have a list of relevant websites related related to broken link checkers. 

Almost there

The latest step in the process is adding the link to the directory. This should be as easy as possible for the curator. I don't know how other directories work, but at an average I'm able to add a link per minute. 

So creating a list of 15 links takes me roughly a quarter of an hour.

Believe me, creating a relevant list of broken link checkers takes hours instead of seconds.


Negative sentiment regarding link directories


The second reason why creating a link directory is hard is the negative sentiment regarding link directories.


For years there had been a negative sentiment regarding link directories. Deservedly because there are a lot of low quality link directories. 

Lots of people bought a PHP script and create a link directory for ABC link exchanges to get high position in the search engines. Even worse, they link unrelated, so links regarding completely different subjects where at one page. This makes the pages completely useless for normal users.

No wonder that Google stated in 2012 that they will reward high-quality sites with a algorithm now known as Pinguin

As a result a lot of marketeers and search optimalisation specialists started to think that almost all link directories are a no go area.


A turning tide?

I think the tide turns at this moment. A recent study of Backlinko makes clear that ...
Backlinks remain an extremely important Google ranking factor. We found the number of domains linking to a page correlated with rankings more than any other factor.

As Search Engine Watch wrote in 2013, probably the best think you can do is use your common sense.

When you look at a directory (or any other link source for that matter) you have to ask yourself, “does it make sense that this link should pass weight to my site?” If you can honestly say “yes” to this then it’s likely a good link.

Good luck with linking,
Hans

* Google results aren't always the same for everyone, so you might find other results


--
ZOMDir.com is a dynamic directory and a wiki
Everyone is able to add a link in 10 seconds

To learn more view this Slideshare presentation














Monday, 4 January 2016

Again, the directory Yahoo!

November 2014 I wrote about the directory Yahoo! One suggestion on Twitter was to copy the whole directory. After I wrote that blog, I tried to copy the directory Yahoo! with WGET. That seems simple, but isn't. Several weeks my computer was working day and night to fetch the directory. Due to an automatic windows update I wasn't able to fetch the whole directory. It's a pity, but I think that I have more than enough data.

After the fetch process I tried to analyse the links, however that was hard due to the fact that the structure of the HTML code wasn't always the same, and I made the mistake that I converted all filenames to lowercase. That wasn't smart, because now I've got a lot of broken links. So I tried to parse the pages with the YahooAnalyser (some Python code written by me, to analyse the content of the old Yahoo! directory), however, the end result was ... nothing beside the finding that this approach was far to complex.

Recently I decided to take another approach. Instead of analysing the complex HTML code, I deleted this complex code so that in the end I got simple lists of links, and I converted all uppercase characters to lowercase characters. Thereafter it was possible to use a broken link checker like Xenu's Link Sleuth to analyse the fetched copy of the directory Yahoo! 

My findings are:

  • I was able to fetch 56,359 folders (I assume that there are roughly 60,000 folders);
  • I was able to fetch 568,744 external links, so the average number of links per folder is 10;
  • Of these 568,744 fetched links there where 92,746 duplicate links, so 84% of the links where unique;
  • Of the 475,998 unique links 365,751 links got the status 200 OK. That's 77% OK and 23% not OK;
  • The top reasons a link is broken are:
    • No such host (9%);
    • No connection (8%);
    • Not found (2%).

A year ago my estimation was that there are 55,000 to 75,000 categories so that's roughly the same. However I overestimated the number of links. First I thought there are 1,000,000 to 3,000,000 links mentioned on Yahoo! Nowadays I believe that there where roughly 500,000 unique links.

The percentage of 23% broken links isn't fair due to the fact that a year passed between my fetch action and this analysis. However in my opinion a broken link percentage of 20% or higher isn't acceptable. So I still repeat the conclusion of last year "It is a pity, but it is logical that the Yahoo! directory is shut down".

Hans
--
ZOMDir.com is a dynamic directory and a wiki
Everyone is able to add a link in 10 seconds
To learn more view this Slideshare presentation






Saturday, 20 June 2015

A bug in Safari?

Apple is more or less cheating about the Javascript* screensize of the iPhone 4s for a long time.

You might check this by using Browsersize at a Glance on an iPhone. This simple webpage will give you results like this:

Safari on iPhone 4s, iOS 7.1.2


Chrome on iPhone 4s, iOS 8.3
The browsersize returned isn't the real size of the iPhone 4s. According to the iPhone 4s Tech Specs  the iPhone 4s has a screen of  640 (width) by 960 (height) pixels. That Apple was "cheating" on the browsersize was never a problem for me. On the contrary I did understand that Apple pretended that the screensize is lower than the specs for backwards compatibility. After all previous models of the iPhones have a screen of 320 (width) by 480 (height) pixels.

However now I wonder why Apple states that the iPhone 4s has more pixels than in the specs. Today I got this result and I'm puzzled.


Safari on iPhone 4s, iOS 8.3

Why do I now get the default width? The only thing I have done is updating iOS, so I wonder what I have done wrong that the default width is shown. Perhaps this is a bug. 

I know this is a technical issue, so it is good to know that I have read Apple's Configuring the Viewport and I'm always using this code 

<meta name="viewport" content="width=device-width" />

in the head section of the webpage. 

Thanks,
Hans

By the way, I also wonder why there is a difference between Google's Chrome and Safari. I suppose that they are based on different versions of webkit.

* The screensize is based on the Javascript innerWidth property and the corresponding innerHeight property


--
ZOMDir.com is a dynamic directory and a wiki
Everyone is able to add a link in 10 seconds
To learn more view this Slideshare presentation

Sunday, 17 May 2015

The power of the Zero Width Space

Recently I discovered that besides the Non breaking Space, there is also a Zero Width Space

A Zero Width Space works similair to the Soft Hyphen. Instead of adding a visible hyphen where the string is broken across lines, the Zero Width Space adds virtualy nothing.  

The Soft Hyphen is useful when you want to control where normal words might be broken across lines. 

Use the Zero Width Space when you don't want extra visible hyphen in your string. Since I discovered the Zero Width Space I use it in URL's. This way, long URL's are shown perfectly fine on small screens too. 


Zero Width Space in URL's

In URL's I add a Zero Width Space:

  • After each ampersand '&';
  • After each sledge '/';
  • After each egual sign '=';
  • After each underscore '_';
  • After each plus signe '+';
  • Before each percentage sign '%';
  • Before each period '.'
This way an URL might be shown as follows:

http://example
.zomdir.com/
this_is_a_fake_
directory/and-
this-one-is-fake-
too/index.php?
inputfromuser=
wow%20this%20
is%20a%20handy
%20code
 
In my opinion is the Zero Width Space that useful that I wonder how I missed it before. Anyway from now on I'm a fan of the Zero Width Space.

Hans

By the way. The code for a Zero Width Space in HTML is: &#8203;

--
ZOMDir.com is a dynamic directory and a wiki
Everyone is able to add a link in 10 seconds

To learn more view this Slideshare presentation




Monday, 2 March 2015

Memcache to the rescue

ZOMDir was built on Google's App Engine. One of the reasons to choose for this platform was the excellent performance. Google's servers are really fast.

However, a fast platform is not enough. The software architecture should also be optimised for speed.

Google's recommendations 

Luckily Google has some suggestions for a good performance like:
I have taken all these recommendations where possible. Although I over optimised the last recommendation a bit. Let me explain.

Why use sharding counters

Writing to Google's datastore takes much longer than reading from the datastore and an update of any single entity or entity group is only possible about five times a second.

Therefor the solution Google suggests relies on the fact that reads from the App Engine datastore are extremely fast and cheap. The way to reduce the contention is to build a sharded counter – break the counter up into N different counters. When you want to increment the counter, you pick one of the shards at random and increment it. When you want to know the total count, you read all of the counter shards and sum up their individual counts. The more shards you have, the higher the throughput you will have for increments on your counter.

200 new links per second?

How much shards you should use depends on the number of updates you expect per second. I have chosen for 40 shards for the statistics, so ZOMDir should be able handle circa 200 new links per second without contention. Nice, isn't it.

However there is a disadvantage. To collect all these 40 shards takes time. More time than I expected. The reason is probably that I didn't code smart enough to save resources :-( 

Memcache to the rescue

Reducing the number of shards will ruin the statistics so to overcome this I fall back to memcache.

To improve the performance I decided to save the total number of links in memcache too. This way I was able to improve the performance with a factor 2. That is roughly from 1500ms to 750ms. 

That's what I call a nice improvement :-)
Hans

--
ZOMDir.com is a dynamic directory and a wiki
Everyone is able to add a link in 10 seconds

To learn more view this Slideshare presentation

N.B. Here are some pictures from the testresults based on the English page of hotels in Amsterdam

Before the I kept the total number of links in memcache: An endless list of shards I have to collect
The results after keeping the total number of links in memcache

The results according to tools.pingdom.com

The results according to GTMetrix