Posted by Paddy_Moogan
I used to spend a lot of my time plowing through Google to find potential link targets for client websites. I still do this a lot but I've changed my approach a lot of the last 12-18 months to try and become a bit more efficient. I've changed my approach so that the first point of call when looking for link targets is lists.
Lists are awesome for link building because someone else has already done some of the hard work for you. If you can find good quality, curated lists of websites, then you can be reasonably sure that you have found sites that are good ones to get links from. You still want to run your own analysis and due diligence, but the end output is probably going to be a higher majority of quality sites than you would have gotten from pulling lists straight from Google SERPs.
I like to put link building techniques into processes which makes them easier to follow and easier to scale if you need to automate parts of it. The process I use can be broken down into the following -
- Find your lists
- Scrape together your master list
- Filter and prioritise
I'll go through each of these in more detail to explain the steps for each one.
1. Find your Lists
There are multiple ways of doing this and there are probably more places to find them than you think. To make things a bit clearer, let's think about these "types" of lists, they are roughly in order of personal preference -
- Curated lists found on other websites
- Top x type lists found on other websites
- Public Twitter lists
- Good quality directories
There are more, but just these ones alone with give you enough link targets to keep you busy for a long time!
Curated lists found on other websites
For me, these are the best types of lists to find. They are hand curated which means that the quality of the list should be pretty good, usually they have also been curated by a subject matter expert who knows the sites in their industry well. I always try and find these first because they tend to give me the higher quality sites – albeit in low quantities.
My main method for finding these sites actually starts off very simple, it will be something like -
It really does start off that simple.
This can give me lots to get started with, just this search alone gave me a list of seven blogs curated by The Times, a list of ten law blogs from Cision and a list of employment law blogs curated by a professional lawyer. This was only taking a few from page 1 of the search results too, there were many, many more.
You can of course vary the types of searches you do, then you can go a bit deeper with some advanced search queries. Here is a quick example -
Here are a few more that you can use -
- "list of * blogs" "uk law"
- "list of * websites" "uk law"
- list of uk law blogs inurl:resources
When I've found very high quality sites, I also tend to keep a note of them in my Chrome Bookmarks so I can easily find them again in the future -
You could use anything you want to do this really, for example you could use something like Evernote and attach the list to a project. At Distilled we also keep a record of link targets in Buzzstream which means they are easy to find again in the future for follow up or other projects.
Top x type lists found on other websites
This is a personal favourite, the quality can sometimes be a little dubious, but overall you will find pretty good quality websites. The great thing about these ones too is that they tend to be refreshed every year so lots of these lists exist.
Similar to curated lists, finding these can start off with something simple -
These returned lots of great results, one of which I'd class as a curated list from The Telegraph.
You can of course go a bit more advanced again and use some of the following -
- top * food blogs
- best * food blogs
- best food blogs inurl:links
There are so many of these it's unreal. I'd like to think they were slow down as people get bored, but quite frankly, they still work well for things like link bait!
Public Twitter lists
You need to do a bit of extra work to pull the websites out of these ones, but it's worth the extra effort and much of this can be automated. You can actually find these a number of ways. One of my favourites is this advanced query in Google -
Once you have found a list, you want to click on the "following" tab which brings up the people on the list rather than their tweets -
This brings you to a URL such as this one which (if you wanted!) you could scrape the usernames of the list members from. You could then scrape their profiles using something like Google Docs with Xpath and pull out their websites if they have them.
To help you with this task, I knocked together a very quick and dirty scraper in Google Docs which you can get a copy of here. It's not that robust but will do the job on up to 50 usernames quickly and easily. Make sure you are logged into your Google Account then click "Make a Copy" from the File menu.
Another easy way is to use something like Listorious which has a nice clean interface and pulls back Twitter lists based on your keywords -
As I said, there are loads of ways to find these lists. I tend to find that just searching for my keywords in Twitter search and finding influential people often leads me to lists of some sort.
Good quality directories
I had to mention the D word at some point :). Being honest, I don't use this one that much but it can be useful in some very niche industries where there may not be that many lists created by individuals.
Luckily, it's super easy and quick to find a list of sites in some of the better directories. My first point of call is usually Dmoz. I try to ignore the fact that many categories these days are controlled by SEOs! You can find some really good niche sites on Dmoz, for example did you know there was an association of tea drinkers with a directory of member blogs? I found this via Dmoz!
There are a few other directories out there which are decent quality, but you'll probably find you get a lot more commercial and lower quality sites in the lists and will have to do some filtering later.
A few more directories I'd take look at -
Please don't dismiss the Yahoo! Directory without taking a look, there are some decent sites on there, for example on this list of photography blogs, which is quite extensive.
RIght, so now we have a number of places we can find lists. Next up is to collate these into our own master list so we can filter and sort to find our final list of link targets.
2. Scrape together your master list
Hopefully I don't lose too many of you now, I'm not going to ask you to build your own tools to do this. Whilst we are big believers in hacking together your own tools for various SEO jobs, there are some very simple plugins that can do this job for you. Having said that, if you can build your own scraper to pull in link targets – go for it!
Here are a couple of ways of pulling link targets from a page very quickly without building a tool.
Multi-links for Firefox
I love this plugin. It's a very handy tool for loads of stuff. In this case, we're going to use it to pull the URLs of the link targets in our lists into a spreadsheet.
Lets use one of the examples I linked to above, say you wanted to pull the URLs of the websites in this list on The Telegraph -
Using Multi-links, you simply right click just above the first link, then drag the cursor down so all the links you want are within the red lines -
Keep pressing the right click button, then press the left click, then select "Copy URLs" -
You can then just paste it into your spreadsheet. Much quicker than going to each site and copying the URL!
Extra Tip – this tool also allows you to open highlighted links in other tabs which can be very useful for link prospecting. However be careful, if you do not want this, make sure you keep hold of the right mouse button! If you release it, all tabs will open at the same time. Opening loads of tabs in Firefox at the same time can slow it down a lot – I've done it with over a hundred links before and crashed it.
If you are looking for something similar that works in Chrome, take a look at Linkclump.
Scrape similar plugin for Chrome
This nice little plugin takes away a lot of the hassle of learning things like Xpath and allows you to grab elements of a page which are similar to each other – such as links.
Lets use the same example as above and scrape the links from The Telegraph again. I simply right click on one of the links in the list and click on "Scrape Similar". I get the following list which looks accurate to me -
I can then export to Google Docs in the bottom right corner, easy!
For a more detailed look on using this plugin for link building, go and read Justin's post which gives a great step by step guide.
3. Filter and prioritise
At this point we should have a spreadsheet of link targets, which we have gathered pretty quickly using some simple search queries and a few plugins or tools. Now we need to filter, sort and prioritise. Chances are that you have ended up with a pretty big list of potential link targets, so you need someway of knowing where to start to give you a good return.
I know some of you will prefer to use Excel at this point, some will prefer Google Docs. So I'm going to cover both and show you how to quickly prioritise your link targets using some simple tools.
Filtering using Excel
This became a lot easier recently with the discovery of these SEO tools for Excel by Niels Bosma. There are loads of features you can use which are for onpage SEO analysis which you should definitely take a look at. For the purposes of this post, I've taken the following screenshot of metrics for link building which you can get -
For those of you who have never seen it before, here is a very quick snapshot of just a few of the other elements you can gather with the plugin -
Richard Baxter did a great post on the plugin and all the different features you can use, it's well worth a read over on SEOgadget.
Once you've gathered these metrics, you can start to filter and sort by whichever metrics matter to you. I tend to look for large numbers of social shares and then look at PageRank as a rough indicator. Whilst PageRank isn't perfect, it's still useful for filtering large sets of data.
Filtering using Google Docs
As much as I love SEO tools for Excel, I think I still prefer using Google Docs for gathering metrics on the fly. There are loads of posts for hacking together Google Docs to do various things (all with free downloads of the tools themselves) so I'd advise you go and take a look at those.
For this type of work, I tend to use a Google Doc which looks like this -
You can add loads more into here if you wanted, but I want to keep it simple for now. I like being able to pull Domain Authority straight into my spreadsheet which you can do with the free version of the SEOmoz API key.
Again, once you've got all your metrics, you can filter, sort and prioritise based on your key metrics. Then it's a case of starting to contact each site and giving them a good reason to link to you!
Final Important Step
Go and do something with this list! Importantly, ask yourself the following -
What reason can I give these people to link to me?
Why should these people give a sh*t about my site?
If you can answer these questions confidently, you have your hook ready to go and do your outreach.
That's about it for this post, I'd love to hear your feedback and comments below or feel free to tweet me if you have any questions or feedback.