Much Improved Success Rates

Yesterday we posted about increased performance issues since we started adding new sites and whilst investigating that problem we also fixed a duplication error that had been around for a while.

We also took the first steps towards alleviating the performance issues and that has also had a major impact; success rates are much improved today! There are still some other performance tweaks we are going to implement and over the longer term we are also looking at modifying our server infrastructure so that it can scale better as we add new customers or new sites to the system.

We’ll continue to keep you posted on the progress but in the meantime we won’t add any new sites until we can be sure the performance is where it should be.

Duplication Error Partial Fix

As well as working on the high server errors that we have been experiencing since we started to add a lot more social bookmarking sites, we have also been working on another issue which is related to this one.

The submission process works in several steps – it logs into the website, submits the job data, in some cases there is a confirmation step and then we check for success and pull out the URL of the created bookmark where available.

The earlier problem of the server timing out and causing the job to fail can actually happen at any point in the submission process. Sometimes the job has been successfully submitted and then the timeout occurs on the final step where we are checking for success. If this happens the job will fail even though the submission did actually succeed. What happens next is that we have an internal retry system that will retry a submission that failed due to a server error several times over a 24-48 hour period. Sites with higher PR are given a higher number of retries.

The problem is that when the job is retried, it is already there (because it was successfully submitted the first time) but now it fails with the error “Duplicate URL”! This is actually an oversight on our part and a bug that has been in the system since the beginning but we didn’t notice it before because the number of failures due to a duplication error was very small.

But since we started adding more sites and started to experience general performance issues, more and more submissions have been failing due to timeouts and this in turn also raised the number of duplication errors. A true duplication error should be very rare, so as these numbers increased it raised a red flag.

So we have now investigated the issue and fixed it – partially. If a duplication error is found then the submission process will now skip ahead to the final step and look for the posted Bookmark URL. However, this will only work if the account that is being used to submit this job is the same account as the one used to submit it earlier when it did indeed succeed because we login to that users account, look at the submission history and pull out the Bookmark URL that was created.

We use a pool of accounts for submission and we rotate around them randomly for all jobs so it may well be a different account that is used on a subsequent retry which means the URL cannot be extracted.

The proper fix for this is to keep track of which account is used for every submission and in the case of a failure, to use the same account when performing any retries. However this is not a trivial fix and it cannot be applied retroactively.

What all of this means is that from now on you will see less failures due to a duplication error (they should now be very rare) but until we can implement the full fix keeping track of the account, the reporting of the created URL will not work in all cases so that field will be blank.

I’ll post again when the next phase of the fix has been implemented and also on the progress of the underlying server timeout errors.

New Social Bookmarking Sites & Performance Issues

Since we removed the Article & RSS submission tools we have been hard at work bringing more social bookmarking sites into the system. Last week we hit 70 sites. However, this seems to have also had an adverse effect on overall system performance.

We monitor the success rates of all sites and monitor the failures carefully and we have noticed that we are starting to see a high number of server errors across the board since we really began to increase the number of sites. At first we thought that the issue was simply that some of the sites were just poor performers in general. This is something we don’t know until we put them into the live system and test them with large volumes of submissions.

So we have been removing those that have very high failure rates but this seems to be just part of the problem. These server errors are almost all time-outs or empty replies from the server which basically means the same thing – we site we are trying to submit to simply is not responding. But what has happened is that the failure rates are increasing across all sites in our system – even those that usually have 100% success rates which means that it appears to be an issue with our own server and not just individual sites.

Right now we’re not sure exactly what the problem is. I don’t think it is related to high volumes as our engine used to process around 8 times more jobs per day than it does now when we had an active free service a couple of years ago. We have the server tech guys looking into it and we are also looking more closely at the log files to try and figure out what is going on.

When a job fails the credits that were spent on that job are refunded so if you have a look at your credit activity log (from the members page) you are likely to see lots of activity there with refunds being given. Also, if a submission to a site fails due to a server error, that site becomes available for submission again. So you will also see the ‘SA’ (Submissions Available) column increasing as the jobs are processed.

You can go into any job that has a SA above 0 and re-select sites for submission to retry them. We also have an internal retry system which will automatically retry a submission a number of times before it fails permanently but this is done over a short period – around 24 hours but you may select these sites for resubmission at any point so please do make use of that feature in order to get more working submissions and in the meantime we shall keep working on the issue and keep you posted!

Article & RSS Submission Tools Now Removed

The last few remaining submissions for RSS feed & Article jobs finished over the weekend and today we have shut down both tools and removed the data associated with them. Thank you very much to everyone who used them over the past few years!

Our focus now is the Social Bookmark submitter which has always been our most popular tool. As of today we’re up to 59 sites and we’re continuing to push that number. It’s actually been quite a while since we even hit 50 bookmark sites so we’re pleased with that. Being able to concentrate on that one tool has helped and we have also been developing some internal tools which has given us access to fresh new sites on a daily basis so we should always have new sites to add as long as they are being built.

Don’t forget that every time we add new sites, those sites are available to all of your existing jobs – you just click the job ID number to edit it and any new sites that have been added since your last submission, along with sites you didn’t select or sites that had previously failed, are now available for submission. Select them just as you would for a new job and re-submit the job to these new sites.

In the jobs list there is a column marked ‘SA’ which is ‘Submissions Available’. This shows how many sites are available for submission for each job. By re-submitting your existing jobs to new sites as they get added to the system you can have a steady stream of fresh links coming in to your content.

The Article & RSS Feed Tools Are Now Read Only

Last week we blogged about the plan to phase out the article and rss feed submission tools. The first phase has been implemented today. The tools are still there but the buttons to submit new jobs has been removed and the resubmission section of the job details screen has been removed.

You may still view the progress of your jobs and edit them, but you cannot submit them to new sites or submit new jobs.

In the meantime we’re continuing to add lots of new bookmark sites. We’re almost at 50 now. Don’t forget to check the ‘SA’ column of your bookmark jobs as this tells you how many new submissions are available. Then just click the job ID, select the sites and resubmit the job to the new sites.

Article & RSS Submission Tools To Be Phased Out Next Month

A little while ago we blogged about allowing our Article & RSS submitter tools to wind down naturally, stating that we would continue to monitor the submissions, removing any sites that stopped working but would not be adding any new sites.

The time has now come where we feel that the current offering does not provide enough benefit to continue. Currently, we have 14 RSS directories and only 10 Article directories. There are a few of you that still use these tools but the amount now is so tiny that it is time to remove them.

Phasing Out In Several Steps

What we will do is remove the tools in several steps:

 Step 1: Disallow new submissions

We will begin simply by removing the functionality to add new jobs to these tools. The actual tools pages will still remain in the system so you can continue to check on the progress of your existing jobs. At this time we will also remove mention of the tools from our sales page, training pages and so on.

We will do this next week on Thursday 27th February 2014 so if you do wish to use those tools, do so within the next few days.

Step 2: Wait for current jobs to finish

We will monitor existing jobs for these tools and will wait until every last submission has been processed before taking any further action. Due to our drip-feed functionality we expect this to take several weeks. Once all the jobs have been processed will will notify members so that you can take once last look at your jobs before we take down the tools entirely.

Step 3: Remove the tools altogether

After all jobs have been completed and members have had enough time to check over them (probably about a week) we will take the tools down entirely. The information about past jobs will no longer be available from our site and we will remove all references to them.

What’s Next?

Going forwards we are going to be focusing heavily on the social bookmarking tool which has always been the most popular. We have recently written some back end tools that allow us to incorporate new sites into the system a little quicker than before so we’re really hoping to increase the number of sites available.

So far this week we have added 8 new sites though sadly we have had to remove some as well. It is always an on-going effort to get the sites list fresh!

Social Bookmarking Site Networks Slapped by Google

google slapGoogle has been very busy updating its algorithm over the last year or so and many of the updates have hit certain social bookmarking sites. Note the operative word here – ‘certain’ sites, not all of them! There are most definitely some social bookmark sites that stand the test of time and enjoy high page rank and lots of traffic year after year.

IMAutomator allows you to submit to many of these high profile social bookmarking sites such as Delicious, Bibsonomy, Diigo, Bitly, FolkD and many others in just a few moments of your site. One of our goals is to ensure that our list of supported sites is always closely monitored and kept fresh. To that end we run a PageRank report daily which checks the PR of every site in the system.

Over the weekend just gone we noticed that around 8 sites in the system all dropped to 0 at the same time; they got Google slapped!

What was also significant is that all of these sites appeared to belong to some kind of network. They linked to each other which is in fact how we found them in the first place. We found one of the sites in the network, and then found the others through that initial site.

It’s easy to put up a social bookmarking site

Templates such as Pligg make it incredibly easy and quick to put up a social bookmarking site. Add to that the ease with which you can buy expired or soon-to-expire domains with PR for just a few dollars means that it’s very easy for someone to make a social bookmark network by buying up a few domains, putting Pligg or a similar template on them and then inter-linking them to keep the PR juice flowing.

We even tried such an experiment a couple of years ago. However, from our own experiments we found that this network idea for social bookmark sites was NOT a good idea. Within a few weeks, all the sites in the network had their Pagerank removed – they ended up at either 0 or unranked. With the purpose of social bookmarking in bulk being to get backlinks – have a bunch of PR0 backlinks isn’t going to do you much good! We removed that network.

Poor quality social bookmarking networks are short lived

poor quality social bookmark sites are short lived

However, because of this easy barrier to entry, new networks of social bookmark sites are popping up all the time. On the surface this seems like a good thing as there’s always fresh new sites for us to add to the system but in the long term it’s only good if those networks survive the Google algorithm changes. When a site drops to PR0 we deem it as too low value to be of benefit and so it’s is free of charge; and we remove them from the IMAutomator system pretty quickly 🙂

But there’s also another negative point that may not be apparent at first glance. Imagine you are the owner of one of these networks. You’re not creating it for fun; you’ll be monetizing that network in some way. Indeed we often find such sites are plastered with ads and some of them in a particularly intrusive way – full page ads that cover the whole screen, tons of pop ups etc.

For us, this is not actually a problem because once that site is plugged into the IMAutomator system, the members never see those ads – our submission engine sees them and our engine doesn’t care 🙂 However, coming back to the owners of these bookmark networks… if you have a large (or small) network of sites and overnight there’s a shift in the Google algorithm and all of those sites drop to 0 what is going to happen? They will almost certainly not be getting the traffic they used to.

There are thousands of social bookmark sites to submit and whether you submit to them manually or use an automated submission service such as ours or pay for a manual submission service – either way that is going to cost you in either time, money or both. So, anyone who uses social bookmarking as part of their link building strategy is most certainly going to want to get the best value for their time and money so all those PR0 sites drop to the bottom of the priority list.

Less submissions to the sites means less traffic for the site owners which translates to less eyeballs on ads and ultimately less income. What will happen to that bookmark site network that just got slapped by Google? The site owner will undoubtedly take those sites down before long and replace them with something else.

High PR links provide the best long term value

So if you are a link builder and you have put your time and money into building links on those sites only for them to disappear just a few months later – your investment is not as solid. What can you do about it. To our IMAutomator members what we suggest is to be sure that you bookmark your best content and the best sites. These are obvious – they’re the ones with the highest page rank that are in the system month after month after month,

We still suggest diversifying your link profile by including a wide range of sites so by all means throw in some of the more ‘throwaway’ sites into the mix but just be aware that these links may not stick around for as long as the high PR ones.

IMAutomator works with a credit system where the credit value for a site is equal to its Pagerank which means that a PR7 site costs 7 credits to submit to where as a PR2 only costs 2 credits. It can be tempting to avoid to the high PR sites and select just the lower PR ones as you’ll get a lot more individual backlinks for your money but if you are interested in the quality and longevity of those backlinks then its worth spending those few extra credits for your best content.

What’s the future of social bookmarking?

prepare for the future

My personal opinion from looking at what Google has been doing over the last few updates is that they will continue to penalize the kinds of cookie-cutter sites that are put up in minutes and not developed in any way. What makes the big sites like Digg, Delicious, Reddit etc so popular? These are high quality sites; they do so much more than just slap up a bunch of links. They monitor and filter submissions to keep the content high quality, they encourage user interaction etc. In other words they are building the kind of site that people want to use and this is exactly the kind of site that Google is pushing to the top of its rankings.

On the other hand, those people that use templates such as Pligg and do nothing other than leave the site as-is, may find their sites losing PR over time. That’s my prediction anyway. I’m not bashing Pligg as a platform – I think it’s great. But like with any website, it should be used just as a base; a building block and not as the whole package.

If you’re a blog owner, you’d most likely use WordPress to build your blog. But would you just install the base software, keep the standard template and then just leave it alone? No, not if you cared about your blog! You’d install a custom theme that matches your content, you’d install some plugins that are useful for you and your visitors and you’d keep on top of comment spam to ensure that your blog remains high quality.

I think in this day and age if somebody wants to create a new social bookmarking site that is going to hold its ranking with Google, then it needs to do more than just upload a cookie-cutter template site. It may even be that Google does indeed target Pligg itself as a platform! I certainly hope that’s not the case because for us, we’ve developed our software to work seamlessly with Pligg sites so we can add them into our system pretty quickly.

As always, time will tell!

Link Building with Misspellings

A little while ago I came across a blog post which is targeted towards big brands, but presents a technique for building links via misspellings. I had put it on my todo list and put it to one side; IMAutomator doesn’t really qualify as being ‘big brand’.

Link Building with Misspellings Technique

The basic technique can be broken down into a few steps:

  1. Take your brand name and use a tool such as the SEO Book typo tool to identify potential misspellings
  2. Construct the .com URL out of all of those misspelled words
  3. Use a backlink checker to find backlinks to those domains
  4. Identify possible link opportunities and perform outreach

I didn’t actually do step 1. What actually happened is that yesterday I was looking at the IMAutomator analytics and I was looking at some of the data over a long period of time. I spotted some misspellings which had sent me fairly decent amounts of traffic and when I had a closer look I found a lot of misspellings.

I exported them all and ended up with 223 misspelled versions of ‘imautomator’! And here I am thinking that I’d created a catchy, easy-to-spell brand name! Oh well 🙂 Actually I have to admit, I mis-type it myself sometimes!

Should you optimize for misspelled words?

Years ago some SEO’s would actually build targeted pages around their top misspelled words in a bid to get more traffic. It’s not really a good idea as if a potential customer was actually to land on one of these pages it wouldn’t really give a great impression if it looked like you had misspelled your own brand name on your own website!

But these days its entirely unnecessary anyway. Google is getting far more intelligent and will now automatically show you search results for the word that it thinks you had actually intended to type in. Indeed, I went through a handful of my misspelled words and plugged them into google and sure enough it said “Showing results for imautomator” and showed this website in the top spot. This makes sense, as these keywords are already sending traffic.

So back to the link building…

The purpose of this technique is actually  to build backlinks using these misspellings so I decided to put it to the test and see how well it worked. The second step is to put a .com on the end of each word. Many of my keywords actually had spaces in them such as ‘im automator’ and my absolute favourite, ‘i am automator’!

Incidentally, in case you have ever wondered, yes IMAutomator is all one word, and the IM stands for Internet Marketing.

So I removed all the words with spaces and other invalid characters and I trimmed the list down to 156 URL’s. Next I went to the bulk backlink checker of Majestic’s tools to see which of these had any backlinks.

misspellings-backlinks-majestic

Using Majestic SEO to find domains with backlinks

The first one that popped up was for http://imautomation.com. It looks like this used to be a site that did something very similar to IMAutomator, but is now dead. This has potential because any live site linking to that company could potentially link to mine as they offer pretty much the same thing. It had 68 backlinks according to Majestic and Ahrefs but as I only have a free plan on both those I looked at the handful of backlinks shown and then I used a crappy little backlink checker for the rest. It was a bit more effort but showed the data I needed.

build backlinks with misspellings

Using Ahrefs to look at the actual backlinks pointing to the domain

This tool doesn’t filter out internal links and it also shows entries for whois sites, ranking sites and all of that garbage. However, it did show the PR of the linking page which was handy. I only looked at ones PR2 or higher, as nothing less is worth the time. I actually found a couple of live resource pages, and contacted the webmasters and asked them to change the links. While I was at it I also used a broken link checker to identify some broken links on the page for them as a nice gesture.

The next domain of interest was http://iautomator.com/ which redirected to some guy claiming to be an Internet Marketing expert. However I couldn’t find any decent backlinks for that one.

Unfortunately that was it; none of the other domains showed any backlinks.

Conclusion: Linkbuilding with Misspellings Worth it?

This whole process took maybe a couple of hours and I ended up with only 1 solid link on which to perform outreach and of course there’s no guarantee that the website will even get the email, let along act on it. So, for me, it was not worth the effort.

However, I would suggest that you at least have a look in your analytics for misspellings. If you see some that are sending through a lot of traffic and you’re also the kind of site that gets lots of links then its very possible that some of those links are actually pointing to a mis-typed URL that should be pointing at your site and then it might well be worth doing the rest of the steps.

 

Remove Dead Links From IMAutomator (Free Broken Link Checker Tool)

Many IMAutomator members have submitted thousands of bookmarks to the system over the years and many of the older links may be pointing towards sites that no longer exist. These just clog up your account (and our database!) and get in the way when trawling through your existing jobs when you want to re-submit them for additional backlinks.

Remove dead links in bulk

There are two features which help you remove dead links in bulk plus a neat little plugin for Chrome called Check My Links. First of all, if you know that you have many bookmarks for a particular website which no longer exists, you can use the job search functionality to find them all at once.

Secondly, you now have the bulk job deletion feature which allows you to just click all the jobs you want to select and then click a button to delete the selected jobs. Of course, please do be careful with this feature!

Now what the chrome plugin does is quickly analyze all the links on a page and look for broken ones. Onces which are ok are highlighted in green and broken links are highlighted in red so its easy to see at a glance which jobs are pointing to dead links.

So all you need to do is use the search to pull up jobs from a site that is gone, click the ‘Check My Links’ button, click the checkmark next to any job that has a red broken link, and then delete all of those jobs. This technique will only take a few moments and you can easily remove dead links from IMAutomator in bulk.

SolveMedia CAPTCHA Support Added

You may have noticed that over the last few months a new type of CAPTCHA has been popping up all over the web by a company called SolveMedia. These captcha’s vary – sometimes you just type in the words you see (pretty standard) but other times you need to solve a puzzle, identify a brand and other strange things.

Many social bookmarking sites have started using this new style of captcha in their submission pages and until now we have been unable to support it and had to put those sites to one side. Today we have managed to get support for SolveMedia working which means that the range of social bookmark sites that we can incorporate into IMAutomator has just grown.

We’ve put 8 new sites on trial today – all PR3 & PR4 and as usual we shall monitor their performance over the next few days and add them into the live system if they work well. We have quite a large list of sites to look at so hopefully we can grow the list of our social bookmarking sites over the coming months.

I know I say that a lot and the list often stays hovering around 30 sites but now that we’re winding down the Article & RSS submitter tools more time can be dedicated to the social bookmark submitter. As always, we never know how long a new site will last so time will tell how well these new sites work out!

Stay connected with us in your favorite flavor!