I just wanted to write a post about what we’re working on at the moment. After the launch of Release 2 we encountered a couple of problems. One was that the engine could not keep up with the increased workload (which seemed odd as we’d already re-written the engine to execute around 60 times faster than the old one!) and the second was that also due to the increased number of daily submissions, our server was being blocked by many of the sites we were submitting to.
So these problems have obviously taken priority over anything else. The speed issue has been resolved – we found a bottleneck which was not allowing it to run at its full capacity (a few actually!). Many improvements have been made and it now runs exceedingly fast and processes thousands of submissions concurrently.
The second problem has been a little trickier to resolve and is still on-going. We’ve now started to use a variety of proxy servers for our submissions – this spreads the loads across multiple IP’s and there is no real limit to how many proxies we can use so that solves the initial problem. However, proxies are not as reliable as a direct server connection and many submissions fail with a timeout or some other kind of server error caused by the proxy as opposed to the site itself which is what we are now working to resolve.
We already had in place a re-try mechanism; if a submission fails due to a server error it is retried a number of times before a permanent failure is recorded. Some sites are just a little flaky so this tends to sort out temporary glitches. So we’ve now increased the number of retries (which in turn causes the engine to have to do more work which is why we’ve also been working hard to speed it up!) but some sites are still persisting in giving us trouble for a good percentage of submissions which is not what we want.
The goal of IMAutomator is to create backlinks for our members. More sites = more backlinks. Higher success rate = more backlinks. As we’ve only recent launched a new Release we want to work on these core aspects before introducing any new functionality, so we are splitting our time between improving the success rates, cleaning up any bugs we find and adding new sites to the system. We’re not planning to start work on any new features or the new submission tools until we are happy that our existing tools are solid.
What we’ve also noticed is that many of these recorded failures are not actually failures at all! The way we determine success is that we examine the output returned from the submission process for some kind of success identifier. This varies for each site but usually there is either a success message or we see the details of the bookmark created. What is happening since we’ve started using proxies is that sometimes the submission works but we get a timeout whilst it is returning the result – so our success check records a failure when in fact it did go through!
False negatives are better than false positives but of course this is not ideal! Now that we have re-worked our engine to work a little differently, our next approach is to try a new method of determining success that does not use this output. We’ll be working on that next week. If successful, it may mean that not only are we able to more accurately determine success, but we may even be able to start reporting on the actual URL’s of the bookmarks created! That’s the hope anyway…