Tuesday, March 27, 2012

IMified - Please dont shutdown!

This morning I received the following email from IMified informing me that the service intends to shut down due to lack of profit and what seems like continual problems with the IM companies. The real shame is that IMified is an awesome service with absolutely no alternative - please do correct me if I am wrong, I'd love to know.

For those of you that are not aware of IMified (, it is a service which allows application developers to create 'bots' which are essentially IM users on networks like AOL, Google Chat, MSN, XMPP etc and provides a single consitent API to allow asynchronous communication with a client. This service is absolutely brilliant for notification services like ( - online web page tracker and notification service). Maybe IMified could have a future as an open source, cloud-based service. Eitherway, I really hope that we can save IMified.

Here is the email for your information:


Dear IMified Customer,

The purpose of this email is to alert you to a change in the IMified service.

Over the last several months, it has become apparent that running an IM bot hosting service is something that cannot be done profitably. When we created IMified several years ago, the popular IM networks had programs for running bots - bypassing limits in contact lists and message volume, for example. This made it feasible to run a large scale IM bot. However, over the last year, these programs have started to disappear. Our contacts at IM providers have left the companies or been reassigned. It's become clear that this is a business that public IM providers no longer want to be in.
We've tried to keep IM support operational despite the fact that not only have we had no support from IM networks, but we have also had to fight the networks actively stopping and shutting down IM bots. Continuing to do this requires a significant amount of resources, and does not bring in enough revenue to justify the battle. So we're shutting down the service.
The logical question that many of you will have next is "what can I do instead?" Unfortunately, we're not aware of any viable IM bot hosting services. Those that we knew of have all gone away. One alternative is to host one yourself. Building a basic XMPP (Jabber) bot isn't hard. You just need to sign up with an XMPP host, someone like Gmail or will work for low-traffic bots. There are numerous tutorials and frameworks for building XMPP bots in your language of choice. A discussion thread on Quora has some other ideas. Once you have a XMPP bot up, you can connect it to other IM networks using XMPP Gateways (also known as Transports). These sign on to an IM network for you, and relay messages to and from an XMPP account. There's not much in the way of public gateways available, so to go this route, you may need to host your XMPP server and your gateways yourself. Spectrum and Kraken are two popular gateways, and Kraken is available as an easy-to-install plugin for Openfire, an open source XMPP server.

If you have concerns or questions about the shutdown, please email

- The IMified Team

Wednesday, March 21, 2012

Software development and the dangers of the word "Done"

People like using the word "done" and with good reason - if something is "done" it means we no longer have to worry about it and we can get on with the next task - its a decisive and positive word. But this word is usually misleading and in the real world, software development tasks are very rarely "done".

To say that a software development task is "done" tends to send a message to the project manager and client that the specific user story or feature will never need any revision and is completely free from errors. This is almost certainly not the case. It's a good assumption that new code will contain errors, even if you haven't found them yet and you can be pretty sure that when the new feature is demonstrated to the client or as soon as real people start to use the system, changes will be required.

And at the engineering level, well organized code will generally adhere to the DRY (dont repeat yourself) principal. This means that engineers will try to 'abstract' code, functions and structures to be as general as possible so that they are used as often as possible. This implies that although a feature may appear unchanged from one version to the next, the engineers really know that actually a lot of the underlying code may well have changed and although unit testing can 'positively' test and confirm consistent behaviour, a good dose of manual testing will do no harm.

But this article is not about code organization or testing techniques, it is about why we should try not to use the word "done". When we say this word, we probably really mean: "done to the point where another task is of higher priority". If this definition of "done" is generally accepted then project managers and clients can continue as usual, but should not be alarmed when something marked as "done" contains bugs, needs changing or the code requires rework.

The Internet - and what it could be

The Internet is still basically a mass of insular websites that do not interact with each other. Websites are still primerily designed to be read/browsed by a human, regardless of whether it is done from a PC, mobile or tablet device. Most websites are still just like books in a library which do not communicate and share information with one another.

Websites have got prettier and include more media, like music, videos and animations but essentially, websites have not changed and the Internet as a whole as not significantly changed for many years.

But the Internet has an amazing potential, behind all of these insular websites are computers capable of complex data processing and hi-speed communications, but for the most part, this potential power is just used to serve up pre-defined static content.

I think it is now time to build the REAL "web 2.0". For me, this implies websites that communicate and share information automatically with other websites and services in (pseudo) real-time. The end goal of which is to turn the Internet into something really powerful - a single massively parallel computer. The hard part has been done for us with technologies like HTTP, DNS, TCP/IP, JSON etc already defined and well supported by tried and tested technologies - all the building blocks of a new Internet are in place.

But to create this new Internet, we must create a mechanism such that all data currently formatted and published for humans (the majority) is also available to other web services. Consider my own personal website:

If you visit this website from a standard web browser, you would be presented with a mixture of graphics, text and hyperlinks organised to be human readable and formatted to be attractive (like any other website). Unfortunately, this type of content is completely unsuitable for a third party web service to make use of, let alone use in a meaningful way. So, lets consider the following two proposed URL's:

Now, a third party web service can read the contents of 'services.json' and can learn that the '' site has a service called 'contact_me', a programatic version of the 'contact form' ( which allows the service to submit a contact request. The services.json file will also detail how the 'contact_me' service should be called, what type of data it returns and maybe some human readable text to explain what the service does and how it should be used. We could also implemenent a 'subscribeToService_xxx' service which means that before allowing just anything to use the 'contact_me' service, one must first subscribe. The rules and details of how these services work are completely up to the website owner.

Our third party web service could also decide to read the contents of 'data.json' which contains semantically structured data. This data might contain the details of my portfolio, my curriculum, my contact details, links to social profile pages and any of the text found on the human readable website. It could also detail daily contractor rates, availability, location and phone number, which a third party recruitment service might find useful when searching for an available web developer in the Madrid area with experience with jQuery for example. Again, the contents of 'data.json' are completely up to the website owner.

Online shops could employ the same standard to publish special offers and could provide services to allow 3rd application to check an order status or product stock levels.

A news website could publish their latest headlines in 'data.json' and maybe provide a search service in 'services.json' which could return a list of news articles that fulfill the search criteria.

But the fun really starts when we start to combine these services. We could have language translation services, hotel reservation services, transport reservation systems, services that interact with social networks, services to access and query government and official data and all this opens us up to the possibility of creating sophisticated 'problem specific' services, for example a service that knows how to plan an entire wedding and knows which other services it needs to locate available venues, check guests calendars, send and recieve invitations, book a chauffeur driven car, obtain estimates for food, gift lists etc...

It would be really great to move in this direction, but it will take the support of lots of people. However, as a stepping stone, website owners can start by using - a service which provides the functionality to publish and manage an arbitrary set of data and allow 3rd party applications to subscribe to this data, which means that LiveDirectory will send change notifications to all subscribers when this data is updated. For example, I have implemented the equivilent of the 'data.json' file as a LiveDirectory profile:

The above LiveDirectory profile details information about me and allows anybody to 'subscribe' to any part of that data structure. Another example is the jQuery profile:

Where services can subscribe to be notified whenever a new version of jQuery is released, which is specifically located here:

The aims to provide a simply but very powerful API based data manangement, retrieval and subscription system to help the Internet take one step toward becoming a more useful tool for everyone.