Thursday, 3 July 2014

3 predictions on the impact of the Payroll-2-Pension PAPDIS data standard

Yesterday (2nd July 2014) a significant press release was issued by ‘Pensions BIB’ announcing that they were creating a free data standard (named the PAPDIS standard) to help simplify the transfer of data between Payroll systems and Pension systems. Its significant because if this data standard is adopted it will be used to transfer every UK employees personal information. Including yours.
I had the privilege of being invited as a guest to the last ‘Pensions BIB’ meeting in London, also attended by the Pensions Regulator, The Department for Work and Pensions, The Chartered Institute of Payroll Professionals, the British Computer Society, the Business Application Software Developers group, and other influential representatives of the Payroll and Pensions industry. That gave me an opportunity to thank that group for their efforts so far to create a data standard for what will become an essential business process for every UK company. Now that the press release has been issued I can use this blog to thank them publicly.
But I’d also like to take this opportunity to make three predictions about what the impact of a data standard will do to the auto-enrolment market and its constituents.
  1. My first prediction is that it will firmly cement Payroll as being the IT system in which auto-enrolment assessment and pension calculations are done. At the moment there are several places where assessment can be done: in Payroll, in independent Middleware, or in the Pension Provider’s systems. The reason for this is that its not always been possible to assess staff in Payroll (because Payroll software may or may not have that capability). But PAPDIS assumes that Payroll will do assessment. That’s good news because its the most efficient place to to assessment in terms of business process. If assessment is done anywhere else besides Payroll then Payroll needs to stop and wait for a data feed back from the assessment system containing individual employee Pension deductions. That’s horribly inefficient and just would not work for many SMEs in this country.
  2. My second prediction is that not every Pension Provider will adopt PAPDIS (even if every Payroll system does). That will create a schism in the Pensions world: those Providers that can take a PAPDIS file directly from Payroll and those which cannot. The reason I think some Pensions schemes will not adopt PAPDIS is because they cannot afford (in terms of time or money) to change the way they accept Pension data from 3rd parties.
  3. My third prediction is based on the first two being correct. I predict the role of independent AE middleware will become clearer and better defined as a result of PAPDIS. AE Middleware will be able to take a PAPDIS feed and then do the things that Payroll software is not willing to do. For example, issue employee communications, provide a web-login for employees to monitor their pension payments & other benefits, and perhaps gather in more Employee/Employer data and be able to output a Pensions data file to Pension Providers which are not PAPDIS compliant.
The bottom line is that the creation of a data standard that will be readily adopted by all Payroll software providers is good news for everyone: Payroll practitioners; Pension companies; and Flexible Benefit Middleware companies. It immediately simplifies the ongoing auto-enrolment business process but it also clarifies “who should do what” for auto-enrolment. Something that until now has not always been clear.
So what does PAPDIS it mean for my company: SystemSync?
In the short term: we’re still helping our customers work in a standard-less environment in which every payroll export format differs from the next. We will continue to do this until PAPDIS is adopted by the entire Payroll software community.
In the medium term there’s a job for SystemSync to help with the rapid adoption of PAPDIS. SystemSync can take a PAPDIS file and convert it to a Pensions contribution or AE Middleware input file format. That means SystemSync can help cut the cost and the time for anyone to wishing to work with the PAPDIS format. And that helps with the adoption of this nascent standard.
In the long term the presence of a standard will help SystemSync build seamless end-to-end data integrations because it will mean we don’t need to waste time working out what data should be available, what that data means and how to map that data. But the other thing we desperately need (to build seamless data integrations) are APIs. APIs exposed from Payroll and APIs exposed by Pension companies or Middleware. Once those APIs are in place integration-platforms-as-a-service like SystemSync can remove the dreaded CSV file from the data transfer process and make everything faster and more secure

Thursday, 17 April 2014

If your SaaS product does not have an API then its not valuable

Myself and others at SystemSync Solutions Ltd have spent a lot of time recently looking at Cloud SaaS Payroll systems in the UK. And during that time we’ve also migrated our traditional accounting package (which use to sit in a server in our office) into the cloud. Our findings and experiences from these activities has lead us to understand that if a SaaS product does not have an API then it is no better than a traditional desktop software product.

Here’s why:

Firstly, there’s no financial advantage to using a cloud SaaS tool. Our old desktop accounting package used to cost me a monthly fee to license. The SaaS version is approximately the same monthly fee. 
Conclusion: in this case SaaS gives no significant financial advantage to the customer

Secondly, there’s only a small productivity advantage to using a cloud SaaS tool. True that SaaS can be accessed anywhere (such as from home) but my in-office desktop accounting package was also accessible from home via VPN. It’s also true that cloud SaaS does not require me to install new software versions. But to be honest the job of installing new software versions on desktop software is trivial and reduced to a few clicks. Most desktop software packages do a great job of updating themselves in the background. But it has to be said that one clear advantage from using a SaaS accounting package is that my accountant can access my account on a regular basis to keep my books in order. That’s good for them, but irrelevant for me. 
Conclusion: in this case SaaS gives some, but not significant, productivity advantages to the customer.

Thirdly, when it comes to security there’s no real advantage to using a cloud SaaS tool. Personally I think my companies financial software is now marginally safer then when it was stored in my office. When I moved my accounting package into the cloud I made a choice to trust my company accounts to a very large established business which I believed could be relied on. And at the same time I removed from my own business any responsibility for local security/back-up/access/password management to the server containing my accounts package. But the advantage is small. Security of my desktop accounting software was already pretty good: I know who has access to the room its in; I know who has access to the machine it sits on; I know it cannot be accessed without 2 sets of usernames/passwords which only I know. Plus in the back of my mind I know that SaaS companies store all their customers data in one place meaning if a hacker did get through then they would be able to steal thousands if not millions of companies financial data in one go. 
Conclusion: there’s no real security advantage to using Cloud SaaS.

So if there’s no clear advantage on price, security or productivity for customers of cloud SaaS then what’s left? The answer is data integration. Enabled by SaaS platform APIs.

Using Cloud SaaS tools which have APIs means I can “add on” additional functionality to my SaaS accounting platform. I can link it to other SaaS products I use like time sheets, e-invoicing platforms, e-commerce markets and so on. This helps me reduce my manual data entry tasks. And that gives me real justifiable productivity gains and the possibility to scale my business without getting slowed down by admin.

And that leads me back to my opening sentence of this blog: if a SaaS product does not have an API then its not valuable to the customer because its not helping them move their data around their business. Instead its keeping their data siloed and isolated. Which is a situation that is no better than the old desktop software equivalent. The big incumbent software houses want to move you (their customer) into the cloud so they can stop supporting multiple desktop software versions. I understand why they’d want to do that. But if those same companies don’t also provide an API into the SaaS product then their customers data is just as “locked in” as it is in desktop software.

The UK Payroll market is dominated by legacy desktop payroll software products. But there is a new breed of emerging Cloud Payroll products. But to date I have only encountered one which has an open API: PaySuite. That’s amazing. Its amazing because almost every other software market sector is dominated by SaaS products with APIs which are chaining together to provide seamless data integrations. That seamless exchange of data is eradicating the mundane office tasks meaning that businesses which are adopting Cloud SaaS tools are overnight becoming more productive and efficient (than rival businesses which are not). That means that SaaS with APIs gives competitive advantage to customers who use them. SaaS without APIs does not give any advantage.

Right now my policy on buying cloud SaaS products is “if it doesn’t have an API then I’m not interested”. 

Monday, 4 November 2013

Why we're sunsetting @datownia

This month (November 2013) we made a decision to withdraw @datownia from general availability, meaning that new companies cannot sign-up. But we're still continuing to service the businesses which currently use datownia. 

There are 3 reasons for this decision. Here they are, in no particular order:

  1. Focus (of our attention and our limited resources) now needs to be on SystemSync.
  2. We could not find the right marketing strategy for datownia
  3. A self-service API is too complicated for non-technical business executives to operate. 
Datownia was a software product built to answer a single question: could API creation and management be reduced to a level of simplicity that meant a non-technical person could operate it. After running Datownia for over a year we felt the answer was "no". There are some aspects of API creation that require a fundamental understanding of data, security, subscription management that most business executives do not have. 

I'm not going to say datownia was "ahead of its time". I don't think it was. Its just that the subject matter (i.e. the creation and operation of APIs) is technical, complex, and ultimately has no room for error because APIs are critical to the smooth operations of apps and data interchange between businesses. To paraphrase Clemenceau when he said "War was too important to be left to the Generals".. in a way APIs are too important to be left to the business to operate.

But we did learn some very important lessons from Datownia which we are now applying to great success in the SystemSync product
  1. There are massive financial savings to be made in the world today by increasing the efficiency of on-boarding data into a single large organisation from multiple (hundreds, thousands) of small companies using API technology. Call centres, paper forms, emailed Excel docs can all be replaced by data APIs.
  2. Small companies can and will work with Excel files held in Cloud files stores (without complication or snobbery or over-zealous concerns about security)
  3. A platform that can remotely "write" an Excel template into Dropbox, Google Drive, SkyDrive or Box, and then make the contents of that Excel document available as a federated JSON API, and do this securely and at scale across thousands of accounts, will create real business value in multiple markets. This is the mission statement of SystemSync
Datownia, and the spirit of simple data interchange between businesses, lives on within SystemSync.

Friday, 31 May 2013

Managing Enterprise Identities in the Cloud.. using an Excel sheet

Last month I travelled to Munich, Germany, to demonstrate a visionary concept at a prestigious IT trade show: the European Identity and Cloud 2013 conference organised by Kuppingercole.

Specifically, we were invited by Craig Burton (Distinguished Analyst at Kuppingercole) who had spotted an opportunity. Craig’s vision was that datownia’s ability to create Cloud-hosted APIs from spreadsheets could be combined with Microsoft’s Azure Active Directory and Graph Store services to drive federated single-sign-on to a mobile web app using an Excel sheet (as the employee register) and Social networks (as Identity verification service). 

If you are in the identity management business, and you know about Kuppingcole’s “Computing Troika”, then you will know this concept is a big deal: the frictionless management of employee identities at the epicentre of the three seismic shifts which are impacting businesses today: cloud computing, mobile computing and social computing. 

For the rest of us it means “making it very very easy to administer secure employee access to a cloud-hosted Enterprise mobile app using their Facebook/Google/Yahoo login”. 

So why is this so important?

Metaphorically speaking “Enterprise IT” is under incredible tension right now because its being stretched by unstoppable forces (cloud, mobile and social computing). One of the biggest points of tension is “identity management” which is a subset of “security”. ‘Security’ is the single biggest obstacle thrown up by any and every IT team when their business teams want to use their own devices for work or access amazing features in new cloud apps or just work remotely. 

Identity management is the answer to the question “Who in my organisation (or from other organisations) is allowed to access something (like a cloud SaaS product), and when they get access to that something then what can they do with it (what level of authority to they have)?” The big boys (the global corporates) can afford to spend $250K on a Enterprise Identity Management solution. They can also afford to employ staff to operate that system. The trouble is that 96% of the world’s businesses are SMEs. Those SMEs are exactly the type of businesses which are rapidly adopting cloud, mobile and social technologies. They are exactly the type of business which now has an identity management problem.

Identity management in the cloud/mobile/social world is a real headache for every business. Its a headache because its difficult to technically sort out and it’s difficult to administrate. Identity Management is not a core expertise of most businesses. Yet as soon as an Enterprise ‘extends’ itself beyond its office premises (and the safety of its physical networks, routers and firewalls) Identify Management becomes an issue. Solve the Identify Management problem and you are a long way down the road of helping every business in the world operate securely. And that’s exactly what we did (guided by Craig at Kuppingercole and enabled by Microsoft’s Azure Active Directory and Graph Store services).

Our demo is very simple. It has 2 actors: 
  1. a company identity management “administrator” (could be an HR person, or a CEO) and 
  2. someone else (could be an employee or a business partner). 
The administrator wants to grant the employee access to password-protected content a cloud mobile app. In our example the cloud mobile app is actually a “conference agenda app” for the EIC 13 conference. The administrator facilitates access to the app by adding the employee’s name, email address and the chosen access level to an Excel sheet stored on their computer. The employee then opens a browser, surfs to the mobile app’s URL and logs-in with the same email address. The identity of the employee is verified by a single sign-on via a social network (either Google Gmail, Facebook, or Yahoo). Once authenticated the user can access the cloud mobile web app but only to the correct degree of authorisation (as specified by membership to the appropriate user group as detailed in the Excel document). End of demo.

High level architecture of WAAD Manager, showing the integration between a cloud app (EIC conference app), social networks (Gmail, Facebook), Azure Active Directory/Graph Store & user data stored in an Excel doc
Behind the scenes things are a little more sophisticated, but not much. Lets walk through it step by step.
  1. The Excel sheet is turned into a cloud-hosted API by datownia
  2. A simple app called “data pump” synchronises the contents of the Excel sheet with Azure Active Directory (for Identity Management) and Azure Graph Store (to store the content to be displayed in the mobile web app)
  3. The employee logs into the web-app, which instantly polls Azure Active Directory to make sure it recognises the employee’s email address and then re-directs the employee to their chosen social network. For example Gmail.
  4. The employee logs into Gmail, and thereafter is automatically returned to the mobile web-app which recognises the employee as an authenticated bonafide user. 
  5. The mobile web-app only displays content or functions which are appropriate to the access level of the user (as stored in Azure Active Directory).
In this way we have made the secure identity management of employees in the cloud as simple as typing data in Excel: something which is within the capability of any business no matter how small. 

The credit for this concept belongs entirely to Craig Burton who spotted the opportunity to combine datownia’s data sharing APIs with Microsoft’s Azure services. It was his vision and his encouragement that brought it all together on stage at EIC 13.

We’re also indebted to Kim Cameron (Chief Identify Architect at Microsoft) for spending half of his key note speech at EIC talking about this demo and telling the audience of CIOs it was the shape of things to come. And finally to James Baker, Technical Product Manager at Microsoft, and one of the brains behind “Graph Store”, for supporting our dev team as they put the demo together. 

Tech tips for working with Azure Graph Store

In April 2013 Release Mobile built a demo app using Windows Azure Graph Store. We put this tech blog together to share some tech tips on using GraphStore.

The project objectives:

We built the Windows Azure Active Directory "Manager" app in collaboration with Kuppingcole and Microsoft to demonstrate how single-sign-on access to a mobile web app (in this case a replica of the EIC13 conference all) can be governed, via Azure Active Directory, by information held in an Excel doc. The user identities are authenticated by Social networks such as Facebook, Google and Yahoo

The project had three aims:
1) Present a vision of the future in which access to Enterprise Cloud apps are securely managed by single-sign-on authentication facilitated by Social Networks.
2) Present a very low-friction method for Enterprises to manage their employee identities, using Azure Active Directory, Excel and the API-as-a-Service app
3) Demonstrate how Azure Graph Store is a multi-function schema-less data management service that can be used to both extend Active Directory and to hold App data.


What is Windows Azure Graph store?

A Graph store is a named data store, containing data tuples.  These are identified with a compound key (_Item1 & _Item2) and have a value property, capable of holding complex data types.
The Windows Azure Graph Store is an extension of the Windows Azure Active Directory Graph (
The tuples are formatted in JSON.
It is managed by a RESTful API.


In order to start using a Windows Azure Graph Store, you will need an Azure subscription and an Active Directory Tenant
More details on pre-requisites can be found here:

How to access a Graph Store

Consider this URL:
This is an example of  a URI to a tuple with item identifiers “id1” and “id2”, within the “myGraphStore” named graph store of my active directory tenant named “”
The resulting object might be of the format:
"_Item1" : "first item",
"_Item2" : "second item",
"AttributeName" : AnyPrimitiveValue,
"ComplexAttribute" : {
"AttributeName" : "value",
"AttributeName2" : "value2"
"AttributeCollection" : [
"Attribute1" : 100,
"Attribute2" : true
"Attribute1" : 200,
"Attribute2" : false

Http operations to manage data:
Action HTTP Operation
Create POST
Update MERGE
Replace PUT
Posting a new tuple to a non-existent graph store will automatically create the store and tuple.
Permissions on a graph store are managed via a permissions document which can be found at{tenant}/{graphstore}/$permissions
The permissions document specifies the name of claims that must be present in a bearer token obtained from Windows Azure Access Control Service (ACS). For information about obtaining access tokens from ACS, see the following link; 
If anonymous access is permitted to a graph store via the permissions document for a given operation (AnonymousRead or AnonymousWrite are true), the REST request need not include an access token.

Extending Active Directory

Each AD Tenant has a specially named graph store called “graphextension” which is used to add properties to AD entities.
For example, to add a property named “programme” to an AD User entity, you can post the following:
"_Item1" : "users",
"_Item2" : "programme",
"OwningTenant" : "",
"ValueFormat" : "{graphstore}/targetGraphName/{id}/programme"

Then, calling:'')/ /programme
will return  the url:
When this url is called, the joe@ tuple is returned, containing the extended data.

Querying the graph store

A data tuple is identified by a tuple id.  It is possible to use wildcards (*) in place of item1 or item2 in order to return a concatenated result set.

thanks Ovais Oozeer @ Release Mobile Ltd for providing the tech tips for this blog.

Monday, 18 February 2013

Why I'm changing the @datownia business model

We launched @datownia last year after rapidly developing it for 9 months, working closely with several small companies to iterate through product (and customer) development cycles. Our objective was to create a self-service API creation platform that any company could operate without the need for an API expert. 

This month we're going to change the model and pricing structure of datownia: taking it from a low cost, self-serve, pay-as-you-go model to a relatively higher price point with further fees for setting up and operating an API-as-a-Service for our customers.

This blog explains the 2 reasons behind this change.

The first reason is that it was always necessary to prepare data for an API, but the required data skills didn't exist in our target market. Our client's needed access to a skill set they don't have themselves.

The companies we worked with to trial datownia didn't ever really understand data, data structures and APIs. There's nothing wrong with that because that's the market we wanted to help bring API technology to. But it did mean our team had to help them structure, clean and present their data so that other Developers would be able to use it (via our APIs). 

In our new model we're offering data preparation and API creation services to our customers as part of an "API Creation" service. Then we offer "API management" services as part of our API-as-a-Service strategy. 

The second reason is that couldn't devise an effective marketing and sales strategy to sell APIS to SMEs, based on a self-service model. 

Our self-serve pricing offering started with a free trial for 3 months, then pricing tiers starting as low as £16/month to create and run simple data APIs. Our challenge was to market our API creation platform to SMEs, who (a) didn't understand APIs and (b) didn't understand that they needed APIs. We believed that Developers would advocate datownia to their clients and in so doing delegate down to their clients the job of maintaining the data. The trouble is that this gave us a very difficult market space to operate in for a small company. Did we market to Developers? or to their clients? or both? And how much could we really afford to spend on marketing when our products price was so low.

By restructuring our pricing and our services we have clearly defined (a) what we do for our customers and (b) who are customers are. 
(a) We build and run APIs-as-a-Service
(b) For companies who don't need the time and expense of doing it themselves

A footnote on Eric Ries' "Lean Start-up".
In early 2012 I'd read and been influenced by Eric Ries' "Lean Start-up". Yet even though I followed many of the recommendations in that book I ended up building something that our customers didn't exactly want. They never really wanted to create and operate their own API. Instead they wanted to solve business problems (which I thought could be solved by APIs).  Looking back at my early "customer development" meetings I can see that when I asked my prospective pilot partners the question "wouldn't it be great if you could open up your data and connect it to apps without spending tens of thousands of pounds and months of work" I kept on hearing the answer "Yes". What I failed to realise was that they really meant "Yes, but only if you did that for me so I don't have to".

Saturday, 6 August 2011

How I used YouTube (and a Python) to explain something uninteresting about IT

I wanted to find an easy way to explain to a client what could happen if you introduce change too quickly into a complex business environment.

More specifically, the complex business environment was functioning normally but with obvious inefficiencies. The point I was trying to make was that even a positive change that clearly makes sense and which will bring immediate benefits can still have associated risks. Especially if the changes are introduced to the existing business process without comprehensive unit and end-to-end testing.

The vehicle I selected to deliver this message was Terry Gilliam's 1977 movie: Jabberwocky, staring Michael Palin. Specifically one scene which takes place in a Blacksmiths shop, which starts 4 mins 0.5 seconds into this YouTube clip:

Result: the client got the point, and then forwarded the email around the office. There could be something in this… IT management via YouTube clips.