Recently as part of an ecommerce product catalogue project for my job, one of the data requirements was that products have unspsc codes for both internal and customer spend analysis.
What are unspsc codes?
There is plenty of detailed information on the website http://www.unspsc.org but to paraphrase their homepage summary -
The United Nations Standard Products and Services Code®(UNSPSC®) provides an open, global multi-sector standard for efficient, accurate classification of products and services.
The UNSPSC offers a single global classification system that can be used for:
- Company-wide visibility of spend analysis* Cost-effective procurement optimization* Full exploitation of electronic commerce capabilities
[more]How do they work?
The codes are comprised of 5 pairs of numeric digits each representing a hierarchical level
BB: Business Function
The full explanation can be found on the unspsc website here
Finding a solution
My first port of call was to see what was out there and how I could find these codes. There wasn't much out there, either because no one had heard of them or those who have are not interested in using them. After a bit of hunting I came across a web service from www.webservicex.net which I could use to query and search the codes.
The first thing I noticed was that there wasn’t any version number associated with the unspsc codes, although there are audit trails to track any code changes between released versions of the codes, it was apparent that not knowing which version of the codes we were applying to the products could cause real headaches later on.
The next thing which our users made us aware of was the performance. They complained that the unspsc lookup page regularly took a long time to load and sometimes didn’t load at all. To be fair to the completely free service hosted by webservicex.net, it may have been our link causing the problems, nevertheless it highlighted our dependence on a third party which quite naturally had no SLA or support.
It was a band aid solution to avoid writing the inevitable, a proper API for the codes running from a data source on our network.
If at first you don’t succeed
The first thing you find out about unspsc codes when you decide you want to make a data source from them, is that in their raw form, they are quite hard to come by. The website will offer you the latest version as a pdf and there is an online search facility on their site, but that is about as far as you get for free.
In order to get the codes in raw formats (excel documents) and in multiple versions you are required to obtain membership of the organisation which for a single corporate user carries an annual fee of $300 or you can buy the code sets at $50 per version.
So unfortunately I cannot include the codes with this api, I suspect if I did I would be hearing from them in less than friendly terms. I will provide some test data but please note they will not be unspsc codes.
So with that in mind, my money paid and the codes in excel files on my computer, off I went to build an api for these suckers.
The meandering model
First off I decided that it would be a great idea to just query the excel file, it appealed to my ‘keep it simple stupid’ sensibilities. Around the time I was also getting familiar with using linq and my mind was made up when I found a linq to excel provider that looked like it could do the job.
So I set about creating the queries to yank out the data from the spreadsheet, added a few methods and voila, I had a simple library which returned me the unspsc data. It worked but it was pretty horrible; had locking issues and when it came to roll it out on our 2003 x64 server I discovered completely unsupported (there is no x64 Microsoft.Jet.OLEDB.4.0 driver for excel!)
Third time lucky
I decided that once bitten, twice shy and started on a repository pattern so if my data provider changed once more I would not be back to square one. In fact I decided that since I could find so little out there, I’d have a proper bash at it
I’ll cover the overall project structure and some of the code in Part 2 (coming soon).