Pleased as I was to reach my holiday destination (Cartagena in Colombia) yesterday I was almost as excited to receive the news that self-service loans – and more – are at last available for library users in the UK.
SOLUS are, like me, based in Scotland and like me are eager to find ways to exploit the full potential of RFID and the new app does just that. Borrowers in RFID equipped libraries will be able both to issue items at the shelf and clear security in a single operation. Those still using barcodes will be able to use their devices cameras to issue items but will still have to deal with whatever security system (if any) is in use separately.

Returns can also be handled using mobile devices.

Dovetailing with the launch of “self-service” within the Library App, SOLUS has also announced the Q1 launch of “SOLUS Pay”, its mobile payment solution, which will allow users to make payments from within the App.

Initially aimed at library charges there are no payment limits with SOLUS Pay so both library services and their wider parent organisations will be able to pay other service charges through their Library App.

 

Full press release available here.

Yesterday I was in Birmingham at the offices of Capita Library Services our hosts for a day of coding and discussion. My job, as chair of BIC‘s various Library Communication Framework  (LCF) committees was to kick-off the first LCF “Plugfest” where developers from different library system suppliers spent the day writing and testing applications using the new framework launched last November.

Plugfest

The Plugfest is an important part of the process of developing more interoperable systems as it offers developers the opportunity to verify that the applications they are writing work in practice. It also ensures that the team of Technical Editors charged with the responsibility of maintaining the framework are made aware of new requirements and any problem areas. Plugfests will be an essential and frequent part of the development process as more and more library applications adopt the framework. Yesterday’s event was attended by representatives from 2CQR, Bibliotheca+3M, P.V. Supa, from the world of RFID; Capita, Civica, Innovative and Infor (late apologies were received from Axiell) representing LMS providers as well as third party suppliers Lorensbergs and Insight Media.

Unusually for such a highly competitive market everyone attending had already signed up to share freely the fruits of their labours. This spirit of co-operation appears to be almost unique to the UK as colleagues in Australia and North America frequently express disbelief when I tell them that competitors in the UK library market actually work together to try and find ways to improve both the user’s and staff experience of library automation. “You’d be lucky to get them in the same room here!” is one popular response. Certainly there are plenty of examples of companies meeting to discuss new standards and best practice – America’s National Information Standards Organisation (NISO) has been discussing a successor to the SIP protocol for more than three years now – but it seems to be unusual for competitors to share code, provide hardware and develop best practice together as they do in the UK.

Perhaps that’s why the authors of the other big interoperability event of the day – the publication of the long-awaited ACE funded, SCL initiative on creating a single digital presence for England’s public libraries – ignored invitations to discuss LCF during their lengthy investigation of the UK library systems market.

Now of course I’d be the first to acknowledge that the BiblioCommons report concerns itself with much wider issues than the existing systems infrastructure but a significant part of its recommendations appears to suggest that the only way forward is for them to write new code to create a new BiblioCommons software layer on top of the various existing LMS systems, pending migrating everyone to a new, purpose-built BiblioCommons LMS at some future date. One might argue that the same result might be achieved more cheaply by awarding a contract to a single supplier now and cutting out the highly risky intermediate stage recommended by BiblioCommons. But then that is what they do for a living.

Nonetheless ignoring the significant work already being done in this area seems at best something of an oversight?

I’ll be writing a full review of the BiblioCommons report on my other blog in the near future as its findings and recommendations go way beyond the relatively simple aim of establishing a common framework for interoperability but the irony of the juxtaposition of these two events was irresistible!

Meanwhile, back in the real world, this first Plugfest was a great success and the LCF Project is now well and truly under way. New functionality – that is both interoperable between disparate systems and which can readily be migrated without impact between suppliers – is no longer a system integrator’s dream but a developer’s work in progress.

 

 

 

25. November 2015 · Write a comment · Categories: Uncategorized

A lot of people have been asking me what I think about the recent merger between Bibliotheca and 3M.

It’s an impossible question to answer easily and is usually prompted by a variety of concerns. How will customer service be affected? Will products (like the two e-book offers) be merged as well? How will the RFID market now develop? What’s happening to the staff? (Just a selection of the emails in my inbox recently.)

Obviously the answers to all of these questions will be answered in the fullness of time, and by those actually making the decisions not by some opinionated individual with an axe to grind. But there are some ‘big picture’ changes that I think are likely to be happening as a consequence and today I’d like to focus on just one of them – the future of SIP (the Standard Interchange Protocol).

There can be very few librarians – certainly very few in the UK or North America – who haven’t heard of SIP. It’s part of the development history of library systems – as z39.50 was for discovery systems – and like z39.50 has played a significant role in developing interoperability between some very differently designed library systems.

But it has also been holding back the development of RFID solutions from the very beginning.To understand why we need to remind ourselves why SIP was developed in the first place.

SIP was the result of 3M’s efforts to standardise communication between their early (non RFID) self-service machines and the library management systems to which they connect. As such it was concerned only with establishing the status of items being presented for loan. When 3M introduced RFID they did so to combine circulation and security in a single operation using data instead of magnetism to manage security. They did not however attempt to change the functionality it could deliver.

And that’s pretty much how things have stayed for 25 years or so. SIP drives the circulation transaction, RFID handles the security.

RFID is actually a pretty expensive way to manage such a simple process but it works, looks modern and librarians have been in love with it for years. Until now.

Suddenly new pressures acting on the library market are changing the way we to think about RFID.

The first of these is of course financial. Buying in self-service is still a very popular response from local authorities seeking to cut their costs. Often this goes hand-in-hand with staff cuts – the machines do the work of lending and returning stock and volunteers can do the re-shelving. The big appeal of this approach is the transfer of recurring costs (the staff) to capital expenditure (the machines). There’s always money around for ‘invest to save’ projects, far less for providing professional staff to provide a “comprehensive” service (however the government of the day chooses to interpret that). Looking ahead it’s difficult to see past the likelihood of more councils short-sightedly spending more money on self-service machines to keep fewer and fewer libraries open.

Librarians are becoming disenchanted with self-service – it costs them their jobs.

But councils are likely to consider more RFID (or rather self-service) as the best way of supporting the government’s agenda by cutting costs. Just maybe they might pause to consider whether spending all that money on book lending machines is really delivering an adequate return on their RFID investment? Of course to do that they would need the expert advice of their librarians to tell them how the technology could deliver a more efficient and effective service at lower cost. If there are any left to ask…

Pressure also comes, perhaps surprisingly, from the suppliers.

Since 2011 and the domestic market’s wholesale adoption of the UK data model RFID suppliers have begun to realise the wider potential of the technology for delivering new products and services. Having a single data model has enabled their developers to plan to deliver new functionality against a single, known tag standard. (It has also enabled some librarians to change suppliers without having to re-tag or re-program all their existing stock).

And with so much investment in self-service over the past ten years suppliers are beginning to run out of opportunities to sell new systems. They need to find new ways to use RFID to deliver new services and solutions.

For librarians another problem is that suppliers are rapidly running out of librarians to whom they can sell them – so they are talking to directly to councils. That has tended to shift the emphasis for service development away from improving the library service toward expanding the range of council services that can be delivered in the library building.

Librarians tend to regard RFID with suspicion because it doesn’t deliver a better library service.

And that’s at least in part because of SIP. (Remember SIP? This is a blog about SIP)*

Because SIP was originally designed to help 3M sell more self-service circulation machines it has proved very resistant to being adapted to deliver much else. When 3M donated the protocol to NISO two years ago they originally hoped that the NISO ‘imprimatur’ on their newly developed version 3.0 would rekindle a flagging US market. Sadly for them this strategy appears to have failed thus far. SIP 3.0 is still nowhere to be seen (although rumours of its death may be exaggerated).

In the meantime Bibliotheca have effectively taken over 3M’s library business – and in the process become the largest RFID supplier in the USA.

Now like 3M, Bibliotheca systems still have to rely on SIP to manage much of the communication between them and the LMS systems that still handle the decision-making process. But unlike 3M, Bibliotheca have been one of the most enthusiastic supporters for removing the limitations of SIP since they first arrived on the library market scene. One of the ways they plan to do this is by using BIC’s Library Communication Framework to develop new functionality for their RFID installations.I’m sure they have others.

The future of SIP looks very insecure right now. Even if NISO do eventually publish a new version of the protocol it is unlikely to move forward the functionality of library systems (or RFID) by a single byte. With 3M in the process of leaving the library stage SIP’s greatest advocate has gone. Fortunately what remains is an opportunity (using LCF) rather than a void. Will Bibliotheca use its undoubtedly strong global influence to change the way we use RFID through LCF? I really hope so – for their sake as well as the market’s.

To really gain true value from RFID we need several things:

A common data framework that is open to all (LCF);

A major supplier dedicated to using that framework (we have several – including Bibliotheca);

An informed library workforce that understands how the technology works so that they, and not the suppliers, drive the demand for development (not sure about that)

and,

Buyers that have the wit and wisdom to make informed decisions (well 2.5 out of 3 isn’t bad).

So what do I think about Bibliotheca’s take-over of 3M? Well it gives me more hope for the future development of our library services than the alternatives.

 

 

*acknowledgement to Arlo Guthrie for adapting the line from Alice’s Restaurant.

,

 

 

 

 

 

22nd October saw another important milestone being reached for the Library Communication Framework (LCF) with its official launch taking place at the somewhat unlikely venue of the “Poetry Café” in the heart of London’s Covent Garden.  IMG_5588

The somewhat cosy atmosphere did however encourage conversation – one of the aims of Book Industry Communication (BIC) “Breakfasts” – and everyone I spoke to appeared to have enjoyed the experience.

The three main presenters – Catherine Cooke from Tri-Borough Libraries and Archives, Anthony Whitford of Capita and myself – explained the genesis of the project, its purpose, governance and future development as well as offering advice on what steps librarians and suppliers should take if they want to participate. All the presentation slides are available here.

The heavily over-subscribed event was attended by many of the leading suppliers in the library sector – 2CQR, Axiell Ltd, Bibliotheca, Capita, Civica, D-Tech International Ltd, Ex Libris UK, Infor, Innovative Interfaces, Insight Media Internet Limited, Lorensbergs Ltd, Nielsen Book, ProQuest Bowker, PTFS Europe, SOLUS UK Ltd.; representatives from key library organisations – CILIP, Libraries Taskforce,  DCMS, The British Library, and even librarians – from Buckinghamshire Library Service, Enfield Library and Museum Service, GLL, Tri-Borough Libraries and Archives.

Many of the suppliers present – and some who were unable to attend – had already pledged their support for the framework by signing up for membership of the recently established LCF Consortium (full list here). Consortium members agree to work together to promote the adoption of the framework for the development of better interoperability between library management (LMS) and third party systems. To the casual reader this might sound like a public relations exercise but it’s much more than that. Contributors work together in an entirely open environment, its deliberations and decisions open to public examination and comment. Three Technical Editors – one from JISC, one from the supplier market and a third from the standards arena – are responsible for growing and maintaining the framework on a day to day basis while their decisions are reviewed monthly by a Technical Committee which I chair on behalf of BIC.

Governance rests with BIC who undertake to manage the development of the framework on behalf of the library community. The LCF “Charter” (to be issued before the end of the year) will, among other requirements, bind members to agree to share their contributions to the framework with other users.

The consortium is separately funded from other BIC activities by its supplier members and BIC membership is not be a pre-requisite for membership.

There is a lot more information about LCF available on the web and elsewhere on this blog. Follow @BIC_LCF to keep up to date on developments.

Bibliotheca logoConsolidation continues in the library automation sector as Bibliotheca this afternoon announced their acquisition of 3M’s global library business.

Rumours of a sale had been circulating for some months with China’s Invengo  – specialists in RFID – widely tipped to win the race to seal the deal.

The new company becomes easily the largest supplier of library self-service and security products in the western hemisphere and by combining the already established 3M’s Cloud Library with Bibliotheca’s recently announced Opus product, the enlarged company is likely to provide stiff competition in the e-lending sector for current market leader Overdrive – which announced its latest software product to the UK market only this morning.

Two separate deals – one for North America and another for the rest of the world – have been signed with Bibliotheca acquiring both staff and assets at 3M’s headquarters in Minneapolis.

One consequence that will be of especial interest my colleagues at Book Industry Communication (BIC) will be the potential international boost that this gives to their recently launched Library Communication Framework (LCF). Bibliotheca has always been one of the keenest supporters of LCF from the project’s inception and I am assured that this is set to continue.

More details to follow.

Today sees the official launch of the Library Communication Framework (LCF). Originally conceived as a replacement for 3M’s Standard Interchange Protocol (SIP) the framework has been several years in the making and has, through the active involvement of both suppliers and librarians working together, grown from a simple updating of protocols for running RFID self-service into a significant contribution to interoperability across a range of products and services.BIC

Exactly why LCF was developed has been the subject of many papers and reports over the period. The more enthusiastic reader will find a succinct (if somewhat dated) explanation in the BIC archive.

Having myself first proposed that a replacement for SIP was long overdue back in 2010 it was in fact my colleague Frances Cave who first suggested that a “framework” would offer a more flexible approach for the industry in general. The history of these early discussions and meetings up to the original launch of what was then called “BLCF” (the “B” standing for BIC) can be found here.

Renamed “LCF” (in response to a request from American colleagues, who thought the “B” might be thought by some to stand for “British”) the LCF working party – which it has been my privilege to chair – has expanded both in membership and scope since 2012 and over the last 18 months has seen the establishment of a regulatory mechanism to ensure that the framework remains current and avoids the problems – inherent in SIP – of allowing developers to add new values and functions almost at will. BIC – an independent organisation – will maintain and develop the framework for the benefit of all.

Most heartening – for me – are the number of both RFID and LMS suppliers that have already signed up to the LCF “Charter” – a statement of intent to comply with, promote and of course use the framework to develop better interoperability between systems. The astute librarian will want to scan the list of LCF supporters carefully and perhaps question why some suppliers haven’t wanted to support the aims of this entirely open framework.

Developing better interoperability and ultimately more closely integrated systems has been the dream of librarians for many years. There have been many attempts to solve the myriad problems of multiple formats, different architectures and a lamentable lack of industry standards. Most have sunk without trace. Libraries have responded to these disappointments in a variety of ways – single LMS procurements, moves to Open Source solutions and potentially even API heavy middleware adding significant cost without commensurately improving interoperability. The industry badly needs to put its house in order. The framework provides a starting point for realising that dream.

The framework is officially launched today and the press release can be downloaded here. A BIC Breakfast meeting in London on the 22nd October will provide an early opportunity for librarians and others to find out more about the framework, ask questions about its use and most importantly discover how making it a mandatory requirement in future system procurements will ensure the best return on investment for cash-strapped libraries. I and two of my fellow LCF working party – Catherine Cooke (Triborough Libraries) and Anthony Whitford (Capita) will be speaking – details here.

Note: Please don’t confuse the library communication framework with purchasing frameworks (such as that brokered by organisations like ESPO).

This is a data framework developed by members of the library profession working with their suppliers to improve interoperability. Purchasing frameworks essentially facilitate hardware purchase at discounted rates.

24. August 2015 · Write a comment · Categories: Uncategorized

A new header to celebrate new beginnings!

The wonderful city of Edinburgh. I can’t imagine why it has taken me so long to come and live in the capital city of this magnificent country.

This has been a year of considerable change for both me and for the profession with which I have been so closely linked (despite never having been a member!) for most of my life.

For me the changes have been mostly positive. Certainly moving to Edinburgh feels like being granted a new lease of life. People still value their public library service here. A recent news report shows that, even with austerity, libraries in the capital are welcoming increasing numbers of visitors through their doors. I sense a greater feeling of optimism here than seems to exist south of the border – it seems to me that for the thirty years I’ve been working with Scottish libraries this has almost always has been the case.

But there’s no room for complacency here. Librarians here face the same challenges as their English counterparts. But the SNP is actively fighting to save libraries in Scotland and, like Plaid Cymru with its vision for the future of the service in Wales, has an altogether more positive and less fragmented view of their future than anything being brokered by English political parties.

So a mark in the ‘plus’ column for moving to Scotland!

In other news…

Despite having been mired in RFID for the greater part of the last six years I have retained an active interest in library systems. Since 2013 I have had the pleasure to be asked to advise on library management system (LMS) projects in both Scotland and Ireland – the latter culminating in the drafting of the specification and much of the business case for their national LMS procurement. Quite a refreshing change from ISO standards and self-service kiosk design! So much so that I recently closed down my company – Library RFID Ltd – and am now in the process of rebuilding that website to more accurately reflect my current activities.

Whilst I am still keenly interested in RFID I have become increasingly frustrated at the lack of ambition of both suppliers and consumers for developing the potential of the technology – particularly for mobile applications – and am consciously changing the emphasis of both this blog and my website in an effort to change this state of affairs.

Inspired by meetings earlier in the year with smartphone and tablet app developers in both the commercial sector and the universities, where new uses for NFC and RFID in libraries are already being planned, I think it’s definitely time to start building the foundations of the next generation of more mobile and agile library systems. A view that I hope will resonate with both the CEO of the recently-created library task force, Kathy Settle and the Arts Council’s Brian Ashley, both of whom were kind enough to find the time to meet with me over the summer.

I don’t believe anyone wants to recreate the kind of proprietary, fragmented solutions that prevented the RFID library market in the UK from functioning effectively for much of the first ten years or so of its development. The SCL’s efforts to create a single digital platform – assisted by consultants from Canadian application provider BiblioCommons – is I think a clear recognition of the lack of any kind of cohesion in existing library system provision. The Welsh and Irish decisions to implement a single supplier solution represent – in my opinion – another example of what happens when consumer patience runs out. I hope both succeed in improving matters but I am more than a little concerned that both solutions are fighting the last war rather than anticipating the future.

In my view there are no ‘silver bullet’ solutions and very few tech-savvy library systems specialists left in the public sector to assess the efficacy of the solutions being proposed – but I would say that, wouldn’t I?

In a effort to try and do something positive I’ve recently agreed to work with Ken Chad and some other equally talented library people (not necessarily librarians Steven!) in building a new model for systems procurement for libraries to replace the existing core specification originally created by former colleague Juliet Leeves back in the 1980s. Events elsewhere having unexpectedly given me some time to spare over the coming months.

Another ‘plus’ for library technology development!

Sadly for me there was one sad moment in all this excitement. In March Book Industry Communication (BIC) decided to make changes to the structure of its committees and working groups to better reflect the wishes of its membership which has ultimately ended my fifteen year relationship with the Library Committee. I hope we achieved some useful outcomes during my time – not least the adoption of a national data model for RFID use.

I am, for the moment, still actively engaged in promoting the Library Communication Framework – a project I initially proposed back in 2011 – but only until a librarian can be found to take over. I wish them and BIC well – and thanks for all your support.

Just one small tick in the minus box then.

Onward and upward… (Edinburgh  streets offer no other alternative!)

I’ve been talking to a lot of librarians recently.

I’m currently on the road in the UK spreading the word about new issues in RFID – Privacy, the impact of Near Field Communication (NFC) and something that regular readers of the blog will already know a good deal about – the Library Communication Framework (LCF).

Since the beginning of the year I’ve been speaking to heads of service in academic and public libraries beginning in Glasgow and now working my way south via Wallsend, Beverley and Preston. I was also very pleased to be invited to run a CPD session for academic librarians in the South East during March.

One of the many things I’ve learned along the way is that the procurement guide that Mark Hughes and I wrote for the National Acquisitions Group and Book Industry Communication  back in 2011 is still being widely used by librarians seeking to buy or extend their RFID solutions.

Flattering though this is it is also somewhat alarming! There have been many changes since 2011 – most of them flagged up on this blog – which were not addressed in the original guide. Anyone still using it, particularly anyone issuing it without amending the sample questions to reflect local circumstances and/or requirements, is unlikely to be taking full advantage of the new services, standards and benefits that have appeared over the last 4 years, not to mention the danger of making expensive mistakes.

Realising how dated the guide had become I withdrew it from all my sites last year. My plan is to produce an updated version for publication next year but in the meantime there is one particular innovation that I really think ought to be included in RFP you may be planning – RFID or otherwise.

I refer of course to my pet project – the Library Communication Framework (LCF).

The framework was developed over two years by suppliers from both the RFID and LMS (ILS) markets working together with librarians and assisted by consultants from Book Industry Communication (BIC). A great deal of information has already been published about LCF both in print and on the web. I wrote an article for CILIP’s Access journal that contained an explanation of why it was needed and what it is last year and there is a more detailed explanation on the BIC site.

Put simply it is nothing less than an attempt to create a more interoperable environment for library applications. Using it RFID system can speak unto RFID system – and both can speak to the Library Management System. It’s not an API or a web service (although both are supported) it’s simply a set of standard data elements and values that can be implemented in whatever way best suits developers. The LCF is completely open and supplier independent and the whole process is managed by BIC on behalf of the library community.

The framework will grow as new functionality is added and as new application providers come on stream. A management team and website is already in place to make it simple for developers to add new elements and data as required but unlike SIP quality controls will ensure that it maintains its integrity as it develops.

All of this is discussed in detail elsewhere so today I just want to suggest a few additional questions that you should be considering adding to any tender or RFP you might be about to issue – whether based the original guide or not.

Many suppliers are already using LCF to develop new functionality so it’s worth checking to establish whether the one you are inviting to sell you a solution is one of them so I would consider asking,

 

  1. How is your commitment to the Library Communication Framework demonstrated in your future product development plans/roadmap?
  2. What specific functionality are you achieving today via LCF?

Functionality developed using LCF can be readily transferred to any other supplier that supports it while functionality that has been specifically developed by a particular combination of RFID and LMS (ILS) supplier is less likely to be available if you change either so if you’re buying a new RFID system consider asking,

  1. Which functions of your system have been implemented or made possible using integration methods that are unique to your current LMS (ILS) – RFID supplier (i.e. using API’s and/or customised code rather than defined open standards such as SIP2/NCIP/LCF)
  2. What specific functionality will be lost if we choose to change our LMS (ILS) in future?
  3. What services and costs might we have to budget for, in the event we chose to change our LMS (ILS) in the future?

Obviously it would be sensible to ask essentially the same questions of any potential LMS (ILS) provider.

The whole area of interoperability has been a bugbear for librarians and providers alike for many years now. LCF seeks to put this right by presenting developers with the choice of using a more open means of implementing their solutions. In the UK all the major RFID suppliers now support both data standards and the LCF.

Readers from outside the UK – or those with systems that were installed pre-2011 might therefore consider asking a couple more questions:

  1. Are there any proprietary elements of your solution that might prevent another supplier from interoperating with solutions provided by your company?
  2. Please provide details of sites where your solutions work alongside other RFID applications/systems in the same library.

Hopefully anybody currently struggling with procurement using an RFP will find this helpful however if you’re buying through one of the many framework agreements out there I can only wish you ‘good luck’ since – so far as I am aware – none of these issues are addressed by any of them.

Over the past two weeks I’ve been talking to quite a few UK librarians about RFID issues. A few had misconceptions about some aspects of the technology and suggested that it might be helpful if I posted about them here. So here goes…

  1. RFID self-service doesn’t use sensitisers.

Many libraries invested in Electromagnetic (EM) security systems long before RFID appeared. These usually relied on a thin strip of metal (often called “tattle tape”) hidden in the book’s spine.EM

When items were borrowed a “de-sensitiser” reversed the polarity of these strips allowing them to pass security gates – set to sound an alarm when they detect sensitised items. High voltages powered the de-sensitiser which transmitted electricity in much the same way as an electric toothbrush does.

When self-service units first appeared in libraries they included these same de-sensitisers together with barcode readers to allow readers to issue their own items.

RFID self-service looks almost exactly like its EM counterpart but works in a completely different way. Instead of changing polarity on a bit of metal RFID depends entirely on data.

Library RFID tags usually comprise an aerial and a tiny chip stuck to a label. The data resides on the chip, while the aerial transmits data values to and from other devices via a scanner/receiver. Security is managed by writing specific values to an area of memory on the chip.

No high voltages. No magnetism. No sensitising or de-sensitising.

Because data is used to carry out the security function it is important for libraries know what data is being written – and how.  This is one reason why data standards are so important in RFID installations. RFID scanners using the same frequency – in another library for example – constantly scan for tags, and since not everyone uses the same values to set or clear security data false alarms can and do occur.

  1. It’s not the RFID system that makes the decisions.

This is a perennial topic. Every year I run a survey of RFID use in libraries around the world and one of the most common complaints I receive is that suppliers of RFID systems are very poor at responding to development requests.

Whilst many of these complaints are fully justified a significant number are asking for changes that could only be made by the management system (aka ILS or ILS) supplier.

All RFID solutions in use in UK libraries depend on a connection to the LMS. It is the LMS that continues to hold all the information – loan policies, borrowing limits, locations etc. All the information required by the RFID system – for displaying items on loan, fines owed or even to determine whether an item may be borrowed – is carried between the LMS system and RFID device by a message of some sort. This may be a web service, an API or some other proprietary means but most often it will be 3M’s “SIP”.

The Standard Interchange Protocol has been developed over many years to allow communication between an LMS and self-service systems (some of them RFID). It was designed primarily to support circulation and has been in use for over almost 30 years.

So RFID suppliers seeking to extend functionality for their clients are often restricted by their dependence on this protocol. Many have sought to improve matters by forming partnerships with specific LMS companies but of course the solutions they develop in this way are by definition bilateral in nature (i.e. they only work for products developed by the two partners).

The UK industry is trying to improve matters by developing an alternative to SIP called the “Library Communication Framework” (LCF).

  1. Adopting standards doesn’t usually require re-tagging stock

I frequently see messages on the RFID lists (particularly in the USA) from librarians explaining why they have decided not to use standards. One of the reasons given for not doing so is the cost.

taggingThere are of course still some costs incurred in switching over to a data standard but one of those often suggested – the cost of tag replacement – is often unnecessary. As I mentioned before a tag comprises a chip, an aerial and a sticky label and it’s the chip that matters here. Most of them are manufactured by the same supplier – NXP – but even if yours aren’t there is every chance that they can be converted without having to replace them.

In the early days of library RFID suppliers used many different manufactures for their tags and some of these products were discontinued, leaving library clients with no alternative but to replace existing tags altogether. That all changed in 2011 and it’s a simple enough matter to ask your supplier whether it’s possible to make the switch. Anyone wanting to future-proof their implementation should seriously consider doing so.

Some companies already offer hardware that will automatically convert tags to the UK standard as they are borrowed and since all UK suppliers have undertaken to support both their own and the UK data standard there should be no need to swap tags.

  1. You don’t have to buy everything from the same supplier

Before RFID suppliers agreed to support the UK data standard they each decided what data to use and critically where and how to store it on the chip. This often varied from site to site as some librarians mandated data elements they wanted to store.

This state of affairs rapidly created an inflexible market in which libraries had no choice but to buy all their RFID supplies from the same company.

Framework agreements – very popular with the public sector – have tended to perpetuate this practice and most procurements are still based on buying from a single supplier.

Academic libraries have proved more adventurous than their public sector counterparts in asking suppliers to support existing systems – usually by writing bespoke software to read another supplier’s data model – but this has become unwieldy for suppliers and libraries alike so most new installations use the standard – making it easier to mix and match hardware from different suppliers as well as allowing librarians the freedom to buy any new products that support the data standard.

On Wednesday NXP made a seemingly routine product announcement about their new RFID chip designed especially for libraries – the reassuringly geeky sounding ICODE SLIX 2.

The press release doesn’t say very much about the reasons for the chip’s development, rather it concentrates on the improvements it will bring to library users of RFID technology. The more technically minded can download the full specification of the chip here.

The poor benighted librarian reading this announcement – which has been duplicated by the excellent Marshall Breeding on his website – will however probably be simultaneously confused and reassured. After all just about all the major players (in the UK RFID market at least) have made supportive and excited noises about the significance of the new product in the announcement – and I know from my annual surveys that librarians trust their suppliers above almost everybody else in the market (apart from other librarians) when it comes to RFID.

So why this post?

Well you can call me a sceptic (people do you know) but I take very little at face value and there are some threads running through this announcement that raise questions in my mind. Coupled with other information I received last week I’m beginning to wonder whether we’re about to see a realignment in the library automation world that we haven’t seen the like of since the birth of the Internet.

Let’s look at what the statement says and try and figure out what’s going on here.

After the usual “it’s all going to be so much better” messages we are told that,

“The SLIX 2 is fully compatible with existing ICODE library systems, ensuring that over 5000 public and university libraries already using ICODE SLIX and ICODE SLI based labels can migrate and benefit from the latest technology without difficulties”.

Which is good news for the 5000 (where ARE all these libraries, and how are they being counted I wonder?) but there will be many more libraries out there NOT using the ICODE family of products that won’t. Unlike most RFID users libraries tend not to replace their RFID tags – and their “product” lifecycles are significantly longer than in retail for example.

So the chances are that many libraries will still be using tags that even predate the existence of the ICODE family of products. An obvious point I know – but I know some librarians who will think that this statement means everything’s fine. When it may not be.

The next point that caught my eye was,

“In addition to improved scanning and reading capabilities the new SLIX 2 introduces near field communication (NFC) technology to enhance library services.”

Now THAT’s a really interesting way to present information that already applies to ANY RFID tag using 13.56 MHz tags (and that’s ALL of them in the UK by the way).  Regular readers will be aware that the potential for NFC devices (like smartphones and tablets) to be used to alter or delete data on RFID tags is something of an obsession of this author’s. It’s been possible for years now, what’s different is the recent surge in the number of NFC devices on the market. To me this sounds like spin – the implication that NFC has been “added” suggests that it hasn’t been possible before. But it has. For ages now.

What the data sheet will also tell you is that NXP are introducing a number of new features on this chip that will enable suppliers to password protect, and even kill tags. Librarians should consider two aspects of this news.

Firstly that this protection will only available on the new tags, and secondly that password protection may not be the answer to the problem because of the way in which libraries actually work (something frequently misunderstood both by RFID suppliers and manufacturers alike). Integration with an LMS might indeed be made more difficult if RFID suppliers start to manage additional aspects of the circulation process – and that’s one of the reasons for my opening, somewhat hyperbolic(?), remarks about change.

The last part of the announcement to which I want to draw your attention is this one,

“The new chips offer additional memory space to store dedicated URLs without compromising the library management memory areas. The URLs will point to internet spaces that contain additional information related to the book or media.

Sophisticated content, such as movie trailers, author bios, book reviews, and much more, becomes automatically accessible through NFC-enabled mobile devices as they tap marked areas on the books. “

Sound familiar?

Again regular readers (and those who have attended any of my conference presentations in the last few years) will be aware that I have long advocated the use of physical stock as a discovery tool for other resources. Examples of this obvious benefit already exist in libraries as far apart as Australia and Norway. By linking with a discovery system – or even an OPAC – library users can already enjoy the benefits of using books, DVDs etc. to discover author interviews or live performances for example (it’s already documented on this blog).

But the difference here is that the URLs that make this possible will be stored on the chip – rather than on a remote database which, in the light of the recommendations on user privacy in the EU’s mandate to standards bodies – M436 (as discussed at length on this blog and elsewhere), may be almost culpably reckless. The mandate isn’t only concerned with the data present on tags but also with what might be inferred from it. Someone carrying an item with a URL on it could easily be inadvertently advertising a personal or commercial interest to someone equipped with the right (and probably free) software on their smartphone.

So what to make of all this?

To me it all sounds like the RFID market has run out of existing products to sell to its traditional library market and has decided to take on the LMS companies for their circulation business.  It’s not a surprising development – the potential has existed for many years now, what was missing was a chip that could support the additional features that would make an entirely RFID based circulation solution possible. Until now.

Of course this is not an overnight process. First librarians will need to buy the new chips that make the reinvention of circulation possible.

I wonder what they cost?