Monthly Archives: December 2013

Open Technology Standards to Accelerate Interoperability

Steve Taylor, Head of Technology and Architecture


As we become immersed in an era described as the “internet of things,” we are almost taking for granted something that for decades has continued to shackle data into siloes and stifle how data is used and adopted across the enterprise. The way we move information around the enterprise has gone from a ‘nice to have’ to vital to exist. It is this need for greater levels of interoperability that has created a shift in how we approach data integration. The technology landscape has changed significantly in recent years; open source is now the norm rather than the exception. We have seen the emergence of new standards and patterns that are accessible and available to larger numbers of users and applications. This has allowed information to move deeper and wider across firms driving up data quality and operational efficiency. One common challenge remains. How do I ensure interoperability between my commercial off-the-shelf, custom and legacy applications? A decade ago a standardized ETL function within a shared utility was the strategic direction followed, but all too often the projects overran, didn’t deliver or required costly “last mile” integration using point solutions and technologies. This was often due to the lack of standards or understanding of the business domain. Read More…

 Scroll to top