The difficulties of tool integration and how to overcome them.
There's a dirty secret about software tool integration: it's difficult to get right.
The use of multiple platforms in complex software development environments creates de facto silos. Unfortunately, tools are often only integrated into the platform where they will be used. This limits visibility and traceability throughout the rest of the development chain.
These limitations create bottlenecks that fragment the entire development chain, making it inefficient and slow.
This article talks about the key challenges to software tool integration and how to effectively address them. We'll go into detail about technical analysis of a tool before integration begins,the importance of domain understanding, the challenges artifacts pose, API difficulties, and how endpoint complexity can really affect software performance.
Before integration begins, it is important to identify the differences between the new tool and other tools already in the development chain. A deep understanding of how the tool is used - and what the tool actually does - are important for determining if it is the right tool for the job.
Consider these questions before integrating any new tool: Will the tool make a substantial enough contribution to a project to justify the time and resources spent on integration? How difficult will it be to learn how to use?
If it takes an extended period of developer training to use the tool, it will most likely fail to be adopted. The best tools are ones that provide the necessary functionality along with a learning curve that allows most developers to master it in a few weeks or less.
As technical analysis is performed, a deep awareness of how the new tool fits into the software chain begins to develop, speeding up integration.
Tools that are designed to integrate provide relatively easy means by which to connect APIs, services, and databases. Unfortunately, most development tools are not designed to be integrated with each other, giving rise to the numerous challenges that are a result of the inherent differences between those tools. Consequently, conflicts tend to be domain-specific (unique to the business and its software development teams).
In this post, Betty Zakheim says "In a cruel twist of fate, enterprise IT teams tasked with creating the organization's custom software and integrating off-the-shelf applications often must use fractured, unintegrated tools." Resolving these conflicts takes expertise and skill with the tools already being used in the software development chain.
This deep level of understanding must also include detailed knowledge of the object models, APIs, and interoperability.
[TRENDING: PSL Named Top IT Outsourcing Company]
Each development platform has its own object models, as do the tools used in each platform. These object models are almost always different, with unique artifacts and attributes.
Most of the work performed during integration involves finding ways to resolve these differences so that, for example, a Granny Smith apple is recognized as the same object by all tools in the software chain.
It's just data, right? Yes, but data that lives in different contexts as it travels through the software chain.
Let's take that Granny Smith apple: in one tool, it has a light green skin, crisp texture, and tart flavor, while in another, it's a dataset that breaks the color of the skin down into the percentages of its component colors, the density of the flesh, and the amount of sugar per cubic centimeter.
Same attributes, different context.
To get it right, one must posses a detailed understanding of the new tool's attributes and their context, while also possessing detailed knowledge of the software chain's attributes and their contexts. Resolving these differences leads to consistent attributes that can be trusted during development.
Artifacts, in theory, are supposed to keep the development process transparent while providing opportunities for inspection and adaptation. Artifacts exist in relationship to one another, providing context.
These relationships form the basis of transparency and traceability. For example, there is a relationship shared between a requirement, its tests, and any defects that result from those tests.
Mirroring these relationships throughout all of the tools used in the software development chain is critically important for maintaining the context of the work. In the absence of that context, any attempt at fixing the defect will be a waste of time.
[SELECT CONTENT: DevOps Not What You Expected? How to Get It Right This Time]
APIs: The Good, The Bad, and The Ugly
APIs help developers by preventing them from violating business logic, rules that manage business governance, and regulatory compliance. APIs are almost always the best way to access endpoint capabilities and their artifacts, because they provide safeguards.
When directly accessing a database, there are no safeguards to prevent a catastrophic mistake.
APIs come with a catch, however: most APIs are created by vendors for their own tiered architecture, not for the convenience of third-party integration.
Since most APIs are created by the vendor, their documentation fails to come within a parsec of best practices. Important details such as data structures, error handling, and edge cases are poorly documented, if at all. Developers end up spending a lot of time figuring everything out on their own through time-consuming trial-and-error.
To make matters worse, vendors have a penchant for updating/upgrading their tools with little or no warning. APIs are often changed, destroying integrations. When integrating a new tool, be proactive. Persuade the vendor to give you a reasonable heads-up when they are ready to deploy an update, providing you with documentation that clearly describes what changes were made. At the very least, you will be able to give your development team leads fair warning about forthcoming updates and plan accordingly.
[SEARCHING FOR RELEVANT CONTENT? TRY THIS... The Agile Way: Continuous Integration in ALM]
Endpoint Connections = Exponential Complexity
Connecting two endpoints can be difficult, but once you have it figured out, it's not so bad...or so we like to think.
Unfortunately, connecting a third endpoint takes us down a path quite similar to the first, complete with all the time-wasting technical analysis and trial-and-error.
If you complete a third connection and still have your sanity, stop while you can. Exponential complexity literally kicks in, making the connection of more endpoints a living nightmare that won't end. Conflict resolution and mirroring at this point are virtually impossible.
Even if you do succeed, the entire system will be substantially slowed by the API calls to each of the multiple connections per endpoint. Entire projects have been scrapped as a direct result of trying to integrate too many connections to a single endpoint.
Seamless, successful software tool integration depends upon a domain understanding that is deep and detailed. Additionally, conflicting attributes must be resolved to provide consistent object models throughout the software chain.
Artifacts in the new tool must be able to adapt to the development environment in which they will be used to provide a consistent level of transparency and traceability from within the tool.
In the end, software tool integration management belongs in the hands of skilled experts who possess a detailed knowledge of every aspect of the software chain.