The Good, Bad and Ugly of Technology Service Integration

We are rolling out a project next week that is really one of our crowning achievements as a small company.  The project involves replacing an out-dated and cumbersome Access database that runs a sales and loan process for our medium sized client with an end-to-end solution based on QuickBase.  What is most exciting is that we are integrating three external data sources as part of the process; web based application capture, credit reports and electronic signatures.

Our recent adoption of the Talend ETL technology to run our integrations has opened the door for us to tackle this project at an accelerated pace and reduced cost to our client.  Even with the new tools the human and partnering elements of the integration processes make a great study in the good, bad and ugly of system and service integration.

Web Application Capture

This integration was the easiest of the three but still brings to light some interesting challenges.  Early in the process we identified that over time our client had posted a number of application capture websites for their loan product.  The websites were hosted with different providers that had different data policies.  It turned out that only one of the sites was hosted by a provider that would allow access to the site database.  This restriction on the database can be quite common with low cost providers.

In the end we were able to work with the client to consolidate hosting providers and set up the domains on a single hosting service that allowed database access via ODBC.  We were then easily able to connect and extract the data using Talend ETL tools.

Credit Reports

As part of the loan approval process, our client checks credit on all applications.  They use the Equifax credit reporting service and currently pull each credit report individually over the web.  This process can take hours per day for the staff and also delays the speed at which agents can follow up and contact loan applicants.  Using the Talend ETL tools we were able to automate the credit report retrieval and push the data to QuickBase for review.  Low credit scores can automatically be filterd out and the applications reviewed and assigned to sales within minutes of the client submission.

Integrating with the Equifax credit reporting service is not without it challenges.  Administratively Equifax is very bureaucratic about gaining access due to security concerns.  This is not without good reason but needs to be planned for.  The API documentation is not published for the public, so can only be accessed when the (system-to-system) STS service is negotiated with Equifax sales.  Once we gained access to the API we also had to undergo and on-site security review of of our development facilities and our client had to work with Equifax to gain approval and access codes for the integrated service.

The technical integration with Equifax also has a number of complexities.  The most well documented method post fixed length data using HTTP.  Setting up fixed length data is quite cumbersome so we pursued the less well documented XML method.  Fortunately Equifax employs a full time staff person from IBM to oversee their integration services.  We were able to get some specific instruction on several minor nuances of the XML integration that are not documented and thus succeed in connecting and accessing the needed data.

Electronic Signatures

Especially in a loan processing environment, collecting signatures electronically can save tons of time, avoid errors and save overnight document shipping costs.  What’s exciting is the ability to not just get electronic signatures but to dynamically populate the agreements from QuickBase data and also collect data entered by the signers back to the QuickBase application.

Our client initially selected one of the larger electronic document services.  After struggling for weeks with a poorly documented API and a service department that admitted they were not even sure how their own API worked we succeeded on structuring the correct calls to connect to the service.  then just before going live the service informed our client that they would have to pay a $2500 fee to use the service with integrated data.  As a result we worked wit the client to find Agreement Express, a newer and exciting player in the electronic signature space.

Yet again, the integration with Agreement Express has not been without complexity.  Their HAPI API is designed primarily for web site based document creation and not system to system data exchange. Fortunately the Agreement Express technologists have a firm control and understanding of their technolgy and have been able to quickly work with us to develop a tailored API that fits our client’s needs and will likely benefit other clients in the future.

As we move to the roll out of this exciting project, it has been valuable to reflect back on the challenges above.  One of the most important takeaways for looking at system to service integrations is to understand that the underlying technologies, clarity and completeness of documentation and service technology staff all play a significant role in how well an integration project succeeds.

Advertisements

Simple Data Entity Modeling

On yet another application review with a prospective client the issue of potentially troublesome data entity relationships came up.  A number of our clients are existing QuickBase users who have made an initial attempt to design their applications and find that some assistance is needed.  The first thing we look at to understand their application requirements are the data entity relationships.

This is done in the context of both their existing application as well as their general business requirement because often applications are not correctly designed to support the structure of the business information.

One of the most common mistakes we see is when application designers build inflexible structures into their applications because of a lack of understanding of relational data.  During the client call mentioned above we saw that they had set up a table of Projects and then wanted to capture monthly budget data.  To do this the application designer created twelve fields, one for each month, on the project form to capture the budget information.  For spreadsheet abusers this kind of flat structure may seem natural.  However, in a database environment the goal is to structure related data to support reporting, visibility and analysis of information.

Data Entity Modeling 101

Whenever we start development of a new client application our first step is to model the data entity relationships.  The key to this is asking detailed questions about how information in the organization is structured, used and viewed.  A simple line of questioning can help flesh out data entity relationships.

1. Start with element of data that is at the core of the process being evaluated.  In most cases this should be relatively easy to identify; for example, when discussing a Project Management application the core data element is typically a Project.  CRM or Sales force Automation applications can be more challenging but typically have an entity such as an Opportunity, Case or Lead as the core.

KEY QUESTIONS

  • What is the main function of the application?
  • What do you call the form that you currently fill out as part of the process?

2. Identify primary relationships to the core data entity (see Example A).  Almost all business processes involve data that has some kind of hierarchy so it’s should be expected that a core data entity will have one or sub-entities.  A sub-entity is a child relationship where the core is the parent and the sub-entity has one or more pieces of information related to the parent core entity.  There are many examples of this; Projects almost always have a list of associated Tasks, Quotes have Lines or Quote Items, Sales Orders have Order Lines.

KEY QUESTIONS

  • What other information is captured as part of the primary function?
  • Is additional information singular to the core entity or might there be more than one piece of information of the same type?
  • Is information entered on a separate form linked or referenced to the core entity?

3. Identify auxiliary entity relationships to the core and primary data entities (see Example B).  We often refer to this auxiliary information as meta data because it is really information that helps qualify another data entity.  Auxiliary information is best defined as information that helps define, assign or organize a core or primary data entity.  Often information that is contained in a drop-down field is converted to an auxiliary data entity to allow for easier management of the list.  Another very common auxiliary data entity is a Staff or Resource table used to assign a Task or a Project Manager.

KEY QUESTIONS

  • Is the field on the primary or core entity a drop down that needs to be selected from a dynamically changing list?
  • Is there additional information that needs to be passed to the primary or core entity from a selection other than just the selection value?
  • Is there categorical or list data that is likely to be used in more than one place in the application?

The Benefits of Proper Data Entity Modeling

Circling back to the example above where the budget data was set up as a flat structure with a field for each month there are a number of considerations to point out.

1. It’s would be confusing to interpret a project budget that spans the end of the year because there would be data points in the ending months and beginning months but no specification as to which year the months belong in.

2. No project budget could ever be more than year long (there are only 12 fields to capture data).

3. Reporting trend and aggregate data would be very difficult due to the inability to easily identify the budget year and not having budget amounts in 12 different fields.

The ideal solution would be to set up a Budget entity related to the Project and capture each month as a single entry with the month, year and amount entered.  This would allow QuickBase and really just about any database to much more effectively report the budget information.

This small example is just one of many such data entity challenges we have seen.  In some cases the limitations caused by inefficient structures is not realized until substantial data and process development has occurred and correcting it can be a major project.  Therefore it is critically important when designing any database application to go through the data entity mapping exercise as an early step.


PaaS & ETL in the Application Ecosystem

Our conversation continues about the optimal mix of technology to support business process.  In our last TechWise blog,“Living IT, QuickBase Leads The Way”, we discussed the concept of Living IT and why organizations should plan for changing and dynamic technology.  To elaborate further on this topic we want to introduce some ideas around what we refer to as the the Application Ecosystem and how technologies such as PaaS and ETL fit in.

Core Systems

The Application Ecosystem of an organization is a broad way to refer to the various technology tools that the organization uses.  This applies to government, for-profit and non-profit organizations.  At the center of the Application Ecosystem is what we refer to as Core Systems.  These tools are typically associated with basic organizational functions required for accounting and transactional purposes.  For smaller businesses, tools like QuickBooks or PeachTree are the frequent choice while larger organizations have mostly deployed ERP systems like Oracle ERP or SAP.

While accounting and ERP systems have expanded to include broader functionality few if any organizations are able to function with a single technology to manage business processes.  This is due to the fact that ERP and Accounting systems are designed fir best practices and with transaction management as the primary focus with process management flexibility given limited attention.  This means that organizations are pushed to find technologies that complement and extend core systems.  We refer to the multitude of applications that support defined and ad hoc organizational processes outside the core systems as the Extension Layer.

The Extension Layer 

There are two basic types of applications in the Extension Layer, Point Solutions and Situational Applications.  Point Solutions are specialized, typically best of breed applications that solve a specific and typically well defined need.  Image management, warehouse management and CRM are areas that often are targeted for point solutions.  The other type of application in the Extension Layer is often referred to as Situational Applications.  These are applications that solve more unique or possibly temporary organizational needs where no viable Point Solution exists.  Often these types of applications are managed ineffeciently using spreadsheets or simple databases.

Application Connections 

The final but very important element of the Application Ecosystem is the body of interconnections between applications.  These connections or integrations may be between applications in the Extension Layer or with Core Applications.  In many cases organizations lack the technical capability to effectively integrate applications so information is moved between applications with human, manual processes.  Only when extension applications reach a significant size and value are they integrated in an automated way with other applications.

Paas & ETL Enable Flexibility in Extension Layer Application Creation and Interconnectivty

Platform-as-a-Service (PaaS) tools such as QuickBase and Wolf Frameworks provide organizations with tool sets to rapidly build and deploy Extension Layer applications that extend core accounting and ERP systems or provide effective departmental or workgroup functionality.  PaaS is frequently utilized for Situational Applications but also quickly becoming a strong choice for CRM, sales force automation and other areas often relegated to Point Solutions.

One of the main benefits of PaaS as part of an organizations Extension Layer is the ability to quickly and easily interconnect data and processes between applications designed on the same PaaS technology.  For example, QucikBase allows the ability to create cross application relationships between applications as a simpel and easy way to share information.

A technology that is rapidly becoming mainstream is Extract-Transform-and-Load (ETL) tools.  Open Source technologies like Talend ETL are allowing rapid, low cost development of integration.  This means that organizations can choose to automatically tie together applications, processes and information that previously could not have been efficiently integrated.