Realities of Solution Change Management

Coming out of business school, somebody once told me, “If you design a solution correctly, everybody can be a winner.”  It was up to the solution designer to capture every user’s requests, wants, and needs and to piece them together into a solution that creates high value for all of them.  Soon after, I entered the real world of system implementation and I soon realized that this advice was purely, 100%, unequivocally wrong.

In the real world of solution design and implementation, executive management, middle management, and rank-and-file often have very different and competing goals.  Executive managers typically value reporting and metrics. They want to capture every piece of data that might possibly be used for some type of analytical reporting at some point in the future. Executives tend to drive systems towards analytically powerful, but overly-complex systems.

On the other hand, rank-and-file employees typically value efficiency, ease of use, and collaboration.  They want a tool that actually speeds up their job and helps keep them on the same page with other team members. Rank-and-file employees tend to drive towards automated, efficient systems with very little analytical power and accountability. Middle Managers are usually trapped somewhere in between, having the unenviable task of reconciling one group to the other.

So what is a QuickBase developer to do?  Far be it from me to offer a solution to probably the most difficult problem in change management; however, the following observations have been helpful to me.

Step 1: Accept Reality

First, I have found that after a new system is put in place users can be grouped into one of three categories:

  • Winners: Users who gain high value from a system
  • Losers: Users whose jobs will temporarily be made more difficult by the new system.
  • Sorta’ Winners/Sorta’ Losers: Users who will gain some value from the system, but will also have some difficulty adopting it.

It is impossible to design a system where at least one user group does not face challenges. The sooner we accept this reality, the sooner we can design strategies to manage the effects of change on this group. This is very evident in sales automation systems. Most systems on the market are geared more towards executive management and middle management.  It is usually an uphill battle to explain to a sales rep, who is used to capturing their sales quickly on a paper form and letting an admin deal with organizing the data, that they gain value from now having to logon to an online system they don’t yet understand in order to enter their leads. Or even worse, now they need to fill out a paper sheet on the road and re-enter it into the fancy new sales system. But what about the value they perceive from having a place to manage their contacts and follow ups? My experience is that most sales reps already have sophisticated and effective ways for managing this for themselves.

So let’s face it, management are the winners in most sales systems, not the sales rep.  If you were to build a sales system geared towards sales reps, management would most likely have to make major tradeoffs in the type of reporting they will see.  Still not convinced?  A major goal of sales systems is pipeline reporting. What value does a sales rep get from having to login to the system and update their “Expected Close Dates” every week? In this instance, I posit they get limited value.  Let’s be honest with ourselves that each group expereinces solution changes very differently, so we can plan how to maximize overall value to the organization as a whole.

Step 2: Choose Your Winners

Your next step is choose which user group will be your winners.  In some cases, this will already be chosen for you.  If you are hired by executive management and they are providing your requirements, you have no choice but to gear a system towards this user group. In other cases, it will be less clear who your winners should be.

Who is the optimal user group to build a system around? I don’t think there is a single right answer to this, as this is largely determined by the organizational situation.  My personal view that no matter what user group you target as the Winners, it is often optimal to convince executive management that they accept the role of Sorta’ Winners/Sorta’ Losers.  This creates an environment where management is accepting trade-offs, resulting in a more balanced system.

Step 3:  Maximize Value to the Sorta’ Winners/Sorta’ Losers

Next, you should try to create as much value for the Sorta’ Winners/Sorta’ Losers as possible.  The goal of course is to win as many of these users over to your side as possible.  It is the battle for the heart and minds of this user group where the true success of the system is decided. For QuickBase, this is easier because it was designed to be simple yet powerful for end users and it also enables them to create their own solutions.

Step 4: Plan for your Losers

Best case scenario, these users will grumble about the system, but comply. More often, they will vocally complain about the system, but comply. Worst case scenario, they will actively sabotage the implementation of the new system.   The absolutely worst thing you can do with this group is ignore them or refuse to accept that they exist, though this often seems the path of least resistance. Their anger will grow and possibly spread to other user groups. I have found it helpful to listen to their issues and actively communicate the benefits of the system to the organization as a whole. Sometimes this works, sometimes it doesn’t, but at least they feel listened to. I also try to make concessions when possible and keep them in the communication loop. I have had varied results with this, but it always turns out better if I acknowledge them.

What do you think? How do you handle change management, especially user groups who will lose from your system?

Also, if you are interested in this sort of stuff, you should read Our Iceberg is Melting by John Kotter, whose ideas I borrowed freely.

About James Cosman

James is a Solution Architect at MCF Technology Solutions, a leading provider of cloud computing services (most notably QuickBase). His core competency lies in translating stated business needs into tangible, value creating applications. His MCF team is based in the great city of Houston, Texas and he holds an MBA from Rice University. Go Owls!


The Good, Bad and Ugly of Technology Service Integration

We are rolling out a project next week that is really one of our crowning achievements as a small company.  The project involves replacing an out-dated and cumbersome Access database that runs a sales and loan process for our medium sized client with an end-to-end solution based on QuickBase.  What is most exciting is that we are integrating three external data sources as part of the process; web based application capture, credit reports and electronic signatures.

Our recent adoption of the Talend ETL technology to run our integrations has opened the door for us to tackle this project at an accelerated pace and reduced cost to our client.  Even with the new tools the human and partnering elements of the integration processes make a great study in the good, bad and ugly of system and service integration.

Web Application Capture

This integration was the easiest of the three but still brings to light some interesting challenges.  Early in the process we identified that over time our client had posted a number of application capture websites for their loan product.  The websites were hosted with different providers that had different data policies.  It turned out that only one of the sites was hosted by a provider that would allow access to the site database.  This restriction on the database can be quite common with low cost providers.

In the end we were able to work with the client to consolidate hosting providers and set up the domains on a single hosting service that allowed database access via ODBC.  We were then easily able to connect and extract the data using Talend ETL tools.

Credit Reports

As part of the loan approval process, our client checks credit on all applications.  They use the Equifax credit reporting service and currently pull each credit report individually over the web.  This process can take hours per day for the staff and also delays the speed at which agents can follow up and contact loan applicants.  Using the Talend ETL tools we were able to automate the credit report retrieval and push the data to QuickBase for review.  Low credit scores can automatically be filterd out and the applications reviewed and assigned to sales within minutes of the client submission.

Integrating with the Equifax credit reporting service is not without it challenges.  Administratively Equifax is very bureaucratic about gaining access due to security concerns.  This is not without good reason but needs to be planned for.  The API documentation is not published for the public, so can only be accessed when the (system-to-system) STS service is negotiated with Equifax sales.  Once we gained access to the API we also had to undergo and on-site security review of of our development facilities and our client had to work with Equifax to gain approval and access codes for the integrated service.

The technical integration with Equifax also has a number of complexities.  The most well documented method post fixed length data using HTTP.  Setting up fixed length data is quite cumbersome so we pursued the less well documented XML method.  Fortunately Equifax employs a full time staff person from IBM to oversee their integration services.  We were able to get some specific instruction on several minor nuances of the XML integration that are not documented and thus succeed in connecting and accessing the needed data.

Electronic Signatures

Especially in a loan processing environment, collecting signatures electronically can save tons of time, avoid errors and save overnight document shipping costs.  What’s exciting is the ability to not just get electronic signatures but to dynamically populate the agreements from QuickBase data and also collect data entered by the signers back to the QuickBase application.

Our client initially selected one of the larger electronic document services.  After struggling for weeks with a poorly documented API and a service department that admitted they were not even sure how their own API worked we succeeded on structuring the correct calls to connect to the service.  then just before going live the service informed our client that they would have to pay a $2500 fee to use the service with integrated data.  As a result we worked wit the client to find Agreement Express, a newer and exciting player in the electronic signature space.

Yet again, the integration with Agreement Express has not been without complexity.  Their HAPI API is designed primarily for web site based document creation and not system to system data exchange. Fortunately the Agreement Express technologists have a firm control and understanding of their technolgy and have been able to quickly work with us to develop a tailored API that fits our client’s needs and will likely benefit other clients in the future.

As we move to the roll out of this exciting project, it has been valuable to reflect back on the challenges above.  One of the most important takeaways for looking at system to service integrations is to understand that the underlying technologies, clarity and completeness of documentation and service technology staff all play a significant role in how well an integration project succeeds.


Simple Data Entity Modeling

On yet another application review with a prospective client the issue of potentially troublesome data entity relationships came up.  A number of our clients are existing QuickBase users who have made an initial attempt to design their applications and find that some assistance is needed.  The first thing we look at to understand their application requirements are the data entity relationships.

This is done in the context of both their existing application as well as their general business requirement because often applications are not correctly designed to support the structure of the business information.

One of the most common mistakes we see is when application designers build inflexible structures into their applications because of a lack of understanding of relational data.  During the client call mentioned above we saw that they had set up a table of Projects and then wanted to capture monthly budget data.  To do this the application designer created twelve fields, one for each month, on the project form to capture the budget information.  For spreadsheet abusers this kind of flat structure may seem natural.  However, in a database environment the goal is to structure related data to support reporting, visibility and analysis of information.

Data Entity Modeling 101

Whenever we start development of a new client application our first step is to model the data entity relationships.  The key to this is asking detailed questions about how information in the organization is structured, used and viewed.  A simple line of questioning can help flesh out data entity relationships.

1. Start with element of data that is at the core of the process being evaluated.  In most cases this should be relatively easy to identify; for example, when discussing a Project Management application the core data element is typically a Project.  CRM or Sales force Automation applications can be more challenging but typically have an entity such as an Opportunity, Case or Lead as the core.

KEY QUESTIONS

  • What is the main function of the application?
  • What do you call the form that you currently fill out as part of the process?

2. Identify primary relationships to the core data entity (see Example A).  Almost all business processes involve data that has some kind of hierarchy so it’s should be expected that a core data entity will have one or sub-entities.  A sub-entity is a child relationship where the core is the parent and the sub-entity has one or more pieces of information related to the parent core entity.  There are many examples of this; Projects almost always have a list of associated Tasks, Quotes have Lines or Quote Items, Sales Orders have Order Lines.

KEY QUESTIONS

  • What other information is captured as part of the primary function?
  • Is additional information singular to the core entity or might there be more than one piece of information of the same type?
  • Is information entered on a separate form linked or referenced to the core entity?

3. Identify auxiliary entity relationships to the core and primary data entities (see Example B).  We often refer to this auxiliary information as meta data because it is really information that helps qualify another data entity.  Auxiliary information is best defined as information that helps define, assign or organize a core or primary data entity.  Often information that is contained in a drop-down field is converted to an auxiliary data entity to allow for easier management of the list.  Another very common auxiliary data entity is a Staff or Resource table used to assign a Task or a Project Manager.

KEY QUESTIONS

  • Is the field on the primary or core entity a drop down that needs to be selected from a dynamically changing list?
  • Is there additional information that needs to be passed to the primary or core entity from a selection other than just the selection value?
  • Is there categorical or list data that is likely to be used in more than one place in the application?

The Benefits of Proper Data Entity Modeling

Circling back to the example above where the budget data was set up as a flat structure with a field for each month there are a number of considerations to point out.

1. It’s would be confusing to interpret a project budget that spans the end of the year because there would be data points in the ending months and beginning months but no specification as to which year the months belong in.

2. No project budget could ever be more than year long (there are only 12 fields to capture data).

3. Reporting trend and aggregate data would be very difficult due to the inability to easily identify the budget year and not having budget amounts in 12 different fields.

The ideal solution would be to set up a Budget entity related to the Project and capture each month as a single entry with the month, year and amount entered.  This would allow QuickBase and really just about any database to much more effectively report the budget information.

This small example is just one of many such data entity challenges we have seen.  In some cases the limitations caused by inefficient structures is not realized until substantial data and process development has occurred and correcting it can be a major project.  Therefore it is critically important when designing any database application to go through the data entity mapping exercise as an early step.


PaaS & ETL in the Application Ecosystem

Our conversation continues about the optimal mix of technology to support business process.  In our last TechWise blog,“Living IT, QuickBase Leads The Way”, we discussed the concept of Living IT and why organizations should plan for changing and dynamic technology.  To elaborate further on this topic we want to introduce some ideas around what we refer to as the the Application Ecosystem and how technologies such as PaaS and ETL fit in.

Core Systems

The Application Ecosystem of an organization is a broad way to refer to the various technology tools that the organization uses.  This applies to government, for-profit and non-profit organizations.  At the center of the Application Ecosystem is what we refer to as Core Systems.  These tools are typically associated with basic organizational functions required for accounting and transactional purposes.  For smaller businesses, tools like QuickBooks or PeachTree are the frequent choice while larger organizations have mostly deployed ERP systems like Oracle ERP or SAP.

While accounting and ERP systems have expanded to include broader functionality few if any organizations are able to function with a single technology to manage business processes.  This is due to the fact that ERP and Accounting systems are designed fir best practices and with transaction management as the primary focus with process management flexibility given limited attention.  This means that organizations are pushed to find technologies that complement and extend core systems.  We refer to the multitude of applications that support defined and ad hoc organizational processes outside the core systems as the Extension Layer.

The Extension Layer 

There are two basic types of applications in the Extension Layer, Point Solutions and Situational Applications.  Point Solutions are specialized, typically best of breed applications that solve a specific and typically well defined need.  Image management, warehouse management and CRM are areas that often are targeted for point solutions.  The other type of application in the Extension Layer is often referred to as Situational Applications.  These are applications that solve more unique or possibly temporary organizational needs where no viable Point Solution exists.  Often these types of applications are managed ineffeciently using spreadsheets or simple databases.

Application Connections 

The final but very important element of the Application Ecosystem is the body of interconnections between applications.  These connections or integrations may be between applications in the Extension Layer or with Core Applications.  In many cases organizations lack the technical capability to effectively integrate applications so information is moved between applications with human, manual processes.  Only when extension applications reach a significant size and value are they integrated in an automated way with other applications.

Paas & ETL Enable Flexibility in Extension Layer Application Creation and Interconnectivty

Platform-as-a-Service (PaaS) tools such as QuickBase and Wolf Frameworks provide organizations with tool sets to rapidly build and deploy Extension Layer applications that extend core accounting and ERP systems or provide effective departmental or workgroup functionality.  PaaS is frequently utilized for Situational Applications but also quickly becoming a strong choice for CRM, sales force automation and other areas often relegated to Point Solutions.

One of the main benefits of PaaS as part of an organizations Extension Layer is the ability to quickly and easily interconnect data and processes between applications designed on the same PaaS technology.  For example, QucikBase allows the ability to create cross application relationships between applications as a simpel and easy way to share information.

A technology that is rapidly becoming mainstream is Extract-Transform-and-Load (ETL) tools.  Open Source technologies like Talend ETL are allowing rapid, low cost development of integration.  This means that organizations can choose to automatically tie together applications, processes and information that previously could not have been efficiently integrated.


QuickBase: Why RAM Matters

I first stumbled onto QuickBase more than five years ago as a Sourcing Manager at American Greetings. I was looking for a better way to collaborate with the Product Managers than sending a barrage of cost requests to my buying team. My first app, the RFQ Manager, was an instant hit. In fact, the Product Manager I tested it on liked it so much I did not even need to ask the other PM’s to start using the tool, they were asking me where to sign up. It wasn’t long before I was literally touting QuickBase as “the best thing since sliced bread,” but it would take five long years, hundreds of developed applications and a deep search of the PaaS marketplace before I really understood what makes QuickBase special.

With so much time behind the wheel of QuickBase, it seemed that the almost instant access and reportability of entire datasets linked across multiple applications, all updated virtually in real time was something to be assumed. Last winter, our MCF leadership team spent several days with QuickBase and talked a lot about what’s under the hood, namely the use of a RAM/in-memory database. However, at the time I did not fully grasp the meaning until I explored other PaaS and SaaS offerings and realized that few if none were delivering the kind of data experience that QuickBase could provide.

In-memory databases (IMDB) store information in the RAM memory of a CPU whereas main-memory databases (MMDB) store data on disk space. IMDB storage is not-persistent, meaning that when the power is turned off the data is lost but accessing the information is must faster and requires fewer computations. It’s basically equivalent to the RAM on your PC where the information is stored for running applications that need fast access to the data. This model is often used in applications where access speed is essential such as 9-1-1 response systems and telecommunications.

What this means for QuickBase end-users is that the accessibility of the data, the computational capability, is optimized for speed. To translate this into terms that may be more familiar, imagine a totally different setup for your PC where instead of 1 or 2 GB of RAM and a huge hard drive, you really only use your hard-drive as back-up and your PC with 200 GB of RAM is never turned off. This means that every file and application is almost instantly accessible, there’s no booting up and no time loading applications because everything is always loaded in memory.

With all this in mind, it is clear why QuickBase provides world class collaboration, access and reporting for work groups, teams, smaller businesses, project managers, etc. It’s important when choosing where and how to deploy QuickBase to consider the use and underlying data needs, as some requirements are not ideal for IMDB due to dataset size and require the larger and cheaper data storage solutions. Fortunately, integrating with QuickBase is straight forward, allowing it to serve as the application for working data and with transactional or unnecessary data archived or purged.


A story of the benefits of platform openness in a Web 2.0 world

Much of the value of software is determined by how completely an application solves a particular user problem and meets a user need. The Web 2.0 revolution is evidence that no application can fully solve every nuance for every customer within an area of requirement.
This has long been evidenced by the proliferation of bolt-ons, add-ons, extensions, etc. designed for traditional software. In the Web 2.0 world, adding on and extending software is an assumed critical element of the application value. Software providers that fail to provide ample hooks into their applications will inevitably fail to maximize their value to customers.

As a long time developer and integrator of the Intuit QuickBase platform, we have become intimately familiar with the QuickBase API. The simplicity of the QuickBase API means that, in almost 5 years of use, we have rarely needed more than a few basic request types to command substantial interactivity with the application. In addition to being open to external calls, the QuickBase API can be used inside the application to create simple and complex customizations. What’s great about having such a simple but powerful API model is that it allows developers and reasonably savvy end users to solve a myriad of information problems that could not possibly have been anticipated by the platform designer.

One of the greatest leaps in our business came recently with our move to utilizing open source ETL as a method for interacting with and controlling QuickBase applications. Because QuickBase does not have it’s own logic processing tools, it has become essential for us to tie in processing capability to best support our client’s value. The Talend Integration Suite allows us to provide this type of processing capability with advanced Java components for executing business processes and integration. Once again, one of our key decision points for selecting Talend to provide ETL services in connection with QuickBase was related to the platform openness. Talend’s Java and Perl components are all user configurable but also fairly easy to customize, which allows for expansion and extension of the toolkit whenever needed.

For a recent project with ENSAT, there was a need to provide EOHS and ISO compliance. To deliver on the need, we developed asynchronous routines that would run periodically throughout the day, check for certain information updates and propagate the changes to the compliance requirements to assure staff of a major manufacturing facility had proper training for their assigned work. QuickBase was already being used successfully to capture training and staff information, so it seemed logical to extend QuickBase for the EOHS and ISO compliance handling. In order to execute the data processing required for the compliance determinations, a Talend job was created to pull QuickBase data, generate compliance records and load them back for reporting. With just hours of assistance from Talend services, and a lot of brain-power applied to writing compliance data rules, we were able to set up the necessary Java componentry for an effective QuickBase integration and go live inside of a week.

As we evolve our practice at MCF Tech to support the future of cloud based, Web 2.0 technology it is apparent that the key is not to find the silver bullet technologies that aim to solve all problems, but rather to choose those that have the humility to know their limitations. Important questions to ask about any platform or application are whether there is an API, WSDL or other integration method and whether customization is encouraged. It is surprising how many traditional and even cloud based applications fail in the area of openness.