Seed Edition: Finally, light at the end of the tunnel for subscription billing!

A recurring theme in our business practice has been a search for a viable subscription billing service.  At least several times a year we find ourselves diving into the minefield of payment gateways, processors and third party services to support a new product offering of our own or for a client project.  In our most recent foray, I started out by telling our client that “handling recurring payments is at best a daunting subject, and in truth is most likely an utter quagmire.”  Fortunately, we were invited to participate in early beta for the Seed Edition offering and found light at the end of the tunnel.

The basic concept of subscription billing is that you need to get a credit card processed on a regular basis … sounds simple right?  Unfortunately, it’s never that easy.  The payment processing world has layer upon layer of companies and services trying to get a small slice of every transaction, and a wide range of options, none of which are easily understood.  Over the years, we’ve evaluated and tested dozens of possible solutions.  Rather than name names or share horror stories, I thought it might help to provide a general overview of the world of subscription payments which should help convey the complexity of finding the right solution.

Credit Card Processing

There are many ways to get credit cards processed, including services designed mostly for web based transactions, but pretty much every bank and financial institution these days will happily process cards for a fee.  Every processor has different fee structures and service offerings.  Some processors won’t sell direct, such as Authorize.net, so you have to go through a processing reseller.  Also, not all will support recurring payments and, even if they do, it’s likely a limited functionality offering with no management features.

Payment Gateways

Most credit card processors provide a payment gateway that allows you to process credit cards online or through an API if you want to programmatically post transactions.  The main purpose of the Gateway is to securely pass credit card information to the processor.  Not all Gateways are alike.  Some are basically just set up to be websites that you can use in your office or home instead of a credit card machine.  Others have the ability to be embedded in a website or passed into from a form post but may end up with a look and feel different from your product site.

 Recurring Payments

Some processing services offer a gateway with recurring billing service that allows you to store a client credit card with a monthly amount and automate the processing.  Others just allow storage of the credit card and then you manually have to submit the transaction each period.  There are a number of services that will securely store credit card information and automatically send the bill to the processor on a regular basis … which is where Seed Edition comes in.

It’s very possible when designing an web application or shopping cart to store credit cards, navigate the maze of processing and gateway services and then set up some scheduled processes to run credit cards every month.  Unfortunately, this is expensive, often unreliable and beyond the technical means of many smaller businesses.

Seed Edition makes the entire process extremely simple.  PayPal and Authorize.net gateways are supported so you simply get the appropriate account and enter credentials in the Seed Edition setup interface.  It’s then an easy process to create a subscription model, add clients with regular payment patterns and begin billing on a regular basis.  For more advanced use cases, Seed Edition provides a robust API to automate account and subscription creation, and to generate payment inquiries.  The user interface is simple for billing staff and powerful for reporting and management activities.


Realities of Solution Change Management

Coming out of business school, somebody once told me, “If you design a solution correctly, everybody can be a winner.”  It was up to the solution designer to capture every user’s requests, wants, and needs and to piece them together into a solution that creates high value for all of them.  Soon after, I entered the real world of system implementation and I soon realized that this advice was purely, 100%, unequivocally wrong.

In the real world of solution design and implementation, executive management, middle management, and rank-and-file often have very different and competing goals.  Executive managers typically value reporting and metrics. They want to capture every piece of data that might possibly be used for some type of analytical reporting at some point in the future. Executives tend to drive systems towards analytically powerful, but overly-complex systems.

On the other hand, rank-and-file employees typically value efficiency, ease of use, and collaboration.  They want a tool that actually speeds up their job and helps keep them on the same page with other team members. Rank-and-file employees tend to drive towards automated, efficient systems with very little analytical power and accountability. Middle Managers are usually trapped somewhere in between, having the unenviable task of reconciling one group to the other.

So what is a QuickBase developer to do?  Far be it from me to offer a solution to probably the most difficult problem in change management; however, the following observations have been helpful to me.

Step 1: Accept Reality

First, I have found that after a new system is put in place users can be grouped into one of three categories:

  • Winners: Users who gain high value from a system
  • Losers: Users whose jobs will temporarily be made more difficult by the new system.
  • Sorta’ Winners/Sorta’ Losers: Users who will gain some value from the system, but will also have some difficulty adopting it.

It is impossible to design a system where at least one user group does not face challenges. The sooner we accept this reality, the sooner we can design strategies to manage the effects of change on this group. This is very evident in sales automation systems. Most systems on the market are geared more towards executive management and middle management.  It is usually an uphill battle to explain to a sales rep, who is used to capturing their sales quickly on a paper form and letting an admin deal with organizing the data, that they gain value from now having to logon to an online system they don’t yet understand in order to enter their leads. Or even worse, now they need to fill out a paper sheet on the road and re-enter it into the fancy new sales system. But what about the value they perceive from having a place to manage their contacts and follow ups? My experience is that most sales reps already have sophisticated and effective ways for managing this for themselves.

So let’s face it, management are the winners in most sales systems, not the sales rep.  If you were to build a sales system geared towards sales reps, management would most likely have to make major tradeoffs in the type of reporting they will see.  Still not convinced?  A major goal of sales systems is pipeline reporting. What value does a sales rep get from having to login to the system and update their “Expected Close Dates” every week? In this instance, I posit they get limited value.  Let’s be honest with ourselves that each group expereinces solution changes very differently, so we can plan how to maximize overall value to the organization as a whole.

Step 2: Choose Your Winners

Your next step is choose which user group will be your winners.  In some cases, this will already be chosen for you.  If you are hired by executive management and they are providing your requirements, you have no choice but to gear a system towards this user group. In other cases, it will be less clear who your winners should be.

Who is the optimal user group to build a system around? I don’t think there is a single right answer to this, as this is largely determined by the organizational situation.  My personal view that no matter what user group you target as the Winners, it is often optimal to convince executive management that they accept the role of Sorta’ Winners/Sorta’ Losers.  This creates an environment where management is accepting trade-offs, resulting in a more balanced system.

Step 3:  Maximize Value to the Sorta’ Winners/Sorta’ Losers

Next, you should try to create as much value for the Sorta’ Winners/Sorta’ Losers as possible.  The goal of course is to win as many of these users over to your side as possible.  It is the battle for the heart and minds of this user group where the true success of the system is decided. For QuickBase, this is easier because it was designed to be simple yet powerful for end users and it also enables them to create their own solutions.

Step 4: Plan for your Losers

Best case scenario, these users will grumble about the system, but comply. More often, they will vocally complain about the system, but comply. Worst case scenario, they will actively sabotage the implementation of the new system.   The absolutely worst thing you can do with this group is ignore them or refuse to accept that they exist, though this often seems the path of least resistance. Their anger will grow and possibly spread to other user groups. I have found it helpful to listen to their issues and actively communicate the benefits of the system to the organization as a whole. Sometimes this works, sometimes it doesn’t, but at least they feel listened to. I also try to make concessions when possible and keep them in the communication loop. I have had varied results with this, but it always turns out better if I acknowledge them.

What do you think? How do you handle change management, especially user groups who will lose from your system?

Also, if you are interested in this sort of stuff, you should read Our Iceberg is Melting by John Kotter, whose ideas I borrowed freely.

About James Cosman

James is a Solution Architect at MCF Technology Solutions, a leading provider of cloud computing services (most notably QuickBase). His core competency lies in translating stated business needs into tangible, value creating applications. His MCF team is based in the great city of Houston, Texas and he holds an MBA from Rice University. Go Owls!


The Good, Bad and Ugly of Technology Service Integration

We are rolling out a project next week that is really one of our crowning achievements as a small company.  The project involves replacing an out-dated and cumbersome Access database that runs a sales and loan process for our medium sized client with an end-to-end solution based on QuickBase.  What is most exciting is that we are integrating three external data sources as part of the process; web based application capture, credit reports and electronic signatures.

Our recent adoption of the Talend ETL technology to run our integrations has opened the door for us to tackle this project at an accelerated pace and reduced cost to our client.  Even with the new tools the human and partnering elements of the integration processes make a great study in the good, bad and ugly of system and service integration.

Web Application Capture

This integration was the easiest of the three but still brings to light some interesting challenges.  Early in the process we identified that over time our client had posted a number of application capture websites for their loan product.  The websites were hosted with different providers that had different data policies.  It turned out that only one of the sites was hosted by a provider that would allow access to the site database.  This restriction on the database can be quite common with low cost providers.

In the end we were able to work with the client to consolidate hosting providers and set up the domains on a single hosting service that allowed database access via ODBC.  We were then easily able to connect and extract the data using Talend ETL tools.

Credit Reports

As part of the loan approval process, our client checks credit on all applications.  They use the Equifax credit reporting service and currently pull each credit report individually over the web.  This process can take hours per day for the staff and also delays the speed at which agents can follow up and contact loan applicants.  Using the Talend ETL tools we were able to automate the credit report retrieval and push the data to QuickBase for review.  Low credit scores can automatically be filterd out and the applications reviewed and assigned to sales within minutes of the client submission.

Integrating with the Equifax credit reporting service is not without it challenges.  Administratively Equifax is very bureaucratic about gaining access due to security concerns.  This is not without good reason but needs to be planned for.  The API documentation is not published for the public, so can only be accessed when the (system-to-system) STS service is negotiated with Equifax sales.  Once we gained access to the API we also had to undergo and on-site security review of of our development facilities and our client had to work with Equifax to gain approval and access codes for the integrated service.

The technical integration with Equifax also has a number of complexities.  The most well documented method post fixed length data using HTTP.  Setting up fixed length data is quite cumbersome so we pursued the less well documented XML method.  Fortunately Equifax employs a full time staff person from IBM to oversee their integration services.  We were able to get some specific instruction on several minor nuances of the XML integration that are not documented and thus succeed in connecting and accessing the needed data.

Electronic Signatures

Especially in a loan processing environment, collecting signatures electronically can save tons of time, avoid errors and save overnight document shipping costs.  What’s exciting is the ability to not just get electronic signatures but to dynamically populate the agreements from QuickBase data and also collect data entered by the signers back to the QuickBase application.

Our client initially selected one of the larger electronic document services.  After struggling for weeks with a poorly documented API and a service department that admitted they were not even sure how their own API worked we succeeded on structuring the correct calls to connect to the service.  then just before going live the service informed our client that they would have to pay a $2500 fee to use the service with integrated data.  As a result we worked wit the client to find Agreement Express, a newer and exciting player in the electronic signature space.

Yet again, the integration with Agreement Express has not been without complexity.  Their HAPI API is designed primarily for web site based document creation and not system to system data exchange. Fortunately the Agreement Express technologists have a firm control and understanding of their technolgy and have been able to quickly work with us to develop a tailored API that fits our client’s needs and will likely benefit other clients in the future.

As we move to the roll out of this exciting project, it has been valuable to reflect back on the challenges above.  One of the most important takeaways for looking at system to service integrations is to understand that the underlying technologies, clarity and completeness of documentation and service technology staff all play a significant role in how well an integration project succeeds.


Simple Data Entity Modeling

On yet another application review with a prospective client the issue of potentially troublesome data entity relationships came up.  A number of our clients are existing QuickBase users who have made an initial attempt to design their applications and find that some assistance is needed.  The first thing we look at to understand their application requirements are the data entity relationships.

This is done in the context of both their existing application as well as their general business requirement because often applications are not correctly designed to support the structure of the business information.

One of the most common mistakes we see is when application designers build inflexible structures into their applications because of a lack of understanding of relational data.  During the client call mentioned above we saw that they had set up a table of Projects and then wanted to capture monthly budget data.  To do this the application designer created twelve fields, one for each month, on the project form to capture the budget information.  For spreadsheet abusers this kind of flat structure may seem natural.  However, in a database environment the goal is to structure related data to support reporting, visibility and analysis of information.

Data Entity Modeling 101

Whenever we start development of a new client application our first step is to model the data entity relationships.  The key to this is asking detailed questions about how information in the organization is structured, used and viewed.  A simple line of questioning can help flesh out data entity relationships.

1. Start with element of data that is at the core of the process being evaluated.  In most cases this should be relatively easy to identify; for example, when discussing a Project Management application the core data element is typically a Project.  CRM or Sales force Automation applications can be more challenging but typically have an entity such as an Opportunity, Case or Lead as the core.

KEY QUESTIONS

  • What is the main function of the application?
  • What do you call the form that you currently fill out as part of the process?

2. Identify primary relationships to the core data entity (see Example A).  Almost all business processes involve data that has some kind of hierarchy so it’s should be expected that a core data entity will have one or sub-entities.  A sub-entity is a child relationship where the core is the parent and the sub-entity has one or more pieces of information related to the parent core entity.  There are many examples of this; Projects almost always have a list of associated Tasks, Quotes have Lines or Quote Items, Sales Orders have Order Lines.

KEY QUESTIONS

  • What other information is captured as part of the primary function?
  • Is additional information singular to the core entity or might there be more than one piece of information of the same type?
  • Is information entered on a separate form linked or referenced to the core entity?

3. Identify auxiliary entity relationships to the core and primary data entities (see Example B).  We often refer to this auxiliary information as meta data because it is really information that helps qualify another data entity.  Auxiliary information is best defined as information that helps define, assign or organize a core or primary data entity.  Often information that is contained in a drop-down field is converted to an auxiliary data entity to allow for easier management of the list.  Another very common auxiliary data entity is a Staff or Resource table used to assign a Task or a Project Manager.

KEY QUESTIONS

  • Is the field on the primary or core entity a drop down that needs to be selected from a dynamically changing list?
  • Is there additional information that needs to be passed to the primary or core entity from a selection other than just the selection value?
  • Is there categorical or list data that is likely to be used in more than one place in the application?

The Benefits of Proper Data Entity Modeling

Circling back to the example above where the budget data was set up as a flat structure with a field for each month there are a number of considerations to point out.

1. It’s would be confusing to interpret a project budget that spans the end of the year because there would be data points in the ending months and beginning months but no specification as to which year the months belong in.

2. No project budget could ever be more than year long (there are only 12 fields to capture data).

3. Reporting trend and aggregate data would be very difficult due to the inability to easily identify the budget year and not having budget amounts in 12 different fields.

The ideal solution would be to set up a Budget entity related to the Project and capture each month as a single entry with the month, year and amount entered.  This would allow QuickBase and really just about any database to much more effectively report the budget information.

This small example is just one of many such data entity challenges we have seen.  In some cases the limitations caused by inefficient structures is not realized until substantial data and process development has occurred and correcting it can be a major project.  Therefore it is critically important when designing any database application to go through the data entity mapping exercise as an early step.


PaaS & ETL in the Application Ecosystem

Our conversation continues about the optimal mix of technology to support business process.  In our last TechWise blog,“Living IT, QuickBase Leads The Way”, we discussed the concept of Living IT and why organizations should plan for changing and dynamic technology.  To elaborate further on this topic we want to introduce some ideas around what we refer to as the the Application Ecosystem and how technologies such as PaaS and ETL fit in.

Core Systems

The Application Ecosystem of an organization is a broad way to refer to the various technology tools that the organization uses.  This applies to government, for-profit and non-profit organizations.  At the center of the Application Ecosystem is what we refer to as Core Systems.  These tools are typically associated with basic organizational functions required for accounting and transactional purposes.  For smaller businesses, tools like QuickBooks or PeachTree are the frequent choice while larger organizations have mostly deployed ERP systems like Oracle ERP or SAP.

While accounting and ERP systems have expanded to include broader functionality few if any organizations are able to function with a single technology to manage business processes.  This is due to the fact that ERP and Accounting systems are designed fir best practices and with transaction management as the primary focus with process management flexibility given limited attention.  This means that organizations are pushed to find technologies that complement and extend core systems.  We refer to the multitude of applications that support defined and ad hoc organizational processes outside the core systems as the Extension Layer.

The Extension Layer 

There are two basic types of applications in the Extension Layer, Point Solutions and Situational Applications.  Point Solutions are specialized, typically best of breed applications that solve a specific and typically well defined need.  Image management, warehouse management and CRM are areas that often are targeted for point solutions.  The other type of application in the Extension Layer is often referred to as Situational Applications.  These are applications that solve more unique or possibly temporary organizational needs where no viable Point Solution exists.  Often these types of applications are managed ineffeciently using spreadsheets or simple databases.

Application Connections 

The final but very important element of the Application Ecosystem is the body of interconnections between applications.  These connections or integrations may be between applications in the Extension Layer or with Core Applications.  In many cases organizations lack the technical capability to effectively integrate applications so information is moved between applications with human, manual processes.  Only when extension applications reach a significant size and value are they integrated in an automated way with other applications.

Paas & ETL Enable Flexibility in Extension Layer Application Creation and Interconnectivty

Platform-as-a-Service (PaaS) tools such as QuickBase and Wolf Frameworks provide organizations with tool sets to rapidly build and deploy Extension Layer applications that extend core accounting and ERP systems or provide effective departmental or workgroup functionality.  PaaS is frequently utilized for Situational Applications but also quickly becoming a strong choice for CRM, sales force automation and other areas often relegated to Point Solutions.

One of the main benefits of PaaS as part of an organizations Extension Layer is the ability to quickly and easily interconnect data and processes between applications designed on the same PaaS technology.  For example, QucikBase allows the ability to create cross application relationships between applications as a simpel and easy way to share information.

A technology that is rapidly becoming mainstream is Extract-Transform-and-Load (ETL) tools.  Open Source technologies like Talend ETL are allowing rapid, low cost development of integration.  This means that organizations can choose to automatically tie together applications, processes and information that previously could not have been efficiently integrated.


“Living IT”: QuickBase Leads The Way

It seems there is a continual push and pull in the technology world between centralizing and consolidating technology vs. distributing and diversifying.  We’ve asked ourselves this question often of whether our clients will be better served with a single technology to support their needs.

This line of questioning typically leads to the evaluation of the trade off between a single multi-purposed technology that solves many problems but none well as opposed to a number of “best-of-breed” solutions.

In reality organizations seem to answer this perennial question with decisions that shift the balance continually.  These decisions and the resulting daily interactions with IT systems are what we refer to as “Living IT”.  By acknowledging that technology within an organization is a dynamic entity allows for a whole new approach to solving business process and technology challenges.  A number of important factors are defining “Living IT” and it’s place in the 21st century organization.

1. The gap between management and technology is shrinking

As the Cloud evolves and the explosion of advanced collaboration and business process tools continues, technology that previously relied on IT staff to deploy and support is becoming accessible directly to information and process managers.  Platform-as-a-Service (PaaS) technologies such as QuickBase are a perfect example of tools that are narrowing this gap.  While QuickBase has the ability to handle advanced technological problems, the codeless development environment is used by managers with limited technical skills in tens of thousands of companies to deploy collaborative information gathering and process applications.

2. The costs and risks of change are shrinking as the benefits are growing

The 21st century company has to adapt quickly.  This means adding capacity and scaling quickly but also being nimble enough to shrink and shift directions nimbly.  In the “Living IT” environment companies need tools that are flexible, scalable and in many cases expendable.  QuickBase is again an ideal model for supporting change at a low cost.  The QuickBase subscription model is user based allowing organizations to develop as many platform applications as are needed with the scaling costs based on users only.  As organizations needs change applications can be retired without concern and new ones rapidly developed and deployed.

3.  Interconnectivity and security, two sides of the same coin are the main drivers of 21st century technology

Ubiquitous access to the internet has opened the doors to a new wave of collaboration between people and between technologies.  This new power of communication has also brought with it very real organizational concerns about security and accessibility of information.   QuickBase provides powerful tools to take advantage of collaboration but also manage the associated risks.

  • Domain/application user model: QuickBase users are registered with the QuickBase domain or with a private domain for enterprise accounts.  Application managers can then invite users who have registered in their domain to applications to allow access.  This simple but powerful model allows for easy collaboration with users inside and outside the organization without compromising security.
  • Role based security: within an application access is defined for each user by designated roles.  Roles can be easily configured for simple access rules or more complex data driven models.  This enables applications to effectively create different user and data experiences tailored for each application user.
  • Secure and capable API: QuickBase supports communication to and from external application via an HTTP API model.  The API enables interaction with all application tables for query, record add/update and import actions as well as more advanced interactivity.  All API interaction required authenticated sessions and for increased security, applications can enforce API token requirements.

The most important consideration in creating a true “Living It” environment is to embrace and not fear technology change.  Even very traditional technologies like accounting systems have to flex and change as the tax and other financial policies change.  Process and collaboration tools need to be even more flexible to support organizational evolution.  QuickBase is in ideal tool for supporting the “Living IT” environment due to it’s ability to enable development of secure, collaborative process and information management applications.


QuickBase: Why RAM Matters

I first stumbled onto QuickBase more than five years ago as a Sourcing Manager at American Greetings. I was looking for a better way to collaborate with the Product Managers than sending a barrage of cost requests to my buying team. My first app, the RFQ Manager, was an instant hit. In fact, the Product Manager I tested it on liked it so much I did not even need to ask the other PM’s to start using the tool, they were asking me where to sign up. It wasn’t long before I was literally touting QuickBase as “the best thing since sliced bread,” but it would take five long years, hundreds of developed applications and a deep search of the PaaS marketplace before I really understood what makes QuickBase special.

With so much time behind the wheel of QuickBase, it seemed that the almost instant access and reportability of entire datasets linked across multiple applications, all updated virtually in real time was something to be assumed. Last winter, our MCF leadership team spent several days with QuickBase and talked a lot about what’s under the hood, namely the use of a RAM/in-memory database. However, at the time I did not fully grasp the meaning until I explored other PaaS and SaaS offerings and realized that few if none were delivering the kind of data experience that QuickBase could provide.

In-memory databases (IMDB) store information in the RAM memory of a CPU whereas main-memory databases (MMDB) store data on disk space. IMDB storage is not-persistent, meaning that when the power is turned off the data is lost but accessing the information is must faster and requires fewer computations. It’s basically equivalent to the RAM on your PC where the information is stored for running applications that need fast access to the data. This model is often used in applications where access speed is essential such as 9-1-1 response systems and telecommunications.

What this means for QuickBase end-users is that the accessibility of the data, the computational capability, is optimized for speed. To translate this into terms that may be more familiar, imagine a totally different setup for your PC where instead of 1 or 2 GB of RAM and a huge hard drive, you really only use your hard-drive as back-up and your PC with 200 GB of RAM is never turned off. This means that every file and application is almost instantly accessible, there’s no booting up and no time loading applications because everything is always loaded in memory.

With all this in mind, it is clear why QuickBase provides world class collaboration, access and reporting for work groups, teams, smaller businesses, project managers, etc. It’s important when choosing where and how to deploy QuickBase to consider the use and underlying data needs, as some requirements are not ideal for IMDB due to dataset size and require the larger and cheaper data storage solutions. Fortunately, integrating with QuickBase is straight forward, allowing it to serve as the application for working data and with transactional or unnecessary data archived or purged.