Monday, July 20, 2015

Why Business Leaders are Clueless about Data Integration

When someone says that the data integration process is automated, I suggest asking questions to clarify what they mean by automated. You'll probably conclude that the process is anything but automated let alone reliable, scalable, secure, or configurable.

To some, automation implies efficient and reliable but still including manual steps so long as they are performed quickly and easily. Others assume that if the process can be completed without IT's direct involvement then it is automated. Still others don't care whether it's automated but are angered when there is a breakage in the process or if it can't scale magically as more data is piped in. Lastly, there is an assumption that the daily process running on a volume of data will magically scale when existing data has to be reprocessed for a change in business logic or storage schema.

As hard as it is to modify software, modifying a semi-automated (in other words, partially manual) data integration can be even more daunting even if the steps are documented. For example, fixing data quality issues or addressing boundary conditions tend to be undocumented steps performed by subject matter experts.

Business Leaders are Clueless About Data Integration


So now you want to fix data integration. Take out the manual steps. Make the process more nimble, agile, reliable, etc. Why is it so hard to get business leaders on board with the investment needed in data integration?

Data processing and data integration technologies like ETL, Hadoop, Spark, Pig, Hive, IFTTT are difficult enough for technologists to fully understand and solution to the right data issues, but the jargon just frustrates business leaders. Many are clueless about the technologies (other than the over-hyped term Big Data) and are more surprised about the need to have them and invest time to build expertise in them. "Information technology" and processing data has been around a long time, so there is underlying assumption, even with Big Data, that integration is cheap and easy.

Now unless you are doing very basic, point A to B straightforward plumbing, data integration can become quite complex as new data sources are added, logic changed, and new applications developed. Data integration may start off as simple, but over time legacy data flows are difficult to understand, extend, or modify.

The complexity slows down IT, and if data and analytics is strategic to the business, it frustrates business leaders that they can't just add a new data source, modify calculations, improve on data validation, or strategically change the downstream analytics.

So now, whatever technologies IT selects and however it is presented to the business, it comes across as plumbing. All that time investment just to have a stable data flow? Analytics, visualization, application and for the most part doing anything useful with the data will cost extra?

Data Integration = Core Big Data Infrastructure


The simple answer is that data integration is a key foundational technology capability if data is strategic to the business. It's not just a technology, it is a competency. Unfortunately, ranked against other data technologies like data warehousing (in RDBMS, NoSQL, etc) and delivery (including analytics, data visualization, mobile application development, etc.), data integration capabilities are often a distant third in priority. So it's no surprise that many processes are not fully automated and that business leaders don't "get" the importance of this capability.

Knowing you have a problem is the first step to solving it... More in my next post!


continue reading "Why Business Leaders are Clueless about Data Integration"

Tuesday, July 14, 2015

Fixing your IT Legacy Before the Next Bridge Collapse

Last week we saw two significant, public system outages, one belonging to NYSE because of a "configuration issue" and the other to United Airlines because of a "router issue". This week, the news is on OPM's  data breach of 21 million users and their lack of investment and follow through on resolving security issues identified by auditors.

My first thought on seeing these failures is that CIOs have little hope in achieving 99.9%+ uptime, fifteen minute or less recovery times, and secure perimeters when the world's greatest organizations suffer from what looks like basic IT 101 issues. Though I am certain the issues are far more complex in reality and fixing them a significant challenge to their IT staff, I think about how many times CIO walked into a Board room with their head tucked in because of similar issues? A critical operational outage because of some preventable infrastructure, operational, or software issue?

But my second thought turns to our roads, bridges, highways and infrastructure. At the bottom line, the US would have to invest $3.6 trillion to bring it all up to snuff by 2020. That's some technical debt and we're lucky that there isn't infrastructure related catastrophes reported on a frequent basis. But what you may not see in the numbers but people feel on a daily basis is the negative impact poor infrastructure has on growth. Stuck in traffic because a highway needs expansion? Subway delays because of a switching issue? How about that high speed rail that would be faster, cheaper, safer and more environmentally friendly?

Over the past five years, businesses have been smarter about investing in innovation and digital capabilities. We've been enabling analytics and data driven decisions, adjusting to a user driven mobile workforce, and insuring we can personalize and improve customer experiences.

But have we attacked the IT legacy and technical debt with the same enthusiasm let alone internal celebration? Are you still running Windows 2003 even though Microsoft will end extended support for it this week? Will you upgrade that .Net application on V3.5 or that Java one on V7? Will you connect and secure your data sources, or continue to run them as inefficient data silos?

Lots of work to do, but before you spend hours reviewing, debating, and blaming what to do when you hit your next outage, I suggest spending a greater amount of energy, funding, focus and celebration on where you will attack your technical legacy. 
continue reading "Fixing your IT Legacy Before the Next Bridge Collapse"

Monday, July 06, 2015

DevQOps - Giving QA a Seat at the DevOps and Digital Transformation Table

I was surprised over the lackluster response to my last post, Why a QA Practice Is Critical to Long Term Success. After all, if more organizations are looking to achieve continuous delivery, why are we not talking about the critical role QA plays in a DevOps transformation?

If more organizations are investing in self-service BI programs, how are these new dashboards being tested before they are used in critical decision making? How are organizations validating new SaaS applications before they are deployed to users? Deploying new marketing tools? What steps are being taken to insure KPI changes are as expected and no new issues are introduced?

Might I suggest that we give QA a seat at the table.


Perhaps we should call it DevQOps


Let's review some of the key QA functions

  • QA should automate regression tests so that agile development and iterative releases can be performed efficiently and reliably

  • QA should be validating performance of the application especially when response time is critical to success and where usage growth is expected

  • QA should help minimize the work required by User "Acceptance Testing" since users are often ill equipped to test applications through multiple flows, data inputs, and boundary conditions

  • QA should lead efforts to validate security and perform other code validations

  • QA should insure the user experience is optimized for different devices and browsers

  • QA should manage business risk by itemizing, prioritizing, and developing action plans to mitigate them

 

Why Give QA a Seat at the Table?


Nothing is new in the list of responsibilities I listed and QA has been performing some or all of these roles in many application development practices for some time. If you're already investing in DevOps, you can't legitimately claim or achieve continuous delivery without factoring in some of these practices. My issue is that these needs are not given equal billing in a DevOps transformation and QA seems to be the middle child being squeezed out by their more aggressive Dev and Ops siblings.

However
What is new is that many organizations are building customer facing applications and proprietary workflows as part of their digital transformation programs. QA is often an afterthought in the budgeting process and maturing QA a distant third versus Dev and Ops practices
The mindset among many business managers is that QA is still "the thing you do after the application is developed", that developers should be writing bug-free code, and that whatever validation is required can be performed by the expected business users. They are more likely to invest in additional developer resources if they believe it will gain them functionality or speed to market, or in system engineers if it will gain them reliability or help reduce costs. The CIO and IT leaders in these programs are then more likely to invest their efforts maturing development and operational practices without learning quality assurance tools, practices and governance.

No QA = Legacy Application?


Whether you're moving to a DevOps or investing in digital transformation, I suggest leaving out or underinvesting in QA is a mistake. Ever here anyone going back and easily adding a QA practice to a legacy application? It's hard and often more expensive to do after the fact, so less likely to be done.

In fact, I would argue that a successful application without QA is the definition of a legacy application.

continue reading "DevQOps - Giving QA a Seat at the DevOps and Digital Transformation Table"

Monday, June 22, 2015

The Most Important and Often Underinvested IT Function

If you're practicing agile development without QA team members, a reasonably defined testing process, and sufficient criteria to help define "done", then at some point your development process will go off the cliff of complexity.

That's right. No QA, off the cliff you go without a parachute.


The size of your development team, the number of technologies used in the development stack, and the business criticality of the applications are all quality criticality indicators. They shape how much runway IT has before quality factors drive material business risk that can easily overwhelm the IT development and operations teams.

Even if you're a solo citizen developer on a low code platform, at some point you're going to make application changes where having defined test plans help avoid issues that might impact users, customers, data quality, security or performance. In larger scale applications, performance testing is not something you can do on demand because the first load always fails and valid testing practices need to be implemented as part of the agile development process.

Why a QA Practice Is Critical to Long Term Success


Let's look at some basic concepts that point to why QA is so critical to businesses that rely on technology
  • More functionality, more testing - As applications are developed with more features and capabilities more functions need testing. Every agile sprint increases the testing landscape, and if the discipline to define test cases and establish regression tests isn't developed in parallel, the backlog to develop them becomes too hard and long to execute on afterward. If a development team is adding functions and then breaking others, it's likely because there isn't regression testing in place to catch these issues.

  • Applications are more complex involving multi-tier architectures, multiple databases, transactions spanning multiple APIs, and computing performed on multi-zone cloud environments. If you aren't building unit performance tests along the way then identifying a bottleneck can be a lengthy, painful process when it emerges.

  • Data and Analytics Testing - The application is working, but the data is wrong. Many applications today are data driven enabling their users - internal or external to make better data driven decisions. Application testing isn't sufficient because if the data is wrong, the entire value to the end user is compromised. But testing data, validating calculations, understanding boundary conditions, and insuring aggregations or statistic calculations are valid is really really hard without a defined test strategy.

  • Security Testing - A failed application security test may be costly to fix. Worse is a security failure in production that could have been avoided with security practices entrenched into the development practice. If you're not testing for security issues, then it's unlikely the development team is applying basic security design principles into developed applications.

  • Multiple devices, browsers - Finally, today's user experiences span phones, tablets, laptops, IoT devices with different browsers and plugins. Insuring that user interfaces are functioning and that the user experience is optimized across all these modalities has never been easy, however, customers have minimal patience for mediocre or clumsy experiences.

 

 Why Many Organizations Underinvest in QA?


The workflow may be broken. The data may be wrong. The user interface may look broken on an Android phone. The performance may be degrading over time. There may be a security hole just waiting for someone to exploit it. There is likely loss of revenue if there is an outage.

With so many things that can go wrong, why do many enterprises and the CIOs that lead them underinvest in QA?

Next post.

continue reading "The Most Important and Often Underinvested IT Function"

Monday, June 15, 2015

5 Key Practices - What IT Takes to be a Citizen Developer on Low Code Platforms?

Does the IT department and the applications development team have to own and lead all application development efforts?

Over my last couple of posts, I've been exploring how Citizen Developers - developers using light weight, often low code development platforms can help their business functions automate tasks, analyze data, develop knowledge repositories, and connect data across applications. As a CIO, I've developed citizen development programs and believe it may be the next big thing in application development, but it takes some discipline and defined practices to avoid the potential perils of citizen developed applications.

So in this post, I'll share some practices that will make these programs successful. Keep in mind that what you elect to implement from these practices will likely depend on the goals, strategy of the program and technical capabilities of the platforms selected. So for example, using IFTTT SaaS platforms like Zapier, IFTTT, or itDuzzit to connect SaaS data platforms may not need a lot of rigor, but self-service BI programs need lifecycle practices and IT provided data management services. It also helps to understand technical bounds, so low code platforms like QuickBase, Tableau, QlikSense, may be in scope for Citizen Development while others offering more technical capabilities like Force.com still require IT ownership or participation.   

 

 

What Practices Are Needed To Enable Citizen Developers?


There are several ingredients needed for citizen development to flourish.
  • CIO sponsorship of these programs is critical, otherwise IT will view the platform and applications developed by citizen developers as rogue IT. There is also a greater risk of citizen developed data silos or other unsupportable applications. CIOs that are not on board may communicate that these programs are contributing to enterprise data landfills or creating new data governance challenges. The last thing organizations need is a divide between IT and organizations that want more technology and are willing to roll up their sleeves to get it. Note, I should also point out that organizations selling low-code platforms should invest more effort selling and partnering with the CIO!

  • Formally define and communicate a strategy, list of example opportunities, and target platforms so that the organization can align on a selected approach and priorities. What you don't want is every department pursuing different platforms, or that the program is only successful with a handful of early adopting departments, or that citizen developers create applications that add more complexity than value.  

  • Establish the development practices and lifecycle for applications developed in these programs. Is there a review committee to determine priorities and appropriateness of platform? How are versions maintained? What naming conventions and documentation is minimally required? What UX/UI standards will be leveraged in these programs? How are applications tested? What security is required and reviewed? How are these applications monitored and how are end users support provided when required? When and how does the organization recognize that they've outgrown an application and a more scalable solution required? Who and how is it decided when venturing into coding practices, data integration, or API usage permissible?

  • Who are the acknowledged Citizen Developers? What skills are required to be part of the programs? How do you get their managers engaged and supporting the program? What training, mentorship, and rewards is provided to successful developers? How are best practice on developing applications shared with other developers?

  • How are you marketing, promoting, and providing access to successfully developed applications? Where is the directory to find these applications? How are access roles and other permissions defined so there is consistency? Who signs off on new user access? Who is solicited on priorities for enhancements? Who takes responsibility for rolling out and communicating changes to end users? 

This doesn't need to be overwhelming or hard and it doesn't need to be defined all up front. It also doesn't require all the rigor required of a software development lifecycle. This is just a sample of considerations and what to think about as a citizen development program is kicked off or  matured.



continue reading "5 Key Practices - What IT Takes to be a Citizen Developer on Low Code Platforms?"

Monday, June 08, 2015

The Perils of Citizen Developers

Mile wide spread sheets. String parsers that break when unrecognized characters are passed to it. Rapid fire queries that kill database performance. Forms that permit SQL injection. Hundreds or thousands of undocumented disparate applications that don't share data or design patterns. User experiences that have no consistency.

Is that what goes through your mind when you hear the words Citizen Developers?

This is some of the feedback I received after my last post, 4 Reasons Why Citizen Developers May Be The Next Big Thing in Application Development. In general, many developers scoff at the idea of giving "corp cube dwellers" this capability that lead to "desktop crapware". Citizen Development will "continue klunking along". My favorite comment, "If I had a penny for every time I have read development with no coding required." See the comments to my last post on Reddit.

Should Developers Scoff at Apps Developed by Business Users?


Software development and engineering is a craft, a skill, and a discipline. It's not mastered in a training course, a degree, or by being a programmer on a simple application. To develop robust, scalable, secure applications that can be extended and maintained often takes a team of multi-disciplined professionals working collaboratively leveraging various tools, aligning to architectural principals and design patterns in creating "software magic".

These same developers are often called in to rescue bad applications, or to investigate a systems issue caused by a poorly defined application, or blamed for a process failure because a hack of an app developed by a rogue developer isn't stable.

But these same developers also know there is too much work for them in the enterprise. They scoff at the idea of hiring mediocre developers to handle the demand or when business users work around their need for apps or reports. Many will shake their heads when looking at the work of spreadsheet jockeys or the legacy of data silos.

And the reality is, these developers would prefer building new, innovative applications where their work can have business impact. They want to work on the latest technologies. They are needed to help solve Big Data challenges, to help migrate more applications to the cloud, or to build customer facing mobile applications.

Citizen Developer Development Practices

I like the term Citizen developer. Not everyone is a citizen. Citizens are governed and have accepted values and practices. They are given a set of tools to work with and instructions on how to use them. I may be qualified to go to home depot and get the tools, materials and guidance to install a new light fixture that I need to do safely and properly. On the other hand, I'm not permitted to go upgrade the circuit leading to my home without a licensed electrician and the proper work permits.

So my next post will provide some governance, practices and specifics on What IT takes to be a Citizen Developer.

continue reading "The Perils of Citizen Developers"

Monday, June 01, 2015

4 Reasons Why Citizen Developers May Be The Next Big Thing in Application Development

I must be behind the times because last week was the first time I heard the term "Citizen Developer", a term Gartner defines as a user who creates new business applications for consumption by others using development and runtime environments sanctioned by corporate IT. They are referring to a business user and not a software developer in IT, and are careful to distinguish this from Rogue-IT which occurs when users select technologies not sanctioned by the IT department.

I have been a strong proponent of citizen development, though I've been using other terms for it on this blog. Self Service BI is a form of citizen development and a strong practice is a key ingredient to becoming a data driven organization. I also wrote The Best Line of Code is the One That You Didn't Have to Write! referring to PaaS platforms that enable application development with no or little coding required. I heard the term at last week's conference for Intuit QuickBase, one of the best platforms for citizen development of database driven web and mobile applications.

Why Citizen Development?


Gartner predicted that Citizen Developed applications would be 25% of new applications developed by 2014. I suspect the percentage today is lower than predicted, so here are a few reasons why this trend may take off in the next few years -

  • Understaffed IT Departments with greater business demand fro technology services - I've never seen an IT department with adequate resources and funding to tackle all the demands of the enterprise or organization. We spend significant effort to prioritize, identify business value, formulate risk scores, and calculate ROI to insure we tackle the most critical opportunities knowing that we can't manage every need and request equally and simultaneously. So one reason for citizen development is to handle technology needs to automate workflow, develop knowledge repositories, construct reporting dashboards, or process data in domains that are difficult for IT to service. Very often these are operational groups, finance, and marketing who have significant technology opportunities but are too often ranked lower in business priority.

  • DIY by SME's vs. IT's ability to capture requirements and execute solutions - It's also no secret that IT often fails to understand and translate business requirements adequately. When deep subject matter expertise is required to facilitate a relatively easy to implement technology solution, it can be a lot more efficient for a citizen developer working in the organization to develop a solution. Examples include data driven dashboards, departmental specific task management, lightweight CRM, or no-code content management systems (CMS).

  • No/Low code solutions should be easier and cheaper to maintain -The reality is that custom developed applications can be expensive and many organizations underfund their ongoing support. Typically, the application development team is asked to go on and build new applications, leaving limited applied resources to enhance or upgrade older applications. In addition, the rate of adding new applications is often faster than the rate legacy applications are retired. Citizen developed applications are largely low or zero code configurations and often deployed on SaaS or PaaS platforms. These applications should require less maintenance and are more likely to be enhanced if they are mission critical to the citizen developer's organization.

  • Emergence of tech savvy business users and functions - Finally, I believe there are technically capable individuals entering organizations in non-IT functions that are interested in taking on more technical responsibilities. They include data scientists working in analytical functions, data stewards working in operational functions, sales operations managers, or digital marketers. Given the right platforms, practices and governance these tech savvy users can become citizen developers.

So What's The Catch?


Needless to say that citizen development can easily become a next generation IT nightmare. A good quote from Citizen Developers Will Ruin Software

Citizen developers are only concerned with their immediate environment, looking at the problem that they are trying to solve so they can do their job, rather than seeing it in the context of the wider IT ecosystem.

I'm not sure I believe this has to be the case. Like all other technologies, it requires IT leadership to provide governance, practices, and guidance for technologies used by citizen developers. More in my next post!
continue reading "4 Reasons Why Citizen Developers May Be The Next Big Thing in Application Development"

Share