Five ways to align digital transformation with data-driven practices

Align Digital Transformation and Data Driven Organization
Let's examine a couple of ways you can align two transformation programs to achieve complementary and reinforcing results. Your digital transformation program has a number of initiatives that enable new markets, develop new products, and should be targeting an overall improvement in customer experiences. You may have a separate program aimed at enabling the data driven organization that enabled citizen data scientists and leverages big data technologies to drive leadership and managers to leverage data in their decision making.

To align these programs, you have to consider how to enable data driven decision making around the digital transformation program. Here are a few ways to do this: 

  1. Ensure digital transformation programs are grounded with key performance indicators (KPIs) that are early indicators of whether programs are achieving desired results. In addition to any financial metrics, consider KPIs in customer experience, employee engagement, and operational efficiencies that can be early indicators of financial success.

  2. Develop a data gathering and sharing strategy especially for new products and applications. I want to see metrics on marketing activities, product usage, and system performance to help guide decisions on priorities and roadmaps.

  3. Leverage data to update customer segments and user personas so that the digital strategy can evolve as new capabilities are deployed. Strategic activities should not be considered one-time events and should be updated with new data and insights.

  4. Engage a growing number of employees with data and insights when embarking on new initiatives. Transformation programs often begin with a short list of initiatives and participants, that should evolve over time to include a larger scope and participation. When bringing on new participants, align them early to both digital and data-driven thinking by leveraging insights captured from successful initiatives.

  5. Review enterprise systems, especially the CRM and marketing automation tools on new data sources and workflow changes that enable a digital business.    

Don't leave these as afterthoughts. Be explicit when working with initiative and team leaders to ensure these data-driven practices are incorporated in their programs.


continue reading "Five ways to align digital transformation with data-driven practices"

500+ Conferences for Technology, Digital, and Data Leaders

I've been slowly growing the number of events in my dashboard. I started with about 200 and now the list has grown to over 500 conferences for technologists.

Insights on Conferences for CIO, CTO, and CDO


  • Top technology topics are Big Data (42 conferences) IoT (19) and Artificial Intelligence (14)

  • 50%+ of the conferences are directly target CIO and CISO.

  • 30%+ of the conferences are in the top three locations - New York (65), San Francisco (59), and Chicago (41).

  • 50%+ of the conferences occur between March and June. Top months are June (98), May(73), March (69) and April (54).

  • About 10% of the conferences are industry specific with Finance and Healthcare the top two industries

Click here to use the dashboard to find conferences that interest you.



continue reading "500+ Conferences for Technology, Digital, and Data Leaders"

5 Reasons to Estimate Agile Development with Story Points

I have a few posts on agile estimation. Here's one on why estimating is important, another one on how to do one-week agile planning sprint, and a more detailed one on how to conduct an agile sizing meeting. Does agile poker help? Yes. If that's how the team wants to self organize and come to consensus on an estimate.

I am often confronted with the question on whether to estimate in hours or in story points. The debate over whether to estimate at all and what measure to use is extensive, so I'm going to share some points why I always endorse software and application development teams to use story points to estimate.

Why using Story Points Drives More Reliable Estimates and Consistent Velocity


  1. Estimating with points is easier for developers and aligns with how they interpret requirements and develop solutions. When reading the story and acceptance criteria, developers typically ask themselves a few questions. Do I understand the requirement? Do I believe it's important and understand why it was prioritized? Most importantly, do I have an idea of how to implement it and is the solution similar to other things that I have already implemented? Developers will assign higher story points for vague requirements or for unknown implementations accounting for both the effort required to solution and the potential complexities in the implementation.

  2. Estimating high story points drives more questions and dialogue. Since story points expresses both effort and complexity, a higher estimate will often draw questions and dialogue on how to simplify the requirement or the implementation. Ask the developer why the estimate is high. If it was estimated in hours, you're more likely to get a list of implementation and refactoring steps that is hard for product owners and technical leaders without a deep understanding of the application architecture and code to interpret. If the estimate is in story points, more questions can be asked as to whether the developer is interpreting the requirement correctly, what makes the implementation complex, and whether there are alternative solutions that are easier to implement.

  3. It's easier to normalize story points across developers of different skills.  An advanced developer may estimate a story is only a couple of hours to implement while an inexperienced one is more likely to estimate greater effort to account for the learning curve and making mistakes. Now let's say teams are using the Fibonacci series for standardizing sizes and craft some definition of what three vs. five vs thirteen story points mean. Maybe a story size of three means that the implementation requires a single change to the user interface without any changes to the business logic or data model. When you define it that way, you're more likely to get both the advanced and inexperienced developer to estimate the same or similar number of points for the story.

    So what accounts for the inexperienced developer's added effort to complete this story? You'll see it in the team's commitment and the stories assigned to the novice developer versus the advanced one. The novice developer is more likely to commit to fewer stories (and fewer total points) than the advanced developer. As the novice developer gains more experience and knowledge of the application architecture, you're likely to see a higher commitment.

    Here is a great post by Mike Cohn that elaborates on why estimating with story points helps teams with different skill and experience levels. 

  4. For those looking to capture development costs, measuring actual hours provides an easier to implement and more accurate solution. Most agile tools allow developers to log their work in hours so if required, at the end of a sprint you can get a full cost accounting. See time tracking in JiraRally and VersionOne. What's more interesting is that you'll have better data correlating estimated story points to actual hours and asking questions on the variance. For example, a high point story with low effort implies a complex story or an overestimated one. 

  5. Estimating and committing to story points more often leads to a consistent velocity. Development teams will not only consider the total points of prioritized stories but the mix of them. So for example, they may commit to three 5-point stories and one 13-point story for a total of 28 points but may not commit to six 3-point stories and two 5-point ones even though they add up to the same 28 points. When committing, the developers take many other variables into context beyond size, complexity, and effort hours and are more likely to commit to a blending of story sizes that fit the skills and expertise of the team. The added context in the decision making often leads to a more consistent velocity.
Want to read more? Here are some more points on using story points. Also, remember that estimation is hard and what leaders should focus on is on getting the culture, practice, and requirements right that enables teams to deliver well designed and performing applications. 

continue reading "5 Reasons to Estimate Agile Development with Story Points"

2017 Events for CIO, CTO, Chief Digital Officers, and Chief Data Officers

I was doing some research last week on 2017 events for CIO, CTO, Chief Digital Officers, Chief Data Officers, and IT leaders. I found articles on CIO.com and TechTarget with a number of events listed and then went looking at other institutions (Gartner, Forrester, CDM Media, Evanta, Argyle, HMG Strategy, O'Reilly and others) for their lists.

I focused on events in the United States (MVP) and aggregated almost 250 events targeting CIO and other data, digital, and technology leaders.

It took some time to put together what I think is a reasonable, but not comprehensive list. I then used Trifacta to merge and cleanse the data, added a dimension on "topics" and developed a Tableau dashboard to review.

Here is the CIO 2017 Events Dashboard. To use the dashboard, click on the map for your location and use the bar charts to drill down by topic, timing, and sponsor. In the grid, click on the Event Name to get to the event's website.


2017 CIO Events
Click the image to see 250+ 2017 events for CIO, CTO, Chief Digital Officers and Chief Data Officers

Insights into Event Topics


  • Chief Digital Officers should look at events under Digital Transformation, Digital Marketing, Customer Experience and Innovation.
  • Chief Data Officers should look at events under Big Data which also includes events on analytics, data science, and data management.
  • Emerging topics include blockchain, AI, IoT, Wearables, and innovation.
  • DevOps includes cloud conferences. All other IT operations such as data centers and service desk are covered under Operations. Security is a separate topic.
  • Events from leading technology vendors are under the topic Technology.
  • There are separate topics for enterprise architecture, software development, and mobile.



continue reading "2017 Events for CIO, CTO, Chief Digital Officers, and Chief Data Officers"

How to Select a Data Visualization Platform for Citizen Data Scientists

Over the last few years, I've been telling readers, colleagues, stakeholders and clients the importance of establishing a data driven organization as part of a digital transformation. To compete today, organizations need to be smarter and faster to strategically target market segments, develop new products, improve customer experiences, and automate operations. Organizations can't get there by shooting from the hip and have to educate and empower a larger number of managers to leverage data in their decision making.

Is establishing a data science team sufficient? Skilled data scientists should be used for the most important and complex data analytics an organization requires, but this only provides part of the answer. First, there is a data science skill shortage, so it's unlikely most organizations have sufficient data scientists to perform all the analytics. Many organizations simply can't afford data scientists or have the cache to recruit them, and enabling citizen development programs is one way CIO can address the technical skill gap. To be successful, most organizations need to consider training and outfitting "citizen" data scientists that can take develop analytics and mentor colleagues to use them in both strategic and tactical decision making.

Kicking off Citizen Data Science Programs


I've already blogged on how to kickoff a citizen data science program. Read this post to see how to find early adopters for the program, get buy-in to support the program, and start developing standards and practices. I've also shared what services citizen data scientists need to be successful, and how to assign data roles/responsibilities between data and technology teams.  I also suggested best practices on developing dashboards and also laid out an agile process to finding value in dark data.

But I haven't spoken about technologies and platforms for citizen data scientists. Selecting platforms is a very important consideration in order to make the program both short and longer term successful. Keep in mind that all organizations already have some tools for business analysts to process data including Excel and other legacy BI platforms.  Organizations should look beyond these tools if they are serious about citizen programs. From my post on data governance challenges around Microsoft Excel, "The issue is, that Excel always made it too easy for business users to create splintered and derivative data sets." This is in addition to the very long list of Excel horror stories aggregated by the European Spreadsheet Interest Group. The other issues CIO fear is that empowering citizens will lead to a proliferation of single-purpose reports and dashboards, similar to what many organizations implemented in their legacy BI solutions.


Like this post? Signup for the newsletter or share this post with colleagues



Data Visualization Tools Selection Criteria


So if you're going to outfit citizen data scientists, you have to consider some traditional business requirements and some newer, "big data" driven ones in order to pick tools that are appropriate for the size, scale, skill, and complexity of the organization, the underlying data, and the analytics required.

For starters, since becoming a data driven organization is key to successful digital transformation programs consider reading six critical strategies for selecting breakthrough digital transformation platforms and my other post on six digital criteria to evaluate superior technology. These posts highlight a number of generic considerations when selecting technologies such as (i) align on vision, strategic opportunity, and short term needs, (ii) use experts to define solution sets, (iii) perform detailed reviews of the user experiences, (iv) evaluate documentation and the health of the tool's ecosystem (data, integration, and developers), and (v) consider the organizational impact of the tool.

When selecting data  visualization tools for data scientists, a number of more specific criteria emerge based on the people, data, analytics, and other constraints.

1. People, Skills, and Organizational Impact


These criteria require you to understand the needs of three types of users (i) citizen data scientists that will be the primary developers of dashboards and analytics, (ii) data scientists, quants and statisticians who may also use this tool but may have additional integration requirements, and (iii) end users of the completed dashboards and analytics


CriteriaImpact
Number of citizen data scientistsMore training and governance will be required for larger teams.
Skill levels of the citizen data scientistsIf low skill, less sophisticated tools with easy user experiences will yield faster results. Tools that heavily rely on programming models may be difficult for novice groups.
Organization also has skilled data scientists, quants, or statisticiansDecide on whether they are in scope and if yes, consider their integration needs. Advanced data scientists may prefer data visualization tools with programming models that offer more flexibility and integration capabilities.
Number of departments that will leverage completed dashboardsMore departments imply disparate use cases. Consider tools that have programming models or have mechanisms that enable reusing visuals. 
Number of users that will access completed dashboardsIf large audiences, then user experience of the dashboards and visuals should be a top criteria.

Bottom line: These criteria should help you decide whether you need a simple and easy tool for a small, less sophisticated group or a more comprehensive tool aimed at  higher skilled developers and greater organizational needs.

2. Data Management Considerations


You can't complete data discovery work, perform analytics or create dashboards without some consideration of the underlying data sources and their complexities.

CriteriaImpact
Number of data sourcesQuantity is important, but more important is whether citizen data scientists will be incorporating new data sources on a regular basis
Big data considerations?Are you handling larger volumes, higher velocity, or greater variety of data sources and types?
Real time data?Does your organization require processing data in real time?
Data quality, transformation, or master data consideration?Are you connecting to relatively clean data sources or do you expect significant data processing and preparation will be needed? If yes, you may need a data preparation tools such as those offered by Informatica, Talend, Alteryx, and Trifacta.
Enterprise data sources?Most organizations will look to secure and automate data connections from enterprise data sources.
SaaS data sources?Many SaaS providers have APIs for pulling data. Review whether the data visualization tool offers a direct connection to your SaaS platforms or if one is available through platforms such as IFTTT or Zapier.
IoT data sources?Sensors often produce a large volume and velocity of data. You'll likely need data storage and stream processing technologies to handle IoT sources before connecting a data visualization tool.
Confidential, privacy considerationsWill you need to consider mechanisms to secure data and manage entitlements? If yes, then you need to review the security capabilities of the data visualization tool and also consider adding tools that mask and encrypt data elements.

Bottom line: These criteria all speak to whether you require additional data integration, preparation, processing or management tools in addition to any data visualization tools. Many of the data visualization tools come with some data preparation capabilities and some market themselves as an end to end data management tools. A few will try to sell you on the concept that all you need is them, no other databases, ETL, or data integration tools because they come with all the required capabilities.

So these criteria should help you flush out whether this is a realistic proposition. The more data sources, the bigger the data, the more enterprise sources, and the complexity of the data preparation work are all indicators that you will likely need additional data management tools. On the other hand, if you're working with relatively few, less complex data sources then make sure to evaluate the data preparation capabilities of the data visualization tools and see if they are "good" and "easy" enough.

3. Constraints


Before getting to the heart of the analysis, you'll want to consider other selection constraints.

CriteriaImpact
Legacy toolsDoes your scope include phasing out any legacy BI or reporting tools? If yes, you'll want to consider what dashboards, reports, or analysis are in scope for conversion and where there is flexibility to modify output formats.
Business modelIn addition to overall cost, you'll want to consider how the vendor prices with usage and whether that will create higher than expected costs as usage increases. This is a very important criteria for customer facing analytics especially if customer will receive access to the data visualization tool.
Costs and budgetPricing models may box out smaller organizations from selecting the more sophisticated tools. Can you afford it?
RegulationRegulations may pose requirements on how and where data is stored and accessed. It may also require auditing, analytics lifecycle, documentation, and other data governance capabilities.
Hosting optionsSaaS? Cloud? Data center? What options are available and your organization's requirements?

Bottom line: Technology selections need to consider financial, legal, logistical and other constraints. It's best practice to identify these up front to help limit the scope of the review.

4. Data Visualization and Analytic Capabilities


You'll spend most of your time evaluating data visualization tools based on their visualization capabilities, ease of use, and sophistication of analysis.


CriteriaImpact
Chart types availableEvery visualization tool comes with a toolkit of chart types. All will have bar charts, pie charts, data tables, etc. but some will include geo mapping, heat maps, node graphs and other more sophisticated visuals. What's required versus nice to have?
One time or ongoing analysisIf you're conducting more one time discovery work, then you'll want to consider how easy it to use "out of the box" analytics and review the tool's story telling capabilities. (Some good examples of story telling are here and here.)
Internal or customer facingIf you intend to develop customer facing analytics then this has implications on the type of delivery expected (direct access versus pdf outputs for example), whether there are style or branding considerations of the final product, security considerations (how to enable data entitlements), and performance considerations (speed becomes more critical). 
Analytics needsAggregations? Trends? Modeling? Machine learning? You'' want to consider not only whether the tool has the capability, but how easy it is to use and whether you'll need to integrate with programming environments such as R or Python to implement these algorithms
Visual configuration needs?It's one thing to have the chart types desired, but then you should consider how easy they are to configure and the overall configuration capabilities. If you're doing customer facing visuals, then reviewing the visual configuration capabilities is important to ensure that the output meets minimal customer expectations.
Reusability? StandardsIf you plan to develop a large number of dashboards or analysis, you'll want to consider how to reuse and standardize elements such as dashboard layouts, chart configurations, calculations, expressions and other elements that are programmed.

Bottom line: These criteria all address the core capabilities of the tools and separate out less sophisticated needs versus more flexibility and analytical capability. You'll want to invest considerable effort investigating these capabilities, but be prepared to make compromises. Most tools can't be all things to all people but many will try to sell you that they can handle your requirements. The best way to evaluate these tools is to run proof of concepts.

Data Visualization Tool Selection Process


The figure below provides some guidelines on a selection process




In summary:

  • Define a tool selection committee and have them propose a charter - Keep this team small, but empower them to make decisions to avoid stakeholder conflicts.

  • Use primary selection criteria to short list the tool set - There are a large number of data visualization tools in market today, so use the criteria from people/organization, data management, and constraints to help narrow down the list.

  • Commission proofs of concepts to evaluate the visualizations and analytics - It's better than doing a paper evaluation, Have a small group of your proposed data scientist use the short listed tools against some of the short term needs and evaluate the output, effort, performance, and end user satisfaction. 

The Gotchas in Selecting Data Visualization Tools


I promise you that selecting a data visualization tool isn't as easy as I just laid out and there are a number of significant "gotchas" that can steer you in the wrong direction. I'll first share these in my newsletter, so please consider signing up!

continue reading "How to Select a Data Visualization Platform for Citizen Data Scientists"

Hot Twitter Hashtags for CIO [DataViz]

We're tweeting about key technologies like blockchain, artificial intelligence, IoT, big data and analytics.Transformation topics on digital transformation, agile, social business, and smart cities are also top of our tweets. Click here to use the data visualization on CIO Influencer Hashtags.

CIO Influencers Hashtag Data Viz
Click on the image to use the dashboard

continue reading "Hot Twitter Hashtags for CIO [DataViz]"
Share