White paper

Evaluation guide: How to choose the right modern BI & analytics platform


The transition to a modern business intelligence model requires IT to adopt a collaborative approach that includes the business in all aspects of the overall programme. This guide focuses on platform evaluation and selection. It is intended for IT to use collaboratively with business users and analysts as they assess each platform’s ability to execute on the modern analytics workflow and address the diverse needs of users across the organisation.

About this guide

This evaluation guide aims to support IT organisations as they evaluate and select a modern BI & analytics platform suitable for a broad, enterprise-wide deployment.

The transition to a self-service-based, modern BI model requires IT to adopt a collaborative approach that includes the business in all aspects of the overall programme (see Redefining the role of IT in a modern BI world). This guide focuses on the platform evaluation and selection aspect of a modern BI programme. It is intended for IT to use collaboratively with business users and analysts as they assess each platform’s ability to execute on the modern analytics workflow and address the diverse needs of users across the organisation.

The modern analytics workflow is a cycle of inter-related capabilities.

IT enables the modern analytics workflow, but it is primarily driven by business users and analysts throughout the organisation. And its successful implementation requires collaboration and participation from all roles. In order to select a modern BI & analytics platform that can be adopted and widely deployed, organisations should consider the following set of foundational core attributes throughout the evaluation process which are covered in detail in the “Core platform attributes to consider” section below:

• Platform integration & accessibility
• Ease of use
• User enablement
• Deployment flexibility
• Pricing and packaging

Intended audience

This guide assumes the following core role types will be represented and available to participate in applicable aspects of the evaluation:

  • IT/BI professional – performs all initial setup tasks including software installation, user provisioning, access rights, governance oversight and some development tasks (content and data source).
  • Content creator – performs most of the content creation tasks including data preparation, free-form exploration, content promotion and data validation.
  • Information consumer – primarily accesses and interacts with curated content and trusted data sources.

Throughout the guide, a primary role will be identified for each stage within the analytics workflow as this is the lead role for that specific stage of the evaluation. However, it is imperative that every stage of the evaluation includes participation and input from all of the above role types to ensure all needs and concerns are addressed through the process.

It should also be noted that for some organisations, the same person may serve multiple roles so it would not be uncommon for a single person to evaluate a platform from more than one perspective. Ultimately, the modern approach to business analytics will evolve to the point where it will no longer be possible (or necessary) to differentiate between an enabler, a producer, or a consumer of analytics within an organisation.

Prerequisites for evaluation

In order to conduct a comprehensive evaluation of a modern analytics platform, the following tasks should be completed prior to kicking off the evaluation process.

  • Desktop/server/cloud software licences needed for evaluation
  • Professional services/implementation partner engagement (if applicable)
  • Role identification and evaluation assignments:
    • IT/BI professional
    • Content creator
    • Information consumer
  • Access to cloud data source(s) and on-premises data source(s)
  • Initial environment setup
  • Initial user provisioning and security
  • Confirm availability of mobile devices (iOS, Android, other), phones and tablets

Modern BI & analytics platform evaluation

Core platform attributes to consider

This guide primarily focuses on evaluating specific inter-related capabilities that are important when selecting a modern BI & analytics platform. However, it is critical that the evaluation team considers the following list of non-technical core attributes that are essential to the successful implementation and execution of the modern analytical workflow in an organisation. These attributes should factor heavily into the ultimate decision as they collectively serve as the glue that holds together the individual capabilities of the workflow and are foundational in nature.

Platform integration & accessibility

  • Can all of the steps of the modern analytics workflow be executed seamlessly within the platform without the need to move between modules/products in a disconnected manner?
  • Can all of the steps of the modern analytics workflow be executed without IT involvement or specialised skills?

Ease of use

  • Is it easy for BI platform administrators to install, configure and manage the platform?
  • Is it easy for content creators to prepare data and curate data sources without upfront or ongoing assistance from IT?
  • Is it easy for content creators to author content and access the analytical capabilities of the platform without upfront or ongoing assistance from IT?
  • Is it easy for non-technical content consumers to find, view and interact with available analytical content?
  • Is it easy for non-technical content consumers to ask deeper questions autonomously and customise existing published content to suit their specific needs?

User enablement

  • Is role-specific training available and accessible to all users?
  • Are there self-paced tutorials and/or online webinars that users can access?
  • Is it easy for users to search and find answers to product-specific questions?
  • Is there a robust and active user community accessible to share and learn best practices, tips & tricks, etc.?
  • What is the platform vendor’s reputation for resolving technical support issues?
  • Are professional services (through vendor or partners) readily available?
  • What is the platform vendor’s reputation for ensuring customer success and ongoing engagement with customers?

Deployment flexibility

  • Does the platform offer flexible deployment options (e.g. SaaS, public/private cloud deployment, on-premises, etc.)?
  • Does the platform offer flexible data storage options (e.g. in-DB vs. platform storage (in-memory))?
  • Does the platform support hybrid connectivity of on-premises and cloud data sources?
  • Is the platform scalable to accommodate increasing data volumes and additional users over time?
  • Can the platform easily scale up and out depending on the needs of the organisation?

Pricing and packaging

  • Is the product packaging easy to understand?
  • Are the available licensing options clear and transparent?
  • Is the pricing model for the platform easy to understand?
  • Is the pricing model for the platform flexible and scalable?

Access and view

As organisations begin the transition from a traditional top-down approach driven by IT to one based on self-service, it is often advantageous for IT (or a centralised BI team) to develop an initial set of trusted data sources and analytical content. Business users can then access and use this content as a starting point for their analysis. Over time, as users are encouraged to ask and answer their own questions as part of the modern analytics workflow, the domain of available trusted content will grow organically. Users will have access to a wider range of analytical content for self-service. For the purpose of this section, we will disregard the origin of content available to end users, and the evaluation criteria as it pertains to getting into a governed state will be addressed in the “promote & govern” section.

The evaluation criteria for this section will first be addressed from the perspective of the IT/BI professional who is ultimately responsible for the administration of the centralised environment where analytical content is stored and maintained, and data sources are administered and monitored.

Evaluation criteria:

IT/BI professionals should be able to:

  • Define and update underlying data refreshes and monitor status.
  • Choose where underlying data used for analysis should be stored and how it should be accessed.
  • Extend the platform to include partner-provided capabilities.
  • Monitor and audit usage of available content and perform impact analysis.
  • Diagnose and tune performance-related issues.

Evaluation considerations:

  • Can the refresh schedule be set and managed independently for each item stored centrally in the analytical content repository?
  • Can a specific person or role be set up to be notified of issues/failures in the data refresh process?
  • Can queries originating from the analytics platform be pushed down to the underlying database where the data resides?
  • Can data be ingested into the analytics platform’s in-memory/columnar storage for performance optimisation?
  • Can on-premises data be accessed live from the analytics platform when deployed in the cloud?
  • Can the platform be extended via APIs/SDKs to include supplemental analytical capabilities not natively delivered by the platform?
  • Can usage of specific data sources and available analytical content be tracked and audited by an administrator?
  • Can an administrator perform an impact analysis to determine the scope and severity of a proposed change to downstream content and processes?
  • Does the platform offer utilities to an administrator to identify, diagnose and resolve performance-related issues?

The second perspective in this section to consider is that of the information consumer who drives the specific usage requirements and parameters that the IT/BI professional is responsible for successfully delivering.

Evaluation criteria:

Information consumers should be able to:

  • Search the repository for existing content based on a keyword or topic.
  • Define alerts and notification preferences if a metric/KPI exceeds a threshold or is triggered by a specific condition.
  • Subscribe to relevant content and set update/notification preferences.
  • Access and view analytical content on any preferred form factor.

Evaluation considerations:

  • Can a user perform a search to find and view available content that has already been created by another user that may assist in answering a business question?
  • Can a user easily determine whether analytical content and/or data sources have been certified and should be considered trusted?
  • Can a user access and view field-level metadata to understand the underlying details of a particular data element?
  • Can a user define data-driven or static thresholds to indicate when a notification should be triggered?
  • Can a user specify how and where relevant alerts and notifications are delivered?
  • Can a user subscribe to specific content and set notification preferences following updates or other events impacting content subscriptions?
  • Can a user search for and access analytical content on any device (phone, tablet, laptop, etc.)?
  • Can a user access and download analytical content via a mobile device for offline viewing?


The interact phase is an extension of the initial access-and-view phase of the analytics workflow. It offers information consumers need to perform guided analysis of available content within predetermined fixed boundaries as set by the content publisher. The following considerations should be the focus of the evaluation for this section from the perspective of the information consumer:

Evaluation criteria:

  • Information consumers should be able to:
  • Change the scope of analysis through direct interaction with visual interface.
  • Use controls provided by the content author to increase analytical depth.
  • Use search capabilities to interact with available content.
  • Interact with content on any preferred form factor.

Evaluation considerations:

  • Can a user control the scope of analysis interactively through native capabilities of the platform? The following questions should be evaluated to determine the extent to which this is addressed from directly within the flow of visual interaction:
  • Can a user drill up and drill down using predefined or custom-built hierarchies?
  • Can a user focus their analysis on a specific data point or set of data points identified through the visual interaction process?
  • Can a user exclude a specific data point or a set of data points identified through the visual interaction process?
  • Can the user interact with parameters to change the analytical view or perform what-if analysis/scenario modelling?
  • Can the user interact with visible filter controls to change the scope of analysis?
  • Can a user search on keywords to drive filters and change the scope of analysis?
  • Can a user interact with available analytical content through natural language queries?
  • Can a user perform the same level of interaction on any device of different form factors?

Analyse and discover

This phase of the modern analytics workflow spans a broad spectrum of user needs, and it is imperative that the platform addresses these needs seamlessly. This phase is of particular significance in the workflow as it differentiates data-visualisation tools used to build charts from rich visual-analysis tools that use visualisations as the primary metaphor for analysis. As users interact with dashboards and generate new questions, users will inevitably encounter barriers and roadblocks as they reach the limits of the guided experience offered by existing dashboards. When this occurs, users require a self-driven, autonomous framework for asking and answering new questions that have emerged. Users of all skill levels must be able to “visualise as they analyse” and access the analytics capabilities of the platform while in the flow of analysis without having to move to a different module or product in the suite.

The concepts of platform integration and ease of use are covered in the “core attributes” section at the end of this guide in greater detail, but they are most critical to consider here. The transition from the “interact” phase to “analyse & discover” is often where the analytics workflow is disrupted due to lack of overall continuity of platform components needed to
ask the next level of questions.

The first scenario to consider is from the perspective of an information consumer who has generated new questions that cannot be addressed by any available dashboards. The following considerations should be the focus of the evaluation for this scenario:

Evaluation criteria:

Information consumers should be able to:

  • Access the trusted data source that serves as the source of a dashboard to autonomously launch a deeper contextual analysis.
  • Search the repository of trusted data sources to identify curated data sets that are available to augment the analysis.
  • Enhance the data model of trusted sources to customise for their specific needs.

Evaluation considerations:

  • Can a user, from within a production dashboard, launch a new analysis using the data set(s) that the dashboard sources from? This should allow for self-service exploration and analysis of all data elements contained within the data source without the need to access a separate product or module within the platform.
  • Can a user browse or search the repository of available production data sources available for analysis and launch a new analysis from a selected data source? The success criteria are the same as the previous step, with the only difference being that the analysis starts from a data source, not an existing dashboard.
  • Can a user, once connected to a trusted data source, modify and augment the existing data model within the flow of analysis and content creation? This should be done in the context of the analysis and not in a separate product or module within the platform. Each of the following questions should be addressed:
  • Can a user enrich the existing data model to create new dimensions and measures needed for analysis?
  • Can a user combine and group related data points into a new field in the data model to streamline analysis?
  • Can a user isolate specific data points of interest and save dynamically within the data model for further analysis?
  • Can a user modify the data model and create custom drill paths and hierarchies to align with their analytical needs?
  • Can a user interactively correct data issues that surface during the analysis process? This would include NULL value handling and renaming/replacing values globally for consistency.
  • Assess the breadth and depth of assistive analytics capabilities available within the product to augment the analytics workflow where appropriate using the following questions:
  • Is the user presented with recommended best-fit visualisations throughout the discovery process based on the chosen analysis path?
  • Are advanced analytical capabilities accessible to the user to enrich analysis without necessarily having to understand or access the underlying models or algorithms used within the product?
  • Can a user access the underlying statistical detail used for advanced analytics if needed, to share with more advanced users who may request it for further analysis and validation?
  • Is field-level metadata accessible throughout the analysis process and can it be updated and augmented by the user as appropriate?

The second scenario to consider is from the perspective of a content creator who has new questions that cannot be addressed by any available dashboards or any trusted data sources within the environment. The following considerations should be the focus of the evaluation for this scenario:

Evaluation criteria:

Content creators should be able to:

  • Ingest and model data that is not yet in a trusted state to explore and discover new insights.
  • Combine trusted and untrusted data to create new data sources for analysis.
  • Use existing and newly created data sources to build new analytical content to share and promote.
  • Modify existing analytical content based on new findings resulting from discovery and exploration.
  • Create a guided analytics experience to facilitate broader use by information consumers.

Evaluation considerations:

  • Can a user connect to data sources that are not currently being governed centrally?
  • Does the platform offer broad connectivity options to include structured and unstructured data sources for ingestion and analysis?
  • Can a content creator perform all of the analysis and discovery tasks included in the information consumer section against new, untrusted data sources?
  • Can a user virtually extend a trusted data source without modifying the underlying data structure or load process?
  • Can a user build new analytical content using a new data source or one that is a hybrid of trusted and untrusted data?
  • Can a user create alternative versions of governed content for sharing and track the lineage of changes over time?
  • Can a user redirect the underlying data connection of governed analytical content to use a newly-created/-enhanced source with no downstream impact?
  • Can a user build programmatic controls into analytical content to facilitate interaction and provide a guided experience to a broad audience of information consumers?
  • Can a user create and save style sheets or design themes to apply in the creation of other content?


The approach to sharing content has evolved. Within traditional BI platforms, sharing meant the delivery of static printed or exported reports to an inbox or a user’s desk. Within the modern analytics approach, sharing now includes collaboration and aspects of social interactions that we have grown accustomed to with all of our business tools. This transition is driven by the simple fact that information is outdated as soon as the report is printed or exported. And this doesn’t align with the needs of today’s consumer seeking the latest information. Some aspects of content sharing involve making information broadly available to users, while other aspects entail collaboration as a core component of the analysis process. Both scenarios will be included in the evaluation criteria for this section.

The push model of making information accessible to a broad range of users will be addressed first. This is a more reminiscent of the traditional approach; however, modern platforms should also enable organisations to make information broadly accessible to a wide range of internal and external users. Many of these tasks fall into the domain of the IT/BI professional and the following criteria should be evaluated from that perspective.

Evaluation criteria:

IT/BI professionals should be able to:

  • Deliver content on any form factor used throughout the organisation.
  • Embed analytical content for broader access and contextual use.
  • Provide for external access and consumption.

Evaluation considerations:

  • Can analytical content be rendered in any form factor that may be used across the organisation to access the environment? This would include tablets, phones, laptops, large displays, etc.
  • Can analytical content be embedded in an organisation’s web portals and applications that users access as part of their normal business processes?
  • Can analytical content be shared to external consumers who are outside the corporate firewall?

The second scenario is that of true collaboration, where both trusted and untrusted content is discussed, reviewed and validated at the peer-to-peer, workgroup or enterprise level.
This collaboration should be an integral step in the process of deriving new insights and as an input into the governance process. The primary participant in this scenario is the content creator, which should be the perspective of evaluation for the following criteria.

Evaluation criteria:

  • Content creators should be able to:
  • Collaborate with others on the development and validation of analytical content.
  • Annotate and discuss findings in a social media-style conversation.
  • Follow specific content types or content authors.
  • Provide quality ratings for specific analytical content.
  • Create storyboards to share findings and insights.
  • Add descriptive narratives to augment and enhance visual content.

Evaluation considerations:

  • Can users across the organisation collaborate on shared content in real time to discuss and elaborate on findings?
  • Can users annotate and provide comments directly within the content using any form factor?
  • Can users follow a conversation via a timeline to track lineage of the conversation and view a snapshot of the content being discussed as it looked when a comment was added?
  • Can users follow specific users within an organisation and receive updates and notifications of their activity?
  • Can users follow and track specific topics or types of content, and receive updates and notifications when new content meeting the criteria is published?
  • Can users rate content, either through a rating system or by using social media-style “likes”?
  • Can users create stories to represent a logical sequence of findings for the purpose of walking another user through the analytics journey?
  • Can users integrate descriptive narratives to augment the visual content within an analysis, either manually or automatically through platform capabilities?

Promote and govern

There are various approaches to governance. Every organisation will find itself at a different point on the spectrum ranging from an IT-led, highly governed and controlled environment to one with little to no controls, with many organisations landing somewhere in between. Often times, even within one organisation, the governance requirements may vary depending on the users’ needs within that area as well as the data itself.

When choosing a modern analytics platform, flexibility is important to consider in order to meet those varying needs of the business and to ensure that you can alter your governance needs as you scale. An organisation may choose to facilitate the transition from traditional to modern by initially using the modern platform in a traditional manner then gradually expanding the range of capabilities that are accessible to users through self-service. It is equally as important to evaluate a platform’s distinct capabilities in the separate but related areas of data governance and analytics governance (as depicted below) to ensure that an adequate amount of flexibility is afforded within the platform to put the most appropriate governance model in place and adjust over time as needed.

For most modern analytics use cases, a self-service-driven organic approach to governance will lead to greater adoption, deeper insights and improved business outcomes. As such, this is the approach that should be considered primary for the purpose of this evaluation.

In this approach, a subset of content creators, referred to as information stewards in this guide, are primarily responsible for defining and navigating the overall governance process. The sections below will consider aspects of both data governance as well as analytics governance from the perspective of the content creators as well as the IT/BI professional.

Data governance

The task of defining and ensuring compliance with an organisation’s governance framework is a core responsibility of the content creator and as such, the following data governance-related items should be considered from that perspective:

Evaluation criteria:

Content creators should be able to:

  • Define, manage and update data models used for analysis (data source management).
  • Autonomously define, update and expose field-level metadata to users (metadata management).
  • Centrally capture and expose data-cleansing and enrichment rules applied to published data models (data enrichment and data quality).
  • Monitor and track usage metrics for centrally-defined data models (monitoring & management).

Evaluation considerations:

  • Can an information steward publish a data model to the system-of-record environment for broader use across the organisation?
  • Can a published data model be augmented with validated user-defined fields through a promotion process?
  • Can an information steward physically mark trusted data models with a watermark?
  • Can a published data model be virtually extended with additional sources/data elements without impacting downstream content and/or users?
  • Can an impact assessment be conducted prior to any data model changes?
  • Can a content creator add and update descriptive metadata to dimensions and measures within a published data model?
  • Can business rules and data transformations used to create and populate published data models be exposed to end users?
  • Can data model changes be tracked and audited, and reverted if needed?
  • Can an information steward access utilisation statistics and access platform capabilities to identify data-model attribute redundancy, inconsistency, non-use, etc.?

Administration and enablement of the entire governance process is largely the responsibility of the IT/BI professional, and as such, the following data governance-related items should be considered from that perspective:

Evaluation criteria:

IT/BI professionals should be able to:

  • Define security parameters and access controls to published data models (data security).
  • Monitor and audit usage to ensure compliance and appropriate use of data assets (monitoring & management).
  • Create new data models as needed to enforce consistency across departments and information stewards (data source management).
  • Comply with the organisation’s overarching data strategy (data source management).

Evaluation considerations:

  • Can security be inherited from source systems where applicable?
  • Can an administrator allow/deny access at the user/group level for each data source?
  • Can access rights be defined at the row level to allow a user to access a subset of data for each data source?
  • Can an administrator define specific roles and privileges for each user in the system to control who can create, edit and promote shared data sources?
  • Can system-wide usage be tracked and analysed by an administrator?
  • Can an administrator access a system-wide view of the environment to identify redundancies and inconsistencies across data models being managed by individual information stewards?
  • Can an administrator create a new data source and seamlessly switch downstream users and analytical content to reference it in place of an existing source?
  • Can an administrator decide the most appropriate storage strategy for data required by the analytics platform based on an organisation’s reference architecture?

Analytics governance

The responsibility of defining and ensuring compliance with an organisation’s governance framework is a core responsibility of the content creator, and as such, the following analytics governance-related items should be considered from that perspective:

Evaluation criteria:

Content creators should be able to:

  • Access platform capabilities to assist with validation and accuracy verification of user-generated analytical content (content validation).
  • Promote validated analytical content to a centralised, trusted environment as determined by a governance process (content promotion).
  • Certify content as trusted and delineate from untrusted content in the same environment (content certification).
  • Monitor and audit usage of published content and track usage of untrusted content (content usage monitoring).

Evaluation considerations:

  • Can an information steward access and reference benchmark data stored in the platform to validate the accuracy of content being evaluated for promotion?
  • Can user-developed content be promoted to a shared environment for broader consumption?
  • During the promotion process, can underlying data sources be redirected to reference trusted data models already published?
  • Can watermarks be applied to published analytical content to indicate that it has been certified and can be trusted?
  • Can an information steward access and analyse usage metrics of published content, both trusted and untrusted, to ensure appropriate use?

Administration and enablement of the entire governance process is largely the responsibility of the IT/BI professional, and as such, the following analytics governance-related items should be considered from that perspective:

Evaluation criteria:

IT/BI professionals should be able to:

  • Create and maintain an environment for storing and organising published content (content management).
  • Secure analytical content and grant users appropriate levels of access based on content type, sensitivity, business need, etc. (security, permissions & access controls).
  • Monitor broad usage patterns across organisational business units (content usage monitoring).

Evaluation considerations:

  • Can the environment be customised to suit the needs and preferences of the organisation related to content organisation and overall management?
  • Can the IT/BI professional enable access to the platform’s content through the organisation’s portals to leverage existing content management investments?
  • Can security be applied at a granular level to allow/deny users access to specific analytical content?
  • Can security defined at the data-model level be enforced for all downstream analytical content automatically?
  • Can usage patterns and consumption preferences be tracked and analysed to provide an administrator with an overall assessment of the environment and how it is being used?

The shift from traditional BI platforms to modern analytics platforms is one that is necessary to truly realise the impact data can have on an organisation. Modern analytics platforms bring together self-service and governance to empower the entire organisation with trusted data to gain insights into the business. These platforms should be evaluated through a different lens, as they break the mould of traditional, IT-run BI platforms.

Tableau, a proven leader in the modern analytics space, enables organisations to explore trusted data in a secure and scalable environment. It gives people access to intuitive visual analytics, interactive dashboards and limitless ad hoc analyses that reveal hidden opportunities and eureka moments alike. As well as the security, governance and management you require to confidently integrate Tableau into your business – on-premises or in the cloud – and deliver the power of true self-service analytics at scale.

About the authors: 

Josh Parenteau

Market Intelligence Director, Tableau