As enterprises embracing artificial intelligence move from initial pilots and trial projects through deployment and into production at scale, many are realizing the critical importance of agile and responsive data processes. These processes are often combined with tools and platforms that facilitate data management to improve trust in the data used for AI and analytics.
This has led to increased attention on the role of data operations, which ISG Research defines as the application of agile development, DevOps and lean manufacturing by data engineering professionals in support of data production. It encompasses the development, testing, deployment and orchestration of data integration and processing pipelines, along with improved data quality and validity via data monitoring and observability.
DataOps has been part of the lexicon of the data market for almost a decade. It takes inspiration from DevOps, which describes a set of tools, practices and a philosophy used to support the continuous delivery of software applications in the face of constant change.
Interest in DataOps is growing. ISG Research asserts that through 2026, more than one-half of enterprises will adopt agile and collaborative DataOps practices to facilitate responsiveness, avoid repetitive tasks and deliver measurable data reliability improvements. A variety of products, practices and processes enable DataOps, including products that support agile and continuous delivery of data analytics and AI and continuous, measurable improvement.
An emphasis on agility, collaboration and automation separates DataOps from traditional approaches to data management, which typically included batch-based, manual and rigid tools and practices. However, this distinction between DataOps and traditional data management tools is clearer in theory than in practice. Providers of traditional data management have, in recent years, incorporated capabilities that make products more automated, collaborative and agile. There is no industry-wide consensus on the level of agility, collaboration and automation that must be provided for products to be considered part of the DataOps category.
This has led to some traditional data management providers adopting a broader definition of DataOps that describes the combination of people, processes and technology needed to automate the delivery of data to users in an enterprise and enable collaboration to facilitate data-driven decisions. This definition is broad enough that it could be interpreted to encompass all products and services that address data management and data governance, including many traditional batch-based, manual products that do not support agile and continuous delivery and continuous, measurable improvement.
A narrower definition of DataOps focuses on the practical application of agile development, DevOps and lean manufacturing to the tasks and skills employed by data engineering professionals in support of data analytics development and operations. This definition emphasizes specific capabilities such as continuous delivery of analytic insight, process simplification, code generation, automation to avoid repeated errors and reduced repetitive tasks, the incorporation of stakeholder feedback and advancement and measurable improvement in the efficient generation of insight from data. As such, the narrow definition of DataOps provides a set of criteria for agile and collaborative practices that products and services can be measured against.
ISG Research’s perspective, based on our interaction with the software provider and user communities, aligns with the narrow definition. While traditional data management and data governance are complementary, our DataOps coverage focuses specifically on the delivery of agile business intelligence and AI through the automation and orchestration of data processing pipelines, incorporating improved data reliability and integrity via data monitoring and observability.
To be more specific, we believe that DataOps products and services provide the following functionality: agile and collaborative data operations; the development, testing and deployment of data and analytics pipelines; data orchestration; data observability; and the delivery of data products. These are the key criteria we used to assess DataOps products and services as part of this Buyer’s Guide.
This research is comprised of parallel evaluations of products addressing each of the four core areas of functionality: data pipelines, data orchestration, data observability and data products. Additionally, we evaluated all products in all categories in relation to their support for agile and collaborative practices.
The development, testing and deployment of data pipelines is a fundamental accelerator of data-driven strategies, enabling enterprises to extract data generated by operational applications used to run the business and transport it into the analytic data platforms used to analyze operations. ISG Research defines data pipelines as the systems used to transport, process and deliver data produced by operational data platforms and applications into analytic data platforms and applications for consumption. Healthy data pipelines are necessary to ensure data is ingested, processed and loaded in the sequence required to generate BI and AI.
Given the increasing complexity of evolving data sources and requirements, it is essential to automate and coordinate the creation, scheduling and monitoring of data pipelines as part of a DataOps approach to data management. This is the realm of data orchestration, which ISG Research defines as providing the capabilities to automate and accelerate the flow of data to support operational and analytics initiatives and drive business value via capabilities for the monitoring and management of data pipelines and associated workflow. By 2027, more than one-half of enterprises will adopt data orchestration technologies to automate and coordinate data workflows and increase efficiency and agility in data and analytics projects.
Maintaining data quality and trust is a perennial data management challenge, often preventing enterprises from operating at the speed of business. In addition to automating and coordinating the creation, scheduling and monitoring of data pipelines via data orchestration, it is also critical to monitor the quality and reliability of the data flowing through those data pipelines. This is achieved using data observability, which ISG Research defines as providing the capabilities for monitoring the quality and reliability of data used for analytics and governance projects as well as the reliability and health of the overall data environment. The metrics generated by data observability also form a critical component of the development and sharing of data products, providing the information by which data consumers can gauge if a data product meets their requirements in terms of a variety of attributes including validity, uniqueness, timeliness, consistency, completeness and accuracy.
ISG Research defines data products as the outcome of data initiatives developed with product thinking and delivered as reusable assets that can be discovered and consumed by others on a self-service basis, along with associated data contracts and feedback options. Key capabilities for platforms that enable the development of data products include a dedicated interface for the development and classification of data products and data contracts as well as a dedicated interface for the self-service discovery and consumption of data products and data contracts. Data product platforms should also include the ability for consumers of data products to provide feedback, comments and ratings as well as request improvements or new products, and the ability for data owners to monitor data product usage and performance metrics and view and manage requests for data product modifications and the development of new data products.
As always, however, software products are only one aspect of delivering on the promise of DataOps. New approaches to people, processes and information are also required to deliver agile and collaborative development, testing and deployment of data and analytics workloads, as well as data operations. To improve the value generated from analytics and data initiatives, enterprises need to adopt processes and methodologies that support rapid innovation and experimentation, automation, collaboration, measurement and monitoring, and high data quality.
The ISG Buyers Guide™ for DataOps evaluates software providers and products to address data pipeline development, testing and deployment, data pipeline orchestration, data pipeline observability, and data products. Providers with products that address at least two elements—data pipelines, data orchestration or data observability—were deemed to provide a superset of functionality to address DataOps overall. This approach is designed to maintain consistency with the 2023 DataOps Buyers Guide and reflects the relative immaturity of data product platforms.
This research evaluates the following software providers that offer products to address key elements of DataOps as we define it: Alteryx, AWS, Astronomer, BMC, Dagster Labs, Databricks, DataKitchen, DataOps.live, dbt Labs, Google, Hitachi, IBM, Informatica, Infoworks, K2View, Keboola, Matillion, Microsoft, Nexla, Prefect, Qlik, Rivery, SAP, Y42 and Zoho.
For over two decades, ISG Research has conducted market research in a spectrum of areas across business applications, tools and technologies. We have designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of the business requirements in any enterprise. Utilization of our research methodology and decades of experience enables our Buyers Guide to be an effective method to assess and select software providers and products. The findings of this research undertaking contribute to our comprehensive approach to rating software providers in a manner that is based on the assessments completed by an enterprise.
The ISG Buyers Guide™ for DataOps is the distillation of over a year of market and product research efforts. It is an assessment of how well software providers’ offerings address enterprises’ requirements for DataOps software. The index is structured to support a request for information (RFI) that could be used in the request for proposal (RFP) process by incorporating all criteria needed to evaluate, select, utilize and maintain relationships with software providers. An effective product and customer experience with a provider can ensure the best long-term relationship and value achieved from a resource and financial investment.
In this Buyers Guide, ISG Research evaluates the software in seven key categories that are weighted to reflect buyers’ needs based on our expertise and research. Five are product-experience related: Adaptability, Capability, Manageability, Reliability, and Usability. In addition, we consider two customer-experience categories: Validation, and Total Cost of Ownership/Return on Investment (TCO/ROI). To assess functionality, one of the components of Capability, we applied the ISG Research Value Index methodology and blueprint, which links the personas and processes for DataOps to an enterprise’s requirements.
The structure of the research reflects our understanding that the effective evaluation of software providers and products involves far more than just examining product features, potential revenue or customers generated from a provider’s marketing and sales efforts. We believe it is important to take a comprehensive, research-based approach, since making the wrong choice of DataOps technology can raise the total cost of ownership, lower the return on investment and hamper an enterprise’s ability to reach its full performance potential. In addition, this approach can reduce the project’s development and deployment time and eliminate the risk of relying on a short list of software providers that does not represent a best fit for your enterprise.
ISG Research believes that an objective review of software providers and products is a critical business strategy for the adoption and implementation of DataOps software and applications. An enterprise’s review should include a thorough analysis of both what is possible and what is relevant. We urge enterprises to do a thorough job of evaluating DataOps systems and tools and offer this Buyers Guide as both the results of our in-depth analysis of these providers and as an evaluation methodology.
We recommend using the Buyers Guide to assess and evaluate new or existing software providers for your enterprise. The market research can be used as an evaluation framework to establish a formal request for information from providers on products and customer experience and will shorten the cycle time when creating an RFI. The steps listed below provide a process that can facilitate best possible outcomes.
All of the products we evaluated are feature-rich, but not all the capabilities offered by a software provider are equally valuable to types of workers or support everything needed to manage products on a continuous basis. Moreover, the existence of too many capabilities may be a negative factor for an enterprise if it introduces unnecessary complexity. Nonetheless, you may decide that a larger number of features in the product is a plus, especially if some of them match your enterprise’s established practices or support an initiative that is driving the purchase of new software.
Factors beyond features and functions or software provider assessments may become a deciding factor. For example, an enterprise may face budget constraints such that the TCO evaluation can tip the balance to one provider or another. This is where the Value Index methodology and the appropriate category weighting can be applied to determine the best fit of software providers and products to your specific needs.
The research finds Informatica atop the list, followed by Microsoft and IBM. Providers that place in the top three of a category earn the designation of Leader. Informatica has done so in five categories; Databricks and Microsoft in three; Google and SAP in two; and Alteryx, AWS, DataOps.live, IBM, Keboola and Qlik in one category.
The overall representation of the research below places the rating of the Product Experience and Customer Experience on the x and y axes, respectively, to provide a visual representation and classification of the software providers. Those providers whose Product Experience have a higher weighted performance to the axis in aggregate of the five product categories place farther to the right, while the performance and weighting for the two Customer Experience categories determines placement on the vertical axis. In short, software providers that place closer to the upper-right on this chart performed better than those closer to the lower-left.
The research places software providers into one of four overall categories: Assurance, Exemplary, Merit or Innovative. This representation classifies providers’ overall weighted performance.
Exemplary: The categorization and placement of software providers in Exemplary (upper right) represent those that performed the best in meeting the overall Product and Customer Experience requirements. The providers rated Exemplary are: Alteryx, AWS, Databricks, Google, IBM, Informatica, Matillion, Microsoft, Qlik and SAP.
Innovative: The categorization and placement of software providers in Innovative (lower right) represent those that performed the best in meeting the overall Product Experience requirements but did not achieve the highest levels of requirements in Customer Experience. The providers rated Innovative are: DataOps.live, K2View and Keboola.
Assurance: The categorization and placement of software providers in Assurance (upper left) represent those that achieved the highest levels in the overall Customer Experience requirements but did not achieve the highest levels of Product Experience. The providers rated Assurance are: BMC, Hitachi and Rivery.
Merit: The categorization of software providers in Merit (lower left) represents those that did not exceed the median of performance in Customer or Product Experience or surpass the threshold for the other three categories. The providers rated Merit are: Astronomer, Dagster Labs, DataKitchen, dbt Labs, Infoworks, Nexla, Prefect, Y42 and Zoho.
We warn that close provider placement proximity should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every enterprise or for a specific process. Although there is a high degree of commonality in how enterprises handle DataOps, there are many idiosyncrasies and differences in how they do these functions that can make one software provider’s offering a better fit than another’s for a particular enterprise’s needs.
We advise enterprises to assess and evaluate software providers based on organizational requirements and use this research as a supplement to internal evaluation of a provider and products.
The process of researching products to address an enterprise’s needs should be comprehensive. Our Value Index methodology examines Product Experience and how it aligns with an enterprise’s life cycle of onboarding, configuration, operations, usage and maintenance. Too often, software providers are not evaluated for the entirety of the product; instead, they are evaluated on market execution and vision of the future, which are flawed since they do not represent an enterprise’s requirements but how the provider operates. As more software providers orient to a complete product experience, evaluations will be more robust.
The research results in Product Experience are ranked at 80%, or four-fifths, of the overall rating using the specific underlying weighted category performance. Importance was placed on the categories as follows: Usability (15%), Capability (20%), Reliability (15%), Adaptability (15%) and Manageability (15%). This weighting impacted the resulting overall ratings in this research. Informatica, Microsoft and IBM were designated Product Experience Leaders.
This category assesses the degree to which products and technology can be adapted to an enterprise’s specifications via configurability and customization while still maintaining integrity of integration across the worker, device, business, processes, application and data. Adaptability is also related to the ability to readily integrate with other internal and external systems—for example, integrate data and information securely across processes and systems—and support bidirectional data flows to support synchronization and migration. It also examines the investment by the software provider in resources and improvements.
The research weights Adaptability at 15% of the overall rating. Keboola, Informatica and Databricks are the Leaders in this category. While not a Leader, BMC was also found to meet a broad range of enterprise adaptability requirements.
Adaptability is an essential evaluation metric as it determines the flexibility and interconnectivity of the software provider’s product related to enterprise requirements. It also enables enterprise software to operate across the variety of platforms and cloud computing environments that exist today and in the future.
Software providers that evaluated well in the Adaptability category understand the criticality of preparing and using information to optimize business execution. These providers meet the specific customization and integration support requirements in these areas, enabling enterprises to process data across business processes, workflows and applications as they operate.
Manageability is evaluated by how well the products can be managed technologically and by business, and governed, secured, licensed and supported in a service level agreement (SLA). Also important is the flexibility of the privacy and security provisions built into the technology with respect to user identity, role and access, how effective that security is, to what extent it supports auditing and compliance, and what licensing or subscription is available from the software provider. It also examines the investment by the provider in resources and improvements.
The research weights Manageability at 15% of the overall rating. Informatica, AWS and Databricks are the Leaders in this category.
Manageability is an essential evaluation metric to indicate whether the software provider’s product can be administrated and supported throughout its life cycle in the enterprise. It also ensures the overall efficiency, compliance and security of the enterprise software.
A software provider’s performance in the evaluation criteria is especially critical when examining business and technology administration. Providers that did not perform well had challenges with the effectiveness of systems and processes for management and escalation of breaches. The significance of information security cannot be overstated as the insights and knowledge of an enterprise are present in the data. The growing importance of simplifying manageability is critical and should be a priority for all software provider evaluations.
For DataOps processes to operate efficiently and for workers to engage the applications, the software on which they run must reliably deliver the necessary performance and scalability using the existing architecture operating across the enterprise and cloud computing environments. The criteria include depth in the performance and scalability of a software provider’s products and architecture, including the metrics to ensure operations and configurability across data, users, instances, activities and tasks. It also examines the investment by the provider in resources and improvements.
The research weights Reliability at 15% of the overall rating. Google, Informatica and SAP are the Leaders in this category, providing the highest level of confidence for operation at any level of reliability 24 hours a day.
Reliability is an essential evaluation metric as it indicates the product’s ability to perform and scale to the defined enterprise requirements and how well it supports the continuous processing required for business continuity and operational resilience today and into the future.
Evaluating the performance and scalability readiness of software is not always easy as it depends on the type of product information and the volume at which the data is being updated and used by processes and systems. Software providers that did not perform well in this category were not able to provide this level of information at any depth, even though it is necessary to establish the confidence required for provider selection.
The importance of a customer relationship with a software provider is essential to the actual success of the products and technology. The advancement of the Customer Experience and the entire life cycle an enterprise has with its software provider is critical for ensuring satisfaction in working with that provider. Technology providers that have chief customer officers are more likely to have greater investment in the customer relationship and focus more on their success. These leaders also need to take responsibility for ensuring this commitment is made abundantly clear on the website and in the buying process and customer journey.
The research results in Customer Experience are ranked at 20%, or one-fifth, using the specific underlying weighted category performance as it relates to the framework of commitment and value to the software provider-customer relationship. The two evaluation categories are Validation (10%) and TCO/ROI (10%), which are weighted to represent their importance to the overall research.
The software providers that evaluated the highest overall in the aggregated and weighted Customer Experience categories are Databricks, Microsoft and SAP. These category leaders best communicate commitment and dedication to customer needs. While not Leaders, Informatica and BMC were also found to meet a broad range of enterprise customer experience requirements.
Software providers that did not perform well in this category were unable to provide sufficient customer case studies to demonstrate success or articulate their commitment to customer experience and an enterprise’s journey. The selection of a software provider means a continuous investment by the enterprise, so a holistic evaluation must include examination of how they support their customer experience.
Company and Product Profile
Informatica Intelligent Data Management Cloud, v October 2024, released October 2024
“Informatica brings your data and AI to life by empowering your business to realize the transformative power of your most critical assets. AI requires data velocity, volume, variety and veracity for companies to succeed with correct, unbiased insights. Informatica’s Intelligent Data Management Cloud unlocks the power of your AI with holistic, trusted and governed data. Powered by AI, our Intelligent Data Management Cloud (IDMC) delivers what you need to achieve better business outcomes — all in one place.” – Informatica
Summary
Our analysis classified Informatica as Exemplary, receiving an overall grade of B++ with a 77.3% performance. Informatica's best grouped results came in Customer Experience with an 85.4% performance and an A- grade, due in part to its A- in Validation. In Product Experience, Informatica received a B++ grade with a 75.0% performance due to its 93.0% performance in Reliability. Informatica was designated an overall Leader as well as a Leader in Product Experience, Adaptability, Manageability, Reliability, Usability and Validation.
Challenges
Informatica's B++ in Product Experience was impacted by its B- in Capability, where it could enhance its data observability functionality in relation to the resolution and prevention of reliability issues. Customer Experience was impacted by its A- in TCO/ROI, where it could improve on the tools provided to buyers to calculate TCO.
Strengths
Informatica performed best in Customer Experience with an A- grade, notably receiving an A- in Validation due to its support services and the viability of the software. Informatica received a B++ grade in Product Experience, with an A in Reliability due to its strong technology architecture and operations.
For inclusion in the ISG Buyers Guide™ for DataOps in 2024, a software provider must be in good standing financially and ethically, have at least $10 million in annual or projected revenue verified using independent sources, sell products and provide support on at least two continents, and have at least 50 employees. The principal source of the relevant business unit’s revenue must be software-related, and there must have been at least one major software release in the past 12 months.
The software provider must provide a product or products that supports agile and collaborative data operations and marketed as addressing at least one of the following functional areas, which are mapped into Buyers Guide capability criteria: a DataOps tool or platform, a data pipeline development, deployment and testing tool or platform, a data orchestration tool or platform, a data observability tool or platform, or a data products tool or platform.
The Data Operations Buyers Guide consists of five parallel evaluations focused on data pipelines, data orchestration, data observability and data products as well as an overall evaluation related to data operations.
Data Operations focuses on the application of agile development, DevOps and lean manufacturing by data engineering professionals in support of data production. It encompasses the development, testing, deployment and orchestration of data integration and processing pipelines, along with improved data quality and validity via data monitoring and observability, and the development and consumption of data products.
To be included in this Buyers Guide requires functionality that addresses the following sections of the capabilities document:
This approach is designed to maintain consistency with the 2023 DataOps Buyers Guide and reflects the relative immaturity of data product platforms.
The research is designed to be independent of the specifics of software provider packaging and pricing. To represent the real-world environment in which businesses operate, we include providers that offer suites or packages of products that may include relevant individual modules or applications. If a software provider is actively marketing, selling and developing a product for the general market and it is reflected on the provider’s website that the product is within the scope of the research, that provider is automatically evaluated for inclusion.
All software providers that offer relevant DataOps products and meet the inclusion requirements were invited to participate in the evaluation process at no cost to them.
Software providers that meet our inclusion criteria but did not completely participate in our Buyers Guide were assessed solely on publicly available information. As this could have a significant impact on classification and ratings, we recommend additional scrutiny when evaluating those providers.
Provider |
Product Names |
Version |
Release |
Alteryx |
Analytics Cloud, Connect |
N/A 2024.2 |
October 2024 October 2024 |
Astronomer |
Astro |
N/A |
October 2024 |
AWS |
AWS Glue, Amazon Managed Workflows for Apache Airflow, Amazon DataZone |
N/A N/A N/A |
September 2024 September 2024 September 2024 |
BMC |
Control-M |
9.0.21.300 |
October 2024 |
Dagster Labs |
Dagster+ |
1.8.12 |
October 2024 |
Databricks |
Data Intelligence Platform |
N/A |
October 2024 |
DataKitchen |
DataOps TestGen, DataOps Automation, DataOps Observability |
2.15.3 1.2.9 2.2.1 |
October 2024 February 2024 August 2024 |
DataOps.live |
DataOps.live |
October 2024 |
October 2024 |
dbt Labs |
dbt |
October 2024 |
October 2024 |
|
Cloud Data Fusion, Cloud Dataflow |
N/A N/A |
October 2024 September 2024 |
Hitachi |
Pentaho Data Integration |
10.2 |
September 2024 |
IBM |
Cloud Pak for DataData Observability by Databand |
5.0 N/A |
September 2024 September 2024 |
Informatica |
Intelligent Data Management Cloud |
October 2024 |
October 2024 |
Infoworks |
Infoworks |
6.1.0 |
September 2024 |
K2view |
Data Product Platform |
8.1.1 |
October 2024 |
Keboola |
Keboola |
N/A |
November 2024 |
Matillion |
Data Productivity Cloud |
N/A |
October 2024 |
Microsoft |
Fabric Purview |
October 2024 October 2024 |
October 2024 October 2024 |
Nexla |
Nexla |
N/A |
October 2024 |
Prefect |
Prefect Cloud |
3.0 |
September 2024 |
Qlik |
Talend Cloud, Talend Cloud Data Inventory |
N/A R2024-09 |
October 2024 September 2024 |
Rivery |
Rivery |
October 2024 |
October 2024 |
SAP |
Data Intelligence Cloud, Datasphere |
N/A 2024.20 |
April 2024 September 2024 |
Y42 |
Y42 |
N/A |
July 2024 |
Zoho |
DataPrep |
2.0 |
October 2024 |
We did not include software providers that, as a result of our research and analysis, did not satisfy the criteria for inclusion in this Buyers Guide. These are listed below as “Providers of Promise.”
Provider |
Product |
Annual Revenue >$10M |
Operates on 2 Continents |
At Least 50 Employees |
GA Product/ Documentation |
Ascend |
Data Automation Cloud |
No |
Yes |
No |
Yes |
RightData |
DataFactory, DataTrust, DataMarket |
Yes |
Yes |
Yes |
No |
Saturam |
Qualdo, Piperr |
No |
Yes |
Yes |
No |
Appendix: Value Index Methodology
To prepare this Buyers Guide, we utilize our Value Index methodology that draws on our more than two decades of market research, which includes benchmarking and advising thousands of enterprises. Our continuous market research provides the context of the real needs of buyers, complemented by our research on software providers, knowledge of the market and subject matter expertise in this area.
The following guidelines were presented to potential participants that met our inclusion criteria:
To ensure the accuracy of the information we collect and ensure that the Buyers Guide reflects the concerns of a well-crafted RFI, we require participating software providers to provide evaluation information across all seven categories. ISG Research then validates the information, first independently through our knowledge base of software providers, product information and extensive web-based research, and then through consultation.
After validation, we grade and aggregate each software provider to determine performance in each evaluation category. Then, through weighted analytics, the ratings in the product and customer experience categories and the overall ranking are assigned. If a provider submitted more than one product for evaluation, we assessed the additional product(s) using our Capability and other evaluation categories.
We have made every effort to encompass the overall requirements that best meet an enterprise’s needs today and into the future. Even so, there may be aspects of the software provider that we did not cover but affect which products best fit your particular requirements. Therefore, while this research is complete as it stands, utilizing it in your organizational context is critical to ensure that products deliver the highest level of support for your requirements.
ISG Software Research provides expert market insights on vertical industries, business, AI and IT through comprehensive consulting, advisory and research services with world-class industry analysts and client experience. Our ISG Buyers Guides offer comprehensive ratings and insights into technology providers and products. Explore our research at research.isg-one.com.
ISG Research provides subscription research, advisory consulting and executive event services focused on market trends and disruptive technologies driving change in business computing. ISG Research delivers guidance that helps businesses accelerate growth and create more value. For more information about ISG Research subscriptions, please email contact@isg-one.com.
ISG (Information Services Group) (Nasdaq: III) is a leading global technology research and advisory firm. A trusted business partner to more than 900 clients, including more than 75 of the world’s top 100 enterprises, ISG is committed to helping corporations, public sector organizations, and service and technology providers achieve operational excellence and faster growth. The firm specializes in digital transformation services, including AI and automation, cloud and data analytics; sourcing advisory; managed governance and risk services; network carrier services; strategy and operations design; change management; market intelligence and technology research and analysis. Founded in 2006 and based in Stamford, Conn., ISG employs 1,600 digital-ready professionals operating in more than 20 countries—a global team known for its innovative thinking, market influence, deep industry and technology expertise, and world-class research and analytical capabilities based on the industry’s most comprehensive marketplace data.
For more information, visit isg-one.com.