尊敬的 微信汇率:1円 ≈ 0.046166 元 支付宝汇率:1円 ≈ 0.046257元 [退出登录]
SlideShare a Scribd company logo
Making the Case for
Legacy Data in
Modern Data Analytics
Arianna Valentini | Product Marketing Manager
Housekeeping
Webcast Audio
• Today’s webcast audio is streamed through your computer
speakers
• If you need technical assistance with the web interface or audio,
please reach out to us using the Q&A box
Questions Welcome
• Submit your questions at any time during the presentation using
the Q&A box
Recording and slides
• This webcast is being recorded. You will receive an email
following the webcast with a link to the recording
Unites and integrates
data from across the
enterprise, making data
available for a variety
of projects and business
needs in one place
Examples of modern analytics platforms
What Can You Do with Modern Analytics
Platforms?
Centralized BI and
analytics
Data discovery Data
democratization
with governance
Next-gen projects
– AI and ML
What are the
benefits of a
modern
analytics
platform?
Visibility into
all data
Sets course
for real-time
pipelines
Limits skills
gaps
Removes
data silos
Reality is not so simple
Silos of multi-
structured data
Legacy IT
infrastructure
Data archives
Employees
Value that Data from
Legacy Systems Brings
• Holds important transaction data
• Most core business applications
running on legacy systems
• High volumes of data
What happens to legacy data
sources?
Ignore data sources
for inclusion
Homegrown
solutions
Rely on existing
investments
Best practices for legacy data integration
Best practices
1. Breakdown the legacy data silo
2.Rethink your current approach
3.Build real-time applications
4.Limit skills gap & costs
1. Breakdown the legacy data silo
Shipping Company requires real-time delivery
status
Top level mandate driven by customer demands to:
1. Integrate customer and shipment information that resides on multiple
systems of record
2. Improve integration of mainframe systems with analytics platform
3. Replicate changes of mainframe data to larger business in real-time
Challenge: Mainframe data not readable for downstream tracking dashboards
Precisely makes mainframe data readable
in snowflake for real-time tracking
Solution
• Connect (ETL + CDC)
• Snowflake
Results
• Power business user and customer dashboards with
the latest shipment information
• Report shipment information in ways that give
business competitive edge
• Integrate and replicate hundreds of z/OS Db2
tables to Snowflake
• All data is integrated and readable across platforms
Lessons learned for enacting this best practice
• Have a clear idea data delivery SLAs: is data required in near-real-time, once an hour, or
only once a day?
• Based on your SLA, select the right data extraction mechanism for your data sources of
interest
• Remember that, distributed cloud architectures promise agility but may not readily integrate
with existing infrastructure
2. Rethink your current approach
Creating enterprise claims hub, required
quickly adding new targets
Strategic decision to use data to:
1. Improve the claims experience for end customers
2. Identify of patterns in claims to alert the business to unexpected severe claims
3. Automate the fast-tracking of low dollar claims without the need for an adjuster
Challenge: Current methods of integrating mainframe data
Precisely and Databricks helps to create high
performance data hub
Solution
• Connect (ETL)
• Databricks
Results
• No downtime or rework for implementing a new
approach to legacy source integration
• Ability to meet requirements of high-volume
processing for data hub
• Faster time to close claims and improved customer
experiences
Lessons learned for enacting this best practice
• Clearly define the goals of your modernization efforts: are you trying to save costs, improve
performance, or something else?
• Chose data integration solutions that allow you to easily expand for new use cases
• New requirements for modern data platforms may break current data integration architectures
• Select a tool that solves your integration problems across the hybrid landscape, from datacenter
to public cloud
3. Build real-time pipelines
Financial Services Company needs to build a
real-time AML process
Top level mandate driven by regulatory demands to:
1. Have consolidated, clean, verified data for all analytics and reporting
2. Provide alerts to any suspicious activity in real-time
3. Integrate mainframe data to analytics but also maintain an unmodified
copy of mainframe data stored
Challenge: Disparate systems and slow time to update mainframe data caused
major process delays in meeting AML monitoring
Precisely and Cloudera enable AML with
timely delivery
Solution
• Connect (ETL + CDC)
• Trillium
• Cloudera
Results
• High performance AML results
• Faster time to value
• Data lake is trusted source
• Data feeding critical machine learning-based
fraud detection
Looking forward…
• Expanding to additional Customer Engagement
solutions and applications
Lessons learned for enacting this best practice
• Select solutions that guarantee data delivery and have reliable transfer of information
• Ensure that your change detection mechanism has a lightweight, negligible impact on your
production systems, to minimize business disruption
• Assess how your overall cloud strategy can support real-time data delivery by selecting
technologies that can handle data in motion
• DI solutions with native integrations to modern analytics platforms help to speed results
4. Limit skills gap & costs
Credit Union looks to enable a data hub for
all lines of business
Top level mandate to open up data across the organization:
1. To improve customer banking experiences
2. Provide transparency of data to lines of business for analytics and BI
3. Enable AI/ML use cases with richer legacy data sets
Challenge: Core banking functions run on mainframe but lack of skills in house
incurred high development costs and made it difficult to scale
By the numbers…the cost of legacy data
$95
per hour
40
hour work week
$3800
cost per week
6 months
average project time
2
programmers
$7600
cost per week
$7600
cost per week
$197,600
cost per project
Connect’s ETL helps to lower costs and solve
skills gap
Solution
• Connect (ETL)
Results
• Reduced costs to development
• Leverage existing skills in house and enable
• Delivers all enterprise data for distribution across an
proprietary analytics platform
Lessons learned for enacting this best practice
• “Homegrown” is not always free
• Look for solutions that help you leverage the existing skills you have in house with minimal
retraining
• While solutions can ease a skills burden, good documentation is also critical to decrease risk
and facilitate knowledge sharing
What you can do in the next 90 days…
• Assess how you are currently using mainframe and IBM i data today
• Look at ways in which you can leverage data from legacy systems to maximize impact
• Keep both best practices and lessons learned in mind when developing your approach!
• Remember Precisely is here to be your partner in innovation!
Questions?
Making the Case for Legacy Data in Modern Data Analytics Platforms

More Related Content

What's hot

2022 Trends in Enterprise Analytics
2022 Trends in Enterprise Analytics2022 Trends in Enterprise Analytics
2022 Trends in Enterprise Analytics
DATAVERSITY
 
Implementing a Data Lake
Implementing a Data LakeImplementing a Data Lake
Implementing a Data Lake
Amazon Web Services
 
Intro to Delta Lake
Intro to Delta LakeIntro to Delta Lake
Intro to Delta Lake
Databricks
 
Architect’s Open-Source Guide for a Data Mesh Architecture
Architect’s Open-Source Guide for a Data Mesh ArchitectureArchitect’s Open-Source Guide for a Data Mesh Architecture
Architect’s Open-Source Guide for a Data Mesh Architecture
Databricks
 
Capgemini Cloud Assessment - A Pathway to Enterprise Cloud Migration
Capgemini Cloud Assessment - A Pathway to Enterprise Cloud MigrationCapgemini Cloud Assessment - A Pathway to Enterprise Cloud Migration
Capgemini Cloud Assessment - A Pathway to Enterprise Cloud Migration
Floyd DCosta
 
Building the Data Lake with Azure Data Factory and Data Lake Analytics
Building the Data Lake with Azure Data Factory and Data Lake AnalyticsBuilding the Data Lake with Azure Data Factory and Data Lake Analytics
Building the Data Lake with Azure Data Factory and Data Lake Analytics
Khalid Salama
 
Putting the Ops in DataOps: Orchestrate the Flow of Data Across Data Pipelines
Putting the Ops in DataOps: Orchestrate the Flow of Data Across Data PipelinesPutting the Ops in DataOps: Orchestrate the Flow of Data Across Data Pipelines
Putting the Ops in DataOps: Orchestrate the Flow of Data Across Data Pipelines
DATAVERSITY
 
Introducing the Snowflake Computing Cloud Data Warehouse
Introducing the Snowflake Computing Cloud Data WarehouseIntroducing the Snowflake Computing Cloud Data Warehouse
Introducing the Snowflake Computing Cloud Data Warehouse
Snowflake Computing
 
The Marriage of the Data Lake and the Data Warehouse and Why You Need Both
The Marriage of the Data Lake and the Data Warehouse and Why You Need BothThe Marriage of the Data Lake and the Data Warehouse and Why You Need Both
The Marriage of the Data Lake and the Data Warehouse and Why You Need Both
Adaryl "Bob" Wakefield, MBA
 
Modern Data Platform on AWS
Modern Data Platform on AWSModern Data Platform on AWS
Modern Data Platform on AWS
Amazon Web Services
 
Introduction SQL Analytics on Lakehouse Architecture
Introduction SQL Analytics on Lakehouse ArchitectureIntroduction SQL Analytics on Lakehouse Architecture
Introduction SQL Analytics on Lakehouse Architecture
Databricks
 
An AI Maturity Roadmap for Becoming a Data-Driven Organization
An AI Maturity Roadmap for Becoming a Data-Driven OrganizationAn AI Maturity Roadmap for Becoming a Data-Driven Organization
An AI Maturity Roadmap for Becoming a Data-Driven Organization
David Solomon
 
MDM for Customer data with Talend
MDM for Customer data with Talend MDM for Customer data with Talend
MDM for Customer data with Talend
Jean-Michel Franco
 
Data Governance
Data GovernanceData Governance
Data Governance
Rob Lux
 
Data2030 Summit MEA: Data Chaos to Data Culture March 2023
Data2030 Summit MEA: Data Chaos to Data Culture March 2023Data2030 Summit MEA: Data Chaos to Data Culture March 2023
Data2030 Summit MEA: Data Chaos to Data Culture March 2023
Matt Turner
 
Emerging Trends in Data Architecture – What’s the Next Big Thing?
Emerging Trends in Data Architecture – What’s the Next Big Thing?Emerging Trends in Data Architecture – What’s the Next Big Thing?
Emerging Trends in Data Architecture – What’s the Next Big Thing?
DATAVERSITY
 
Building a Data Lake for Your Enterprise, ft. Sysco (STG309) - AWS re:Invent ...
Building a Data Lake for Your Enterprise, ft. Sysco (STG309) - AWS re:Invent ...Building a Data Lake for Your Enterprise, ft. Sysco (STG309) - AWS re:Invent ...
Building a Data Lake for Your Enterprise, ft. Sysco (STG309) - AWS re:Invent ...
Amazon Web Services
 
Productionalizing Machine Learning Solutions with Effective Tracking, Monitor...
Productionalizing Machine Learning Solutions with Effective Tracking, Monitor...Productionalizing Machine Learning Solutions with Effective Tracking, Monitor...
Productionalizing Machine Learning Solutions with Effective Tracking, Monitor...
Databricks
 
Microsoft Fabric Intro D Koutsanastasis
Microsoft Fabric Intro D KoutsanastasisMicrosoft Fabric Intro D Koutsanastasis
Microsoft Fabric Intro D Koutsanastasis
Uni Systems S.M.S.A.
 
Building Modern Data Platform with Microsoft Azure
Building Modern Data Platform with Microsoft AzureBuilding Modern Data Platform with Microsoft Azure
Building Modern Data Platform with Microsoft Azure
Dmitry Anoshin
 

What's hot (20)

2022 Trends in Enterprise Analytics
2022 Trends in Enterprise Analytics2022 Trends in Enterprise Analytics
2022 Trends in Enterprise Analytics
 
Implementing a Data Lake
Implementing a Data LakeImplementing a Data Lake
Implementing a Data Lake
 
Intro to Delta Lake
Intro to Delta LakeIntro to Delta Lake
Intro to Delta Lake
 
Architect’s Open-Source Guide for a Data Mesh Architecture
Architect’s Open-Source Guide for a Data Mesh ArchitectureArchitect’s Open-Source Guide for a Data Mesh Architecture
Architect’s Open-Source Guide for a Data Mesh Architecture
 
Capgemini Cloud Assessment - A Pathway to Enterprise Cloud Migration
Capgemini Cloud Assessment - A Pathway to Enterprise Cloud MigrationCapgemini Cloud Assessment - A Pathway to Enterprise Cloud Migration
Capgemini Cloud Assessment - A Pathway to Enterprise Cloud Migration
 
Building the Data Lake with Azure Data Factory and Data Lake Analytics
Building the Data Lake with Azure Data Factory and Data Lake AnalyticsBuilding the Data Lake with Azure Data Factory and Data Lake Analytics
Building the Data Lake with Azure Data Factory and Data Lake Analytics
 
Putting the Ops in DataOps: Orchestrate the Flow of Data Across Data Pipelines
Putting the Ops in DataOps: Orchestrate the Flow of Data Across Data PipelinesPutting the Ops in DataOps: Orchestrate the Flow of Data Across Data Pipelines
Putting the Ops in DataOps: Orchestrate the Flow of Data Across Data Pipelines
 
Introducing the Snowflake Computing Cloud Data Warehouse
Introducing the Snowflake Computing Cloud Data WarehouseIntroducing the Snowflake Computing Cloud Data Warehouse
Introducing the Snowflake Computing Cloud Data Warehouse
 
The Marriage of the Data Lake and the Data Warehouse and Why You Need Both
The Marriage of the Data Lake and the Data Warehouse and Why You Need BothThe Marriage of the Data Lake and the Data Warehouse and Why You Need Both
The Marriage of the Data Lake and the Data Warehouse and Why You Need Both
 
Modern Data Platform on AWS
Modern Data Platform on AWSModern Data Platform on AWS
Modern Data Platform on AWS
 
Introduction SQL Analytics on Lakehouse Architecture
Introduction SQL Analytics on Lakehouse ArchitectureIntroduction SQL Analytics on Lakehouse Architecture
Introduction SQL Analytics on Lakehouse Architecture
 
An AI Maturity Roadmap for Becoming a Data-Driven Organization
An AI Maturity Roadmap for Becoming a Data-Driven OrganizationAn AI Maturity Roadmap for Becoming a Data-Driven Organization
An AI Maturity Roadmap for Becoming a Data-Driven Organization
 
MDM for Customer data with Talend
MDM for Customer data with Talend MDM for Customer data with Talend
MDM for Customer data with Talend
 
Data Governance
Data GovernanceData Governance
Data Governance
 
Data2030 Summit MEA: Data Chaos to Data Culture March 2023
Data2030 Summit MEA: Data Chaos to Data Culture March 2023Data2030 Summit MEA: Data Chaos to Data Culture March 2023
Data2030 Summit MEA: Data Chaos to Data Culture March 2023
 
Emerging Trends in Data Architecture – What’s the Next Big Thing?
Emerging Trends in Data Architecture – What’s the Next Big Thing?Emerging Trends in Data Architecture – What’s the Next Big Thing?
Emerging Trends in Data Architecture – What’s the Next Big Thing?
 
Building a Data Lake for Your Enterprise, ft. Sysco (STG309) - AWS re:Invent ...
Building a Data Lake for Your Enterprise, ft. Sysco (STG309) - AWS re:Invent ...Building a Data Lake for Your Enterprise, ft. Sysco (STG309) - AWS re:Invent ...
Building a Data Lake for Your Enterprise, ft. Sysco (STG309) - AWS re:Invent ...
 
Productionalizing Machine Learning Solutions with Effective Tracking, Monitor...
Productionalizing Machine Learning Solutions with Effective Tracking, Monitor...Productionalizing Machine Learning Solutions with Effective Tracking, Monitor...
Productionalizing Machine Learning Solutions with Effective Tracking, Monitor...
 
Microsoft Fabric Intro D Koutsanastasis
Microsoft Fabric Intro D KoutsanastasisMicrosoft Fabric Intro D Koutsanastasis
Microsoft Fabric Intro D Koutsanastasis
 
Building Modern Data Platform with Microsoft Azure
Building Modern Data Platform with Microsoft AzureBuilding Modern Data Platform with Microsoft Azure
Building Modern Data Platform with Microsoft Azure
 

Similar to Making the Case for Legacy Data in Modern Data Analytics Platforms

Foundational Strategies for Trust in Big Data Part 1: Getting Data to the Pla...
Foundational Strategies for Trust in Big Data Part 1: Getting Data to the Pla...Foundational Strategies for Trust in Big Data Part 1: Getting Data to the Pla...
Foundational Strategies for Trust in Big Data Part 1: Getting Data to the Pla...
Precisely
 
Bring Your Mission-Critical Data to Your Cloud Apps and Analytics
Bring Your Mission-Critical Data to Your Cloud Apps and Analytics Bring Your Mission-Critical Data to Your Cloud Apps and Analytics
Bring Your Mission-Critical Data to Your Cloud Apps and Analytics
Precisely
 
Overcoming Your Data Integration Challenges
Overcoming Your Data Integration Challenges Overcoming Your Data Integration Challenges
Overcoming Your Data Integration Challenges
Precisely
 
Delivering Modern Apps and Analytics That Include All Your Mission-Critical Data
Delivering Modern Apps and Analytics That Include All Your Mission-Critical DataDelivering Modern Apps and Analytics That Include All Your Mission-Critical Data
Delivering Modern Apps and Analytics That Include All Your Mission-Critical Data
Precisely
 
Address Your Blind Spots Around Mission-Critical Data
Address Your Blind Spots Around Mission-Critical Data Address Your Blind Spots Around Mission-Critical Data
Address Your Blind Spots Around Mission-Critical Data
Precisely
 
Accelerate Innovation by Bringing all Your Mission-Critical Data to Your Clou...
Accelerate Innovation by Bringing all Your Mission-Critical Data to Your Clou...Accelerate Innovation by Bringing all Your Mission-Critical Data to Your Clou...
Accelerate Innovation by Bringing all Your Mission-Critical Data to Your Clou...
Precisely
 
Accelerate Cloud Migrations and Architecture with Data Virtualization
Accelerate Cloud Migrations and Architecture with Data VirtualizationAccelerate Cloud Migrations and Architecture with Data Virtualization
Accelerate Cloud Migrations and Architecture with Data Virtualization
Denodo
 
Including All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and AnalyticsIncluding All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and Analytics
Precisely
 
Including All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and AnalyticsIncluding All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and Analytics
DATAVERSITY
 
Integrating IBM Z and IBM i Operational Intelligence Into Splunk, Elastic, an...
Integrating IBM Z and IBM i Operational Intelligence Into Splunk, Elastic, an...Integrating IBM Z and IBM i Operational Intelligence Into Splunk, Elastic, an...
Integrating IBM Z and IBM i Operational Intelligence Into Splunk, Elastic, an...
Precisely
 
Foundational Strategies for Trusted Data: Getting Your Data to the Cloud
Foundational Strategies for Trusted Data: Getting Your Data to the CloudFoundational Strategies for Trusted Data: Getting Your Data to the Cloud
Foundational Strategies for Trusted Data: Getting Your Data to the Cloud
Precisely
 
Digital Transformation: How to Run Best-in-Class IT Operations in a World of ...
Digital Transformation: How to Run Best-in-Class IT Operations in a World of ...Digital Transformation: How to Run Best-in-Class IT Operations in a World of ...
Digital Transformation: How to Run Best-in-Class IT Operations in a World of ...
Precisely
 
The Shifting Landscape of Data Integration
The Shifting Landscape of Data IntegrationThe Shifting Landscape of Data Integration
The Shifting Landscape of Data Integration
DATAVERSITY
 
Webinar: The 5 Most Critical Things to Understand About Modern Data Integration
Webinar: The 5 Most Critical Things to Understand About Modern Data IntegrationWebinar: The 5 Most Critical Things to Understand About Modern Data Integration
Webinar: The 5 Most Critical Things to Understand About Modern Data Integration
SnapLogic
 
Democratized Data & Analytics for the Cloud​
Democratized Data & Analytics for the Cloud​Democratized Data & Analytics for the Cloud​
Democratized Data & Analytics for the Cloud​
Precisely
 
Data Mesh in Azure using Cloud Scale Analytics (WAF)
Data Mesh in Azure using Cloud Scale Analytics (WAF)Data Mesh in Azure using Cloud Scale Analytics (WAF)
Data Mesh in Azure using Cloud Scale Analytics (WAF)
Nathan Bijnens
 
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with Hadoop
Big Data Made Easy:  A Simple, Scalable Solution for Getting Started with HadoopBig Data Made Easy:  A Simple, Scalable Solution for Getting Started with Hadoop
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with Hadoop
Precisely
 
Looking to the Future: Embracing the Cloud for a More Modern Data Quality App...
Looking to the Future: Embracing the Cloud for a More Modern Data Quality App...Looking to the Future: Embracing the Cloud for a More Modern Data Quality App...
Looking to the Future: Embracing the Cloud for a More Modern Data Quality App...
Precisely
 
Introducing Elevate Capacity Management
Introducing Elevate Capacity ManagementIntroducing Elevate Capacity Management
Introducing Elevate Capacity Management
Precisely
 
Future of Making Things
Future of Making ThingsFuture of Making Things
Future of Making Things
JC Davis
 

Similar to Making the Case for Legacy Data in Modern Data Analytics Platforms (20)

Foundational Strategies for Trust in Big Data Part 1: Getting Data to the Pla...
Foundational Strategies for Trust in Big Data Part 1: Getting Data to the Pla...Foundational Strategies for Trust in Big Data Part 1: Getting Data to the Pla...
Foundational Strategies for Trust in Big Data Part 1: Getting Data to the Pla...
 
Bring Your Mission-Critical Data to Your Cloud Apps and Analytics
Bring Your Mission-Critical Data to Your Cloud Apps and Analytics Bring Your Mission-Critical Data to Your Cloud Apps and Analytics
Bring Your Mission-Critical Data to Your Cloud Apps and Analytics
 
Overcoming Your Data Integration Challenges
Overcoming Your Data Integration Challenges Overcoming Your Data Integration Challenges
Overcoming Your Data Integration Challenges
 
Delivering Modern Apps and Analytics That Include All Your Mission-Critical Data
Delivering Modern Apps and Analytics That Include All Your Mission-Critical DataDelivering Modern Apps and Analytics That Include All Your Mission-Critical Data
Delivering Modern Apps and Analytics That Include All Your Mission-Critical Data
 
Address Your Blind Spots Around Mission-Critical Data
Address Your Blind Spots Around Mission-Critical Data Address Your Blind Spots Around Mission-Critical Data
Address Your Blind Spots Around Mission-Critical Data
 
Accelerate Innovation by Bringing all Your Mission-Critical Data to Your Clou...
Accelerate Innovation by Bringing all Your Mission-Critical Data to Your Clou...Accelerate Innovation by Bringing all Your Mission-Critical Data to Your Clou...
Accelerate Innovation by Bringing all Your Mission-Critical Data to Your Clou...
 
Accelerate Cloud Migrations and Architecture with Data Virtualization
Accelerate Cloud Migrations and Architecture with Data VirtualizationAccelerate Cloud Migrations and Architecture with Data Virtualization
Accelerate Cloud Migrations and Architecture with Data Virtualization
 
Including All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and AnalyticsIncluding All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and Analytics
 
Including All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and AnalyticsIncluding All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and Analytics
 
Integrating IBM Z and IBM i Operational Intelligence Into Splunk, Elastic, an...
Integrating IBM Z and IBM i Operational Intelligence Into Splunk, Elastic, an...Integrating IBM Z and IBM i Operational Intelligence Into Splunk, Elastic, an...
Integrating IBM Z and IBM i Operational Intelligence Into Splunk, Elastic, an...
 
Foundational Strategies for Trusted Data: Getting Your Data to the Cloud
Foundational Strategies for Trusted Data: Getting Your Data to the CloudFoundational Strategies for Trusted Data: Getting Your Data to the Cloud
Foundational Strategies for Trusted Data: Getting Your Data to the Cloud
 
Digital Transformation: How to Run Best-in-Class IT Operations in a World of ...
Digital Transformation: How to Run Best-in-Class IT Operations in a World of ...Digital Transformation: How to Run Best-in-Class IT Operations in a World of ...
Digital Transformation: How to Run Best-in-Class IT Operations in a World of ...
 
The Shifting Landscape of Data Integration
The Shifting Landscape of Data IntegrationThe Shifting Landscape of Data Integration
The Shifting Landscape of Data Integration
 
Webinar: The 5 Most Critical Things to Understand About Modern Data Integration
Webinar: The 5 Most Critical Things to Understand About Modern Data IntegrationWebinar: The 5 Most Critical Things to Understand About Modern Data Integration
Webinar: The 5 Most Critical Things to Understand About Modern Data Integration
 
Democratized Data & Analytics for the Cloud​
Democratized Data & Analytics for the Cloud​Democratized Data & Analytics for the Cloud​
Democratized Data & Analytics for the Cloud​
 
Data Mesh in Azure using Cloud Scale Analytics (WAF)
Data Mesh in Azure using Cloud Scale Analytics (WAF)Data Mesh in Azure using Cloud Scale Analytics (WAF)
Data Mesh in Azure using Cloud Scale Analytics (WAF)
 
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with Hadoop
Big Data Made Easy:  A Simple, Scalable Solution for Getting Started with HadoopBig Data Made Easy:  A Simple, Scalable Solution for Getting Started with Hadoop
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with Hadoop
 
Looking to the Future: Embracing the Cloud for a More Modern Data Quality App...
Looking to the Future: Embracing the Cloud for a More Modern Data Quality App...Looking to the Future: Embracing the Cloud for a More Modern Data Quality App...
Looking to the Future: Embracing the Cloud for a More Modern Data Quality App...
 
Introducing Elevate Capacity Management
Introducing Elevate Capacity ManagementIntroducing Elevate Capacity Management
Introducing Elevate Capacity Management
 
Future of Making Things
Future of Making ThingsFuture of Making Things
Future of Making Things
 

More from Precisely

Automate Studio Training: Easy Loop Creation for Greater Efficiency.pdf
Automate Studio Training: Easy Loop Creation for Greater Efficiency.pdfAutomate Studio Training: Easy Loop Creation for Greater Efficiency.pdf
Automate Studio Training: Easy Loop Creation for Greater Efficiency.pdf
Precisely
 
Making Your Data and AI Ready for Business Transformation.pdf
Making Your Data and AI Ready for Business Transformation.pdfMaking Your Data and AI Ready for Business Transformation.pdf
Making Your Data and AI Ready for Business Transformation.pdf
Precisely
 
Getting a Deeper Look at Your IBM® Z and IBM i Data in ServiceNow
Getting a Deeper Look at Your IBM® Z and IBM i Data in ServiceNowGetting a Deeper Look at Your IBM® Z and IBM i Data in ServiceNow
Getting a Deeper Look at Your IBM® Z and IBM i Data in ServiceNow
Precisely
 
Predictive Powerhouse - Elevating AI ML Accuracy and Relevance with Third-Par...
Predictive Powerhouse - Elevating AI ML Accuracy and Relevance with Third-Par...Predictive Powerhouse - Elevating AI ML Accuracy and Relevance with Third-Par...
Predictive Powerhouse - Elevating AI ML Accuracy and Relevance with Third-Par...
Precisely
 
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party DataPredictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
Precisely
 
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party DataPredictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
Precisely
 
Digital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframeDigital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
Digital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
Precisely
 
信頼できるデータでESGイニシアチブを成功に導く方法.pdf How to drive success with ESG initiatives with...
信頼できるデータでESGイニシアチブを成功に導く方法.pdf How to drive success with ESG initiatives with...信頼できるデータでESGイニシアチブを成功に導く方法.pdf How to drive success with ESG initiatives with...
信頼できるデータでESGイニシアチブを成功に導く方法.pdf How to drive success with ESG initiatives with...
Precisely
 
AI-Ready Data - The Key to Transforming Projects into Production.pptx
AI-Ready Data - The Key to Transforming Projects into Production.pptxAI-Ready Data - The Key to Transforming Projects into Production.pptx
AI-Ready Data - The Key to Transforming Projects into Production.pptx
Precisely
 
Building a Multi-Layered Defense for Your IBM i Security
Building a Multi-Layered Defense for Your IBM i SecurityBuilding a Multi-Layered Defense for Your IBM i Security
Building a Multi-Layered Defense for Your IBM i Security
Precisely
 
Optimierte Daten und Prozesse mit KI / ML + SAP Fiori.pdf
Optimierte Daten und Prozesse mit KI / ML + SAP Fiori.pdfOptimierte Daten und Prozesse mit KI / ML + SAP Fiori.pdf
Optimierte Daten und Prozesse mit KI / ML + SAP Fiori.pdf
Precisely
 
Chaining, Looping, and Long Text for Script Development and Automation.pdf
Chaining, Looping, and Long Text for Script Development and Automation.pdfChaining, Looping, and Long Text for Script Development and Automation.pdf
Chaining, Looping, and Long Text for Script Development and Automation.pdf
Precisely
 
Revolutionizing SAP® Processes with Automation and Artificial Intelligence
Revolutionizing SAP® Processes with Automation and Artificial IntelligenceRevolutionizing SAP® Processes with Automation and Artificial Intelligence
Revolutionizing SAP® Processes with Automation and Artificial Intelligence
Precisely
 
Navigating the Cloud: Best Practices for Successful Migration
Navigating the Cloud: Best Practices for Successful MigrationNavigating the Cloud: Best Practices for Successful Migration
Navigating the Cloud: Best Practices for Successful Migration
Precisely
 
Unlocking the Power of Your IBM i and Z Security Data with Google Chronicle
Unlocking the Power of Your IBM i and Z Security Data with Google ChronicleUnlocking the Power of Your IBM i and Z Security Data with Google Chronicle
Unlocking the Power of Your IBM i and Z Security Data with Google Chronicle
Precisely
 
How to Build Data Governance Programs That Last - A Business-First Approach.pdf
How to Build Data Governance Programs That Last - A Business-First Approach.pdfHow to Build Data Governance Programs That Last - A Business-First Approach.pdf
How to Build Data Governance Programs That Last - A Business-First Approach.pdf
Precisely
 
Zukuntssichere SAP Prozesse dank automatisierter Massendaten
Zukuntssichere SAP Prozesse dank automatisierter MassendatenZukuntssichere SAP Prozesse dank automatisierter Massendaten
Zukuntssichere SAP Prozesse dank automatisierter Massendaten
Precisely
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power Systems
Precisely
 
Crucial Considerations for AI-ready Data.pdf
Crucial Considerations for AI-ready Data.pdfCrucial Considerations for AI-ready Data.pdf
Crucial Considerations for AI-ready Data.pdf
Precisely
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Precisely
 

More from Precisely (20)

Automate Studio Training: Easy Loop Creation for Greater Efficiency.pdf
Automate Studio Training: Easy Loop Creation for Greater Efficiency.pdfAutomate Studio Training: Easy Loop Creation for Greater Efficiency.pdf
Automate Studio Training: Easy Loop Creation for Greater Efficiency.pdf
 
Making Your Data and AI Ready for Business Transformation.pdf
Making Your Data and AI Ready for Business Transformation.pdfMaking Your Data and AI Ready for Business Transformation.pdf
Making Your Data and AI Ready for Business Transformation.pdf
 
Getting a Deeper Look at Your IBM® Z and IBM i Data in ServiceNow
Getting a Deeper Look at Your IBM® Z and IBM i Data in ServiceNowGetting a Deeper Look at Your IBM® Z and IBM i Data in ServiceNow
Getting a Deeper Look at Your IBM® Z and IBM i Data in ServiceNow
 
Predictive Powerhouse - Elevating AI ML Accuracy and Relevance with Third-Par...
Predictive Powerhouse - Elevating AI ML Accuracy and Relevance with Third-Par...Predictive Powerhouse - Elevating AI ML Accuracy and Relevance with Third-Par...
Predictive Powerhouse - Elevating AI ML Accuracy and Relevance with Third-Par...
 
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party DataPredictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
 
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party DataPredictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
Predictive Powerhouse: Elevating AI Accuracy and Relevance with Third-Party Data
 
Digital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframeDigital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
Digital Banking in the Cloud: How Citizens Bank Unlocked Their Mainframe
 
信頼できるデータでESGイニシアチブを成功に導く方法.pdf How to drive success with ESG initiatives with...
信頼できるデータでESGイニシアチブを成功に導く方法.pdf How to drive success with ESG initiatives with...信頼できるデータでESGイニシアチブを成功に導く方法.pdf How to drive success with ESG initiatives with...
信頼できるデータでESGイニシアチブを成功に導く方法.pdf How to drive success with ESG initiatives with...
 
AI-Ready Data - The Key to Transforming Projects into Production.pptx
AI-Ready Data - The Key to Transforming Projects into Production.pptxAI-Ready Data - The Key to Transforming Projects into Production.pptx
AI-Ready Data - The Key to Transforming Projects into Production.pptx
 
Building a Multi-Layered Defense for Your IBM i Security
Building a Multi-Layered Defense for Your IBM i SecurityBuilding a Multi-Layered Defense for Your IBM i Security
Building a Multi-Layered Defense for Your IBM i Security
 
Optimierte Daten und Prozesse mit KI / ML + SAP Fiori.pdf
Optimierte Daten und Prozesse mit KI / ML + SAP Fiori.pdfOptimierte Daten und Prozesse mit KI / ML + SAP Fiori.pdf
Optimierte Daten und Prozesse mit KI / ML + SAP Fiori.pdf
 
Chaining, Looping, and Long Text for Script Development and Automation.pdf
Chaining, Looping, and Long Text for Script Development and Automation.pdfChaining, Looping, and Long Text for Script Development and Automation.pdf
Chaining, Looping, and Long Text for Script Development and Automation.pdf
 
Revolutionizing SAP® Processes with Automation and Artificial Intelligence
Revolutionizing SAP® Processes with Automation and Artificial IntelligenceRevolutionizing SAP® Processes with Automation and Artificial Intelligence
Revolutionizing SAP® Processes with Automation and Artificial Intelligence
 
Navigating the Cloud: Best Practices for Successful Migration
Navigating the Cloud: Best Practices for Successful MigrationNavigating the Cloud: Best Practices for Successful Migration
Navigating the Cloud: Best Practices for Successful Migration
 
Unlocking the Power of Your IBM i and Z Security Data with Google Chronicle
Unlocking the Power of Your IBM i and Z Security Data with Google ChronicleUnlocking the Power of Your IBM i and Z Security Data with Google Chronicle
Unlocking the Power of Your IBM i and Z Security Data with Google Chronicle
 
How to Build Data Governance Programs That Last - A Business-First Approach.pdf
How to Build Data Governance Programs That Last - A Business-First Approach.pdfHow to Build Data Governance Programs That Last - A Business-First Approach.pdf
How to Build Data Governance Programs That Last - A Business-First Approach.pdf
 
Zukuntssichere SAP Prozesse dank automatisierter Massendaten
Zukuntssichere SAP Prozesse dank automatisierter MassendatenZukuntssichere SAP Prozesse dank automatisierter Massendaten
Zukuntssichere SAP Prozesse dank automatisierter Massendaten
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power Systems
 
Crucial Considerations for AI-ready Data.pdf
Crucial Considerations for AI-ready Data.pdfCrucial Considerations for AI-ready Data.pdf
Crucial Considerations for AI-ready Data.pdf
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
 

Recently uploaded

Poznań ACE event - 19.06.2024 Team 24 Wrapup slidedeck
Poznań ACE event - 19.06.2024 Team 24 Wrapup slidedeckPoznań ACE event - 19.06.2024 Team 24 Wrapup slidedeck
Poznań ACE event - 19.06.2024 Team 24 Wrapup slidedeck
FilipTomaszewski5
 
Automation Student Developers Session 3: Introduction to UI Automation
Automation Student Developers Session 3: Introduction to UI AutomationAutomation Student Developers Session 3: Introduction to UI Automation
Automation Student Developers Session 3: Introduction to UI Automation
UiPathCommunity
 
Containers & AI - Beauty and the Beast!?!
Containers & AI - Beauty and the Beast!?!Containers & AI - Beauty and the Beast!?!
Containers & AI - Beauty and the Beast!?!
Tobias Schneck
 
QA or the Highway - Component Testing: Bridging the gap between frontend appl...
QA or the Highway - Component Testing: Bridging the gap between frontend appl...QA or the Highway - Component Testing: Bridging the gap between frontend appl...
QA or the Highway - Component Testing: Bridging the gap between frontend appl...
zjhamm304
 
QR Secure: A Hybrid Approach Using Machine Learning and Security Validation F...
QR Secure: A Hybrid Approach Using Machine Learning and Security Validation F...QR Secure: A Hybrid Approach Using Machine Learning and Security Validation F...
QR Secure: A Hybrid Approach Using Machine Learning and Security Validation F...
AlexanderRichford
 
Call Girls Chandigarh🔥7023059433🔥Agency Profile Escorts in Chandigarh Availab...
Call Girls Chandigarh🔥7023059433🔥Agency Profile Escorts in Chandigarh Availab...Call Girls Chandigarh🔥7023059433🔥Agency Profile Escorts in Chandigarh Availab...
Call Girls Chandigarh🔥7023059433🔥Agency Profile Escorts in Chandigarh Availab...
manji sharman06
 
Introducing BoxLang : A new JVM language for productivity and modularity!
Introducing BoxLang : A new JVM language for productivity and modularity!Introducing BoxLang : A new JVM language for productivity and modularity!
Introducing BoxLang : A new JVM language for productivity and modularity!
Ortus Solutions, Corp
 
Communications Mining Series - Zero to Hero - Session 2
Communications Mining Series - Zero to Hero - Session 2Communications Mining Series - Zero to Hero - Session 2
Communications Mining Series - Zero to Hero - Session 2
DianaGray10
 
Real-Time Persisted Events at Supercell
Real-Time Persisted Events at  SupercellReal-Time Persisted Events at  Supercell
Real-Time Persisted Events at Supercell
ScyllaDB
 
Mutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented ChatbotsMutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented Chatbots
Pablo Gómez Abajo
 
MongoDB to ScyllaDB: Technical Comparison and the Path to Success
MongoDB to ScyllaDB: Technical Comparison and the Path to SuccessMongoDB to ScyllaDB: Technical Comparison and the Path to Success
MongoDB to ScyllaDB: Technical Comparison and the Path to Success
ScyllaDB
 
An Introduction to All Data Enterprise Integration
An Introduction to All Data Enterprise IntegrationAn Introduction to All Data Enterprise Integration
An Introduction to All Data Enterprise Integration
Safe Software
 
CTO Insights: Steering a High-Stakes Database Migration
CTO Insights: Steering a High-Stakes Database MigrationCTO Insights: Steering a High-Stakes Database Migration
CTO Insights: Steering a High-Stakes Database Migration
ScyllaDB
 
CNSCon 2024 Lightning Talk: Don’t Make Me Impersonate My Identity
CNSCon 2024 Lightning Talk: Don’t Make Me Impersonate My IdentityCNSCon 2024 Lightning Talk: Don’t Make Me Impersonate My Identity
CNSCon 2024 Lightning Talk: Don’t Make Me Impersonate My Identity
Cynthia Thomas
 
ScyllaDB Real-Time Event Processing with CDC
ScyllaDB Real-Time Event Processing with CDCScyllaDB Real-Time Event Processing with CDC
ScyllaDB Real-Time Event Processing with CDC
ScyllaDB
 
Discover the Unseen: Tailored Recommendation of Unwatched Content
Discover the Unseen: Tailored Recommendation of Unwatched ContentDiscover the Unseen: Tailored Recommendation of Unwatched Content
Discover the Unseen: Tailored Recommendation of Unwatched Content
ScyllaDB
 
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...
DanBrown980551
 
Fuxnet [EN] .pdf
Fuxnet [EN]                                   .pdfFuxnet [EN]                                   .pdf
Fuxnet [EN] .pdf
Overkill Security
 
APJC Introduction to ThousandEyes Webinar
APJC Introduction to ThousandEyes WebinarAPJC Introduction to ThousandEyes Webinar
APJC Introduction to ThousandEyes Webinar
ThousandEyes
 
Call Girls Kochi 💯Call Us 🔝 7426014248 🔝 Independent Kochi Escorts Service Av...
Call Girls Kochi 💯Call Us 🔝 7426014248 🔝 Independent Kochi Escorts Service Av...Call Girls Kochi 💯Call Us 🔝 7426014248 🔝 Independent Kochi Escorts Service Av...
Call Girls Kochi 💯Call Us 🔝 7426014248 🔝 Independent Kochi Escorts Service Av...
dipikamodels1
 

Recently uploaded (20)

Poznań ACE event - 19.06.2024 Team 24 Wrapup slidedeck
Poznań ACE event - 19.06.2024 Team 24 Wrapup slidedeckPoznań ACE event - 19.06.2024 Team 24 Wrapup slidedeck
Poznań ACE event - 19.06.2024 Team 24 Wrapup slidedeck
 
Automation Student Developers Session 3: Introduction to UI Automation
Automation Student Developers Session 3: Introduction to UI AutomationAutomation Student Developers Session 3: Introduction to UI Automation
Automation Student Developers Session 3: Introduction to UI Automation
 
Containers & AI - Beauty and the Beast!?!
Containers & AI - Beauty and the Beast!?!Containers & AI - Beauty and the Beast!?!
Containers & AI - Beauty and the Beast!?!
 
QA or the Highway - Component Testing: Bridging the gap between frontend appl...
QA or the Highway - Component Testing: Bridging the gap between frontend appl...QA or the Highway - Component Testing: Bridging the gap between frontend appl...
QA or the Highway - Component Testing: Bridging the gap between frontend appl...
 
QR Secure: A Hybrid Approach Using Machine Learning and Security Validation F...
QR Secure: A Hybrid Approach Using Machine Learning and Security Validation F...QR Secure: A Hybrid Approach Using Machine Learning and Security Validation F...
QR Secure: A Hybrid Approach Using Machine Learning and Security Validation F...
 
Call Girls Chandigarh🔥7023059433🔥Agency Profile Escorts in Chandigarh Availab...
Call Girls Chandigarh🔥7023059433🔥Agency Profile Escorts in Chandigarh Availab...Call Girls Chandigarh🔥7023059433🔥Agency Profile Escorts in Chandigarh Availab...
Call Girls Chandigarh🔥7023059433🔥Agency Profile Escorts in Chandigarh Availab...
 
Introducing BoxLang : A new JVM language for productivity and modularity!
Introducing BoxLang : A new JVM language for productivity and modularity!Introducing BoxLang : A new JVM language for productivity and modularity!
Introducing BoxLang : A new JVM language for productivity and modularity!
 
Communications Mining Series - Zero to Hero - Session 2
Communications Mining Series - Zero to Hero - Session 2Communications Mining Series - Zero to Hero - Session 2
Communications Mining Series - Zero to Hero - Session 2
 
Real-Time Persisted Events at Supercell
Real-Time Persisted Events at  SupercellReal-Time Persisted Events at  Supercell
Real-Time Persisted Events at Supercell
 
Mutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented ChatbotsMutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented Chatbots
 
MongoDB to ScyllaDB: Technical Comparison and the Path to Success
MongoDB to ScyllaDB: Technical Comparison and the Path to SuccessMongoDB to ScyllaDB: Technical Comparison and the Path to Success
MongoDB to ScyllaDB: Technical Comparison and the Path to Success
 
An Introduction to All Data Enterprise Integration
An Introduction to All Data Enterprise IntegrationAn Introduction to All Data Enterprise Integration
An Introduction to All Data Enterprise Integration
 
CTO Insights: Steering a High-Stakes Database Migration
CTO Insights: Steering a High-Stakes Database MigrationCTO Insights: Steering a High-Stakes Database Migration
CTO Insights: Steering a High-Stakes Database Migration
 
CNSCon 2024 Lightning Talk: Don’t Make Me Impersonate My Identity
CNSCon 2024 Lightning Talk: Don’t Make Me Impersonate My IdentityCNSCon 2024 Lightning Talk: Don’t Make Me Impersonate My Identity
CNSCon 2024 Lightning Talk: Don’t Make Me Impersonate My Identity
 
ScyllaDB Real-Time Event Processing with CDC
ScyllaDB Real-Time Event Processing with CDCScyllaDB Real-Time Event Processing with CDC
ScyllaDB Real-Time Event Processing with CDC
 
Discover the Unseen: Tailored Recommendation of Unwatched Content
Discover the Unseen: Tailored Recommendation of Unwatched ContentDiscover the Unseen: Tailored Recommendation of Unwatched Content
Discover the Unseen: Tailored Recommendation of Unwatched Content
 
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...
 
Fuxnet [EN] .pdf
Fuxnet [EN]                                   .pdfFuxnet [EN]                                   .pdf
Fuxnet [EN] .pdf
 
APJC Introduction to ThousandEyes Webinar
APJC Introduction to ThousandEyes WebinarAPJC Introduction to ThousandEyes Webinar
APJC Introduction to ThousandEyes Webinar
 
Call Girls Kochi 💯Call Us 🔝 7426014248 🔝 Independent Kochi Escorts Service Av...
Call Girls Kochi 💯Call Us 🔝 7426014248 🔝 Independent Kochi Escorts Service Av...Call Girls Kochi 💯Call Us 🔝 7426014248 🔝 Independent Kochi Escorts Service Av...
Call Girls Kochi 💯Call Us 🔝 7426014248 🔝 Independent Kochi Escorts Service Av...
 

Making the Case for Legacy Data in Modern Data Analytics Platforms

  • 1. Making the Case for Legacy Data in Modern Data Analytics Arianna Valentini | Product Marketing Manager
  • 2. Housekeeping Webcast Audio • Today’s webcast audio is streamed through your computer speakers • If you need technical assistance with the web interface or audio, please reach out to us using the Q&A box Questions Welcome • Submit your questions at any time during the presentation using the Q&A box Recording and slides • This webcast is being recorded. You will receive an email following the webcast with a link to the recording
  • 3. Unites and integrates data from across the enterprise, making data available for a variety of projects and business needs in one place
  • 4. Examples of modern analytics platforms
  • 5. What Can You Do with Modern Analytics Platforms? Centralized BI and analytics Data discovery Data democratization with governance Next-gen projects – AI and ML
  • 6. What are the benefits of a modern analytics platform? Visibility into all data Sets course for real-time pipelines Limits skills gaps Removes data silos
  • 7. Reality is not so simple Silos of multi- structured data Legacy IT infrastructure Data archives Employees
  • 8. Value that Data from Legacy Systems Brings • Holds important transaction data • Most core business applications running on legacy systems • High volumes of data
  • 9. What happens to legacy data sources? Ignore data sources for inclusion Homegrown solutions Rely on existing investments
  • 10. Best practices for legacy data integration
  • 11. Best practices 1. Breakdown the legacy data silo 2.Rethink your current approach 3.Build real-time applications 4.Limit skills gap & costs
  • 12. 1. Breakdown the legacy data silo
  • 13. Shipping Company requires real-time delivery status Top level mandate driven by customer demands to: 1. Integrate customer and shipment information that resides on multiple systems of record 2. Improve integration of mainframe systems with analytics platform 3. Replicate changes of mainframe data to larger business in real-time Challenge: Mainframe data not readable for downstream tracking dashboards
  • 14. Precisely makes mainframe data readable in snowflake for real-time tracking Solution • Connect (ETL + CDC) • Snowflake Results • Power business user and customer dashboards with the latest shipment information • Report shipment information in ways that give business competitive edge • Integrate and replicate hundreds of z/OS Db2 tables to Snowflake • All data is integrated and readable across platforms
  • 15. Lessons learned for enacting this best practice • Have a clear idea data delivery SLAs: is data required in near-real-time, once an hour, or only once a day? • Based on your SLA, select the right data extraction mechanism for your data sources of interest • Remember that, distributed cloud architectures promise agility but may not readily integrate with existing infrastructure
  • 16. 2. Rethink your current approach
  • 17. Creating enterprise claims hub, required quickly adding new targets Strategic decision to use data to: 1. Improve the claims experience for end customers 2. Identify of patterns in claims to alert the business to unexpected severe claims 3. Automate the fast-tracking of low dollar claims without the need for an adjuster Challenge: Current methods of integrating mainframe data
  • 18. Precisely and Databricks helps to create high performance data hub Solution • Connect (ETL) • Databricks Results • No downtime or rework for implementing a new approach to legacy source integration • Ability to meet requirements of high-volume processing for data hub • Faster time to close claims and improved customer experiences
  • 19. Lessons learned for enacting this best practice • Clearly define the goals of your modernization efforts: are you trying to save costs, improve performance, or something else? • Chose data integration solutions that allow you to easily expand for new use cases • New requirements for modern data platforms may break current data integration architectures • Select a tool that solves your integration problems across the hybrid landscape, from datacenter to public cloud
  • 20. 3. Build real-time pipelines
  • 21. Financial Services Company needs to build a real-time AML process Top level mandate driven by regulatory demands to: 1. Have consolidated, clean, verified data for all analytics and reporting 2. Provide alerts to any suspicious activity in real-time 3. Integrate mainframe data to analytics but also maintain an unmodified copy of mainframe data stored Challenge: Disparate systems and slow time to update mainframe data caused major process delays in meeting AML monitoring
  • 22. Precisely and Cloudera enable AML with timely delivery Solution • Connect (ETL + CDC) • Trillium • Cloudera Results • High performance AML results • Faster time to value • Data lake is trusted source • Data feeding critical machine learning-based fraud detection Looking forward… • Expanding to additional Customer Engagement solutions and applications
  • 23. Lessons learned for enacting this best practice • Select solutions that guarantee data delivery and have reliable transfer of information • Ensure that your change detection mechanism has a lightweight, negligible impact on your production systems, to minimize business disruption • Assess how your overall cloud strategy can support real-time data delivery by selecting technologies that can handle data in motion • DI solutions with native integrations to modern analytics platforms help to speed results
  • 24. 4. Limit skills gap & costs
  • 25. Credit Union looks to enable a data hub for all lines of business Top level mandate to open up data across the organization: 1. To improve customer banking experiences 2. Provide transparency of data to lines of business for analytics and BI 3. Enable AI/ML use cases with richer legacy data sets Challenge: Core banking functions run on mainframe but lack of skills in house incurred high development costs and made it difficult to scale
  • 26. By the numbers…the cost of legacy data $95 per hour 40 hour work week $3800 cost per week 6 months average project time 2 programmers $7600 cost per week $7600 cost per week $197,600 cost per project
  • 27. Connect’s ETL helps to lower costs and solve skills gap Solution • Connect (ETL) Results • Reduced costs to development • Leverage existing skills in house and enable • Delivers all enterprise data for distribution across an proprietary analytics platform
  • 28. Lessons learned for enacting this best practice • “Homegrown” is not always free • Look for solutions that help you leverage the existing skills you have in house with minimal retraining • While solutions can ease a skills burden, good documentation is also critical to decrease risk and facilitate knowledge sharing
  • 29. What you can do in the next 90 days… • Assess how you are currently using mainframe and IBM i data today • Look at ways in which you can leverage data from legacy systems to maximize impact • Keep both best practices and lessons learned in mind when developing your approach! • Remember Precisely is here to be your partner in innovation!

Editor's Notes

  1. All of these platforms are providing a way to approach the streamlining and unification of data for analytics based projects such as AI, machine learning, and business insights.
  2. Centralized business insights – central management of business insights, helps to shift insights from one offs in isolation to Data discovery - business end-users can work with large data sets and get answers to questions they are asking. Data Discovery is helping the enterprise lose some of the bulk when it comes to running analytics. Data democratization – enables more users to have autonomy with data but without the risk of exposing sensitive data in a way that could violate regulations or internal best practices
  3. Visibility into all data – it provides views that make data look simpler and more unified than it actually is in today's complex, multiplatform data environments Sets course for real-time pipelines - the modern hub, it regularly instantiates data sets quickly on the fly. It may also handle terabyte-scale bulk data movement. Either, way a modern data hub requires modern pipelining for speed, scale, and on-demand processing. Limits skills gas - The IT world is full of old-fashioned data hubs that are homegrown or consultant-built. Support advanced forms of orchestration, pipelining, governance, and semantics, all integrated in a unified tools Removes data silos - Again, this is accomplished without consolidating silos. Think of the data views, semantic layers, orchestration, and data pipelines just discussed. All these create threads that weave together into a data fabric, which is a logical data architecture for all enterprise data that can impose functional structure over hybrid chaos
  4. When it comes to building up unified analytics platforms there is a level of complexity that exists across an enterprise We have silos of multi-structured data difficult to integrate (ERP, CRM, mainframes, RDBMS, Files, logs, cloud data sources) heterogeneous legacy IT infrastructure (EDWs, data lakes, marts, severs, storage, archives and more) and thousands maybe more of employees and lots of inaccessible information
  5. Your traditional systems – including mainframes, IBM i servers & data warehouses – adapt and deliver increasing value with each new technology wave Even with the growth of next-gen technologies, legacy systems (i.e. mainframes and IBM i) still play an important role within many businesses. More than 70% of Fortune 500 enterprises continue to use mainframes for their most crucial business functions. Mainframes often hold critical information – from credit card transactions to internal reports. Most large enterprises have made major investments in mainframe data environments over a period of many years and will not be leaving these investments anytime soon. It is estimated that 2.5 billion transactions are run per day, per mainframe across the world. This high volume of data is one that organizations cannot choose to ignore or neglect. Additionally, mainframes often have no peer when it comes to the volume of transactions they can handle and cost-effectiveness. As a result, these environments contain the data that organizations run on, and in turn, power the strategic big data initiatives driving the business forward – machine learning, AI and predictive analytics. Business insights, artificial intelligence and machine learning efforts are only as good as the data that is being fed in and out of them. Leaving mainframe data out of the equation when building strategic initiatives risks omitting critical information that could greatly influence business outcomes. Specifically, neglecting mainframe data from strategic initiatives results in: • The value of an organization’s big data investments being diminished • Analytics that are not accurate or complete • Large, rich enterprise datasets that never even get analyzed
  6. From speaking with Precisely customers and prospects, we have found these 3 things happen when it comes to approaching legacy data
  7. So how do we get around these and make a true enterprise data hub? Let’s take a look
  8. Break down legacy data silos – removing the barriers that come with accessing and integrating data from legacy data stores, mainframe, IBM i and more Rethink – sometimes you might be already doing something with legacy data, you have the access but the needs of the organization may be changing causing you to think about how you might implement a new solution in line with or to replace existing Real-time, data is only as good as how quickly it is delivered, to do this you need to have a way to build real-time delivery of changes in legacy systems to One of the biggest hinderances to unified analytics hubs can be the lack of skills or costs associated with accessing legacy data
  9. This company wants to vastly improve its tracking and package visibility. They feel that they need to offer customers more visibility into the movement of goods. Pushing the status of goods to customer dashboards will give them the ability to provide more real-time location and updates to transit time and delivery. This concept is familiar to consumer shipping, we know when and where our package is in real-time, not so much when it comes to freight. To accomplish this, they needed data from disparate data sources, including DB2/z and SQL Server. They connect their legacy sources and their target Snowflake.
  10. Understand what real-time means to your business, for this customer updating dashboards every few hours was good enough for customers to get what they need Based on what you need, your choice for the data extraction mech might be different, do you want to do real-time CDC or batch delivery, what works best to meet the SLA? In the case of Snowflake, it cannot natively read MF data, so you need to prepared for such a road block
  11. Repeatable
  12. An American insurance company wanted to take a variety of data from across their organization to build an enterprise-wide claims data lake. The purpose of the claims data lake was to receive data from across the lines of business and improve analysis of customer activity, historical data, and richer analytics. In its ideal scenario, the claims data would help identification of patterns in claims to alert the business to unexpected severe claims or to automate the fast-tracking of low dollar claims without the need for an adjuster. Data funneling into the hub would include information from core systems such as actuary, call center, claims, and billing different departments. Most of this data existed on mainframes. Mainframe data file formats included EBCDIC-encoded VSAM data with binary and packed data types mapped by multiple complex copybooks. When it came time to integrate all these data sources, the insurance company struggled to get data from the mainframe to its data lake. Getting mainframe data into the data lake meant that they had to spin up an entirely separate process for data ingestion. As a result, the insurance company had a siloed process that caused lost time, delayed delivery, and incomplete claims analytics.
  13. Once mainframe data ingest was complete, the insurance company then needed to modernize its ETL processes to scale within Databricks. The insurance company had been using Precisely Connect with Spark on Azure HDInsights for ETL transformation on its claims data hub data and determined a need to move these existing workflows into Databricks. However, the insurance company did not want to perform any rework to their data integration workflows, especially as many had complex data transformations upon the mainframe data. Using Precisely Connect, the insurance company built ETL processes that took a design-once, deploy anywhere approach, and as a result, had no rework or redesigns required to migrate the Azure HDInsights pipelines to run on Databricks. Data migration from Hive on HDInsights to Delta Lake was achieved via JDBC connectivity and the Precisely Connect high-performance integration engine to sufficiently parallelize the data load. Furthermore, Precisely Connect was able to produce the high-performance, self-tuning sorts, joins, aggregation, merges, and look-ups required for the organization to get the data they needed in the right way. Precisely Connect’s ability to run natively in the Databricks run-time also ensured they were able to optimize the data integration workflow for the high-volume requirements of the claims data hub.
  14. Meet AML transaction monitoring and Financial Conduct Authority (FCA) compliance Challenges Data volume too large, diversely scattered to analyze Disparate data sources – Mainframe, RDBMS, Cloud, etc. Maximize the value/ROI of the data lake Requirements: Consolidated, clean, verified data for all analytics and reporting. MUST have complete, detailed data lineage from origin to end point MUST be secure: Kerber-ose and LDAP integration required Need unmodified copy of mainframe data stored on Hadoop for backup, archive
  15. Connect to create “Golden Record” on Hadoop for compliance archiving Trillium for cluster-native data verification, enrichment, and demanding multi-field entity resolution on Spark framework Cloudera provides end…. Full end-to-end lineage from all sources, through transformations, to data landing, Benefits: Ensure Anti-Money Laundering regulatory compliance is met through financial crimes data lake – high performance results at massive scale. Achieve fast time to value with flexible deployment and ease of use Ensure the data lake is trusted source of data feeding critical machine learning-based fraud detection Expanding use to additional Customer Engagement solutions and applications.
  16. Ensure – not using triggers for change detection and make sure you’re using CDC solutions that use zIIP processors on the mainframe to lessen the MIPS load
  17. Needed to access Db2 and VSAM files need to be accesses for AI/ML use cases Current solution that they had for DI was complex and not dynamic Connect helped to extract COBOL program on mainframe making it scalable for big data platforms
  18. Decided to attempt doing the work in house with contractors….. Per project prior to Connect required 1-2 programmers @ $95/per hour they were hired for 6-8 months, roughly cost savings is $104K per project – could not quantify the overhead related to systems Assuming 26 weeks in a 6 month period
  翻译: