Rajesh S has over 3 years of experience in developing ETL applications using IBM Datastage. He has extensive experience designing and developing Datastage jobs to extract, transform and load data from various sources such as Oracle and Teradata databases into data warehouses. Some of his key skills include Datastage, Unix scripting, Oracle, Teradata and working on projects in the healthcare and retail domains.
Meetup: Big Data NLP with HPCC Systems® - A Development Ride from Spray to TH...HPCC Systems
HPCC (High Performance Computing Cluster) Systems from LexisNexis is an open source massive parallel-processing computing platform that solves Big Data problems. In this talk, attendees will be given an overview of HPCC Systems and see a demonstration of its use to parse data from free-form and semi-structured text. This represents a combined text extraction task with human intervention. The code elements and massively parallel processing principles involved in accomplishing these tasks will be thoroughly discussed.
Shipra Jaiswal has over 6 years of experience in data warehousing and business intelligence solutions using tools like Informatica and Teradata. She has worked on ETL projects in various domains including healthcare, banking, e-commerce, and aviation. Her responsibilities have included requirements gathering, data modeling, mapping design, development, testing, implementation, and support.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes, performing testing, and collaborating with other teams on data migration projects for clients in various industries.
The document contains the resume of Naveen Reddy Tamma which summarizes his work experience and qualifications. He has over 7 years of experience working as an Associate at Cognizant Technology Solutions on various projects involving Informatica ETL development, data quality, and reporting. He holds a B.Tech in Computer Science and has experience with technologies like Informatica, Teradata, Oracle, and Cognos.
This document provides a summary of Rajesh Dheeti's professional experience and qualifications. It summarizes his 4+ years of experience developing ETL processes using Informatica PowerCenter to extract, transform, and load data from sources like Oracle and Teradata. It also lists 5 projects he has worked on involving building ETL mappings and workflows to load data into data warehouses.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has designed, developed, deployed and supported large scale ETL processes involving sources over 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata utilities. He has worked on multiple industries including healthcare, banking and insurance.
This resume is for Basu K S, an SAP BODS ETL Developer with over 3 years of experience developing and maintaining data warehouses and performing data migration. He has extensive experience using tools like SAP BODS, Information Steward, SQL Server, and writing SQL stored procedures. Some of his responsibilities include providing ETL designs, developing ETL jobs, performing data cleansing, testing, and supporting production loads. He is currently working at Utopia and has previously worked at Mindtree.
Arun has over 9 years of experience in IT with a focus on data warehousing and ETL projects. He has experience working with various tools like Informatica, Greenplum, and Teradata. Some of the key projects he has worked on include projects for Novo Nordisk involving building a data warehouse and ETL processes to load data from various sources, and projects for Inttra involving designing an ETL framework, data loading, and report generation.
Meetup: Big Data NLP with HPCC Systems® - A Development Ride from Spray to TH...HPCC Systems
HPCC (High Performance Computing Cluster) Systems from LexisNexis is an open source massive parallel-processing computing platform that solves Big Data problems. In this talk, attendees will be given an overview of HPCC Systems and see a demonstration of its use to parse data from free-form and semi-structured text. This represents a combined text extraction task with human intervention. The code elements and massively parallel processing principles involved in accomplishing these tasks will be thoroughly discussed.
Shipra Jaiswal has over 6 years of experience in data warehousing and business intelligence solutions using tools like Informatica and Teradata. She has worked on ETL projects in various domains including healthcare, banking, e-commerce, and aviation. Her responsibilities have included requirements gathering, data modeling, mapping design, development, testing, implementation, and support.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes, performing testing, and collaborating with other teams on data migration projects for clients in various industries.
The document contains the resume of Naveen Reddy Tamma which summarizes his work experience and qualifications. He has over 7 years of experience working as an Associate at Cognizant Technology Solutions on various projects involving Informatica ETL development, data quality, and reporting. He holds a B.Tech in Computer Science and has experience with technologies like Informatica, Teradata, Oracle, and Cognos.
This document provides a summary of Rajesh Dheeti's professional experience and qualifications. It summarizes his 4+ years of experience developing ETL processes using Informatica PowerCenter to extract, transform, and load data from sources like Oracle and Teradata. It also lists 5 projects he has worked on involving building ETL mappings and workflows to load data into data warehouses.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has designed, developed, deployed and supported large scale ETL processes involving sources over 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata utilities. He has worked on multiple industries including healthcare, banking and insurance.
This resume is for Basu K S, an SAP BODS ETL Developer with over 3 years of experience developing and maintaining data warehouses and performing data migration. He has extensive experience using tools like SAP BODS, Information Steward, SQL Server, and writing SQL stored procedures. Some of his responsibilities include providing ETL designs, developing ETL jobs, performing data cleansing, testing, and supporting production loads. He is currently working at Utopia and has previously worked at Mindtree.
Arun has over 9 years of experience in IT with a focus on data warehousing and ETL projects. He has experience working with various tools like Informatica, Greenplum, and Teradata. Some of the key projects he has worked on include projects for Novo Nordisk involving building a data warehouse and ETL processes to load data from various sources, and projects for Inttra involving designing an ETL framework, data loading, and report generation.
This document contains a professional profile for Jeevananthan R, including his contact details, educational background, work experience, skills, and strengths. He has over 4 years of experience in data warehousing using tools like Informatica, Oracle, and Unix shell scripting. Currently working as an ETL Developer/Module Lead at Cognizant Technologies, he has previously worked on projects involving data extraction, transformation, and loading for clients like Smith & Nephew and CVS Caremark.
The document provides a summary of an ETL developer's skills and experience. It includes 3+ years of experience developing ETL processes in IBM InfoSphere Datastage 9.1. Specific experience includes developing Datastage jobs using various stages, debugging, performance tuning, implementing slowly changing dimensions, and working with databases like Oracle, SQL Server and Netezza. Project experience is provided for three projects involving reverse mortgage data warehousing, risk data warehousing, and an order tracking application. Responsibilities included developing ETL processes, testing, and supporting production environments.
ETL tools extract data from various sources, transform it for reporting and analysis, cleanse errors, and load it into a data warehouse. They save time and money compared to manual coding by automating this process. Popular open-source ETL tools include Pentaho Kettle and Talend, while Informatica is a leading commercial tool. A comparison found that Pentaho Kettle uses a graphical interface and standalone engine, has a large user community, and includes data quality features, while Talend generates code to run ETL jobs.
This resume summarizes Sanjaykumar Mane's qualifications and experience. He has over 15 years of experience in database engineering. His skills include Oracle 11g, PL/SQL, SQL*Loader, UNIX, data modeling, ETL tools like Pentaho and Oracle Data Integration. He has worked as a technical lead and project leader on various projects involving data migration, report generation, and database design. His most recent experience is as a technical lead for CITI, where he worked on MemSQL and ODI proof of concepts.
Aden Bahdon has over 15 years of experience as an Oracle developer and database administrator, specializing in designing and implementing data warehouse solutions. He has extensive experience working on projects for clients such as IBM Canada, Bell Canada, and the Department of National Defence, where he developed databases, ETL processes, and reports. His skills include Oracle, SQL, PL/SQL, Java, DataStage, MicroStrategy, and he has experience in all phases of the software development lifecycle.
Ramachandran has over 9 years of experience as a Senior Informatica Developer. He has expertise in data warehousing, ETL development and implementation using Informatica PowerCenter. Some of his key skills include dimensional modeling, mapping complex transformations, tuning Informatica workflows, and working with databases such as Oracle, SQL Server and Teradata. He has worked on multiple projects in the healthcare domain for clients in the US and India.
J.M. Hoffman has over 35 years of experience in programming and systems analysis. He has extensive skills in languages like SQL, Visual Basic, C++, and COBOL. He has worked on projects involving medical billing systems, point of sale systems, and web development. His background includes roles as a senior programmer, systems analyst, and database engineer on various projects.
Shrikantha DM is a senior software engineer with over 3.5 years of experience developing applications using Oracle PL/SQL. He has skills in application design, development, testing and implementation using Oracle databases, PL/SQL and SQL. His experience includes developing applications for tasks such as URL filtering, managing IP addresses and domain servers, and organizing meetings using wireless networks.
Eric Stone is an IT specialist with over 25 years of experience in application development, database design, and programming. He has extensive experience with Oracle, SQL Server, data warehousing, and business intelligence tools. His background includes developing applications, performing Y2K testing, and both client/server and network support. He is skilled in Oracle, SQL Server, Informatica, Visual Basic, Java, C++ and Clipper.
This document is a curriculum vitae for Gervano Polycarp Domingos Fernandes summarizing his professional experience and qualifications. He has over 5 years of experience in IT, particularly in data warehousing and integration using tools like Informatica PowerCenter and Oracle databases. His most recent roles include working as an ETL developer on telecom data warehouse projects in Africa and developing Oracle database applications for telecom clients in the UK. He holds a B.E. in ETC from Goa University and has experience in technologies including SQL, shell scripting, PHP, and UNIX.
Pedro P. Gatica Jr. provides his contact information and an extensive professional profile summarizing his experience and skills as an IT Lead Software Developer/Integrator with over 25 years of experience in data warehousing, ETL development, and project management. He has expertise in technologies like DataStage, Informatica, DB2, Oracle, and SQL Server. His experience includes roles at USAA from 1997 to 2014 where he led many strategic data integration projects and established batch and real-time ETL environments.
Eric Stone has over 25 years of experience as an IT specialist and programmer. He has extensive experience developing data warehouses and ETL processes using tools like Informatica and Oracle. His background includes application development, database design, and data warehousing. He is currently developing extracts and ETL for a company's Oracle-based enterprise data warehouse containing blood banking data from centers nationwide.
The document is a 20 page comparison of ETL tools. It includes an introduction, descriptions of 4 ETL tools (Pentaho Kettle, Talend, Informatica PowerCenter, Inaplex Inaport), and a section comparing the tools on various criteria such as cost, ease of use, speed and data quality. The comparison chart suggests Informatica PowerCenter is the fastest and most full-featured tool while open source options like Pentaho Kettle and Talend offer lower costs but require more manual configuration.
ETL (Extract, Transform, Load) is a process that allows companies to consolidate data from multiple sources into a single target data store, such as a data warehouse. It involves extracting data from heterogeneous sources, transforming it to fit operational needs, and loading it into the target data store. ETL tools automate this process, allowing companies to access and analyze consolidated data for critical business decisions. Popular ETL tools include IBM Infosphere Datastage, Informatica, and Oracle Warehouse Builder.
Syed Babar H Rizvi has over 12 years of experience as a software developer and team lead working on projects in healthcare, telecom and manufacturing. He currently works at Parametric Technology Corporation as a module lead. He has extensive experience with ETL tools like Informatica and databases like Oracle. He also has knowledge of big data technologies such as Hadoop, Spark and machine learning techniques.
This document provides a summary of the job applicant's experience including 7 years of experience working with Oracle databases and developing applications using PL/SQL. The applicant has experience developing database objects, queries, procedures, packages and triggers for various clients across several projects of varying sizes. Their education includes a B.Tech in IT and skills include SQL, PL/SQL, Oracle and MS SQL Server databases.
The document summarizes Rensselaer's process for selecting an Extraction Transformation and Loading (ETL) tool for their data warehouse project. A selection committee evaluated tools from Ascential, Cognos, and Informatica. They determined that Cognos' tool was too limited. Ascential and Informatica tools were comparable, but Informatica was selected as it was deemed more mature and stable, had simpler use while maintaining flexibility, and had lower overall costs given Rensselaer's source and target systems. Reference checks were made of existing Ascential and Informatica customers.
Ratna Rao Yamani has over 9 years of experience in IT and 7 years of experience with data warehousing technologies like Informatica Power Center and Informatica MDM. They have extensive experience developing ETL code, working with databases like Oracle and DB2, and performing tasks like requirements gathering, design documentation, testing, and performance tuning for various projects involving data integration and data warehousing.
This document summarizes key Turkish ports and their container terminal operations. It includes maps of the Mediterranean and Turkey locating major ports. The largest port is in Istanbul, with several privately operated terminals on the European side handling over 1.4 million TEU annually. Other key container ports discussed are in Izmir, Mersin, Gemlik and Aliaga.
This document profiles 9 young, successful marketers in India. It provides brief biographies on each marketer, including their career paths, accomplishments, and responsibilities in their current roles at companies like P&G, Cadbury, Coca-Cola, and Asian Paints. The marketers range in age from their early 30s to late 30s and work in categories like fabric care, chocolate, flavors, and paints. They demonstrate traits like passion, risk-taking, and a focus on innovation to drive their brands' growth.
This document contains a professional profile for Jeevananthan R, including his contact details, educational background, work experience, skills, and strengths. He has over 4 years of experience in data warehousing using tools like Informatica, Oracle, and Unix shell scripting. Currently working as an ETL Developer/Module Lead at Cognizant Technologies, he has previously worked on projects involving data extraction, transformation, and loading for clients like Smith & Nephew and CVS Caremark.
The document provides a summary of an ETL developer's skills and experience. It includes 3+ years of experience developing ETL processes in IBM InfoSphere Datastage 9.1. Specific experience includes developing Datastage jobs using various stages, debugging, performance tuning, implementing slowly changing dimensions, and working with databases like Oracle, SQL Server and Netezza. Project experience is provided for three projects involving reverse mortgage data warehousing, risk data warehousing, and an order tracking application. Responsibilities included developing ETL processes, testing, and supporting production environments.
ETL tools extract data from various sources, transform it for reporting and analysis, cleanse errors, and load it into a data warehouse. They save time and money compared to manual coding by automating this process. Popular open-source ETL tools include Pentaho Kettle and Talend, while Informatica is a leading commercial tool. A comparison found that Pentaho Kettle uses a graphical interface and standalone engine, has a large user community, and includes data quality features, while Talend generates code to run ETL jobs.
This resume summarizes Sanjaykumar Mane's qualifications and experience. He has over 15 years of experience in database engineering. His skills include Oracle 11g, PL/SQL, SQL*Loader, UNIX, data modeling, ETL tools like Pentaho and Oracle Data Integration. He has worked as a technical lead and project leader on various projects involving data migration, report generation, and database design. His most recent experience is as a technical lead for CITI, where he worked on MemSQL and ODI proof of concepts.
Aden Bahdon has over 15 years of experience as an Oracle developer and database administrator, specializing in designing and implementing data warehouse solutions. He has extensive experience working on projects for clients such as IBM Canada, Bell Canada, and the Department of National Defence, where he developed databases, ETL processes, and reports. His skills include Oracle, SQL, PL/SQL, Java, DataStage, MicroStrategy, and he has experience in all phases of the software development lifecycle.
Ramachandran has over 9 years of experience as a Senior Informatica Developer. He has expertise in data warehousing, ETL development and implementation using Informatica PowerCenter. Some of his key skills include dimensional modeling, mapping complex transformations, tuning Informatica workflows, and working with databases such as Oracle, SQL Server and Teradata. He has worked on multiple projects in the healthcare domain for clients in the US and India.
J.M. Hoffman has over 35 years of experience in programming and systems analysis. He has extensive skills in languages like SQL, Visual Basic, C++, and COBOL. He has worked on projects involving medical billing systems, point of sale systems, and web development. His background includes roles as a senior programmer, systems analyst, and database engineer on various projects.
Shrikantha DM is a senior software engineer with over 3.5 years of experience developing applications using Oracle PL/SQL. He has skills in application design, development, testing and implementation using Oracle databases, PL/SQL and SQL. His experience includes developing applications for tasks such as URL filtering, managing IP addresses and domain servers, and organizing meetings using wireless networks.
Eric Stone is an IT specialist with over 25 years of experience in application development, database design, and programming. He has extensive experience with Oracle, SQL Server, data warehousing, and business intelligence tools. His background includes developing applications, performing Y2K testing, and both client/server and network support. He is skilled in Oracle, SQL Server, Informatica, Visual Basic, Java, C++ and Clipper.
This document is a curriculum vitae for Gervano Polycarp Domingos Fernandes summarizing his professional experience and qualifications. He has over 5 years of experience in IT, particularly in data warehousing and integration using tools like Informatica PowerCenter and Oracle databases. His most recent roles include working as an ETL developer on telecom data warehouse projects in Africa and developing Oracle database applications for telecom clients in the UK. He holds a B.E. in ETC from Goa University and has experience in technologies including SQL, shell scripting, PHP, and UNIX.
Pedro P. Gatica Jr. provides his contact information and an extensive professional profile summarizing his experience and skills as an IT Lead Software Developer/Integrator with over 25 years of experience in data warehousing, ETL development, and project management. He has expertise in technologies like DataStage, Informatica, DB2, Oracle, and SQL Server. His experience includes roles at USAA from 1997 to 2014 where he led many strategic data integration projects and established batch and real-time ETL environments.
Eric Stone has over 25 years of experience as an IT specialist and programmer. He has extensive experience developing data warehouses and ETL processes using tools like Informatica and Oracle. His background includes application development, database design, and data warehousing. He is currently developing extracts and ETL for a company's Oracle-based enterprise data warehouse containing blood banking data from centers nationwide.
The document is a 20 page comparison of ETL tools. It includes an introduction, descriptions of 4 ETL tools (Pentaho Kettle, Talend, Informatica PowerCenter, Inaplex Inaport), and a section comparing the tools on various criteria such as cost, ease of use, speed and data quality. The comparison chart suggests Informatica PowerCenter is the fastest and most full-featured tool while open source options like Pentaho Kettle and Talend offer lower costs but require more manual configuration.
ETL (Extract, Transform, Load) is a process that allows companies to consolidate data from multiple sources into a single target data store, such as a data warehouse. It involves extracting data from heterogeneous sources, transforming it to fit operational needs, and loading it into the target data store. ETL tools automate this process, allowing companies to access and analyze consolidated data for critical business decisions. Popular ETL tools include IBM Infosphere Datastage, Informatica, and Oracle Warehouse Builder.
Syed Babar H Rizvi has over 12 years of experience as a software developer and team lead working on projects in healthcare, telecom and manufacturing. He currently works at Parametric Technology Corporation as a module lead. He has extensive experience with ETL tools like Informatica and databases like Oracle. He also has knowledge of big data technologies such as Hadoop, Spark and machine learning techniques.
This document provides a summary of the job applicant's experience including 7 years of experience working with Oracle databases and developing applications using PL/SQL. The applicant has experience developing database objects, queries, procedures, packages and triggers for various clients across several projects of varying sizes. Their education includes a B.Tech in IT and skills include SQL, PL/SQL, Oracle and MS SQL Server databases.
The document summarizes Rensselaer's process for selecting an Extraction Transformation and Loading (ETL) tool for their data warehouse project. A selection committee evaluated tools from Ascential, Cognos, and Informatica. They determined that Cognos' tool was too limited. Ascential and Informatica tools were comparable, but Informatica was selected as it was deemed more mature and stable, had simpler use while maintaining flexibility, and had lower overall costs given Rensselaer's source and target systems. Reference checks were made of existing Ascential and Informatica customers.
Ratna Rao Yamani has over 9 years of experience in IT and 7 years of experience with data warehousing technologies like Informatica Power Center and Informatica MDM. They have extensive experience developing ETL code, working with databases like Oracle and DB2, and performing tasks like requirements gathering, design documentation, testing, and performance tuning for various projects involving data integration and data warehousing.
This document summarizes key Turkish ports and their container terminal operations. It includes maps of the Mediterranean and Turkey locating major ports. The largest port is in Istanbul, with several privately operated terminals on the European side handling over 1.4 million TEU annually. Other key container ports discussed are in Izmir, Mersin, Gemlik and Aliaga.
This document profiles 9 young, successful marketers in India. It provides brief biographies on each marketer, including their career paths, accomplishments, and responsibilities in their current roles at companies like P&G, Cadbury, Coca-Cola, and Asian Paints. The marketers range in age from their early 30s to late 30s and work in categories like fabric care, chocolate, flavors, and paints. They demonstrate traits like passion, risk-taking, and a focus on innovation to drive their brands' growth.
Mani Sagar is an ETL Sr Developer and Lead with over 8 years of experience in designing, developing, and maintaining large enterprise applications. He has expert knowledge of ETL technologies like Informatica and data management processes including data migration, profiling, quality, security, and warehousing. He has led teams of up to 8 developers and delivered projects on time for clients across various industries.
Lokesh Reddy has over 10 years of experience as an analyst programmer at Accenture, USA. He has expertise in DataStage, Teradata, Oracle, SQL Server, and MSBI tools. Some of his roles include designing and developing DataStage jobs, writing SQL scripts, testing, and promoting code between environments. He has experience leading teams and working on projects for clients such as FTB, RBS Citizens Bank, MillerCoors, Aon, Jemena, and MHRA.
The document provides a technical summary and experience profile of Nootan Sharma. It summarizes his 8 years of experience in data warehousing and business intelligence projects. It details his expertise in tools like Informatica PowerCenter, Oracle, SQL Server and data quality management. It also lists his past work experience with companies like Capgemini, Birlasoft and Infogain on various BI and data warehousing projects for clients in different sectors.
- The document contains the resume of Abdul Mohammed, an ETL developer with 8 years of experience using Informatica for data warehousing projects.
- He has expertise in requirements gathering, data extraction from various sources, transforming the data using Informatica tools, and loading the data into target databases.
- His most recent role was as an ETL/SR Informatica Lead from 2015-present where he worked on building a data warehouse for a pharmaceutical company using Informatica to extract data from Oracle and flat files.
Shivaprasada Kodoth is seeking a position as an ETL Lead/Architect with experience in data warehousing and ETL. He has over 8 years of experience in data warehousing and Informatica design and development. He is proficient in technologies like Oracle, Teradata, SQL, and PL/SQL. Some of his key projects include developing ETL mappings and workflows for integrating various systems at BoheringerIngelheim and UBS. He is looking for opportunities in Bangalore, Mangalore, Cochin, Europe, USA, Australia, or Singapore.
The document provides a professional summary and work experience for Job A Easow. It summarizes his skills in ETL using Datastage, data warehousing concepts, SQL, databases like Oracle and Teradata. It outlines two roles, first as a Datastage Developer for Harvard Pilgrim HealthCare where he developed ETL jobs and scripts for loading data into a data warehouse. His second role was as a ServiceNow Developer for the University of Texas at Tyler where he customized, developed and maintained their ServiceNow tool. The document demonstrates his experience in ETL, data warehousing, databases, scripting languages and configuration of tools like Datastage, Teradata, ServiceNow and others.
Akshay Shaha is a technical lead with 5 years of experience in data warehousing and business intelligence projects. He has expertise in Teradata, Informatica, and SQL. Shaha currently works as a consultant for a large healthcare client at Deloitte, where he leads a team developing software to analyze and report on healthcare provider costs and services. Previously he has worked on other data-focused projects in healthcare and banking. Shaha holds a Bachelor's degree in Information Technology.
Kumari Anuradha is seeking a role utilizing her 3 years of experience as a software developer/designer in data warehousing and business intelligence. She has expertise in ETL programming using Informatica PowerCenter and loading data to data warehouses like Teradata. Her experience includes projects for clients like PepsiCo, Cisco Systems involving requirements analysis, design, development, testing and support of ETL processes extracting data from various sources and systems and loading them into data warehouses. She has skills in SQL, databases like Oracle, scripting, and tools like Informatica, Toad, and Tableau.
Shrey Kumar has over 4 years of experience as a Software Engineer working on various data integration projects. He has extensive knowledge of tools like ODI, OBIEE, JIRA, SQL Developer, and programming languages like Java, PL/SQL. Some of his responsibilities have included requirement analysis, documentation, coding, testing, resolving issues, managing data loads and implementing change requests. He currently works as a Senior Software Engineer at KPIT Technologies on a data integration project for Praxair that involves loading data from various sources into a Oracle data warehouse.
Pradeepa Dharmappa is an Oracle Certified Associate with over 5 years of experience in data warehousing using tools like Informatica PowerCenter, PL/SQL, and SAP BODS. She has worked as a Software Engineer at HCL Technologies and as a Senior Software Engineer at InterCall, where her responsibilities included ETL development, data integration, and supporting data applications. She is proficient in technologies like Oracle, SQL Server, UNIX, and has experience working with large datasets and complex data transformations.
Sreekanth has over 5 years of experience in data warehousing and ETL development using IBM DataStage. He has worked on projects in the telecom and automotive industries, extracting data from various sources and loading it into data warehouses. His responsibilities included designing and developing ETL jobs, testing, troubleshooting, performance tuning, and providing production support. He is proficient in DataStage, Oracle, Teradata, UNIX, and scheduling tools like Autosys.
Ganesh Kamble is a technology professional with over 1.5 years of experience in business intelligence development. He has experience developing reports, dashboards, and universes in BusinessObjects 4.0 and migrating projects from BI 3.1 to 4.0. He has also worked on ETL processes using Informatica and developed macros and procedures using Visual Basic for Applications. Kamble aims to contribute his skills in business intelligence and gain further experience and knowledge.
This document is a curriculum vitae for Rajeswari Pothala. It outlines her professional experience working for Tata Consultancy Services for over 6 years leading teams of up to 8 members on data warehousing and ETL development projects. It also lists her educational qualifications including a B.Tech in Electronics and Communication Engineering. Key projects outlined include work on the TCS Trimatrix EDW project and several projects for Aviva involving data integration, mappings development, and module lead responsibilities.
This document contains the resume of Shraddha Verma, a Data Warehouse Architect with over 10 years of experience in designing and developing ETL applications for data warehouses. She has extensive experience with tools like DataStage, Informatica, and Teradata utilities. She has worked on projects in various domains for clients like United Health Group, Sapient, and Tata Consultancy Services. Her skills include ETL design, data quality, project management, and people management. She is looking for a role as a techno-functional consultant in the data warehousing/BI domain.
Pradeepa Dharmappa is seeking a job with career growth and financial growth. She has over 5 years of experience in data warehousing using tools like Informatica and SQL. She has worked as a Software Engineer at HCL Technologies and as a Senior Software Engineer at InterCall. Her experience includes ETL development, data modeling, performance tuning, and working with databases like Oracle, SQL Server, and Informix. She has a bachelor's degree in computer science with over 80% marks.
This document contains a summary of Raj Ganesh Subramanian's work experience and qualifications. He has over 5 years of experience in data warehousing, ETL development, and database management. He has extensive experience with Informatica PowerCenter and has worked on projects for clients such as GE Transportation, GE Aviation, and IGATE Technologies. He has expertise in Oracle, SQL Server, Informatica, Unix scripting, and reporting tools such as Spotfire, Tableau, and Cognos.
This document is a curriculum vitae for Venkat ramanarsaiah Bathem. It outlines his professional experience as a software engineer with over 4 years of experience working with the Informatica ETL tool on data warehousing projects. It also lists his educational background, including an MCA from NIT Raipur, technical skills including SQL, Java, C/C++, and certifications in software testing and Informatica. The CV describes 5 projects he has worked on for clients in banking, financial services and healthcare.
This document contains the resume of Anil Kumar Andra. It summarizes his 5 years of experience as an ETL Developer in the IT industry and 3 years of experience in non-IT work. It lists his technical skills including experience with IBM Datastage ETL tool, SQL, DB2, and relational databases. It also provides details of two projects he worked on, one for Bharti Airtel and Vodafone migrating and transforming telecom data, and another for Shell migrating sample test data. It describes his responsibilities of designing and developing ETL jobs to load large volumes of data into data warehouses. Finally, it briefly outlines his non-IT experience maintaining electrical equipment as a supervisor for HS
This document contains a professional summary and work experience for Tarun Medimi. He has over 2.7 years of experience as a Teradata developer delivering reporting solutions for Apple Inc. His roles have included developing ETL interfaces, stored procedures, and aggregate tables in Teradata and Oracle databases. He has worked on projects involving customer satisfaction dashboards, staffing attainment reporting, and performance management dashboards. His responsibilities have included requirements gathering, design, development, testing, implementation, and support.
Kallesha has over 4 years of experience as an Informatica/PLSQL developer. She has extensive experience developing mappings in Informatica to extract, transform and load data from various sources into data warehouses. She has worked on projects in various domains including storage, sales, banking, and finance. Kallesha is proficient in technologies like Informatica, Pentaho, Hive, HBase, Pig, Oracle, Teradata, and Shell scripting.
1. Rajesh S
Senior Software Engineer
Mobile: +91-7204021290
Mailto: RajeshMesham@hotmail.com
Summary:
Having around 3.6 years of experience in IT industry with strong focus on
developing ETL applications of which around 3.4 years of relevant experience in
Data warehousing tool like IBM Datastage 8.7, 9.1 (Data Stage Designer, Data
Stage Manager, Data Stage Director) and Teradata utilities like
(Tpump,Multiload,BTEQ Script),Unix Shell Scripting , Replication tool -
Infosphere CDC v10.2. I have strong design fundamentals with expertise in
developing ETL applications using Datastage Enterprise Edition
DATASTAGE EXPERTISE:
I have worked extensively in Data-Ware housing development projects in health
care and retail domains.
Designing and Developing ETL jobs in Data stage v8.7 and v9.1.
Writing UNIX scripts.
Worked in Oracle and Teradata databases.
Roles/Responsibilities mainly included requirements gathering, preparing high
level and low level design documents, development, reviewing code, end to end
implementations of projects and finalizing scheduling requirements, developing
user stories in agile methodology.
Data warehousing concepts.
Extensively worked on various stages like Transformer, Sequential file, Join,
Lookup, Sort, Aggregator, Copy, Dataset, Merge, Oracle, Remove duplicates and
DB2 stages.
Good experience in ETL process and fulfillment of data warehouse project tasks
such as data extraction, cleansing, transforming and loading into the target
Warehouse database.
Experienced in troubleshooting DataStage jobs, fixing bugs and addressing
production issues like performance tuning and enhancements.
Highly involved in developing of DataStage parallel jobs
2. EDUCATION
♦ B.Tech- Information Technology from LMEC, Anna
University, Madurai (Tamilnadu, India) June 2012
♦ 12th –S.H.S.S–Tamilnadu State Board (Tamilnadu, India)
April 2008
♦ 10th –S.H.S.S–Tamilnadu State Board (Tamilnadu, India)
May 2006
CAREER
SU
MM
ARY
I am currently working with Ness technologies as senior
software engineer from Dec 2015 to till date
Worked with AIG Data Services as Service Analyst from
April 2015 to Dec 2015.
Worked on DELL Services as Service IT DEV
Programmer from Jan 2013 to April 2015.
SKILLS &
EXP
ERT
ISE
Core Skills:
TECHNICAL SKILLS:
Operating Systems : Windows 9x, 2000, XP and
Unix
Databases : Oracle11g,Teradata
Language : SQL
OLAP Servers & Tool: Business Objects
DW Tools : Datastage 9.1, 8.7Enterprise
Edition (PX Jobs)
Replication tool : Infosphere CDC v10.2
Scheduling Tools : Autosys, Cron tab(Robot)
Other Skills:
Domain Knowledge : Health care (Payer Segment), Insurance
domain , Retail and sports
3. ACHIEVEMENTS/
TRAINING
/
CERTIFIC
ATIONS
• Received On the spot award twice for delivering the
code components on time.
• Received appreciation from client for production bug
fixes
• Completely involved end to end Data stage and DB2
Migration into LINUX machine to AIX Server
• Completed ITIL Foundation Certification from APMG
international
PROJECT SUMMARY
Company Client Project Technology Roles And
Responsibilities
Duration(in
months)
Ness Technologies
(Bangalore)
MWW(Mark
Work Wear
house)
Concur –
ETL
Integratio
n
Data stage
9.1,Unix
Scripting,
Oracle 11G,
Replication
tool -
Infosphere
CDC v10.2
Sr.ETL Developer 6 Months-
Present
AIG (Bangalore) AIG
Insurance
Global
Technical
Pricing
Reporting
Data stage
8.7, Unix
Scripting,
Oracle, ,
DB2,
Autosys
ETL
Developer/PSUPP
8 Months
Dell
Services(Bangalore)
Harvard
Pilgrim
Health Care
HPHC-
EDW
Data stage
8.7, Unix
Scripting,
Oracle,
Teradata,
Autosys
ETL Developer 24 months
PROJECT#1
Title : Concur Integration
Client : Marks Work Wear House
4. Environment : Datastage 9.1, oracle 11g, UnixShellScripting, Replication tool
Infosphere CDC v10.2
Duration : December 2015 to present
Description
Mark’s is one of the leading retailers in Canada and they have acquired FGL. FGL and
Marks are currently operating distribution networks across Canada that is run
independently. In addition, Mark’s uses manual systems in their two distribution center’s
that are currently operating near capacity, leading to higher costs and less flexibility in
their ability to meet business demands throughout peak periods. The main objective of
the project is to create a centralized distribution center which is Calgary Distribution
Center common for both Marks and FGL.
Roles and Responsibilities
Written a Shell script to FTP the file from Remote Machine to ETL server
Generated a key pairs for file encryption/decryption in Linux server
Invoked gpg command utility for file encryption/decryption in datastage
sequencer
ETL process design and development as per user stories.
Responsible for delivery of ETL process and code.
Establishing and enforcing best practices.
Fixing defects raised by the QA team during SIT process.
Scripting for all the jobs designed and developed.
Preparing unit test plan and cases.
PROJECT#2
Title : GLOBAL TECHNICAL PRICING AND REPORTING
(GTP)
Client : AMERICAN INTERNATIONAL GROUP
Environment : Data stage 8.7, DB2,Shell Script
Duration : April 2015 to December 2015
5. Description:
A tactical solution to provide reporting functions for global strategic product pricing.
Financial data is extracted from the Duck Creek Rater application in order to provide
benchmark pricing, report systematically on information used for pricing, and
enhance portfolio management. This data is used by Actuarial and Underwriter
business units at a Regional and Global level.
Roles and Responsibilities
The detail set of responsibilities includes:
Implementing Service Request using Data Stage, Teradata, and Unix Script.
Working knowledge of various Subject areas used in the project such as
Customer, Claims, Product, and Provider etc.
Involved in the analysis of data.
Involved in DWH and EDW Production Support.
Involved in building Data Stage Jobs & sequence as per the ETL Specifications
sent from onsite.
Responsible for creating SQL Validation scripts to validate the loaded data. Once
the Data Stage Job is executed, these test scripts need to be executed against the
databases to record the test results.
Involved in the inspections/reviews as per the requirement. For this proper review
documents need to be maintained for every task.
Involved in the Dev Flow of Completed
Involved in taking making backups related work in Harvest.
Involved in analyzing the new Service Request (SR) and deciding approach.
PROJECT#3
Title : HPHC EDW
Client : HARVARD PILGRIM HEALTH CARE
Environment : Data stage 8.7, Oracle, Teradata, SSH Tectia, Autosys
Duration : Jan 2013 to April 2015
Description:
Harvard Pilgrim HealthCare (HPHC) is the oldest nonprofit, health plan in New England.
HPHC provides health care coverage in Massachusetts and Maine. Perot Systems assist
HPHC to maintain the EDI transaction programs, The scope of the EDI project is to
develop, maintain and provide production support for the software programs to accept ,
6. validate, map and route HPHC incoming outgoing EDI transactions. Development and
maintenance (updating the existing programs) of the programs comprise activities of
requirement gathering, analysis, design, coding, testing and implementation.
Roles and Responsibilities
Involve in business and reporting requirements discussions with business
users/analysts, data modelers, and Micro Strategy developers, and dissect
technical requirements
Write Unix shell scripts and SQL queries for data extraction, data analysis, and
reporting based on the requirements
Collaborate with business users to design test plans and test cases for User
Acceptance Testing (UAT)
Develop, optimize, and maintain DataStage jobs to support the data warehousing
and business intelligence activities of the EDW ETL group
Develop DataStage jobs to load external vendor medical claims audit data into the
EDW to enable business users to make manual adjustments in invalid audits and
thereby realize savings
Lead the efforts to build DataStage parallel jobs with Runtime Column
Propagation to load the 2010 census data into the EDW to enable business users
to perform BI and decision-making
Work on emergency service requests to analyze problem incidents, debug SQL
scripts and DataStage jobs, and implement fixes
Work with business analysts to build, enhance, and maintain interactive EDW
audit reports in Oracle Application Express 3.x
Build credibility, establish rapport, and maintain communication with
stakeholders at multiple levels in the organization.
Create detailed deployment tasks and change tickets to facilitate application
deployments across environments in Remedy
Exposure to HIPAA Compliance requirements and HL 7 standards
7. Achievements
Received On the spot award twice for delivering the code components on
time.
Received appreciation from client for production bug fixes
Completely involved end to end Data stage and DB2 Migration into
LINUX machine to AIX Server
Completed ITIL Foundation Certification from APMG international