Vasudevan Venkatraman has over 11 years of experience working in the IT industry, including 7+ years of experience with Oracle PL/SQL, data warehousing, and 3+ years in performance consulting and applications database administration. He has experience designing and developing applications using Oracle PL/SQL, Hadoop, and big data technologies. Currently he works as an Assistant Consultant at TCS focusing on data warehousing projects using Oracle and Hadoop.
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
Triveni Patro is currently working as a Hadoop admin at Tata Consultancy Services in India with over 4 years of experience in IT development, administration, implementation, and 24/7 support of Hortonworks Hadoop distribution. Some key responsibilities include supporting production clusters of 400+ nodes and troubleshooting Hadoop cluster issues. Previous experience includes working as a Hadoop developer on projects for clients like Comcast and automotive companies, developing MapReduce, Pig, and Hive scripts for data processing and report generation.
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
Pankaj Kumar is seeking a challenging position utilizing his 7.9 years of experience in big data technologies like Hadoop, Java, and machine learning. He has deep expertise in technologies such as MapReduce, HDFS, Pig, Hive, HBase, MongoDB, and Spark. His experience includes successfully developing and delivering big data analytics solutions for healthcare, telecom, and other industries.
Romy Khetan is a senior software engineer with over 3 years of experience in big data technologies like Elasticsearch, MongoDB, Hadoop, Spark, and Java. She has worked on multiple projects involving sentiment analysis, vertical search, and identifying relationships across social media data. Her roles have included backend development, designing plugins, APIs, and interfaces between applications and services. She is proficient in technologies such as Scala, Redis, RabbitMQ, and graph databases.
This document contains the resume of Hassan Qureshi. He has over 9 years of experience as a Hadoop Lead Developer with expertise in technologies like Hadoop, HDFS, Hive, Pig and HBase. Currently he works as the technical lead of a data engineering team developing insights from data. He has extensive hands-on experience installing, configuring and maintaining Hadoop clusters in different environments.
Suresh Yadav is seeking a challenging position to improve his skills. He has a B.Tech in E.C.E from Malla Reddy Engineering College with 67% and expertise in Java, Hadoop, and Ubuntu. His one year of work experience was as a Technical Support Engineer at Krish Technologies, an IT services company. He completed an academic project on a traffic density analyzer and signal system using GSM technology. Suresh has strengths in learning new things, a positive attitude, and self-confidence.
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
Abinash Bindhani is seeking a position as a Hadoop developer where he can utilize over 2 years of experience with Hadoop and Java technologies. He currently works as a senior systems engineer at Infosys where he has gained experience migrating data from Oracle to Hadoop platforms and collecting/analyzing log data using tools like Flume, Pig, and Hive. His technical skills include MapReduce, HBase, HDFS, Java, Spring, MySQL, and Apache Tomcat. He has expertise in Hadoop architecture, cluster concepts, and each phase of the software development life cycle.
Abhinav is an ETL Hadoop developer with over 3 years of experience working with technologies like Hadoop, Informatica Power Centre, PL/SQL, and Unix. He has 1.5 years of experience on big data projects using Hadoop and technologies like Hive, Pig, HDFS, and Sqoop. He is proficient in Informatica for ETL development, data modeling, and data integration. Abhinav has also worked on projects involving data migration, requirement analysis, automation, and testing. He is looking for a role that offers learning opportunities while allowing him to utilize his skills.
Praveen Reddy Gajjala has over 2 years of experience as a Hadoop Developer at Wipro in Hyderabad. He has extensive skills in technologies like Java, Hadoop, Pig, Hive, HBase, and Sqoop. As part of a project for Sears, he helped analyze clickstream data from their websites and apps using Hadoop to improve the customer experience.
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
Triveni Patro is currently working as a Hadoop admin at Tata Consultancy Services in India with over 4 years of experience in IT development, administration, implementation, and 24/7 support of Hortonworks Hadoop distribution. Some key responsibilities include supporting production clusters of 400+ nodes and troubleshooting Hadoop cluster issues. Previous experience includes working as a Hadoop developer on projects for clients like Comcast and automotive companies, developing MapReduce, Pig, and Hive scripts for data processing and report generation.
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
Pankaj Kumar is seeking a challenging position utilizing his 7.9 years of experience in big data technologies like Hadoop, Java, and machine learning. He has deep expertise in technologies such as MapReduce, HDFS, Pig, Hive, HBase, MongoDB, and Spark. His experience includes successfully developing and delivering big data analytics solutions for healthcare, telecom, and other industries.
Romy Khetan is a senior software engineer with over 3 years of experience in big data technologies like Elasticsearch, MongoDB, Hadoop, Spark, and Java. She has worked on multiple projects involving sentiment analysis, vertical search, and identifying relationships across social media data. Her roles have included backend development, designing plugins, APIs, and interfaces between applications and services. She is proficient in technologies such as Scala, Redis, RabbitMQ, and graph databases.
This document contains the resume of Hassan Qureshi. He has over 9 years of experience as a Hadoop Lead Developer with expertise in technologies like Hadoop, HDFS, Hive, Pig and HBase. Currently he works as the technical lead of a data engineering team developing insights from data. He has extensive hands-on experience installing, configuring and maintaining Hadoop clusters in different environments.
Suresh Yadav is seeking a challenging position to improve his skills. He has a B.Tech in E.C.E from Malla Reddy Engineering College with 67% and expertise in Java, Hadoop, and Ubuntu. His one year of work experience was as a Technical Support Engineer at Krish Technologies, an IT services company. He completed an academic project on a traffic density analyzer and signal system using GSM technology. Suresh has strengths in learning new things, a positive attitude, and self-confidence.
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
Abinash Bindhani is seeking a position as a Hadoop developer where he can utilize over 2 years of experience with Hadoop and Java technologies. He currently works as a senior systems engineer at Infosys where he has gained experience migrating data from Oracle to Hadoop platforms and collecting/analyzing log data using tools like Flume, Pig, and Hive. His technical skills include MapReduce, HBase, HDFS, Java, Spring, MySQL, and Apache Tomcat. He has expertise in Hadoop architecture, cluster concepts, and each phase of the software development life cycle.
Abhinav is an ETL Hadoop developer with over 3 years of experience working with technologies like Hadoop, Informatica Power Centre, PL/SQL, and Unix. He has 1.5 years of experience on big data projects using Hadoop and technologies like Hive, Pig, HDFS, and Sqoop. He is proficient in Informatica for ETL development, data modeling, and data integration. Abhinav has also worked on projects involving data migration, requirement analysis, automation, and testing. He is looking for a role that offers learning opportunities while allowing him to utilize his skills.
Praveen Reddy Gajjala has over 2 years of experience as a Hadoop Developer at Wipro in Hyderabad. He has extensive skills in technologies like Java, Hadoop, Pig, Hive, HBase, and Sqoop. As part of a project for Sears, he helped analyze clickstream data from their websites and apps using Hadoop to improve the customer experience.
This document contains Anil Kumar's resume. It summarizes his contact information, professional experience working with Hadoop and related technologies like MapReduce, Pig, and Hive. It also lists his technical skills and qualifications, including being a MapR certified Hadoop Professional. His work experience includes developing MapReduce algorithms, installing and configuring MapR Hadoop clusters, and working on projects for clients like Pfizer and American Express involving data analytics using Hadoop, Spark, and Hive.
Suresh Yadav is seeking a challenging position to improve his skills. He has a degree in Electronics and Communication Engineering from Malla Reddy Engineering College. He has technical skills in Hadoop, Java, SQL, HBase, Eclipse, Linux, and Windows. He has experience developing MapReduce programs and migrating data to HDFS. For a POC, he analyzed Twitter data using Hadoop, Hive, and Pig. He previously worked as a Technical Support Engineer at Krish Technologies, where he troubleshot computers and built marketing products using VPN technology.
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
Jayaram Parida has over 19 years of experience in IT, including 3 years as a Big Data Technical Solution Architect. He has extensive skills in technologies like Hadoop, HDFS, HBase, Hive, MapReduce, Kafka, Storm, YARN, Pig, Python, and data analytics tools. He has experience architecting and developing big data solutions for clients in various industries. His roles have included designing Hadoop infrastructures, developing real-time analytics platforms, and creating visualizations and reports.
This document contains a summary of Renuga Veeraragavan's work experience and qualifications. It outlines 7 years of experience in IT with expertise in areas like Hadoop, Java, SQL, and web technologies. Specific roles are highlighted including current role as Hadoop Developer at Lowe's where responsibilities include data analysis, Hive queries, and HBase. Previous roles include Senior Java UI Developer at TD Bank and Accenture developing web applications. Educational background includes a B.E. in IT from Avinashilingam University.
This resume summarizes Arbind Kumar Jha's experience working with big data technologies like Hadoop, Hive, Pig, and HBase. He has over 12 years of IT experience, including 1.5 years working with Hadoop. His current role is a Technical Architect Lead at HCL Technologies, where he works on architectures, designs, and develops solutions involving big data, NoSQL, Hadoop, and BIRT. His technical skills include programming languages like Java, databases like Oracle and SQL Server, and big data tools like Hadoop, Hive, Pig, Cassandra, and Flume.
Kishore Babu has over 7 years of experience in data analytics, business analytics, and project management. He is currently an Associate Business Analyst at GlobalLogic Technologies working on projects for Google. He leads a team that manages project dashboards, performs cost analysis, and automates reports using tools like Hive, Google SQL, and Dremel. Previously he has held roles as a Senior Lead and Lead at GlobalLogic where he mentored teammates and ensured project metrics were achieved.
Kumaresan Kaliappan has over 14 years of experience as a software consultant specializing in middleware applications using BPM, SOA and J2EE architectures. He has hands-on experience in ecommerce, financial services, and manufacturing verticals. Currently he works as a Senior Programmer Analyst at CSC developing applications for Chrysler including a web RFQ system and tooling applications.
This document contains the resume of Bharath Kumar Rapolu, which summarizes his professional experience working with big data technologies like Hadoop, HDFS, MapReduce, Apache Pig, Hive, Sqoop, HBase and Oozie. It lists his 1.5 years of experience in Hadoop and skills in setting up Hadoop clusters, writing Pig and Hive scripts, importing/exporting data with Sqoop, and scheduling jobs with Oozie. It also provides details of his 4+ years of experience in application development using PL/SQL and his work on projects involving data processing with Hadoop and reporting with SQL.
Pradeepa Dharmappa is seeking a job with career growth and financial growth. She has over 5 years of experience in data warehousing using tools like Informatica and SQL. She has worked as a Software Engineer at HCL Technologies and as a Senior Software Engineer at InterCall. Her experience includes ETL development, data modeling, performance tuning, and working with databases like Oracle, SQL Server, and Informix. She has a bachelor's degree in computer science with over 80% marks.
Here is a summary of the document in 3 sentences or less:
SUMMARY:
Madhu Kopparapu has over 16 years of experience in software development and management, specializing in e-commerce applications. He has led teams in designing and implementing many commercial software products, most recently as a technical manager at Sprint Nextel and RJM Technologies. Kopparapu has a proven track record of delivering projects on-time and on-budget through the use of agile methodologies and a focus on customer expectations.
Soundarya Reddy has over 7 years of experience as a Java developer. She has extensive experience designing and developing web applications using technologies like Java, J2EE, Spring, Hibernate, and web services. She is proficient in all phases of the development lifecycle and has worked on projects for clients like IHG and the CDC. Her most recent role is as a Java developer for Intersect Group where she works on their application for IHG.
Chandan Das is a developer/designer with over 7 years of experience in IT implementation projects using technologies like Teradata, Oracle, Hadoop, Pig, Hive, and Sqoop. He has extensive experience in data warehousing, ETL, and database administration. His career objective is to obtain a productive role in an IT organization where he can implement his expertise in developing complex projects efficiently and meeting expectations. He provides details of his professional experience, technical skills, key achievements and completed projects.
Scott Allen Williams Résumé - Senior Java Software Developer - Agile Technolo...Scott Williams
ATTN: Currently seeking employment in the Austin, TX area as of 6-10-2014. Please ignore any older copies of my résumé you may find on Slideshare as they were posted w/out my permission, not to mention they are out of date.
Scott Allen Williams
512-277-4254
This document is a resume for Chandrakant Pandey summarizing his experience in software development using technologies like Java, J2EE, Spring Framework and developing web services with SOAP and REST. He has over 6 years of experience working on projects using agile and waterfall methodologies. Currently he works as a Senior Java Developer at Accenture where he has developed applications and web services to automate processes around vehicle details and claims notifications.
Pradeepa Dharmappa is an Oracle Certified Associate with over 5 years of experience in data warehousing using tools like Informatica PowerCenter, PL/SQL, and SAP BODS. She has worked as a Software Engineer at HCL Technologies and as a Senior Software Engineer at InterCall, where her responsibilities included ETL development, data integration, and supporting data applications. She is proficient in technologies like Oracle, SQL Server, UNIX, and has experience working with large datasets and complex data transformations.
Borja González is a Big Data Architect with over 5 years of experience in information technologies. He has expertise in Splunk, Big Data, cloud solutions, and other technologies. Currently, he leads a team using Splunk to analyze large volumes of data and create dashboards to monitor business metrics for a major natural gas company. Previously, he deployed a Hadoop cluster for a bank and worked as a software architect for Telefonica Movistar developing their website.
Shubham Goswami is a Java developer with over 6 years of experience developing software in Java. He has extensive experience designing user interfaces and custom components in Java. He is currently working as a software developer at Socket Magnate Pvt. Ltd., where he has helped develop their ERP software system called Accountsdeck.com since 2012. He has expertise in technologies like Java, Netbeans, Eclipse, Jasper Reports, and SVN. He has an MCA in Computer Science.
Rajeshwari K A 9+ years as Java Developer and Team lead-1Rajeshwari KA
Rajeshwari has over 9 years of experience as a Java developer, team lead, and project lead. She currently works as a Principal Application Developer at EMC Corporation, where she is responsible for gathering requirements, designing, developing, testing, and deploying applications using technologies like Java, Spring, Hibernate, and Oracle. Some of the key projects she has worked on include an automation platform for EMC infrastructure services, an investment banking portal, and a risk analysis system for a bank.
This document provides a summary of a candidate's skills and experience working with large data sets and Hadoop technologies. The candidate has 1 year of experience developing, implementing, and deploying Big Data solutions using Hadoop technologies like HDFS, Hive, Pig, and Sqoop on Linux and Windows environments. They have extensive experience developing UDFs, UDAFs, UDTFs in Hive and Pig scripts for ETL processes. Additionally, the candidate has knowledge of Spark, Scala, Java/J2EE development, Linux, and databases. Their most recent role involved writing Pig and Hive queries, UDFs, and shell scripts to process and report on large data sets using Hadoop, Pig, Hive, Cass
Hamsa Balaji has over 20 years of experience in database administration, application development, and business intelligence. He is an Oracle Certified PL/SQL Developer and Database Administrator with extensive experience developing data warehouses using Oracle, SQL Server, DB2, and ETL tools like Informatica and DataStage. He currently works as a Systems Engineer at Wells Fargo where he leads the development and implementation of complex data warehouse schemas and reporting solutions.
Alok Singh is seeking challenging assignments in Business Intelligence/Data warehousing. He has nearly 7 years of experience in BI/DW, ETL, data integration, and data warehousing solution design. He is proficient in SQL, ETL tools like Informatica and SSIS, and visualization tools like QlikView and Tableau. He has experience designing and developing ETL solutions, requirements gathering, and data analysis. His past roles include positions at Technologia, Subex, and Reliance Communications where he worked on projects involving Teradata, Oracle, billing systems, and fraud detection. He has a bachelor's degree in electronics and telecommunications.
This document provides a summary of an IT professional's skills and experience. It includes contact information, locations, a summary of Oracle database and Informatica experience, lists of technical skills and tools, training and education background, and descriptions of previous roles involving data warehousing, ETL development, and Oracle development. Previous roles included senior positions at Symphony EYC, Barclays Investment Bank, and DeCare Systems Ireland developing ETL processes and Oracle databases.
This document contains Anil Kumar's resume. It summarizes his contact information, professional experience working with Hadoop and related technologies like MapReduce, Pig, and Hive. It also lists his technical skills and qualifications, including being a MapR certified Hadoop Professional. His work experience includes developing MapReduce algorithms, installing and configuring MapR Hadoop clusters, and working on projects for clients like Pfizer and American Express involving data analytics using Hadoop, Spark, and Hive.
Suresh Yadav is seeking a challenging position to improve his skills. He has a degree in Electronics and Communication Engineering from Malla Reddy Engineering College. He has technical skills in Hadoop, Java, SQL, HBase, Eclipse, Linux, and Windows. He has experience developing MapReduce programs and migrating data to HDFS. For a POC, he analyzed Twitter data using Hadoop, Hive, and Pig. He previously worked as a Technical Support Engineer at Krish Technologies, where he troubleshot computers and built marketing products using VPN technology.
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
Jayaram Parida has over 19 years of experience in IT, including 3 years as a Big Data Technical Solution Architect. He has extensive skills in technologies like Hadoop, HDFS, HBase, Hive, MapReduce, Kafka, Storm, YARN, Pig, Python, and data analytics tools. He has experience architecting and developing big data solutions for clients in various industries. His roles have included designing Hadoop infrastructures, developing real-time analytics platforms, and creating visualizations and reports.
This document contains a summary of Renuga Veeraragavan's work experience and qualifications. It outlines 7 years of experience in IT with expertise in areas like Hadoop, Java, SQL, and web technologies. Specific roles are highlighted including current role as Hadoop Developer at Lowe's where responsibilities include data analysis, Hive queries, and HBase. Previous roles include Senior Java UI Developer at TD Bank and Accenture developing web applications. Educational background includes a B.E. in IT from Avinashilingam University.
This resume summarizes Arbind Kumar Jha's experience working with big data technologies like Hadoop, Hive, Pig, and HBase. He has over 12 years of IT experience, including 1.5 years working with Hadoop. His current role is a Technical Architect Lead at HCL Technologies, where he works on architectures, designs, and develops solutions involving big data, NoSQL, Hadoop, and BIRT. His technical skills include programming languages like Java, databases like Oracle and SQL Server, and big data tools like Hadoop, Hive, Pig, Cassandra, and Flume.
Kishore Babu has over 7 years of experience in data analytics, business analytics, and project management. He is currently an Associate Business Analyst at GlobalLogic Technologies working on projects for Google. He leads a team that manages project dashboards, performs cost analysis, and automates reports using tools like Hive, Google SQL, and Dremel. Previously he has held roles as a Senior Lead and Lead at GlobalLogic where he mentored teammates and ensured project metrics were achieved.
Kumaresan Kaliappan has over 14 years of experience as a software consultant specializing in middleware applications using BPM, SOA and J2EE architectures. He has hands-on experience in ecommerce, financial services, and manufacturing verticals. Currently he works as a Senior Programmer Analyst at CSC developing applications for Chrysler including a web RFQ system and tooling applications.
This document contains the resume of Bharath Kumar Rapolu, which summarizes his professional experience working with big data technologies like Hadoop, HDFS, MapReduce, Apache Pig, Hive, Sqoop, HBase and Oozie. It lists his 1.5 years of experience in Hadoop and skills in setting up Hadoop clusters, writing Pig and Hive scripts, importing/exporting data with Sqoop, and scheduling jobs with Oozie. It also provides details of his 4+ years of experience in application development using PL/SQL and his work on projects involving data processing with Hadoop and reporting with SQL.
Pradeepa Dharmappa is seeking a job with career growth and financial growth. She has over 5 years of experience in data warehousing using tools like Informatica and SQL. She has worked as a Software Engineer at HCL Technologies and as a Senior Software Engineer at InterCall. Her experience includes ETL development, data modeling, performance tuning, and working with databases like Oracle, SQL Server, and Informix. She has a bachelor's degree in computer science with over 80% marks.
Here is a summary of the document in 3 sentences or less:
SUMMARY:
Madhu Kopparapu has over 16 years of experience in software development and management, specializing in e-commerce applications. He has led teams in designing and implementing many commercial software products, most recently as a technical manager at Sprint Nextel and RJM Technologies. Kopparapu has a proven track record of delivering projects on-time and on-budget through the use of agile methodologies and a focus on customer expectations.
Soundarya Reddy has over 7 years of experience as a Java developer. She has extensive experience designing and developing web applications using technologies like Java, J2EE, Spring, Hibernate, and web services. She is proficient in all phases of the development lifecycle and has worked on projects for clients like IHG and the CDC. Her most recent role is as a Java developer for Intersect Group where she works on their application for IHG.
Chandan Das is a developer/designer with over 7 years of experience in IT implementation projects using technologies like Teradata, Oracle, Hadoop, Pig, Hive, and Sqoop. He has extensive experience in data warehousing, ETL, and database administration. His career objective is to obtain a productive role in an IT organization where he can implement his expertise in developing complex projects efficiently and meeting expectations. He provides details of his professional experience, technical skills, key achievements and completed projects.
Scott Allen Williams Résumé - Senior Java Software Developer - Agile Technolo...Scott Williams
ATTN: Currently seeking employment in the Austin, TX area as of 6-10-2014. Please ignore any older copies of my résumé you may find on Slideshare as they were posted w/out my permission, not to mention they are out of date.
Scott Allen Williams
512-277-4254
This document is a resume for Chandrakant Pandey summarizing his experience in software development using technologies like Java, J2EE, Spring Framework and developing web services with SOAP and REST. He has over 6 years of experience working on projects using agile and waterfall methodologies. Currently he works as a Senior Java Developer at Accenture where he has developed applications and web services to automate processes around vehicle details and claims notifications.
Pradeepa Dharmappa is an Oracle Certified Associate with over 5 years of experience in data warehousing using tools like Informatica PowerCenter, PL/SQL, and SAP BODS. She has worked as a Software Engineer at HCL Technologies and as a Senior Software Engineer at InterCall, where her responsibilities included ETL development, data integration, and supporting data applications. She is proficient in technologies like Oracle, SQL Server, UNIX, and has experience working with large datasets and complex data transformations.
Borja González is a Big Data Architect with over 5 years of experience in information technologies. He has expertise in Splunk, Big Data, cloud solutions, and other technologies. Currently, he leads a team using Splunk to analyze large volumes of data and create dashboards to monitor business metrics for a major natural gas company. Previously, he deployed a Hadoop cluster for a bank and worked as a software architect for Telefonica Movistar developing their website.
Shubham Goswami is a Java developer with over 6 years of experience developing software in Java. He has extensive experience designing user interfaces and custom components in Java. He is currently working as a software developer at Socket Magnate Pvt. Ltd., where he has helped develop their ERP software system called Accountsdeck.com since 2012. He has expertise in technologies like Java, Netbeans, Eclipse, Jasper Reports, and SVN. He has an MCA in Computer Science.
Rajeshwari K A 9+ years as Java Developer and Team lead-1Rajeshwari KA
Rajeshwari has over 9 years of experience as a Java developer, team lead, and project lead. She currently works as a Principal Application Developer at EMC Corporation, where she is responsible for gathering requirements, designing, developing, testing, and deploying applications using technologies like Java, Spring, Hibernate, and Oracle. Some of the key projects she has worked on include an automation platform for EMC infrastructure services, an investment banking portal, and a risk analysis system for a bank.
This document provides a summary of a candidate's skills and experience working with large data sets and Hadoop technologies. The candidate has 1 year of experience developing, implementing, and deploying Big Data solutions using Hadoop technologies like HDFS, Hive, Pig, and Sqoop on Linux and Windows environments. They have extensive experience developing UDFs, UDAFs, UDTFs in Hive and Pig scripts for ETL processes. Additionally, the candidate has knowledge of Spark, Scala, Java/J2EE development, Linux, and databases. Their most recent role involved writing Pig and Hive queries, UDFs, and shell scripts to process and report on large data sets using Hadoop, Pig, Hive, Cass
Hamsa Balaji has over 20 years of experience in database administration, application development, and business intelligence. He is an Oracle Certified PL/SQL Developer and Database Administrator with extensive experience developing data warehouses using Oracle, SQL Server, DB2, and ETL tools like Informatica and DataStage. He currently works as a Systems Engineer at Wells Fargo where he leads the development and implementation of complex data warehouse schemas and reporting solutions.
Alok Singh is seeking challenging assignments in Business Intelligence/Data warehousing. He has nearly 7 years of experience in BI/DW, ETL, data integration, and data warehousing solution design. He is proficient in SQL, ETL tools like Informatica and SSIS, and visualization tools like QlikView and Tableau. He has experience designing and developing ETL solutions, requirements gathering, and data analysis. His past roles include positions at Technologia, Subex, and Reliance Communications where he worked on projects involving Teradata, Oracle, billing systems, and fraud detection. He has a bachelor's degree in electronics and telecommunications.
This document provides a summary of an IT professional's skills and experience. It includes contact information, locations, a summary of Oracle database and Informatica experience, lists of technical skills and tools, training and education background, and descriptions of previous roles involving data warehousing, ETL development, and Oracle development. Previous roles included senior positions at Symphony EYC, Barclays Investment Bank, and DeCare Systems Ireland developing ETL processes and Oracle databases.
IT Professional with 9 years of Data Warehousing experience in the areas of ETL design and Development.Excellent Experience in Requirement Gathering, Designing, Developing, Documenting, Testing of ETL jobs and mappings in Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
Ganesh Kamble has over 2 years of experience in data integration and business intelligence development using tools like Informatica, Oracle, SQL Server, and Business Objects. He has extensive experience in requirements gathering, ETL development, report design, and data warehousing. His most recent projects involved migrating a system from BI 3.1 to 4.0, loading historical data using Informatica, and building a data warehouse for a new company branch using a separate ETL system.
Ganesh Kamble is a technology professional with over 1.5 years of experience in business intelligence development. He has experience developing reports, dashboards, and universes in BusinessObjects 4.0 and migrating projects from BI 3.1 to 4.0. He has also worked on ETL processes using Informatica and developed macros and procedures using Visual Basic for Applications. Kamble aims to contribute his skills in business intelligence and gain further experience and knowledge.
Sumalatha Kalugotla is seeking a position as a Business Analyst. She has over 13 years of experience as a Business Analyst and Java developer. She has extensive experience gathering requirements, documenting specifications, and testing software projects in various domains including healthcare, banking, and retail. She is proficient in Agile methodologies, Oracle databases, and tools like UML, Visio, and Jira.
The document provides a summary of Gary Thompson's skills and experience as a Business Intelligence Professional. It highlights his expertise with Microsoft technologies including SQL Server and his experience developing ETL processes, data warehouses, OLAP cubes, and reports. It also lists his relevant work history manipulating and reporting on data from various sources to support business decisions.
The document provides a summary of Gary Thompson's skills and experience as a Business Intelligence Professional. It highlights his expertise with Microsoft technologies including SQL Server and his experience developing ETL processes, data warehouses, OLAP cubes, and reports. It also lists his relevant work history manipulating and reporting on data from various sources to support business decisions.
This document provides a summary of Sunil Yeshwanth Madam's skills and experience. He has over 4 years of experience in IT, including 2.3 years as an SAP HANA Consultant. His technical skills include data modeling in HANA Studio, migrating HANA content between environments, and administrating HANA databases. He has also designed Business Objects universes and reports connecting to data sources like SQL Server, Oracle, and SAP BW.
This resume summarizes Parthiban Ranganathan's experience in IT with a focus on data warehousing, business intelligence, and big data. He has over 7 years of experience working on healthcare and manufacturing data warehouse projects. His technical skills include Teradata, Oracle, SQL Server, DB2, Sybase, Informatica, SSIS, SSAS, and Cognos. He has experience as an ETL developer, data modeler, and BI developer. His most recent role was as a technical lead for an Anthem healthcare data mart project involving ETL, data integration, and business intelligence.
This document is a resume for S. Jayachandran seeking an IT role. It summarizes his professional experience including 9.8 years as an Oracle Certified Professional DBA and SAP BASIS administrator. Currently an Assistant Manager at DELPHI-TVS Diesel Systems, his responsibilities include managing IT teams, implementing SAP systems, and administering Oracle databases. He is proficient in technologies like SAP, Oracle, Linux and Windows servers.
Kalpana Rai is an Information Technology Professional with over 6 years of experience in Oracle Technology, specializing in development, maintenance, and support of applications for finance and telecom clients. She has a Bachelor's Degree in Computer Science and is a permanent resident of Canada. Her technical skills include Oracle, PL/SQL, SQL, SAS, and she has experience designing databases, writing queries, and migrating data between systems like Teradata and Oracle. She is currently working as an Oracle and ETL Developer at Rogers Communications in Toronto.
Subrat Kumar Panigrahi has over 6 years of experience as a Microsoft Business Intelligence developer. He has expertise in SQL Server 2005/2008, SSIS, SSRS, SSAS, and Microstrategy. Some of his responsibilities include designing ETL solutions, developing reports and dashboards, and query optimization. He has worked on projects for companies like Tesco and Wells Fargo.
Pradeep Kumar Pandey has over 10 years of experience as a data/systems integration specialist and ETL expert. He has extensive experience designing and implementing data warehouses using tools like IBM DataStage, Informatica, Oracle OBIEE, and Oracle OBIA. He has led teams and taken on roles such as developer, technical lead, and team lead. Pradeep has worked on projects across various industries including telecom, financial services, HR, and retail.
This document contains the resume of Abhinav Kaushik, an IT professional with over 8 years of experience in Oracle PL/SQL development. He has extensive skills in SQL, PL/SQL, database design, performance tuning and software development. Currently he works as a Senior Software Engineer at Trigyn Technologies where he has worked on projects like Fundpro, a package for investment management.
Abhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousingabhijit singh
The document provides details about Abhijit Kumar Singh's professional experience and skills. He has over 10 years of experience in data warehousing, ETL, Oracle PL/SQL development, and project management. Currently he works as an Associate at Deutsche Bank Group in Pune, India. Some of the major projects he has worked on include the DART data warehouse project at Deutsche Bank and data migration projects for Telefonica O2 and Novartis.
This resume is for Gary A. Thompson, a Business Intelligence Professional with over 15 years of experience in SQL Server and Microsoft technologies. He has extensive experience developing reports, databases, and data warehouses to support business intelligence and analytics needs. His skills include SQL Server, Integration Services, Analysis Services, and Reporting Services.
Pratik Dey is an IT professional with over 4 years of experience in data warehousing and ETL development. He has strong skills in Informatica PowerCenter and experience loading data from various sources into Teradata. Currently he works as an ETL Data Specialist for Thomson Reuters implementing their Connect Data Warehouse. Previously he worked on projects in healthcare and banking to develop ETL processes and load data into data warehouses.
Vishwas Vijaykumar Pande is seeking assignments in software development with an organization of repute. He has over 5 years of experience in software development, risk management, project execution, ETL, and data analytics. Currently working as a senior associate software engineer at Synechron Technologies in Pune on data conversion projects involving insurance claims, transactions, policies, and the claim lifecycle. He has expertise in Oracle, PL/SQL, ETL tools, and Hadoop technologies.
1. V A S U D E V A N V E N K A T R A M A N
Bellandur Outer Ring Road, Bangalore – 560103. India.
Voice : +91-98809 38525 Email : vasudevan.venkatraman@gmail.com
PROFILE SUMMARY
Vasudevan Venkatraman has been working in the Information Technology industry for the past 11+
years which includes 7+ years in Oracle PL/SQL & Datawarehousing , 3+ years in Performance Consulting
& applications DBA and 2 years in Big data technologies. He has been working in designing and
developing applications using Oracle PL/SQL & Hadoop. He has rich experience in understanding
business process requirements, analyzing and implementation.
PROFESSIONAL SKILLS
Good knowledge in Hadoop Framework , Architecture and Big data concepts.
Worked on Data warehousing project using Oracle / Hadoop.
Having experience on creating databases , tables and views using Hivesql , Impala and Pig.
Very Good Knowledge in Oracle Memory Architecture and Datawarehouse Architecture.
In-depth knowledge in constructing the triggers, packages, collections, functions, procedures etc
Worked on Data Loading using SQL Loader , Data pump ,External tables & Sqoop.
Worked on Materailized views, Partitioning ,Bucketing , Parallel execution and Job scheduling
Exposure to ASM , RAC , Disk and File storage systems
Creating and monitoring the different tablespaces like user, Undo, temporary Tablespaces.
Performance Tuning using AWR, EXPLAIN PLAN, TKPROF and Auto Trace
SKILL SET
Software Development : SQL, PL/SQL , Core Java
Performance Tools : AWR ,ASH, TKPROF ,Autotrace, Explain plan, iostat, vmstat, topas
Big data/Hadoop : HDFS , Map Reduce , Hive , Pig , HBase and SQOOP
RDBMS : Oracle 11g,HP-Neoview
BIDW : Business Intellegence ,Data Warehousing and ETL Concepts
DOMAIN : CPG – Retail , Banking
EDUCATION
Education Details - Degree Institute/University Duration
Master of Computer Applications Madurai Kamaraj University 1999 – 2002
Bachelor of Science Gandhigram Rural University 1995 – 1998
Assistant Consultant
Feb 2015 – Present , TCS
Offshore Tech Lead for
1. UK based Investment bank
• Working on datawarehousing application
• Implemented Proof of Concepts on Hadoop stack and different bigdata analytic tools,
migration from different database (Oracle) to Hadoop.
• Load and transform large sets of structured, semi-structured and unstructured data using
Hadoop ecosystem components.
1
2. V A S U D E V A N V E N K A T R A M A N
• Experience in working with different data sources like Flat files, XML files and Databases.
• Worked on Partitions & Bucketing in hive to optimize performance.
• Preprocessing Data sets using Pig.
• Extracting data to/from Oracle to HDFS using SQOOP.
• Implementing Oracle SCD techniques for new requirements.
• Tuning on batch processes , time/CPU consuming SQL Queries.
• Develop automated unit test stubs using utPLSQL to test transformations through
Continuous Integration
• Worked on explain plan , Oracle hints and creation of new indexes to improve performance
of SQL Statements.
Technology Lead
Jan 2011 – Jul 2014 , Infosys Limited
PL/SQL Developer for
1. Auchan Retail , France
Responsibilities include:
• Discussion with Auchan IT Team to get requirements clarity.
• Performed Source system analysis (SSA) to identify the source data that needs to be moved
into the target tables
• Conducting Performance Review of queries.
Performance Consultant for:
1. Mifel Bank , Mexico
2. National Commerical Bank , Jamaica
All the mentioned projects were onsite based projects for performance monitoring and analysis of the
bank’s production servers. Responsibilities include:
• Preliminary discussions with the bank, preparing Statement of Work (SOW) and project
plan.
• Validating the OS (AIX 6.3), Application (Finacle) and Database (Oracle 11g) level parameters
based on the current load profile of the bank’s systems.
• Performance tuning and resolution of specific issues raised by bank.
Table Partitioning activity for:
1. National Commercial Bank , Jamaica
2. Bank of Baroda , Mumbai
Both the mentioned projects were onsite based projects for Partitioning. Responsibilities include:
• Preliminary discussions with the bank, preparing Statement of Work (SOW) and project
plan.
• Arriving Strategy/Methodology to Partition tables and related indexes.
• Involved in table redesigning with implementation of Partitions Table and Partition Indexes
to make Database Faster and easier to maintain.
Senior Software Engineer, Hewlett Packard
Mar 2008 – Jan 2011
Implementation for Enterprise Wide Data Warehouse to Bank of Baroda
Bank of Baroda is embarking on a significant business transformation by aiming to enhance
efficiency, productivity and competitiveness by adopting the latest business processes and
technology and towards this end, is implementing its Technology-Enabled Business and IT
Strategy Project.
2
3. V A S U D E V A N V E N K A T R A M A N
As a part of the transformation process, the Bank requires a powerful and comprehensive
performance and risk management and executive information system which will enable decision
support and profitability evaluation across multiple dimensions. In addition the Bank is looking
at solution components to enable budgeting and planning, market and liquidity risk
measurement while paving way for an objective decision-making process.
The above requires implementation of an enterprise wide Data Warehouse with a proven data
model which can consolidate data across the Bank for strategic & tactical decision making and
also implementation of advanced analytical engines that enable micro level CUSTOMER and
account and aggregate analysis.
• Worked on Logical and Physical design on Datawarehouse application
• Worked on Advanced Oracle ETL concepts.
• Loading & Processing Bulk Data using sqlloader.
• SQL & PL/SQL Tuning
• Worked on large sets of data.
• Used Bulk Collections for better performance and easy retrieval of data, by reducing context
switching between SQL and PL/SQL engines.
Migration from Oracle to HP-Neoview for Canon , Singapore
Canon migration project involved in replacing the existing oracle data warehouse with HP
Neoview System. It consisted of migrating the oracle procedures, respective DDLs, other core
database objects, the BO reports and the Data load to Neoview Environment , to
redesign/develop Neoview Java stored procedures equivalent to the functionality provided in
the existing oracle Stored procedures.
• Re-design / Develop Neoview Java Stored Procedures and applications
• Performance tuning in the SQLs written in the Java Stored Procedure
• Review other Core database conversion.
• Interaction with the user for better solutions.
• Supporting Users in the UAT phase
Graphical interfaces in Supplier Performance Management Portal
AL SPM graphical enhancement project was executed for Ashok Leyland which implemented
graphical interfaces to analyze captured supplier metrics data and enable the AL Sourcing team
to take right business decisions. The graphical interface included Pareto charts to analyze the
top 20% supplier spends on the overall supplier business spends, spend performance analysis to
analyze spend vs. supplier performance and linear trend analysis to understand the future trend
of supplier metrics based on past data.
To be able to perform the same, database was designed to hold business data in de-normalized
form for better performance and the business logic to render the graphs were written in SQL
stored procedures. The supplier data which was available from AL ERP database was parsed,
analyzed and performance data were calculated. It involved writing a few SQL functions to
calculate the cumulative spend value and cumulative spend percentage for generation of Pareto
charts.
• Performed database design related to changes in the existing Portal.
3
4. V A S U D E V A N V E N K A T R A M A N
• Performed PL/SQL coding in the application
• Deployed the application and provided support during testing
• Prepared and Performed unit and system integration testing document
Software Engineer, Fidelity Investments
Jan 2007 – Nov 2007
Portfolio Engine – Data Maintenance for Bloomberg
Portfolio Engine is being built to replace the rules based Trade Review Engine (TRE). TRE is
limited to instruments contained in specific targets whereas PE perform portfolio optimization
for it Private Portfolio Services (PPS) accounts based on the most appropriate selections. SAI
Data Maintenance functionality is being created to support the Portfolio Engine requirements.
• Analyzed the proposed web front end screen design
• Performed database design related to changes in the web front end screen(Using Java)
• Performed PL/SQL coding in the application
• Deployed the application and provided support during testing
• Prepared and Performed unit and system integration testing
Associate IT-Consultant, ITC Infotech
May 2005 – Sep 2006
Support Consultant for Custom Data Warehouse Product V3 for British American Tobacco.
The aim of V3 is to deliver to BAT a maintainable, integrated, scaleable Customer Relationship
Management (CRM) solution supporting best practice processes in Distribution, Trade
Marketing and Account Management. V3 not only supports the field activities associated with
the field sales representatives, it provides extensive support for the Operational Planning
process that underpins these activities. The V3 Common Functional Base provides the central
elements that are required by any market implementing V3, irrespective of whether they are a
Trade Marketing or Distribution-focused market.
• Worked on Run Nightly Refresh Mechanisam to Populate data in to Datawarehouse.
• Worked on Triggers & Materialized views to refresh Data
• Involved on Development as well as Support.
• Worked for more than 5 release versions of the Product
• Fixed the issues raised by the client with stipulated SLA
• Extensively used bulk collection in PL/SQL objects for improving the performing
Trainee Programmer, Cherrysoft Technologies
May 2004 – Feb 2005
Developing ERP Package for SPIC Pharmauceticals, Chennai.
Solara is an integrated materials management package addressing the needs of a manufacturing
process in a pharmaceutical company. All the major functions starting from product requests to
material dispatch, internal transactions to accounting are all taken care of in this package.
Seamless integration with Internet is possible with this package.
• Involved in coding and support for the system
• Prepared unit test cases and executed unit test case for the system
• Coordinated with onsite
4