Featured
Sanjeeva Reddy Nagar, Hyderabad, India - 500038.
13
Details verified of Ramu J✕
Identity
Education
Know how UrbanPro verifies Tutor details
Identity is verified based on matching the details uploaded by the Tutor with government databases.
Telugu
Tamil
Hindi
English
dravidian university 2009
Master of Computer Applications (M.C.A.)
Sanjeeva Reddy Nagar, Hyderabad, India - 500038
ID Verified
Phone Verified
Email Verified
Facebook Verified
Report this Profile
Is this listing inaccurate or duplicate? Any other problem?
Please tell us about the problem and we will fix it.
Class Location
Online (video chat via skype, google hangout etc)
Student's Home
Tutor's Home
Years of Experience in Big Data Training
15
Big Data Technology
Hadoop
Teaching Experience in detail in Big Data Training
Overall, 14 Years of experience in the fields of Big Data / BI and GCP Certified Google Cloud Professional cloud Architect with 3 years of experience Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB Completed Azure DP-900 certification Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage Experience in Data Validation automation tool Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS Experience in preparing Test Strategy, Test Plan and Test estimation Worked in Agile and Waterfall models Good knowledge in good automation tools Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code. Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data. Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows, Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub. Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience in optimizing ETL workflows. Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API. Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability. Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts. Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries. Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend. Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS. Good Experience in working with SerDe’s like Avro Format, Parquet format data.
Class Location
Online (video chat via skype, google hangout etc)
Student's Home
Tutor's Home
Years of Experience in Google Cloud Platform
15
Teaching Experience in detail in Google Cloud Platform
Overall, 14 Years of experience in the fields of Big Data / BI and GCP Certified Google Cloud Professional cloud Architect with 3 years of experience Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB Completed Azure DP-900 certification Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage Experience in Data Validation automation tool Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS Experience in preparing Test Strategy, Test Plan and Test estimation Worked in Agile and Waterfall models Good knowledge in good automation tools Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code. Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data. Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows, Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub. Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience in optimizing ETL workflows. Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API. Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability. Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts. Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries. Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend. Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS. Good Experience in working with SerDe’s like Avro Format, Parquet format data.
Class Location
Online (video chat via skype, google hangout etc)
Student's Home
Tutor's Home
Years of Experience in Python Training classes
15
Course Duration provided
1-3 months
Seeker background catered to
Educational Institution, Individual, Corporate company
Certification provided
Yes
Python applications taught
PySpark
Teaching Experience in detail in Python Training classes
Overall, 14 Years of experience in the fields of Big Data / BI and GCP Certified Google Cloud Professional cloud Architect with 3 years of experience Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB Completed Azure DP-900 certification Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage Experience in Data Validation automation tool Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS Experience in preparing Test Strategy, Test Plan and Test estimation Worked in Agile and Waterfall models Good knowledge in good automation tools Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code. Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data. Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows, Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub. Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience in optimizing ETL workflows. Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API. Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability. Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts. Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries. Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend. Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS. Good Experience in working with SerDe’s like Avro Format, Parquet format data.
Class Location
Online (video chat via skype, google hangout etc)
Student's home
Tutor's Home
Years of Experience in Microsoft Azure Training
15
Azure Certification offered
Azure Certified Data Engineer
Teaching Experience in detail in Microsoft Azure Training
Overall, 14 Years of experience in the fields of Big Data / BI and GCP Certified Google Cloud Professional cloud Architect with 3 years of experience Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB Completed Azure DP-900 certification Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage Experience in Data Validation automation tool Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS Experience in preparing Test Strategy, Test Plan and Test estimation Worked in Agile and Waterfall models Good knowledge in good automation tools Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code. Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data. Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows, Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub. Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience in optimizing ETL workflows. Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API. Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability. Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts. Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries. Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend. Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS. Good Experience in working with SerDe’s like Avro Format, Parquet format data.
4.4 out of 5 5 reviews
Shivakumar
"As I am taking course from last 1 and half month I have learned so many new things. Like hive,pig,sqoop. I got an idea of all these things. He will be telling us oozie and spark as well. It was very good experience. I learned so many things. He has explained everything very clearly. He has cleared all doubts regularly. "
Gayathri K S
"Good trainer for beginners, trying hard for the students and clarifying doubt then and there, easy to follow his class. "
Sayyed Iqbal Faheem
"The training was good. I feel there should be a two days revision so that we get to know all the things "
Rajesh
"No one can teach Big Data Concepts like Ramu Sir. He is excellent. I attended many training institutes to learn Hadoop. I got satisfied only with Ramu Sir's teaching. Ramu has great patience. If we don't understand any topic, he gives very good examples to makes us understand. If anyone wants to learn Hadoop, I would confidently say attend Ramu Sir's without any second opinion. "
1. Which classes do you teach?
I teach Big Data, Google Cloud Platform, Microsoft Azure Training and Python Training Classes.
2. Do you provide a demo class?
Yes, I provide a free demo class.
3. How many years of experience do you have?
I have been teaching for 15 years.
Answered on 05/01/2016 Learn IT Courses/Big Data
Class Location
Online (video chat via skype, google hangout etc)
Student's Home
Tutor's Home
Years of Experience in Big Data Training
15
Big Data Technology
Hadoop
Teaching Experience in detail in Big Data Training
Overall, 14 Years of experience in the fields of Big Data / BI and GCP Certified Google Cloud Professional cloud Architect with 3 years of experience Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB Completed Azure DP-900 certification Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage Experience in Data Validation automation tool Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS Experience in preparing Test Strategy, Test Plan and Test estimation Worked in Agile and Waterfall models Good knowledge in good automation tools Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code. Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data. Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows, Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub. Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience in optimizing ETL workflows. Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API. Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability. Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts. Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries. Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend. Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS. Good Experience in working with SerDe’s like Avro Format, Parquet format data.
Class Location
Online (video chat via skype, google hangout etc)
Student's Home
Tutor's Home
Years of Experience in Google Cloud Platform
15
Teaching Experience in detail in Google Cloud Platform
Overall, 14 Years of experience in the fields of Big Data / BI and GCP Certified Google Cloud Professional cloud Architect with 3 years of experience Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB Completed Azure DP-900 certification Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage Experience in Data Validation automation tool Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS Experience in preparing Test Strategy, Test Plan and Test estimation Worked in Agile and Waterfall models Good knowledge in good automation tools Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code. Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data. Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows, Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub. Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience in optimizing ETL workflows. Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API. Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability. Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts. Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries. Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend. Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS. Good Experience in working with SerDe’s like Avro Format, Parquet format data.
Class Location
Online (video chat via skype, google hangout etc)
Student's Home
Tutor's Home
Years of Experience in Python Training classes
15
Course Duration provided
1-3 months
Seeker background catered to
Educational Institution, Individual, Corporate company
Certification provided
Yes
Python applications taught
PySpark
Teaching Experience in detail in Python Training classes
Overall, 14 Years of experience in the fields of Big Data / BI and GCP Certified Google Cloud Professional cloud Architect with 3 years of experience Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB Completed Azure DP-900 certification Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage Experience in Data Validation automation tool Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS Experience in preparing Test Strategy, Test Plan and Test estimation Worked in Agile and Waterfall models Good knowledge in good automation tools Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code. Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data. Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows, Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub. Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience in optimizing ETL workflows. Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API. Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability. Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts. Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries. Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend. Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS. Good Experience in working with SerDe’s like Avro Format, Parquet format data.
Class Location
Online (video chat via skype, google hangout etc)
Student's home
Tutor's Home
Years of Experience in Microsoft Azure Training
15
Azure Certification offered
Azure Certified Data Engineer
Teaching Experience in detail in Microsoft Azure Training
Overall, 14 Years of experience in the fields of Big Data / BI and GCP Certified Google Cloud Professional cloud Architect with 3 years of experience Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB Completed Azure DP-900 certification Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage Experience in Data Validation automation tool Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS Experience in preparing Test Strategy, Test Plan and Test estimation Worked in Agile and Waterfall models Good knowledge in good automation tools Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code. Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data. Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows, Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub. Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience in optimizing ETL workflows. Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API. Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability. Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts. Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries. Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend. Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS. Good Experience in working with SerDe’s like Avro Format, Parquet format data.
Answered on 05/01/2016 Learn IT Courses/Big Data
Reply to 's review
Enter your reply*
Your reply has been successfully submitted.
Certified
The Certified badge indicates that the Tutor has received good amount of positive feedback from Students.