I have worked in both a admin and management capacity for 5 years. I am extremely proficient in all programs Hadoop related work and also familiar with the internet and all its applications. I have skills in talking, supporting 24*7, managing email and reports i have worked with many clients abroad and can fluently have a conversation with them whenever required. I look forward to work with you!!! Processing large data sets, Assisting in hardware architecture. • Capable of planning and estimating cluster capacity and creating roadmaps for Hadoop cluster deployment. • Seting up, installing, configuring, maintaining and monitoring HDFS, Yarn, Flume, Sqoop, Pig, Hive, Oozie. • Installing, configuration and management of Hadoop distributions – Hortonworks(For POC). • Performing cluster tuning, cluster monitoring and troubleshooting. • Informing job failure or stuck jobs. • Troubleshoot and debug Hadoop Eco system runtime issues. • Experience in Commissioning and De-commissioning, Trash configuration, node balancer. • User Management . • Evaluation of Hadoop infrastructure requirements and design/deploy solutions (high availability, big data clusters, etc.) • High Availability for NameNode,Yarn. • Backups, Snapshots and recovery from node failure. • Installation of various components and daemons of Hadoop eco-system. • Importing/Exporting data to/from RDBMS. • Fire Sql query on hive as per requirement. • Loading Data from S3 to HDFS and vice versa, assisting in creation of data pipeline. • Performing Distributed copy between clusters. • Creating an Amazon EMR cluster by using the AWS Management Console • Configuring various XMLs file such as Core-site.xml, Hdfs-site.xml, Mapred-site.xml, Yarn-site.xml. • Windows OS, AWS Cloud and Linux OS. • Hadoop Security. Specialist in deployment of Hadoop in cloud (like AWS and Google cloud) ? Experienced in Cloudera installation, configuration and deployment on Ubuntu server, CentOS Server. ? Experienced in Cluster Planning, Architecting with the Team, Installation, Configuration and Deployment. ? My role involves Hadoop Cluster administration, cluster monitoring, Meta data Backup, eco-system monitoring and Yarn job monitoring, and Administration. ? I load data into the cluster from dynamically-generated files using Flume and from RDBMS using Sqoop, also from the local file system to the Hadoop Cluster. ? Followed Best practices for preparing and maintaining Apache Hadoop in production. ? Troubleshooting, diagnosing, tuning, and solving Hadoop issues. ? Worked with internals of MapReduce, HDFS & had been part of building Hadoop architecture. ? Played a crucial role and also provided necessary support in creation of POC for Hadoop deployment decisions. ? Configured security for Hadoop cluster, created configurations for Kerberos, ACL, SSL and Sentry. ? Managing Hadoop services by selecting the appropriate repository. ? Working experience with AWS – EC2 , S3 and EMR