Offering around 6.5 years of IT exposure. Strong acumen in designing and developing frameworks based on Auditing, SCD2, Cleanup, Archival and Purging. Skilled in Data Architecture, ETL. Adept at designing data pipeline using Delta Lake as well as optimizing data flow. Proficient in Java, PySpark, Talend, Hive and Sqoop. Knowledge of Hadoop ecosystem.