- Build data pipeline frameworks to automate high-volume and real-time data delivery for Hadoop and streaming data hub - Transform complex analytical models into scalable, production-ready solutions - Develop applications from ground up using modern technology stack such as Scala, Spark and NoSQL, Postgres - Have strong knowledge on HDFS, Hive, Pig, Scoop, Oozie, Zookeeper, Yarn and Impala - Worked on multiple projects related to traditional ETL technnologies like Informatica, SSIS (SQL Server Integration Services) - Have provided insights of data using different BI tools like SSRS (sql server integration services) and Power BI - Build robust systems with an eye on the long term maintenance and support of the application - Leverage reusable code modules to solve problems across the team and organization - Utilize a working knowledge of multiple development languages - Drive cross team design / development via technical leadership / mentoring - Understand complex multi-tier, multi-platform systems