Responsibilities:
- Interact with stakeholders to understand and implement new requirements
- ETL package development & deployment for all new change requests
- Bring best practices to the table to enhance data warehouse efficiency
- Bug Fixing and Maintenance activities for existing ETL packages
- Promote data warehouse to Big Data environment.
- Build processes and methodologies to implement new solutions
- Contribute and create a motivating work environment within the team
- Mentor other team members and promote opportunities
Requirements:
- Bachelor’s degree in fields of Computer Science, Statistics or an equivalent
- Should have minimum of 4 years of experience in building ETL pipelines
- Should have experience working with Microsoft BI stack – SSIS, SSRS, SSAS. Experience with other ETL tools – Informatica & Talend is a plus.
- Should have the ability to write complex SQL, Stored procedures, functions, triggers.
- Should have worked with optimizing techniques – adding indexes, caching, lookups, etc
- Should have experience working with Big Data tools likes Hadoop, Hive, Spark is a plus
- Should have experience in working with all types of data sources – database, csv, excels, json, xml, API’s, others.
- Should have experience with command line scripting – automating jobs, managing services.
- Should have experience in deploying workloads in production environment, troubleshooting defects, improving performance
- Should possess good communication skills and ability to interact with Customers