Job Description In this role, you will be responsible for building out key components of data ingestion and transformation, using a combination of strong Python and SQL development skills as well as Big Data processing technologies. Responsibilities for this role include:
Implement data engineering best-practices to build extraction, loading, and transformation logic to land large datasets into the data warehouse.
Aptitude to independently learn new technologies, prototype, and propose software design and solutions.
Handle end-to-end development, including coding, testing, and debugging during each cycle
Develop automated tests for multiple scopes (Unit, System, Integration, Regression)
Mentor new and junior developers
Identify and recommend appropriate continuous improvement opportunities
Job Requirements for this role are as follows:
Knowledge of the data management eco-system including Concepts of ETL, data integration, etc.
At least 4 years of experience in application development
At least 3 years of experience in big data technologies
5+ years of experience in application development including Python, SQL, or Java
2+ years of experience with a public cloud (Microsoft Azure, AWS, Google Cloud)
2+ years of experience with Agile engineering practices
About the team: We are an analytics team building out pipelines for measuring and analyzing key internal components across Manulife.