CA EmploymentAlert | Sr. Data Engineer
Skip to Main Content

Job Title


Sr. Data Engineer


Company : GalaxE.Solutions


Location : Toronto, Ontario


Created : 2025-01-03


Job Type : Full Time


Job Description

What you will do:Implement methods to improve data reliability and quality.Deploy programmable, easily customizable infrastructure that embraces change and automates the deployment process to reduce riskAnalyze all platform level changes, monitor application impact, and provide appropriate technical solutions to resolve all issues quickly and efficientlyActively manages and escalates risk and customer-impacting issues within the day-to-day role to managementAchieve product commitments (and influences others to do the same) by using informal leadership & highly developed communication skills and contributes to or led technology communitiesUses automation, system tools, open-source solutions, observability and 'security first' principles in daily workContribute to team agile ceremonies, leads demos and presentations, helps new engineers learn established normsInitiate high level solution design approaches, and guides team to achieve desired key software delivery capabilities using automated, coded enterprise and observabilityParticipate in internal speaking and advocacy eventsSupport research activities to adopt new technology solutions in ways of developing new capabilitiesto oversee and work independently on a variety of complex initiatives end-to-end (primarily on short- to medium- term issues (6-12 months) requiring specialist knowledge and/or the integration of cross-functional processes (within your own area of expertise).Provide thought leadership and/or industry knowledge specific (but not limited to) your own area of expertise.Provide guidance and influencing multiple teams across a broad spectrum of initiatives and functional domains.Required:5+ years relevant experience. (HDFS/SPARK/YARN/DREMIO/KAFKA/Azure PaaS)Knowledge of at least one programming language (e.g. Java, Python, R)Knowledge of Databricks (Spark with Scala or Pyspark)Hands-on experience with ETL/ELT processes and SQL database designProven ability to design, develop and secure data pipelinesGreat numerical and analytical skills.Experience with DevOps and CI/CD pipelinesExperience in IaaS Data Platform and capability enablement.Compute via Spark, storage via HDFS, ADLS, orchestration via Airflow, Dremio, etc.Experience in streaming data ingestion and processing using KafkaExperience in PaaS offerings from Azure / GCP (ADF/Databricks, etc.)Familiarity with data catalog and metadata management tools (e.g.: Unity Catalog, Collibra)Knowledge of data modeling, data quality and governance practicesMust be able to lead complex projects and competing priorities with a high level of technical acumen and strong communication skillsFamiliar with data management Access Control (Rbac / Abac).EducationUndergraduate Degree or Technical Certificate.