Job Description: - Requirements: 10+ years of experience implementing large data and analytics platforms. Data Platform: Experience working in Data warehouse, Data Lake, and Spark Lakehouse environments to support large-scale analytic applications. Data Modeling: Extensive experience in designing Star and Snowflake schema models. Proficient in logistics industry operations. ETL: Understanding of Medallion architecture and ETL/ELT processes. SQL: Expert in SQL with advanced programming abilities, Spark SQL. Reporting: Familiarity with PowerBI reporting, Tableau, Cognos Programming Languages: Basic experience with any scripting languages such as shell scripting, Python, PySpark.Responsibilities:Data mapping discovery from disparate source systems including mainframe, oracle, sql, cloud, others. Develop and implement medallion data architecture in Microsoft Fabric to support analytics platform and data processing requirements. Ensure efficient data transformations, optimize data processing and ELT queries. Implement robust data validation, cleansing, and standardization to ensure data integrity and data quality. Contribute to data governance policies, procedures and tools. Attention to detail, strong problem-solving skills, capable of independent work, collaborating with cross-functional stakeholders.MINIMUM SKILLS AND QUALIFICATION REQUIREMENTS: Have 4 years of educational qualification. Ability to work outside of normal business hours to complete implementations and resolve issues.WORKING CONDITIONS:Office environment.
Job Title
Data Modeling Architect