Back to Home

Fabric Engineer

💰 $110,000 - $165,000

Data EngineeringCloud & InfrastructureMicrosoft Azure

🎯 Role Definition

As a Fabric Engineer, you will be a cornerstone of our data transformation journey. You are a hands-on technical expert responsible for architecting, building, and optimizing end-to-end data solutions within the Microsoft Fabric ecosystem. Your work will directly empower business intelligence, advanced analytics, and data-driven decision-making across the entire organization. You will leverage the full potential of OneLake, Lakehouse/Warehouse architecture, and integrated toolsets to create a unified, scalable, and secure data platform. This role requires a blend of data engineering prowess, architectural vision, and a collaborative spirit to translate complex business needs into high-performance data products.


📈 Career Progression

Typical Career Path

Entry Point From:

  • Azure Data Engineer
  • BI Developer / Power BI Specialist
  • Data Analyst with strong technical skills

Advancement To:

  • Senior or Lead Fabric Engineer
  • Data Architect / Cloud Solutions Architect
  • Data Engineering Manager

Lateral Moves:

  • MLOps Engineer
  • Data Governance Specialist

Core Responsibilities

Primary Functions

  • Design, implement, and manage end-to-end data pipelines within Microsoft Fabric, from ingestion and transformation to serving and visualization.
  • Develop scalable and performant ETL/ELT processes using Fabric Data Factory, Dataflows Gen2, and Synapse Data Engineering notebooks (PySpark/Spark SQL).
  • Architect and maintain our unified data platform using the Lakehouse and Data Warehouse paradigms on top of OneLake.
  • Implement and enforce a medallion architecture (Bronze, Silver, Gold layers) to ensure data quality, governance, and progressive refinement of data assets.
  • Build and optimize data models within Fabric Warehouse and Power BI datasets, leveraging features like Direct Lake mode for unparalleled query performance.
  • Collaborate with data analysts and business intelligence teams to define requirements and deliver high-quality, certified datasets for reporting and analytics.
  • Monitor, troubleshoot, and optimize the performance, reliability, and cost-effectiveness of data pipelines and workloads running in Fabric.
  • Implement robust data governance, security policies, and access control mechanisms within the Fabric environment to protect sensitive information.
  • Develop solutions for real-time data ingestion and analytics using Fabric Eventstream and KQL (Kusto Query Language).
  • Migrate existing data solutions from legacy platforms (e.g., on-premises SQL, Azure Synapse, other cloud data warehouses) to Microsoft Fabric.
  • Champion and implement CI/CD best practices for Fabric artifacts, using Azure DevOps and Git for version control, automated testing, and deployment.
  • Author and maintain comprehensive technical documentation, including data lineage maps, architectural diagrams, and development standards.
  • Partner with data scientists to operationalize machine learning models, integrating them into production data pipelines and applications within Fabric.
  • Manage and administer Microsoft Fabric workspaces, capacities, and tenant-level settings to ensure optimal configuration and resource utilization.
  • Establish and promote data engineering best practices, patterns, and standards across the technical teams to foster a culture of quality and consistency.
  • Conduct code reviews for peers to ensure adherence to standards, performance, and maintainability of all developed solutions.
  • Integrate a wide variety of internal and external data sources, including relational databases, APIs, streaming platforms, and flat files, into the OneLake foundation.
  • Automate data management tasks and infrastructure deployment using Infrastructure as Code (IaC) principles where applicable.
  • Evaluate and recommend new tools, technologies, and features within the rapidly evolving Microsoft Fabric and broader Azure ecosystem.
  • Provide expert-level support for production data issues, performing root cause analysis and implementing long-term solutions to prevent recurrence.

Secondary Functions

  • Support ad-hoc data requests and exploratory data analysis.
  • Contribute to the organization's data strategy and roadmap.
  • Collaborate with business units to translate data needs into engineering requirements.
  • Participate in sprint planning and agile ceremonies within the data engineering team.

Required Skills & Competencies

Hard Skills (Technical)

  • Deep, hands-on expertise with the Microsoft Fabric platform, including Lakehouse, Warehouse, OneLake, Data Factory, and Power BI integration.
  • Strong proficiency in Python and PySpark for complex data transformation and processing in Fabric notebooks.
  • Advanced SQL skills for data manipulation, querying, and optimization across different engines (e.g., T-SQL, Spark SQL).
  • Proven experience with Azure data services such as Azure Data Lake Storage (ADLS Gen2), Azure Functions, and Azure DevOps.
  • Solid understanding of data modeling principles, including dimensional modeling (Kimball) and normalized forms for modern data warehousing.
  • Practical experience implementing and managing CI/CD pipelines for data solutions using Git and Azure DevOps (or similar tools).
  • Knowledge of data governance and security best practices within a cloud data platform.
  • Expertise in Power BI, including DAX, data modeling, and performance optimization, especially with Direct Lake mode.
  • Familiarity with Kusto Query Language (KQL) for analyzing real-time and log data.
  • Experience with migrating data workloads from on-premises or other cloud platforms to Azure.

Soft Skills

  • Exceptional analytical and problem-solving abilities, with a talent for deconstructing complex issues.
  • Strong communication and interpersonal skills, capable of explaining technical concepts to non-technical stakeholders.
  • A collaborative mindset with a proven ability to work effectively in a team-oriented, Agile environment.
  • Self-motivated and proactive, with a passion for continuous learning and staying current with emerging data technologies.
  • Excellent organizational and time-management skills, able to manage multiple projects and priorities simultaneously.

Education & Experience

Educational Background

Minimum Education:

  • Bachelor's Degree in a quantitative or technical field.

Preferred Education:

  • Master's Degree or relevant Microsoft certifications (e.g., DP-600, DP-203).

Relevant Fields of Study:

  • Computer Science
  • Information Systems
  • Engineering
  • Statistics

Experience Requirements

Typical Experience Range: 3-7 years in a data engineering, BI development, or related role.

Preferred: Demonstrated experience with large-scale data projects on the Azure platform, with at least 1-2 years of hands-on experience specifically with Microsoft Fabric components in a development or production environment.