Key Responsibilities and Required Skills for a Facts Engineer
💰 $120,000 - $180,000
🎯 Role Definition
A Facts Engineer is the architect of our data's meaning and reliability. This role sits at the critical intersection of data engineering, data analysis, and business intelligence. You are not just building pipelines; you are curating the certified, canonical datasets—the "facts"—that the entire company uses for reporting, analysis, and strategic decision-making. Your primary mission is to create a robust, well-documented, and highly performant semantic layer. This ensures that when two people in different departments ask the same question (e.g., "What is our monthly active user count?"), they get the same answer. You are a steward of data quality, a champion of self-service analytics, and a key partner to business leaders, helping them trust and leverage their data to its fullest potential.
📈 Career Progression
Typical Career Path
Entry Point From:
- Data Analyst
- Business Intelligence (BI) Developer
- Software Engineer (with a data focus)
Advancement To:
- Senior / Staff Facts Engineer
- Data Architect
- Manager, Analytics Engineering or Data Engineering
Lateral Moves:
- Data Scientist
- Data Product Manager
- Data Governance Specialist
Core Responsibilities
Primary Functions
- Design, implement, and own the company's central semantic layer, creating a definitive single source of truth for all critical business metrics and dimensions.
- Develop and maintain complex, scalable data models within our cloud data warehouse (e.g., Snowflake, BigQuery) optimized for analytical querying and BI tool performance.
- Write, test, and deploy high-quality, production-grade SQL and Python code for data transformation and modeling, primarily leveraging dbt (data build tool).
- Collaborate extensively with data analysts, data scientists, and business stakeholders to deeply understand their data requirements and translate them into robust, reliable data products.
- Establish and enforce data modeling best practices, governance processes, and documentation standards to ensure data quality, consistency, and trustworthiness across the organization.
- Own the entire lifecycle of key business metrics, from initial definition and logical modeling in partnership with business units to implementation in code and final certification.
- Architect and manage curated data marts and consumption-layer datasets tailored for specific business domains like marketing, finance, sales, and product analytics.
- Develop and maintain comprehensive data dictionaries, business glossaries, and lineage graphs, making data definitions accessible and understandable for both technical and non-technical users.
- Implement and manage automated data quality testing frameworks (e.g., dbt tests, Great Expectations) to proactively identify, alert on, and resolve data integrity issues.
- Optimize and refactor existing data models and SQL queries to improve performance, reduce data latency, and manage computational costs effectively.
- Act as a primary subject matter expert on the company's core data assets, guiding analysts and stakeholders on how to effectively query and interpret data.
- Integrate and model data from a diverse set of new and existing sources, including third-party APIs, streaming platforms, and operational databases, into the central warehouse.
- Champion a data-driven culture by empowering self-service analytics through well-structured, intuitive data models and seamless BI tool integration (e.g., Looker, Tableau, Power BI).
- Define and manage the ontology and taxonomy of our core business concepts, ensuring consistent naming conventions and logical relationships are maintained in the semantic layer.
- Troubleshoot and debug complex data issues across the stack, performing root cause analysis and implementing sustainable long-term solutions.
- Manage dependencies and orchestrate complex data transformation workflows using tools like Airflow, Dagster, or Prefect.
- Conduct rigorous peer code reviews to ensure adherence to coding standards, performance best practices, and overall code quality within the data team.
- Partner with the platform engineering team to influence the design of raw data ingestion patterns to better support downstream modeling and analytical use cases.
Secondary Functions
- Support ad-hoc data requests and exploratory data analysis to answer pressing business questions.
- Contribute to the organization's data strategy and technology roadmap by evaluating new tools and methodologies.
- Collaborate with business units to translate ambiguous data needs into concrete engineering requirements and project plans.
- Participate in sprint planning, retrospectives, and other agile ceremonies within the data engineering team.
- Create training materials and conduct workshops to onboard and educate users on the data platform and certified datasets.
- Monitor and manage the health and performance of our data transformation jobs and BI platform.
Required Skills & Competencies
Hard Skills (Technical)
- Expert-level proficiency in SQL and advanced data modeling techniques (e.g., dimensional modeling, Kimball/Inmon methodologies, Data Vault).
- Extensive hands-on experience with modern data stack tools, especially dbt (data build tool) for data transformation and modeling.
- Strong programming skills in Python, particularly with libraries like Pandas, for data manipulation and scripting.
- Deep experience working with a major cloud data warehouse such as Snowflake, Google BigQuery, or Amazon Redshift.
- Proven experience building or maintaining a semantic layer using tools like LookML, the dbt Semantic Layer, Cube.js, or AtScale.
- Proficiency with data orchestration and workflow management tools like Airflow, Dagster, or Prefect.
- Strong understanding of data governance principles, data quality frameworks, and metadata management.
- Expertise in version control systems (especially Git) and CI/CD best practices for data projects.
- Experience with business intelligence platforms like Looker, Tableau, or Power BI from a data modeling perspective.
Soft Skills
- Exceptional communication and interpersonal skills, with a proven ability to explain complex technical concepts to non-technical stakeholders.
- A highly collaborative mindset with experience working effectively in agile, cross-functional teams.
- Strong analytical thinking and creative problem-solving abilities with meticulous attention to detail.
- Excellent business acumen and a strategic mindset, with the ability to connect data initiatives directly to business value and outcomes.
- A sense of ownership and accountability, with a proactive approach to improving data systems and processes.
Education & Experience
Educational Background
Minimum Education:
- Bachelor's Degree in a relevant quantitative or technical field.
Preferred Education:
- Master's Degree in a relevant field is a plus.
Relevant Fields of Study:
- Computer Science
- Information Systems
- Statistics
- Engineering
- Economics or another quantitative field
Experience Requirements
Typical Experience Range:
- 3-7+ years of professional experience in data engineering, analytics engineering, or business intelligence development.
Preferred:
- Significant hands-on experience in a role primarily focused on data modeling, dbt, SQL, and building certified data solutions for analytics and self-service BI.
- Demonstrated experience designing and building a semantic layer or metrics store from the ground up.