Analytics Engineer

Analytics Engineer - Audio 

Location: London based but hybrid (typically 1–2 days per week in the office) 

Reports to: Senior Analytics Engineer, Data Enablement (Data & Decision Sciences) 

About Bauer Media Audio Data & Decision Sciences Team 

 

The Data and Decision Sciences (DDS) team is at the core of Bauer Media Audio with a mission to leverage data as a strategic enabler across nine European markets. DDS provides trusted, actionable insights and robust data solutions that empower business growth, enhance audience engagement, and drive operational efficiency. 

 

The team operates as a collaborative, cross-functional unit that bridges the gap between data and business strategy, combining central capabilities with local expertise across markets. DDS treats data as an integral business partner rather than a support function, working closely with stakeholders to deliver impactful outcomes across commercial revenue, digital audiences, and consumer competitions. 

 

Role Summary 

 

As an Analytics Engineer in the Data Enablement team, you will turn raw and curated warehouse layers into well-modelled, business-ready data marts and a semantic layer across Bauer Media Audio’s three pillars: consumer competitions, digital audiences, and commercial revenue. You will focus on Kimball-style dimensional modelling, dbt-based transformations, and semantic layer definitions that enable reliable, consistent reporting and analytics in tools such as Looker, Power BI, and Tableau. 

 

This is a hands-on engineering role that is also business-aware: you will work closely with product, data science, and business stakeholders to understand how data will be used, while collaborating day to day with Data Engineers, DevOps, and other Analytics Engineers to modernise and standardise the data estate on Snowflake. 

 

Core responsibilities 

 

Data modelling and transformation 

  • Design, develop, and optimise dimensional data models and marts in dbt, following Kimball principles and good warehouse design practices. 

  • Build and maintain dbt pipelines that transform data from raw and curated layers into gold-layer marts that are easy to use, performant, and well-tested. 

  • Work with Data Engineers to understand and shape upstream data structures, ensuring the hand-off from ingestion to modelling is clear, efficient, and well-documented. 

  • Implement and maintain data quality checks in dbt (for example, tests and dbt-expectations/Great Expectations) to ensure models are accurate, reliable, and trustworthy for downstream users. 

 

Semantic layer and BI consumption 

  • Help define and maintain a semantic layer that provides consistent metrics, dimensions, and business logic across Looker, Power BI, Tableau, and other analytics tools. 

  • Translate technical schemas into clear, reusable business concepts (for example, customer, campaign, order, session) with well-defined metrics and calculations. 

  • Maintain and evolve documentation such as data dictionaries, ERDs, and semantic definitions so that analysts, data scientists, and business teams can self-serve with confidence. 

 

Modernisation and migration 

  • Contribute to the migration and modernisation of existing Redshift and BigQuery warehouses into Snowflake-based marts and models, including refactoring legacy stored-procedure-based logic into dbt. 

  • Work on short- to medium-term projects (for example over 6–9 months) to repoint dbt projects to Snowflake and align legacy stacks with new modelling and semantic standards. 

  • Collaborate with Data Engineers and DevOps to ensure modernised workloads are observable, cost-aware, and aligned with platform standards. 

 

Quality, observability, and operations 

  • Contribute to a documentation-first culture, keeping models, tests, decisions, and semantic definitions up to date in Confluence and repositories. 

  • Use and extend data quality and observability tooling, including dbt tests, Great Expectations/dbt-expectations, and MONACO (DDS’s internal observability platform) to monitor model health, SLAs, and data issues. 

  • Participate in day-to-day operations of the analytics data stack, including monitoring, incident follow-up, and iterative improvements to pipelines and models. 

 

Collaboration and ways of working 

  • Work as part of cross-functional squads that include Data Engineers, DevOps, Data Scientists, Product Managers, and business stakeholders across the three pillars. 

  • Take part fully in agile ceremonies (for example, sprint planning, stand-ups, retrospectives), managing work through Jira and keeping documentation current in Confluence. 

  • Use GitHub effectively in a collaborative engineering environment, including branching, pull requests, and participating actively in code reviews. 

  • Support and, where appropriate, mentor more junior team members, sharing good practices around dbt, modelling, testing, and semantic layer design. 

 

Requirements 

 

Technical must haves 

  • Strong SQL skills with hands-on experience in at least one of Redshift or BigQuery (ideally both), and practical experience working with Snowflake or another modern cloud data warehouse. 

  • Proven experience building and maintaining dbt projects, including models, tests, and environments for production workloads. 

  • Solid understanding of dimensional modelling and Kimball-style data marts, including facts, dimensions, and slowly changing dimensions. 

  • Experience working with or designing semantic layers or metric layers, and an ability to express business logic cleanly in data models. 

  • Practical experience with Python for data and analytics engineering tasks (for example, tooling, small utilities, or integration work). 

  • Experience working in agile teams, using tools such as Jira for ticketing and Confluence for documentation. 

  • Proficiency with Git-based workflows (GitHub branches, pull requests, and code reviews) in a collaborative engineering environment. 

  • Experience working in a cloud environment, ideally AWS, and familiarity with common data and analytics services. 

  • Comfort working with BI and analytics tools such as Looker and Power BI, including understanding how model and semantic changes affect dashboards and reports. 

 

Technical nice to haves 

  • Experience with Snowflake as a primary cloud data warehouse, beyond initial exposure. 

  • Exposure to Airflow for orchestrating data and analytics workflows. 

  • Experience with Great Expectations and/or dbt-expectations for data quality. 

  • Familiarity with Tableau and an interest in supporting a multi-tool BI environment. 

  • Awareness of semantic layer tooling and approaches (for example, dbt Semantic Layer or similar technologies). 

  • Exposure to infrastructure-as-code concepts and tools (for example, Terraform), and CI/CD practices for data and analytics projects. 

  • Experience in media, digital audiences, marketing/CRM, or subscription businesses is a plus but not a requirement. 

 

Behavioural / ways-of-working must haves 

  • Clear, confident communicator who can work with product, data, and business stakeholders to understand how data will be used and reflect that in models and metrics. 

  • Demonstrated ownership of outcomes, not just tasks—seeing models and pipelines through from design to stable, reliable operation in production. 

  • Strong documentation-first mindset, ensuring models, tests, semantics, and decisions are captured and maintained so others can understand and build on the work. 

  • Collaborative team member who participates actively in sprints, estimates work realistically, and contributes thoughtfully in code reviews and design discussions. 

  • Comfortable working across pillars and functions (Data Enablement, DDS, product, and business teams), rather than in a narrow, ticket-only silo. 

 

Personal and behavioural traits 

  • Communicative and engaging: explains technical and modelling topics clearly, asks good questions, and is comfortable talking with both technical teammates and business stakeholders. 

  • Outcome-focused: cares about the impact of models and marts on decision-making and user experience, not just completing stories. 

  • Practically curious: looks for better ways to model, test, and document data, and brings concrete improvement ideas into the team. 

  • Quality-minded: pays close attention to detail in code, tests, metrics, and documentation, treating data issues as something to be prevented as well as fixed. 

  • Well-organised: manages work effectively within sprints, keeps tickets and documentation up to date, and follows through on commitments. 

  • Collaborative and supportive: contributes to a positive team culture, participates constructively in code reviews, and is willing to support and mentor others. 

  • Adaptable: comfortable working in a changing environment, learning new tools (such as Snowflake, semantic layer approaches, and MONACO), and adjusting to evolving priorities. 

 

About Bauer Media Group


We are a media business focused on creating content that matters to millions of people across Europe. Our offering extends from print and online publishing to audio broadcasting and entertainment, alongside investments in other media related sectors. With more than 500 million copies sold each year, we are one of Europe’s largest Publishers. From women’s and celebrities’ magazines to TV listings to food and special interest, we own some of the most popular publishing brands in Germany, UK, Poland and France – both digital and print. But not only that. Reaching over 61 million listeners weekly, we operate over 150 radio and podcast brands in nine countries, spanning the UK, Ireland, Poland, Slovakia, Denmark, Sweden, Finland, Norway and Portugal. Family-owned in the 5th generation, Bauer Media focuses on the long-term, with a consumer-first mindset that guides us across our diverse portfolio. Our workforce of 12,000 shares a common purpose: to deliver content and services that enrich people‘s everyday lives.

 

What’s in it for you

  • You’ll have 28 days holiday, bank holidays & 2 volunteer days to use.
  • Your development matters, so access to our internal training provider – Bauer Academy, is a huge win. 
  • We have enhanced Maternity/Adoption, Paternity and Shared Parental Leave Pay.
  • You’ll have the opportunity for flexible working. 
  • And much more! Find the full details of our benefits here

 

We are an international employer and equal opportunities are important to us. That's why we welcome everyone in their uniqueness, regardless of e.g. religion, gender, skin color, disability in our house.

 

We are committed to ensuring our recruitment process is inclusive and accessible to all. If you have a disability or a long term health condition, and need us to make any reasonable adjustments or do anything differently during any stage of the recruitment process, please let us know by emailing careershub@bauermedia.co.uk

 

We are actively recruiting for this position, so the job advert may close earlier than expected.

 

If you have any feedback regarding our UK recruitment process, please email careershub@bauermedia.co.uk we would love to hear from you.

Date Job Posting Last Updated:  23 Dec 2025
Location: 

London, GB, NW1 2PL

Req ID:  3322