Full time | Auckland


This is a key role to support our fast-growing consultancy capability, created to deliver outstanding solutions and support transformational change in some of New Zealand’s top organisations. You’ll work in multi-disciplinary environments to wrangle data through thorough investigation, innovative design and efficient build of BI solutions to provide real impacts, enabling our clients to make informed changes to their businesses and grow through enhanced performance.

Team Structure

Joining a team of highly talented data individuals from a range of backgrounds and experience, you will form part of an energised team in growth mode and autonomy, with the freedom to ‘think outside the box’. 

The team are all certified in the key technologies we support and we provide ongoing training and the ability to grow and mature in a range of technologies and capabilities across a number of industries.

Role Responsibilities

  • Enable the business to use data to inform day-to-day decisions leveraging key technologies such as Snowflake, DBT, Tableau and other database technologies.
  • Build resilient, secure high-performance infrastructure required for optimal ETL/ELT processes from a wide variety of data sources using the latest technologies.
  • Acquire, ingest, and process complex data sets from multiple sources and systems and assemble into Big Data platforms.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
  • Contribute to building analytics tools that utilize the data to provide actionable insights into the planning system performance.
  • Work with stakeholders to understand, assess and model their data landscape, obtain data extracts and define secure data exchange approaches, resolve data-related technical issues and support their data infrastructure needs.
  • Maintain our Information Security standards on each client engagement.
  • Collaborate with our data scientists to map data fields to hypotheses and curate, wrangle, and prepare data for use in their advanced analytical models.
  • Define the technology stack to be provided by our infrastructure team.
  • Build modular pipelines to construct features and modelling tables.


  • ETL/ELT toolsets such as SSIS, DataStage, SAS DI Studio, Informatica, Talend, or SAP Data Services and governance tools like Matillion, FiveTran and Collibra to load data lakes/warehouses – and the willingness and ability to learn new tools is a must.
  • The design and implementation of reusable and meta-data driven ingestion patterns to populate production strength platforms.
  • Developing jobs that load complex dimensional models (Data Vault experience is a bonus).
  • Designing and building complex and robust self-healing workflows and schedules.
  • Shell scripting in a Linux environment.
  • Using relational databases such as SQL Server, Snowflake, Oracle, DB2, MySQL or PostgreSQL and a strong understanding of software development in languages such as Python, Scala, Java, TSQL, PL/SQL.
  • Connecting to Web Resources or APIs using REST, SOAP, OAuth.
  • Aspects of DevOps experience with AWS Lambda or similar skills sets.
  • Working with data in other forms, such as XML, JSON, or CSV.
  • Cloud-based or cloud-native data technologies on Azure or AWS and a good understanding of modern authentication options.
  • Building enterprise data analytics platforms, data warehouses and data lakes, creating data pipelines.
  • Databases, advanced SQL and software development in languages such as Python, Scala, Java, TSQL, PL/SQL.
  • Big Data technologies and with our partner ecosystem software DataRobot, Tableau, Alteryx, Snowflake, Collibra, ThoughtSpot, FiveTran, DBT.
  • Building end to end data pipelines using on-premise or cloud-based data platforms.
  • Exposure to the main players in Cloud and cloud-native data warehouses (AWS/Redshift, GCP/BigQuery, Azure, Snowflake).
  • Integration/Middleware (inc API, Enterprise Service Bus., Mulesoft, Tibco).
  • Both agile and traditional software delivery life cycles.
  • Building enterprise data analytics platforms, data warehouses and data lakes, creating data pipelines.
  • Integration/Middleware is desirable. (API, Enterprise Service Bus., Mulesoft, Tibco).

Skills & Responsibilities

  • Excellent written and oral communication skills, with an ability to convey complex ideas clearly and concisely to non-technical business clients.
  • The ability to translate functional requirements into technical specifications.
  • Skills to develop new systems and provide end-user training.
  • High attention to detail and the ability to analyse data.
  • Complex problem-solving skills using a combination of technology and working with business subject matter experts in a self-confident and professional manner.
  • Ability to multi-task in a fast-paced agile environment working to tight deadlines.
  • Strong customer service skills to both internal and external clients.

Selection Criteria

We are looking for a natural problem solver, a client-focused individual with a growth mindset. You must be curious, ambitious and on track to becoming an expert in your field. You must enjoy a collaborative approach to work and love learning new technologies.

Only people with the right to live and work in New Zealand may apply for this role. For this position, you must be available for an interview in Auckland.


If you have the required skills and experience, feel you’d be a good fit with our team and would like more information on this role, please send your resume to careers@firn.nz ­­