Data Engineer

As Data Engineer with 2 years of experience, you will design, implement, and manage data solutions using Snowflake, Microsoft Azure, AWS, Databricks, dbt and more. Your role involves creating and maintaining data pipelines, storage solutions, data processing, and data integration to enable data-driven decision-making within the organisation.

What You Will Do:

  1. Deploy and optimise ETL/ELT workflows to efficiently extract, transform, and load data from multiple sources into data warehouses and analytics platforms.
  2. Deploy and manage data lakes, data warehouses, and data marts, ensuring structured and unstructured data is efficiently stored, processed, and accessible.
  3. Must work with cloud platforms like AWS, Azure, or GCP, leveraging tools such as Snowflake, Databricks, Redshift, and BigQuery to handle large-scale data processing.
  4. Design, maintain, and optimize SQL and NoSQL databases, ensuring high performance, scalability, and availability for analytics and operational use.
  5. Implement data validation, cleansing, and governance frameworks to maintain accuracy, consistency, security, and compliance with GDPR, HIPAA, and other regulations.
  6. Optimise query performance, indexing, partitioning, and caching while automating data workflows through CI/CD, Infrastructure as Code (IaC), and monitoring solutions.
  7. Work closely with BI team to ensure data is well-structured and optimized for reporting and dashboards in Power BI, Tableau, Looker, and Qlik.
  8. Partner with Data Scientists, Analysts, and Business Teams to enable data-driven decision-making, advanced analytics, and self-service BI capabilities.
  9. Translate architecture and designs to engineer solutions to host proof-of-concepts, prototypes, non-production and production solutions.
  10. Collaborate with designers and product managers to understand user requirements and translate them into technical solutions.
  11. Help define options papers for solutions and platforms.
  12. Fulfil all legislative and company policy requirements.
  13. Have active involvement in health and safety in the workplace.

About You:

Qualifications and Skills:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • 2+ years of experience in data engineering, ETL processes, and building data pipelines.
  • Strong knowledge of data modeling, database design, and data warehousing.
  • Hands-on experience with Azure Data Services (e.g., Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, Azure Blob Storage).
  • Hands-on experience in working with Snowflake for data warehousing, data sharing, and cloud-based data architecture.
  • Experience with AWS Data Services (e.g., Redshift, S3, Glue) would be a plus.
  • Proficiency in building ETL/ELT pipelines using tools like AWS Glue, Azure Data Factory, and DBT for data transformation and integration.
  • Experience with CI/CD pipelines for automating data workflows.
  • Strong experience in automating data workflows and orchestrating data pipelines in cloud environments.
  • Experience implementing data governance strategies and security protocols in cloud environments, including encryption, access control, and auditing in Snowflake, AWS, and Azure.
  • Experience with real-time data streaming and processing using Azure Stream Analytics, or Snowflake’s Snowpipe for automated data loading.
  • Proficiency in SQL, Python, Scala, or other programming languages for data processing.
  • Ability to integrate data from multiple sources, both structured and unstructured.
  • Strong understanding of data governance, data quality, and security best practices.
  • Strong ability to work with cross functional teams, including engineers, analysts, and business leaders.
  • Excellent verbal and written communication skills to explain technical concepts clearly.
  • Passion for emerging technologies, data architecture advancements, and innovation in the field.
  • A pragmatic thinker who aligns with our values, avoiding over-engineering solutions and reviewing the current state before proposing solutions.
  • You must demonstrate versatility in learning technologies to be technology agnostic based on demand.
  • Must have an always learning attitude.
  • Growth mindset, positive attitude & strong interest in solving client challenges, adapting to a changing work environment, and dealing with new issues.
  • Certifications from Snowflake, Databricks or Fabric would be a plus.
  • Must be based in New Zealand.

 

Here’s a Taste of What’s on Offer at dataengine:

  • Opportunity to solve real-world business problems.
  • Collaborative and supportive work environment.
  • Competitive salary.
  • Be at the forefront of data science innovation in a dynamic startup environment.
  • Work on a variety of challenging and impactful projects across different industries.
  • Opportunity to learn and grow your skillset with the latest technologies.
  • Continuous learning and development opportunities.
  • Weekly fresh fruit, snacks, monthly industry led lunches and active staff activities.
  • Be part of a company that is shaping the future of data-driven solutions.


If you are a passionate and talented data geek who is excited about making a real impact, we encourage you to apply!


About dataengine:

dataengine, a New Zealand-based leader in data solutions, empowers businesses to unlock the hidden value within their data and achieve true transformation. We connect people, processes, and technology for optimised efficiency and sustainable impact. Founded in 2018, we’ve grown rapidly thanks to our exceptional team and commitment to delivering practical, impactful results. We believe in building the foundation right, starting with strategic planning and robust architecture.

What sets us apart?

  • Experienced & Pragmatic:Our team boasts extensive expertise, focusing on tangible outcomes, not just presentations.
  • Small Wins, Big Impact:We deliver incremental value drops, ensuring our customers see results quickly and continuously.
  • Holistic Enablement:We develop with our customers in their environment, empowering our team to own and optimise solutions.


Leading the way in Generative AI:
We’ve developed a unique standalone AI engine, providing secure and accessible tools for our customers’ AI journey. From use case brainstorming to model deployment and tuning, we offer comprehensive support throughout the entire lifecycle.

Beyond AI: Our services range from advisory and resourcing to full-scale platform design and build, catering to any data-driven business need.

Our Values:

  • Nimble – We move fast but think big
  • Quality– We do it right the first time
  • Co-Creation:We work together to solve precise problems
  • We are All Ears– We listen and then do
  • Progressive Thinkers– We are creators and doers
  • Pragmatic – We don’t over-engineer