Skip to content

Intern - Data Engineering

  • On-site
    • Colombo, Western Province, Sri Lanka
  • Engineering

Job description

Who we are?

Headquartered in Perth, Australia, with offices globally including in Colombo, Sri Lanka, Qoria is an ASX listed global leader in child digital safety technology and services. We are a purpose-driven business, operating under the ‘Linewize’ brand in North America and Asia Pacific, the ‘Smoothwall’ brand in the UK, 'Octopus BI' in Sri Lanka and the ‘Qoria’ brand in EMEA. Our solutions are utilised by schools, school districts, and parental communities to protect children from seeing harmful content online, identify children at risk based on their digital behaviours and ensure teachers maintain focus and safe learning in the digital classroom. 30.000 schools and 7 million parents depend on our solutions to keep 25 million children safe in 180 countries around the world.

What’s the opportunity?
We are looking for a driven and curious Intern Data Engineer with excellent SQL knowledge to join our growing data team. In this role, you will contribute to building and optimizing data pipelines, warehouses, and dashboards across cloud and on-prem environments. You will be exposed to cloud-native tools such as GCP BigQuery, Cloud Spanner, and Dataflow and gain real-world experience across data engineering and analytics use cases.

This role is ideal for someone early in their career who wants to grow technically while supporting business-critical data initiatives.

Octopus BI, a part of Qoria, is committed to delivering cutting-edge data analytics and integration solutions that drive informed decision-making.

Job requirements

Key Responsibilities

  • Develop SQL queries and scripts for transforming, joining, and aggregating data.

  • Assist in building and maintaining scalable ETL/ELT pipelines across cloud and on-prem sources.

  • Support development and optimization of cloud data warehouses and databases (e.g., BigQuery, Cloud Spanner, Bigtable).

  • Automate data processing pipelines using tools like Google Cloud Dataflow and Composer/workflows.

  • Collaborate with team members to ensure data integrity, accuracy, and security during processing.

  • Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies.

  • Assist in building reports and dashboards using tools like Power BI or Looker.

  • Work with IT and DevOps to solve integration and access issues.

  • Participate in sprint planning, standups, code reviews, and QA processes.

  • Apply statistical and analytical techniques under supervision to support reporting use cases.

  • Keep up to date with modern data engineering tools and best practices.

Must-Have

  • A Bachelor's degree in Computer Science, Data Science, or a related technical field is required.

  • Strong SQL skills with some hands-on experience in writing optimized queries.

  • Understanding of data pipelines, ETL/ELT workflows, and data modeling concepts.

  • Basic programming knowledge in Python, Go and Java.

Nice-to-Have

  • Familiarity with GCP services like BigQuery, Cloud Spanner, and Dataflow.

  • Exposure to data validation, data quality frameworks, or monitoring systems.

  • Awareness of statistical methods and their use in reporting and dashboards.

  • Familiarity (academic or internship) with at least one BI tool (e.g., Power BI, Looker, Tableau).

  • Should be handling structured and semi-structured data (CSV, JSON, Parquet, etc.).

Career Growth Path

This role is designed to evolve into a Data Engineer or Data Scientist role with increasing exposure to advanced analytics, machine learning workflows, and platform architecture.

If you are passionate about data and looking for an opportunity to work on impactful projects, apply now!


or