
Data Engineer
- On-site, Hybrid
- Colombo, Western Province, Sri Lanka
- Engineering
Job description
Want to deliver tech with purpose, with people who care?
Join us in our mission to create solutions that help keep children safe online.
Who are we?
Headquartered in Perth, Australia, with offices globally including in Colombo, Sri Lanka, Qoria is an ASX listed global leader in child digital safety technology and services. We are a purpose-driven business, operating under the ‘Linewize’ brand in North America and Asia Pacific, the ‘Smoothwall’ brand in the UK, 'Octopus BI' in Sri Lanka and the ‘Qoria’ brand in EMEA. Our solutions are utilised by schools, school districts, and parental communities to protect children from seeing harmful content online, identify children at risk based on their digital behaviours and ensure teachers maintain focus and safe learning in the digital classroom. 30,000 schools and 7 million parents depend on our solutions to keep 25 million children safe in 180 countries around the world.
What’s the opportunity?
We are seeking a highly motivated and skilled Data Engineer with strong expertise in SQL, Java, Python, and cloud-based data platforms to join our growing data team. In this role, you will be responsible for designing, developing, and optimizing data pipelines, warehouses, and analytics solutions on GCP and big data platforms. You will collaborate closely with BI teams to support data modeling, reporting, and analytics initiatives while gaining hands-on experience with advanced data engineering technologies.
This role is ideal for professionals with prior experience in data engineering who are looking to solve complex analytical problems and work with large-scale data systems.
Job requirements
Key Responsibilities
Design, develop, and optimize SQL queries and scripts for data transformation, aggregation, and integration.
Build, maintain, and enhance scalable ETL/ELT pipelines on Big Data platforms.
Support development and optimization of relational and NoSQL databases, data warehouses, and data lakes.
Work with cloud-native GCP services, including BigQuery, Cloud Spanner, Workflow and Dataflow, to enable high-performance data processing.
Apply version control and collaborative development practices using tools like Git.
Implement workflow orchestration and support data modeling efforts with the BI team.
Collaborate with cross-functional teams to ensure data quality, integrity, and security.
Solve complex analytical problems leveraging SQL, data engineering tools, and programming expertise with Java and Python.
Participate in sprint planning, code reviews, and quality assurance processes.
Stay current with modern data engineering tools, machine learning workflows, and best practices.
Requirements
Bachelor’s degree in Computer Science, Data Science, or a related technical field.
Strong experience in SQL with hands-on expertise in writing optimized queries.
Proficiency in Java and Python for data engineering and analytics tasks.
Hands-on experience with cloud platforms, particularly GCP.
Experience in OLTP and OLAP databases.
Knowledge of ETL/ELT development, workflow orchestration, and data modeling.
Familiarity with version control tools (e.g., Git).
Experience solving complex analytical problems using data engineering tools.
Experience with Apache data engineering services. (e.g., Apache Beam, Spark, Kafka, Airflow). The experience with Apache Beam will be an added advantage.
Exposure to machine learning languages and frameworks.
Familiarity with statistical methods for data analysis and reporting.
Career Growth Path
This role is designed to evolve into a senior Data Engineer, Analytics Engineer, or Data Scientist role, offering increasing responsibility in designing scalable data solutions, advanced analytics, and machine learning workflows.
or
All done!
Your application has been successfully submitted!