Website Quest Alliance
At Quest Alliance we transform learning ecosystems through education technology, capacity building, and collaboration to build 21st-century skills for learners and facilitators.
We are a not-for-profit trust that focuses on research-led innovation and advocacy in the field of teaching and learning. We engage with educators, civil society, government institutions and corporate organisations to demonstrate and enable scalable and replicable solutions in school education and vocational training.
At Quest Alliance, you will get the opportunity to apply your skills and contribute to addressing issues around quality education and skills training. The organisation gives you the space to learn and grow in a fun and engaging environment. We have an eclectic group of people working at Quest drawn from diverse disciplines including Education, Technology, Design, Youth Development, and Business.
About the Role:
We are currently seeking a data coordinator who can help strengthen the culture of data driven decision making within the organisation by creating user friendly and easy to comprehend dashboards and visualisation of different types of data. The job role might also require you to learn new tools and technologies fast, and you should have in-depth knowledge of visualisation platforms such as Tableau, Power BI, database management as well as basic programming and scripting skills. You will help to build efficient and stable data pipelines using ETL processes which can be easily maintained in the future. You should have expertise in the design, creation, management, and business use of large datasets.
You will build ETL pipelines to ingest the data from heterogeneous sources into our system. You should have excellent business and communication skills, and be able to work with program owners to understand their data requirements and help them make data-related decisions using your ETL knowledge and experience.
Responsibilities
- Gather requirements and business process knowledge in order to transform the data in a way that’s geared towards the needs of end users
- Design, implement, and continuously expand data pipelines by performing extraction, transformation, and loading activities
- Build data visualisation as per analytics requirements on Superset
- Maintain and improve already existing processes
- Ensure that the data architecture is scalable and maintainable
- Work with the program, data and technology teams in designing and delivering correct, high quality data
- Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions
- Prepare documentation for further reference
Requirements
- Bachelor’s degree in Computer Science, Engineering, Data Science or related disciplines
- Experience and knowledge of visualisation/ dashboarding using tools like Apache Superset or Tableau/ Power BI. Prior experience of working with Superset would be an advantage
- SQL knowledge (query performance tuning, index maintenance, etc.) as well as an understanding of database structure
- Prior experience of working with data collection tools including Google and KoBo and building ETL pipeline via Talend
- Advanced knowledge of Python or R
- Knowledge of data modelling principles
- Organisational skills: time management and planning
- Expert level Knowledge and experience of working with Talend
- Knowledge of various SQL/NoSQL data storage mechanisms and Big Data technologies
- High attention to detail
- Passionate about complex data structures and problem solving
- Ability to pick up new data tools and concepts quickly
- Good written and verbal communication skills
We are looking for people who:
Are passionate about working to solve challenges in the education and employability domain
Candidate should be flexible, self-motivated, enthusiastic, and an energetic team player
To apply for this job please visit questalliance.zohorecruit.com.