Job Description
Gojo & Company, Inc
Established in 2014, Gojo & Company, Inc (Gojo) is a Japan-based Microfinance holding company with 7 group companies in India, Sri Lanka, Myanmar, and Cambodia. Gojo’s mission is to extend financial access to everyone in the world as the Private Sector World Bank. (http://gojo.co/)
Company information for candidates: https://speakerdeck.com/gojo/material-for-hiring
Job Description
Gojo is building next gen digital financial infrastructure and accessibility platforms for the less privileged. Data is going to be the key differentiator of what we do. We are looking for a highly skilled Data Engineer to create our Data Management Platform from scratch. You have strong data engineering skills combined with the ability to clearly communicate complex ideas to everyone. Another major responsibility would be supporting our AI/ML engineers to create better algorithms by providing feature rich data. You will build and deliver data services and solutions to support our strategical technical initiatives
As an early member of our engineering team, you will have a key role in building the future of financial inclusion by solving complex problems you have ever faced. Your work will directly contribute to saving and improving less privileged people’s lives.
Job Type: Full Time Remote Engineering Position
Job location: India (preferred Bangalore)
Joining date: within next 30-45 days
Responsibilities:
- Design and build Gojo’s data warehouse and data lake in cloud and build reliable and smart ways to ingest data to the data lake and warehouse
- Creating and maintaining optimal data pipeline architectures from various channels including remote partner locations
- Ensuring optimal accuracy and timeliness of Data on a real-time basis while implementing monitoring tools to detect data issues.
- Designing schemas, this includes ETL processes while collaborating closely with analytical teams across the business.
- Collaborating closely with engineering and businesses units to manage data to drive cross-channel, cross-partner targeting and personalisation.
- Resolving Data quality issues, this includes root cause analysis while managing internal and external vendors to resolve Data issues.
- Ensuring Data security, this includes facilitating Data protection and compliance of customer's Data while incorporating emerging technologies to enhance existing Data Warehouse.
- Knowledge in data localization and privacy laws and know-how to mitigate data localization rules
Requirements:
- Degree in Computer Science, Mathematics, Computer Engineering, or relevant disciplines
- At least 4 years of hands on experience in data engineering
- Working experience in large set of data platforms, includes legacy data infrastructure and modern cloud data platforms.
- Demonstrable experience in writing / maintaining ETL pipelines.
- Familiar with data warehouse modelling, such as dimensional data modelling and schema design.
- Deep knowledge and working experience in AWS cloud is a must have, especially in Data lake and warehouse development experience using RedShift, S3 as a data lake and AWS Glue jobs
- You will also have experience in languages such as Python, Java, C++ or Go along with knowledge in big data tools such as Hadoop and Spark
- Great to have knowledge in data visualization tools
- Knowledge of Continuous Integration and Delivery systems (i.e. Jenkins) and setups
- Experience in Git
Preferred Skills & Experience:
- AWS Redshift data warehouse design and implementation
- ETL pipelines and AWS Glue based big data processing
- Experience in distributed data lake on cloud
Work location: India (Bangalore)
- This is a remote engineering position. You will be working from home and ready for occasional domestic and international travels
- We are planning to have co-working space in near future where most of our engineers are based on.