July 23, 2020
About Tavant Technologies
Tavant is an IT solutions and services provider recognized globally for its innovative solutions. We have been able to provide game changing results for our customers by combining our industry experience with cutting-edge technologies. We specialize in building custom technology solutions and providing end to end services across domains such as capital markets, consumer lending, manufacturing, media and entertainment, gaming, and retail. We pride ourselves on our traditions of innovation and process excellence that are coupled with high employee and customer satisfaction levels.
The client is one of the world’s foremost credit information services companies offering a wide range of business tools to their clients based across the globe. They engage in the provision of data and analytical tools used to manage credit risk, prevent fraud, target marketing offers, and automate decision making. Headquartered in Dublin, the company was founded in 1980 and employs more than 17,000 people in 37 countries.
Named one of the most innovative companies by Forbes magazine, this company believes in the true potential of data in transforming lives. If you want to be a part of something that enriches the lives of people, businesses, and society at large, look no further!
Please visit www.tavant.com for more information.
Responsibilities / Skills
• 4+ years experience in designing and developing enterprise level software solutions.
• Strong programming skills in Scala/Java/Python and Spark.
• Strong experience with SQL and Relational databases
• Experience with large volume data processing and big data tools such as Apache Spark , Presto.
• Experience with Amazon cloud computing infrastructure(AWS RDS, Dynamo dB, Redshift,EMR etc • Familiarity with Hadoop, MapReduce, and the big data ecosystem.
The impact you will have
• Build scalable, efficient and high-performance pipelines/ workflows that are capable of processing large amounts of batch and real-time data.
• Build out our data service architecture to support customer facing application use cases.
• Use Big Data technologies such as Data lake on AWS S3, EMR, Spark, Glue, Redshift Spectrum and related technologies to store, move, and query data.
• Follow coding best practices-Unit testing, design/code reviews, code coverage,documentation etc
• Work effectively as part of an agile team.
Offered Pay AmountPay Amount : Rs.
number of openings :1
Qualification :esigning and developing enterprise level software solutions. • Strong programming skills in Scala/Java/Python and Spark. • Strong experience with SQL and Relational databases