I am currently working for a huge global business with a large portfolio of major investments across Middle East and Europe. They have recently established a Technology arm based in Belarus,Europe and are looking for an experienced Data Engineer to join their team.
As a Big Data Developer on their team, you will be involved in building one of the largest Big Data systems in the world. Collaborating with Product Managers, Architects and Software Engineers to research, design and improve core Big Data analytics functionalities, in the area of ingestion, enrichment, search, and/or analytics.
- You are passionate in applying software engineering skills to solve real life Big Data analytical problems.
- You can write code to process billions of data records daily from heterogeneous sources.
- You are an ETL Champion, deploying solutions to work on clusters with hundreds of nodes, storing petabytes of data.
- You are creative, translating user stories into innovative solutions that provide an excellent client experience and align with architectural roadmap.
- You are detailed, maintaining thorough technical designs that account for security, scalability, maintainability, and performance.
- You transform ambiguity into clarity.
- You are a team player, participating in all phases of the development process - analysis, design, construction, testing and implementation in Agile development lifecycles.
- You enjoy collaborating in a multicultural and diverse environment that expands to include various geographic locations and spans numerous cultures.
- You have stellar communication skills, effectively expressing yourself. You convey and receive information in a clear, credible, and consistent manner.
Qualification and Skills Requirements:
- Bachelor's Degree or higher with a minimum of 5 years of experience working on Big Data related technologies; Master's Degree and a minimum of 2 years of experience are preferred.
- Experience in Java, Scala, with OOD/OOP design skills, and proficiency in Linux environment along with knowledge of writing unit test and testable code.
- Hands on experience in building Big Data applications.
- Strong experience in Spark Core and Spark Streaming, Flink, Storm or Apache Beam.
- Strong experience in Kafka. • Strong experience with NoSQL databases like ElasticSearch, Solr, HBase, Cassandra.
- Strong experience working with Cloudera, Hortonworks or similar.
- Experience in using Spring framework.
- Experience in debugging issues, resolving performance issues and coordinating with external teams.
- Proven experience working with various data formats, such as Columnar databases, Parquet, Avro, XML, JSON, datasets & data frames concepts.
Kindly apply if you are interested in hearing more.