Playtika is seeking an awesome DevOps Big Data engineer to work in our world-class team. This role will contribute to the creation of an environment and culture which will help us develop the next generation of social games in our Minsk office.
Job responsibilities require the candidate to display critical thinking to help our business all the while understanding the team and company priorities.
The candidate will be responsible for taking on support requests coming from our Big Data team as well as offices from all over the globe.
Responsibilities will include:
- Designing aggregates, scheduled data jobs, highly scalable data processes as well as related datawarehouse development work for our Big Data systems: Vertica, MySQL and Redis.
- Troubleshouting Rundeck jobs, bash scripts and SQL queries
- Help manage access control and resource pooling on our various data stores
- Help monitor and respond to various system and business alerts
- Work on HDFS solutions with the Dev team
- Be innovative and solve complex problems using a variety of tools and technologies
- A commitment to quality and operational excellence in addition to speed of delivery
- High degree of proficiency with Linux
- Advanced skills in SQL and some scripting languages
- Excellent communication skills including the ability to identify and communicate data driven insights
- Hands on experience in the development and maintenance of large scale database systems having high availability/scalability/performance
- Understanding of ETL and ELT best practices for processing n-Terabytes of data
- Have work experience with data pipelines in throughput environments
- Coding skills in Bash or Python
Additional beneficial skills:
- Bonus for having experience with; AVRO, ORC, Hive, PIG, Hadoop, Kafka, Storm, Vertica, MySQL, Redis
- Coding skills in a JVM language