RubyPlay is hiring a Data Engineer to help us scale our data infrastructure and tools and provide insights to optimise performance
What will you be responsible for?
- Take a leading role in all parts of the lifecycle of our Data product, including requirement analysis, designing the technical architecture, development, testing and deployment of the platform
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a variety of internal and external data sources
- Build analytics tools that utilise the data pipeline to provide actionable insights into operational efficiency and key business performance metrics.
- Provide database expertise to our Platform product development, aiming at benchmark systems, analyzing bottlenecks and propose solutions to scale our gaming system
- Work with stakeholders including the Product, Engineering and Commercial teams to assist with data-related technical challenges and support their data infrastructure needs.
What do we expect from our perfect candidate?
- Bachelor’s in computer science, data engineering, or related field
- At least 3 years of experience of application development
- At least 1 year of experience with big data technologies
- 5+ years of experience in ETL development
- 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) – GCP is a big advantage!
- 2+ years of experience with API integration/Development
- Advanced working SQL knowledge and experience working with various types of databases
- 2+ years of data warehousing experience (BIgQuery, Redshift or Snowflake)
- 1+ experience in working with Air Flow – Advantage
- 2+ experience in at least one scripting language such as R or Python – Advantage
- English – professional working proficiency
- Experience with Business intelligence tools such as Tableau or Power BI
Nice to have:
- Experience coding and automating processes using Python or R.
- Experience using big data technologies (Hadoop, Hive, Hbase, Spark, EMR, etc.), a big plus
- Experience with Vertica DB
- Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field, a plus
Benefits of working with us:
- Challenging tasks to feel productive
- Opportunity to visit paid conferences/training/workshops to feel fresh minded
- Free English classes with native speakers to feel your growth
- Medical insurance to feel safe
- 20 paid vacation days/7 paid sick leaves to feel humane
- Friendly working environment to feel excited