HAT WE DO
The Heart of the Dawn Games Analytics team provides insights and actionable results to a wide variety of stakeholders across the organization in support of their decision making.
We partner with multiple departments across the company, leveraging analytics to measure and improve on the success and health of our games.
We collaborate as a distributed team to develop innovative data pipelines, data products, data models, reports, analyses, and machine learning applications.
The Analytics Architecture vertical within the Analytics team is tasked with building out high impact data models and internal tooling to support the wider Analytics team.
RESPONSIBILITIES
Partner with Data Scientists and business stakeholders to understand analytical needs and translate them into robust data infrastructure and workflows for the global Analytics team to leverage for deriving insights. Collaborate with the Data Engineering team to design data models and data pipelines.
Design, develop and maintain scalable data pipelines. Monitor performance and optimize data systems for high performance and reliability.
Collaborate with the Data Integrity team to build a Data Quality Framework that ensures production data meets SLAs for key stakeholders and business processes.
Perform timely Root Cause Analysis to troubleshoot data-related issues; assist in implementation of code and process fixes.
Provide thought leadership and collaborate with other team members to continue to scale our architecture to evolve for the needs of tomorrow. Contribute to the establishment of best practices including code standards, version control, documentation, testing, and review processes.
Develop and support CI/CD processes using Terraform, Github, TeamCity, Octopus, etc.
Define and implement monitoring and alerting policies for data solutions.
REQUIREMENTS
5+ years of experience in analytics engineering or data engineering Python and PySpark.
Proven track record in building, monitoring, and optimizing large-scale data systems and analytics infrastructure.
Experience working in Databricks & Azure environment (or other cloud equivalents).
Experience working with pipeline scheduling tools such as Airflow & Astronomer.
Experience working with CI/CD tools such as TeamCity, Terraform, Github, Octopus.
Bachelor’s degree or equivalent in an engineering or technical field such as Computer Science, Mathematics, Statistics, or strong quantitative and software background.
5+ years of hands-on experience in using advanced SQL queries (analytical functions), experience in writing and optimizing highly efficient SQL queries.
Ability to push the frontier of technology and freely pursue better alternatives.
PLUSES
Please note that these are desirable skills and are not required to apply for the position.
Knowledge of software coding practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations.
Streaming or NRT modeling experience.
Passion for technical mentorship.