Big Data Engineer (up to 22000 RON/net per month)

About us:

Skywind Group is a privately held entertainment technology Igaming solutions provider.

Since our establishment in 2012, we have been growing steadily alongside our partners and customers, proud to witness their own headway augmented by our products.

We shape the future of our industry with world-class games, back-office and content management systems, hosting solutions, infrastructure, proprietary deep learning engine as well as a revolutionary user retention suite of products.

Join us today and be a part of our rapidly growing team of more than 600 professionals spanning 8 R&D centers across Romania, Ukraine, Belarus, Cyprus, the Philippines, Australia, etc.

For Romania one of the most well know award winning products is: Princess Casino.

Role responsibilities:

Role summary: Create and maintain optimal data pipeline architecture.

• Assemble large, complex data sets that meet functional / non-functional business requirements.
• Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
• Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Snowflake and Python
• Build Tableau/Power BI in the case it is needed.

Role requirements:

Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases such as Postgres and Snowflake.

• Experience building and optimizing ‘big data’ data pipelines, architectures and data sets using Airflow.
• Strong analytic skills related to working with unstructured datasets.
• Build processes supporting data transformation, data structures, metadata, dependency and workload management.
• A successful history of manipulating, processing and extracting value from large disconnected datasets.
• Working knowledge of message queuing, stream processing and highly scalable ‘big data’ data stores (Kafka).
• Strong project management and organizational skills.

They should also have experience using the following software/tools:

• Experience with big data tools: Hadoop, Spark, Kafka, etc.
• Experience with relational SQL and NoSQL databases, including Postgres, Snowflake and Cassandra.
• Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
• Experience with stream-processing systems: Storm, Spark-Streaming, etc.
• Experience with object-oriented/object function scripting languages: Python.
• Experience with Reporting tools such as Tableau and Power BI - Advantage.

Benefits:

• Competitive salary & professional growth as we grow day by day!

• Educational possibilities such as: trainings, certifications and participation to professional conferences as continuous learning is one of our values that we strongly believe.

• Meal vouchers, private insurance, performance bonus plans, remote work, annual visits to some of our other HQ s in the work, modern & central office near “Romana Plaza”, from 22 to 24 annual leave days for Mid & Senior Pros to recharge and be with the loved ones.

• Warm and friendly attitude towards our new colleagues to make them feel welcomed and integrated alongside us.

• Possibility to work within a product company and to see the results of your work.

Contact:

Thanks!

Acest subiect a fost închis automat după 30 de zile de la primul răspuns. Nu mai sunt permise răspunsuri noi.