Raiffeisen Bank Zrt. logó

DATA ENGINEER (Data Science Enablement)

Állás részletei

  • Cég neve

    Raiffeisen Bank Zrt.

  • Munkavégzés helye

    Hibrid Budapest
  • Nézze meg a jelentkezők átlag bérigényét!

  • Egyéb juttatások

    • Cafeteria
    • Munkába járás támogatás
    • Egészségbiztosítás
  • Munkaidő, foglalkoztatás jellege

    • Teljes munkaidő
    • Általános munkarend
  • Elvárt technológiák

    • CLOUD AWS SQL PANDAS PYTHON ANALYTICS AGILE SCRUM
  • Elvárások

    • Angol középfok
    • 1-3 év tapasztalat
    • Főiskola
Állás elmentve
A hirdetést eltávolítottuk a mentett állásai közül. Visszavonom
A hirdetés adatait sikeresen elküldte az email címére.

Állás leírása

content-header
    • Participate in the lifecycle of data science projects, incl. design and development of data processing and monitoring pipelines
      Work with the state-of-the-art cloud infrastructure (AWS, Databricks)
    • Assemble large, complex data sets to meet functional / non-functional business requirements
    • Develop, maintain and optimize ELT and ETL pipelines (incl. incidents investigation, writing “postmortems”)
    • Continuously support internal consumers (data analysts, data scientists) in best data engineering practices and automation of development pipelines
    • Find and adopt best practices, share your knowledge with a team and constantly learn from others
    • You’ll work both locally and in an international analytics team at a leading bank
    • Be part of a huge international community of Data Team sharing learnings, best practices, use cases regularly
    • Data science courses provided by the group
    • Flexible Home Office opportunity (up to 80%)
    • You can choose from 16 types of Cafeteria benefits: SZÉP card, extra day off, commuting support, health insurance, tickets to cultural events and sports events,
    • We provide housing loan support, an employee account package
    • A modern working environment, dining rooms and cafes await you
    • As part of a well-being program, we pay attention to the physical and mental health of our colleagues
    • Structured and conceptual mindset coupled with strong quantitative and analytical problem-solving attitude.
    • Professional experience in designing and developing production ready data application and pipelines in cloud ecosystem.
    • Software engineering excellence: understanding of SDLC, Unit / Integration tests, Data Lake architecture.
    • Knowledge in PySpark and SQL (DDL, analytical functions, sub-queries, optimization of performance, principles for optimization).
    • Experience working within agile (scrum) methodology.
    • Fluent English, spoken and written.
    Will be a plus:
    • BSc in Computer Science, Informatics, Software Engineering or related major
    • Solid knowledge of ML principles and frameworks, analytical libraries (e.g., pandas, NumPy, scikit-learn etc.)
    • Knowledge of DBT for ETL pipelines
    • Good comprehension of data warehousing principles, MDM, data models (LDM/PDM)
    • Experience with other programming languages beyond Python
    • Experience in developing CI/CD/CT pipelines (we’re using GitHub actions)
  • Canteen
  • Cafe
  • Medical assistance
  • Corporate events
  • Bicycle storage
  • Parking
  • Play corner
Jelentkezés

Álláshirdetés jelentése