Állás részletei
-
Cég neve
High Tech Engineering Center Kft.
-
Munkavégzés helye
Országos lefedettség -
Munkaidő, foglalkoztatás jellege
- Alkalmazotti jogviszony
- Általános munkarend
-
Elvárt technológiák
- SECURITY ACCESS CLOUD AZURE BI TABLEAU POWER BI AWS APACHE ISO DATABASES GDPR
-
Elvárások
- Nem kell nyelvtudás
- Nem kell tapasztalat
- Középiskola
Állás elmentve
A hirdetést eltávolítottuk a mentett állásai közül.
Állás leírása
Responsibilities
Design and propose changes to current data systems, including migration paths and modernization strategies for legacy architectures
Define and evolve the target architecture for data platforms, covering ingestion, storage, processing, and consumption layers
Design and implement both batch and streaming/event-driven architectures to support diverse data processing requirements
Evaluate and recommend technology stacks across Azure, AWS, GCP and on-premises environments for hybrid and multi-cloud deployments
Design scalable, high-throughput data pipelines capable of handling large data volumes with low latency
Architect and deliver reporting systems using Power BI, Apache Superset, Grafana, or Tableau
Translate business and product requirements into technical architecture, data models, and implementation blueprints
Lead PoC design, implementation, and validation including architecture documentation
Assess AI/ML readiness including data availability, quality, governance, and infrastructure constraints
Ensure alignment with security, compliance, and regulatory standards including GDPR, SOC 2, ISO 27001, HIPAA, and PCI-DSS across all deployment environments
Collaborate closely with Product Managers, Engineers, and Domain Experts
Provide technical guidance during the transition from architecture to implementation
Define and evolve the target architecture for data platforms, covering ingestion, storage, processing, and consumption layers
Design and implement both batch and streaming/event-driven architectures to support diverse data processing requirements
Evaluate and recommend technology stacks across Azure, AWS, GCP and on-premises environments for hybrid and multi-cloud deployments
Design scalable, high-throughput data pipelines capable of handling large data volumes with low latency
Architect and deliver reporting systems using Power BI, Apache Superset, Grafana, or Tableau
Translate business and product requirements into technical architecture, data models, and implementation blueprints
Lead PoC design, implementation, and validation including architecture documentation
Assess AI/ML readiness including data availability, quality, governance, and infrastructure constraints
Ensure alignment with security, compliance, and regulatory standards including GDPR, SOC 2, ISO 27001, HIPAA, and PCI-DSS across all deployment environments
Collaborate closely with Product Managers, Engineers, and Domain Experts
Provide technical guidance during the transition from architecture to implementation
Requirements
Strong experience in designing robust data systems from scratch, including greenfield architecture definition, technology selection, and end-to-end implementation planning
Experience in modernization of data legacy systems
Hands-on experience with high-throughput data systems and performance optimization at scale
Multi-cloud experience across Azure, AWS, GCP, and including hybrid and on-premises deployments
Experience with AI/ML production systems
Experience implementing Lakehouse architecture with Apache Iceberg
Experience with modern data platforms such as Databricks and Snowflake
Solid understanding of batch architectures (ETL/ELT, data warehouses) and streaming architectures (Kafka, Flink or equivalent)
Experience designing reporting systems with BI tools (Power BI, Superset, Grafana)
Strong knowledge of relational databases, columnar stores, object storage, data catalogs, and orchestration tools (Airflow, dbt)
Familiarity with compliance frameworks and regulatory requirements: GDPR, SOC 2 Type II, ISO 27001, HIPAA, PCI-DSS, and data residency requirements
Experience implementing data governance controls, audit logging, access management, and encryption standards aligned to compliance mandates
Ability to bridge business and technical stakeholders and communicate architecture decisions clearly
Experience in modernization of data legacy systems
Hands-on experience with high-throughput data systems and performance optimization at scale
Multi-cloud experience across Azure, AWS, GCP, and including hybrid and on-premises deployments
Experience with AI/ML production systems
Experience implementing Lakehouse architecture with Apache Iceberg
Experience with modern data platforms such as Databricks and Snowflake
Solid understanding of batch architectures (ETL/ELT, data warehouses) and streaming architectures (Kafka, Flink or equivalent)
Experience designing reporting systems with BI tools (Power BI, Superset, Grafana)
Strong knowledge of relational databases, columnar stores, object storage, data catalogs, and orchestration tools (Airflow, dbt)
Familiarity with compliance frameworks and regulatory requirements: GDPR, SOC 2 Type II, ISO 27001, HIPAA, PCI-DSS, and data residency requirements
Experience implementing data governance controls, audit logging, access management, and encryption standards aligned to compliance mandates
Ability to bridge business and technical stakeholders and communicate architecture decisions clearly
Nice-to-have
Experience in public safety, critical infrastructure, or mission-critical systems, Exposure to real-time data processing and event-driven architectures
How to apply
You can submit your application on the company's website, which you can access by clicking the „Apply on company page“ button.
Állás, munka területe(i)
Álláshirdetés jelentése