Állás részletei
-
Cég neve
High Tech Engineering Center Kft.
-
Munkavégzés helye
Országos lefedettség -
Munkaidő, foglalkoztatás jellege
- Teljes munkaidő
- Általános munkarend
-
Elvárt technológiák
- MACHINE LEARNING DATABASES
-
Elvárások
- Nem kell nyelvtudás
- 5-10 év tapasztalat
- Egyetem
Állás elmentve
A hirdetést eltávolítottuk a mentett állásai közül.
Állás leírása
Responsibilities
Building cutting-edge multilingual transcription systems
Delivering accurate, consistent audio and text transcription across diverse media assets
Enabling automatic language and locale detection in real-world scenarios
Working with state-of-the-art LLMs and transformer-based architectures
Architecting scalable AI solutions
Fine-tuning high-impact models
Delivering accurate, consistent audio and text transcription across diverse media assets
Enabling automatic language and locale detection in real-world scenarios
Working with state-of-the-art LLMs and transformer-based architectures
Architecting scalable AI solutions
Fine-tuning high-impact models
Requirements
5+ years of industry experience
2+ years implementing LLM or NLP systems
Bachelor's (preferred Master's Degree) in Computer Science or Electrical Engineering
Hands-on experience with fine-tuning transformer models for text-to-text and audio-to-text
Experience with multi-lingual NLP
Proven track record for building ML / AI systems architecture, RAG and multi-agent systems
Deep understanding of transformer models (Huggingface, SentenceTransformers)
Experience with efficiently fine-tuning large language models (LLMs) with LoRA, QLoRA
Knowledge of model optimization techniques (quantization, pruning, distillation)
Experience with multi-modal dataset preparation and quality improvement
Multi-lingual NLP, and cross-lingual transfer learning
Familiarity with language-specific text processing (tokenization, normalization)
Deployment of resource intensive models and scalable AI systems
Monitoring model performance in production and detecting drift
Distributed training (DDP, FSDP, mixed-precision)
2+ years implementing LLM or NLP systems
Bachelor's (preferred Master's Degree) in Computer Science or Electrical Engineering
Hands-on experience with fine-tuning transformer models for text-to-text and audio-to-text
Experience with multi-lingual NLP
Proven track record for building ML / AI systems architecture, RAG and multi-agent systems
Deep understanding of transformer models (Huggingface, SentenceTransformers)
Experience with efficiently fine-tuning large language models (LLMs) with LoRA, QLoRA
Knowledge of model optimization techniques (quantization, pruning, distillation)
Experience with multi-modal dataset preparation and quality improvement
Multi-lingual NLP, and cross-lingual transfer learning
Familiarity with language-specific text processing (tokenization, normalization)
Deployment of resource intensive models and scalable AI systems
Monitoring model performance in production and detecting drift
Distributed training (DDP, FSDP, mixed-precision)
Nice-to-have
Hands on experience with open-source LLMs (Gemma, DeepSeek, Llama, GPT-OSS), Building RAG systems, multi-agent systems, and utilizing LLM APIs and vector databases (Pinecone, ChromaDB, FAISS, Milvus), Knowledge of LLM Orchestration Frameworks: LangChain, LangGraph, Llama Index
How to apply
You can submit your application on the company's website, which you can access by clicking the „Apply on company page“ button.
Állás, munka területe(i)
Álláshirdetés jelentése