Nơi làm việc: Hà Nội
Mức lương:
Ngành nghề: CNTT - Phần cứng / Mạng , CNTT - Phần mềm
Hạn chót nộp hồ sơ:
Cấp bậc: Nhân viên
• Design and implement enterprise-grade data platforms, including data lakes, data warehouses, and real-time streaming systems.
• Build and manage data pipelines for ingestion, transformation, and integration from multiple internal and external sources.
• Optimize platform performance, reliability, and scalability for large data volumes and high concurrency.
• Define data architecture blueprints, standards, and reference implementations.
• Ensure data quality, consistency, lineage, and metadata management across the platform.
• Implement role-based access, encryption, and data protection to comply with regulatory standards (e.g., GDPR, local banking regulations).
• Collaborate with data governance and security teams to embed policies into the platform.
• Enable self-service analytics, BI, and AI/ML model deployment on the platform.
• Support data scientists, analysts, and business teams with high-quality, curated datasets.
• Integrate real-time data streaming and event-driven architectures for advanced use cases (fraud detection, credit scoring, personalized offers).
• Work with enterprise architects, solution architects, and data analysts to align platform capabilities with business priorities.
• Partner with software engineers, DevOps, and cloud engineers to deliver data-driven applications.
• Collaborate with product owners and business stakeholders to enable data monetization and innovation initiatives.
• Evaluate and implement new technologies, frameworks, and tools (e.g., lakehouse, data mesh, serverless data processing).
• Drive automation in data workflows, monitoring, and operations.
• Continuously improve performance, cost efficiency, and developer experience on the data platform.
• Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
• 5+ years of experience in data engineering or platform engineering roles.
• Proven expertise in building and managing enterprise data platforms (data lakes, warehouses, or lakehouse).
• Strong hands-on experience with cloud platforms (AWS, Azure, GCP) and big data technologies (Hadoop, Spark, Kafka, Flink).
• Proficiency in SQL and programming languages (Python, Java, Scala).
• Experience with modern data warehousing solutions (Snowflake, BigQuery, Redshift, Databricks).
• Knowledge of data governance, security, and regulatory compliance in financial services.
• Familiarity with CI/CD, DevOps, and infrastructure-as-code (Terraform, Kubernetes).