Data Engineer BLOX

Date:  Mar 6, 2026
Division:  Central Services (40000003)
Location: 

Penang, MY

Requisition ID:  2112

 

To complement our team, we are looking for a customer- and teamoriented individual for the business area Group Functions at the Penang site as a

 

Data Engineer BLOX

 

For over 75 years, we have been following our curiosity. It drives us and has made Comet a leading Swiss technology company worldwide. Curiosity and the spirit of research have allowed us to become and remain innovative. For a better and sustainable world.

 

We develop and produce innovative high-tech components and systems based on X-ray and radio-frequency technology. Our developments make an important contribution to safer, more efficient and more sustainable production, mobility and communication. 

 

In the Group Funtions, we support the market-oriented divisions with central services. We work together today on the solutions for tomorrow - in close cooperation with our customers and always with high quality standards.

 

What you will do:

  • Data Architecture & Pipeline Development: Design, build, and maintain ETL/ELT processes using Databricks (Spark/PySpark) and Python. Integrate data from various sources (APIs, MQTT, Streams) and refactor legacy code into production-grade Python packages.
  • Infrastructure & DevOps: Collaborate with the IaC team to manage infrastructure and provide technical support for decentralized, high-availability global systems. Utilize Kubernetes for orchestration and maintain data tools (Elasticsearch, InfluxDB, RabbitMQ).
  • Software Engineering & Best Practices: Develop web-based APIs, implement robust CI/CD pipelines (Test Automation, Linting), and write clean, documented code consistent with internal Developer Guidelines. Proactively monitor system performance and troubleshoot bottlenecks.
  • Collaboration & Enablement: Act as the technical enabler for Data Analysts, ensuring data accessibility for statistical analysis. Document architectures on Jira/Confluence to facilitate knowledge sharing.
  • Other duties as assigned

 

What you bring:

  • Bachelor's degree in Computer Science, Software Engineering, Data Engineering, or a related field. Fluency in spoken and written English.
  • Strong software development experience in Python with proven expertise in building and managing pipelines in Databricks (Spark/PySpark).
  • Solid knowledge of SQL, database modeling, and API development/integration.
  • Experience with Kubernetes (K8s) for containerized environments and Infrastructure as Code (IaC) tools like Terraform or Ansible is highly desirable.
  • Familiarity with message brokers (RabbitMQ, MQTT, Kafka) and time-series databases (InfluxDB) is a strong advantage.


Why join us? Break new ground with us:

  • Work on innovative solutions for global technical challenges
  • Benefit from flexible working options and hybrid working
  • Take part in Comet’s success through profit sharing
  • Work with international colleagues and grow as a team
  • Enjoy our diverse educational and career opportunities
  • Access a wide range of benefits including allowances for meals, transport, tolls, mobile, and car loan interest
  • Benefit from Flexi Benefits for health checks, dental and optical care, plus comprehensive medical insurance coverage
     

We are curious about you and look forward to receiving your complete online application and will be happy to answer any questions you may have.

For this vacancy, we only consider direct applications. Submissions from recruitment agencies will not be considered.

 

Comet Technologies Malaysia Sdn Bhd
PMT 761 Persiaran Cassia Selatan 3
Taman Perindustrian Batu Kawan
14110 Bandar Cassia
Penang, Malaysia
comet.tech/careers