Magnecomp Precision Technology Public Company Limited
About Company
WE ARE THE POWER TEAM.
We are a TDK subsidiary, who is a leading Independent suspension manufacturer specializing in designing, developing and manufacturing suspensions for the Hard Disk Drive industry. With an IP rich portfolio, multiple Asian manufacturing sites and the only Company who has design centers in Thailand; we provide the widest variety of high quality, cost effective suspension solutions in the industry.
A TDK Group company
“We have a policy of Non-Discrimination in our recruitment process that has opportunities for disabled candidates who are suitable for the job”
If you want to join an international Company, please click “Apply Now” to submit your application and identify the location that you are interested in;
The Employment Manager
Magnecomp Precision Technology Public Company Limited
We are seeking a highly capable Senior Data & Analytics Integration Lead to join our team in supporting the development of factory-scale analytics systems, focusing on Root Cause Analysis (RCA), traceability, and business intelligence dashboards. This role will act as the technical liaison between our internal team and external development partners, ensuring data pipelines, real-time alerting, and analytics tools are robust, maintainable, and aligned with strategic goals.
You will contribute to the design, validation, and extension of analytics infrastructure—including Grafana dashboards, ETL pipelines, and data models—while also guiding junior team members and helping bridge technical gaps in vendor-delivered solutions.
RESPONSIBILITIES:
Collaborate with external development teams to validate data pipelines, analytics outputs, and dashboard functionality.
Support the deployment and extension of:
Star schema and Lakehouse data models for factory analytics and traceability.
Data ingestion pipelines from industrial machines (CSV, SECS/GEM, logs) using Airflow.
Real-time alerting systems and data quality monitoring frameworks.
Grafana dashboards with business metrics, RCA visualizations, and filterable views.
Develop or step in to enhance microservices, Python scripts, or visualizations when required.
Contribute to chatbot and LLM integrations to support self-service analytics navigation and RCA explanation.
Align cross-department technical requirements with analytics capabilities, especially in support of KPIV–KPOV mapping and defect traceability.
Provide architectural insight and mentor junior engineers on system understanding and good practices.
Ensure maintainability and documentation for eventual in-house ownership of the platform.
Requirements
Education Required:
Bachelor’s or Master’s degree in Computer Science, Data Engineering, Analytics, or related field (or equivalent experience)
Skill Required:
Strong experience with Grafana, Airflow, and SQL-based analytics.
Proficiency in Python for ETL, data transformation, and microservice integration.
Knowledge of star schema, Lakehouse architecture, and data mart design.
Familiarity with Kubernetes, Docker, and Git-based DevOps pipelines.
Understanding of real-time data ingestion from edge sources (CSV, logs, SECS/GEM)
Exposure to AI/ML workflows, LLM-based chatbots, and tools like Ollama or Kedro is a plus.
Strong coordination and communication skills for vendor and cross-functional collaboration.
Proven ability to guide and upskill junior technical staff.
Strong system thinking and troubleshooting ability.
Experience:
6+ years of experience in data engineering, analytics systems, or platform integration.
Experience working across development partners, data teams, and business users.