
Singtel |Nxera
Senior Data & Integration Architect
Internship  / 2 Years
Information Technology
Posted 06 Aug, 2025
Job Description
- Design, build and implement enterprisewide data and API integration frameworks to support AI/ML platforms across hybrid cloud and onpremise environments
- Work with system owners and data domain leads to design and deliver scalable endtoend data flows across operational, analytical, and AI systems
- Define and develop secure, reusable API interfaces (REST, GraphQL, eventdriven) and data interface (batch or streaming) that enable seamless interoperability between internal systems and AI services
- Oversee and evaluate new data integration approaches and pipeline designs to ensure efficient, secure, and scalable data flow between data sources and AI platforms.
- Collaborate with Security and Data Governance teams to ensure integration designs align with compliance, privacy, and policy requirements (e.g., PDPA, data classification)
- Design and enable data access strategies for LLMs and agentbased workflows, ensuring contextrich, realtime connectivity to distributed enterprise systems
- Implement and maintain integration middleware and tooling (e.g. Kafka, Azure ML/Foundry, Databricks, etc) to support data orchestration, synchronization, and reliability
- Contribute integration expertise to data or AI experimentation, PoCs, and platform upgrades, ensuring architectural consistency and productionreadiness
- Define and enforce data and integration design standards, focusing on scalability, resilience, observability, and system decoupling
- Work closely with business units, IT, and Networks to align integration plans with enterprise priorities and ensure successful data exchange across functional boundaries
Job Requirements
- Bachelor’s in Computer Science, Engineering, Data, AI/ML, or related field.
- At least 3 years of experience in data architecture, system and API integration engineering.
- Demonstrated experience in designing integration flows for largescale, realtime systems across cloud and legacy environments.
- Experience in designing and implementing data integration frameworks across hybrid cloud and onpremise environments, including building scalable and secure data pipelines for AI/ML platforms.
- Proficient in data integration design, with solid knowledge of data pipelines, data lakes, data warehouses, and data lakehouse architectures.
- Good knowledge of modern data orchestration and middleware tools such as Apache Kafka, Azure Data Factory, Databricks, Airflow, and experience in managing data flow between operational, analytical, and AI environments.
- Working knowledge of data security, data protection and data quality management including implementation of encryption, RBAC, masking, and alignment with regulatory frameworks such as PDPA and internal data classification policies.
- Proven experience integrating data systems with AI/ML workflows, including model training, serving, monitoring, and enabling contextaware access for LLMs and agentbased automation.
- Effective collaboration skills to work across data, platform, machine learning engineering and API integration teams, with a clear communication style to bridge business and technical stakeholders
- Good internal (IT, Networks, business) and external (suppliers, government) stakeholders management skills
- Strong technical writing and presentation skills, with the ability to communicate complex concepts clearly to both technical and nontechnical stakeholders.
- Proactive and fast learner with a strong drive to stay current on emerging technologies and industry trends.