hackajob is partnering with DXC Technology to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
AI Data Architect
Job Description
DXC Technology is a Fortune 500 Global IT Services Leader and is ranked at 152.
Our more than 130,000 people in 70-plus countries are entrusted by our customers
to deliver what matters most. We use the power of technology to deliver mission
critical IT services that transform global businesses. We deliver excellence for our
customers, colleagues and communities around the world.
Accelerate your career and reimagine the possibilities with DXC!
We inspire and take care of our people. Work in a culture that encourages innovation
and where brilliant people embrace change and seize opportunities to advance their
careers and amplify customer success. Leverage technology skills and deep industry
knowledge to help clients. Work on transformation programs that modernize
operations and drive innovation across our customer’s entire IT estate using the
latest technologies in cloud, applications, security, IT Outsourcing, business process
outsourcing and modern workplace.
At DXC Technology, we believe strong connections and community are key to our
success. Our work model prioritizes in-person collaboration while offering flexibility to
support wellbeing, productivity, individual work styles, and life circumstances. We’re
committed to fostering an inclusive environment where everyone can thrive
AI Data Architect – AI Data Foundations (Industry-Agnostic)
Job Summary
We are seeking an AI Data Architect to design and implement AI-ready data
foundations. You will define data models, storage patterns, pipelines, and
governance frameworks that support machine learning and analytics. Working
closely with data engineers and ML engineers, you will ensure data is well-
structured, accessible, secure, and scalable for AI use cases.
Key Responsibilities
Design and maintain data models and schemas (relational, graph, time-
series) aligned to AI and business needs.
Define scalable data storage architectures across databases, data lakes, and
warehouses in cloud environments.
Build and oversee ETL/ELT pipelines to integrate batch and streaming data
from multiple sources.
Establish data governance, quality, security, privacy, and compliance
standards.
Implement metadata management and data lineage for transparency and
traceability.
Collaborate with data, ML, and analytics teams to deliver AI-ready datasets
and features.
Monitor, optimise, and scale data platforms for performance, cost efficiency,
and growth.
Required Qualifications
Bachelor’s degree in Computer Science, Information Systems, or related field
(or equivalent experience).
3–5 years of experience in data architecture, data engineering, or similar
roles.
Strong data modelling and database design skills (SQL, relational, and
NoSQL databases).
Hands-on experience building and managing data pipelines (ETL/ELT) using
Python, SQL, or related tools.
Experience with cloud-based data platforms and modern data architectures.
Solid understanding of data governance, quality management, security, and
privacy.
Strong communication skills and ability to collaborate across technical teams.
Preferred Experience
Experience supporting AI/ML or advanced analytics projects.
Familiarity with data catalogues, governance, and lineage tools (e.g. Collibra,
Alation, Atlas).
Exposure to ontologies, semantic modelling, or knowledge graphs.
Experience with modern architectures (lakehouse, data mesh,
streaming/event-driven systems).
Relevant cloud or data architecture certifications.
hackajob is partnering with DXC Technology to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.