Job Description
Key Responsibilities
• Lead the design, development, and deployment of a production-grade multi-agent platform using LangGraph, including stateful workflows, persistence, and Human-in-the-Loop checkpoints.
• Architect and deliver the Intelligence Layer integrating agent orchestration, enterprise knowledge graphs, and trusted data platforms.
• Design and implement the Ontology-to-Schema pipeline, mapping OWL/SKOS enterprise ontologies into TigerGraph models optimized for agentic reasoning.
• Develop and optimize high-performance GSQL queries, distributed graph algorithms, and multigraph memory structures.
• Own trusted data materialization in Snowflake using dbt, including automated Data Quality rules and Trust Score computation.
• Build and maintain agent capabilities for Internal Data Operations such as metadata harvesting, governance automation, and data quality remediation, as well as external-facing intelligence use cases.
• Follow a spec-first development approach, defining agent behavior, tool contracts, and data schemas prior to implementation.
• Support and enhance AI-optimized CI/CD pipelines to ensure reliable, observable, and model-agnostic deployments.
• Act as a senior technical resource, removing blockers, enforcing architectural standards, and ensuring long-term system stability.
Required Technical Skills and Experience
• 10+ years of senior software and data engineering experience delivering enterprise-scale production systems.
• 2+ years of hands-on agentic orchestration experience with LangGraph including StateGraph, persistence, and command execution, along with LangChain.
• Production experience with multiple LLMs including Anthropic Claude, OpenAI GPT, and Llama, and familiarity with Model Context Protocol.
• 5+ years of TigerGraph and GSQL experience, including distributed graph algorithms and query performance optimization.
• Experience modeling enterprise knowledge using OWL, SKOS, or RDF and mapping ontologies to Labeled Property Graph schemas.
• 5+ years of analytics engineering experience with dbt and Snowflake, including automated data quality monitoring and trust pipelines.
• Advanced Python skills including asynchronous programming, FastAPI, and Pydantic, with strong SQL expertise.
• Demonstrated experience building automation for data governance, metadata management, and policy enforcement in enterprise environments.
Experience: 8-10 Years .
The expected compensation for this role ranges from $80,000 to $158,000 .
Final compensation will depend on various factors, including your geographical location, minimum wage obligations, skills, and relevant experience. Based on the position, the role is also eligible for Wipro's standard benefits including a full range of medical and dental benefits options, disability insurance, paid time off (inclusive of sick leave), other paid and unpaid leave options.
Applicants are advised that employment in some roles may be conditioned on successful completion of a post-offer drug screening, subject to applicable state law.
Wipro provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Applications from veterans and people with disabilities are explicitly welcome.
Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention.