Data Quality Engineer
Barclays
Join the Cards Platform team as a Data Quality Engineer where you will be responsible for ensuring the reliability and integrity of data across high-v...
Opportunities at Incedo, Cognizant & Persistent and 5 others for Data & Analytics, IT & Infrastructure roles across India and 11 more locations.
Showing 1 to 12 of 13 jobs
Join the Cards Platform team as a Data Quality Engineer where you will be responsible for ensuring the reliability and integrity of data across high-v...
As a Director-level Enterprise Architect, you will define and implement the end-to-end architecture for Cognizant's AI Training Data Service (AITDS) p...
As a Senior Director-level Enterprise Data Architect, you will define and implement the comprehensive architecture for Cognizant's AI Training Data Se...
As a Lead Data Scientist, you will design and implement advanced machine learning and predictive analytics solutions to address complex business probl...
As a Senior Lead Data Scientist, you will design and deploy advanced machine learning models and predictive analytics solutions to address complex bus...
As an Associate Director specializing in Data Science and Modeling, you will lead the design and deployment of advanced predictive models and machine ...
This position involves leading big data initiatives to manage and process large-scale distributed datasets. You will design and implement data process...
The Senior Data Engineer will design and build a scalable enterprise data warehouse and robust data pipelines to support analytics across Sales, Marke...
This role focuses on designing, building, and operating large-scale distributed systems that power Salesforce’s big data infrastructure. You will de...
Lead the design and implementation of large-scale big data solutions. Manage complex datasets, build and optimize data processing pipelines, and ensur...
Highly motivated Big Data developer role with hands-on Hadoop, Spark, PySpark, Scala and SQL experience. Build next generation real-time financial dat...
Responsible for designing and maintaining scalable data pipelines, building data platforms, and optimizing ETL workflows using Python, Spark, Hadoop, ...