Search for More Jobs
Get alerts for jobs like this Get jobs like this tweeted to you
Company: Mastercard
Location: Pune, MH, India
Career Level: Mid-Senior Level
Industries: Banking, Insurance, Financial Services

Description

Our Purpose

Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential.

Title and Summary

Senior Big Data Test Engineer Mastercard is the global technology company behind the world's fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. Services within Mastercard are responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience.

As a Senior Test Engineer in Data Collection Engineering (Data and Analytics), you will play a vital role in testing high-performance data pipelines which load into the Mastercard Data Warehouse. You will work within a rapidly growing organization, collaborating closely with experienced engineers to solve challenging problems and ensure business users can derive critical insights from our data.

2. Role
In this role, you will lead quality assurance efforts by developing automated frameworks and validating complex data transformations. Your responsibilities will include:

Automation & Framework Development: Design, support, and maintain automated test frameworks and tools using Python, Java, or Shell scripting to bring efficiency to test cycles and reduce manual toil.
Big Data Validation: Conduct thorough testing of data pipelines, ETL processes (Batch/Real time processing), and data transformations using Oracle and Big Data technologies (Spark, Hive) on Hadoop or Object Storage environments (AWS).
Data Quality Assurance: Apply Data Warehouse/Data Lake methodologies to validate the accuracy, completeness, and performance of data storage and retrieval systems.
Process & Standards: Contribute to the improvement of Quality Assurance/Quality Control processes. actively identify opportunities to enhance standards, enforce engineering principles, and participate in code reviews and design discussions.
Collaboration with Project Team: Work in an Agile/Scrum environment with cross-functional teams—including software developers, data engineers, and data analysts—to ensure comprehensive testing coverage and adherence to scheduled due dates.
Collaboration with Platform support: Coordinate and collaborate with the DBA, SRE, and Operations team to address any environment-related issues and ensure testing intergrity without compromising project deadlines.
Incident Management: Perform root-cause analysis, document and track software defects, and manage production incidents independently to ensure timely resolution.
Innovation: Leverage new technologies and perform Proofs of Concept (POC) to explore the best solutions for testing increasingly large-scale datasets.
3. All About You / Experience
Technical Expertise

Experience: Bachelor's or Master's degree in Computer Science or Engineering with 3 to 5+ years of testing experience in Data Warehouse/Big Data projects.
Automation Skills: Strong programming skills in Python, Java, or Shell scripting with a proven track record of building and enhancing test automation frameworks.
Big Data Stack: Good Hands on experience with Big Data technologies such as Apache Spark, Hive, and Impala, and experience working with large-scale distributed data processing frameworks.
Data Format Proficiency: Deep experience parsing, validating, and handling various structured and semi-structured file types including JSON, Parquet, Fixed Width, CSV, and Avro.
Database & SQL: Expert-level knowledge of SQL and strong experience working with databases like Oracle and Netezza.
ETL Proficiency: Deep understanding of ETL (Extract, Transform, Load) design, scheduling, and orchestration tools. Proven experience in End-to-End testing, Test Environment planning, and Performance testing.
ETL Tool (Good to Have): Experience with Apache NiFi is a strong advantage.
Professional Attributes

Analytical Mindset: Strong analytical skills required for data analysis, debugging, and the defect management process.
Agile Practitioner: Solid understanding of Agile/Scrum methodologies, user stories, acceptance criteria, and sprint cycles.
Communication: Excellent verbal and written communication skills, with the ability to build relationships and collaborate effectively in a matrix-based, geographically distributed team.
Autonomy: Ability to be high-energy, detail-oriented, and proactive. You function well under pressure in an independent environment with a high degree of self-motivation to drive results.
Learner: A quick learner with the ability to adapt to new technologies and apply best practices to validate data models.

Corporate Security Responsibility


All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:

  • Abide by Mastercard's security policies and practices;

  • Ensure the confidentiality and integrity of the information being accessed;

  • Report any suspected information security violation or breach, and

  • Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines.




 Apply on company website