Worldpay Job - 31156044 | CareerArc
  Search for More Jobs
Get alerts for jobs like this Get jobs like this tweeted to you
Company: Worldpay
Location: Cincinnati, OH
Career Level: Mid-Senior Level
Industries: Computer Software


The Company

Worldpay is a global payments leader powering international commerce with deep fintech expertise and a shared passion for our customers. Whether in-store, online, or on a mobile device, we process over 40 billion transactions annually and offer more than 300 payment methods supporting 126 currencies across 146 countries. It's the perfect place for exceptional people to take their careers to the next level.

The Opportunity

As Worldpay grows, we must be proactive in assuring we stay ahead of the curve with truly cutting-edge products and services. The payments industry is in rapid evolution with the implementation of new technologies. By hiring the best minds that understand that new features and capabilities are just as important as security, Worldpay will continue to beat the competition with our offerings. Whether it's dedicated to new products and services, or ensuring we have the best infrastructure out there, Worldpay is dedicated to tech.

The Sr Technology Engineer of Hadoop/Big Data provides on-going support of critical data warehousing and reporting systems.

A World of Opportunity

We're turbo-charging our industry by nurturing the fintech experts needed to help our customers prosper. We don't try to ride the winds of change. We create them. We're proud to be shaping the future of payments by supporting the growth and development of our colleagues. We provide opportunities to learn and the flexibility needed to get the job done. We strive to hire the best and to create a climate where curiosity is king. So, wherever you join us around the world, we'll empower you to fulfill your potential. If this is the kind of career experience you're looking for, we invite you to apply today.

The Day-to-Day Responsibilities:

Work directly with an onshore/offshore IT support team responsible for operational performance of critical financial and corporate systems in the Big Data/Hadoop ecosystem (Hive, MapReduce, Scoop, Pig, Spark, Hue, R or other NoSQL technologies)

Level 2/3 responsibly and LOB point of contact responsible for daily data load processes.

Continuously provide SME knowledge to promote the operational health and monitoring of production environment of corporate systems

Demonstrate ability to meet or exceed organizational KPI metrics for MTTR, Problem, Incident, Service Requests ticketing and quality assurance.

Support production escalations and manage issue through resolution to stabilize operational environment(s).

Perform L2/L3 analysis and engage directly with IT organizations (including, Development, Infrastructure and Database teams) to provide root cause, recommendations and resolution planning.


Bachelor's degree plus minimum 3 years of experience in IT support and experience

Proven results using analytics to drive business decisions, ideally in a Hadoop/Data Warehousing role

Proven ability to troubleshoot application issues in a complex production environment

Passion for recognizing and solving problems to deliver actionable business insights.

Excellent communication skills, both verbal and written, and team skills necessary to work well with onshore and offshore team members.

Experience with multiple distributed system environments, including Windows, Linux and UNIX.

Detailed and proven experience with:

Hadoop file system

Hadoop ecosystem applications such as R, Hue, oozie, pig, etc.

Data Warehousing concepts and ETL experience

12 months with Big Data (Hive, MapReduce, Scoop, Pig, Spark or other NoSQL technologies)

UNIX shell scripting


Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming

Experience with Spark, Hadoop, MapReduce, HDFS

Experience with NoSQL databases, such as HBase, Cassandra, MongoDB

Knowledge of various ETL techniques and frameworks, such as Flume, Sqoop

Experience with various messaging systems, such as Kafka or RabbitMQ

Experience with Cloudera/MapR/Hortonworks

Knowledge of workflow/schedulers like Oozie

Proficient knowledge in back-end programming, specifically java, JS, Node.js and OOAD

Proficient knowledge of database structures, theories, principles, and practices

Analytical and problem solving skills, applied to Big Data domain

Proficient aptitude in multi-threading and concurrency concepts

Proficiency desirable: ITIL and Agile/SAFe

 Apply on company website