At DXC we use the power of technology to deliver mission-critical IT services that our customers need to modernize operations and drive innovation across their entire IT estate. We provide services across the Enterprise Technology Stack for business process outsourcing, analytics and engineering, applications, security, cloud, IT outsourcing, and modern workplace.
Our DXC Application services team works with our customers as a trusted partner to help simplify, modernize, and accelerate mission-critical applications that support business agility and growth. Customers work with us to take advantage of the latest digital platforms with both custom and packaged applications, ensure resiliency, launch new products and enter new markets with minimal disruption.
About this role:
DXC team is seeking a highly skilled and analytical Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining the data pipelines that enable our organization to make informed, data-driven decisions. You will work closely with our data scientists and analysts to understand their data needs and ensure that the data infrastructure is optimized for fast and accurate data processing. You will also work with the IT team to ensure that the data pipelines are secure and compliant with company policies.
What You’ll Do:
Demonstrates ownership and teamwork. Leads data product design, code development, production deployment, and production support efforts.
Coordinates Data product development efforts with other Data Engineers in the squad and helps Product Owner in work assignments. Represents other Data Engineers and provides updates on overall development efforts
Represents Data product changes in CAB meetings.
Leads resolution and implementation of all production defects and changes
Reviews the code developed by other Data Engineers and ensures data engineering best practices are followed
Actively supports Data products production runs, validates the data products for completeness & accuracy, and coordinates with Production support teams
Ensures implementation of data and IT controls in Data products
Actively involved in the creation of data products including development, testing, deployment, and production support.
Required to perform the following activities hands-on:
Explore and understand data sets.
Building data pipelines, integrating and scheduling ETL jobs using CI/CD framework.
Implement required data transformation in the data lake.
Visualize the data set; determine whether the data set has enough information to answer the data product requirements.
Work with IT support to create and visualize the data and data products on the data lake.
Configure required security and data masking to a data set.
Support testing of data acquisition, data set correlation, and/or product development. Investigate; resolve data and interface issues
Work with IT to harden and produce data products, and business procedures.
Who you are:
Candidates for this role must have:
Good knowledge of architecture, design patterns, Source target mappings, ETL Architecture in Hadoop space, data modeling techniques, and performance tuning in the Hadoop environment.
Experience with big data tools: Hadoop (CDH), Spark, Kafka, PySpark, Python, Impala, Hive, Hue, HDFS, and related tools
Experience in Azure DevOps and CI/CD tools such as Maven, Git/Stash, Jenkins, Docker, etc.
Experience in Tableau data visualization tool is plus
Advanced level knowledge and experience with SQL queries including performance optimization techniques
Working experience in Shell Scripts, Oozie workflows, and scheduling tools (Stone branch or CA7).
Able to analyze large volumes of data and understand patterns, data quality issues, and design solutions to manage volume.
Experience in building applications in Cloud Platforms such as Azure and leveraging native services is plus
Experience working in an Agile environment
Ability to modularize the project components, architect the complete project and provide solution architecture.
Excellent written, verbal, and interpersonal skills, is a must as there will be significant collaboration with the business and IT
Experience with collaborative development workflows (e.g., Microsoft DevOps Tools).
In addition to the basic qualifications noted above, we find that individuals who are successful in this role have:
Master’s/ Bachelor’s Degree in one of the following: Engineering, Computer Science, Statistical Analytics, Data Science, or Actuarial Science.
Big Data/Hadoop Technical Lead with extensive experience on the Hadoop platform and related Big Data tools/technologies.
10+ years of experience in implementing and managing high-performance scalable enterprise applications in the Financial Services industry.
Experience in developing data products for finance organizations, especially in the Insurance domain. This includes:
5+ years of experience in Big Data Technologies
10+ years experience in Data Warehouse, Business Intelligence, and analytics area
3 + years of experience in Hadoop Eco Systems
2+ years of experience in Spark
1+ years experience with Microsoft Azure and/or AWS
Any exposure to Mainframe as a source is preferred
Joining DXC connects you to brilliant people who embrace change and seize opportunities to advance their careers and amplify customer success. At DXC we support each other and work as a team — globally and locally. Our achievements demonstrate how we deliver excellence for our customers and colleagues. You will be joining a team that works to create a culture of learning, diversity, and inclusion and are dedicated to strong ethics and corporate citizenship.
At DXC we put our people first. In managing COVID-19, our actions are focused on the health, safety, and well-being of our colleagues and their families and our approach is to encourage and support masking, testing, and vaccination. With our Virtual-First strategy, the majority of our workforce now works remotely and will continue to do so. We recognize that requirements and availability around masking, testing, and vaccination vary by location, and we continue to monitor and conform with government regulations and customer requirements specific to each location.