Verizon is a leading provider of technology, communications, information and entertainment products, transforming the way we connect across the globe. We’re a diverse network of people driven by our ambition and united in our shared purpose to shape a better future. Here, we have the ability to learn and grow at the speed of technology, and the space to create within every role. Together, we are moving the world forward – and you can too. Dream it. Build it. Do it here.
What you’ll be doing...
As a Mgr-Data Engineering in the Artificial Intelligence and Data Organization (AI&D), you will drive various activities including data engineering, data operations automation, data frameworks and platforms to improve the efficiency, customer experience and profitability of the company.
You will analyze marketing, customer experience and digital operations environments to build data pipelines, transform data into actionable intelligence. You will turn raw data into usable data pipelines and build data tools and products for effort automation and easy data accessibility. At Verizon, we are on a journey to industrialize our data science and AI capabilities. Very simply, this means that AI will fuel all decisions and business processes across the company. With our leadership in bringing 5G network nationwide, the opportunity for AI will only grow exponentially in going from enabling billions of predictions to possibly trillions of predictions that are automated and real-time.
Managing real time streaming data and model insights platform that enables a combination of customer journey and verizon internal data.
Building a high-quality real time streaming and data engineering team that continue to drive customer experience across our system of engagement.
Designing, building, and launching of new data engineering solutions and data pipelines based on customer interactions with verizon digital interfaces.
Building cross channel customer journey analytics using clickstream and other customer interaction data.
Building real time data products that can be used for feature engineering and scoring in ML models.
Ensuring a clear focus on reusability, scalability and optimization in every architecture and implementation done by your team
Driving data quality across all data pipelines and related business areas.
What we’re looking for...
You’ll need to have:
Bachelor’s degree in Computer Science or four or more years of work experience.
Six or more years of relevant work experience.
Two or more years of experience in Stakeholder and vendor management.
Two or more years of experience in high performance web scale and real-time response systems.
Experience working with cross-functional teams and projects in space of real time streaming data analytics.
Experience in leveraging NoSQL and low latency cache like Redis.
Experience in designing, building, and deploying production-level real time event processing data pipelines using tools from Streaming platforms (IBM Streams, Flink, Kafka etc.) and programming in Java/Scala/Python.
Experience with cross-team collaboration, interpersonal skills/relationship building, and management.
Experience leveraging and managing CI/CD toolchain products like Jira, Gitlab, Artifactoryand Jenkins.
Even better if you have one or more of the following:
A degree. Master’s degree in Computer Science, Information Systems and / or related technical discipline.
Strong, analytical ability to quickly debug application problems and provide short and long term solutions.
Big Data Analytics Certification in Cloud.
Knowledge of telecom architecture.
Ability to effectively communicate through presentation, interpersonal, verbal and written skills.