When you join Verizon

Verizon is a leading provider of technology, communications, information and entertainment products, transforming the way we connect across the globe. We’re a diverse network of people driven by our ambition and united in our shared purpose to shape a better future. Here, we have the ability to learn and grow at the speed of technology, and the space to create within every role. Together, we are moving the world forward – and you can too. Dream it. Build it. Do it here.

What you’ll be doing...

As part of the Artificial Intelligence and Data Organization (AI&D), you will drive various activities including data engineering, data operations automation, data frameworks and platforms to improve the efficiency, customer experience and profitability of the company. You will analyze marketing, customer experience and digital operations environments to build data pipelines, transform data into actionable intelligence. You will turn raw data into usable data pipelines and build data tools and products for effort automation and easy data accessibility.

At Verizon, we are on a multi-year journey to industrialize our data science and AI capabilities. Very simply, this means that AI will fuel all decisions and business processes across the company. With our leadership in bringing 5G network nationwide, the opportunity for AI will only grow exponentially in going from enabling billions of predictions to possibly trillions of predictions that are automated and real-time

  • Architecture, design and development of all enterprise and real time data platforms that enables a combination of 3rd party and Verizon internal data to generate insights.
  • Gather requirements, assess gaps, and build roadmaps and architectures to help the analytics driven organization achieve its goals.
  • Partner with development teams working on vertical applications that use the platform core services.
  • Develop platform core services that makes vertical application development faster and consistent with the overall design of the entire ecosystem.
  • Work closely with Data Analysts to ensure data quality and availability for analytical modelling.
  • Drive the overall test data management strategy and platform architecture.
  • Collaborate in cross-functional teams to source new data, develop schema requirements, and maintain metadata.
  • Identify ways to improve data reliability, efficiency and quality.
  • Use data to discover tasks that can be automated.
  • Support the foundational cloud data requirements.

What we’re looking for...

You’ll need to have:

  • Bachelor’s degree or four or more years of work experience.
  • Six or more years of relevant work experience.
  • Four or more years of relevant work experience in Public cloud data lakes, data warehouses & analytics services experience.
  • Four or more years of relevant work experience in Database experience in Teradata SQL and NoSQL.
  • Experience in designing, building, and deploying production-level data pipelines using tools from Hadoop stack (HDFS, Hive, Spark, HBase, Kafka, Oozie etc.) and programming in Scala/Python.
  • Experience in streaming technologies like Spark, IBM Stream, Flink etc
  • Experience in messaging technologies like Kafka, Pulsar, RabbitMQ etc
  • Experience in creating monitoring, logging and tracing experience in data platform using open source technologies.
  • Experience leveraging and managing CI/CD toolchain products like Jira, STASH, Git, Bitbucket, Artifactory, and Jenkins in Data Engineering projects.

Even better if you have one or more of the following:

  • Ten or more years of relevant work experience in the big data space on technologies like Hadoop, Spark, Hive, Kafka, Oozie, ELK, Ranger, Atlas, Presto etc.
  • Big Data Analytics Certification in Public Cloud (GCP or AWS).
  • Knowledge of telecom architecture.
  • Two or more experience in high performance web scale and real-time response systems.
  • Experience with Cloud technologies like GCP, Docker, Kubernetes and data engineering migration programs from on-premise to cloud big data platforms.
  • Ability to effectively communicate through presentation, interpersonal, verbal and written skills.
  • Proven track record of driving a development team and delivering products on time using Agile.
  • Familiarity with fault-tolerant deployments, devops, and canary, A/B testing deployments.
  • Knowledge of AI/ ML and associated platforms.
  • Dashboard development experience in Tableau, Qlik and/or Looker
  • Experience in space of Digital Twin and Simulation platform