What you’ll be doing...

This role involves working in a complex, multi-functional, Agile team environment with other data scientists, engineers and analysts. The DevOps Engineer is involved in many aspects of a customer engagement; from the collaboration with other team members and customers to ensure delivery of VTAC feed packages are being handled smoothly.

In this role, you'll be responsible for:

  • Design effective monitoring / alerting (for conditions such as application-errors, high memory usage) and log aggregation approaches (to quickly access logs for troubleshooting, or generate reports for trend analysis).
  • Write code and scripts to automate and to configure services, using tools and languages including API, Python, Bash, and Git.
  • Maintain and manage all virtual and physical servers.
  • Document and diagram deployment-specific aspects of architectures and environments.
  • Work to ensure system and data security is maintained at a high standard, ensuring the confidentiality, integrity and availability of the Navigating Cancer's applications is not compromised.
  • Provide proactive event monitoring of the following security tools for targeted threats and malicious activity including but not limited to Splunk, Threat Connect, Risk Vision and Risk IQ.
  • Continually review and recommend improvement to operational process and procedures.
  • Implement and manage standard tools and services for automated test for each environment.
  • Ability to solve any ongoing issues with operating the cluster.

What we’re looking for...

You’ll need to have:

  • Bachelor’s degree or four or more years of work experience.
  • Four or more years of relevant work experience.
  • Six or moreyears experience in developing Hadoop and Spark.
  • Strong understanding of APIs.
  • Experience with data visualizations tools like SSRS, PowerBI & Tableau.
  • Experience working with different query languages (i.e. PL-SQL, T-SQL).

Even better if you have:

  • A degree in Computer Science or related field.
  • Strong experience in any database technology.
  • Python development experience.
  • Proficient understanding of distributed computing principles.
  • Understanding of Big Data Technologies and the ecosystem viz, Hadoop, Spark, MapReduce.
  • Experience with integration of data from multiple data sources.
  • Understanding and experience working with cloud infrastructure services like Azure, Amazon Web Services & Google Cloud. Azure preferred.
  • Experience working with code repositories and continuous integration (i.e. Git, Jenkins, etc.).
  • Understanding of development and project methodologies

#ProfessionalServices; 22CyberVES; 22CyberOPS

When you join Verizon...

You’ll have the power to go beyond – doing the work that’s transforming how people, businesses and things connect with each other. Not only do we provide the fastest and most reliable network for our customers, but we were first to 5G - a quantum leap in connectivity. Our connected solutions are making communities stronger and enabling energy efficiency. Here, you’ll have the ability to make an impact and create positive change. Whether you think in code, words, pictures or numbers, join our team of the best and brightest. We offer great pay, amazing benefits and opportunity to learn and grow in every role. Together we’ll go far.

Equal Employment Opportunity

We're proud to be an equal opportunity employer- and celebrate our employees' differences,including race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, and Veteran status. Different makes us better.