MoveTheWorldForwardTogether

When you join Verizon

Verizon is a leading provider of technology, communications, information and entertainment products, transforming the way we connect across the globe. We’re a diverse network of people driven by our ambition and united in our shared purpose to shape a better future. Here, we have the ability to learn and grow at the speed of technology, and the space to create within every role. Together, we are moving the world forward – and you can too. Dream it. Build it. Do it here.

What you’ll be doing...

  • Troubleshooting and diagnose user and system issues in the business applications.
  • Modifying data or making recommendations on data modifications in live and mission-critical systems.
  • Executing data loads and/or monitoring automated process that load data to systems.
  • Working with internal and external users, and internal and external.
  • Enforcing incident management and escalation procedures for production systems.
  • Deep knowledge in setting up Role based access management for authorization within big data ecosystem" with "Solution at scale ."

What we’re looking for...

You are curious about new technologies and the possibilities they create. You enjoy the challenge of supporting applications while exploring ways to improve upon the technology. You are driven and motivated, with good communication and support skills. You’re a sought-after team member that thrives in a dynamic work environment. You have a thirst for working on Cloud and Docker environment with the drive to change the status quo.
You'll need to have:

  • Bachelor's degree or four or more years of work experience.
  • Four or more years of relevant work experience in Teradata Administration.
  • Experience writing complex SQLs.

Even better if you have one or more of the following:

  • Good knowledge on Shell Programming, Python, Perl Scripting, JSON or any other scripting or program language.
  • Proficiency in Linux shell, sed and awk scripting.
  • Experience installing, configuring and supporting the Bigdata ecosystem components.
  • Knowledge in Storm, Spark, Kafka and Hbase.
  • Good knowledge in Hfile, Region server management, different memory allocation in HBASE, Column Families and Coprocessors.
  • Kerberos and Knox integration experience.