Staff DevOps Engineer - Hadoop - Big Data - Federal - 3rd Shift (Nights)
Location: California City
Posted on: November 20, 2022
At ServiceNow, our technology makes the world work for everyone,
and our people make it possible. We move fast because the world
can't wait, and we innovate in ways no one else can for our
customers and communities. By joining ServiceNow, you are part of
an ambitious team of change makers who have a restless curiosity
and a drive for ingenuity. We know that your best work happens when
you live your best life and share your unique talents, so we do
everything we can to make that possible. We dream big together,
supporting each other to make our individual and collective dreams
come true. The future is ours, and it starts with you.
With more than 7,400+ customers, we serve approximately 80% of the
Fortune 500, and we're proud to be one of FORTUNE's 100 Best
Companies to Work For - and World's Most Admired Companies -
Learn more on Life at Now blog and hear from our employees about
their experiences working at ServiceNow.
Unsure if you meet all the qualifications of a job description but
are deeply excited about the role? We still encourage you to apply!
At ServiceNow, we are committed to creating an inclusive
environment where all voices are heard, valued, and respected. We
welcome all candidates, including individuals from non-traditional,
varied backgrounds, that might not come from a typical path
connected to this role. We believe skills and experience are
transferrable, and the desire to dream big makes for great
Please Note: This position will include supporting our US Federal
This position requires passing a ServiceNow background screening,
USFedPASS (US Federal Personnel Authorization Screening Standards).
This includes a credit check, criminal/misdemeanor check and taking
a drug test. Any employment is contingent upon passing the
screening. Due to Federal requirements, only US citizens or US
naturalized citizens will be considered.
As a Staff DevOps Engineer on our Federal Team you will help
deliver 24x7 support for our Government Cloud infrastructure.
The Federal Big Data Team has 3 shifts that provide 24x7 production
support for our Big Data cloud infrastructure.
Below are some highlights.
- 4 Day work week (either Sunday to Wednesday or Wednesday to
- No on-call rotation
- Shift Bonuses for 2nd and 3rd shifts
- This is a Night Shift position with work hours from 11 pm - 9
am Pacific Time
The Big Data team plays a critical and strategic role in ensuring
that ServiceNow can exceed the availability and performance SLAs of
the ServiceNow Platform powered Customer instances - deployed
across the ServiceNow cloud and Azure cloud. Our mission is to:
Deliver state-of-the-art Monitoring, Analytics and Actionable
Business Insights by employing new tools, Big Data systems,
Enterprise Data Lake, AI, and Machine Learning methodologies that
improve efficiencies across a variety of functions in the company:
Cloud Operations, Customer Support, Product Usage Analytics,
Product Upsell Opportunities enabling to have a significant impact
both on the top-line and bottom-line growth. The Big Data team is
- Collecting, storing, and providing real-time access to large
amounts of data
- Provide real-time analytic tools and reporting capabilities for
various functions including:
- Monitoring, alerting, and troubleshooting
- Machine Learning, Anomaly detection and Prediction of P1s
- Capacity planning
- Data analytics and deriving Actionable Business Insights
What you get to do in this role
- Responsible for deploying, production monitoring, maintaining
and supporting of Big Data infrastructure, Applications on
ServiceNow Cloud and Azure environments.
- Architect and drive the end-end Big Data deployment automation
from vision to delivering the automation of Big Data foundational
modules (Cloudera CDP), prerequisite components and Applications
leveraging Ansible, Puppet, Terraform, Jenkins, Docker, Kubernetes
to deliver end-end deployment automation across all ServiceNow
- Automate Continuous Integration / Continuous Deployment (CI/CD)
data pipelines for applications leveraging tools such as Jenkins,
Ansible, and Docker.
- Performance tuning and troubleshooting of various Hadoop
components and other data analytics tools in the environment: HDFS,
YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis,
Hue, Kerberos, Tableau, Grafana, MariaDB, and Prometheus.
- Provide production support to resolve critical Big Data
pipelines and application issues and mitigate or minimize any
impact on Big Data applications. Collaborate closely with Site
Reliability Engineers (SRE), Customer Support (CS), Developers, QA
and System engineering teams in replicating complex issues
leveraging broad experience with UI, SQL, Full-stack and Big Data
- Responsible for enforcing data governance policies in
Commercial and Regulated Big Data environments.
To be successful in this role you have:
- 6 + years of overall experience with at least 4+ years as a Big
Data DevOps / Deployment Engineer
- Demonstrated expert level experience in delivering end-end
deployment automation leveraging Puppet, Ansible, Terraform,
Jenkins, Docker, Kubernetes or similar technologies.
- Deep understanding of Hadoop/Big Data Ecosystem. Good knowledge
in Querying and analyzing large amount of data on Hadoop HDFS using
Hive and Spark Streaming and working on systems like HDFS, YARN,
Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue,
Tableau, Grafana, MariaDB, and Prometheus.
- Experience securing Hadoop stack with Sentry, Ranger, LDAP,
- Experience supporting CI/CD pipelines on Cloudera on Native
cloud and Azure/AWS environments
- Good knowledge of Perl, Python, Bash, Groovy and Java.
- In-depth knowledge of Linux internals (Centos 7.x) and shell
- Ability to learn quickly in a fast-paced, dynamic team
ServiceNow is an Equal Employment Opportunity Employer. All
qualified applicants will receive consideration for employment
without regard to race, color, creed, religion, sex, sexual
orientation, national origin or nationality, ancestry, age,
disability, gender identity or expression, marital status, veteran
status or any other category protected by law.
At ServiceNow, we lead with flexibility and trust in our
distributed world of work. Click here to learn about our work
personas: flexible, remote and required-in-office.
All new employees hired in the United States are required to be
fully vaccinated against COVID-19, subject to such exceptions as
required by law. If hired, you will be required to submit proof of
full vaccination or have an approved accommodation, by your start
date. Visit our Candidate FAQ page to learn more.
If you require a reasonable accommodation to complete any part of
the application process, or are limited in the ability or unable to
access or use this online application process and need an
alternative method for applying, you may contact us at
email@example.com for assistance.
For positions requiring access to technical data subject to export
control regulations, including Export Administration Regulations
(EAR), ServiceNow may have to obtain export licensing approval from
the U.S. Government for certain individuals. All employment is
contingent upon ServiceNow obtaining any export license or other
approval that may be required by the U.S. Government.
Please Note: Fraudulent job postings/job scams are increasingly
common. Click here to learn what to watch out for and how to
protect yourself. All genuine ServiceNow job postings can be found
through the ServiceNow Careers site .
Keywords: ServiceNow, Lancaster , Staff DevOps Engineer - Hadoop - Big Data - Federal - 3rd Shift (Nights), Other , California City, California
Didn't find what you're looking for? Search again!