Hortonworks Administrator

Location: Herndon, VA, United States
Date Posted: 11-06-2018
Job Title: Hortonworks Administrator
Location: US-VA-Herndon
Contract-Hire Position
 
Overview:
This position supports the General Services Administration (GSA) CAMEO program. The Karsun team is responsible for the development, maintenance, and enhancement (DM&E) and operation of selected GSA Federal Acquisition IT Systems. All employees must be able to pass a Federal Suitability Check for a position of public trust. The GSA CAMEO program supports the operations of multiple business applications, as well as development of new applications across different technologies. The Karsun software development team is responsible for the software design and implementation of web applications supporting multiple Business Lines within GSA. Successful candidates are modern web development specialists experienced in translating business requirements into software architecture.  In addition to strong software development skills, ideal candidates have demonstrated experience in working on an Agile Scrum team. Position location is in Herndon, VA.
 
This position supports the General Services Administration (GSA) program. Karsun team is responsible for the operations, maintenance, and modernization of the middleware and database environment. All employees must be able to pass a Federal Suitability Check for a position of public trust. Successful candidates have previous Hortonworks or similar product experience.  Position location is in Herndon, VA.
 
A successful candidate should be able to administrator the Big Data framework such Hortonworks platform to lead and document  -- scope, tasks, risks, dependencies and mitigation plans, create and update support documentation and review of work product, establish and continuously enhance SOPs for all aspects of platform operations for a federal customer in GSA.
 
Responsibilities:
  • Administer the Hortonwork Data Platform (HDP)
  • Work with business and technology client representatives to gather functional and technical requirements
  • Analyze requirements and provide leading practices in the design of the solution
  • Create data models
  • Install and Configure HDP tools to meet business needs
  • Participate in client facing meetings and documenting key decisions and action items
  • Serve as an SME on HDP and Data governance
  • Keep informed of the latest technology trends and innovations especially in the areas of data integration, master data management, data management platforms, digital asset management, web content management
 
Qualifications:
Required Skills:
  • 5+ years of experience in ingesting data to Hadoop from variety of sources like ERP, CRM, NoSQL and transactional data systems
  • 2+ years of hands on experience with the Hortonworks Data Platform - Hadoop v2, Apache NiFi/Hortonworks DataFlow, Spark ecosystem, MapReduce
  • Experience in monitoring performance and implementing any necessary infrastructure changes to meet performance capabilities and goals
  • 5+ years of experience in building and operationalizing Hadoop based data lake
  • 5+ years in Big Data querying tools, such as Pig, Hive, and Drill
  • 2+ years of experience with building stream-processing systems, using solutions such as Storm , Spark-Streaming or HDF
  • Experience in implementing Big Data ML toolkits, such as Amazon ML, SparkML, or H2O
  • Knowledge of various ETL tools in Hadoop like Pentaho Data Integrator, Apache Nifi, SAP Data Services, etc.
  • Experience in code/build/deployment tools like git, svn, maven, jenkins
  • Ability to work through ambiguity and maintain task focus towards delivery with minimal supervision is key
  • Ability to work under challenging deadlines and constraints to deliver
Desired Skills:
  • Excellent communication skills with experience in business requirements definition and creating clear documentation
  • Experience in streaming analytics and data integration is highly desirable
  • Experience with various messaging systems, such as Kafka
  • Experience with ETL tools such as Apache Nifi and Pentaho PDI is desirable
  • Basic demonstrable Linux skills, including the ability to write simple shell scripts
  • Python scripting skills are highly advantageous
  • Self-starter with substantial motivation and ability to learn new tools
  • Experience with the Hadoop ecosystem and related analytics tools such as Spark, Zeppelin and Hue, is highly desirable
  • Experience with data management – ETL, MapReduce, HDF, Streamsets
  • Strong methodical approach to problem solving with very strong hands-on coding skills are expected
  • Extensive knowledge of the SDLC with working experience in Agile methodologies
  • 1+ years of working experience of AWS
Qualifications(Education/Experience):
  • Bachelor's degree in Computer Science or related discipline
  • Must be able to obtain a Public Trust (Moderate level) 
or
this job portal is powered by CATS