Overview

Duration: 6 month contract to hire
Location: Detroit, MI (onsite)
**W2 candidates only
No OPT/CPT s/3rd parties

Looking for a Cloud Big Data Infrastructure Architect
**Need Platform Architect .. not a Data or Analytics architect. Knowledge/experience in tools, technologies and how they work together in a Hadoop ecosystem on the Cloud is critical
Cloud big data infrastructure architect must have:
-Exp with strong technical knowledge on JIRA, Hadoop, HIVE, Impala, HUE, Hbase, SCALA, GitHub, Jenkins
-Hadoop and Spark don t have to be hands on, but should be able to come up with their own design patterns (development, data pipelines, DevOps, CICD)
-Infrastructure, firewall, networking and AWS experience
-Must have experience with Hadoop on AWS
-Must have Cloudera Stack experience
Description:
Design Hadoop deployment architectures (with features such as high availability, scalability, process isolation, load-balancing, workload scheduling, etc.).
Publish and enforce Hadoop best practices, configuration recommendations, usage design/patterns, and cookbooks to developer community.
Engineer process automation integrations.
Perform security and compliance assessment for all Hadoop products.
Contribute to application deployment framework (requirements gathering, project planning, etc.).
Evaluate capacity for new application on-boarding into a large scale Hadoop cluster.
Provide Hadoop SME and Level-3 technical support for troubleshooting.

Qualifications:
8+ years overall IT experience.
2+ years of experience with Big Data solutions and techniques.
2+ years Hadoop application infrastructure engineering and development methodology background.
Experience with Cloudera distribution (CDH) and Cloudera Manager is preferred.
Experienced in Cloudera, particularly hue, impala, HDFS, hive, hbase, oozie , spark, yarn (very much in this order)
Experience installing, troubleshooting, and tuning the Hadoop ecosystem.
Experience in cloud (Azure/AWS) solutions
Experience with full Hadoop SDLC deployments with associated administration and maintenance functions.
Experience developing Hadoop integrations for data ingestion, data mapping and data processing capabilities.
Experience with designing application solutions that make use of enterprise infrastructure components such as storage, load-balancers, 3-DNS, LAN/WAN, and DNS.
Experience with concepts such as high-availability, redundant system design, disaster recovery and seamless failover.
Overall knowledge of Big Data technology trends, Big Data vendors and products.
Good interpersonal with excellent communication skills – written and spoken English.
Able to interact with client projects in cross-functional teams.
Good team player interested in sharing knowledge and cross-training other team members and shows interest in learning new technologies and products.


More Jobs: