Tuesday, December 11, 2018

[USAJobsClub] Cognizant Recruitment Drive HCSC Big data Needs Chicago IL and Richardson, TX. Need passport Information for consultants less than 10 years

 We are planning a recruitment drive on 12/15/18 for Health care Services Corporation. This will be held at two locations – Chicago, IL and Richardson, TX . below are the JDs for the various roles for this client . Could you please Start lining up profiles ? 
Please find updated JD and rates  for Dev, Sr Dev, Dev Lead and Architects for HCSC.

BigData Developer: ( $60 / hr )
Job Purpose: This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models.
Required Job Qualifications: 
•    Must have qualifications – 
o    Bachelor Degree and 4 years Information Technology experience OR Technical Certification and/or College Courses and 6 year Information Technology experience OR 8 years Information Technology experience.
o    Hands on experience in developing, and maintaining software solutions in Hadoop cluster. 
o    Experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog, Pig scripts, Hive QL, UDF
o    Hands-on experience with Spark and Scala.
o    Experience in Kafka
o    Familiarity with Hadoop Best Practices, Troubleshooting and Performance tuning.
o    Experience with change management / DevOps tools (Github / Jenkins etc.) 
o    Familiarity to SDLC Methodology (Agile / Scrum / Iterative Development).
•    Nice-to-have qualifications 
o    Working experience with developing MapReduce programs running on the Hadoop cluster using Java/Python.
o    Experience with NoSQL Databases like HBASE, Mongo or Cassandra
o    Experience using Talend with Hadoop technologies.
o    Working experience in the data warehousing and Business Intelligence systems
o    Business requirements management and Systems change / configuration management. Familiarity with JIRA.
o    Experience in ZENA (or any other scheduling tool)
o    Healthcare experience
BigData Sr. Developer: ( $70/ hr )
Job Purpose: This position  is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models.
Required Job Qualifications: 
•    Must have qualifications – 
o    Bachelor Degree and 5 years Information Technology experience OR Technical Certification and/or College Courses and 7 year Information Technology experience OR 9 years Information Technology experience.
o    Extensive hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster. 
o    Experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog, Pig scripts, Hive QL, UDF – design & development
o    Hands-on experience with Spark and Scala – design & development
o    Experience in Kafka
o    Good knowledge in Hadoop Best Practices, Troubleshooting and Performance tuning. Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production.
o    Experience with change management / DevOps tools (Github / Jenkins etc.) 
o    Participate in design reviews, code reviews, unit testing and integration testing.
o    Familiarity to SDLC Methodology (Agile / Scrum / Iterative Development).
•    Nice-to-have qualifications 
o    Working experience with developing MapReduce programs running on the Hadoop cluster using Java/Python.
o    Experience with NoSQL Databases like HBASE, Mongo or Cassandra
o    Experience using Talend with Hadoop technologies.
o    Working experience in the data warehousing and Business Intelligence systems
o    Business requirements management and Systems change / configuration management. Familiarity with JIRA.
o    Experience in ZENA (or any other scheduling tool)
o    Healthcare experience
BigData Lead: ($80/hr)
Job Purpose: This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models.
Required Job Qualifications: 
•    Must have qualifications – 
o    10-15 years of IT experience with at least 3-5 years in BigData project experience.
o    Strong Hadoop eco system technical knowledge
o    Extensive hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster. 
o    Experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog, Pig scripts, Hive QL, UDF – design & development
o    Hands-on experience with Spark and Scala – design & development
o    Experience in Kafka
o    Demonstrate strong knowledge in Hadoop Best Practices, Troubleshooting and Performance tuning. Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production.
o    Experience with change management / DevOps tools (Github / Jenkins etc.) 
o    Participate in design reviews, code reviews, unit testing and integration testing.
o    Experience in SDLC Methodology (Agile / Scrum / Iterative Development).
o    Will need to assume ownership and accountability for the assigned deliverables for him/herself and small (3-5 member) team.
o    Mentoring / leading a team and provide technical guidance
o    Good communication skills. Will need to communicate with client IT PMs, client lead and architects.
•    Nice-to-have qualifications 
o    Working experience with developing MapReduce programs running on the Hadoop cluster using Java/Python.
o    Experience with NoSQL Databases like HBASE, Mongo or Cassandra
o    Experience using Talend with Hadoop technologies.
o    Working experience in the data warehousing and Business Intelligence systems
o    Business requirements management and Systems change / configuration management. Familiarity with JIRA.
o    Experience in ZENA (or any other scheduling tool)
o    Healthcare experience

BigData Architect (Application Architects in HCSC): ( $90 / hr )
Job Purpose: This position is responsible for ensuring alignment of project specific designs with application architecture roadmap; supporting project and/or product teams on functional and technical design activities; defining the interaction between application packages, databases, and middleware systems to optimize product functional coverage; acting as subject matter expert (SME) for products that require integration into the application.
Required Job Qualifications: 
•    Must have qualifications – 
o    15+ years of IT experience with at least 5-7 years in BigData project experience.
o    Strong Hadoop eco system technical knowledge covering all Hadoop umbrella (HDFS, Pig, MapReduce, Spark, HBase, Hive, HCatalog, Phoenix, Flume, Atlas, Ranger)
o    Knowledge in Spark & Scala – architecture, design, troubleshoot, performance tuning
o    Knowledge in Kafka and / or other streaming technologies
o    Experience with NoSQL databases like HBASE, Mongo or Cassandra
o    Experience in developing architecture on DataLake, ingestion and consumption frameworks
o    Should demonstrate technical thought leadership. Should be ready to conduct POC either himself or engaging some developers for pilot / evaluation phase works
o    Ability to clearly articulate technical complexity to a variety of audience – PMs, other architects (Enterprise Architects, Solution Architects or Data Architects). Ability to explain and justify the architecture and technology choices.
o    Demonstrate strong knowledge in Hadoop Best Practices, Frameworks, Troubleshooting and Performance tuning. Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production.
o    Experience with change management / DevOps tools (Github / Jenkins etc.) 
o    Participate in design reviews, code reviews, unit testing and integration testing.
o    Experience in SDLC Methodology (Agile / Scrum / Iterative Development).
•    Nice-to-have qualifications 
o    Working experience in the traditional data warehousing and Business Intelligence systems
o    Business requirements management and Systems change / configuration management. Familiarity with JIRA.
o    Strongly preferred – Healthcare experience
o    Strongly preferred – Have PMP, PMI-ACP, CSM, or TOGAF certification
 

Thanks & Regards,
Nagesh Bade
4229 Lafayette Center Dr., Suite #1625, Chantilly, VA 20151
Tel: 703-831-8282 Ext. 202, Direct:  213-262-5570 

--
"USA Jobs Club"
The most happening and helping Jobs Group by Network of Indians of America. Join our WHATSAPP Groups JOBS CLUBS by sending a message to phone# 973-692-5232
Like our Page: https://www.facebook.com/DesiIndiansUSA
Join our FB Group: https://www.facebook.com/groups/NJNYIndians/
Attend our MEETUPS: http://www.meetup.com/NJNYIndians/
---
You received this message because you are subscribed to the Google Groups "USA Jobs Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to USAJobsClub+unsubscribe@googlegroups.com.
To post to this group, send email to USAJobsClub@googlegroups.com.
Visit this group at https://groups.google.com/group/USAJobsClub.
To view this discussion on the web visit https://groups.google.com/d/msgid/USAJobsClub/CAJrufs62EsCJoXHGcKy1mX%3DHDeUT20xn31wRg9P-dWuhz2O0ig%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

No comments:

Post a Comment