Capgemini - Pune, Maharashtra - Short Description Hadoop developer - 6 to 9 years -Pune Qualifications Bachelors/Masters Job Responsibilities 4+ years of Big Data development experience Hands on experience on Design and implement Design and implement work using Unix / Linux scripting to perform data ingestion and ETL on Big Data platform Excellent Understanding/Knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, high availability, HDFS job tracker, MapReduce, Spark RDDS/programming , Hive ,Pig , Kafaka and Flume. Provide hands-on leadership for the design and development of ETL data flows using Big Data Ecosystem tools and technologies Lead analysis, architecture, design, and development of data warehouse and business intelligence solutions. Work independently, or as part of a team, to design and develop Big Data solutions...
source http://jobviewtrack.com/en-in/job-1e1c41654c000001575464061e0402081545066a21124e44525d2f200a1842184f130d136f5f5618425e/4f82dd3559d1d1e4675c95ee8a5b510e.html?affid=f584d43114bf1954a48e3ec6be21b6ec
Subscribe to:
Post Comments (Atom)
-
NSE IFSC-SGX Connect may be fully operational by June https://ift.tt/XC89Iks this connectivity, global investors who are clients of SGX will...
-
Evolution Gaming - Philadelphia, PA - Company Description Evolution Gaming is the world's leading provider of video-streamed Live Casino...
-
Mumbai: India’s economic recovery has been swift since the unlock began — with cars, motorcycle, and television sets flying off the shelves ...
No comments:
Post a Comment