Capgemini - Pune, Maharashtra - Short Description Hadoop developer - 6 to 9 years -Pune Qualifications Bachelors/Masters Job Responsibilities 4+ years of Big Data development experience Hands on experience on Design and implement Design and implement work using Unix / Linux scripting to perform data ingestion and ETL on Big Data platform Excellent Understanding/Knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, high availability, HDFS job tracker, MapReduce, Spark RDDS/programming , Hive ,Pig , Kafaka and Flume. Provide hands-on leadership for the design and development of ETL data flows using Big Data Ecosystem tools and technologies Lead analysis, architecture, design, and development of data warehouse and business intelligence solutions. Work independently, or as part of a team, to design and develop Big Data solutions...
source http://jobviewtrack.com/en-in/job-1e1c41654c000001575464061e0402081545066a21124e44525d2f200a1842184f130d136f5f5618425e/4f82dd3559d1d1e4675c95ee8a5b510e.html?affid=f584d43114bf1954a48e3ec6be21b6ec
Subscribe to:
Post Comments (Atom)
-
NSE IFSC-SGX Connect may be fully operational by June https://ift.tt/XC89Iks this connectivity, global investors who are clients of SGX will...
-
Tough challenges await Rishi Sunak: Tory strategists https://ift.tt/ibXqIld has successfully eaten into the opposition poll lead - Keir Star...
-
Cryptocurrency, or "crypto" or "tokens", is all the rage right now. People are buying and using cryptos for varied purpo...
No comments:
Post a Comment