Job: Big Data Engineer – Hadoop
|| Big Data Engineer – Hadoop | Apply Now!
||Application Development Opportunities, Information Management Opportunities
||Long Term Contract
IBISKA has a requirement for an Architect or Engineer to deliver a clustered computing environment for our Government client.
The role is to design and build large-scale data analytics platforms to deal with the complexities of ingesting, storing, and manipulating masses of data in real-time.
Skills and Experience Required
- 5+ years of experience in software development/engineering, including requirements analysis, software development and implementation;
- 5+ years of experience developing software with high level languages such as Java, C, C++;
- Experience with distributed scalable Big Data programming model and technologies such as Hadoop, Hive, Pig, etc;
- Experience with Hadoop Distributed File System (HDFS);
- Experience developing solutions integrating and extending FOSS/COTS products;
- Experience deploying applications in a cloud environment;
- Understanding of Cloud Scalability;
- Hadoop/Cloud Developer Certification is an asset;
- Experience designing and developing automated analytic software, techniques, and algorithms;
- Experience developing and deploying data driven analytics with heterogeneous environment.
Apply For This Position Online
Copyright 2013 IBISKA Telecom Inc.