FIK E-Commerce Pvt Ltd develop products based on DataScience, AI and Blockchain technologies. We work on projects as a vendor for development, implementation and support operations. We also do consulting and training operations for corporate and others.

Our major aim is to empower people towards emerging technologies for smart future.

We are going to build products to achieve:

our goals

1. Digital Rural

2. Save farmer

3. Hunger free nation

with cutting technologies (BigData Analytics, Artificial Intelligence , IOT, Blockchain, VR/AR/MR and many).


Domain: Software Development / IT

Designation: Trainee Software Engineer (Hadoop/ JAVA )

Eligibility Criteria:

  • B.E/ B.Tech (any Discipline) 2017 Graduates with minimum 60% aggregate throughout the academics and and no backlogs only need apply.

CTC per Annum: INR 1,80,000 - INR 2,10,000 (Based on performance during interview process & Academics)

Job Description - Hadoop:

  • He/She must demonstrate analytical and problemsolving skills, particularly those that apply to a Big Data environment & have a deep understanding and related experience with Hadoop stack - internals, HBase, Hive, Pig, Spark and Map/Reduce Deep understanding of schedulers, workload management, availability, scalability and distributed data platforms. Candidate must also have extensive experience building and designing large-scale distributed applications with expert knowledge in developing and debugging in Java/J2EE.

Job Description - JAVA:

  • We are looking for an energetic individual to join us in building our next generation of web and mobile application. someone who will strive to meet the needs of our growing business. This individual will play a key role for front-end and back-end web development.
  • The candidate will be working on cutting edge technologies right from the training period.
  • Desire and willingness to work in a collaborative, innovative, flexible and team-oriented environment.

Bond Period: 2 years

Skills Required - Hadoop:

  • Hands-on experience in working with Hadoop Distribution platforms like HortonWorks, Cloudera.

  • Full knowledge of Hadoop Architecture and HDFS is a must.

  • A knowledge of OOP languages Java to write MapReduce

  • Experience in Map Reduce , HDFS, PIG , Hive and SQOOP

  • Expertise concepts like Apache Spark and Scala programming

  • Knowledge of installing, configuring, maintaining and securing Hadoop

  • Good command on Linux and expert in algorithms and data structures

  • Experience with server infrastructure and networking technologies (Web, SaaS, distributed systems, TCP/IP,HTTP)

  • Good analytic, troubleshooting and debugging skills

  • Understanding of software development life cycle (SDLC) methodologies such as Agile Development Method, Waterfall Development Cycle.

Skills Required - JAVA:

  • Expertise in Core and Advanced JavaScript
  • Expertise in ECMA/ES 6.0
  • Working knowledge of TypeScript/Angular 2.0
  • Expertise in jQuery and UI Bootstrap
  • Expertise in AJAX
  • Hand on with JavaScript, Node.js and SQL language, HTML5, CSS3, Responsive Web design
  • Working knowledge of testing tools like karma, protractor, Jasmine etc.
  • Familiarity with Package manager tools (node package manager), and task runners like Grunt/Gulp Well-grounded in Waterfall and Agile methodology.
  • Java 1.8, Struts, Spring and Hibernate frameworks, SOAP and REST, SpringBoot Microservices.

Selection Process:

1st round: Written test 

2nd round: Technical interview

3rd round: HR interview

Additional Information:

Please carry your updated resume, 2 recent passport size photographs & a govt issued photo ID proof. 

Placement drive updates:

Number of students applied: 330

Number of students attended: 21

Number of students cleared written test & Technical interview conducted on Thursday, 12th July 2018: 03

Number of selects: Zero