5
Debashis Mishra +91-8095923661 E-Mail:[email protected] Professional Summary: 4+ years of overall IT experience in Application Development in Java and Big Data Hadoop. 2 years of exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie Extensive Experience in Setting Hadoop Cluster Good working knowledge with Map Reduce and Apache Pig Involved in writing the Pig scripts to reduce the job execution time Have executed projects using Java/J2EE technologies such as Core Java, Servlets, Jsp, Struts & Spring. Design, Run and Deploy software performance tests on developed applications for the purposes of correcting errors, isolating areas for improvement, and general debugging Very well experienced in designing and developing both server side and client side applications. Strong and extensive experience in programming with Struts, spring and Experience in programming Web Applications and Web Services using J2EE Technologies Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of team. Exceptional ability to learn new concepts. Hard working and enthusiastic. Knowledge on FLUME,NO-SQL and Impala. Professional Experience: Currently Working as Systems Engineer in CGI Inc. Bangalore, India since April 2012. Worked as a Senior Technical Support Analyst in Mphasis Ltd, Bangalore from July’ 2010 to 2012. Qualifications: Completed BTech (Information Technology) from Silicon Institute of technology under BPUT, Bhubaneswar. Certifications:

Debashis RESUME-EXP

Embed Size (px)

Citation preview

Page 1: Debashis RESUME-EXP

Debashis Mishra

+91-8095923661 E-Mail:[email protected]

Professional Summary:

4+ years of overall IT experience in Application Development in Java and Big Data Hadoop.

2 years of exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie

Extensive Experience in Setting Hadoop Cluster Good working knowledge with Map Reduce and Apache Pig Involved in writing the Pig scripts to reduce the job execution time Have executed projects using Java/J2EE technologies such as Core Java,

Servlets, Jsp, Struts & Spring. Design, Run and Deploy software performance tests on developed applications

for the purposes of correcting errors, isolating areas for improvement, and general debugging

Very well experienced in designing and developing both server side and client side applications.

Strong and extensive experience in programming with Struts, spring and Experience in programming Web Applications and Web Services using J2EE Technologies

Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of team.

Exceptional ability to learn new concepts. Hard working and enthusiastic. Knowledge on FLUME,NO-SQL and Impala.

Professional Experience:

Currently Working as Systems Engineer in CGI Inc. Bangalore, India since April 2012.

Worked as a Senior Technical Support Analyst in Mphasis Ltd, Bangalore from July’ 2010 to 2012.

Qualifications: Completed BTech (Information Technology) from Silicon Institute of

technology under BPUT, Bhubaneswar.

Certifications:

Sun Certified Java Programmer 1.6.

Technical Skills:

Languages Java, Java Script, HTML, XML, XSD, XSL , Web Services, MapReduce, Pig, Sqoop, Pig, Hive, Hbase.

J2EE Technologies : JSP, Servlets, JDBC and EJB

Page 2: Debashis RESUME-EXP

Application Web-Servers IBM Web Sphere Application Server 7.0, Web Logic and Tomcat

Frameworks Spring 3, Hadoop.Scripting Language JavaScriptJava IDEs Eclipse.Databases MS Access, SQL (DDL, DML, DCL).Design Skills J2EE design patterns, Object Oriented Analysis and Design

(OOAD), UML.Operating Systems Windows7,Windows XP,Unix and LinuxVersion Control / Tracking Tools

SVN

Ticketing Tools BMC Remedy

Project Details:PROJECT #1:Project Name : MECY's – Web Intelligence

Client : Target Minneapolis, Minnesota, USA.

Environment : Hadoop 2.0, Apache Pig, Hive, SQOOP, Java, UNIX, MySQL

Duration : November 2013 to till Date

Role : Hadoop Developer

Description:

 This Project is all about the rehousting of their (Target) current existing project into Hadoop platform. Previously Target was using mysql DB for storing their competitor’s retailer’s information.[The Crawled web data]. Early Target use to have only 4 competitor retailers namely Amazon.com, walmart.com etc….

But as and when the competitor retailers are increasing the data generated out of their web crawling is also increased massively and which cannot be accomodable in a mysql kind of data box with the same reason Target wants to move it Hadoop, where exactly we can handle massive amount of data by means of its cluster nodes and also to satisfy the scaling needs of the Target business operation.

Roles and Responsibilities:

Moved all crawl data flat files generated from various retailers to HDFS for further processing.

Written the Apache PIG scripts to process the HDFS data. Created Hive tables to store the processed results in a tabular format. Developed the sqoop scripts in order to make the interaction between Pig and

MySQL Database. Involved in gathering the requirements, designing, development and testing Writing the script files for processing data and loading to HDFS Writing CLI commands using HDFS. Developed the UNIX shell scripts for creating the reports from Hive data. Completely involved in the requirement analysis phase. Analyzing the requirement to setup a cluster Created two different users (hduser for performing hdfs operations and map red

user for performing map reduce operations only) Ensured NFS is configured for Name Node

Page 3: Debashis RESUME-EXP

Setting Password less hadoop Setting up cron job to delete hadoop logs/local old job files/cluster temp files Setup Hive with MySQL as a Remote Metastore. Moved all log/text files generated by various products into HDFS location Written Map Reduce code that will take input as log files and parse the logs and

structure them in tabular format to facilitate effective querying on the log data Created External Hive Table on top of parsed data.

PROJECT #2:Process : Bell

Client : Bell Canada, Bell Mobility, Bell Aliant, Bell TV

Environment : Java 1.6

Duration : April 2012 to October 2013

Role : Application Support

Description: Bell had different applications used extensively by the Bell End-Users. We were working as a Level 2 Support for the Applications used by the Client Base/onsite developers.

Roles and Responsibilities:

Worked as L2 support, Responsible for requirements analysis, interacting with client-base/onsite developers to validate the understanding and actively involving as a team member for the development of the module

Developed the front-end using Java Swing (involving UI design, user generated event handling, incorporation of underlying functionalities as per the laid down specifications) and also involved in incorporating business logic in SQL code

Involved in integration testing and deployment of the module as part of the overall application.

Had to check thoroughly for Errors occurring in the application as when reported .Customer notification and workflow co-ordination and follow-up to maintain

service level agreements.

PROJECT #3:

Project Name : Internal Attendance Management System

Client : Bank of America

Environment : Java 1.6,JSF 2.0,Spring 3,JavaScript,Ms Access

Duration : June 2010 to April 2012

Role : Java Developer

In Bank of America project, the attendance and periods of service records are analyzed Any discrepancy is recorded and actions are taken accordingly.Developed the Tool i:e Work Request Estimation And Allocation Tool which is a large standalone web application developed in Java7 with JSF20, Spring 3 & MS Access as DB and it is live in DotCom Unit. The Validation was done using Javascript. The Application saves around 10 to 15 Hrs per Week of the Project Leads.Kudos:

Page 4: Debashis RESUME-EXP

Achieved highest CSI (Client Satisfaction Index) consecutively Received Value Appreciation Certificate for Maximizing the Team’s Performance

Twice

Responsibilities:

Responsible for requirements analysis, interacting with client and validate the understanding and actively involving as a team member for the development of the tool

Developed the enterprise application by integrating JSF/Spring Frameworks. Validation was done using Javascript & HTML in the front end. Involved in unit-testing and peer testing.