3
1 | Page DEEPAK KAUSHIK E-Mail: [email protected] Mobile: +91 9034558955 PROFESSIONAL SNAPSHOT I am a Big Data Hadoop and Spark resource who tends to adapt with the evolving data world and seeks to make way for designing solutions for managing unthinkable and unstructured volumes of data. Prior to this, I had myself engaged in data warehousing domain and started off as an ETL developer later leaned into the functional side as well. Since it's all about data, I decided to take on Big Data way. Summary:- Working on Big Data ecosystem, Hadoop and Spark. Setting up a Data Analytics Platform primarily responsible for organization wide reporting base on Hadoop Cluster. Good understanding of Hadoop Architecture and HDFS concepts Hands on experience on Spark , RDD's concepts and Spark SQL. Experienced on HIVE application for data retrieval and processing. Over 5 Years of IT experience (e-commerce, Life Insurance Domain and HRBPO Tech Services) exclusively in Data Warehousing including clear exposure to designing and development of Data Warehouses and ETL codes. Working experience in creation of physical data models using CA Erwin and Sybase Power Designer. Vast experience in data integration across databases like Oracle, My SQL, MS SQL Server and db2 using ETL tools like Informatica Power Center and Pentaho Data Integration/kettle. Nice exposure of working on SAP Business Objects (BOXI) reporting technology. Hands on experience in performance tuning of huge DWH. Good experience and command in writing shell scripts, troubleshooting UNIX related process, writing complicated SQL queries and performance tuning of ETL design and code. PROFESSIONAL EXPERIENCE Company: Jabong.com, Gurgaon Sep 2013 - Till date Designation: S/W Development Engineer II Company: Sun Life Financial Services, Gurgaon Nov 2012 – Sep 2013 Designation: Data warehouse Developer Company: Aon Hewitt, Gurgaon Jul 2010 - Nov 2012 Designation: Software Engineer EDUCATION DETAILS Year Degree Percentage Name of University/Board. Place 2010 B. Tech 71% M.D.U. Rohtak Sonipat 2006 12 th 73% CBSE Board Sonipat 2004 10 th 85% CBSE Board Sonipat TECHNICAL SKILLS

Resume_Deepak_Hadoop'n'Spark

Embed Size (px)

Citation preview

Page 1: Resume_Deepak_Hadoop'n'Spark

1 | P a g e

DEEPAK KAUSHIK

E-Mail: [email protected]

Mobile: +91 9034558955

PROFESSIONAL SNAPSHOT

I am a Big Data Hadoop and Spark resource who tends to adapt with the evolving data world and seeks to make

way for designing solutions for managing unthinkable and unstructured volumes of data.

Prior to this, I had myself engaged in data warehousing domain and started off as an ETL developer later leaned into the functional side as well. Since it's all about data, I decided to take on Big Data way.

Summary:-

Working on Big Data ecosystem, Hadoop and Spark. Setting up a Data Analytics Platform primarily responsible for organization wide reporting base on Hadoop Cluster.

Good understanding of Hadoop Architecture and HDFS concepts

Hands on experience on Spark , RDD's concepts and Spark SQL.

Experienced on HIVE application for data retrieval and processing.

Over 5 Years of IT experience (e-commerce, Life Insurance Domain and HRBPO Tech Services) exclusively in Data Warehousing including clear exposure to designing and development of Data Warehouses and ETL codes.

Working experience in creation of physical data models using CA Erwin and Sybase Power Designer.

Vast experience in data integration across databases like Oracle, My SQL, MS SQL Server and db2

using ETL tools like Informatica Power Center and Pentaho Data Integration/kettle.

Nice exposure of working on SAP Business Objects (BOXI) reporting technology.

Hands on experience in performance tuning of huge DWH.

Good experience and command in writing shell scripts, troubleshooting UNIX related process, writing complicated SQL queries and performance tuning of ETL design and code.

PROFESSIONAL EXPERIENCE

Company: Jabong.com, Gurgaon Sep 2013 - Till date

Designation: S/W Development Engineer II

Company: Sun Life Financial Services, Gurgaon Nov 2012 – Sep 2013

Designation: Data warehouse Developer

Company: Aon Hewitt, Gurgaon Jul 2010 - Nov 2012 Designation: Software Engineer

EDUCATION DETAILS

Year Degree Percentage Name of University/Board. Place 2010 B. Tech 71% M.D.U. Rohtak Sonipat 2006 12th 73% CBSE Board Sonipat 2004 10th 85% CBSE Board Sonipat

TECHNICAL SKILLS

Page 2: Resume_Deepak_Hadoop'n'Spark

2 | P a g e

Big Data Tech. Apache Hadoop, HBase, Hive, Pig, Spark, Spark SQL, Scala Database Oracle 11g, 10g, IBM Netezza, Mongo DB, Mysql,DB2,SQL Server Operating Systems Linux, Windows ETL & Reporting Tools Informatica Power Centrer, Pentaho Data Integration, SAS Data Integration,

SAP Business Objects, Tableau

CAREER CONTOUR

Company Name Jabong.com, Gurgaon Duration Sep 2013 – Till Date Designation Software Development Engineer II

Contribution

Working on setting up a data platform for analytics, which is vision to be the single data house on Big Data stack for all reporting, needs of the organization.

Developing Spark code for data acquisition from multiple data sources and integrating into the Hadoop Cluster.

Writing logics for campaign structuring and major reports like Inventory report consumed by ERP system, Sales Report,Stock Report by BI and MI teams.

Interacting with various business teams to understand their reporting and data needs which would be furnished from Data Platform.

Designed new DWH from scratch for Jabong which involves integrating data from wide variety of sources viz. front end(website data), ERP, CRM and Google Analytics.

Developed the DWH using SAS Data Integration Studio as ETL tool and Netezza as the database.

Participated in the implementation of complete SAS technology stack including Base SAS, SAS Data Integration Studio, SAS Visual Analytics, SAS e-Miner.

Writing complex data warehousing queries for performing ELT in order to obtain maximum performance from Netezza.

Writing complex analytic queries to meet the reporting needs of the organization.

Performing POC on a wide variety of ETL tools like Informatica, Pentaho Data Integration, SAS Data Integration and on a wide variety of databases like Actian Vector wise(Columnar DB), IBM Netezza(Appliance), Oracle EXADATA(Appliance) and InfiniDB(Columnar DB) to choose the best suited solution for the organization. Performance tuning the database and existing ETL code.

Resource Allocation and task allocation.

Understanding the e-commerce business and designing ETL logic based on the business process and requirements.

Meeting the business to understand the business logic, creating technical specification document and developing ETL code based on the business requirements.

Extraction of data from various data sources, transforming it as per the business logic and loading it into the DWH.

Preparation of test cases and performing unit testing of the ETL code developed.

Deployment of code in UAT Environments for doing UAT testing.

Analysing and fixing Production Issues.

Preparation of reports consumable for various analytic teams like CRM, Operations Intelligence, and Marketing Intelligence as well as for higher management.

Company Name Sun Life Financial Services, Gurgaon Duration Nov 2012-Sep 2013

Page 3: Resume_Deepak_Hadoop'n'Spark

3 | P a g e

Designation Data Warehouse Developer Contribution

Part of a 10 member Data Quality and Integration team.

Acting as a Data Ware House Developer for ETL development and

Direct interaction with client and business.

Effort Estimation and time line setting.

Understanding and solving business level problems.

Creating HLD and LLD from BRD.

Mapping business level requirement/solution to technical specifications.

Developing business intelligence routines (ETLs) using Informatica Power Center to leverage the policy information provided by the client to meet the analytic and management reporting needs of Business executives.

Creating and executing Testing Plan/phase.

Performance tuning of existing ETL codes.

Documentation of all types of client ready documents required for change approval and migration of final code.

Automating and stabilizing operation process.

Simultaneously working for multiple business units like Reinsurance, Individual, Group Benefits and Group Retirement Services.

Company Name Aon Hewitt, Gurgaon Duration July 2010 – Nov 2012 Designation Software Engineer

Contribution

Worked as individual contributor for a big project “YTR Activity Reporting” and implemented the project end-to-end successfully.

Developing end to end ETL for Payroll and WLM Administration as per their requirements.

Creating mappings and work flows to load client data in XML Formatted Flat Files and then loading these files into DB2 database.

Creation of shell Scripts for Work flow Administration.

Unit testing of developed ETL.

Migration of Developed ETL code across environments.

Creation of shell scripts for pre-validation of data thus avoiding run time errors while executing work flows.

PERSONAL DETAILS

Date of Birth 16 April,1989 Passport No. & Status L3346736 and Valid Till 2023 PAN Card No BTCPK9013R Permanent Address 55/10,Janta Colony, Ganaur, Sonipat, Haryana-131101

(Deepak Kaushik)