Satish Kumar

  • Sr. BI / Data Analyst / Data Modeler
  • Austin, TX
  • Member Since Jun 11, 2023

Candidates About

 

Satish Kumar

SUMMARY

  • Over all 13+ Years of experience in  Data Warehouse(ETL)\BI Development\Architect, Data Modeling Technique & Data Visualization
  • Strong experience of Pre-Sales, Onsite-Offshore Model, Project Estimation and Budgeting, Capacity Planning, Business Consulting, Process Consulting, Data Management Consulting, Presentations, Define reference solutions, frameworks, Product Selection, Long/Short term technology roadmap
  • 8+ Years experience in multidimensional (OLAP) data modeling, such as Star schemas, Snowflakes schemas normalized and de-normalized models, handling -slow-changing- dimensions\attributes.
  • Expertise in entity relationship modeling, conceptual and logical Data modeling, Physical Data Modeling, hierarchies, relationships, metadata, data lineage, reference data.
  • 8+ Years experience in Microsoft BI & SAP BI, ETL Solution Design and Analytics Reporting.
  • Expertise in Project design, Estimation, Implementation, Upgrade, Migration, Client-interaction, maintenance and team management.
  • Extensive experience in SQL Server 2016/2014/2012/2008R2 performance tuning, load balancing, backup, Database design, upgrade and migration.
  • Expertise in ETL Tools- Microsoft SQL Service Integration Services (SSIS), Analysis Services (SSAS), SAP BODS (Data Services), INFORMATICA PowerCenter, IDQ, ICS,
  • Extensive experience in BI Tools - SAP Business Objects, SSRS, SAP Lumira, MicroStrategy9.2, COGNOS 10.2 Report Suite, Tableau, QlikSense,  SiSense  & Sap Lumira
  • Expertise in Business Objects Configuration, Administration, Server Clustering, Load Balancing, Universe Design, Reporting, Dashboard, Tomcat settings, CMS Repository, Lumira Stories.

·         Worked on Integration packages (SSIS), SAP BODS & Informatica Power Center for data movement and ETL\ELT process to populate the data warehouse from various source systems.

  • Built Enterprise Data Warehouse using Ralph Kimball Dimensional Technique, worked on Data Modeling tools –Erwin, ER\Studio, & Oracle SQL Developer Data Modeler tools
  • Extensively used Performance Optimization best practices for making database healthy and provide optimal performance including table partitioning, Buffer Sizing, Threading and Indexing strategy.
  • Data Analysis, Business Requirement Gathering, Gap Analysis, and created Logical\Physical Data Model for Enterprise Data Warehouse Application.
  • Extensive Knowledge in Tableau server installation in Stand-alone/cluster installations of Tableau Server Concepts like tabcmd, tabadmin, creating users, sites and projects.
  • Exposure in Big Data -Hadoop, AWS Cloud, Amazon RedShift, AWS SNS, AWS SQS, AWS Lambda, MongoDB and Python.
  • Exposure in Project Management activities -Resource optimization, task estimation, costing for fixed Time & Material, experienced SDLC & Agile-Scrum Model that includes short term goals, iterative development and daily stand-up
  • Sound Knowledge of Insurance, Healthcare, Life Science, Telecom domains

·         Data Modeling: Over 7 years of Dimensional Data Modeling experience (ERwin & Oracle SSDM). Sound knowledge on Dimensional Modeling, OLTP Models, Canonical Integration Models, EAV, Associative Modeling, Ralph Kimball and Bill Inmon Methodologies, Star/Snowflake schemas, Data Marts, Facts & Dimensions, Logical & Physical data modeling, MDM Data Modeling, Unstructured Data Modeling, Metadata, Reference Data Process models. ETL and Reporting Framework Modeling, Database Management in SQL Server, Oracle, DB2

·         Data Management: Over 5 years of Microsoft BI Platform (SSIS, SSRS, SSAS), 4+ years of experience in SAP BODS (Business Objects Data Services), 3+ years of Data Warehousing experience using Informatica Power Center 8.6 till 9.5 (Designer, Workflow Manager, Workflow Monitor, Repository Manger), PowerConnect, Power Exchange, Good knowledge of Pentaho, Talend

·         Big Data: Good Knowledge of Hadoop, Hive, Spark, NoSQL, Pig, Apache Big Data frameworks and standards

·         Architect: Over 10 years of Architectural experience as Data Integration Architect, ETL Architect and Data Architect, Integration of complex Legacy systems in Warehouse, Data Migration Architectural design, Analytics over data, Defined various Data Integration patterns and Framework with innovative approach, Expert knowledge of Integration of New generation PAS systems with legacy systems.

·         Business Intelligence: 11 years of Business Intelligence experience using SAP Business Objects (7+ years) COGNOS (3 Years), Microstrategy9.2 (3 Years), SSRS,

·         Analytics: 5 years of Experience in Multidimensional Data Analytics including SSAS (XMLA Query), SSAS (5 Years), SAP Lumira (2 years), Tableau Desktop 9.3 Tableau Server 9.3, (3 Years), QlikView\QlikSense(1.5 Years), SiSense(1 year), PowerPivot (3 Year)

·         Databases: 12+ years of experience using Oracle 11g(7 years), DB2 (6 yrs), MS SQL Server 2016 or lower(10+ years), RDS, NoSQL MongoDB

·         Programming Languages: SQL (10+ Years), T-SQL(5+ Years), PL/SQL(4+ Years), C#(1+ Years), JSON(1 Years), VB(3+ Years), Unix Shell Scripting(4+ years), Python(1 yr), R(1 Yr),

·         Cloud Integration: Informatica Cloud, Big Data -Hadoop, AWS Cloud, Amazon RedShift, AWS SNS, AWS SQS, AWS Lambda, MongoDB and Python

·         Data Tools Informatica IDQ, Informatica MDM, Information Steward, DQS

·         Version Control Tools Microsoft VSS, GitHub, Tortoise SVN repository

·         Schedule Tool Control-M, Auto-Sys

·         Project Management Tool: JIRA, Microsoft Project, HPSM, Agile Scrum

·         ERP : Oracle- E-Business Suite R12 (PA)

TRAINING & CERTIFICATION 

·         Project Management Professional (PMP) Training (PMBOK 5th Edition) successfully completed

·         Certified Scrum Master (CSM) - Scrum Alliance – Pro: Agile Scrum Project

·         Oracle Certified Associate (OCA) – Oracle Corporation: Pro: Database administrator

·         Certificate course of  “Big Data Internship Program– Foundation” granted by Big Data Trunk & Udemy

·         Certificate course of Data Science A-Z™: Real-Life Data Science Exercises by Big Data Trunk & Udemy

·         Certificate of Completion in “Data Science 101” by Big Data University

·         Certificate of Completion in Training & Assessment of DevOps Course by CSC India skill Platform

·         Certificate of Completion of “SQL*LIMS” v4.0.16 & 5.0.1 training by Merck Inc

EDUCATION     

·         Master of Computer Application (MCA) from Kumoun University, Nainital, India

·         Master degree in Mathematics (M.Sc.) from CCS University, Meerut, India

·         Bachelor of Science (B.Sc.) from CCS University, Meerut, India

 

EXPERIENCE

Client: Dell Inc., USA

Role:Sr. BI\ Data Analyst\ Data Modeler                                                                                                      05/17 –Now

  • Working as Sr. DWH- BI Developer\Data Modular\ Data Engineer on retail based project.
  • Participated in the Business and Specifications requirements gathering sessions and convert it into Design specifications. The Design specifications ranges from High Level Design to Application level designs and (Cross) Application Interface agreements specifications.
  • Defined the scope of the project based on the gathered business requirements including documentation of constraints, assumptions, business impacts, project risks and scope exclusions.
  • Prepared ETL design document which consists of the database structure, change data capture, Error handling, restart and refresh strategies
  • Used MS SSIS to extract, transform and load data into DWH target system, Projected to Use SAP Business Objects for traditional reporting and Tableau for business KPI visualization.
  • Expertise in CMC, SAP BO administrator activities - access, server properties, SIA, Scheduling, publishing documents, managing server, Tomcat server, BO services, Excellent understand of CMS Audit DB tables.
  • Created rich dashboards using Tableau Dashboard and prepared user stories to create compelling dashboards to deliver actionable insights.
  • Analyzed the source data and handled efficiently by modifying the data types.

Environment: SQL Server 2016, SSIS, SSAS, Power Pivot, SAP Business Objects 4.2, Tableau Desktop 9.3 Tableau Server 9.3, JIRA

 

Company: Computer Science Corporation

Client: KenyaRe & TOARe
Role: Sr. DWH-BI \ Data Modeler\Data Architect                                                                                11/16– 5/17

  • Worked in multiple roles like- Sr. DWH-BI, Data Modeler, Data Architect in SICS (P&C, Life, Ceded Insurance Product) Business Analytics projects used by 150+ Reinsurance customers globally.
  • Defined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.
  • Created both logical and physical design of a database and ran the reports.
  • Used advanced data techniques in data modeling, integration, visualization, database design and implementation. Expertise in Business Objects Installation, Upgrade, Migration and Implementation for Insurance Customers
  • Used calculated fields extensively for different logics for Trend, Extended price calculations for the different cost types and Used Set function manual. And Created Advanced grouping function.
  • Found efficient ways to make tables and graphs which were visually easy to understand and at the same time maintaining the accuracy of the core information content.
  • Worked on MS-SSIS (ETL) Solution to extract, transform and load data into DWH target system, expert in writing complex SQL query, function and stored procedure to perform specific task.
  • Expertise on BO Universes- Design. develop, modify in UDT & IDT & BO Reports -Design, Create BO reports, troubleshoot issues, enhancement.
  • Expertise in SAP BO administrator activities - access, server properties, SIA, Scheduling, publishing documents, managing server, Tomcat server, BO services, Excellent understand of CMS Audit DB tables.
  • Created various Prompts, Conditions and Filters to improve the report performance By means of Detailed and summary filters, created various reports.
  • Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps.
  • Worked on motion chart, Bubble chart, Drill down analysis using tableau desktop. And Data Source created and modified.
  • Monitor and improved performance of ETL, Reporting and Database optimization, Expertise in defining data types, optimal normalization, referential integrity, triggers, partitioning design, indexing methods, and data security procedures.
  • Worked in Agile-Scrum that includes short term goals, iterative development and daily stand-up

Environment: IBM DB2, SQL Server 2016, SSIS, SSAS, Power Pivot, TSQL, Oracle 11g, SAP BO XI 3.1/4.2, Tableau Desktop 9.3 Tableau Server 9.3, JIRA

 

Company: Computer Science Corporation, Blythewood, SC, USA

Client: P&C Clients - SwissRe, Farm Bureau, Florida Peninsula & OMAG USA

Role: Sr. BI Developer\BI Architect\ Data Modeler                                                                        07/13 – 11/16

  • Worked as various roles like Sr. BI Developer, BI Architect, Data Modeler in POINT IN/J (P&C Insurance Product) Business Analytics projects used by 170+ USA Insurance customers
  • Defined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.
  • Created both logical and physical design of a database and ran the reports, used advanced data techniques in data modeling, Access, integration, visualization, database design and implementation.
  • Expertise in Business Objects Installation, Upgrade, Migration and Implementation for Insurance Customers. Expertise in MS-SSIS (ETL) Solution to extract, transform and load data into DWH target system
  • Expertise on BO Universes- Design. develop, modify in UDT & IDT & BO Reports -Design, Create BO reports, troubleshoot issues, enhancement.
  • SAP BO administrator activities - access, server properties, SIA, Scheduling, publishing documents, Managing BO server, Managing Apache Tomcat server and BO Services, Excellent understand of CMS Audit DB tables.
  • Created various Prompts, Conditions and Filters to improve the report performance By means of Detailed and summary filters, created various reports.
  • Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps.
  • Worked on motion chart, Bubble chart, Drill down analysis using tableau desktop. And Data Source created and modified.
  • Expertise to writing complex SQL query, function and stored procedure to perform specific task, defining data types, optimal normalization, referential integrity, triggers, partitioning design, indexing methods, and data security procedures.

·         Delivered simple, complex end-to-end data solutions on-time, budget, and with a high level of quality

  • Monitor and improved performance of ETL, troubleshooting complex production support issues and Database optimization & operational reporting environments
  • Performed Parallel Testing or Production Testing, which ensures that the new system will perform correctly in a production environment and interface correctly with other production systems
  • Worked in Agile-Scrum that includes short term goals, iterative development and daily stand-up, interacted with the clients and the business partners for issues and queries in the project and Followed, and enforced, industry and organization standards and best practices

Environment: IBM DB2, SQL Server 2016, MS-SSIS, SSAS, T-SQL, .NET, CA Erwin, SAP BO XI 3.1/4.2, SAP Dashboard, Tableau Desktop 9.3 Tableau Server 9.3, JIRA

 

Company: Computer Science Corporation, Bloomington, IL, USA

Client: State Farm Insurance, Bloomington, IL, USA
Role: Technical Architect-BI \ Project Lead\ Data Modeler                                                         04/12 – 06/13

  • Worked as Project Lead cum Technical Architect role in State Farms- Asset Management project
  • Defined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.
  • Created both logical and physical design of a database and ran the reports.
  • Developed the team of data analytics and visualization development engineers and convert data into actionable insights using descriptive and predictive modeling techniques
  • Worked on SAP Business Objects & COGNOS reports design and development
  • Worked on INFORMATICA (ETL) Solution to extract data from various source, transform and load into target system (Data warehouse and data Mart).
  • Expertise to writing complex SQL query, function and stored procedure, defining data types, optimal normalization, referential integrity, triggers, partitioning design, indexing methods, and data security procedures.
  • Worked on T- SQL packages Solution to extract data, transform and load into DWH/DM system
  • Expertise BO Universes- Design and develop new BO universes, modification and enhancement of existing universe by UDT, Resolve loops, traps, and cardinality on universe. 
  • Expertise in Reports -Design, create reports, Changes/enhancement of existing BO & COGNOS reports
  • Worked on IBM COGNOS Report Studio, COGNOS Framework Manager
  • SAP BO administrator activities - access, server properties, SIA, Scheduling, publishing documents, managing BO server, Tomcat server and BO services, Excellent understand of CMS Audit DB tables.
  • Monitors and improves performance of ETL and operational reporting environments
  • Handled a team of resources at client location in Bloomington, USA
  • Interacting with the clients and the business partners for issues and queries in the project

Environment: SQL Server2016, TSQL, .NET, Oracle PL/SQL, Oracle SQL Developer Data Modeler, INFORMATICA9, IDQ, T-SQL, BO XI 3.1 SP5, Dashboard, COGNOS10.2

 

Company: Computer Science Corporation, Tokyo, Japan

Client: SAFIC – SAISON Auto & fire Insurance Company, Japan

Role: Team Leader (Technology)\ Data Modeler\BI Architect                                                                                     07/10 – 03/12

  • Played multiple roles like Team Leader (Technology), Data Modeler, BI Architect, Technical Architect SAFIC- POLISY/J (P&C Insurance)-Business Analytics projects
  • Defined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.
  • Created both logical and physical design of a database and ran the reports.
  • Expertise in Design, Implement and support development end to end data warehouses, data marts, ETLs development that provide structured and timely access to large datasets
  • Developed the team of data analytics and visualization development engineers and convert data into actionable insights using descriptive and predictive modeling techniques
  • Worked on SAP Business Objects & SAP BODI Installation at SAFIC environments
  • Worked on SAP BODI (ETL) tool to extract data from POLISY/J (Japan) system, transform the data and load into target system (Data warehouse and data Mart).
  • Expertise to writing complex SQL query, function and stored procedure to perform specific task.
  • Expertise in BO Universes- Design and develop new BO universes, modification and enhancement, BO Reports -Design and create new BO reports, troubleshoot the BO reports issues.
  • Experienced in SAP BO administrator activities, server properties, SIA, Scheduling, publishing documents, managing BO server, Tomcat server and BO services, Excellent understand of CMS Audit DB tables.
  • Interacted with the clients and the business partners for issues and queries in the project

Environment: SQL Server 2008R2, Oracle10g, SQL, SAP BODI, SAP BO XI 3.1, Dashboard, CA ERwin, Tomcat 5/7 QlikView

 

Company: Computer Science Corporation, Noida, India

Client: TDC- Tele-Denmark Communications, Denmark

Role: Senior Software Engineer (G30)                                                                                                               10/08 – 06/10

___________________________________________________________________________________________

  • Played as Sr. BI Developer role in TDC (Tele Denmark Communications) project

·         Worked on ARTEMIS Reports, major/minor bug fix, improved performance and Production Support

  • Experienced on Data Acquisition from Oracle e-Business Suites R12 (PA Module) using ETL Job scripts.
  • Expertise to writing complex SQL query, function and stored procedure to perform specific task.
  • Worked on INCA Application, Oracle Spatial DB and Map Info application to generate Geographical maps
  • Exposure on Project Registration, Project Tracking, development & Maintenance HLD and DLD Document

Environment: TCL, Oracle 10g, PL/SQL scripts, Oracle- EBS R12 (PA Module), ARTEMIS, Oracle Spatial, MapInfo.

Company: Computer Science Corporation, Chennai, India

Client: Thomson Healthcare (Truven Healthcare)

Role: Senior Software Engineer (G30)                                                                                                            11/07 – 10/08

  • Worked as ETL – BI Developer role in Thomson Reuters (Truven Healthcare) Business Analytics projects
  • Experienced in MSTR Report Development and objects (Attributes, Facts, Metrics, Filters, Prompts), Dashboards, Cubes.
  • Experienced in design and setting up of new users, roles, privileges, data and application security
  • Extensively worked on Data Modeling concepts - Type2 DIM, Fact-Less-Fact and Confirmed DIM and Fact and ..etc.
  • Experienced in Quarterly release development process and production live support.
  • Worked on INFORMATICA 8.6 as an ETL tool to make Data warehouse and data Mart
  • Exposure of MapInfo to deploy additional ZIP CODE into the system to have them into MSTR Reports
  • Experienced on reports issues, enhancements, ETL operational environment, DB Optimization
  • Expertise to writing complex SQL query, function and stored procedure to perform specific task.
  • Interacted with the clients and the business partners for issues and queries in the project

Environment: SQL Server 2008 R2, Oracle 9i, UNIX Scripting, Oracle SQL Developer Data Modeler, INFORMATICA PowerCenter 8.6, IDQ, ICS, Microstrategy9, Onyx, MapInfo

 

Company: Cognizant Technology System, Pune India

Client: Merck Inc, NJ, USA

Role: Programmer Analyst                                                                                                                                                    05/06 – 11/07

  • Performed role of Programmer Analyst -BI Developer in Cognizant Technologies Solutions in Pune
  • Had Exposure of Pharma Client used SQL* LIMS System for NG-LIMS and Stability-LIMS products and captured sample life cycle phase data to design the DSS system.
  • Worked for 24 LIMS sites across the Globe to capture NG-LIMS and Stability-LIMS Sample data and loaded successfully into data warehouse and DSS system by setting up the Data Stages at UNIX server
  • Worked on UNIX Shell Scripting, Crontab Job Scheduling, MSTR & COGNOS Admin and Framework Manager, User Access
  • Expertise in MSTR & COGNOS Report Development and Production Support
  • Monitoring processes running on UNIX server of Stability DSS and maintaining their continuation
  • Worked on troubleshooting COGNOS and MSTR reports issues, Modify or enhance existing MSTR reports

Environment: UNIX Shell Script, Oracle 9i, SQL*LIMS, PRO*C, Oracle PL/SQL, COGNOS 8.2, COGNOS FrameWork Manager, Microstrategy9,

 

 

Company: CMC (Active Computers Payroll), New Delhi, India

Client: NDPL, Delhi

Role : Software Engineer                                                                                                                                       12/05 – 05/06

  • Performed as Software Engineer - Developer role in NDPL location in Delhi
  • Worked in Data center to monitor and maintained the process in server room
  • Worked on Crystal report design & development, Unit testing, System testing, UAT and PRODUCTION Deployment.
  • Worked on Tables, Views, Indexes, Stored Procedures, Triggers, Crystal Report, Oracle PL/SQL and Bug Fixing
  • Tested the Crystal reports, debugging application issues and fixed them

Environment: Oracle 9i, Oracle PL/SQL, TOAD, UNIX Commands, Unix Shell Script, Crystal Report.

 

 

System and Software Solutions, New Delhi, India

Software Engineer                                                                                                                                                    07/04 – 11/05

  • Worked as a Software programmer in creation of functions in VB Modules
  • Worked on Crystal report design & development, Unit testing, System testing, UAT and PRODUCTION Deployment.
  • Involved in coding and Testing windows based applications
  • Created Test case, executed them and worked on user manuals
  • Worked on Tables, Views, Indexes, stored Procedures and triggers, debugging of issues & fixed them

Environment: Oracle PL/SQL, Crystal Report, Oracle 8i, VB6.0, Windows NT