
Harshendu Desai
- Sr. Data Architect / Modeler
- Bay Area, CA
- Member Since May 05, 2023
Harshendu Desai
(Available for Bay Area, California Only)
SUMMARY
Extensive professional experience in business and data analysis, data modeling, metadata management, statistical analysis and machine learning with various distributed relational databases which provide a unique opportunity to bring in with technical expertise and the business knowledge.
COMPUTING SKILLS
· Requirement & Analysis – Translate business requirement into functional design documents and into detailed technical specifications that include technical flow, ER and RDF diagrams, detailed report logic, measurements and layouts. Document database objects, and error handling while ensuring these technical designs meet the business/functional requirements specified in the Functional Design. Database performance and map and gap analysis using Excel, Power Point presentation and MS Word documentation.
· Architecture – Data Quality, Data Integration, Master Data Management (CDH,CDI, PMI), Business Data Quality (BDQ) and Business Data Object (BDO), Business Intelligence (BI), Reference Description Framework (RDF) , Zachman, TOGAF and BOST Architecture Frameworks.
· Modeling -- Forward (OLTP –3nf) and reverse engineering (ERD) for traditional databases, data marts and data warehouses (Stars Schema) with data dictionary using CA ERWIN, ER Studio, IBM System Architecture, Oracle Designer, Sybase Power designer (SAP Power Designer) and MS Visio tools.
· Design & Development – Database driven client server applications, specifically for Internet and custom client/server RDBMS solutions using J script, VB script, Shell script, SAS, SPSS, R, and Python programming languages.
· Migration & Integration – – Moving data from one database vendor platform to another (such as from Oracle to MS SQL Server vise versa) with applications using vendor / platform specific tools. Data Integration using PL/SQL, SSIS, ETL and cleansing tools for SQL/DS and DB/2 mainframe, Sybase, Oracle, MS SQL Server and Greenplum (PostgreSQL) databases.
· Servicing – Writing and maintaining Structured Query Language ( SQL) and stored procedures (PL/SQL) on SQL/DS and DB/2 mainframe, Sybase, Oracle, MS SQL Server relational databases. Also writing SPARQL on Resource Description Framework (RDF)/ Oracle RDF and NOSQL like HBASE, HIVE, GEMFIRE and Greenplum (PostgreSQL) databases.
· Reporting – BI metrics, ad hoc and analytical reporting and dashboard using Brio, Business Object and Tableau reporting tools.
EXPERIENCE
2016 Applied Materials – California
· Designing and integrating various data sources for supply chain industry.
· Logical and Physical Data modelling on spare parts subject area using MS Visio modelling tool.
· Map & Gap Analysis of SAP SD and MM modules for Quote to Cash (QTC) processes.
· Planned to migrate SAP Reports to MS SQL Server 2012 APS MPP Database
· Trained and guide juniors on understanding Quote to Cash Business Processes and Business Data Objects (BDO).
2015 GE Transportation – San Ramon, California
· Designing and integrating various data sources for transportation industry.
· Logical and Physical Data modelling on shipment and locomotive subject area.
· Design, build and operationalize Big Data Analytics using NOSQL databases like HBASE, HIVE, Gemfire and Greenplum (PostgreSQL) databases.
· Build various consumption models related to Locomotive and Transportation subject area using Erwin and MS Visio modelling tools.
· Design and modelled machine parameter reading (Locomotive sensors) data for Reliability Availability Services (RAS) and automate and operationalize the machine learning process for logistic regression model and text similarity model (recommendation engine) respectively.
· Map & Gap Analysis of various SORs flowing through Greenplum (PostgreSQL) Data Lake.
· Created Data Dictionary for Shipment, Locomotive and Machine Parameter / sensor Reading data.
· Trained and guide juniors on understanding the locomotive controllers and machine fault parameters occurrences data and flows..
2014 Wells Fargo Bank Corporation – Fremont, California
· Designing and integrating various data sources for conforming wholesale banking data.
· Developed normalized Logical and Physical database models to design OLTP system for Reference and Balance data conformance using ER studio modelling tool.
· Conduct Design review with the business analysts and content developers
· Map & Gap Analysis of various SORs flowing through wholesale banking DataMart Hub.
· Created Data Dictionary for Customer, Contract, Credit, Deposit and Resources Subject Areas.
2011-2013 CISCO Corporation – San Jose, California
· Define Zachman Architecture Framework for End-to-End BI Services using IBM System Architecture Modeling tool.
· Define BOST Architecture Framework specifically on Systems and Technology subject area.
· Designing and Integrating Resource Description Framework (RDF) Data model using Oracle Spatial RDF database technology.
· Drawing RDF data model, Data Transformation / Data Flow diagrams using Visio tool.
· Migrating and converting various sources of Relational data (SQL) into RDF (NOSQL) format and documenting use test cases.
· To-be Architecture, Data Dictionary and Map & Gap Analysis for Resource Skills, Service Supply Chain, Customer, Contract, Services and Technical Support Subject Area.
2011 Enterprise Corporations – California
· Database Migration and Conversion from MS SQL Server 2005 to Oracle 11g at ROVI corp.
· Designing and Integrating Rovi’s to-be BI architecture using Oracle PL/SQL.
· Logical and Physical Data Model for Risk Evaluation & Mitigation Strategies (REMS) drug at McKesson Corporation using Oracle Designer modeling tool.
· Reverse Engineered Regional Strategy Model (RSM) extracts business rules from SAS Datasets at Blue Shield.
2010 Apple Inc – Cupertino, California
· Data mining and as-is analysis of Apple Contract manufacturing Mac and Indigo subject area.
· Reverse Engineered Mac and Indigo data warehouse schemas and build inferred relationships for as-is architecture using Erwin tool.
· Map & Gap Analysis of Business Processes and Data for MAC and Indigo Bill of Materials (BOM) subject area.
· Responsible for standardization of data quality processes, data profiling and data governance strategies for systems and applications.
· Created Data Dictionary for Mac and Indigo subject area.
2010 Xtime Inc – Redwood city, California
· Data mining and as-is analysis of Xtime Automobile Dealership Management, Repair Order, Appointment and Customer subject areas.
· Responsible for documenting as-Is ETL processes of Xtime BI Metrics and recommending to-be ETL processes.
· Designing, Packaging and Integrating Xtime to-be BI architecture using Oracle PL/SQL.
· Responsible for demonstrating BI metrics performance improvement using Tableau BI reporting tool.
· Created Data Dictionary for Repair Order (RO), appointment taker and agent subject area.
2007-2010 Delta Dental Corporation – San Francisco, California
· Data mining and as-is analysis of Delta Dental member, provider and claim subject area.
· Reverse Engineering third party Oracle database schema and building inferred relationships for to-be architecture using Erwin tool.
· Responsible for drawing as-is and to-be Data Flow diagrams by applying Gene & Sarson methodology and using MS Visio tool.
· Responsible for assessing mainframe data quality and reporting.
· Map & Gap Analysis of Business Processes and Data for provider and claim subject area using SAP Business Object tool.
· Responsible for standardization of data quality processes, data profiling and data governance strategies for MDM program.
· Developing data cleansing and ETL via SAP Business Objects Data Quality (BO/DQ) Software.
· Responsible for Developing Metadata Model and administering Delta metadata repository using ASG–ROCHADE Tool.
· Customize ASG-ROCHADE meta model using JAVA and Eclipse developer tool.
· End-to-End data cleansing Architecture for Language Assistance Program (LAP) member survey data.
· Created Data Dictionary for Claim, Member and Provider subject area.
· Responsible for writing Business Requirement, Functional specs and Detail design documents for Address standardization and conversion projects as a part of MDM program.
2004 Charles Schwab Corporation – San Francisco, California
· Re-designing and rectifying the Schwab Security and Fixed Income transactional databases.
· Documenting the current PL/SQL and standardize according to Schwab standards.
· Reverse engineering the physical database schema of MS SQL Server 2000.
· Analyzing business logic and mapping the Database schema with the requirement.
2005-2007 CISCO Corporation – San Jose, California
· Data mining and as-is analysis of Oracle 11i ERP Modules in CISCO environment.
· Reverse Engineered Order Entity, OE/OMT and IB ERP Modules for to-be architecture as a Single Source of Truth (SSOT).
· Responsible for drawing Data Transformation / Data Flow diagrams using Visio as well as IBM System Architecture (Tele logic) tools.
· Managed and published as-is and to-be Business Processes, Logical & Physical data models, System Architecture, Interface and Infrastructure Architecture for Oracle Customer Data Management and Oracle Trading Community Architecture ( TCA).
· Map & Gap Analysis of Business Processes and Data using SAP Business Object tools for Order and Customer subject area.
· Generating Reporting and Metrics in SAP Business Objects (RAMBO).
· Created Data Dictionary for Order, Customer, Shipment and Service Contract Subject Areas.
· Responsible for writing Business Requirement, Functional specs and Detail design documents for QTC and CA BI Metrics projects.
EDUCATION
· M.S. in Statistics, California State University, Hayward, USA
· M.S. in Business Statistics, Bombay University, India
· B.S. in Business Statistics, Bombay University, India