Analytics Methodology


Data science work flow is a non-linear, iterative process that involves Asking questions, Getting Data, Exploring Data, Modelling Data, and Communicating Data. There are many skills and tools required to cover the full data science process. Data Science process begins by asking questions which guides the work-flow of various activities in the Data Processing/Analysis stack.The skills required is the ability to think in-the box, out of the box, thinking critically, scientifically as well as talking to experts in that domain. The experience of the Data Scientist also has a huge role to play. Fformulating hypothesis, answers to which solve known problems or unearth unknown solutions that in turn drive business value. The importance of iterating on the questions, playing with what-if scenarios and research, will enable a Data Scientist to become more familiar with the dataset. Sourcing the Data and making it usable is a laborious and cumbersome aspect of Data Science. It entails web scraping, data cleaning, querying databases and a lot of programming to not only handle data aquisition of volume, variety and velocity, but also, standardizing, normalizing, ensuring quality of data. Later matching, joining several disjoint data or data files is a challenging task.

Exploration of the Data to know and understand, develop hypotheses, identify patterns, identify anomalies or outliers would be essential in identifying the type of analysis to be done to get to the answers to the Business questions. Performing such analysis by implementing various algorithms/tools (statistics based) in a distributed and parallel architecture would need efficient, distributed programming motifs.

Communicating the insights gathered from the analysis in the form of simple stories/visualizations/dashboards (the Data Product) that a non-data scientist can understand and build conversation out of it.



The increase in data sources and complexity has meant an increased emphasis on data collection, acquisition and processing techniques. We collect data from both primary and secondary sources. Ability to process large data sets using open source distributed/parallel frameworks.


The data is Cleansed, Normalized, Standardized and Transformed. Data review and profiling techniques are embedded to validate and compensate for data deficiencies or missing data . Identify all relevant resources to provide an analytical outcome.


We use triangulation and other techniques to access the right tools/techniques to approach the problem at hand. We also look for precedence of the use cases and techniques employed in the academia and published work. Acquire necessary data as required for achieving an analytical outcome.


We have a team of people who are specialized in state of the art data visualization tools. The rending is often achieved in various communication formats through channels such as Mobile or Web.

Service Offerings



PoC and Pre-Pilots

We define future-state vision and cloud service models for migration to the cloud.

  • Demonstrate the potential value of big data investments through a proof of concept.


Strategy Road map

The roadmapping will involve collecting Inventory available data, how it's structured, what value it offers, and how to validate it for use, so you can prioritize your asset utilization. This documents the opportunities available to you through big data and how they align to your strategic priorities, helping you to build the business case. This service is aimed at showing how to make better, more informed business decisions in real time, using sophisticated analysis of multiple data sources. The outcomes expected with this service is business becoming more flexible to market changes through real time data analysis.

We determine which business and IT assets are most suited for cloud migration and identify business‑relevant cloud services.

  • DW Rationalization/Simplification
  • Portfolio analysis ( Systems, reports, ETLs)
  • High-level Project Planning
  • Scoping for unstructured requirements
  • Requirements and Prioritization and Estimation

Data Engineering

Data is kept in spreadsheets and log files, information embedded in photos, videos, tweets and emails, or streams of data from connected devices, sensors and machines; we've been surrounded by vast quantities of data for a long time. Competitive and market pressures have transformed that information from this in these multi-structure data into a new competitive opportunity. Our Data Engineering offering is geared towards achieving this end.

  • Data Integration,ETL, data Munging and Transformation.
  • Unstructured/Structured Data Processing
  • Modeling Data -- Using Standard Techniques
  • Develop map reduce code, transformations, custom code


Data Integration Services

  • Search & Document Indexing
  • Hadoop/NoSQL Development and Maintenance
  • Data warehouse augmentation
  • Planning and Setting up Clusters and NoSQL
  • Performance Monitoring
  • Capacity Planning
  • Configuration Management (Dev/Test/Prod)
  • Administration functions for NoSQL/Hadoop
  • Job Workflow control and Breakfix

We bring comprehensive, quantitative approach and innovation to our end to end Advisory and Consulting Services which are envisioned to help customers harness their data assets into information, and transform the information gained into business insights, thus turning data resources into a source of sustainable competitive advantage. We understand, changing the mindset of stakeholders require various agents; critical amongst them are knowledge dissemination, demonstration and creation of value, learning from case studies and best practices. Our advocacy initiatives achieve this through extensive community engagement with Product vendors, IT Service providers, Institutions and Industry bodies.

Analytics Strategy:

Provides a time-tested analytics envisioning approach to help customers create a road-map for their needs, keeping in mind the business needs, technology fit and implementation feasibility.

Buisness Case Building:

Our eclectic valuation approaches such as Return on Investment (ROI) calculator, a comprehensive tool that eases the cost-benefit analysis, helps in quickly building a future-proof business case for any substantial investments in Analytics infrastructure & technologies. Scoping for unstructured requirements is where our consultants revel and bring incredible value to the customer analytics roadmaps and makes the achieving break through outcomes possible

  • Business Discovery
  • Business Case development
  • Business Alignment

Performance Engineering:

The connection between application success and business success are inextricably intertwined in the digital systems of the current generation of technologies. In Big data Analytics, as the syetms are becoming more responsive, the earlier batch oriented systems are undergoing a tremendous push towards becoming low latency sytems. Thus, this entails not only ensuring performance improvement of an architectural component but also lineage of systems to ensure the SLAs a business demands.

Analytics Modeling and Implementation:

In an increasingly quantitative, digital world, everything is, or will be, recorded or counted. You need to find value in that data, spotting the seemingly unrelated patterns in data, mining and refining it, to create insights with commercial value. Business Analytics is the solution that turns the promise of big data into a source of practical, continuing insights to improve organizational performance.

  • Analytics (Machine Learning, Statistical Programming, Text Mining).
  • Algorithm development
  • Reports/Visualizations

Solution Architecture and Design:

We will utilize our accelerators and pre-built components to our solution design. We approach a complex analytical problem by understanding the business imperatives and algorithmic feasibility, model performance goals and a multitude of factors such a s robustness of the solution, performance and scalability of the solution etc.,

  • Architecture, Design and Performance
  • Solution Architecture and Design
  • Performance assessments and improvement roadmap
  • Technology Architecture (Loosely coupled systems, Data Appliance, Cloud?)
  • Code assessment
  • Artifact Reuse
  • Recommend Tool Best practices and Standards

Technology Architecture and Selection:

Technology Selection is a vital process a company must employ to acquire modern information technology that will drive business process and business performance improvement. Selecting and implementing a new technology is one of most disruptive to an enterpise, but its success hinges on matching the right technology to the achievement of critical business goals. We specialize in the methodology and tools that are necessary to properly assess candidate technologies that leverages independent market research, ensuring that the technology selection is unbiased and long lasting.


We also provide Emerging Technology Advisory in considering whether an Open Source vs Commercial software is appropriate for a given solution, or a recommendation on; if a particular software be Build vs. Bought be it Packaged, Industrial Models or Custom software.

CoE Services aims at IT Service Provides whose Big Data Practices are either to be set up or in nascent stages. We provide an extensive array of core services required my these organizations to grow their Businesses with the existence Clientele. The following Outcomes Delivered


  • System Performance Improvement
  • Reducing the Cost Of Quality
  • Improved Team Skills/productivity


Technical Reviews and Health Checks:

  • Evaluation, Assessment and Diagnostics
  • Current Landscape Assessment
  • Performance Diagnostic ( DB, ETL, OLAP)
  • Enterprise Application/ Portfolio Assessment


Proficiency Development:

  • Competency measurement
  • Trainings Needs analysis
  • Lateral cross-training
  • Greenfield training
  • Enabling certifications


Sourcing Support:

  • Interview guides
  • Interviewer training and empanelment
  • Support Recruitment / Referral drives
  • Weekend support
  • Interview bandwidth support


Expert On Demand:

  • Estimation support and Reviews
  • Project health-check
  • Technical Reviews
    • Architecture assessmenti
    • Code assessment
    • Tool best practices and Standards
  • Data Quality assessments
  • Performance assessments and improvement roadmap
    • Performance Diagnostic ( DB, ETL, OLAP)
  • Portfolio analysis ( systems, reports, ETLs)
  • Design workshops and latest techniques


Virtual CoE Labs:

  • Sand Boxes for Anaytics - to build competencies
  • Support Solution accelerators that complement BI products
  • Support Idea Incubation labs and develop PoCs
  • Understand and drive adoption of new technology/tools
  • Create BI point solutions for industry/domain specific areas
  • Improve our Solutioning capability
  • Develop tool evaluations and help create benchmarks