top of page

WA26 Data Structures and Types

WA26 Day31 JSS -DM303 : Web Analytics-BL01M10-Day31-22Jan21

DM303 : Web Analytics BL01M10 Data Connectivity offline and Online Data

Data Structures Data Structure is a way to store and organize data so that it can be used efficiently. Data Structure can be defined as the group of data elements which provides an efficient way of storing and organizing data in the computer so that it can be used efficiently. Some examples of Data Structures are arrays, Linked List, Stack, Queue, etc.

Data structures allow information storage on various storage

  1. Provides means for management of large dataset such as databases or internet indexing services.

  2. Are necessary for design of efficient algorithms.

  3. Allows safe storage of information on a computer.

  4. Allows the data use and processing on a software system.

Two Types of Data

There are two general types of data – quantitative and qualitative and both are equally important.

What is Qualitative Analysis?

Qualitative analysis uses subjective judgment to analyze a company's value or prospects based on non-quantifiable information, such as management expertise, industry cycles, strength of research and development, and labor relations

Qualitative observations are made when you use your senses to observe the results. (Sight, smell, touch, taste and hear.) Quantitative observations are made with instruments such as rulers, balances, graduated cylinders, beakers, and thermometers. These results are measurable.

Qualitative Analysis

Qualitative analysis is the analysis of qualitative data such as text data from interview transcripts. ... The emphasis in qualitative analysis is “sense making” or understanding a phenomenon, rather than predicting or explaining.

The hair colors of players on a football team, the color of cars in a parking lot, the letter grades of students in a classroom, the types of coins in a jar, and the shape of candies in a variety pack are all examples of qualitative data so long as a particular number is not assigned to any of these descriptions.

Beta Testing/Analysis

Beta is a measure of the volatility—or systematic risk—of a security or portfolio compared to the market as a whole. Beta is used in the capital asset pricing model (CAPM), which describes the relationship between systematic risk and expected return for assets (usually stocks). Beta testing now is being used in Analysis on Analysis of various possibilites of scenarios on new online campaigns.

Higher Beta investments tend to be more volatile and, therefore, riskier but provide the potential for higher returns. Lower Beta investments pose less risk but generally offer lower returns

What is Big Data

That amount of data that will not practically fit into a standard (relational) database for analysis and processing caused by the huge volumes of information being created by human and machine-generated processes.

diverse data sets that include structured, semi-structured and unstructured data, from different sources and in different volumes, from terabytes to zettabytes. It’s about data sets so large and diverse that it’s difficult, if not impossible, for traditional relational databases to capture, manage, and process them with low-latency,

Structured, unstructured, semi-structured data : Whether the data is/can be put in the Structure defined or possibility of data to be categorized and put in a structure

Machine Data : machine data is the digital output created by the systems, technologies and infrastructure powering modern businesses. Machine data includes data from areas as varied as application programming interfaces (APIs), security endpoints, message queues, change events, cloud applications, call detail records and sensor data from industrial systems,” said Davies. “Yet machine data is valuable because it contains a definitive, real time record of all the activity and behavior of customers, users, transactions, applications, servers, networks and mobile devices.”

What are the 4 types of analytics?

Depending on the stage of the workflow and the requirement of data analysis, there are four main kinds of analytics – descriptive, diagnostic, predictive and prescriptive.

Cycle of Predictive Analysis

Sense - Evaluate - Respond- Sense

Predictive Analytics Predictive Analytics is a statistical method that utilizes algorithms and machine learning to identify trends in data and predict future behaviors. ... Predictive Analytics can take both past and current data and offer predictions of what could happen in the future.

Predictive analytics are used to determine customer responses or purchases, as well as promote cross-sell opportunities. Predictive models help businesses attract, retain and grow their most profitable customers. Improving operations. Many companies use predictive models to forecast inventory and manage resources.

Predictive analytics: Four prerequisites of an effective strategy

  1. Appropriate sources of data. One of the most fundamental points to consider is whether data is indeed capable of providing an answer to every question that the organisation has.

  2. Data cleanliness and usefulness. ...

  3. Automation and machine learning. ...

  4. Meeting business objectives.

What type of data analytics has the most value?

Prescriptive – This type of analysis reveals what actions should be taken. This is the most valuable kind of analysis and usually results in rules and recommendations for next steps. Predictive – An analysis of likely scenarios of what might happen. The deliverables are usually a predictive forecast

Gartner Analytic Ascendancy Model

Lakshminarasimman V Rao | 303 WEB Analytics |Digital Marketing| Study notes | Study Material | MBA | Corporate Neeti Consulting | Mysuru

All data above is a combination of data from Internet, purpose of this doc is for research and education only and responses received from Class students and interaction.

Recent Posts

See All


bottom of page