Home » Home » Introduction to Data Structure and Algorithm

# Introduction to Data Structure and Algorithm

In today’s world the data structure and algorithm are most important part of digital world and without it, the most of processes of data are going to complex and messy to store and retrieve. It specify the optimal way of manipulating data and the flow of program and procedure because data in any institute, organisation or any company is very vital and important. Here data can be varies can be different types or forms for any organisation or individual as it preferred by the its needs or fulfil the objectives.

Some Important terms :

Instructions – Instruction are commands that are written in high-level language(HLL – Human understandable language like alphabet, number system) and commands are orders that were written by programmers or nowdays even AI’s are capable of writing commands(like ChatGPT). The collection of orders or command called Instruction.

programs – The collection of Instructions called program.

Software – The set of programs called software.

Data Structure is comprised of two words data and structure :

Data – the raw facts and figures that are processed to get the relevant(necessary) information that we want to work on it.

Structure – It refers to the form of data that are stored in such a way that we easily modify it, maintain it, store and update it for further future purpose.

Algorithm – Algorithm is defined as a process or set of well-defined instructions that are typically used to solve a particular group of problems or perform a specific type of calculation. To explain in simpler terms, it is a set of operations performed in a step-by-step manner to execute a task.

OR The blueprint of how to, what to do it with feasibility and with clear objectives in minds.

Complexity – The term complexity refers for the effectiveness and efficiency for a program. There are two types of complexity are there :

1. Time Complexity – How much time required to run the whole program. The lesser the time it takes the better is program will be.
2. Space Complexity – How much less space it takes in RAM(Random Access Memory) to execute the whole program. some time some extra space were taken for execution of program that extra space called Auxiliary Space.

The time complexity counts time time taken to execute every single line of code for declaration and assignment of variable/identifiers it take constant time, Big-Oh(1) or O(1). for loops it take N for N numbers it runs and nested loops takes multiple times like multiple of N (N * N * N and more) but we try to avoid case were time complexity too much. When we works on big projects the time efficiency needed there every seconds counts and make it slower application.

example – if you search something on google and find it taking even ten seconds you will close the process and irritated at internet speed. There are many users of google mails and it is very fast to login and exchange data and sends mails with efficiency and less time consuming because even having millions of users it based on best data structure and algorithm to make super fast and do in zippy.

Now we know the both complexities time and space are measured with their input parameters. But time required for executing the code depends on several factors and that is :

• The numbers of operations performed in the program.
• The speed and capacity of device
• The speed of data transfer or connectivity that executed the program.

So determining this aspects which one is more efficient, We use Asymptotic Notation.

Asymptotic Notation – It is a mathematical tool that calculates the required time in terms of input size and does not require the execution of the given code.

It neglects the system-dependent constants and is related to only the number of modular operations being performed in the whole program. The following 3 asymptotic notations are mostly used to represent the time complexity of algorithms:

• Big-O Notation (Ο) – Big-O notation specifically describes the worst-case scenario.
• Omega Notation (Ω) – Omega(Ω) notation specifically describes the best-case scenario.
• Theta Notation (θ) – This notation represents the average complexity of an algorithm.

The most used notation in the analysis of a code is the Big O Notation which gives an upper bound of the running time of the code (or the amount of memory used in terms of input size).

To learn about complexity analysis in detail, you can refer to our complete set of articles on the Analysis of Algorithms.