Hello people of the internet !! Today You're going to learn a little (if not anything) about algorithms.

So let's start from the basics.....

What exactly is an Algorithm ?

Well Informally it is a computational procedure that takes in a few inputs and throws out another set of outputs. It's basically a bunch of steps that a computer does, which takes in some input data and transforms it into output data.

Algorithms can also be considered a tool for solving a well-specified computational problem. The statement of the problem specifies in general terms the desired Input/Output (I/O) relationship. An algorithm defines a certain procedure for achieving the I/O relationship expected.

For example let's consider the age old

Basically the input is a bunch of numbers given in a random order, and the job of the Algorithm is to SORT these numbers into either ascending or descending order.

In this case what the problem is giving us as inputs is just a random bunch of numbers and asks the Algorithm to weave it's magic and spew these numbers out in a proper ascending or descending order.

This of course is on the simpler side of this topic but there are many famous implementations of Algorithms.

But why are Algorithms used ?

When we are developing applications,What technologies do we normally prioritize over others ?

So why implement Algorithms in the first place?

With the ever increasing capacities of computers, we now have to solve larger problems than ever before, It helps to get the job done quicker and more efficiently and algorithms work much better than just plain brute force methods

let's go over Sort algorithms

Again these algorithms take input data and returns them in a numerically ascending order or Alphabetical order. For simplicity's sake we will look over numerical examples only

There are many sorting algorithms.Each algorithm is fundamentally different from each other by the way they work. Normally an algorithm is chosen depending on the given data as some algorithms are better for a certain organization of data than others.(this will be explained later)

Three commonly used sorting algorithms are

1)Merge Sort

2)Quick Sort

3)Bubble Sort

So these algorithms are basically doing the same thing but in different ways. But one thing that differentiates between algorithms is the efficiency of the algorithm

So how do we determine the efficiency of an algorithm ? This is called the analysis of run time complexity (the words are just fancy, the concept is simple)

We normally consider 3 main representations of Time taken by the Algorithm to Sort in this case

1)Big Ω(Omega) notation(in terms of computer science) -This shows the least amount of time taken by an algorithm to complete. For example the Big Ω notation of quick sort will be "c

2)Big Θ(Theta) notation - This shows the Maximum time taken by an algorithm (in the case of Sorting ones this is normally when the elements are in reverse order). Normally this is the best notation to design algorithms because this also shows that the algorithm is always gonna work in the given time frame and we can more or less decide how fast the algorithm is based on this. For example in quick sort the Big Θ is " c

3)Big O(Oh...) notation-This shows the Average time taken by an algorithm. This is the most common notation used and is considered the standard of viewing algorithms as it gives the viewer a basic idea of how the algorithm works on a normal case. This can also show the the best and worst case but this is beyond the scope of this article. Again in the case of quick sort the big O is "c

As I have mentioned earlier quick sort is the most efficient algorithm for most cases. The reason there are none quicker is because these algorithms are considered to be "stable". The whole process of checking whether an algorithm is stable is a tedious one. But It's a simple mechanic to explain.

So let's just say we are sorting a bunch of cards. Normally if we are taking just the numbers in mind, both the given arrangements are considered to be sorted but if we are also considering the suits also. We must consider that in this case the 5 of hearts must come before the 5 of spades and thus this case of sorting will be considered a 'stable' sort. This sort of sorting takes relatively longer time but gives this desired result and retains other properties also.

The above algorithms were only explained conceptually. Now I want to take it one step further and explain how algorithms are normally expressed by computer scientists. Pseudocode is a high level informal description of the algorithm following basic programming language conventions.

Here's the pseudocode for insertion sort:

x = A[i]

j = i

This statement gives a variable X the value of the the ith term (the current index) and another variable j the value of i for the conditional loop later

while j > 0 and A[j-1] > x

A[j] = A[j-1]

j = j - 1

end while

A[j] = x

end for

Here's where most of the fun happens. so here basically it takes an element and checks whether it's bigger that it's previous element and the element before that and so on until the first element. And in each iteration switches it the position of the present element with the previous element (if the value is greater).

And then again the whole process repeats with the next number on the array as the index value

This marks the end of the article and hopefully you have leaned a little bit about algorithms.If anybody has any doubts or want to know more can surely IM me on my G+ account given below

References:

1)http://en.wikipedia.org/wiki/Sorting_algorithm

2)http://www.sorting-algorithms.com/

3)https://www.youtube.com/watch?v=SJwEwA5gOkM

4)Introduction to Algorithms-3rd Edition

Attributions

So let's start from the basics.....

What exactly is an Algorithm ?

Well Informally it is a computational procedure that takes in a few inputs and throws out another set of outputs. It's basically a bunch of steps that a computer does, which takes in some input data and transforms it into output data.

Algorithms can also be considered a tool for solving a well-specified computational problem. The statement of the problem specifies in general terms the desired Input/Output (I/O) relationship. An algorithm defines a certain procedure for achieving the I/O relationship expected.

For example let's consider the age old

**Sorting Problem.**Basically the input is a bunch of numbers given in a random order, and the job of the Algorithm is to SORT these numbers into either ascending or descending order.

In this case what the problem is giving us as inputs is just a random bunch of numbers and asks the Algorithm to weave it's magic and spew these numbers out in a proper ascending or descending order.

This of course is on the simpler side of this topic but there are many famous implementations of Algorithms.

But why are Algorithms used ?

When we are developing applications,What technologies do we normally prioritize over others ?

- Devs prioritize GUIs and focus on it's components
- Manufacturers focus on better fabrication technologies
- Some programmers might even focus on implementing Object-Oriented Systems in their Software life cycle

So why implement Algorithms in the first place?

- The main reason algorithms are prioritized is because they themselves are (explicitly or not), required by applications for certain features. For example a travel web based service will require loads of hardware, refined GUIs and so on....

- But they would also require certain features such as finding the shortest routes, sorting based on user reviews and other criteria. These are best implemented by using standard algorithms (or something better if you can think of something.)

With the ever increasing capacities of computers, we now have to solve larger problems than ever before, It helps to get the job done quicker and more efficiently and algorithms work much better than just plain brute force methods

let's go over Sort algorithms

Again these algorithms take input data and returns them in a numerically ascending order or Alphabetical order. For simplicity's sake we will look over numerical examples only

There are many sorting algorithms.Each algorithm is fundamentally different from each other by the way they work. Normally an algorithm is chosen depending on the given data as some algorithms are better for a certain organization of data than others.(this will be explained later)

Three commonly used sorting algorithms are

1)Merge Sort

2)Quick Sort

3)Bubble Sort

**Merge Sort:**#### Conceptually, a merge sort works as follows:

- Divide the unsorted list into
*n*sublists, each containing 1 element (a list of 1 element is considered sorted). - Repeatedly merge sublists to produce new sorted sublists until there is only 1 sublist remaining. This will be the sorted list.

#### Quick Sort:

#### Conceptually, a Quick sort works as follows:

- Pick an element, called a
**pivot**, from the array. - Reorder the array so that all elements with values less than the
pivot come before the pivot, while all elements with values greater than
the pivot come after it (equal values can go either way). After this
partitioning, the pivot is in its final position. This is called the
**partition**operation. - Recursively apply the above steps to the sub-array of elements with smaller values and separately to the sub-array of elements with greater values.

#### Bubble Sort:

#### Conceptually, a bubble sort works as follows:

- Take the first 2 elements and sort them
- Then take the 2nd and 3rd elements and sort them
- Repeat these steps until the end of the elements are reached
- Repeat the following steps until the elements are fully sorted

So these algorithms are basically doing the same thing but in different ways. But one thing that differentiates between algorithms is the efficiency of the algorithm

So how do we determine the efficiency of an algorithm ? This is called the analysis of run time complexity (the words are just fancy, the concept is simple)

We normally consider 3 main representations of Time taken by the Algorithm to Sort in this case

1)Big Ω(Omega) notation(in terms of computer science) -This shows the least amount of time taken by an algorithm to complete. For example the Big Ω notation of quick sort will be "c

*n*log*n"*where "n" is the number of elements . This means that the least amount of steps taken by quick sort (most of the the time an already sorted bunch of elements) would be product of the number of elements and the log (base 2) of it and then this is multiplied time the time taken to perform each step 'c'.2)Big Θ(Theta) notation - This shows the Maximum time taken by an algorithm (in the case of Sorting ones this is normally when the elements are in reverse order). Normally this is the best notation to design algorithms because this also shows that the algorithm is always gonna work in the given time frame and we can more or less decide how fast the algorithm is based on this. For example in quick sort the Big Θ is " c

*n*^{2" }meaning that the time taken will be the time taken for each step times the number of elements squared.3)Big O(Oh...) notation-This shows the Average time taken by an algorithm. This is the most common notation used and is considered the standard of viewing algorithms as it gives the viewer a basic idea of how the algorithm works on a normal case. This can also show the the best and worst case but this is beyond the scope of this article. Again in the case of quick sort the big O is "c

*n*log*n"*(it's called quick sort for a reason. It is considered the most efficient because the average case is the best case).As I have mentioned earlier quick sort is the most efficient algorithm for most cases. The reason there are none quicker is because these algorithms are considered to be "stable". The whole process of checking whether an algorithm is stable is a tedious one. But It's a simple mechanic to explain.

So let's just say we are sorting a bunch of cards. Normally if we are taking just the numbers in mind, both the given arrangements are considered to be sorted but if we are also considering the suits also. We must consider that in this case the 5 of hearts must come before the 5 of spades and thus this case of sorting will be considered a 'stable' sort. This sort of sorting takes relatively longer time but gives this desired result and retains other properties also.

The above algorithms were only explained conceptually. Now I want to take it one step further and explain how algorithms are normally expressed by computer scientists. Pseudocode is a high level informal description of the algorithm following basic programming language conventions.

Here's the pseudocode for insertion sort:

` `**for** i = 1 **to** length(A) - 1
x = A[i]
j = i
**while** j > 0 and A[j-1] > x
A[j] = A[j-1]
j = j - 1
**end while**
A[j] = x
**end for**

This time instead of conceptually explaining The Sort,I will go through the pseudocode line by line **for** i = 1 **to** length(A) - 1

` `

This statement states the starting of a loop which repeats the below statements as many times as the number of elements in the array (the below steps are done to all the elements in the array)x = A[i]

j = i

This statement gives a variable X the value of the the ith term (the current index) and another variable j the value of i for the conditional loop later

while j > 0 and A[j-1] > x

A[j] = A[j-1]

j = j - 1

end while

A[j] = x

end for

Here's where most of the fun happens. so here basically it takes an element and checks whether it's bigger that it's previous element and the element before that and so on until the first element. And in each iteration switches it the position of the present element with the previous element (if the value is greater).

And then again the whole process repeats with the next number on the array as the index value

This marks the end of the article and hopefully you have leaned a little bit about algorithms.If anybody has any doubts or want to know more can surely IM me on my G+ account given below

References:

1)http://en.wikipedia.org/wiki/Sorting_algorithm

2)http://www.sorting-algorithms.com/

3)https://www.youtube.com/watch?v=SJwEwA5gOkM

4)Introduction to Algorithms-3rd Edition

Attributions

- "Insertion-sort-example-300px" by Swfung8 - Own work. Licensed under CC BY-SA 3.0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/File:Insertion-sort-example-300px.gif#/media/File:Insertion-sort-example-300px.gif
- "Sorting stability playing cards" by User:Dcoetzee, User:WDGraham - Own work, based on File:Cards-2-Heart.svg, File:Cards-7-Spade.svg, File:Cards-5-Spade.svg, File:Cards-5-Heart.svg. Licensed under CC0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/File:Sorting_stability_playing_cards.svg#/media/File:Sorting_stability_playing_cards.svg
- "Bubble-sort-example-300px" by Swfung8 - Own work. Licensed under CC BY-SA 3.0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/File:Bubble-sort-example-300px.gif#/media/File:Bubble-sort-example-300px.gif
- "Quick-sort-example-300px" by Swfung8 - Own work. Licensed under CC BY-SA 3.0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/File:Bubble-sort-example-300px.gif#/media/File:Bubble-sort-example-300px.gif
- "Merge-sort-example-300px" by Swfung8 - Own work. Licensed under CC BY-SA 3.0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/File:Bubble-sort-example-300px.gif#/media/File:Bubble-sort-example-300px.gif

Really informative

ReplyDelete