# Complexity or Efficiency of Algorithm in Data Structure

Data Structure is the most important part of any programming language . In this article I explain about Efficiency or Complexity of Algorithm in Data Structure.

### Complexity or Efficiency of Algorithm in Data Structure.

A Complexity of a Function Algorithm is based on an Input Data Processing of Data refers to the time or Space, or both.And how long will the Processing of Data and Data Processing of how to Space Use.

These Functions Mathematically, we can find out how much of an Efficient Algorithm is compared to any other Algorithm. To Analyze on Algorithms in Computer Science is a very complex task. Because any comparison of Algorithms Criteria to be something very important to us, which enables us to find out how much Efficient Algorithm is no. If n Data in a Data Structure Algorithm M Size n is of the Input Data. Size of Input Data Criteria is the first of any Algorithm.

We Algorithm Using Data Process Steps on how to do a number of the Steps of Algorithm Efficiency Criteria is second to known.

An Efficient Algorithm That is how, for the Input Data and Data Use The Steps of the Process Comparisons depends on the number. An Algorithm to provide a solution that uses many Time and Space, Efficiency of these two facts shows that Algorithm. Suppose that M is an Algorithm and Size n of the Input Data. Sorting Algorithm of Searching or all of the following Operations Data Process is based on the number of Operations Algorithm shows the Time.

Suppose that 1000 Records in a File and Record of them to achieve a particular name, then Algorithm to achieve the desired Record to max 1000 Comparisons may need. These maximum of 1000 Comparisons Use the time taken by Algorithm The Time (Time) and the 1000 Comparisons Says the Use of Memory as the Algorithm, by the Memory Use The Space is an Algorithm. Complexity of an Algorithm M to determine a Function f (n) are used. The Function Complexity of an Algorithm provides the Size of the n Input Data. For example, only 1 Record on File, then that's a name from Record to Record Low Complexity of Search Algorithm will do the same on the 100 Record File If Use the Complexity of the Algorithm will be higher. If you do not receive the desired Record somewhere in the File Complexity of the Algorithm will be eternal. Suppose that: The Record File Search're in the middle of the Average Case Algorithm Algorithm will be asked to

If that is the Record Complete Record Search File not so much the Worst Case Algorithm Algorithm is called like that. Search for a Worst Case Algorithm for Record of the Operations of Comparison many times as you have to, as are Records in File. Suppose that in File 100 times 100 Record Comparison Algorithm may have to do so. Mathematically we can display it as follows -

C (n) = n

Where C = Comparisons of numbers and n = Input Data Size is. Worst Case Complexity of a Linear Search Algorithm in C (n) = n is.

Average Case Algorithm on Location of Data possibility of getting every 1 / n is the number of n Data Items. Ie if n Data Items in a Data Structure from Data 1 through n can be at any location. So to find a Data Item in a List Algorithm to total n Comparisons may need. We Mathematically it can be represented as follows -

C (n) = 1 X 1 / n + 2 X 1 / n + ... n X 1 / n

= (1 + 2 + ... + n) X 1 / n

= N (n + 1) / 2 X 1 / n

= N + 1/2

The Probability Equation List of Data and we can see that the possibility of getting in approximately n / 2, where n List is the total number of Data Items.