You must Sign In to post a response.
  • Category: SQL Server

    Performance issues on fetching the data

    Hi All,

    In my project specific SQL table contains 6 million data's. Its affecting application performance while fetching the record from database. What is the best way to overcome this issue? Please provide your suggestion on this.

    Best Regards,
    Simiyon A
  • #767237
    Solving the performance issue in Database is the ART. If you have any specific table that has lot of records, you have to analyse the fetching the records query. Based on the query you have to create index.
    In SQL Server, we have two kind of index , ie Clustered and Nonclustered.

    Consider the query SELECT Name, age FROM emp WHERE EmployeeID = 500. When this query is executed, the query optimizer evaluates each available method for retrieving the data and selects the most efficient method. The method may be a table scan, or may be scanning one or more indexes if they exist.

    So you have to create indexes based on you fetching.

    By Nathan
    Direction is important than speed

  • #767240

    You have to follow all optimization techniques so that you can improve the performance to fetch the data.

    Optimization techniques in database side:

    1) Index in table wise
    2) Proper relationship between tables
    3) While joining first join master table then child table and use all the conditions while fetching data.
    4) use selected columns rather than call all the columns.
    5) try to avoid aggregate functions and grouping clauses.
    6) try to avoid temporary tables and table varriables it will occupy some memory rather than that use CTE.

    If you follow all those then you can improve the performance, rather than above you can have more options to improve the performance but the above mentioned one is basic things to remember while considering performance.

    Give respect to your work, Instead of trying to impress your boss.

    Blog :

  • #767241
    First up'all check your database tables have proper indexing.
    After that check your sql queries/stroed procedures and try to optimize them. Eg. If it contains like/unwanted joins/whether where condition is placed on numeric columns or not.
    Also try query execution plan for sql queries.
    After that in coding part try to check which data structures you have used to fetch the database data. Eg. Use datareader instead of datatable etc. etc.
    Check your database transactions and connections are properly closed when operations are done.

  • #767248
    Thank you all. Useful informations

  • #767251
    There are lot of factors affecting your sql performance while fetching record from database, first thing is your index and your primary key, your primary key is should be always NUMERIC not a string(varchar) datatype, and when you search/fetch data, it should be fetched with numeric column
    there are some workaround that you can check
    - Create temporary table on subset (rows and columns) of data you are interested in. Temporary table should be much smaller that original source table
    - use CTE a lot with ad-hoc queries -- it's help a lot with building (and testing) a query piece by piece.
    - You can create views or indexed views on subset of data you are interested in and run queries on view -- which should contain only interesting subset of data much smaller than the whole table
    - Running star query (SELECT * FROM) on big table is not good thing

    Editor, DotNetSpider MVM
    Microsoft MVP 2014 [ASP.NET/IIS]

  • #767588
    send me actual execution plan of your query. i will let you know what all things you need to change
    Many Thanks
    Tejinder Singh Barnala
    /*I have the simplest tastes. I am always satisfied with the best*/

  • #767637
    Hi Simiyon,
    By considering the huge amount of data in particular table,even if you apply above mentioned tricks you may get some improvement but not extreme improvement.

    I have faced the same issue which consist huge amount of data and even applying everything I know and reading execution plans I was facing the same issue.

    For resolving such cases,I strongly suggest to archive the old data which is not in use since years and not in use currently or if that data is required then try to fetch the data in chunks based on some criteria.
    This approach gives multiple calls to database but it surely resolved the issue.

  • #767654
    for ex:
    your Table struture this

    iD INT,

    iNSERT INTO TWW VALUES(1,'aa','10-05-1989')
    iNSERT INTO TWW VALUES(1,'aa1','10-05-1989')
    iNSERT INTO TWW VALUES(1,'aa','05-05-1989')
    iNSERT INTO TWW VALUES(1,'aa','05-05-1989')

    iNSERT INTO TWW VALUES(1,'aa','08-07-1989')
    iNSERT INTO TWW VALUES(1,'aa','06-07-1989')

    you need mention following query for Customized your Query. Then using paging also for this Situation.
    they you can fetch records fastly.

    1.Use Stored Procedure for Fetching Records
    2.set index for fields



    Name : Dotnet Developer-2015
    Email Id :

    'Not by might nor by power, but by my Spirit,' says the LORD Almighty.

  • #767671
    Hai Simiyon,
    If you have a single table contains 6 M records, then you can just use clustered Index for the table and it will make the retrieval faster. Because indexes works based on the binary search which makes the performance better in retrieving the data.
    If you are using few table in join and then filtering the records, the you probably need to check the join conditions and the best way is to check the query execution plan and find out which join is taking more time and then work for that join to optimize the time in getting the results.
    Hope it will be helpful to you.

    Pawan Awasthi(DNS MVM)
    +91 8123489140 (whatsApp), +60 14365 1476(Malaysia)

  • Sign In to post your comments