Database Research & Development

  • Home
  • NoSQL
    • NoSQL
    • Cassandra
  • Databases
    • Database Theory
    • Database Designing
    • SQL Server Coding Standards
    • SQL Server
    • PostgreSQL
    • MySQL
    • Greenplum
    • Linux
  • Interviews
    • SQL Server Interviews
    • MySQL Interviews
    • SQL Puzzles
  • DBA Scripts
    • SQL Server DBA Scripts
    • PostgreSQL DBA Scripts
    • MySQL DBA Scripts
    • Greenplum DBA Scripts
  • Home
  • Blog Archives !
  • (: Laugh@dbrnd :)
  • Contact Me !
sqlserverinterviews
Home 2017 September SQL Server: Use sp_estimate_data_compression_savings for checking estimate object size and saving space

SQL Server: Use sp_estimate_data_compression_savings for checking estimate object size and saving space

This article is half-done without your Comment! *** Please share your thoughts via Comment ***

In this post, I am sharing the use of sp_estimate_data_compression_savings system stored procedure which we can use to check the object size + object expected saving space in SQL Server.

Many times, I have found that people are using CHAR data type for fix length store, but it is also true that people are not storing a correct length of data in CHAR column.

I just gave one example of CHAR column; there might be many reasons where we have to check the table actual size and the size of saving space.

Once we got the actual size and expected saving space, we can apply proper compression algorithm on objects. If your table has multiple indexes, you can compare space of all indexes and can take the decision to which index requires to compress.

We can use sp_estimate_data_compression_savings for the table of a clustered index, non-clustered index and heap table. You can also evaluate object for ROW size and PAGE SIZE.

Check the below small demonstration on this:

Create a sample table:

1
2
3
4
5
6
7
CREATE TABLE tbl_DumpData
(
ID INT
,RandomNumber BIGINT
,CONSTRAINT pk_tbl_DumpData_ID PRIMARY KEY(ID)
)
GO

Insert dummy data:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
;WITH CTE AS
(
SELECT 1 ID
UNION all
SELECT ID + 1
FROM CTE
WHERE ID + 1 <= 1000000
)
INSERT INTO tbl_DumpData(ID,RandomNumber)
SELECT
ID
,convert(int, convert (varbinary(4), NEWID(), 1)) AS RandomNumber
FROM CTE
OPTION (MAXRECURSION 0)
GO

Check the inserted records:

1
2
SELECT *
FROM tbl_DumpData

Check space for ROW:

1
2
3
4
5
6
EXEC sp_estimate_data_compression_savings
@schema_name = 'dbo',
@object_name = 'tbl_DumpData',
@index_id = NULL,
@partition_number = NULL,
@data_compression = 'ROW'

Result: Check the difference between size_with_current_compression_setting and size_with_requested_compression_setting

After data compression, we can make the object size upto size_with_requested_compression_setting.

SQL Data Compression

Check space for PAGE:

1
2
3
4
5
6
EXEC sp_estimate_data_compression_savings
@schema_name = 'dbo',
@object_name = 'tbl_DumpData',
@index_id = NULL,
@partition_number = NULL,
@data_compression = 'PAGE'

Result: Check the difference between size_with_current_compression_setting and size_with_requested_compression_setting

After data compression, we can make the object size upto size_with_requested_compression_setting.

SQL Data Compression Page

Sep 14, 2017Anvesh Patel
SQL Server: BULK INSERT for insert data from Flat file (CSV) to TableSQL Server 2016: Introduced Data Compression Algorithm to reduce the size of Table
Anvesh Patel

Database Engineer

September 14, 2017 SQL ServerAnvesh Patel, data compression, database, database research and development, dbrnd, sp_estimate_data_compression_savings, space saving, SQL Query, SQL Server, SQL Server Administrator, SQL Server Error, SQL Server Monitoring, SQL Server Performance Tuning, SQL Server Programming, SQL Server Tips and Tricks, TSQL
About Me!

I'm Anvesh Patel, a Database Engineer certified by Oracle and IBM. I'm working as a Database Architect, Database Optimizer, Database Administrator, Database Developer. Providing the best articles and solutions for different problems in the best manner through my blogs is my passion. I have more than six years of experience with various RDBMS products like MSSQL Server, PostgreSQL, MySQL, Greenplum and currently learning and doing research on BIGData and NoSQL technology. -- Hyderabad, India.

About DBRND !

dbrnd

This is a personal blog (www.dbrnd.com).

Any views or opinions represented in this blog are personal and belong solely to the blog owner and do not represent those of people, institutions or organizations that the owner may or may not be associated with in professional or personal capacity, unless explicitly stated.

Feel free to challenge me, disagree with me, or tell me I’m completely nuts in the comments section of each blog entry, but I reserve the right to delete any comment for any reason whatsoever (abusive, profane, rude, or anonymous comments) - so keep it polite.

The content of this website is protected by copyright. No portion of this website may be copied or replicated in any form without the written consent of the website owner.

Recent Comments !
  • Anvesh Patel { Sure will do... } – May 27, 12:43 PM
  • Anvesh Patel { Great... } – May 27, 12:41 PM
  • Anvesh Patel { Great... } – May 27, 12:39 PM
  • Anvesh Patel { Great... } – May 27, 12:36 PM
  • Anvesh Patel { Great... } – May 27, 12:28 PM
  • Anvesh Patel { Great... } – May 27, 12:27 PM
  • Anvesh Patel { Great... } – May 27, 12:16 PM
  • Older »
Follow Me !
  • facebook
  • linkedin
  • twitter
  • youtube
  • google
  • flickr
© 2015 – 2019 All rights reserved. Database Research & Development (dbrnd.com)
Posting....