Showing posts with label SQL server. Show all posts
Showing posts with label SQL server. Show all posts

Sunday, August 6, 2023

SQL Server Editions: Balancing Availability, Downtime, and Cost

Choosing between SQL Server editions requires a careful balance of high availability benefits against cost implications. For applications that rely on multiple databases, this decision is paramount.

Availability and Downtime:

  • SQL Server Enterprise Edition: This edition supports Always On Availability Groups, allowing multiple databases to be included in a single group. In the event of a database failure, all databases in the group failover together, ensuring application consistency. This feature is pivotal for applications that depend on multiple databases.

  • SQL Server Standard Edition: The Standard edition only supports Basic Availability Groups, limiting it to a single database per group. This means if one database fails, it doesn't guarantee the failover of other related databases, potentially causing inconsistencies in applications relying on multiple databases.

For applications requiring multiple databases to failover together for consistent performance, the Enterprise edition emerges as the preferred choice.

Cost Comparison:

  • SQL Server Enterprise Edition: At 4vCPU and 16GB RAM, the cost is higher but offers advanced features.

  • SQL Server Standard Edition: The same configuration is more budget-friendly but might lack some advanced features.

While the Enterprise edition offers advanced features, it comes at a higher cost. If budget constraints are significant and the application can tolerate some downtime, the Standard edition, combined with workarounds like Log Shipping, becomes a viable option.

Log Shipping:

  • This method involves periodic backups of the transaction log from the primary database, which are then restored on a secondary database.

  • If the primary database fails, the secondary database must be manually activated, which can be time-consuming, especially for large databases.

Variables to Consider in Decision-Making:

  1. Number of Databases: For applications that rely on multiple databases, the Enterprise edition is more suitable.

  2. Availability Requirements: If high availability is paramount, the Enterprise edition is the recommended choice.

  3. Future Scaling: Anticipating growth in user base or data volume? The Enterprise edition, with its advanced features, is a better long-term investment.

  4. Cost: While the Standard edition is more budget-friendly, it might require workarounds that could impact RTO.

Conclusion:

For applications that rely on multiple databases and require high availability, the SQL Server Enterprise edition is a clear frontrunner. However, it's essential to balance this need against cost constraints and evaluate the potential implications of workarounds in the Standard edition. Making an informed decision now can save time, and money, and ensure consistent application performance in the long run.

Saturday, November 6, 2021

Slowness in long running load tests

 I would like to share another interesting Database issue

 Performance Testing team had shared the below report where they had observed slowness in long running test after 1.5 hours, which they don’t notice in shorter tests for an hour long


I could figure out that the issue was with  the Database.  If you see the below snapshot from AppDynamics, Resource semaphore wait stats in sql server had increased during the middle of the load test.





I had gathered few additional statistics from Grafana to understand the root cause and found that the applications slowed down and server memory reached its limit (84 GB )and resource semaphore events were getting triggered as shown in the image below





Next step is to determine which query is causing the issue. So I ran DMV in SQL server to determine the same.

 SELECT der.session_id ,

DB_NAME(der.database_id) AS database_name ,

deqp.query_plan ,

SUBSTRING(dest.text, der.statement_start_offset / 2,

( CASE WHEN der.statement_end_offset = -1

THEN DATALENGTH(dest.text)

ELSE der.statement_end_offset

END - der.statement_start_offset ) / 2)

AS [statement executing] ,

der.cpu_time

--der.granted_query_memory

--der.wait_time

--der.total_elapsed_time

--der.reads

FROM sys.dm_exec_requests der

INNER JOIN sys.dm_exec_sessions des

ON des.session_id = der.session_id

CROSS APPLY sys.dm_exec_sql_text(der.sql_handle) dest

CROSS APPLY sys.dm_exec_query_plan(der.plan_handle) deqp

WHERE des.is_user_process = 1

AND der.session_id <> @@spid

ORDER BY der.cpu_time DESC ;

-- ORDER BY der.granted_query_memory DESC ;

-- ORDER BY der.wait_time DESC;

-- ORDER BY der.total_elapsed_time DESC;

-- ORDER BY der.reads DESC;

 

 Output of the DMV gave us the Culprit. it  is a query requesting around 16GB of Server memory (memory grant) for execution. This query needs server memory to store temporary data while sorting and joining rows.

Note:  The amount of the workspace memory for a query is called a memory grant. A memory grant is calculated during the query compilation and then, when the execution should start, this amount is requested and, depending on the available memory, granted to a query.

Solution : Tune the sql query to reduce memory usag. Development team had given the fix in  a day and that solved the problem





Claim Based Authorization

  1.      Claim Based Authorization ·         Token Validation: As requests come into the Ocelot API Gateway, the first step is to validat...