Newsletter Subject

Performance Tuning Using Extended Events: Part 2 (2019-12-13)

From

sqlservercentral.com

Email Address

subscriptions@sqlservercentral.com

Sent On

Fri, Dec 13, 2019 12:43 PM

Email Preheader Text

 SQLServerCentral Newsletter for December 13, 2019 Problems displaying this newsletter? . Featured

 SQLServerCentral Newsletter for December 13, 2019 Problems displaying this newsletter? [View online](. [SQL Server Central]( Featured Contents - [Performance Tuning Using Extended Events: Part 2]( - [Deploying Data and Schema Together with SQL Compare or SQL Change Automation]( - [Azure Text Analytics and Power BI]( - [From the SQL Server Central Blogs - Take the 2020 State of Database DevOps survey]( - [From the SQL Server Central Blogs - Woman in STEM award 2019!]( Question of the Day - [Moving a Default Schema]( [Redgate SQL Monitor]( The Voice of the DBA  A New Recovery Tool for Your Toolbelt This editorial was originally published on 15 Nov, 2016. It is being republished as Steve is traveling today. Those of us that administer databases and are responsible for ensuring the integrity and availability of data know that being able to restore systems is a core skill. In the SQL Server world, this should mean more than just being able to run a RESTORE DATABASE Sales FROM DISK = 'Sales.bak'. Many of us would easily recognize that we should be able to use the WITH MOVE option. We should also understand when to use (and not use) NORECOVERY and STANDBY. Perhaps more importantly, we need to know the default option is RECOVERY. If we delve further, perhaps we could debate and discuss when a tail log backup is needed. Many might mention how to programmatically use FILELISTONLY and HEADERONLY to generate scripts.  A few of you would note that restoring databases means restoring logs as well and an understanding of the STOPAT option is important. Perhaps learning the ins and outs of restoring a filegroup is important in your environment. There are many nuances and options with restoring a database that a system administrator could learn. Apart from performing a restore, a careful DBA might think that we should try to avoid restoring databases if possible. Perhaps a DR system using Log Shipping can be used to recover data quicker than a database restore. Maybe periodic snapshots are valuable in recovering from those "whoops" mistakes. As our data sizes grow larger, we may need to develop more creative ways of recovering from user mistakes. Recently I saw the [SQLCAT team had written a post on using Temporal Tables to recover from an "oops" mistake](. If you haven't looked at these structures, they are new in SQL Server 2016. [Temporal tables](, or system versioned tables, allow a user to view a row as of a certain timestamp, which can be especially useful for DML mistakes. Have you ever updated every row of a table to the same value because of a poorly written query? I have. A temporal table would have been handy in this case. Perhaps even more useful are the cases where a table is receiving regular inserts and updates, which can be very difficult to recover from with backup files. Certainly the majority of us don't yet have temporal tables, and we might not even set up the option on many tables even if we were running SQL Server 2016. Over time, however, some of us will use these system-versioned objects, and perhaps we will run into a situation where recovery using temporal queries is more efficient than restoring an entire database. If that were to happen, wouldn't you want to have some skills in this area? As SQL Server continues to evolve, many of the techniques and skills we've built across time will continue to work. There will be new ways of accomplishing work, based on features and enhancements that we ought to consider, especially when they might be more efficient than our past techniques. As you work with SQL Server, I hope you keep learning about the "new" ways that you might accomplish the tasks for which you've developed solutions in the past. Sometimes the old way might prove to be a better choice, but you won't know that unless you learn, practice, and measure the effects of a new method. Steve Jones - SSC Editor [Join the debate, and respond to today's editorial on the forums]( [Redgate SQL Source Control](   Featured Contents [Performance Tuning Using Extended Events: Part 2]( bdavey from SQLServerCentral.com Part 2 of identifying performance puning opportunities using Extended Events. Learn how about shredding XML. [Deploying Data and Schema Together with SQL Compare or SQL Change Automation]( Additional Articles from Redgate You want to use SQL Compare or SQL Change Automation (SCA) to create or update a database, and at the same time ensure that its data is as you expect. You want to avoid running any additional PowerShell scripting every time you do it, and you want to keep everything in source control, including the data. You just want to keep everything simple. Phil Factor demonstrates how it's done, by generating MERGE scripts from a stored procedure. [Azure Text Analytics and Power BI]( Additional Articles from MSSQLTips.com In this tip we look at how to analyze textual data to gain insight from freeform text using Azure Text Analytics and Power BI. From the SQL Server Central Blogs - [Take the 2020 State of Database DevOps survey]( Steve Jones - SSC Editor from The Voice of the DBA It’s still 2019, but we’re looking forward to 2020 and wonder how your world has changed since our survey last year. There are a number of questions and this... From the SQL Server Central Blogs - [Woman in STEM award 2019!]( GRE (Gethyn Ellis) from The SQL DBA in the UK gethynellis.com is proud to sponsor the Woman in STEM award at the Woman’s Business Club conference 2019! The Women in STEM award recognises the outstanding contributions of woman in...   Question of the Day Today's question (by Steve Jones - SSC Editor):  Moving a Default Schema I have the user, JoeDev, that has a default schema of dbo. I want to move this to the [dev] schema. What code should I run? Think you know the answer? [Click here]( and find out if you are right.    Yesterday's Question of the Day (by Steve Jones - SSC Editor) The Mangled Outer Join I have these two tables: CREATE TABLE Student (StudentKey INT IDENTITY(1,1) NOT NULL CONSTRAINT StudentPK PRIMARY KEY , StudentName VARCHAR(10) ) GO INSERT dbo.Student (StudentName) VALUES ('Delaney'), ('Kyle'), ('Kendall') GO CREATE TABLE Schedule (ScheduleKey INT IDENTITY(1,1) NOT NULL CONSTRAINT SchedulePK PRIMARY KEY , CourseName VARCHAR(50) , StudentKey INT ) GO INSERT dbo.Schedule (CourseName, StudentKey) VALUES ('Math', 1), ('Math', 2), ('English', 2) I want to get a list of the students that aren't in any courses. I run this code: SELECT s2.StudentKey , s2.StudentName , s.CourseName FROM dbo.Student AS s2 LEFT OUTER JOIN dbo.Schedule s ON s.studentkey = s2.StudentKey AND s.StudentKey IS null What should I do differently? Answer: Move the IS NULL check to the WHERE clause Explanation: The problem with this query is that the ON clause determines which rows are joined to each other. In this case, there are no rows that satisfy a join in the tables where the studentkey is NULL for courses. All those rows have data. This results in rows back from the left table (Student), with no matching rows from Schedule, so NULLs are there. Ref: Understanding Outer Joins in SQL - [Discuss this question and answer on the forums](    Database Pros Who Need Your Help Here's a few of the new posts today on the forums. To see more, [visit the forums](. --------------------------------------------------------------- SQL Server 2017 - Administration [Always on in standard and enterprise edition]( - Hello everyone! I would glad to know, what is the difference in Always on between Standard and Enterprise edition? And also there is something that change in 2012 and above? Thanks! [Migrating DB encrypted by master key]( - I'm migrating DBs from a 2008 R2 instance onto a 2017 instance. I have one DB that is encrypted by the master key, how do I migrate this successfully across to a new instance? [Log Shipping - What has changed]( - We currently have a situation with a vendor where they will only give us access to our data via 5 minute log ship files that we import into our read only db. this is fine for most queries, but we need a way to get details about what has changed in each period in the […] SQL Server 2017 - Development [Insert in batches using select coalesce option]( - Hi, I am trying to insert multiple tables data into one table using Insert in batches and with select coalesce. I am getting Ambiguos column (id) error. Can some one please help me to resolve the issue. Thank You. ERROR MESSAGE: Msg 209, Level 16, State 1, Line 26 Ambiguous column name 'id'. Line:26 - […] [Fastest Way to Calculate Total Number of records inserted per day]( - Hi, I need to show a number of records per day in the dashboard of the web application if the user reload the page the count will automatically be updated. Initially, we don't have any problem as the records grow to the millions of records the query execution takes slow. The following is the query […] [Fastest way to query Millions of records with Pagination and Total Record Count]( - HI, We had a table that contains a huge volume of data inserted per day it contains millions of rows. Currently, we had 2 million records in staging but once we hit the production the data will be easily more than 10 - 20 million rows Previously we had a pagination query that took more […] SQL Server 2016 - Administration [changing db owner with always on replication (to fix DB properties window)]( - SQL Server 2016 standard edition, with primary databases and one other server set up as secondary failover with always on availability group. There were some issues over the weekend with an application that accesses the databases on this server, unbeknownst to me, they decided just to reboot the database server, which seemed to clear things […] [What\'s the best way to keep a record of all records inserted via an SP call?]( - Hey guys, so I may be overthinking this but basically, I have a bunch of stored procedures which I want to begin keeping historical track of. I want to know which records were updated by which SP & when, and I'm just looking for the best way to do this. Is there any built in […] [Impact value about missing indexes]( - is there relation between Impact from sys.dm_db_missing_index_group_stats and Impact from cache plan? Thanks for all.  [I can't Uninstall SQL Server 2016]( - Hi, I have SQL Server 2016 installed in my machine but I can't uninstall it, it doesn't appear on my Control Panel, I looked again using CMD wmic, product get name, it also doesn't appear. But I can see it, SQL Server on the services and SQL Server Configuration Management. I also tried using CMD […] SQL Server 2016 - Development and T-SQL [BCP utility]( - Hi All I am trying to export some data from a SQL server database to a CSV file. As I need to do this for around hundreds of records, I am using a cursor to loop through and creating dynamic file names to write individual files. This seems to be working ok. Some of the […] Administration - SQL Server 2014 [Trouble connecting in single user mode]( - I have an SQL server which I need to get admin access to. Logged on as administrator to the server I attempt to start in single user (-m) mode. However in this case, instead of dropping into single user (as on other servers in this project), I get a screen full of continuous login failures […] Development - SQL Server 2014 [How to get information about an AG group using powershell]( - Hi Experts, Does anyone has a powershell script which displays below information: for a given AG listener name or AG group name, I need to get Listnener name, AG name, replicas , replica role desc , Can we get this information using pure powershell which reads registry values or do we need to execute a […] SQL 2012 - General [SQL Monitoring]( - Hi All, I have SQL 2012 instance on a vm server which I am testing with Red Gate Monitoring Tool installed on my machine and my machine machine is set a base monitor for testing purpose. Problem I am facing is that it only monitor while I am logged on to the machine. If I […] Strategies and Ideas [Tracking History in Fact Table]( - Hi, I was hoping somebody would be able to offer advice in relation to tracking history in a fact table, in particular whether #2 below would be a viable option? I would have thought that #1 would be bad for performance given the volume of records created through history so maybe not best practice?. If […]   [RSS Feed]([Twitter]( This email has been sent to {EMAIL}. To be removed from this list, please click [here](. If you have any problems leaving the list, please contact the webmaster@sqlservercentral.com. This newsletter was sent to you because you signed up at SQLServerCentral.com. ©2019 Redgate Software Ltd, Newnham House, Cambridge Business Park, Cambridge, CB4 0WZ, United Kingdom. All rights reserved. webmaster@sqlservercentral.com  - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

EDM Keywords (222)

yet written would world work wonder women woman well weekend way want volume voice visit view vendor value valuable using user useful used use us updates updated update unless uninstall understanding trying try toolbelt took today tip thought testing techniques tasks tables table students studentkey structures strategies steve start standard staging sponsor sp something skills situation signed show set services servers server sent seems seemed see schedule saw satisfy run rows row results restoring restore responsible respond resolve republished replication removed relation redgate recovery recovering recover records record reboot read questions question query queries proud project production problem post period perhaps performing pagination page overthinking outs ought options option one number nulls null newsletter new need move monitor millions migrate might measure mean maybe may majority machine loop looking looked look logged list know keep joined join issues integrity ins information importantly important import impact hope hit history help headeronly happen handy get forums following fine find filegroup features facing export expect execute environment ensuring enhancements encrypted email efficient effects editorial easily dropping done displays discuss difficult difference develop delve decided debate dbo dba db day databases database data dashboard cursor currently create courses coursename count continue contains code changed change cases case bunch built batches basically bad availability automatically attempt area application appear anyone answer always also administrator accesses able 2020 2012

Marketing emails from sqlservercentral.com

View More
Sent On

21/06/2024

Sent On

19/06/2024

Sent On

17/06/2024

Sent On

15/06/2024

Sent On

14/06/2024

Sent On

12/06/2024

Email Content Statistics

Subscribe Now

Subject Line Length

Data shows that subject lines with 6 to 10 words generated 21 percent higher open rate.

Subscribe Now

Average in this category

Subscribe Now

Number of Words

The more words in the content, the more time the user will need to spend reading. Get straight to the point with catchy short phrases and interesting photos and graphics.

Subscribe Now

Average in this category

Subscribe Now

Number of Images

More images or large images might cause the email to load slower. Aim for a balance of words and images.

Subscribe Now

Average in this category

Subscribe Now

Time to Read

Longer reading time requires more attention and patience from users. Aim for short phrases and catchy keywords.

Subscribe Now

Average in this category

Subscribe Now

Predicted open rate

Subscribe Now

Spam Score

Spam score is determined by a large number of checks performed on the content of the email. For the best delivery results, it is advised to lower your spam score as much as possible.

Subscribe Now

Flesch reading score

Flesch reading score measures how complex a text is. The lower the score, the more difficult the text is to read. The Flesch readability score uses the average length of your sentences (measured by the number of words) and the average number of syllables per word in an equation to calculate the reading ease. Text with a very high Flesch reading ease score (about 100) is straightforward and easy to read, with short sentences and no words of more than two syllables. Usually, a reading ease score of 60-70 is considered acceptable/normal for web copy.

Subscribe Now

Technologies

What powers this email? Every email we receive is parsed to determine the sending ESP and any additional email technologies used.

Subscribe Now

Email Size (not include images)

Font Used

No. Font Name
Subscribe Now

Copyright © 2019–2024 SimilarMail.