SQL Server 2016 … Is it a Game Changer?

With every new release of SQL Server, Microsoft claims that it is a game changer. However, in some ways I think it might be true regarding SQL Server 2016. Here are some of the things Microsoft is saying to promote it:

Although the marketing material is nice, there is another reason to be excited about this version of SQL Server, which was developed following a different model:

Personally, out of all the features that have been included, the following items stand out to me as reasons why you may want to adopt this release sooner, rather than later.

Query Store

From MSDN, “The SQL Server Query Store feature provides you with insight on query plan choice and performance. It simplifies performance troubleshooting by helping you quickly find performance differences caused by query plan changes. Query Store automatically captures a history of queries, plans, and runtime statistics, and retains these for your review. It separates data by time windows so you can see database usage patterns and understand when query plan changes happened on the server.” (https://msdn.microsoft.com/en-us/library/dn817826.aspx)

I have seen this demoed a few times and have played with it a little with some of the release candidates, and it seems to be something that will make DBA’s and consultants lives a lot easier. Here some additional information regarding these items:

Reporting Services

Finally, it seems like reporting services was given some attention in this release of SQL Server! And the changes are NOT trivial. There are some fundamental enhancements that I feel make reporting services a true competitor in this space again. I have been testing out the release candidates and I am very excited to get this into production. Here are some useful posts that will help you get excited about these enhancements:

Well, I have to say that I am probably as excited about this release as Microsoft is!

Link to SQL Server 2016 download:

Posted in Business Intelligence, DBA | Tagged , , , | Leave a comment

SQL Intersection 2016 (Orlando FL)

For my training this year, I went to SQL Intersection (https://devintersection.com/) in Orlando, Florida from April 18-22, 2016. This was a new experience for me since I normally attend the SQL PASS conference in Seattle, Washington (http://www.sqlpass.org/summit/2016/Welcome.aspx). And there were some things I really liked and others not so much. For example, I really liked the weather (sunny and in the 80’s), the high quality of speakers, and the smaller conference size. However, because of being a smaller conference there were less sessions to choose from, less after-hours activities, and a large portion of the conference attendees were developers and not specifically SQL professionals, so the networking was different.

Pre-Conference Session

  • Performance Troubleshooting Using Waits and Latches (Paul Randall) – I really enjoyed this session and he provided some great advice, such as don’t always assume that the symptom equals the root cause of the problem and don’t do knee jerk performance troubleshooting. Other items that were covered included: how to start your performance troubleshooting using waits, components of the scheduler, latches, spinlocks, various solutions to wait problems, and some real world examples. (http://www.sqlskills.com/blogs/paul/category/wait-stats/)

Sessions – Tuesday

  • Keynote: Microsoft Cloud (Scott Guthrie) – Microsoft’s cloud services seemed to be an underlying theme of this conference. The speaker presented some amazing things regarding different companies’ experiences using Azure and how the newly acquired company, Xamarin (https://www.xamarin.com/), can be used to help with the development and testing processes for mobile devices.
  • SQL Server 2016 New Feature Overview (Tim Chapman) – This session provided a high level overview of many of the new features of SQL Server 2016: maintenance plans, backup and restore extended events, new cardinality estimator, memory grants hints, multiple tempdb files on install, new alter database options, new T-SQL and DMV’s, column compression, in-memory optimizations and functions, live query monitoring, query store, temporal tables, JSON, row level security, dynamic data masking, load balancing for Availability Groups, stretch databases, backups to Azure, and Always encrypted.
  • SQL Server Tuning When You Can’t Fix the Queries (Brent Ozar) – I really liked this energetic presentation. The speaker mentioned that you might choose to use batch requests per second and wait time per second as the key indicators for performance. He advocated using his own stored procedure, sp_askBrent (https://www.brentozar.com/askbrent/) to troubleshoot slow performance. Additionally, he suggested an easy approach to understand from the business or management what you can change and what you cannot change in order to improve performance. Afterwards, he reviewed some common wait types and their resolutions. His basic plan of action was to: tune queries, tune indexes, add more memory, and make storage faster.
  • Corruption Survival Techniques (Paul Randall) – I always feel like taking a session on corruption is a must for every DBA, and I don’t think anyone does it better than this speaker. He emphasized that corruption will always happen and we need to know what we are doing or else we could make it worse. Additionally, he mentioned that we must perform root cause analysis or else it may re-occur. In order to find it early, we should have IO alerts set (823, 824, 825) and Checkdb running regularly. Also, he emphasized the need to have a disaster recovery run book. (http://www.sqlskills.com/blogs/paul/category/corruption/)
  • Keynote: Do You Know Data in the Microsoft Cloud? (Bob Ward) – This was a good session that explored many of the technical details of Microsoft’s cloud services, including Infrastructure as a Service (IaaS), SQL Database, SQL Data Warehouse, DocumentDB, HDInsight, Data Lake analytics, and Data Factory.

Sessions – Wednesday

  • Keynote: Go to There and Back Again (Douglas Crockford) – This presentation focused on the development of programming languages and what he thought would be next big ideas. This speaker is probably best known for popularizing JSON. (https://en.wikipedia.org/wiki/Douglas_Crockford)
  • New Features with SQL Server 2016 Reporting Services (David Pless) – I was really excited about this session and I think the speaker covered many of the features well. The Reporting Services team has a nice blog that covers many of the things he did: https://blogs.msdn.microsoft.com/sqlrsteamblog/.
  • Integrating Azure into Your Data Platform (Grant Fritchey) – I feel like this session was a balanced view of using Azure. He made the case for some who may be good candidates for using Azure, such as “greenfield” stuff, small shops, and global companies with distributed systems. He then talked through using active directory, virtual machines, and setting up network, monitoring, and other basics.
  • DevOps with Team Foundation Services and Azure (Karl Rissland) – This was a well-developed presentation that focused on the people, the process, and the tools to make DevOps work. He mentioned some of the cultural impediments between IT groups (i.e. UI, integration, data, infrastructure) and how each focuses on a different purpose from change to stability. The technologies he used in his demos were: PowerShell, PowerShell DSC, Azure ARM templates, VSTS, and Azure.

Sessions – Thursday

  • Keynote: A Day in the Life of the Data Scientist (Buck Woody) – This might have been one of my favorite presentations, as he broke down the job of a data scientist into a few parts. (https://buckwoody.wordpress.com/)
  • Inside the SQL Server Query Store (Bob Ward) – This session featured some great technical content that made me excited to migrate to SQL Server 2016. I feel like the Query Store is a game changer and it was explored in some detail. Some of the subjects that were touched upon were: query execution over time, reporting of completed and failed queries, storing of compiled queries and plans and stats, the query store data model – including plan store and runtime stats, compilation and execution statistics, UTC dates, and maintenance.
  • Scripting in SSIS (Tim Mitchell) – This session highlighted the creating of custom code to extend the capabilities of SSIS, using SSIS expressions, script task, script component, and PowerShell. (https://www.timmitchell.net/)
  • Keynote: SQL Intersection Closing Panel & RunAs Radio Recording Basically, we were audience for the radio show, RunAs Radio (http://runasradio.com/)

Post-Conference Session

  • Leveraging SQL Server in the Cloud (Jeremiah Peschka and Brent Ozar) – This workshop compared Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) and some of the pros and cons of each. Also, we went through some exercises to help us think through our decisions of how to leverage the cloud given certain situations and features that would be required. Again, it was stressed that a hybrid approach may work best.

So, I did enjoy my first experience with the SQL Intersection conference. Although it was a little different than what I was used to, it provided a high level of training and was in a nice location.

Posted in DBA, Professional Networking, Training | Tagged , | Leave a comment

SQL Saturday in Chicago (2016)

This past Saturday I attended SQL Saturday in Chicago (http://www.sqlsaturday.com/484/). It was a nice event and there were plenty of opportunities to network, get product demonstrations, and get some FREE SQL Server training.

Sessions I attended:

    • What’s New in SQL Server 2016 (Ross LoForte) – This was a fast moving presentation that gave a nice overview of things that are new to SQL Server 2016. It included visualizations and demonstrations of Query store (which would be similar to a flight recorder), live query stats, temporal tables (retaining previous copies of data), Json features, Always encrypted and row level security, dynamic data masking of sensitive data, enhancements to AlwaysOn availability groups (load balancing, standard edition support to deprecate mirroring), Polybase, advanced analytics (R scripts), mobile apps (Datazen), new features in SSRS (placing parameters), stretch databases, and backups to Azure. Free ebook: https://blogs.msdn.microsoft.com/microsoft_press/2015/12/22/free-ebook-introducing-microsoft-sql-server-2016-mission-critical-applications-deeper-insights-hyperscale-cloud-preview-edition/
    • Scripting out SQL Server for Documentation and Disaster Recovery (George Walkey) – This presentation focused on the presenters PowerShell scripts that he uses to document his SQL Servers. They seem like they could also be useful with disaster recovery, server migrations, upgrades, and auditing. He walked through some of the scripts, but it seemed that you would need to download and examine them to get a better understanding of what is happening (https://github.com/gwalkey/SQLTranscriptase).
    • Effective Data Visualization: The Ideas of Edward Tufte (David Giard) – The speaker in this session tried to simply explain some data visualization ideas from Edward Tufte (https://en.wikipedia.org/wiki/Edward_Tufte). The title of the session really fit the presentation. Some of the key takeaways were: visualizing data can be much better than looking at raw data for larger data sets, how some visualizations lie or hide data (lie factor), data ink ratio, and data density.
    • Big Data in the Modern Data Warehouse (Fred Bliss – Aptitive) – This session was presented over lunch and gave a nice introductory explanation to Big Data. One of the keys I noted from the speaker was to use Big Data as a data source to your data warehouse, once you determine the metrics that you need (http://www.aptitive.com/).
    • Reporting Services 2016 Solutions: The New Awesome (Paul Turley) – The speaker did a nice job presenting the hybrid approach that Microsoft is taking, focusing on both on-premise and cloud usage. He showed some new features of SSRS including: the new portal, parameter placement, KPIs, mobile reports, and Microsoft’s BI roadmap (https://sqlserverbiblog.wordpress.com/).

 

Overall, I had a great experience and look forward to future events like this.

Posted in Business Intelligence, DBA, PowerShell, Professional Networking, Training | Tagged , | Leave a comment

SQL Saturday in Chicago – 2016

There is a popular saying and it has been included in many songs, that “the best things in life are free”.  And really, that is one of the things I like the most about the SQL Server community.  There are always opportunities to take advantage of free training.  That being said, there are some upcoming events that I am personally looking forward to:

  1. SQL Saturday in Chicago – March 5, 2016 (http://www.sqlsaturday.com/484/eventhome.aspx).  There seems to be a nice variety of sessions and many vendors will be attending.  I mention the vendors because sometimes a small purchase of a product can save you hours of time.
  2. SQL PASS BA Webinar – Thursday, February 25, 2016 (http://passbaconference.com/2016/Learn/WebinarMarathon.aspx).  There are only a few sessions that will be presented, but they should provide a taste of what will be presented at the PASS Business Analytics Conference (http://passbaconference.com/2016/Home.aspx).
Posted in Professional Networking, Training | Tagged , | Leave a comment

Look for Server “Ouches” – By Collecting SQL Server Logs (Part 2)

In my last post, I detailed how you could look for server “ouches” by collecting the server’s event logs using PowerShell (https://sqlbadboy.wordpress.com/2015/12/07/look-for-server-ouches-by-collecting-server-logs/). However, at times you may want more detailed information concerning the SQL Server which may need to be obtained from the SQL Server logs.

And although I normally show code in PowerShell, this time I will simply show the T-SQL code. I am assuming it will be easy for someone to embed this T-SQL code in a larger PowerShell script or even an SSIS package for an automated collection.

How to read a SQL Server’s logs?

The key to reading the SQL Server logs will be using the undocumented stored procedure, xp_readerrorlog. This extended stored procedure has the following parameters:

  • Parameter 1 – Used to get a specified error log file. SQL Server keeps a number of historical error log files, so you can use this parameter if you want to read an older error log file. The default is 0, which is the current error log file.
  • Parameter 2 – Used to specify if you want to read from the SQL Server error log (default, value = 1) or SQL Agent log (value = 2).
  • Parameter 3 – Used to specify a string as a search parameter to filter the results. This needs to be specified to use Unicode, with syntax like N’search string’.
  • Parameter 4 – Used to specify an additional search parameter.
  • Parameter 5 – Used to filter out rows older than this datetime value.
  • Parameter 6 – Used to filter out rows newer than this datetime value.
  • Parameter 7 – Used to specify a sort order for the result. Valid values include either N’ASC’ or N’DESC’.

Another helpful undocumented stored procedure is xp_enumerrorlogs. This extended stored procedure lists all the error logs with their last change date. The default for the number of logs is 6, but it can be configured to use up to 99.

Here is a sample of code that will look for the last 3 days’ worth of entries in the error log.


DECLARE @StartDatetime DATETIME

SELECT @StartDatetime = GETDATE() - 3

EXEC master.dbo.xp_readerrorlog 0, 1, NULL, NULL, @StartDatetime

***NOTE: If you recycle the error logs daily, you will have to read multiple log files to obtain the entries for the last 3 days, as in our example. The error log can be recycled using sp_cycle_errorlog (https://msdn.microsoft.com/en-us/library/ms182512.aspx). It is recommended to recycle the error logs on a periodic basis or based on size so that they are easier to work with.

 So, here is a simple way to read the SQL Server error logs and look for “ouches”.

Posted in Uncategorized | Leave a comment

Look for Server “Ouches” – By Collecting Server Logs

At some point, most parents have asked their child to “show me where it hurts”. Perhaps a parent asks this question because their child has just yelled “ouch” or maybe the child seems to not be himself (i.e. lethargic). And the reason parents do this is because they want to help their child get better and catch any problem as early as possible.

In a similar way, we need to know when our servers are yelling “ouch” or don’t seem to be themselves. We can do this by examining the event logs, which basically is like the server talking to us, and by baselining the performance of our server.

Now, the easiest way to do both of these things is if we have some sort of monitoring software. Typically, monitoring software will examine server logs and baseline servers, and will alert us to any issues. This is the ideal situation …

However, what if we don’t have any monitoring software or we don’t have the monitoring software on all our servers. Well, then we may have to build our own process. And in this post, I am going to examine how to look for “ouches” in our server’s event logs.

How to read a server’s event logs?

First, I am going to acknowledge that you could go to each server and examine the event logs, but that seems like a horrible idea to me. Although, I have heard of some DBA’s doing that, the problem is that this method is not scalable. If you have one or two servers, ok, it could work. But, if you have hundreds of servers, it will never work.

So, that is where automation comes in and using PowerShell is my tool of choice. I am going to demonstrate some ways to read a server’s event logs.

Method 1: Use Get-EventLog (https://technet.microsoft.com/en-us/library/hh849834.aspx)

Get-EventLog -ComputerName $server -LogName “Application” -EntryType Error,Warning -After $logdate

Method 2: Use Get-WinEvent (https://technet.microsoft.com/en-us/library/hh849682.aspx)

Get-WinEvent -FilterHashtable @{LogName=’Application’; Level=1,2,3; StartTime=$logdate} -ComputerName $server

 As you can see, it is pretty simple to read a server’s event logs. Now, all you have to do is read the data into a table and report on it daily. This will also help you to keep a history of the “ouches”, in case you need it. But, there are some things to keep in mind:

  • For the best performance, filter the amount of records returned by the command by using the parameters EntryType when using Get-EventLog and/or Level when using Get-WinEvent. This will perform much better than piping the command to an output, which will read all the records, and then using a where clause for filtering. In my deployment, since I am looking for “ouches”, I only care about messages that are: critical, error, and warnings.
  • Get-EventLog is the older command and can’t read some of the newer event logs from Windows Vista and later. So, it is recommended to use Get-WinEvent for newer operating systems.
  • Get-WinEvent also lists event logs and event log providers. You can get events from selected logs or from logs generated by selected event providers. And, you can combine events from multiple sources in a single command. Get-WinEvent allows you to filter events by using XPath queries, structured XML queries, and simplified hash-table queries.

 

So, although there are multiple ways to look for “ouches” on a server, I always recommend a monitoring software solution or an automated process that can easily be built with PowerShell.

Posted in DBA | Tagged | 1 Comment

SQL PASS 2015 – Watch Portions Online

I just wanted to give a reminder that portions of last week’s PASS Summit 2015 can be watched online (http://www.sqlpass.org/summit/2015/Live.aspx).  Personally, I found the Day 2 keynote on the Internet of Things (IOT, https://en.wikipedia.org/wiki/Internet_of_Things) very interesting.  It is one of those buzz words we are hearing everywhere now, and it is good as a data professionals to know how that term fits in or will affect us.

Posted in Professional Networking, Training | Tagged | Leave a comment