With every new year, many people decide to make goals that they want to accomplish. At times, these goals may include reviewing our compensation and/or setting a target. So, how do we know what our compensation should be?
Here are a few resources that may help:
Why is it good to review your compensation periodically?
- To make sure you are being paid fairly in line with the market
- To set future goals and career paths as you explore compensation in your field
Posted in General
As data professionals, we all have heard a lot about Power BI. However, some may not be familiar with what exactly it is. Therefore, I am going to use this post as an introduction to Power BI.
What is Power BI?
Microsoft’s basic slogan for Power BI is … Any Data, Anywhere, Any Time
So, what does that really mean?
- Power BI can be used to connect to a variety of data sources, including cloud applications and data stores, as well as company data that resides on premise.
- Power BI can be used on any type of device (i.e. computer, tablet, phone).
- Power BI can be used when you are in the office or on the road in a mobile fashion.
Power BI is a full-scale business intelligence (BI) solution tool set. That means you can use the tool to extract data from source systems, transform it into the desired look, and load the data for reports. Essentially, Power BI allows you to create a “data model” for reporting. Also, when it comes to reporting and analytics, Power BI provides interactive visual reports, live streaming dashboards, and embedded reports and dashboards (web pages, PowerPoint, etc.).
Additionally, when people refer to Power BI, they may be actually speaking about one of multiple applications:
- Power BI Desktop – This desktop self-service analysis tool is used to author reports.
- Power BI Service – This cloud service handles the delivery, collaboration, and enables mobility.
- Power BI Mobile – This mobile application that works on major mobile devices (iOS, Android, Windows 10) that also integrates with Reporting Services (SSRS).
Best resources for Power BI:
As professionals, we all know how important training and networking can be. And that is why many of us like to go to conferences. At these events, they may help us in the following ways:
- Give guidance with a particular strategy or direction with a certain technology
- Receive technical training on implementing a specific piece of technology
- Opportunity to network with those facing similar challenges as our organization
Just recently, I attended the Gartner Data & Analytics Summit in Grapevine, Texas. This was my first time attending this event and I found it to be quite beneficial. The event blended real world problems with the data and analytic technology that could solve them. Particularly interesting were the keynotes which focused on things such as: business opportunities that use data and technology, technology in the entertainment world, how leaders need to view the abundance of data, the way that frustration can lead to creativity, and the blending of technology and people today. Although this conference did not provide the technical training that I had been used to with other conferences, it did provide the strategy and direction needed for my organization.
In regards to other conferences, here are some that I think would be beneficial this year:
- Gartner Data & Analytics – March 6-9, 2017 (Grapevine, TX)
- Enterprise Data World – April 2-7, 2017 (Atlanta, GA)
- DEV Intersection – May 21-24, 2017 (Orlando, FL)
- Microsoft Data Insights Summit – June 12-13, 2017 (Seattle, WA)
- Microsoft Ignite – September 25-29, 2017 (Orlando, FL)
- PASS Summit – October 31-November 3, 2017 (Seattle, WA)
Migrations are simply a part of a database administrator’s life. And as we start a new year, we probably will have a few migrations to do in the upcoming year. So, rather than shy away from them or have that sickening feeling of dread about them, we should endeavor to become good at doing them, which will make our lives easier. However, as we all know there are many ways to migrate things, and the same way or method may not be the most useful for everyone and in every situation.
One useful tool that I use to help with migrations is PowerShell. Now, I am not suggesting that you go write from scratch PowerShell scripts for everything that you might need to migrate, because this would become a daunting task. And really there is no need to “reinvent the wheel”, because some SQL Server community members have helped us with this. Here are some examples of ways to handle migrations:
- https://github.com/gwalkey/SQLTranscriptase – George Walkey has created a tool called SQLTranscriptase to document your SQL Server using PowerShell. In addition to having your SQL Server documented, which can be tremendously useful in a disaster recovery scenario, these scripts can also be used for migrations and managing different development environments.
- https://dbatools.io – Chrissy LeMaire has lead the development of a PowerShell module that can help you to migrate an entire SQL Server instance with one command.
Now, I will admit that I have used the SQLTranscriptase tools much more than I have used the DBAtools. In fact, I have only used the DBAtools in a testing environment, whereas I have used SQLTranscriptase in a production environment for migrating things to new servers.
*** CAUTION: I always caution that running code from the internet can be dangerous and should be tested in a test environment. This will allow you to understand exactly what is going to happen, before running it in a production environment.
So, whether you decide to use these PowerShell based tools or they just give you some ideas, they should make your life much easier when it comes to migrations.
As we all know, user access is one of the tasks that DBA’s are responsible for in their environment. And if we are following best practices and want to make our lives easier, we should be handling user access with domain security groups, rather than individual domain user accounts or SQL logins. However, from time to time, we will need to know what users are in a specific domain security group. How can we do that?
- xp_logininfo(http://msdn.microsoft.com/en-us/library/ms190369.aspx) – This extended stored procedure can be used along with the “members” option to return the next level of users in a group. However, if there are multiple nested groups, this will not return the next level groups, only users.
EXEC sys.xp_logininfo @acctname = 'DOMAIN\GroupName', @option = 'members'
$strFilter = "(&(objectCategory=Group)(Name=[GroupName]))"
$objDomain = <strong>New-Object</strong> System.DirectoryServices.DirectoryEntry
$objSearcher = <strong>New-Object</strong> System.DirectoryServices.DirectorySearcher
$objSearcher.SearchRoot = $objDomain
$objSearcher.PageSize = 1000
$objSearcher.Filter = $strFilter
$objSearcher.SearchScope = "Subtree"
$colResults = $objSearcher.FindAll()
foreach ($objResult in $colResults)
$objItem = $objResult.GetDirectoryEntry()
So, as always there are many ways that things can be done. and I have briefly listed two different ways that a person can find out who is in a domain security group.
With every new release of SQL Server, Microsoft claims that it is a game changer. However, in some ways I think it might be true regarding SQL Server 2016. Here are some of the things Microsoft is saying to promote it:
Although the marketing material is nice, there is another reason to be excited about this version of SQL Server, which was developed following a different model:
Personally, out of all the features that have been included, the following items stand out to me as reasons why you may want to adopt this release sooner, rather than later.
From MSDN, “The SQL Server Query Store feature provides you with insight on query plan choice and performance. It simplifies performance troubleshooting by helping you quickly find performance differences caused by query plan changes. Query Store automatically captures a history of queries, plans, and runtime statistics, and retains these for your review. It separates data by time windows so you can see database usage patterns and understand when query plan changes happened on the server.” (https://msdn.microsoft.com/en-us/library/dn817826.aspx)
I have seen this demoed a few times and have played with it a little with some of the release candidates, and it seems to be something that will make DBA’s and consultants lives a lot easier. Here some additional information regarding these items:
Finally, it seems like reporting services was given some attention in this release of SQL Server! And the changes are NOT trivial. There are some fundamental enhancements that I feel make reporting services a true competitor in this space again. I have been testing out the release candidates and I am very excited to get this into production. Here are some useful posts that will help you get excited about these enhancements:
Well, I have to say that I am probably as excited about this release as Microsoft is!
Link to SQL Server 2016 download:
For my training this year, I went to SQL Intersection (https://devintersection.com/) in Orlando, Florida from April 18-22, 2016. This was a new experience for me since I normally attend the SQL PASS conference in Seattle, Washington (http://www.sqlpass.org/summit/2016/Welcome.aspx). And there were some things I really liked and others not so much. For example, I really liked the weather (sunny and in the 80’s), the high quality of speakers, and the smaller conference size. However, because of being a smaller conference there were less sessions to choose from, less after-hours activities, and a large portion of the conference attendees were developers and not specifically SQL professionals, so the networking was different.
- Performance Troubleshooting Using Waits and Latches (Paul Randall) – I really enjoyed this session and he provided some great advice, such as don’t always assume that the symptom equals the root cause of the problem and don’t do knee jerk performance troubleshooting. Other items that were covered included: how to start your performance troubleshooting using waits, components of the scheduler, latches, spinlocks, various solutions to wait problems, and some real world examples. (http://www.sqlskills.com/blogs/paul/category/wait-stats/)
Sessions – Tuesday
- Keynote: Microsoft Cloud (Scott Guthrie) – Microsoft’s cloud services seemed to be an underlying theme of this conference. The speaker presented some amazing things regarding different companies’ experiences using Azure and how the newly acquired company, Xamarin (https://www.xamarin.com/), can be used to help with the development and testing processes for mobile devices.
- SQL Server 2016 New Feature Overview (Tim Chapman) – This session provided a high level overview of many of the new features of SQL Server 2016: maintenance plans, backup and restore extended events, new cardinality estimator, memory grants hints, multiple tempdb files on install, new alter database options, new T-SQL and DMV’s, column compression, in-memory optimizations and functions, live query monitoring, query store, temporal tables, JSON, row level security, dynamic data masking, load balancing for Availability Groups, stretch databases, backups to Azure, and Always encrypted.
- SQL Server Tuning When You Can’t Fix the Queries (Brent Ozar) – I really liked this energetic presentation. The speaker mentioned that you might choose to use batch requests per second and wait time per second as the key indicators for performance. He advocated using his own stored procedure, sp_askBrent (https://www.brentozar.com/askbrent/) to troubleshoot slow performance. Additionally, he suggested an easy approach to understand from the business or management what you can change and what you cannot change in order to improve performance. Afterwards, he reviewed some common wait types and their resolutions. His basic plan of action was to: tune queries, tune indexes, add more memory, and make storage faster.
- Corruption Survival Techniques (Paul Randall) – I always feel like taking a session on corruption is a must for every DBA, and I don’t think anyone does it better than this speaker. He emphasized that corruption will always happen and we need to know what we are doing or else we could make it worse. Additionally, he mentioned that we must perform root cause analysis or else it may re-occur. In order to find it early, we should have IO alerts set (823, 824, 825) and Checkdb running regularly. Also, he emphasized the need to have a disaster recovery run book. (http://www.sqlskills.com/blogs/paul/category/corruption/)
- Keynote: Do You Know Data in the Microsoft Cloud? (Bob Ward) – This was a good session that explored many of the technical details of Microsoft’s cloud services, including Infrastructure as a Service (IaaS), SQL Database, SQL Data Warehouse, DocumentDB, HDInsight, Data Lake analytics, and Data Factory.
Sessions – Wednesday
- Keynote: Go to There and Back Again (Douglas Crockford) – This presentation focused on the development of programming languages and what he thought would be next big ideas. This speaker is probably best known for popularizing JSON. (https://en.wikipedia.org/wiki/Douglas_Crockford)
- New Features with SQL Server 2016 Reporting Services (David Pless) – I was really excited about this session and I think the speaker covered many of the features well. The Reporting Services team has a nice blog that covers many of the things he did: https://blogs.msdn.microsoft.com/sqlrsteamblog/.
- Integrating Azure into Your Data Platform (Grant Fritchey) – I feel like this session was a balanced view of using Azure. He made the case for some who may be good candidates for using Azure, such as “greenfield” stuff, small shops, and global companies with distributed systems. He then talked through using active directory, virtual machines, and setting up network, monitoring, and other basics.
- DevOps with Team Foundation Services and Azure (Karl Rissland) – This was a well-developed presentation that focused on the people, the process, and the tools to make DevOps work. He mentioned some of the cultural impediments between IT groups (i.e. UI, integration, data, infrastructure) and how each focuses on a different purpose from change to stability. The technologies he used in his demos were: PowerShell, PowerShell DSC, Azure ARM templates, VSTS, and Azure.
Sessions – Thursday
- Keynote: A Day in the Life of the Data Scientist (Buck Woody) – This might have been one of my favorite presentations, as he broke down the job of a data scientist into a few parts. (https://buckwoody.wordpress.com/)
- Inside the SQL Server Query Store (Bob Ward) – This session featured some great technical content that made me excited to migrate to SQL Server 2016. I feel like the Query Store is a game changer and it was explored in some detail. Some of the subjects that were touched upon were: query execution over time, reporting of completed and failed queries, storing of compiled queries and plans and stats, the query store data model – including plan store and runtime stats, compilation and execution statistics, UTC dates, and maintenance.
- Scripting in SSIS (Tim Mitchell) – This session highlighted the creating of custom code to extend the capabilities of SSIS, using SSIS expressions, script task, script component, and PowerShell. (https://www.timmitchell.net/)
- Keynote: SQL Intersection Closing Panel & RunAs Radio Recording – Basically, we were audience for the radio show, RunAs Radio (http://runasradio.com/)
- Leveraging SQL Server in the Cloud (Jeremiah Peschka and Brent Ozar) – This workshop compared Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) and some of the pros and cons of each. Also, we went through some exercises to help us think through our decisions of how to leverage the cloud given certain situations and features that would be required. Again, it was stressed that a hybrid approach may work best.
So, I did enjoy my first experience with the SQL Intersection conference. Although it was a little different than what I was used to, it provided a high level of training and was in a nice location.