Monday 2 June 2014

Exam 70-461: Querying Microsoft SQL Server 2012

Resources to help when sitting the 70-461: Querying Microsoft SQL Server 2012 Certification Exam.

This is the overview page of the Exam. I strongly suggest going over the Skills Measured section. Ensure you have a good understanding of them. Including all the different options for Triggers, XML, Indexed Views etc.
Exam 70-461 Querying Microsoft SQL Server 2012

The Training Kit book is a great reference resource. In the front it highlights the skills measured and which chapters focus on those. One word of warning, the practice test that comes with the book is far easier than the actual exam. Use it as a guide on how the questions will be asked.
Amazon - Training-Kit-Exam-70-461-Microsoft

Microsoft 70-461 Test Preparation - This is a playlist of 10 videos which will give you a good overview. The videos dont cover everything but its a great start. I suggest even if you are not planning on taking the Exam its a good set of videos to watch to improve your knowledge. 
Playlist of 10 Videos on You Tube

MCSA Certification Prep | Exam 461: Querying Microsoft SQL Server 2012 - This is from Microsoft to give you an overview and tips on what to study on
Microsoft Exam Preperation Guide

What's New in SQL Server 2012 (Part 2 of 13) - New Transact-SQL Enhancements - New Features of SQL 2012 which will of course be asked about during the exam
New Features of SQL 2012 - Querying

The Microsoft Virtual Academy has some good videos as well. 
Querying Microsoft SQL Server 2012 Databases Jump Start

Microsoft Certification Offers - there can be some great discounts or double shot offers that you can take advantage of

My tip would be if you are unsure of an answer rule out what is not right. I found I could easily rule out at least half of the answers by looking for obvious wrong answers. For example if it asks for columns A, B & C make sure all options give this columns, it could be something as simple as that to narrow down the options and if you still don't know at least you will have a higher chance when guessing.

Sunday 27 April 2014

A Fan Asks Mike Rowe For Life Advice… His Response Is Truly Brilliant.

Some great advice from Mike Rowe from Dirty Jobs.
Although at the same time I do think people should aim high but they shouldn't cloud their ambitions with unrealistic criteria that must accompany them.

http://www.lifebuzz.com/mike-rowe/?utm_content=buffera2714&utm_medium=social&utm_source=facebook.com&utm_campaign=buffer

Saturday 29 March 2014

A Gamification view of the future

I did a Gamification course on Coursera a while back and this video was referenced during it. While not the greatest topic it is a great view of what the future may hold and really what is shown doesn't seem so far fetched. Until the last seconds perhaps.

Believe in Gamification! [A Futuristic Short Film…: http://youtu.be/ziHCvpikLh8

Microsofts in 2019

I do enjoy seeing these future vision clips. It does give inspiration and ideas for today.

Microsoft in 2019 [HQ]: http://youtu.be/RWxqSEMXWuw

Tuesday 18 March 2014

The world is one big dataset

This is a TED talk, seeing things like this really do inspire you.

Dan Berkenstock: The world is one big dataset. No…: http://youtu.be/7pVPmmwSeJQ

Sunday 9 March 2014

Search SSRS Datasets in SQL

The XML for SQL Server Reporting Services Reports are stored in the ReportServer Database of the server. Its possible to pull the XML out and then use the contents to find which reports are using which tables etc. This can be very handy if you there is a change to a schema where logic in reports may need to be updated but you dont know which reports need to be looked at.

The below will place the XML into a table so you can use it to filter what you are after.

CREATE TABLE [dbo].[ReportContents](
 [CatalogItemID] [uniqueidentifier] NOT NULL,
 [Type] [int] NOT NULL,
 [Path] [nvarchar](425) NOT NULL,
 [Name] [nvarchar](425) NOT NULL,
 [ItemContent] [xml] NULL
) ON [PRIMARY]

INSERT INTO ReportContents
 (CatalogItemID, [Type], [Path], [Name], ItemContent)


SELECT
 ItemID
 , [Type]
 , [Path]
 , [Name]
 , cast(CONVERT(VARCHAR(MAX), CONVERT (VARBINARY(MAX), [Content])) as xml) As [ItemContent]
FROM ReportServer.dbo.Catalog
WHERE [Type] <> 1     --Folder
   AND [Type] <> 3 --Image


The below queries the table you created, adjusting the value of the @SQLToCheckFor variable will adjust which text to look for within the SQL Datasets of the SSRS XML. Note this is set up for SQL2008, from memory its just a change of the http address from 2008 to 2005 etc.

--Check all the reports to find a bit of SQL

Declare @SQLToCheckFor varchar(500) = 'FACTMemberBalance'--'dPlanClassification' --Enter your bit of SQL here

select MydataSets.CatalogItemID, MydataSets.ReportName , MyDataSets.DataSetName,
MydataSets.CommandText, MydataSets.ReportPath
from
 (
  select
   CatalogItemID,
   [Name] As ReportName,
   [path] as ReportPath
   ,nref.value('@Name', 'nvarchar(255)' ) As DataSetName
   ,nref.query('.') As DataSetXML
   ,nref.value('declare namespace p1="http://schemas.microsoft.com/sqlserver/reporting/2008/01/reportdefinition";
     (./p1:Query[1]/p1:CommandText)[1]', 'varchar(max)') As CommandText
   ,nref.value('declare namespace p3="http://schemas.microsoft.com/SQLServer/reporting/reportdesigner";
      declare namespace p1="http://schemas.microsoft.com/sqlserver/reporting/2008/01/reportdefinition";
      declare default element namespace "http://schemas.microsoft.com/AnalysisServices/QueryDefinition";
     (./p1:Query/p3:MdxQuery/QueryDefinition/QuerySpecification/From)[1]', 'varchar(max)') As Cube   
  from ReportContents
  cross apply ItemContent.nodes('declare default element namespace "http://schemas.microsoft.com/sqlserver/reporting/2008/01/reportdefinition";
            declare namespace p1="http://schemas.microsoft.com/SQLServer/reporting/reportdesigner";
  //DataSet') as R(nref)
--A filter can be added here to limit what reports you are looking through
 )MyDataSets
where CommandText like '%' + @SQLToCheckFor + '%'



How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did

A favorite article of mine, was brought to my attention by a friend of mine. Its pretty impressive what Target was able to do but at the same time you have to wonder how far should organisations go to understand their customers.

How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did

Saturday 8 March 2014

Telstra fields 40,000 government data requests

That seems pretty high to me. I understand there are times in emergencies etc but I would like to know a bit more about the rest.

Tuesday 4 March 2014

Westpac using big data to woo customers with offers made to measure

An interesting article on how Westpac is using its data to target communications to its customers.
Nothing really new but it is a bit of a milestone a lot of Business would love to get to.

The below comment caught my eye and made me wonder what their definition is of Big Data and why they would have more than one data warehouse. Maybe its just the initial extracts from the Web that is the "Big Data" component which is then transformed into structured data in a Warehouse?

"Our data sources are growing very fast and customer interactions are growing very fast," Ms Ganschow said. "We know who you are paying. We know where you are shopping and what you are buying. There is a lot of data pouring into our data warehouses."


Saturday 1 March 2014

Microsoft Power BI

Interesting stuff, but like most of these tools you really need to use them first before you can make a call on them.

http://www.techrepublic.com/article/microsoft-power-bi-brings-big-data-to-the-little-guys/

SQLCLR

I found this a useful article on SQL Server Central.
SQLCLRs are a great way of integrating SQL Server with other applications and/or to create better ways of achieving your results.
The Level 2 page is very in depth, I wish I had access to such information years ago when I first started playing with them.
My favorite SQLCLR I created was to trigger the SSRS Web Service so we can schedule and run Reports with variables set in a table. Very handy when you need hundreds/thousands of reports scheduled to run only after certain events complete etc.

http://www.sqlservercentral.com/stairway/105855/

Friday 4 October 2013

All Sci-Fi Spaceships Known to Man

In theory "All Sci-Fi Spaceships Known to Man", it is missing the ship from Space Balls though. I guess it wouldnt of fit on the scale.

All Sci-Fi Spaceships Known to Man
Explore more infographics like this one on the web's largest information design community - Visually.

Saturday 4 May 2013

How to run your best 5k

Two reasons for this one. Firstly I'm loving Data Visualisations at the moment, they are a great way to put across a message or visualise the data. This one is not about the data but to convey the message of how to run 5 kilometers.
The other reason is that I run and to actually see tips like this is very helpful. Maybe it can help me reduce the number of injuries I get.

Shooting Hoops: Making $$$

I must I like this one a bit more because of my love of Basketball/sports in general, but still the amount of information put into is great.
There is so much great data on Basketball I love it.
If only I could work with sports data. An excuse to do what I have been doing since I was a boy as job, I think that would be my dream job.


Thursday 25 April 2013

In 60 Seconds

I'm really impressed at how well people are visualizing data theses days.
Below is another great example, it makes you want to read each fact.
I have to find a way to improve how I do the same, if I can get anywhere near this level I'll be very happy.


Row Counts for each table on Database from Metadata

Im sure a lot of people do this already but for those of you who dont there is an easy way to find the row counts of all of your tables without having to wait for count(*) commands.

SQL Server saves this information for you.

Just run the below on each database you want to check OR if you want you can use the sp_foreachDB I mentioned in a previous post (Run SQL On Each Database) to run it on multiple databases. If I was to do that I would pump the data into a temp table each time and then return the results from it.
SELECT
 DB_NAME() as DatabaseName, 
        st.Name,
 sc.Name as ''Schema'',
        SUM(CASE WHEN (p.index_id &lt; 2) AND (a.type = 1) THEN p.rows 
            ELSE 0 END) AS Rows,
 st.Modify_Date
FROM sys.partitions p
INNER JOIN sys.allocation_units a ON p.partition_id = a.container_id
INNER JOIN sys.tables st ON st.object_id = p.Object_ID
INNER JOIN sys.schemas sc on st.schema_id = sc.schema_id
GROUP BY st.Name, sc.Name, st.Modify_Date
ORDER BY rows desc, sc.Name, st.Name

Saturday 13 April 2013

Big Data analysis allows businesses and governments to mine your personal details

A good article on what some organisations are doing in the way of analytics. Whenever you provide any information to some one or interact in a way where they know who you are even if it is just with a credit card they can track you.

Big Data analysis allows businesses and governments to mine your personal details


Sunday 7 April 2013

SSMS Tools Pack

If you haven't already got SSMS Tools Pack I highly recommend it.

I use it a lot for the SQL Snippets which allows short cuts for example stf<ENTER> will return Select Top 100 FROM in the query window. All the Snippets are customisable and you create your own.

Running Custom Scripts from the Object Explorer is great too, I have scripts I want to run at a server level so I just click on the Server in Object Explorer and then just choose the script I want to run.

A lot of other really useful tools too, just makes life that little bit easier.

Sourcing Data from Active Directory



Something I've found useful in the past is being able to source user details from Active Directory.
Its nice being able to link the User Names from your systems up to actual Names of the users when providing data.
It is also handy when you want to find which users have certain security access. For example who has access to Database X.

/*
--Check if Ad Hoc Distributed Queries is Visible/Turned on
sp_configure

--Make Visible if it isnt
sp_configure 'show advanced options', 1
reconfigure

--Turn it on
sp_configure 'Ad Hoc Distributed Queries', 1
reconfigure
*/

--Return All Active Directory Users
--Replace ABC.DEF with your active directory server name

select
*
FROM OPENROWSET('ADSDSOObject',
'adsdatasource;', 'SELECT Title, Department, Mail, DisplayName, Sn, GivenName, Cn
FROM ''LDAP://ABC.DEF'' where objectClass = ''User'' AND objectClass<>''computer'' '
)

--Find All Active Directory Groups
--Replace ABC.DEF with your active directory server name
select *, substring(AdsPath, charindex('CN=', adspath), 300) as GroupName , substring(AdsPath, 0, charindex('CN=', adspath) - 1) as Domain FROM OPENROWSET('ADSDSOObject', 'adsdatasource;', 'SELECT AdsPath, name FROM ''LDAP://ABC.DEF'' WHERE objectCategory=''Group'' ' ) --Find Users from X Active Directory Group
--Replace 'YourADGroup' with the name of the Group you are after

--Replace ABC.DEF with your active directory server name
--Also again in the DC=ABC,DC=DEF section
select * FROM OPENROWSET('ADSDSOObject', 'adsdatasource;', 'SELECT Cn FROM ''LDAP://ABC.DEF'' WHERE memberOf=''CN=YourADGroup,OU=Security,OU=Groups,DC=ABC,DC=DEF'' ' )