A while I go I did a post on licensing SQL Server in a virtual world, and there some changes you need to be aware of with the release of SQL Server 2008 R2 as the current site for SQL Server 2008 R2 doesn’t have this detail
Very simply SQL Server licensing has been aligned to Windows Server licensing:
This alignment also applies to CPU support and memory as the table below shows..
Two other things to note..
PowerPivot for Sharepoint Master Data Services unicode compression (which apparently is really useful if you’re running SAP) the Enterprise features introduced in SQL server 2008/5 e.g. Resource Governor, partitions more than 3 nodes in your cluster etc. as ever there is an exception to every rule and in SQL Server 2008 R2 Standard edition, data and backup compression are included except for the unicode compression mentioned above.
The editions page of the SQL Server 2008 R2 site, currently only has an overview of this but should be your primary resource in understanding which edition is right for your business and how to license it
I do think this is all pretty straightforward and reduces confusion, rather than the customisation approach where you buy the basic engine (much as you would buy a car) but every little add-on is an extra cost such as compression, clustering etc. but as ever comments are welcome.
PowerPivot is essentially a way of making an analysis services cube using excel as the design tool. When you use the PowerPivot for Excel add-in, then there no backend dependency, but if you want to save a PowerPivot to SharePoint 2010, there has to be a special installation of SQL Server 2008 R2 analysis services associated with the SharePoint farm.
This integrated installation might seem like the way Reporting Services can also be configured to work with SharePoint, however when analysis services is installed in this way it can’t be used as a repository for traditional cubes. So you can’t use management studio to manage this new environment it all has to be done through SharePoint. Nor can you deploy (build or restore) ‘normal’ cubes to this instance form scripts or the BI Dev studio.
However a good test that your PowerPivot environment is running properly is to post a PowerPivot to SharePoint ..
Now open the PowerPivot you just saved
and copy it’s URL..
Now open SQL Server Management Studio and connect to an analysis services database..
and paste the in that URL to the PowerPivot..
and the PowerPivot looks like any other cube. However the advice here is to look but don’t touch. So apart from testing the only other thing I can think of that you might want to do is to script out the data source view or schema. The use case for this would be to quick start the design of a traditional analysis service database, because the PowerPivot has moved form being a tactical solution to a strategic one.
Another thing to not about the connection string is that it can also be used in Reporting Services or anything that can consume analysis services data to make the PowerPivot a data source e.g. you can write and run a report against a PowerPivot..
I have a short video on sharing PowerPivot Data here and there is a complete site dedicated to PowerPivot http://PowerPivot.com
Why are people in this country so glued to the imperial system of measurement? are they easier to use compared to metric (SI) units? e.g.
SI units are designed to work together from the ground up and use a common set of prefixes to denote scale e.g. kilo, micro, centi etc., so calculations involving multiple units (like force and acceleration) are much easier as are day to day areas and volumes.
The same sort of odd inertia seems to exist in isolated pockets as I found out at the conference I was at last week, only this time it is the reluctance to adopt the new office ribbon, and it’s almost like the people I was talking to were fixed in the idea that they didn’t like it and were then looking for reasons not to.
When Office 2007 was developed something had to give. The menu structures were getting so long that navigating them become more and more difficult to the point where 8-% of the feature requests for excel 2007 were actually in the product already but you couldn’t easily find them. So what they decided to do was to attack the problem in several ways:
Office 2010 builds on this, but also harks back to earlier versions of office in that you get the file button back, which now brings up backstage where you can:
One other thing I couldn’t field at the event was how you could get toolbars to float in Office 2003. You can’t do that anymore but you do have right click so if I select something in Office 2010 e.g a textbox in PowerPoint and right click
I can not only set its properties , I can also use the mini-toolbar (next to the orange arrow) to directly set what I am working on.
Microsoft isn’t giving up on the ribbon and is actually expanding its use to OneNote, Outlook, Project and Visio and SharePoint 2010, but it has made changes to it make it even easier to use. You could take my word for it or you could download the current beta and try it yourself.
My parting shot is that at the BETT 2010 event I had about twenty questions about the release date of Office 2010 (some time before June 2010) so clearly some people out there like it. Anyway that’s done so now I am off for a quick 0.568261484567444 litre!
I am a pretty good cook (not as good as our chef in residence Marc) but I don’t practice enough so I wanted to get back up to speed to help out while my wife works on her OU degree and to explain PowerPivot.
I have posted this video on YouTube..
..and you can click on the yellow buttons on it to view each of the screen casts (there’s a back button on each screen cast as well) on each topic I have made.
Hopefully you’ll find this useful and as ever comments are welcome.
In my recent post on the license changes in SQL Server 2008 R2, I deliberately didn’t cover the way the new features would be distributed across the various editions as at the time of writing the decisions simply hadn’t been made.
One of these new features is StreamInsight (the feature formerly known as Complex Event Processing) and I haven’t posted too much about it to date because I didn’t see it being a mainstream tool.
Before I go into that what exactly is StreamInsight?
Imagine you’ve just woken up this morning and all these annoying inputs are entering your brain:
Your body needs to react to all of this in the right way and often very quickly rather than simply remembering that this has all happened. If your body was running StreamInsight then you could set up a few rules to start your day:
So StreamInsight is a near real time monitoring and control framework able to respond to thousands of events a second. It does this by running in memory and it has the optional capability to write out what’s happened to database i.e. SQL Server.
It will have several applications from traditional manufacturing or environmental process monitoring and control, to biomedical applications and event stock market analysis.
StreamInsight is a development framework so there is nothing to see until you start to program in it, and one of the other reasons I have mentioned to much about before was that it was originally only going to be in the new DataCenter edition of SQL Server.
However it will be included in a limited form in standard edition, the limitation being a restriction on throughput rather than capability so code written for one edition will work on them all. This restriction will typically result in standard and enterprise editions being limited to about 5,000 events a second with greater than 5 second latency while DataCenter edition will be able to handle over triple this amount and have a far lower latency.
If that’s sounds interesting there a whole site on StreamInsight here and there are some sample apps on the StreamInsight blog here and you can download the November CTP of it here.
If I was writing a bill of rights for the internet my main focus would be to make the web a safe and fun place for everyone, but that’s not going to happen so instead I spend some of my spare time trying to help children be safe on line (a sort of digital green cross man). Because this entails meeting the children I have had to get the relevant criminal records checks done via the Child Exploitation Online Protection (CEOP) centre and receive their excellent training.
It’s also important to educate parents about the threats the internet can pose. While we have some really good guidance on how to lock down a child’s PC and XBox to make them safer, increasingly they’ll be surfing from a phone and of course there are other connected gaming platforms like the wii and Playstation, so the focus must be on awareness and process rather than some magic bit of technology.
So there is Safer Internet Day on 9th Feb were volunteers from O2, Visa and Microsoft will be descending on schools in the UK to spread the message not only to children but to parents as well. However we can’t be everywhere so there is also an opportunity for the parents education to be streamed to schools throughout the day so if your school want to get in on this please contact me and I’ll set it up.
The people we employ to run the country have decided to make a lot of the data they collect on our behalf freely available at Data.gov.uk. This data by itself is of limited use, however when combined with in house data in a business or academic institution and plotted on to a map then it becomes quite powerful.
And that power can be good or bad. If you are trying to sell your house on line and I notice that car theft and muggings in your area are twice as high as the national average that might influence my decision to purchase. On the other hand this lets the buyer know more about the are they are moving into e.g. quality of schools healthcare etc.
Fortunately there’s no personal data on there, but the data is in quite a lot of different formats and this site is just a collection of URLs to the various agencies and NGO’s that have the source data, some of which want to give you a pdf document!
These data sets typically don’t have geospatial coordinates in them them but there are some reference data sets which do have this and keys to such things as local authority boundaries. So to get your own solution working you’ll have to find out which one to use to join to the data sets you want to plot and compare.
It is a start and there is a forum capability where you can post questions and make suggestions, and there’s also a developer section to support and encourage applications around the data.
All I need now is a quiet moment to explore it in a bit more details and have a go!
One of the key problems in BI is getting your data together and cleaning it, and one of the best tools for doing this at scale would be integration services in SQL Server. However integration services is too complicated for the average business user but these users do need something to load and prepare data. They are used to doing this in excel and so this is exactly what they use in PowerPivot for excel.
There are two parts to the process, extraction and transformation. PowerPivot provides simple wizards to get data from a variety of sources, in fact anything that can be connected to. New sources include SQL Azure, and Data Feeds in Reporting Services and I have a short screen cast on this here (on TechNet Edge).
Having grabbed all the data each table ends up in its own tab. The most important step to create relationships between them, where they are not automatically picked up form the sources by PowerPivot. The data in each one can be added to, by having extra columns (but NOT rows) using special excel like formula (Data Analysis eXpressions – DAX) to do lookups across the tabs. I have another screen cast on this here.
If you want to know more, there are also various demos, and videos on www.powerpivot.com
The most important thing about PowerPivot is the ability to share users analysis into SharePoint so that these other users can slice and data form within a browser. As I’ve said before this uses SQL Server analysis services behind the scenes, albeit as part of a special install option. I’ve also said that PowerPivots on SharePoint are not anywhere near as scalable as conventional analysis services, and so the usage has to be carefully monitored to make best use of the facility. This monitoring is built into the PowerPivot functionality in SharePoint 2010 via a dedicated dashboard.
The other clever thing about this installation is the ability to schedule refreshes of the PowerPivots from their source data thus keeping them up to date without any intervention.
I have another screen cast of it on TechNet Edge here.
I have spent the last four days on the Microsoft stand at a huge education show (BETT 2010) to field technical questions on Windows 7 and Office 2010. I take my hat off to the IT Pro’s working in schools as they have to cope with all kinds of applications and hardware and usually there are only one or two per school.
But don’t take my word for it here are some of the more unusual conversations I had:
In future these guys will be opening up their networks to pupils’ laptops and of course there is the 20% turnover in every school every year as pupils form the senior year leave and new ones arrive in the junior year.
Fortunately there are quite a few things to make life a little easier for them , probably the most important being live@edu, which is free Microsoft hosted e-mail and collaboration for pupils and students, so that the IT staff in a school only have to worry about e-mail for the staff.
Other resources for schools in the UK are on the Microsoft education portal, and the education team’s blog including part 1 and part 2 of ICT Money Saving tips, which was a standing room only talk on the Microsoft stand at BETT.
Storing and manipulating spatial data into SQL Server 2008 is fairly straightforward, thanks to its new data types, methods and indexing. However presenting this data is more of a challenge; showing your users a grid of latitude/longitudes isn’t really going to help and so you need to put your data on a map. The obvious choice for this is Bing Maps, but how can you do this quickly and easily?
To find out how, why not come along to this event at our offices in Reading on 21 January..
If you can’t make it and you are still interested then you could do worse than buy “Beginning Spatial with SQL Server 2008” published by Apress.
I like technology but it’s what you do with it that matters. Having said that if you don’t know what a piece of technology is capable of be it your home cinema system or SQL Server then you might not be making best use of it.
One piece of technology from Microsoft that is little understood except by those already using it is BizTalk and I often get asked what the difference is between BizTalk and SQL Server integration services. The easiest way to differentiate them is to understand that:
On the one hand a data warehouse gives us the ability to measure what’s happening in a business, while BizTalk has the ability quickly to create and modify business processes. The interesting bit is that having found out what’s not working in your business using business intelligence, you might want to change the way you do business. the flip side of this is how do you know your processes are or aren’t optimised if you don’t have a mechanism to measure them. So in my opinion combining business intelligence technologies with a powerful process management suite like BizTalk is an important part of having an efficient and agile business.
If I then mention that even though the economy appears to be slowly recovering in some areas, it’s still very hard for some businesses to survive, then I think this stuff is quite important.
If any of this sounds like it might be of interest then we have an event organised around this on 9th Feb at our offices in Reading. There are a variety of speakers including a customer, partners and Microsoft experts and I get to do the business intelligence slot. To see the full agenda and book a place..