Thursday, November 12, 2009

Tech-Ed Berlin 2009: Day 4, Thursday 12th of November

Don’t know how we did it, but we arrived in time. So no need to hurry. The S-Bahn to Tech-Ed was really packed to the brim. Somehow these trains should have a sign like this: ‘This train wagon can have 200 people OR 600 IT staff attending Tech-Ed’. Seriously, when the train entered the station for Tech-Ed it was flooded with IT staff from all over the world. Never the density of laptops and smart phones per person on that station has been that high!

Upgrading to SQL Server 2008 Done Right
The first session was about how to upgrade to SQL Server 2008 done right. The speaker was Dandy Weyn, Technical Development Manager. When the upgrade to SQL Server 2008 isn’t prepared properly one is likely to end up like this:

Step by step the available upgrade scenario’s and the caveats were discussed. Also some known compatibility issues were brought to the attention. When upgrading one has not to forget to look outside the SQL server itself as well. For instance cross-database dependencies, linked servers and extended stored procedures might make the upgrade more complex.

Gladly, Microsoft has delivered good tooling AND documentation in order to aid in the upgrade process. One of these tools is the SQL 2008 Upgrade Advisor. It contains another tool ‘Upgrade Advisor Analysis Wizard’. With it one can select the DBs to be upgraded and check to see whether there are certain issues to be fixed prior the upgrade. In conjunction with this tool the SQL 2008 Profiler is run as well. This way much useful information is collected. When the ‘Upgrade Advisor Analysis Wizard’ is finished the results are shown which can be looked at in every detail:

In one demo an upgrade from SQL 2000 DB to SQL 2008 was shown. This upgrade went wrong. With the above mentioned tooling the causes of this failure was shown and resolved. Afterwards the upgrade went fine. Afterwards the Server Objects (e.g. linked servers) Agent Jobs were taken care of as well.

Another demo was about upgrading from SQL 2005 to SQL 2008. The same approach is used here as well. One very important thing to do is to update the statistics of the upgraded DB. Otherwise performance won’t be optimal.

Shall I LTI or ZTI?
Second session I attended was all about Windows 7 deployment with MDOP and SCCM. Good demo’s were given. The session was about many topics, some of them were:

Resolving application compatibility issues
The Application Compatibility Toolkit was discussed and demonstrated shortly. Certainly a tool worthwhile to be used by organizations migrating to Windows 7. Nice touch is the UI of this tool. It has the look & feel of the SC products like SCOM! Other tooling was demonstrated as well, like Standard User Analyzer and Microsoft Application Compatibility Database. These tools do not only show what the issues are but also deliver the solutions to it. This is great since it helps companies in their quest to resolve all compatibility issues with the applications they are using.

Upgrading to Windows 7 and how to go about it?
Besides SCCM there are other tools available as well like the Microsoft Deployment Toolkit (MDT), aka Light Touch Installation (LTI) since it still needs input from the administrator per system to be upgraded where as SCCM is Zero Touch Installation (ZTI). Here all the needed operations are centralized. Not only the upgrade process itself, but also the monitoring, the reporting, the repairs (when needed). Now no input on a per client basis is needed anymore.

The earlier mentioned tool Application Compatibility Toolkit integrates with SCCM by usage of the ‘Application Compatibility Toolkit Connector’. So its features are leveraged within SCCM. Also the code used for MDT is used for SCCM and vice versa. In reporting a cool report is to be found, the Windows 7 Upgrade Assessment report which can be run against the collection within SCCM.

When an organization runs SCCM and want to upgrade to Windows 7, the usage of SCCM will be a very good aid in order to run an upgrade as smoothly as possible.

All you ever wanted to ask about SCOM
In this ‘Birds-of-a-Feather’ session Pete Zerger and Rory McCaw answered questions from the audience about all kinds of SCOM issues. Simon Skinner was also present and with the three of them all questions raised by the audience were answered very well. Again the knowledge AND experience these people have is awesome!

X-plat and Agents that won’t work
An interactive session where speaker Barry Shilmover answered questions from the audience. He also demonstrated some issues which might occur when discovering and installing the Agent on a UNIX/Linux system.

During this session Barry told that the Discovery process in SCOM R2 of UNIX/Linux systems is the MOST complex operation. Mainly because it is purely data-driven. The MP delivers here all the knowledge.The Discovery Process itself has no knowledge at all about the systems it is discovering. So when support for another UNIX/Linux system is added, Microsoft ships a MP and an Agent. There is no need to alter SCOM R2 itself. Also good to know is that by default the Discovery Process will  scan for 25(!) different systems.

DNS is eminent in this process since a certificate will be created on the fly as well for which the correct FQDN is required. This is the FQDN as the UNIX/Linux system knows it. Number One issue of troublesome Agent installation is: the resolved name from the RMS doesn’t match the name as UNIX/Linux system sees it.

Who ever thought to see this on UNIX/Linux system?

Since UNIX/Linux systems many times use per system different passwords, Microsoft has also revised the security model within SCOM R2, aka Run As Accounts and Run As Profiles. The distribution method (secure/less secure) comes here into play as well. For Windows systems this is not very important since the accounts (and passwords) reside in AD. The Windows servers being monitored are only sent the related SID of the account, not the account nor the password itself.

For UNIX/Linux systems this is different. Also the way the Run As Account is targeted in the Run As Profile is very important. Here one can really make a granular selection to what UNIX/Linux computer the credentials are targeted against, thus enabling SCOM R2 to use multiple passwords for different UNIX/Linux systems.

Audit Collection Services (ACS) for UNIX/Linux is coming up. Microsoft is still testing it but it will soon come out, to be expected this year! In order for this to work new MPs must be imported as well as specific ACS Forwarders. Microsoft is also talking with companies like Bridgeways and Novell in order to extend it.

This was a very good session and I learned very much from it.

Even though I will attend more sessions today I won’t find time to put into this blog posting. There fore I put this on my blog already.

Again I must say it has been a very interesting day as well. Many new things learned and seen. It is amazing to see how much effort Microsoft puts into the development of all its products. Whether it is Windows 7, SQL, SCCM, SCSM, SCOM, X-plat monitoring AND ACS for it as well (!!!!), the drive and roadmaps are very impressive. Good to be part of it. :)

No comments: