Wednesday, December 12, 2007

Walkthrough Connecting to Excel 2007 with SSIS

I'll be straight with you, I'm an SSIS newb. I've got the book, a couple of packages under my belt and that's it. The other day I had the distinct pleasure of having to connect to an Office 12 (read 2007) Excel file as a data source.

Needless to say I was pretty disappointed to find out that the Excel Connection Manager was not up to the task. The Excel Connection Manager data source can only handle files from 2003 and earlier. As an alternative you need to download and then use the Microsoft Office 12.0 Access Database Engine OLE DB Provider to get at Office 12 files.

While this introduces a deployment dependency, rest assured that you do not need to install Office 2007 to run this provider, you only need the provider to be installed on the machine that's running Integration Services.

So without further ado here's how you use the provider to access an Excel 2007 data source from the Business Intelligence Studio.

1) Install the Office 12.0 Access Database Engine OLE DB Provider (see above).
2) Open the Business Intelligence Studio and start a new Integration Services Project.
3) Drag a Data Flow task on to the designer surface and double click on it.
4) Drag an OLE DB Data source onto the designer surface and double click on it.
5) Click 'New' to create a new Connection.
6) From the Connection Manager choose Microsoft Office 12.0 Access Database Engine OLE DB Provider.
7) Click on the "Data Links" button.
8) Enter in the location of the file in the "Data Source" field.

8) Under the "Advanced" tab select the permissions you want on the file.
9) Under the "All" tab click "Edit Value" for "Extended Properties" enter in the text "Excel 12.0;HDR=YES"
10) That's it, click "Test Connection" and find out how you did.

It's also of note that you can not only query sheets as if they were tables but you can also query cell ranges. The syntax to do this looks like the following:

To Query a Worksheet (Sheet1):

FROM `Sheet1$`
To Query a range of cells (A2:C10)
FROM `Sheet1$A2:C10`
Well that's it, happy codeing.


Tuesday, December 11, 2007

Virtual PC and Virtual Server: Doing Differencing Disks

[Updated Feb 26, 2008]
I work at a contracting/consulting firm. I kid you not when I say that what I'm working on sometimes changes every week for months at a time. I mean it. Two weeks ago was InfoPath. Before that was Reporting Services. Last week was SSIS. This week is nHibernate. Next week I'm porting an ASP.NET to SharePoint. You get the picture.

I cannot begin to intimate how dirty my dev environment gets. Take a bucketful of open source offerings, throw in a lot of MS tech stack CTP's, Beta 2's and then stir in a service pack or two and you get a really dirty development environment. Typically I'll scrub my box at least twice a year, but as I'm getting older I find I'm becoming more and more impatient when it comes to sitting through installs.

A couple of weeks ago I stumbled upon this wonderful article from Andrew Connell. Essentially it's a HOW TO for leveraging Virtual PC/Virtual Server images to help your development world. I won't go into the details that are so elegantly laid out in the article but let me assure you it's a dynamite read. It was so good that I mustered up the courage to give it a try.

The Pain

I have to tell you, creating all the base images (I decided to create 4 of them) was PAINFUL. It took what seemed forever to install, defragment, precompact, compress 4 machine bases with different assortments of WinXP, Win2k3, MOSS, VS2005 SP1 and SQL Server 2005 SP2. Essentially I threw away an entire weekend getting things up and running.

The biggest problem I had was my poor notebook disk. When you start creating 12 GB (or bigger) files your disk gets ridiculously fragmented. Factor in a 5400 RPM notebook drive (I decided to do this on my laptop for some reason) and what you get is a SLOW SLOW disk. Eventually the installs and service packs grind to a halt if you don't stay on top of the defragmenting.

I only wish I had known of contig a utility hosted on TechNet that will let you defragment individual files/directories without needing to defrag the entire disk! Let me tell you this little utility is a god send for this type of task. If you have any intention of going through with this (and it's worth considering) be sure to download this utility.

The Pain Summary is as follows:
-Long installs, you'll lose some of your life that you'll never get back. Ideally you'll come out ahead in the long run though.
-My Virtual PC images still don't pick up my second display at work. You can still have multiple displays, just not multiple displays INSIDE the virtual machine. This is irritating to me because I've grown very accustomed to have the SQL Management Studio on on monitor and Visual Studio on the other.
-You'll probably have to buy an external drive, especially if you're doing this on a laptop. All that disk contention on one 5400 RPM drive is just a bad recipe for grouchiness. I'd recommend a Fire Wire one if you can get one, it's less CPU intensive to access.
-It's also probably worth your time/money to get at least 3 GB of memory if you're going to run very memory heavy virtual machines. But hey, memory's cheaper than bad humor, consider yourself lucky.

The Gain

I can now provision a machine in no time at all! And I don't just mean a new development machine either, you can take the same images that you would use with Virtual PC and use them with Virtual Server (you will need to run newsid every time you make a copy of a machine and want to run multiple copies at the same time). You can even run either the base images or differencing disks on Virtual Server. Developing off these base images we can now:
-Get a fresh development environment in a matter of minutes
-Continue working on another machine if my machine dies (as long as the external HD which holds the virtual machines is good)
-Make a copy of any development environment and give it to another developer
-Provision test/QA environments whenever we need a virgin machine
-Roll back any work we've done on the computer within a session (Undo Disks)
-Save the state of the machine at any time and continue later (you could be in the midst of debugging an app, save the state, put the machine down and carry on next week like nothing happened)

Trust me this makes for a FEARLESS development environment, no longer do I grimace before applying some patch or installing some 3rd party utility. Given that both Virtual PC 2007 and Virtual Server 2007 are both free, if your firm has some good hardware lying around you should definitely give it some though if you want your hardware to stretch a little further. It could really benefit both you and your team. Also only one person really needs to set up the base images. After that you can copy them around, run NewSid on them and you're off and running!

Last but not least I recently acquired both an external USB 2.0 and a Fire Wire 400 drive. I thought it might be educational to post the results of some HD Tach here. So here's my version of the USB 2.0 vs Fire wire 400 test. Both drives actually had the exact same data on them so it's a fairly decent case.

USB 2.0 (Western Digital 250 GB 5400 rpm drive)

Fire wire 400 (Verbatim 250 GB Smart Disk)

Internal MacBook Pro Drive (120 GB 4200 RPM)External eSata (WD Raptor 120GB 10,000RPM)
As you can see (click on the images) the USB 2.0 has better burst speeds and better average reads, but the Fire Wire is a LOT less CPU intensive. They both boast the same average read times. My advice, go with the Fire Wire unless you've got a bunch of cores in your machine that you don't want to have idle :-).

[Updated Feb 26, 2008] A friend sent HD Tach scores for an external WD Raptor connected via eSata, as you can see it blows the other stats out of the water. Albeit it's a little less portable then it's slower spinning brethren.

Ok, that's it folks this one got a little long winded.

Happy dev-ing,

Monday, November 26, 2007

Visual Studio SP 1 Error on Windows 2003

I was building a up a workstation the other day for SharePoint and one of the last steps was patching Visual Studio up to Service Pack 1. It's a sizable patch (over 150 MB on disk) and takes a while to unpack never mind run.

What was driving me crazy though was that the installer kept dying and throwing out this error message:

"Error 1718. File was rejected by digital signature policy"
I was pretty sure that the patch was not corrupt, it had just successfully installed on a Windows XP machine not 20 min before. After poking around I stumbled upon this KB article which fixed the issue. As it turns out there's a bug in the Windows Installer for Windows 2003. When trying to verify the signature of a file it needs a (depending on the size of the file) a fairly large piece of contiguous virtual memory to load the file into. For files that are pretty big this can be a problem. What ends up happening is Windows often can't come up with the contiguous memory and so the validation step fails and it throws the above error.

The KB article above includes a hot fix for Windows 2003.


Saturday, November 17, 2007

Common SQL Server 2005 Diagram Suport Errors

Here's a scenario, you're trying to create a database diagram and you're presented with the following error:

Database diagram support objects cannot be installed because this database does not have a valid owner. To continue, first use the Files page of the Database Properties dialog box or the ALTER AUTHORIZATION statement to set the database owner to a valid login, then add the database diagram support objects.

There's two popular causes for this problem.

1) The database doesn't have a proper owner. This often happens if you're restoring the database from another SQL Server instance. To check the owner:
  • Right click on the database and click on properties. Click on the files group and make sure the owner property is set to a valid login. If you're lazy you can always set it to sa.

  • You can also do this T-SQL if you want to script it. The syntax looks like:
    ALTER AUTHORIZATION ON DATABASE::[databaseName] TO [owner]
2) Another trick you might try if the database does have a valid owner is setting the compatibility level to SQL Server 2005 (90). To change the compatibility level:
  • Right click on the database and click on properties. Under the options group set the Compatibility Level to SQL Server 2005 (90).

Well that's it folks. Hope it saves you a minute or two.


Friday, November 16, 2007

Safari 3 - Finally a REAL browser

I'm not going to lie to you. I've hated Safari for quite some time. Shortly after I first opened it I realized that browsing the web with Safari was like looking at the world while hung over. Not only were the most mundane of activities painful, everything rendered looked miserable and queasy.

When I became a web developer and started trying to deploy web applications that worked for Mac users, I really started to hate Safari. It seemed like NOTHING worked. Common javascript calls that would work great in IE and Firefox would fail miserably in Safari. The browser that Apple claimed was the "Worlds Best Browser" could barely render a .jpeg never mind a decent Web 2.0 site.

Last night I unwittingly downloaded an update for OS X Tiger which included (among other things) Safari 3.0. To my surprise the browser actually works quite well. I was actually blown away when I found out that all the WYSIWYG editors that used to fail miserably in Safari 2 work wonderfully in Safari 3.0. Even the text editor that I'm using to write this blog normally fails terribly in Safari, but no more! Two other text editors that I'm fans of, TinyMCE and the FCKeditor also both work extremely well with Safari 3.0.

What surprised me even more after I started digging around for material for this blog entry was that Safari has actually had extremely strong CSS support since version 2.0. I found this page which has actually been tracking Acid 2 Test performance for major browsers for quite some time. Imagine my surprise when I found out that both IE and Firefox STILL don't pass the Acid 2 test.

That being said, the Acid 2 test isn't a measure of everything that's important for a good browser, but as a developer I can't help but shake my head at the test renders from both IE 7 and Firefox 2.0. One day I'd like to be able to write CSS without being paranoid as to how it's going to render in other browsers.

If you have any interest, this is how the Acid 2 Test currently renders in the IE 7, FireFox 2.0, Opera 9.0 and Safari 3.0.

IE 7 (renders terribly)

FireFox 2.04 (almost there)

Opera 9 (renders correctly)

Safari 3 (renders correctly)

It's worth mention that this blog entry was written entirely in Safari 3.0. I can only hope that Safari 3.0 gets rolled out quickly and enjoys tremendous Mac user adoption. Kudos to Apple for finally releasing a half decent browser with OS X.

Thursday, November 15, 2007

SEO with Google Webmaster Tools

Many people are probably already aware of the Google Webmaster Tools, but since I'm sometimes not the last to find out, I thought I'd spend a second speaking to what a great help these tools can be.

SEO is always an interesting topic, mostly because there's so much hearsay and urban myth. Here's an example.

True or False: "Your key word density should not exceed 15%".
True or False: "Participating in link exchanges could actually hurt your site".
True or False: "Parametrized URLs (like are not crawled by the Google Bot".

There's a HUGE number of these, some of them are easy to find the answer to, while others often get bogged down in opinion stated as the truth. In an effort to quell confusion surrounding how SEO with Google works. Google offers the Webmaster Central which tries to answer a lot of these questions. A lot of content in the form of Blogs, Discussion Groups and a collection of SEO tools make both answering these questions and actually improving the crawl-ability of your site possible.

One of the most interesting tools to me has been the Webmaster Tools pages which allow you to analyze through the eyes of Google what your site looks like. I've often found that one of the most frustrating things about SEO is that it's extremely difficult to get feedback from your efforts. How do you know that the changes you've made are actually helping!?

The Webmaster tools don't solve this in entire, but they do tell you important information like:

  • Were there any errors when Google crawled your website last?
  • What are some key search terms that bring up your site?
  • A break down of internal/internal and site links.
Given that this is a blog, I'm often curious as to what people actually need to type in to a search engine to find this site. As it turns out right now they need to tickle Google pretty hard for this scarce blog to show up. But if you're curious as to which terms actually bring up this site (at time or writing), you're in luck! The Webmaster Tools actually show you specific search terms that bring up your site when searching with Google (see below).

If you expand the image above you can see specific search terms that bring up this site and the rank that this site will appear in. Pretty cool huh? This is one of many features available for free with the Webmaster Tools. If you have a site that you're even mildly interested in promoting, you should definitely check it out.

I'm also messing around with Google Analytics (I know it's been around forever but I haven't really played with it yet). I'll do another post if it's as impressive as I think it will be.


Tuesday, November 13, 2007

Uninstalling OFFICESERVERS Sql Express Instance from MOSS, WSS etc...

It seems to happen quite often that I install MOSS on some machine, want to remove it but am left with some slightly unsightly left overs.

Once such left over is the OFFICESERVERS SQL Express named instance that comes with a 'Basic' install of WSS.
To rid yourself of this garish fallen soldier do the following.

1) Start->Run->regedit

2) Navigate to hkey_local_machine\software\Microsoft\Windows\CurrentVersion\Uninstall

3) Find the key which has a display name of OFFICESERVERS (a search from this key down is usually a good idea).
4) Once you find the key in question (it's usually a GUID), copy the text out of the 'UninstallString' key and paste it into a Start->Run dialog box. (IE. Start->Run-> 'MsiExec.exe /I{6DEF11C0-35FF-4160-A543-FDD336C4DAE5}' would be the command garnered from the image below).

5) Let the installer run and when prompted choose to remove the Sql Express instance.

That's it, your box is now slightly cleaner.

Good riddance.

Take care,

Friday, November 2, 2007

RichText WYSIWIG Editor for SharePoint (Safari Friendly)

It's a common problem for anyone trying to roll out SharePoint to a community that has Apple users. SharePoint is only semi functional for Safari users. If you ask "the Soft" himself he'll tell you that almost any browser from OS 9/10 is a "Level 2" browser and will have limited support for interacting with Lists and Administrative context menus.

Probably the most difficult pill to swallow though is that the stock HTML editor barely works at all from Safari. Even if the editor did work well, its still a pretty terrible user experience even for users who are running browsers that are supported. Common tasks like uploading documents, images and managing them from a text entry screen are not supported even from IE or Firefox.

Enter the RadGridLite. The RadEditorLite (for SharePoint) is free and adds some hot features to stock SharePoint Blogs/Wiki's, especially for users who want to include a little media with their Blogs, Wikis, etc...

Here's a Screen Shot of the stock editor.

Trying to add an image to a Wiki with the stock editor:

And this is what it looks like with the RadEditor Lite:

Browsing Images with RadEditor:

Uploading Image with the RadEditor:

You can create folders to store uploaded images but it's of note that all the content that you create is under the default Wiki SharePoint list. This probably makes sense for some implementations but a lot of instances have all their content stored in central document repository. I went through the .chm and was unable to see any documentation that made it easy to point uploaded content to another list (like a document repository or some other kind of content repository).

Either way it's a step in the right direction and it's always hard to criticize free add ons too strenuoulsly.

Take if for a spin and let me know what you think.


Thursday, November 1, 2007

Backing up Sql Server 2005 with SMO

We recently decided that we needed a low maintenance backup strategy for our OLTP databases. The thing with working in contracting/consulting is it seems that you're working on a new database every other week. One of the clients we're working on right now has over 16 databases that we work on semi-regularly.

What makes things worse is that it seems we end up creating another one every other couple of weeks talk about a maintenance headache!

Normally for database backups I would recommend a SQL Server Maintenance Plan, but we wanted to do more than what SQL Server would generate for us via an SSIS package. We also didn't want to be changing the package every week as we drop and create more databases.

We wanted to be able to do the following:
1) Backup up ALL non system databases (except those in an "exclusion list")
2) Zip up all backups that we create (because you can get GREAT compression on .bak files)
3) Delete old backups after a certain time interval, say after accumulating 10 backups delete the old.
4) Send out summary emails so that we know the jobs are still running.
5) Be able to change these settings w/out knowing any T-SQL, C#, etc...

Because of this semi tall order we actually ended up writing some C#. At first I couldn't help but groan, I felt we were overly complicating the issue and creating an unnecessary maintenance headache (who wants to maintain more code).

After I spent some time on the Code Project though I felt a little better. SMO (SQL Server Management Objects) provides a WONDERFUL API in terms of manipulating SQL Server. This rich interface provides everything you might ever want to do to your data store through the Management Studio.

Some of the code ended up looking a little like this:

Backup backup = new Backup();
backup.Database = database.Name;
backup.Action = BackupActionType.Database;
backup.Devices.Add(new BackupDeviceItem(backupPath, DeviceType.File));
backup.Initialize = true;
backup.Checksum = true;
backup.ContinueAfterError = false;
backup.Incremental = false;
backup.LogTruncation = BackupTruncateLogType.Truncate;
backup.PercentComplete += new PercentCompleteEventHandler(backup_PercentComplete);
backup.Complete += new ServerMessageEventHandler(backup_Complete);

It's ridiculously simple. Combined with the popular Zip library SharpZipLib we were able to do pretty much everything we wanted to our databases. As of right now our deployed utility runs as a SqlAgent job and backs up all databases (not in an exclusion list) before zipping them up. It then deletes the Nth oldest backup so that we don't have to clean up (we get a say, a 6 week revolving set).

One last thing, so that we could backup new databases without having to add any grants or new logins we created a single login user DBBackupOperator and put it in the server role 'sysadmin'. I know what you're thinking..."Tyler that's bloody CRAZY!" especially if you're gonna log in from some 3rd party app.

We decided that to save ourselves from living too dangerously we would hash at runtime the password that we store in the app.config. That is, the password we store in configuration files is NOT the password used to connect to the data store. Instead we hash the crap out of it and then used that shared secret password to connect.

Well that's it, I strongly encourage you to check out SMO the next time you're playing with SQL Server 2005.


Wednesday, October 24, 2007

Reporting Services: An internal error occurred on the report server. See the error log for more details.

We recently migrated reporting services from one machine to another. After performing a fairly textbook migration (we migrated the databases, encryption keys, applying service packs etc...), everything SEEMED to be working. But we were getting plagued with one intermittent error that was extremely weird to troubleshoot.

When opening up a report (or drilling in, paging etc...) we would intermittently get a:

An internal error occurred on the report server. See the error log for more details.

Nothing would show up in the Event Log.

If you dig into the reporting services log files (C:\Program Files\Microsoft SQL Server\MSSQL.3\Reporting Services\LogFiles...) you eventually find some text that looks like this:

w3wp!processing!1!10/24/2007-15:17:56:: a ASSERT: Assertion failed! Call stack: Microsoft.ReportingServices.ReportProcessing.ReportProcessing.UpdateTimingInformation(RunningJobContext jobContext, Int64 totalDurationMs, Int64 reportDataRetrievalDurationMs, Int64 reportRenderingDurationMs) Microsoft.ReportingServices.ReportProcessing.ReportProcessing.RenderSnapshot(IRenderingExtension renderer, CreateReportChunk createChunkCallback, RenderingContext rc, GetResource getResourceCallback) Microsoft.ReportingServices.ReportProcessing.ReportProcessing.RenderSnapshot(CreateReportChunk createChunkCallback, RenderingContext rc, GetResource getResourceCallback) Microsoft.ReportingServices.Library.RenderSnapshotAction.Render()

So what happend? The old machine was running on Intel hardware the new one was running on AMD. Specifically an AMD x64 proc that boasts AMD's Cool'n Quiet technology. Essentially this messes with software like Reporting Services that try to get at the system time through the QueryPerformanceCounter() function.

Thanks to this KB article and this post we got it resolved (read 3 hours later)!

God bless the internet.

Sunday, September 30, 2007

SPWeb.Groups vs. SPWeb.SiteGroups

Here's a weird one that maybe someone will help me out with some day (you know once people actually start subscribing/reading to this blog). I'm working with the SharePoint Object model and I want to get at all the Groups within a given Site. So we write some code that looks a little like this:

using (SPSite topLevelSite = new SPSite(ConfigurationManager.AppSettings["ForumSharepointInstanceUrl"]))
using (SPWeb rootWeb = topLevelSite.OpenWeb())
foreach (SPGroup group in rootWeb.Groups)
//go do something clever...

If you've run code that's similar and you've found that certain groups are missing from the SPWeb.Groups collection don't feel like you're alone. As it turns out (at least from what I can discern) the SPWeb.Groups returns all the groups within the web that have at least one permission set. The SPWeb.SiteGroups however will return ALL groups. I have no idea why. I also couldn't find anything of use about the topic on the MSDN. I've also found blog entries from people who couldn't find specific SPUser objects when digging in the SPWeb.Users vs SPWeb.SiteUsers collections. Their complaint was similar that the SPWeb.Users collection would only find users who had already logged into the site.

Summing it up: Use SPWeb.SiteGroups, SPWeb.SiteUsers etc... and ask someone who has some real SharePoint chops.

Still Confused,

Saturday, September 29, 2007

AllowUnsafeUpdates, a Sharepoint Object Model gotcha

The other day I was trying to programatically add group to SharePoint via the object model and a confusing exception would continue to get thrown. This same exception could happen whenever you are trying to add an SPUser, SPWeb, Group etc... to an existing collection via the SharePoint object model.

I would run some code that looked a little like:

SPMember owner = rootWeb.Users["SomeUserName"];
SPUser user = rootWeb.Users["SomeUserName"];
rootWeb.SiteGroups.Add(groupName, user, null, description);

And the following exception would get thrown:

System.Runtime.InteropServices.COMException: The security validation for this page is invalid. Click Back in your Web browser, refresh the page, and try your operation again.

After digging around I discovered that you need to set SPWeb.AllowUnsafeUpdates = true. I have yet to find out the implications of this setting although the MSDN states that

"Setting this property to true opens security risks, potentially introducing cross-site scripting vulnerabilities."
The good news is that after you call Dispose on your SPWeb object and get a new one the setting is reset to false.

Friday, September 21, 2007

Ajax in MOSS via the Update Panel

I'm currently doing a lot of ASP.NET development inside MOSS 2007. On this current project we've been doing the most of our RAD outside of SharePoint and then porting our ASP.NET application inside a MOSS instance for integration. Needless to say, like so many developers working with MOSS, I've had an interesting time troubleshooting a litany of errors that seem to come up working inside this new framework.

Today's save comes from Mike Amerlaan who's figured out a fairly critical hint that's needed to get Update Panel's to work inside a MOSS site.

Prior to discovering Mike's fix I would be able to post back (inside an update panel) ONCE after which all postback's on the page became non functional.

Essentially this is because WSS emits submit wrapping javascript that tampers with how your page submits. It does this to ensure that urls that involve double byte (read unicode) characters post back appropriately.

To get your update panels working you're going to have to remove this script. You can disable it by emitting the following javascript:

spOriginalFormAction = document.forms[0].action; _spSuppressFormOnSubmitWrapper=true;

You can of course do this by using either the ScriptManager or the Page.ClientScript.RegisterStartupScript. I'm pretty sure you'll be able to figure the rest out.

Besides that you pretty much do what you would expect to do to get UpdatePanel's working inside MOSS.

  1. Install the Ajax extensions on the host machine
  2. Migrate the web.config settings that tee up all the ajax controls/http handlers/http modules
  3. Add a tag to each page (or master page) that you want to use it in.
  4. Add you update panels around some postback event.
And that's it! Voila you're done. Now your MOSS site feels a little bit less like a stock MOSS site.

Happy Coding,