Monday, August 18, 2008

Free HTTP Headers From SharePoint

Code You Don't Have to Write

It's always a pleasant bonus when you notice that software is behaving "as it should". The other day I was running Tamper Data on a SharePoint site and I noticed that all the images being pulled from Lists had ETag HTTP headers on them! Sweet. Furthering my delight was the fact that FireFox was sending If-Modified-Since and If-None-Match Headers and making use of the cache, making for one less trip to the content database.

If you're not familiar with these headers you may want to check out this post on cache related http headers. They can come in incredibly handy in the CMS world or anytime you're persisting a lot of assets (images, stylesheets, media, etc...) to a database.

The original response to the request had the ETag and looked like:SharePoint handing out an image out of a Library and adding appropriate ETag and Last-Modified HTTP headers.

When the browser next visited the page and tried to pull the image again the browser correctly sent If-Modified-Since and If-None-Match Headers:Firefox sending If-Modified-Since and If-None-Match headers to SharePoint.

And finally SharePoint read the headers appropriately and responded with a 304 - NOT MODIFIED telling the browser to dig the image out of it's cache since it was still current. The image was not pulled from the content database.Sharepoint returning propper 304 - NOT MODIFIED header.

Who Cares?

You might wonder if any of this really matters, it's just an ASP.NET Image/Resource Handler that makes good use of headers, it's how it's supposed to work right?

Sure, but I see bad image handlers all the time. In fact when was the last time you saw an image handler written by a developer that produced ETags and inspected incoming headers to try and make use of browser cache? Trust me, it's exceedingly rare. Most image handlers are unnecessarily database heavy and make little to no use of browser cache, never mind a lack of thought towards Content-Type and Content-Disposition. I'm quite happy that I don't need to stress out about all the images and documents that inevitably get stored in SharePoint libraries.

The Point

I think it's worth mentioning that these headers exist and that even if you haven't taken the time to enable caching on your SharePoint instance you're already taking advantage of the best form of cache that exists, the kind that's both free and close to the client (the browser). All you had to do was throw the content in a SharePoint Library.

My Best,
Tyler Holmes

Friday, August 15, 2008

Locking Down MOSS Application Pages For Anonymous Users

This Doesn't Look Quite Right

If you do many MOSS Publishing Sites for anonymous audiences you may have comes across the interesting artifact of Forms pages being visible to anonymous users. An example of this might be a site which allows anonymous access to the Entire Site, anonymous users will be able to navigate (and may get redirected to) URLs like http://domain/Pages/Forms/AllItems.aspx, or http://domain/Documents/Forms/AllItems.aspx. AllItems.aspx could really be any view on the list.

The problem with this is that it leads to a non branded experience for the user. One minute they're browsing your sharp looking Master Page and the next they're seeing a really ugly version of your site (also known as stock SharePoint).

There's actually an out of the box fix for this that ships with MOSS. It's called the ViewFormPagesLockdown feature and it's already installed, it just needs to be activated. To activate the feature you need to use the STSADM utility like below:

stsadm.exe –o activatefeature –url [Site Collection URL] -filename ViewFormPagesLockdown\feature.xml

Should you want to deactivate it you can of course run:

stsadm.exe –o deactivatefeature –url [Site Collection URL] -filename ViewFormPagesLockdown\feature.xml

What Does ViewFormPagesLockdown Actually Do?

There's no real easy way to say hide AllItems.aspx or similar views from users if you're running anonymous access on your site. These users run under the Limited Access privilege set, a default set of permissions that you can't change through the UI, which is why we have this Lockdown feature to assist you.

When you activate this feature you change the permissions of the Limited Access privilege group removing the following permissions: View Application Pages (List permission), and Use Remote Interfaces (Site permission). Here's a table of what that privilege set looks like before and after running the lock down feature. It's from the following MS article.


Limited access — default

Limited access — lockdown mode

List permissions: View Application Pages

Site permissions: Browse User Information

Site permissions: Use Remote Interfaces

Site permissions: Use Client Integration Features

Site permissions: Open

What's the result?

Because we've effectively removed the ability to see Application Pages to everyone who was using Limited Access (ie. Anonymous Users) when users end up visiting one of these pages they'll get challenged for a better credential. The result is that they'll get either an NTLM login box, or redirected to a login page (if you're using forms authentication).

Can we do anything else?

Some stakeholders don't really like the NTLM popups or the Login Pages on a site that is supposed to be anonymous in the first place. These prompts of course only happen because we just stripped out security to content they'd normally have access to in the first place...but life's far from rational.

Another approach would be to write an HttpModule and intercept requests to Application Pages in the form of a regular expression. At that point you can redirect them to a friendly branded page or simply send them back to the sub site root and let the Welcome Page take over. I'll post such code in a future post.

If you have any other thoughts I'd love to hear them.

My Best,
Tyler Holmes

Wednesday, August 13, 2008

Who Uses SharePoint Anyways?

Anyone Out There?

If you do any SharePoint development there's probably been a moment or two where you've felt like midnight security. You're not well armed, you have very little backup, and no one cares about what you do.

While SharePoint development may currently be thankless, it IS gaining popularity. Not only do slews of intranet deployments keep popping up each week, but MANY Internet facing sites are now the work of MOSS Publishing Sites. Consider the following well branded web sites that have come out of the closet with their MOSS implementations.

The list goes on and on. In fact there's a SharePoint Site with a list dedicated to listing other sites that are running WSS/MOSS (I warn you though the site itself is not branded and is about as ugly as SharePoint gets...).

So take heart! Know that you're not alone and WSS/MOSS implementations are sprouting up like weeds (like MOSS one might say :-). With every every passing month there are more and more developers to lean on, the community gets bigger, and the safety net grows.

If you know of a good looking SharePoint site post a link in the comments! MOSS is growing everywhere. Hopefully when it gets to your shop you and your dev team are ready.

My Best,
Tyler Holmes

Monday, August 11, 2008

Quirks Mode Strikes Again

This Stuff Never Goes Away

The other day I was helping out a friend with a JavaScript driven control that animated a panel and did some show/hide behavior. The control itself was great, it painted a wonderful user interface and had very few dependencies (just a .js and a .css file).

When it came time to deploy however the panel wasn't rendering correctly. We started doing diff's of the .css and .js in production and they were literally the same as the styles and JavaScript used in development and stage, so why the difference?

Two Words

Quirks Mode.

Way back in the day during the midst of the browser wars, Netscape and Microsoft gave birth to arguably their worst browsers of all time (NN4 and IE4). Not in the sense that these companies had never done worse, but in the sense that they had never strayed so far from a W3C standard. As a result web developers started to do some pretty funky stuff with CSS to get documents to render in these browsers.

When standards compliance became a little more important browser manufacturers had some interesting choices to make. If they moved closer to the W3C standards they risked breaking a lot of sites, but if they kept down the track they were going, authoring a web site that leveraged any of the growing CSS features would become increasingly frustrating (and it was already pretty bad).

The solution was two fold.

  1. Allow web developers to specify a document type which would dictate the rules that should be used to render their html document. An example of a document type would be:
    <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "">
  2. If there's no document type available, or if the html that's given breaks the declared document type then render in the old way (quirks mode).

Because of the above quirks mode is slightly different on most browsers, mostly because there's no spec for quirks mode. There's a whole web site dedicated to Quirks Mode. You can also get a fuller story on Quirks Mode history here.

Back To Our Story

The short version of our story is that the document that our web application (SharePoint in this case) was handing out htmls that was breaking the doc type and because of this the page was rendering in quirks mode.

One of the worst things about Quirks Mode is that you can't always see it coming. If you download the Web Developer Tools extension, FireFox will tell you when a document is rendering in Quirks Mode, but IE is a little harder to catch.

Below is what FireFox will do to alert you to a document rendering in Quirks Mode (if you have the Web Developer extensions).HTML document rendering in Quirks Mode.

I've taken to putting the following JavaScript in a bookmark to check any given page in FireFox/IE.

javascript:var mode=document.compatMode,output;if(mode){if(mode=='BackCompat')output='quirks';else if(mode=='CSS1Compat')output='Standards';}else output='an unknown';alert('The document is being rendered in '+output+' mode.');

The above relies on the JavaScript document.compatMode proprietary property which is found in some but not all browsers. Safari for instance doesn't have this property.

So How do You Fix It?

When it comes to Quirks Mode you can really only do two things to get it back into Standards Compliance Mode.

  1. Ensure that there's a doc type at the beginning of the document and that you're familiar with the rules of that doc type.
  2. Ensure that the markup you're building up is compliant with the declared doc type.

When it comes time to troubleshooting and finding the delinquent html you can try SharePoint Designer or Visual Studio which will inspect a document and throw warnings when they find tags that break a document type that you can choose. That being said I've really yet to find a silver bullet for these issues. It's most frustrating when the markup is coming from a control or a page section that you don't even have control over!

Well at least we know WHY the document is rendering funny. Maybe that'll help someone sleep at night.

Hope That Helps,

Saturday, August 9, 2008

Terminal Services and Application/Administrative Mode

Something's Broken

The other day I was in the midst of a troubleshoot and needed to Remote Desktop into some machine. I ended up getting the angry message:

To log onto this remote computer, you must be granted the Allow log on through Terminal Services right. By default, members of the Remote Desktop Users group have this right. If you are not a member of the Remote Desktop Users group or another group that has this right, or if the Remote Desktop User group does not have this right, you must be granted this right manually.

This was a little odd as I was actually an Administrator on the box. I checked the rights on the machine and sure enough had the privileges I needed to get in, so what now?

Punch Line

I'll jump right to the problem. As it turns out someone had put Terminal Services into Application mode, but not installed any Client Access Licenses. After you do this you get a fixed amount of time (30 days maybe) before Terminal Services locks down all connections (except the console connection).

Terminal Services

When you first install Windows 2003, Terminal Services is installed in Remote Administration mode. This mode allows 2 concurrent connections from anyone as long as they have sufficient credentials. You can opt to install the "Terminal Server" component in Add/Remove Windows Components which will effectively puts Terminal Services into Remote Application mode. This allows a large number of concurrent connections so long as the users all have Client Access Licenses (CALs) (per User) or you install a bunch of CALs on the users machines (per Device).

Should you put Terminal Services in Remote Application mode and opt not to install any CALs you have a finite amount of time before Terminal Services locks down and no one's able to get in (besides the console login).

How to Discern the Terminal Services Licensing Mode (Windows 2003)

To check to see if a Terminal Services is in Administrative mode or Application mode perform the following steps on the machine.

  1. Open the Control Panel. Open up Administrative Tools.
  2. Open up Terminal Services Configuration.
  3. Click on the Server Settings folder and look at the Licensing setting.
  4. If the Licensing key says Remote Desktop for Administration you're in Administration mode. If it says Per User or Per Device, you're in Application mode.Terminal Services in Administration mode.

To change the mode simply double click on the Licensing key. There'll be a link to Add/Remove Windows components where you can install the Terminal Server component which will put Terminal Services in Application mode, or remove the component to put Terminal Services back in Administration mode.

HTH Someone,

Thursday, August 7, 2008

Moving Commerce Server Databases/Websites

Normally I'd RTFM...but where's the manual?

Let me preface this by saying I'm not a pro when it comes to Commerce Server. In fact I'm writing this...because I couldn't find any non MSDN documentation to speak of when I was trying to accomplish this task. And when it comes to task based directions the MSDN is a little lacking.

This task we're talking about here is migrating Commerce Server Applications (or migrating Commerce Server) to another machine/farm. Before I hop off the soap box let me just say that at time or writing (5 months away from 2009) there has yet to be a SINGLE (note the screaming caps) Commerce Server 2007 book published. That's pretty weak. I can forgive most publishers for not authoring any content, I mean hey Commerce Server 2007 is a pretty niche product right? But how about a little support from Microsoft Press? I'm normally a sizable proponent of MS's work and if you read this blog are probably aware that I spend a lot of time in the MS tech stack, but I can't help but think I'd get more support when it comes time to pick up this product. Working in a software services firm I'm constantly trying to quickly become deep in new technologies, and existing literature has always been a huge help. I guess you could say when it comes to this "new" release of Commerce Server 2007, I sure do wish some of the MVPs or MS Staff would step up and author something. All that's available right now are a series of books on Commerce Server 2002 and a series of pretty non-technical help files.

Making The Move

Before we go into the step by step and the tools we should first speak to why moving a Commerce Server Application would be a headache (should you choose to do it by hand). The application as far as dependencies is likely to have:

  • A series of folders which contain application code (normally in a web site).
  • A series of databases (Admin, marketing, marketing_lists, productcatalog, profiles, transactionconfig, transactions) the last 6 of which are per site.
  • The Admin site contains machine specific entries which list out all the machines participating in the farm.
  • A configured Csapp.ini file.
  • A series of IIS settings.
  • A bunch of NTFS Permissions.
  • Any other dependencies.

Site Packager

Luckily there's a tool to help you make the move. Commerce Server makes use of the Site Packager which generates .pup files and helps you migrate these configuration and application settings across machines/farms. In fact the Site Packager plays a pretty pivotal role when it comes to doing any kind of Commerce Server 2007 setup. Some of the common tasks you'll use it for include:

  • Setting up Commerce Server Applications.
  • Adding Sites.
  • Adding Applications to Sites.
  • Adding Web Servers to Applications.

You can normally find the Site Packager under Start->Programs->Microsoft Commerce Server 2007->Tools->Site Packager.

Pack It Up

To pack up a site you run the Site Packager. The wizard is pretty self explanatory, you essentially choose the site you want to pack, the file you want to pack it as (choose a .pup extension) and hit next a couple times. The only odd part is you'll be asked to specify .sql files for the Profile database schema and data. You're apparently supposed to generate these on your own using the SQL Server Management Studio.

A more detailed walkthrough of the site packaging task can be found here. The below is from the MSDN.

In addition to site resources, applications, and the Internet Information Services (IIS) settings that are required to re-create the Web server configurations, Commerce Server Site Packager also includes the property values from the Administration database. For example, when you package the Profiles resource, all the current property values are also packaged. When you unpack the Profiles resource, the property values are unpacked into the Administration database.

The package includes global resource pointers, but not the global resources themselves. The package recognizes the connection strings that are in the application, but does not contain the actual data in the connection string


During the application packaging process, Site Packager searches the IIS metabase on your local computer and finds the physical directory that is the root for that application. It then starts at that root directory and packages all the subdirectories in the following section into a new file that has a .pup file name extension. Site Packager preserves certain settings in IIS, such as authorizations and access permissions.

More details on how the site packager works and what it actually packages up (and what it doesn't) can be found here.

Below is a picture of the site packager packing up a site. Commerce Server Site Packager being used to Move/Migrate a Commerce Server 2007 Application.

Unpack It

Now that you have this .pup file you can move it to the destination server, and as long as that server has Commerce Server 2007 you can unpack it. Should you choose a Custom Unpack you'll be able to choose whether you want to reuse existing IIS web sites, which database to use (should there be multiple) and some more options.

A more detailed walkthrough of the site unpacking task can be found here.

Good Luck

This will get you 90% there. There will likely be other dependencies that you may need to fish out of the GAC or some other place but ideally a lot of the Commerce Server noise will go away. A good way to test your migration is to target your Commerce Server "site" with the Business User Applications and ensure that you can see some data made it over the wire.

My Absolute Best,

Wednesday, August 6, 2008

An Unhelpful STSADM -o Restore Error

Uh, What Version Should It Be?

The other day I was helping a gentleman from IT troubleshoot this particularly odd error:

Your backup is from a different version of Windows SharePoint Services and cannot be restored to a server running the current version. The backup file should be restored to a server with version '1178817357.0.120615.0' or later.

I thought this was a little weird since the only error I'd seen similar to this had to do when you try to do a backup/restore from a newer version of SharePoint (say 12.0.6219 [SP1]) to an older version of SharePoint (say 12.0.4518 [RTM]). Even then though the error gives you a real version of SharePoint, I can't say I've ever heard of version 1178817357.0.120615.0 (although the last 7 digits seem suspicious).

It's also of note that the source server was running 12.0.4518 and the destination server was running 12.0.6219 so there wasn't an incompatibility with SharePoint versions.

As it turns out we just weren't using the commands correctly. The IT Worker who had taken the backup (of a sub site) had run an STSADM -o export:

stsadm -o export -url http://RootSite/SubSite -filename c:\path

The IT Worker (another person) who was trying to restore the sub site was trying to use the STSADM -o restore:

stsadm -o restore -url http://RootSite/SubSite -filename c:\path

These two commands of course don't go together which is why we were getting the error. Using an stsadm -o import migrated the site as expected. It's also a good practice to have a subsite already created at the destination URL before you import. It also occurred to me that given a .bak file it's kind of hard to tell if it was created using an stsadm export or a backup.

A quick reminder:

STSADM -o backup/restore are for site collections.

STSADM -o export/import are for sub sites.

Also of note, when running stsadm -o export I would often encourage the use of the -nofilecompression flag. This creates a series of files (as opposed to just one file) in the directory that you specify with the -filename flag, but runs as much as 60% faster in my experience. The backup though is of course larger than it would be had we not used the flag.