Archive for November 2007


Site Columns and Look Up Lists

Okay, people, this is cool.

You all know I'm an old Lotus Programmer and we had an @DBLookup function that could go anywhere to grab a list of values to populate a lookup list.  It was way much cooler that SharePoint's alleged "Lookup" value type.  But, with site columns we're getting a little closer.  If you've ever wanted to see a quick and easy solution using site columns, this is it.

What I have is a number of site each with a directory that contains names and addresses and phone numbers and each item is assigned to one or more categories.  For example, I have categories for "Family", "Aquaintances", "Neighbors", and "Collegues".

And, since we all have our own site where we keep our own contact lists, I'd like to be able to maintain one lookup list for everybody's contact list categories.

First, I create a list in my home site.  I call it Contact Categories and I add my values as noted above.

Then, at the home level, I click Site Actions | Site Settings | Site Columns and then Create.  I name my column Contact Categories and I make it a Lookup datatype.  I put it in a new group called Home Lookups.  Then I tell it to look up from my Contact Categories list and use the Title.

Now, with the column in the bag, I go to my team site where my personal contact list is stored and I open my contact list and select Settings | List Settings and click on the link at the bottom of the Columns list that says Add from existing site columns.

I select my Home Lookups group and add my Contact Categories column.

Now, when I create a new contact, it edit item form reaches all the way out to my home site and provides values from the category list there.

This will obviously allow you to provide a central location for lookup lists that can now be managed seperately from the lists that will use the values.




Print My Christmas Card Envelopes

Here's one for all you hot shot SharePoint guys.

You know what a crumudgeon I am but you may not know that my personal address book still lives in Lotus Notes.  Using the Domino Web server, I can print a completely blank page with just my return address and my recipient's address in a nice format suitable for various envelope sizes.  I print about 100 envelopes on my laser jet in about 20 minutes.  I sign the cards by hand and they all say "Merry Christmas" as opposed to "Happy Holidays."

Does anyone have any idea how this would work "out of the box" with a SharePoint contact list.  This is really the last gate to the ultimate sunset of my LotusIBM technology.  For example, is there a "mail merge" in Word that will grab data from such a list?

Thanks and if you want to get on the list, let me know!



I spent a little while looking for this so maybe you may have as well.

The KPI toolset is really popular with the Executive Suite crowd and may well make the difference between go and no go on your next requisition so I'd like to be able to get some working and visible even if they are less than robust.

But you need the KPI list and he's a hard guy to track down.  Not only do you have to Enable Enterprise Features in Central Admin Operations but you also have to Enable Features on Existing Sites.  Then you have to install the Enterprise Features in the site where you want to create the KPI List.

Now, KPI List is found under Custom Lists on the site's Create page.  If you don't see it there, you've got to work the Enterpriste Features like I just described.

So, you'll need a source.  Now sources can be anything so I cerated a Custom List called Sales Call Activity and I gave it these columns:

  • Rep. Name – changed the Title column name to Rep. Name because it wouldn't let me just call it Name.
  • Call Quota
  • Actual Calls
  • Closed Sales
  • Total Sales Amount

Then I calculated columns to find:

  • Sales Quota 
  • Average Sale
  • Closing Ratio

Then add some data and go back to your KPI List.  Pull down the New menu and select KPI from a SharePoint List.

Do the easy one first.  Name it Closing Ratio.  Provide a description and comments, perhaps describing the indicators' values.

Browse out to find your source data list and view.

Then you've got to work the Number, Percentage, or Calculation options to get the Red, Amber or Green colors you want.  In this case, select the "Percentage.. where" option and select the closing ratio column is less than some target, say .5, since our closing ratio above was calcualted as a percentage.  Then, make it return green when that value is 0 (all reps' closign ratio > 50%) and yellow at 50 (half the reps' closing ratio >50%)

Then, I did one I called "Average Sale Targets."  I calculated the average of the average sale values and showed green when that value was over $1100 and yellow when over $900.  I guess the math is not entirely accurate here because of one person was way over alot and others were under by a tiny bit, the big numbers would be discounted by their lack of dispersal.  Average Average Sale does not equal Team Average Sale.  It would be nice to be able to capture view column totals and such.  In reality, you would probably point your KPI to a spreadsheet stored in some scorekeepers doc library and let excel do all the math.

I've found that it will help to succomb to a little "view diarrhea," filtering views to expose a single Sales Rep., for example.  Then you can get a KPI to indicate which reps are meeting which quotas.  When you use the KPI Web Part you can "Show Only Problems" and get the naughty list for Santa.

Also, create views to sort descending on Average sale, for example, and then limit item count to 1 to expose, in a web part, the best, or worst, performers.



More on WSS (MOSS) Backup and Recovery

So we are able to create our backup images like we talked about last time.

A few issues remain before our approach can be called "complete." 

  1. What's up with all those files?
  2. What to do with the back-up files as they grow in size and number
  3. How to do a restore.
  4. What to do when a restore fails. 
  5. What about configuration and Central Admin content?
  6. What about your DBAs?
  7. Migrating to a Different Farm.
  8. Gotcha's

So, using my best Google searches, I again find that our friends at TechTarget offer some pretty good content.  So we can take these in order:

1.  What's up with all those files?

Remember that, when we ran the backup, we specified and backup folder.  Remember, also, that it was a fully qualified UNC path to which I had created a share called \MyServerBackups and it was this share that needed suitable permissions for the various actors involved.

So in this folder, we'll find a file called spbrtoc.xml and a bunch of folders named using the format spbrxxxx.  The xxxx is the index number of each specific backup operation.

Inside the backup folder, you'll find some folders with GUIDs for names with a Projects folder, a Config folder, and a RegistryBlob.reg file; let's just call these "mysteries."

In addition, you'll find a spbackup.log and an spbackup.xml file.  The backup log will contain the errors encountered if you're not lucky enough to have the job complete without errors. The XML file includes metadata details about the backup operation including the top level component ID and wanrning and error counts.

Once you run a restore using a given restore folder, you'll get sprestore.log and a sprestore.xml files in the folder as well.

2. What to do with the back-up files as they grow in size and number

The restore folders are the key to your answer here.  SharePoint does not care when you put or copy the folders.  You only need the root folder with the spbrtoc.xml file and all of its component back up spbrxxxx folders.

If I had a maintenance window, I'd do a full backup to new folder and then intermittant differentials throughout the working days.  Then I'd change folders for the next full backup and diffs.  Then I could slough off fulldiff sets whenever I got tired of them taking up my disk space.

3. How to do a restore.

Okay, the restore is pretty easy working from Central Admin Operations.  Select a restore directory and check the specific content you'd like to restore and let it fly. 

4. What to do when a restore fails. 

As far as failures are concerned, I presume youi'll run into the same permissions difficulties we had with backups.  Once those are resolved, you get the job to start and the page will refresh with updates but individual segments of the operation may fail with a pointer to the backup log.

In addition, remember that the timer job will have to be deleted manually from the Central Admin Operations Time Jobs Definitions page.  Click on the Backup/Restore item and then click Delete.  Otherwise, next time you try one, you'll get thid error:

The backup/restore job failed because there is already another job scheduled. Delete the timer job from the Timer Job Definitions page, and then restart the backup/restore job.

For example, I sufferred from the fact that my Windows SharePoint Admin Service had not been started by default which makes me go "Hmmm…"   When it crapped out, I got an error in the log that said:

Error: Object Shared Search Index failed in event OnRestore. For more information, see the error log located in the backup directory.

That's nice.  So a little more digging and it told me:

InvalidOperationException: System.InvalidOperationException: This operation uses the SharePoint Administration service (spadmin), which could not be contacted.  If the service is stopped or disabled, start it and try the operation again.

So, in my Control Panel Services tool, I find the service and started it.  I deleted the timer job and kicked off the restore again.

This time, I got an error on the primary.
port 80" web app, and the SSP.  In the logs, I found this error:

SPException: The specified component exists. You must specify a name that does not exist.

5. What about configuration and Central Admin content?

In the logs, I get this message:

[Farm] The configuration database and central administration content database components cannot be restored.

I think this occurred because I neglected to tell the restore page 2 to back up to the same farm and then did not give it new farm info onto which to restore iteself.

6. What about your DBAs?

7. Migrating to a Different Farm. 

While the restore file collection can be sourced from a different farm and restored onto a new host, the SMIGRATE tool might be better suited for migrating site collections from one farm to another.

8. Gotcha's

One thing is that your site security is embedded in your site so unless your new restore host can reach the same domain controller, you're going to have trouble getting anything to work.

One thing the community is unanimous about is meddling with the contents of the backup folders.  I would hesitate to try to fool the xml into thinking that folders exist when they don't or don't when they do. 

Another is that the command line interface bypasses the Job Timer; you won't get a job to delete if you get a failure.

I'm getting some cacheing issues when I restore where my pages are not completely refreshed until I restart my browser or reset IIS.



Microsoft Support:

Content Deployment

WSS (MOSS) Backup

Our topic currently is backup and recovery.

I finally decided to spell "backup" without a hyphen so now I am ready to go.  Here's some good references from our new friends:

  • I found this post from Joel Oleson that provided alot of information
  • And this one from Scott Jamison, et. al. at

I've got my WSS server running so I need this solution.

Of course WSS 3.0 includes a pretty strong backup tools inside Central Administration Operations.  In the Backup and Restore section, you get a Perform a backup option.

Here you can select the entire farm or any one of you web applications including Central Admin or Search.  You can select a full or differential and then specify a location for the backup file.  The location must exist and the logged in user and the SQL Server Service account must have access to that location.  It helps to make the taget folder a share so you can just map your backup out to \MyServerackups.

Of course the first time I tried it it failed with this error:

Access to the parth \MyServerackupsspbrtoc.xml is denied.

So, since this is a Timer Job, you have to go the Timer Job Definitions and delete the blown-up job.

I found one post that suggested adding the network service account to the backup folder will sufficient privileges to create and update the file. 

And another that said to run regiis -ir in the .NET 2.0 folder.

Then, I came upon this TechNet article which did very little to help.

So, I started over and I hate it when this happens but I created a share on my server and I gave Network Service, System and my App Pool Identity full control of the share and ran it again and it worked.

So that's great, I now have an excellent backup of my site collection.

Now, I notice that my Share permissions are a bit different from my folder permissions and it was the Share permissions I needed to set.  I was also able to eliminate Network Service and System and still get it to work.  I'm not surprised about Network System since I'm not using any such account anywhere but my SQL Server is running under the local system account and I thought it might make a fuss.  The only ID I needed is the App Pool Identity.

So now, each time I run a full backup, the resulting folder is 90Mb and again, I've got something less than 100 documents in the whole thing.

Oddly, if I have my App Pool Identity in my local administrators group and I give that group full access to the share, I get the access error.  But, if I list the App Pool Identity explicity, it will run successfully. 

I was also able to reduce the share permissions to read and create and it still worked.

So I try the same deal using the STSADM tool.  The command looks like this:

C:Program FilesCommon FilesMicrosoft Sharedweb server extensions12BIN>stsadm.exe -o backup -url http://MyServer/ -directory \MyServerackups -backupmethod full

And this generates alot of scrolling text in my command window but, in the end, it says it's failed with errors.

So I add System back to the share permissions and try again and the output says that the operation has completed successfully.

From all of this, I conclude that, using the CA UI, the logged-in user can trigger the job which will run using the CA App Pool Identity; if that identity has Read and Create on the Share, then the job ought not fail.  On the other hand, using STSADM, the command is run from Windows which triggers SQL Server which runs under it's own identity, in my case, it was LOCAL SYSTEM and, therefore, I had to add SYSTEM to the Share permissions with Read and Create.

Thanks for paying attention.