Convenient Links in SharePoint

Some things I’ll just leave here…

It’s good to be back.

Here’s some URLs that might come in handy:

/_layouts/Authenticate.aspx – Opens the login dialog box.
/_layouts/settings.aspx – Opens site settings.
/_layoutscreate.aspx – Opens the create page.
/_layouts/15/viewlsts.aspx – Opens Site Contents.
/_layouts/15/listedit.aspx?List=<list guid> – Opens the list settings.
javascript:GoToModern(True) – Exit Classic Experience

These are cool:

javascript:(function(){document.cookie=”splnu=0;domain=”+window.location.hostname+”;”; location.href=location.href;})(); – Sets a cookie to load pages in Classic Experience

javascript:(function(){document.cookie=”splnu=1;domain=”+window.location.hostname+”;”; location.href= location.href})();

Many of these come from our new friend Joao at his site, SharePoint.HandsOnTek.net,  here. And he’s got others.

 

It’s 2018

So the site got hacked…

For anyone interested in the gory details, someone figured out how to hack WordPress and, before we could work in a fix, our whole operation got taken over by a nefarious actor who loaded all our pages with links to his malware sites.

So this site is hosted at HostGator and my package includes some third party malware monitoring and they caught it and called me in the middle of the night and told me I had a problem and if I didn’t fix it, they were going to blacklist me. At first, I didn’t trust them because I had never heard of them except as a throwaway item in my Hostgator deal.

So, I called HostGator and, for a small fee, they were able to restore the site from backup. And that worked for exactly one day and, as you may have noticed, we were not generating a bunch of updates so the loss was negligible. Then apparently, something was not right in the WordPress space and the site was down until the WordPress people pushed out an update today.

So it looks like we’re back in business. A bunch of things worked but some didn’t and we were down for a month or two.

Anyway, if you’re keeping score, we’re satisfied with HostGator and we’re happy with WordPress and whoever that site monitoring company is.

I say all of that to say this: things change. And why we’re still an SPRobot, we need to move on or move out so we’re going to try to mix in some new stuff. One of our new stuffs is our new friends at the StratusFactory. They got started with MS Azure and do some cool stuff but even that might not work out for us so stay tuned.

-robot

That was nuts!

So this was me since my last post:

https://www.facebook.com/109643175745610/videos/vb.109643175745610/1019618048081447/?type=2&theater

And it’s all been MS Dynamics CRM. And now we’ve done it all and turned it over and we’re off to new things. That makes me one happy robot.

And we look at our first assignment and…

This is nuts…

My SP2010 User Provile Service Application Synch database is over 96 Gigs.

What I have is this from our new best friend, Paul.

He’s got a great description of the problem and a lot better response from MS support than this robot but he tells us what to do, both now and later.

It’s interesting that there’s a supported unsupported stored proc SQL fix he was given by the vendor to temporarily fix the probelm. And as much fun as that would be, by the end of his take, he takes us to the sequel. See what I did there?

The sequel takes us to TechNet where we get the February 2012 update for SP2010. Here, we’re informed that we must already be in SP2010 Service PAck 1 to run the CU.

The 2/12 CU dowload source is here.

Then, we learn that all the SP2010 and SP2013 updates are here.

 

 

 

We’ll look at this a bit more later

I wanted to be sure we remembered that this is a great InfoPath post using the lists.asmx web service inside an InfoPath form.

Dynamics CRM and its Reporting Services Extensions

Okay, maybe you didn’t know. This robot is now the soon-to-be leading, local expert on large scale Microsoft Dynamics CRM installations.

In this case, we’re deploying many, many “intances” of DCRM for a number of development teams, test teams, validation teams, disaster recovery and production.

DR and PROD were cake… Four DCRM servers pointing to a single SQL Server and the VMWare backup mirrors the volumes in real time accordingly.

However, project managers and account managers being what they are, got all worked up about cost and money and induced us to agree to install a single SQL Server to support a number (say 1 .. n) of  pairs of DEVx and TESTx DCRM environments where a single SQL Server will run multiple database server instances allowing each DCRM environment to run on it’s own server and use its own database instance on the single SQL Server as shown:

Multiple DCRM Servers Using Multiple Instances on a Single SQL Server

Multiple DCRM Servers Using Multiple Instances on a Single SQL Server

Perfect.

Well, you might think so but as we’ve learned, perfect is also often unobtrainable. It seems our friends at Microsoft, in the DCRM group, have tossed us a little bone called SQL Server Reporting Services Extenstions for Dynamics CRM. This is more stuff that you can add to SQL Server Reporting Services to make more spohisticated DCRM reports. Still, no problem. We simply add SQL Server Reporting Services to each of the database instances as shown where SSRS is the little blue piece added at the bottom of the SQL Server database instances:

 

Multiple Instances of SQL Server w\ SSRS Added

Multiple Instances of SQL Server w\ SSRS Added

And then we add our little DCRM Extensions for SSRS get added to SSRS, no problem as shown with the red arrows:

Adding DCRM RS Extensions to Multiple Instances of SSRS

Adding DCRM RS Extensions to Multiple Instances of SSRS

Okay, well, sure… the first one goes in no problem. The second one, however, no so much…

It seems the setup program for DCRM RSE for SSRS notices that it’s already been installed and falls into a “Repair or Delete” routine that will not allow for the specification of the reporting services second database server instance:

05

DCRM RSE for SSRS Setup Program Bars Installation onto Second Instance of SSRS

So, what do we do? Well, you know me and this robot is all about making new friends. And our new friend, AniMandel from xrmadventures.wordpress.com, gives us a pretty good look at the DCRM RSE for SSRS setep process [HERE]. Yeah, he’s a little off topic but at least he has the screen shots.

Then, we have another new friend, Sean from Blogs.MSDN.com who dives into some the DCRM RSE details [HERE]. Sean says the magic words: “each Reporting server.. with the report extensions installed may only host reports for a single CRM 2011.” There you have it; we simply must have a dedicated instance of SQL Server Reporting Services for each instance of DCRM.

So what is this “dedicated instance of SQL Server Reporting Services” look like? We, we see at TechNet,  [HERE], that it’s not really SQL Server; it’s the Install But Do Not Configure Option. It’s laid out like this:

 

Multiple Instances of SSRS and DCRM RSE Installed on DCRM Servers Using "Remote" Database

Multiple Instances of SSRS and DCRM RSE
Installed on DCRM Servers Using “Remote” Database

Technically, they call this “installing SSRS to run on a remote DB engine instance.” So we log into our DCRM server and run the SQL Server setup program. And, since it matters now, we recall that SQL Server components on separate server must be running the same version of SQL Server. So we check our version number on our DB engine instance, the blue cylinder above and we’ll be sure we install the same version for SSRS bits on our DCRM server. Once we mount the right .iso, we run setup.exe.

We run a new “stand-alone” installation, run the rules check, enter the product key, agree to the license terms, download and install the required updates and extract and install the setup program.

When you’ve installed all the setup and run all the rules, you select a “Features” installation and the only feature you select is SSRS – Native mode. Then you’ll run some more rules and add in your service account and password.

Then you’ll get to the Reporting Service Configuration and your only option here will be Install Only. Do what you want with error reporting and run some more rules. You get Ready to install and let her rip. The installation process runs for a bit and then you get a Success message. Not a bad idea to reboot here.

Now we can configure SSRS and tell it to use the database engine on our remtote SQL Server. We run the SSRS Configuration Manager and note how it’s a little different from a typical install. First, it still wants to connect to an “Instance” even though that instance is just SSRS and does not include the DB engine. Then, as you click down the configuration options on the left, you’ll see that the Service account is fine but the Web Service URL is a little different. it says the SSRS “Web Services is (sic) not configured.” We note that the Virtual directory and IP address and stuff is all okay but we may have trouble with the TCP port. In this particular case, perhaps by dumb luck, we put the DCRM web apps on port 5558 so port 80 is free. We don’t have any SSL requirements yet so I click Apply.

This yeilds a positive result so we press on.

The database section is even less comforting; there’s nothing there to work with so we have to add it all. When we click Change Database.

We get a configuration wizard to open and leave the “Create a new..” option checked and click Next.

Now we need to specify a database server. Accordingly, we specify the SSRS DB Instance shown above using the <Server>\<Instance> format. We select the SQL Server Account and give it a suitable acocunt and password. I used the sa account so shoot me. I click Test Connection and I get a success message. I click Next.

It want a database name and I’ll have a couple so I’m calling this one DEV1ReportServer so my other one can be TEST1ReportServer and I like English so I click Next.

Now it wants a service account and I have one for the local machine so I’ll use it and click Next. I like this because the DCRM service account is already listed as the “Service Account” and it says “Permission to aaccess this report server will be automatically granted to this account you specify.”

I get a summary and I hold my nose and jump in…

I get a big long list of succcesses and I click Finish.

I look at the Report Manager URL page.  Blah, blah, blah, not configured… Apply… Success.

I don’t need an execution account at this point or encryption keys so I’m going to call it done. All I need, now is for the DCRM SRS Extensions setup to run against this DB instance…

We drill into the install package and run SetupSrsConector.exe, accept the licence and forgo the updates.

In the Setup Wizard, on the Specify Configuration Database Server page, we point the extensions at the DCRM database that stores the configuration database; that’s the DEVx DB Instance in the diagram above.

Then, for the SSRS Instance, we point the wizard at the local SSRS instance we just created.

Do what you want with MS Update and point the install files at the correct location and click Next.

So it runs its “System Checks.” and we get a warning and an error. The warning arises from the fact that we created the default organizaation to run SSRS on the old dedicated instance where we orginally wanted to share SSRS. It just means that the RSEs won’t work on that organization. We’ll just delete it and make a new one.

The errors comes from the fact that DCRM and SSRS on the DCRM server are using the same service account. It says to go into the Services.msc and change the account on the SSRS service and restart the service. I do that, click back and then Next again.

And we couldn’t get past this error without resorting to this: http://inogic.blogspot.com/2012_06_01_archive.html

hth!

-robot

 

 

 

 

 

 

 

 

Grow Your PowerShell

We’ve spent some time looking at PowerShell and every time we get better at it.

Some confusion arises from the use of custom functions but we have on method that’s shown to work.

First, we’re working off a set of external sources from some of our new best friends.

We found a discussion at ServerFault.com, here, that provides us with the code to support a specific use case that’s always driving us nuts, proving that a user ID and password is good or bad. This is remarkably handy when working with service accounts that are supported by a bunch of neanderthals in a cage labeled “Windows Support.”

Then we have another discussion, here, that helps us with some PoSh function management stuff.

And, finally, our new best friend, Don, explains a bit here about executing a number of commands on a list on inputs piped in one at a time.

So, here goes…

You can create a function interactively from your PoSh prompt one line at a time.

PS C:\Windows\system32> Function WriteSomethingOnScreen {
>> Write-Host "Something"
>> }
>>
PS C:\Windows\system32>

In that code, note the following:

  • The function amounts to the word “function,” the function name and some code inside {curly braces}.
  • When you start building a function, PoSh figures it out and turns your <path>> prompt into an interactive, “>>” prompt.
  • When you feed the >> prompts a blank line, it jumps out of interactive mode and back into your standard, <path>> prompt.

Now we have a function in memory and we can call it by name at the <path>> prompt:

PS C:\Windows\system32> Function WriteSomethingOnScreen {
>> Write-Host "Something"
>> }
>>
PS C:\Windows\system32>

If we kill our PoSh session, the function dies with it.

Windows PowerShell
Copyright (C) 2009 Microsoft Corporation. All rights reserved.
PS C:\Windows\system32> WriteSomethingOnScreen
The term 'WriteSomethingOnScreen' is not recognized as the name of a cmdlet, fu
{Blah,blah, blah}
 + CategoryInfo : ObjectNotFound: (WriteSomethingOnScreen:String)
 [], CommandNotFoundException
 + FullyQualifiedErrorId : CommandNotFoundException
PS C:\Windows\system32>

Yes, we have to create our function all over again from scratch. Since that sucks, we’ll just write our function into a script in NotePad and save it, no problem.

 

01_NotePad

 

Save the script and run it in PoSh:

First thing you’ll note is that, by default, PoSh will block scripts from running:

PS C:\Windows\system32> writesomethingonscreen.ps1
File C:\Windows\SYSTEM32\WriteSomethingOnScreen.ps1 c
e execution of scripts is disabled on this system. Pl
{Blah, Blah, Blah}
 + FullyQualifiedErrorId : RuntimeException
PS C:\Windows\system32>

So run:

PS C:\Windows\system32> Set-ExecutionPolicy Unrestricted

Execution Policy Change
The execution policy helps protect you from scripts that you
{Blah, Blah, Blah}
[Y] Yes [N] No [S] Suspend [?] Help (default is "Y"):
PS C:\Windows\system32>

Now PoSh will run your script.

PS C:\Windows\system32> WriteSomethingOnScreen.ps1
PS C:\Windows\system32>

Note also that the input is NOT case sensitive.

PS C:\Windows\system32> writesomethingonscreen.ps1
PS C:\Windows\system32>

So our script created our function and we should be able to run it now, right? Wrong. We actually have to call the function from the script to make it work by adding:

WriteSomethingOnScreen

..to the bottom of the script and then running it.

PS C:\Windows\system32> writesomethingonscreen
Something
PS C:\Windows\system32>

We’ll look at the list of inputs next time.

This is where we’re headed:

Function Test-Credentials {
 Param($context, $username, $password, $domain)
 Add-Type -AssemblyName System.DirectoryServices.AccountManagement
 $ct = [System.DirectoryServices.AccountManagement.ContextType]::$context
 $pc = New-Object System.DirectoryServices.AccountManagement.PrincipalContext($ct, $domain)
 New-Object PSObject -Property @{
 UserName = $username;
 IsValid = $pc.ValidateCredentials($username, $password).ToString()
 }
} 

Then, your command looks like this:

test-credentials <{"Machine" | "Domain"}> 
                 <User ID> 
                 <Password> 
                 <{Macine Name | Domain Name}>

Of course, replace the new lines with simple blank spaces.

This does a lot of cool stuff. Once it’s written into the PoSh memory you can run it like a CMDLT and pass it, with no commas, a context {machine or domain} a UID, a password and a domain or machine name and it will tell you if the UID and PW are correct.

We’ll look at it more closely next time.

hth

-robot

 

 

 

Delivering Your Solution

Step One: Getting Started (Again)
Part Two: And Moving Forward 

So we got connected to our farm, loaded Visual Studio, and build our first custom SharePoint 2013 visual web part.

From here there’s two ways we can go. First, obviously we could make the web part more complicated. Sure. We’ve done some of that before. But, second, we need to be able to build our web part into  something that might be usable to an audience. Let’s look at this second issue first.

First of all. Let’s recall that my VS project was called VisualWebPartProject2 and notice that VS projects, by default, create things inside our \Documents\Visual Studio 2012\Projects\<Project Name> \<Project Name>\bin\Debug folder.

Inside that folder, there’s a VisualWebPartProject2.wsp. This is a genuine SharePoint 2013 solution. To prove it, we go to Site Settings | Solutions and click on the Upload Solution link on the ribbon. We navigate out to our \bin\ folder and select the .wsp and click Open

Adding the .wsp File

Adding the .wsp File

Then, we get a chance to activate our solution by clicking on the Activate link

Activate the Solution

Activate the Solution

Now, we can navigate out to a typical team site, edit a page, click on Web Part on the Insert ribbon and select the VisualWebPartProject2 web part from the Custom web parts category and click on Add. And there it is, Hello World in all it’s glory.

HelloWorld

A Real Text Box in a Custom Web Part

And Moving Forward

Step One: Getting Started (Again)

Okay, we looked at getting connected to our server via the RDP Admin client.

So, we’re presuming we have our domain and SQL Server and SharePoint farm already running. and we’ve simply added a second server so we can have a dedicated front end server to run Visual Studio 2012.

We mount the .iso using Windows now so there’s no need for a separate program for that. We right-click on the Visual Studio .exe and select Run as Administrator.

Agree to the licence terms and click Next.

Here, it gives you the opportunity to install some number of components. We check them all and click Install.

Now, there’s two progress bars and instead of a spinning pinwheel, we get little flying dots that back up and then speed off like traffic on the interstate. At least the two status bars are grammatically congruent, one being Acquiring and the other Applying.

Then you get a Setup Successful panel which is nice because it has the exclamation point and we like those a lot!

We click the Launch option and cross our fingers.

So here, the “Program Compatibility Assistant” tells us This program has compatibility issues. We click the Get Help Online option just to see what it will do.

That returns a No solutions found for  Visual Studio 2012 so we click OK, then Run the program without getting help. It doesn’t instill a lot of confidence but, hey, it says it will run.

Sure enough, we get the Choose Default Environment Settings, and we select Visual C# Development Settings, forego the local help documentation and click Start Visual Studio.

Again with the Compatibility Assistant we run the program without getting help.

More progress bars for a couple of minutes… and there’s the Start Page and a baloon suggesting we try the updates.  We click the update button for the Visual Studio 2012 Update 2. That brings up a download manager where we click Run.

The update says to close Visual Studio. We do and click Continue, Agree, and Install.

More progress bars and flying dot traffic jams. Again Setup Successful  and again, Launch this time with no Program Compatibility AssistantThat’s cool.

And we have the VS2012 Start Page.

And if we need any greater detail, our new best friend, CannonFodder, has this for us to work through the entire process.

What we don’t have is the SharePoint 2013 templates. Those are here.

As our new best friend, Tim, explains here, we can download it and run the .exe.

As it turns out, Office 2013 is a bit of a prerequisite for the SP 2013 SDK as much of it will fail without it. Now we could figure out how to get the SP templates without Office 2013 but ain’t nobody got time for that so we just install the Office bits and try again.

So then we can turn to our new best friends at SharePoint 2013 Hosting who details the web part creation process here. We follow along creating the project and defining the deployment site and the template solution gives us an error “Unable to connect to SharePoint Server.”

This turned out to be an ugly issue with the SharePoint 2013 Distributed Caching Service and the Windows Application Fabric Caching. So what we did was generate a number of links to external source that essentially led us to reinstall the AppFabricCaching service on both SP front ends this time using the latest version 1.1 bits.

Here are some helpful links:

App Fabric Intro

APP Fabric Caching – Automated

App Fabric Permissions

Lead Hosts and Cluster Management

Distributed Cache in SP2013

App Fabric Overview This one might be the best.

So after we got that sorted out, we figured out we had to start the Sandboxed Code Service on our development server. We would click the startbutton and Visual Studio would build our web part and install it in the specified site. Once there, we’d have to open a browser and navigate to the specificed site, edit the page and add the web part.

It looks like this:

HelloWorld

And there you have it, a custom SharePoint 2013 web part on a development server attached to a multi-server SharePoint farm.

hth

-robot

Getting Started (Again)

By now, you guys know that what we do best is get started and what could be better than getting started with SharePoint 2013?

Well, in this case, we’re actually had minions do all the hard part so now we just need to connect to it and see what trouble a robot can cause. Then, of course, we’ll try to recreate the whole thing ourselves.

First, we’ll want to get the RDP admin client running so we can log onto all of our servers via Windows. This tool is actually part of the Server 2008 Admin Tool package and you can get it for Windows 7 here:

http://www.microsoft.com/en-us/download/details.aspx?id=7887

You can struggle with the applicability piece but I’ve got Win7 x64 Enterprise so I download and run the .msu file. It thinks it’s just an update and so it just runs and it took several minutes.

Then it’s done. So now you have to go into Control Panel and click on Turn Windows features on or off. Select the Remote Services Administration options and click OK.

Selecting the Remote Desktop Services Feature

Selecting the Remote Desktop Services Feature

This will also take a moment. When it’s done, you can go Start | Run and enter MMC to open a generic console. Select File | Add Remove Snap In. Select the Remotes Desktops, click Add and OK.

Adding the Remote Desktops Snap-in.

You end up with the MMC running the RDC snap-in.

The Remote Desktop Connections Console.

The Remote Desktop Connections Console.

Right click on the Remote Desktops root and select New window from here. This will set your root and now you’re going to want to save your MMC with the RDC snap-in installed so go ahead and be sure to give it a smart, robot-like name. I’ll just leave mine on my desktop.

Finally you can configure each connection to use peculiar user names by right clicking on the root and selecting Add New Connection. Give it an IP address, computer name and user ID. If you check Allow me to save credentials, it will remember your user name but not that 64 character password.

Add a New Connection

Add a New Connection

Then just click on your new connections and log on.

Open Session and Log On

Open Session and Log On

And you end up with a cute little RDP session in that cute little window.

An Open RDP Session

An Open RDP Session

Then you realize the remote desktop won’t resize so you have to be sure to expand the MMC window BEFORE you make the connection. So right click and select Disconnect. Expand the size of the MMC and start over.

Open Session - Full Screen

Open Session – Full Screen

Now, just repeat the process for all the servers in your farm. Then you can add all your coworkers’ computers, all your computers at home and all your VMs and all the VMs running on your VMs.

 

PowerShell at the Document Library Level

Well it’s tough being a robot and always waiting for someone to ask you to do something and then you’ve got to figure out how to do it like yesterday.

Case in point, after we migrated 10,000 documents to the customer’s new SharePoint site they said “Well, all those are published; we need them unpublished.”

And you know how I hate the whole SharePoint version control, check out, publish routine but here we are and our answer is, of course PowerShell.

If you look at first, you’ll not find any specific cmdlets pertaining to libraries or documents. So good thing we have our new best friends to help.

First, here, Salaudeen, gives us a great look at extending the SPWeb object to the folder\library and then manipulate the items in it. Essentially, the routine is:

  1. Get-SPWeb by URL
  2. Set library variable to SPWeb.GetFolderand use the library name.
  3. Set files variable to Library.Files
  4. Call a Files.Add and pass it the library name, the file path\name and an overwrite variable, true or false.

That’s all good but there’s more at CodePlex here and CodeProject.com here.

These ought to get you in business pretty quickly.

-robot