PowerShell Password Generator

Have you ever Googled “PowerShell Password Generator”? Tons of hits right, and tons of scripts out there that can randomly generate passwords. Tons. Literally, like a truckload of them.
I was in the same boat as you, looking for a really efficient way to create random passwords. Why write a script when you can do it in a few lines (or just 1 line if you really want).

Add-Type -AssemblyName System.Web
$pw = [System.Web.Security.Membership]::GeneratePassword(30,10)

And looky there, free random passwords as far as the eye can see! For some info about the command, the 30 number is the total length, and the 10 number is how many non-alphanumeric characters to use. For loads more info on this method, take a gander at the official documentation:

Official Microsoft Documentation

(Kudos to all-around awesome PowerShell Scripting Genius Matt Boren for leading me down the right road!)

SP2013/2016 Profile Properties Issue

If you work at a company that still believes a fax machine is a nifty piece of hardware, beware of the Fax property in the User Profile Service in SharePoint 2013/2016. Huh? It’s right there, out of the box.

Says ‘Fax’ and everything.

Well, ok, sorta. For most people, you’re going to want to map that property to something in AD. Ohh, I don’t know, say ‘facsimileTelephoneNumber’. Looks like a great AD property to use to import those all-important Fax numbers into SharePoint. Except you can’t. Our buddies in Redmond decided that the out-of-the-box profile property for Fax is a Single Value property. The AD property ‘facsimileTelephoneNumber’ is a multi-valued property. If you map these two together, you’ll see nasty errors in ULS regarding invalid data type formats. You can’t edit the out-of-the-box property, and you can’t delete it.

This one is pretty easy, just create a new User Profile Property for Fax (call it ‘SuperAdvancedDataTransfer’ if you like), and map that to ‘facsimileTelephoneNumber’.
Then uncheck all of the boxes in the out-of-the-box property so it won’t show up on your user’s profile page. You’re done…

Checking SPWebs for Customizations

Don’t you hate it when your pesky users download SharePoint Designer and then go off and modify their SharePoint sites to fit the business needs? All without telling you, or even including you in their site launch parties? What, you don’t? You think that’s cool? Yeah, me to.

You may, at some point, want to know which SPWebs have been modified. Say you decide to move to O365 (your call). Might be a nice thing to have, so if the site migration explodes and showers you with css and javascript, you’ll know why.

I hate writing actual scripts, in large farms, the memory management is a pain. I like to stick to one-line piped commands, and have just the thing for you. This command will look at each website in your farm and report back any sites that appear to be customized (just the URL). It creates an output file you can load up in Excel and make all pretty. Or not. Enjoy:

Get-SPSite -Limit All | select -expand AllWebs | %{$url=$null;$url = $_.URL;$m1=$null;$m1 = (($_.MasterUrl).Split("/")[-1]);$_ | Select -expand lists | ?{$_.Title -eq "Master Page Gallery"} | select -expand Items |  ?{$_.Name -eq $m1} |select -expand File | ?{$_.CustomizedPageStatus -ne "Uncustomized"} |%{Add-Content "C:\CustomizedSites.csv" "$URL,$($_.Name),$($_.CustomizedPageStatus)"}}

So, how this thing works is as follows:

1. Get-SPSite -Limit All | Select -expand AllWebs (This one you should know)
2. Get the current active Master Page (if they’ve done things right, they won’t edit the default master page. Yeeeaaahhhhhh… sure they didn’t)
3. Look that file up in the Master Page Gallery and check it’s CustomizedPageStatus flag
4. If it’s customized, write the URL out to the output file

That’s it…

SharePoint is hit, is down, is dead. And jumps back up!!

For the last few years, people who do on-prem SharePoint support (like me) have been a tad bit worried about what we are going to do once Microsoft discontinued SharePoint. Come on, don’t tell me you have thought about it. The rumors that SharePoint 2016 was it for on-prem deploys? The lackluster support, not to mention touting O365 at every turn. Right?

Well, looks like the reports of the death of SharePoint are a bit premature. On May 4th, Microsoft unveiled a new roadmap for SharePoint.

From the session, a few of my takeaways are:

Microsoft likes and believes in SharePoint. I mean really believes. And they want you to believe to. So much so, you’ll see the name “SharePoint” now when logging into O365. That small change, which doesn’t seem like a lot, speaks volumes.

Mobile apps make sense. Not sure who the brainchild was that thought this one up, but the SharePoint mobiles apps are 2 years late. How many people walk around with a laptop in their hands accessing Intranets? Its about time for an app, and a good one. The early reports are that the app works great, and gives mobile users what they’ve desired from a SharePoint/Mobile experience for years.

Delve and Office Graph are the future. Forget about clunky search engines, these products are touted to present the documents you need before you ask for them. Snooping and trust aside, I hope these components work as expected, it would be a huge win.

DLP is the way. It doesn’t matter how cool your collab product is, those attorneys that question everything are going to want to know how you are going to control what happens inside. Governance used to be way we did this, but it’s just a word without teeth. Match your governance with a strong DLP process, and you’ve got a system that makes the users happy, and keeps the attorneys out of your cube (and back in the caves where they belong).

Let’s hope the boys and girls in Redmond can pull all of this off. I waiting, as is everyone, to get 2016 up and running. It’s been a long time since anyone’s been excited about SharePoint. Now you can (no really, it’s OK. Go ahead an give a little ‘whoop!’).

Hey look! I’m back, and lets talk about SharePoint Auditing

Something I ran into with SharePoint 2013/2016 site collection auditing. Everyone loves auditing, it fills the content DBs with such useful information. It’s like Christmas came early for the SQL storage vendors!!

Seriously, I was attempting to use PowerShell to pull Auditing settings from our farm. Who has turned it on, what trimming settings have they selected, and what items are being audited?

Pretty simple stuff:

(Get-SPSite <siteURL>) | select -expand Audit

Returns something like this for a site with no Auditing enabled

UseAuditFlagCache      AuditFlags      EffectiveAuditMask
———————–      ————-       ———————–
False                                None                 None

Great. Now if you find a site with Auditing enabled, that’s where things get really wonky.
If someone has selected just “Opening or downloading documents, viewing items in lists, or viewing item”, you get:

UseAuditFlagCache      AuditFlags      EffectiveAuditMask
———————–      ————-       ———————–
False                                View                   View

Oooookkkk, not really a bit-on flag, but let’s see where this goes.
If someone enables “Editing item”, it looks like this:

UseAuditFlagCache      AuditFlags      EffectiveAuditMask
———————–      ————-       ———————–
False                                Update              Update

And if someone enables “Checking out or checking in items”, it looks like this:

UseAuditFlagCache      AuditFlags      EffectiveAuditMask
———————–      ————-       ———————–
False                                3                         3

Well, of course it does. Looks like the helpful folks at Microsoft decided that some Auditing options will return a word, and some will return a bit-on mask. Super helpful, great, thanks. Awesome.

Fear not, I’m here to help, and help I will. Below is a handy table showing you how to figure some of this mess out. If you look at the Audit Settings section under Site Settings, this will show you what you will see from PowerShell, and what the corresponding bit on equivalent is.

Documents and Items:

PowerShell      Setting in Site Settings
————-      —————————
View(4)           Opening or downloading documents, viewing items in lists, or viewing item properties
Update(16)     Editing items
3                       Checking out or checking in items
6144                Moving or copying items to another location in the site
520                  Deleting or restoring items

Lists, Libraries, and Sites

PowerShell                      Setting in Site Settings
————-                       ————————-
160                                     Editing content types and columns
Search(8192)                   Searching site content
SecurityChange(256)     Editing users and permissions

So, using some quick bit on math you can finally start to understand, from PowerShell, what auditing options your happy Site Collections Admins have selected. Now the fun part, go figure out how much space Auditing is taking up…

Open, Edit: 20
Open, Edit, check Out: 23
Open, Edit, check Out, Moving: 6167
Open, Edit, check Out, Moving, Delete: 6687

All Lists, Libraries, and Sites: 8608
All On: 15295

Hope that helps…

Time Flies…

Sorry it’s been so long since my last post (hey, I’m not an MVP, I have a life and a real job and bills and a dog). News, news, news.

SharePoint 2016 Preview is out, and guess what? No more FIM import. You can use the AD Connector (and you should), or you can connect to a FIM-based system. I’m taking full credit for this (as I should), and for those of you still running or wanting to run on-prem SharePoint 2016, you will no longer know the fear of a corrupt SyncDB, or a messed up incremental sync that requires a full (that won’t start). So, rejoice….

Hybrid SharePoint 2016/O365 looks really good. It would seem that most medium to large companies will probably look at a hybrid solution for SharePoint if they are not already 100% cloud. It makes a lot of sense.

I’ll try to be at this a little more often, but I do have a dog. Really….

AD Connector Reports in SharePoint 2013

Once you start using the AD Connector in SharePoint 2013 (What?! Not use FIM? Are you Mad?!), you’ll probably want to know how the imports are running. So you drop into Central Admin, open up the User Profile Service and click on…. Nothing. That’s right, there’s no feedback in Central Admin on the status of the last Profile Sync run when using the AD Connector.

What’s an Admin to do? (Like you didn’t see this coming)
Spin up the SharePoint 2013 Management Console on the server where the Profile Service is running and fire off these commands.

Let’s say your import ran in the last 24 hours, and you’d like to know how many accounts were updated successfully, run this on the server where the Profile Service is running.
Since the message in ULS for a successful update looks like this:

User profile record of “domain\account” was changed by “profile service account”

This will get you all of those in the last 24 hours:

$dt = (get-date).addDays(-1)
get-splogevent -StartTime $dt| ?{$_.Category -eq "User Profiles" -AND $_.Message -like "User profile record of*"} | measure-Object

This will show you the running Import job (if there is one):

$upa = Get-SPServiceApplication | ?{$_.TypeName -eq "User Profile Service Application"}
$upa.service | select -expand RunningJobs

If you’d like to see all of the history for each timer job associated with the User Profile Service:

$upa = Get-SPServiceApplication | ?{$_.TypeName -eq "User Profile Service Application"}
$upa.service.JobHistoryEntries | fl

If you want to see just errors, do something like this:

$upa.service.JobHistoryEntries | ?{$_.Status -eq "Failed"} | select StartTime,ErrorMessage

And if you (for some odd reason) want to see all of the timer jobs associated with the User Profile Server, you can always do this:


The cool thing to do would be to set this up as a scheduled task and write the contents to a SharePoint list to make this much easier to review. Yeah yeah, I’ll get to that. Or you can do it… right?

Using PowerShell To Actually Do Good Things?

Yes, actually, believe it or not you can. One of my harping points of PowerShell functionality is that everyone starts by building scripts to install SharePoint. A good learning exercise I suppose, but check CodePlex and you’ll understand why I think re-inventing the wheel using PowerShell is pointless. “So, Mr. SharePoint Bartender, what do you use PowerShell for in your SharePoint environments?” Glad you asked.

Let’s say you run a SharePoint installation with a lot of web apps and sites. I mean a lot. If you do, have you ever been asked “Hey, can you give me the top 25 sites in the SharePoint farm? In a format I can understand? And present to management to help out with your raise next year?” (At this point, you should be shaking your head yes, and remembering where you linked to this blog site).

Three parts: 1. A SharePoint list with some custom columns to store the data. 2. On the same site as the list, deploy the SharePoint Chart WebPart, using the list as your datasource (I’m a huge fan of the giant doughnut chart, it certainly makes its presence known). 3. You need a PowerShell script you can run as a scheduled task that goes and gets this data and populates the list.

What’s that? Sure, I guess I can do that for you as well…

#Load the SharePoint cmdlets if they are missing
if (!(Get-PsSnapin | Where-Object {$_.Name -match "Microsoft.SharePoint.PowerShell"}))
  Add-PsSnapin Microsoft.SharePoint.PowerShell

# This function will execute a 'Dispose" method on any variables that have a .Dispose() method
function Dispose-All {
	Get-Variable -exclude Runspace | Where-Object {$_.Value -is [System.IDisposable]} | 
	        Foreach-Object {$_.Value.Dispose()

function Gather-Data{
Param (
# Get all the webapps in the farm, execept for the MySites, because no one cares about MySites (you may need to change the URL)
$sites = (Get-SPWebApplication | Where-Object{$_.URL -notlike "http://mysite*"} | select -expandproperty Sites | ForEach-Object{Get-SPSite $_.url})
ForEach($site in $sites){
	$web = (Get-SPWeb $site.URL)
	$monthUsage = ($web.GetUsageData("url", "lastMonth") | sort 'Total Hits')
	If($monthUsage -ne $null){$largeUse = $monthUsage[-1]
	$tHits = ($monthUsage | Measure-Object 'Total Hits' -Sum | Select -expand Sum)
	$usageData = @{
		waURL = $site.WebApplication.URL
		URL = $site.URL
		Title = $web.title
		Day30 = $tHits
		MostActive = $largeUse.Page
		MostActiveCount = $largeUse.'Total Hits'
	New-Object PSObject -Property $usageData
	$monthUsage = $null
	$tHits = $null


function List-Output{
	$newItem = $spList.Items.Add()
	$newItem["Web Application"]="$($_.waURL)"
	$newItem["Site Collection URL"]="$($_.URL)"
	$newItem["Site Title"]="$($_.Title)"
	$newItem["Last 30 Days Access Count"]="$($_.Day30)"
	$newItem["Most Active File"]="$($_.MostActive)"
	$newItem["File Access Count"]="$($_.MostActiveCount)"

$spList = (Get-SPWeb -identity "http://yoursite.you.com/sites/SiteReportSite").Lists["SiteReport"]
ForEach($killItem in $spList.items){
$spList = (Get-SPWeb -identity "http://yoursite.you.com/sites/SiteReportSite").Lists["SiteReport"]

Gather-Data | sort-object MostActiveCount -descending | select-object -first 25 | %{List-Output $_}

What this script does is pull the Usage Stats (you know, from the Usage DB that no one uses but takes up a ton of SQL server disk space) for the sites, drops the Top 25 hitters into the SharePoint list so you can point the Chart Web Part at that list as a datasource.

***** Some assembly required *****
Create a custom list with the following columns using the column type listed:
Web Application (Single line of text)
Site Collection URL (Single line of text)
Site Title (Single line of text)
Last 30 Days Access Count (Number)
Most Active File (Single line of text)
File Access Count (Number)

Again, use the built-in Chart Web Part and point to this list as a datasource. The script should run once a day at least, so the data will always be fresh. Once it’s complete, amaze and impress your friends with your giant doughnut chart showing the top 25 accessed sites.

Searching ULS Logs in Multi-Server Farms

If you have from 5 to a lot more SharePoint servers in your farm, and you need to locate a specific error in the ULS logs, well, it can be a pain. I’ve written a PowerShell script that will allow you to specify what column the error is in, when the error happened, and the general text of the error. The script will take your input and search the ULS logs on each server in the farm. There are the normal cautions about long searches, etc, but you should already know that.

The output is 2 files, one is a standard ULS log file that ULSViewer can open, the other is a short file, which contains the errors from the ULS with just a few fields.

Last thing, please feel free to make fun of my PowerShell scripting abilities. Everyone else I know does…

#Load up the SharePoint cmdlets if they are missing
if (!(Get-PsSnapin | Where-Object {$_.Name -match "Microsoft.SharePoint.PowerShell"}))
{Add-PsSnapin Microsoft.SharePoint.PowerShell}

$FarmInfo = Get-SPFarm
$spLogLocation = (Get-SPDiagnosticConfig | select -expand LogLocation)
$spLogLocation = "\" + ($spLogLocation -replace (":","$"))

$outputFile = "ULSSearchResults.log"
$fulloutputFile = "ULSViewerVersion.log"

if ((test-path $outputFile) -eq "True") {Remove-Item $outputFile -force}
if ((test-path $fulloutputFile) -eq "True") {Remove-Item $fulloutputFile -force}

$DateCheck = date
Add-Content $outputFile "ULS Error Search - Started $DateCheck"
Add-Content $outputFile "`r"

Write-Host @"
Choose which ULS Column To Search In
A:  Correlation ID
B:  Message
C:  Category
D:  EventID
E:  Level
F:  All Areas (Caution, this will take a long time to complete)

$q1 = Read-Host "Column"
$q1 = $q1.ToUpper()

If($q1 -eq "A"){$logcat = "Correlation"}
 elseif($q1 -eq "B"){$logcat = "Message"}
 elseif($q1 -eq "C"){$logcat = "Category"}
 elseif($q1 -eq "D"){$logcat = "EventID"}
 elseif($q1 -eq "E"){$logcat = "Level"}
 elseif($q1 -eq "F"){$logcat = "ALL"}
	Write-Host "`r"
	Write-Host "Selection Is Outside Of Accepted Values.  Rerun Script And Make A Valid Selection." -foregroundcolor "Yellow"
Write-Host @"
Choose A TimeFrame To Search In
A:  Last 10 minutes
B:  Last 30 Minutes
C:  Last 60 Minutes
D:  Last 12 Hours
E:  All Log Files

$q2 = Read-Host "TimeFrame"
$q2 = $q2.ToUpper()

If($q2 -eq "A"){$timcat = "10"}
 elseif($q2 -eq "B"){$timcat = "30"}
 elseif($q2 -eq "C"){$timcat = "60"}
 elseif($q2 -eq "D"){$timcat = "720"}
 elseif($q2 -eq "E"){$timcat = "ALL"}
	Write-Host "`r"
	Write-Host "Selection Is Outside Of Accepted Values.  Rerun Script And Make A Valid Selection." -foregroundcolor "Yellow"

If($timcat -eq "ALL"){$tframe = "All Log Files"}
 {$tframe = $timcat + " Minutes Ago"}

Write-Host @"
Enter or Paste In The Text To Search For

$errMsg = Read-Host "Error"

Add-Content $outputFile @"
*****  Begin ULS Log Search For Error:  $errMSG
Checking In Column: $logcat.  Within The Timeframe Of: $tframe

$timeframe = $null
$FarmServers = $FarmInfo.Servers | Where-Object {$_.Role -ne "Invalid"}
If($timcat -ne "ALL"){$timeframe = (Get-Date).AddMinutes(-$timcat)}
Write-Host "`r"

ForEach($spServer in $FarmServers){
	$logPath = "\\$($spServer.Address)" + $spLogLocation
	Write-Host "Processing Logs File At:  $logPath"
	If($timcat -ne "ALL"){
		If($logcat -ne "ALL"){$logResults = Get-SPLogEvent -Starttime $timeframe -Directory $logPath | Where-Object{$_.$logcat -like "*$errMsg*"}}
		{$logResults = Get-SPLogEvent -Starttime $timeframe -Directory $logPath | Where-Object{$_ -like "*$errMsg*"}}
		 {If($logcat -ne "ALL"){$logResults = Get-SPLogEvent -Directory $logPath | Where-Object{$_.$logcat -like "*$errMsg*"}}
		 {$logResults = Get-SPLogEvent -Directory $logPath | Where-Object{$_ -like "*$errMsg*"}}
	Add-Content $fulloutputFile $logResults
	ForEach($loggedItem in $logResults){
		$shortMessage = ($loggedItem.Message)
		If(($shortMessage.Length) -gt 125){$shortMessage = $shortMessage.SubString(0,125)}
		If($shortMessage -ne $Null){
		Add-Content $outputFile "$($spServer.Address)`t$($loggedItem.Timestamp)`t$($loggedItem.EventID)`t$($loggedItem.Category)`t$($loggedItem.Level)`t$($loggedItem.Correlation)`t$shortMessage"}}
Write-Host @"
ULS Log Search Complete
Simple output file:  $outputFile
ULSViewer compatible file:  $fulloutputFile