• Can I make changes to the Reporting DB?

    A customer asked me today if he could make changes to the Reporting database in his Project Server 2010 environment.  The short answer is, yes, you can create entities in the Reporting database.  Below is a scrubbed version of the conversation.

    Customer: I have a request from the business to create one or more custom tables in the Project Server Reporting Database and run reports against those tables using <Big Reporting Application> from <Big Application> Company.  Do you have any information about best practices for making changes or what is allowed to be changed in the Project Server 2010 Reporting database? 

    Me: Add, but don’t modify or delete any entities already present and thoroughly test your additions in a test environment.  Beyond that, the only best practices I would forward would be related to designing SQL tables and other entities and I would reach out to a SQL PFE for those. 

    Customer: Will Microsoft still support us if we create a custom SQL Table in the Project Server Reporting Database?  

    Me: Yes.  Ideally, we are talking about only a few tables or less.  If we are talking about using <Big Reporting Application> with many, many tables, I would recommend you consider a separate database (with all of its attendant overhead), but less exposure to upgrade risks. 

    Customer: Can Cumulative Updates or SP2 Updates impact the custom tables in the database?

    Me: The risk of an impact on custom tables always exists and can be mitigated by thorough testing of any CUs or SPs in a test environment with the same configuration as the production tables.  Also, good backups prior to an upgrade can alleviate concerns.  Please see this link for information on what you can add tables/entities/etc. to the Reporting database: http://technet.microsoft.com/en-us/library/ff686786.aspx#section5.  I also found this link: http://msdn.microsoft.com/en-US/library/ms504195(v=office.14).aspx#pj14_Programmability_Databases.

  • Baselining Best Practices

    The following MPUG article by Kenneth Steiness is a good introduction to the basics of baselining.   A baseline is a snapshot of your project at a point in time and you can create up to 11 of them.  Baselines provide a benchmark to compare against and also provide variances.  The variance helps show how far you were or are off of the original estimate, so you can take corrective actions. This can help avoid delays, cost overruns, resource issues, and much, much more.

     

    Here's the article: http://www.mpug.com/articles/baselining-best-practices/


    As you go through the article, under the Maintaining the Baseline section, look at how he uses inactive tasks.

  • Add AD group to all PWA project sites with Read Only permission using PowerShell

    Until I can figure out who to cross post between the Project Server PFE Team Blog and my own blog, I will double post.

    My customer wanted to be able to provide his Portfolio Managers with read only access to all project sites in PWA.  Because an AD group for Portfolio Managers already exists, we can use that AD group in the script and add it to all project sites under PWA with Read Only permission.  Credit to Gerald Parish for helping me with the code.

    As always, test this code in your test environment.

    <#
    #################
     Disclaimer
    #################
     This software (or sample code or query) is not supported under any Microsoft standard support program or service.
     The software, sample code, or query is provided AS IS without warranty of any kind.
     Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose.
     The entire risk arising out of the use or performance of the software and documentation remains with you.
     In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the software be liable for any damages whatsoever
     (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss)
     arising out of the use of or inability to use the software or documentation, even if Microsoft has been advised of the possibility of such damages."
    #>
    Exit

    <#
    #################
    Instructions
    #################
    Review the disclaimer above.  Remove or comment out the "Exit" in order to be able to run the script.
    Go to the last line of this PowerShell script file. 
    In that line, change the -site value to the PWA URL. 
    Also, change the -newuser value to be the AD username or group that you want to provide Read Only permissions for.

    For example,...
     AddReadOnlyUsers -site http://projects.contoso.com/pwa -newuser "contoso\Financial Applications Group"
    becomes
     AddReadOnlyUsers -site http://projects.yourdomain.com/pwa -newuser "yourdomain\Financial Applications Group"

    Run this entire script from the SharePoint 2010 Management Shell. 
    The log file will be created in the same directory you run the .ps1 from.
    #>


    Function Local:AddReadOnlyUsers
    {

    [CmdletBinding()]
    Param(
        [parameter(Mandatory=$true)][string]$Site,
        [parameter(Mandatory=$true)][string]$NewUser
     )

    $global:Logfile = ".\" +[Datetime]::Now.ToString("MMddyyyy_hhmmss_tt") + "_User_log.txt"
    try {Start-Transcript -Path $Logfile -ErrorAction Stop } Catch { $Error[0] }

    $permLevel = "Read"
    $SiteCollection = get-spsite $Site
    $AllWebs = $SiteCollection.AllWebs
    foreach ($webSite in $AllWebs) {
       Write-Host -NoNewline "($webSite.url) Adding $($NewUser)..."
       $oNewuser = $webSite.EnsureUser($NewUser)  
       if ($oNewuser -ne $null) {
          Write-Host -ForegroundColor Green "Complete"
          Write-Host -NoNewline "Granting $($permLevel) to $($NewUser)... "
          $roleDef = $webSite.RoleDefinitions[$permLevel]
          $RoleAss = New-Object Microsoft.SharePoint.SPRoleAssignment($oNewuser)
          $RoleAss.RoleDefinitionBindings.Add($roleDef)
          $webSite.RoleAssignments.Add($RoleAss)
          $webSite.Update()
          Write-Host "Complete"
       }
       else { Write-Host -ForegroundColor red "Error"; exit}
    }

    Stop-Transcript
    }

    AddReadOnlyUsers -site http://projects.contoso.com/pwa -newuser "contoso\Financial Applications Group"

  • SPS Farm Report

    Today, I learned about a PowerShell script that is already part of some of the diagnostic tools I use in my day to day work and I just didn't know it was in there.  It's freely available online so I wanted to share.

    You can go to http://spsfarmreport.codeplex.com/ and download the package to get a full report on your SharePoint farm.

    The ReadMe.txt has important information about what will be needed to successfully run the tool.  Because I'm using SharePoint 2013 as a base for my Project Server 2013 installation, I used the PowerShell cmdlet Get-ExecutionPolicy to learn the execution policy was already set to Unrestricted on my test farm.

    I downloaded the package, then moved the o15 files to a specially created folder on my Project Server 2013 test server called spsfarmreport.  The really neat thing about this tool is you can run it on only 1 server in the farm and it will grab results from all of them.

    So, I ran .\2013SPSFarmReport.ps1 from a PowerShell command prompt window and this was the output...

    PS C:\spsfarmreport> .\2013SPSFarmReport.ps1
    o15WriteInitialXML
    o15farmConfig
    o15WriteFarmGenSettings
    o15enumServers
    o15writeServers
    o15enumProdVersions
    o15writeProdVersions2
    o15enumFeatures
    o15writeFeatures
    o15enumSolutions
    o15writeSolutions
    o15enumSvcApps
    o15enumSPSearchServiceApps
    o15enumSPSearchService
    o15enumHostControllers
    o15enumSearchActiveTopologies
    o15enumSearchConfigAdminComponents
    o15enumSearchConfigLinkStores
    o15enumSearchConfigCrawlDatabases
    o15enumSearchConfigCrawlRules
    o15enumSearchConfigQuerySiteSettings
    o15enumSearchConfigContentSources
    o15writeServiceApps
    o15enumSPServiceApplicationPools
    o15writeSPServiceApplicationPools
    o15enumSPServiceApplicationProxies
    o15writeSPServiceApplicationProxies
    o15enumSPServiceApplicationProxyGroups
    o15writeSPServiceApplicationProxyGroups
    o15enumWebApps
    o15writeWebApps
    o15writeAAMsnAPs
    o15enumContentDBs
    o15writeContentDBs
    o15enumCDConfig
    o15writeCDConfig
    o15enumHealthReport
    o15writeHealthReport
    o15enumTimerJobs
    o15writeTimerJobs
    o15WriteEndXML
    PS C:\spsfarmreport>

    So where did the output file actually go?  It's not in my special spsfarmreport folder.

    The readme file has the answer: Run the "[Environment]::CurrentDirectory" command to know where the output XML is written to.

    I did this and found my output was being dropped here in the system32 directory.

    PS C:\spsfarmreport> [Environment]::CurrentDirectory
    C:\windows\system32

    I copied these files to my spsfarmreport folder where the SPSFarmReport.xslt is and was able to open the .xml file using IE.

    My report is pretty boring because I only have one SharePoint Server in this farm, but you get the idea. 

    Here's a screenshot of the output I see.

    Happy hunting!

  • Pretty darned useful - "Merge-SPLogFile" cmdlet

    My favorite ULS logging PowerShell command is described here:  http://technet.microsoft.com/en-us/library/ff607721.aspx.  The Merge-SPLOgFile cmdlet pulls trace log entries from all farm computers into a single log file on the local computer and it works in Project Server 2010 and 2013, too. 

    I can hand this link to customers and they can run the Merge-SPLogFile command from PowerShell.  The result is a single output file they can easily upload for me to review.  The real benefit to this command, as someone who does troubleshooting remotely, is that I get all ULS entries from all the servers combined.  Pretty neat.

    Merge-SPLogFile -Path "c:\mergelog\log.txt" -Overwrite -level High -StartTime "03/25/2014 12:00" -EndTime "03/25/2014 12:59"

    Edit: You can also use this command to pull just the entries related to a particular correlation ID, which is very useful, because the command will pull only those ULS log entries related to the error you are seeing.  Just replace the Correlation below with your own from whatever error message you see.

    Merge-SPLogFile -Path C:\mergelog\log.txt -Correlation 29b5c483-c48b-4ef2-b4b3-f5e29f635d31

    This creates a much smaller output file to search through and you know that ALL of the entries in the log file are related to the particular problem.