Thursday, July 19, 2012

Backup and Restore SharePoint 2010 Site Collection with PowerShell

One of the most annoying processes with SharePoint is backing up a site collection.  There is a build in tool that you can use but it is kinda clunky and can take a while to perform the backup.  I used it once and was not impressed.  PowerShell provides a quick and easy way to do site collection backups.  My favorite part is that you can do the entire backup with one line of code.  Just a little something to be careful of, you need to use the Windows SharePoint 2010 Management Shell, not the regular Windows PowerShell.

*** Note that doing the backup will put your site into a read only mode.

  1. Click Start
  2. Go to All Programs
  3. Go to Microsoft SharePoint 2010 Products
  4. Open SharePoint 2010 Management Shell
A PowerShell command prompt will appear and you need to format the following to fit the backup for your site.

Backup-SPSite -Identity SiteCollectionURLHere -Path BackupFilePathHere [-Force] [-NoSiteLock] [-UseSqlSnapshot] [-Verbose]

I recommend creating a folder where you can place these backups before starting the backup process so they aren’t just chillin on the C:\ drive of your SharePoint server; just a thought.  Here is a little explanation of those additional parameters that are inside the braces [ ]
  • Force – Include this if you want to override a backup with the same name
  • NoSiteLock – Will prevent the site from going to Read Only mode while the backup is being taken.  A small warning, if someone changes content on the site while the backup is being created and according to Microsoft “might lead to possible data corruption”
  • UseSQLSnapshot – A database snapshot will be taken before the backup begins and the backup will be done off the snapshot. The advantage is that changes can be made to the site while the backup process is running without fear of corruption.  The snapshot will be deleted automatically when the backup is completed.  You don’t need to specify the -NoSiteLock parameter when using this method
Here is a simple example of what the script may look like if you want to just do a backup:
Backup-SPSite -Identity http://servername/sites/BISite -Path D:\SharePointBackups\BISite\07-19-2012.bak

To do site restores the syntax is almost just as easy.  You will need to use the same SharePoint 2010 Management Shell as doing the backup.

Restore-SPSite -Identity SiteCollectionURLHere -Path BackupFilePathHere [-DatabaseServer DatabaseServerNameHere] [-DatabaseName ContentDatabaseNameHere] [-HostHeader HostHeaderHere] [-Force] [-GradualDelete] [-Verbose]
  • DatabaseServer – Specify the server for the content database
  • DatabaseName – Specify the name of the content database
  • HostHeader – URL of the Web application that will hold the host-named site collection
  • Force – Overwrite the site collection if it exists
  • GradualDelete – Recommended for site collections over 1 Gig in size, existing data is marked as deleted and gradually removed over time by a job rather than all at once to reduce the performance hit of deleting large amounts of data
Hopefully this will save you some time and a headaches as it did me!  As much as I don’t like coding and I stay away from PowerShell, I will admit, it is a huge time saver and super easy in the SharePoint site collection backup/restore world.

If you still find issue with backup and restore or getting errors related to sharepoint versioning. follow this post : http://sharepointshah.blogspot.in/2012/07/moving-site-from-one-port-to-another.html

you can even auto-schedule the backup of your farm/site. check this article: http://sharepointshah.blogspot.in/2012/07/schedule-sharepoint-2010-farm-backup.html 

Video Tutorial:


Regards,
Viral Shah

Moving site from one port to another using Export & Import Command

Today I faced an issue while restoring site collection from one server to another server. it was giving me error of different versions of SharePoint Application.

This command help me solving issue. Use this command with Sharepoint 2010 Management Shell.
to access Sharepoint2010 Management Shell.
Go to Start->Microsoft SharePoint 2010 Products->Sharepoint 2010 Management Shell.

To Export List :
Export-SPweb -identity http://mossapp/picturelibrary -Path E:\viral\Backup\exportsites.cmp


To Import List :
Import-spweb –identity http://mossapp:8008 -path E:\viral\Backup\exportsites.cmp

Hope this helps!!

Regards,
Viral Shah

Tuesday, July 10, 2012

Best Practice to Differentiate Accounts while configuring sharepoint 2010

Hi Guys, hope you doing great. 

Rather to keep same account for all the required services, I have try to differentiate all the accounts. i have mentioned purpose of the account as well. It will improve scalability and reduce complexity and vulnerability of the over all farm.

No.
Account
Purpose
1
Sql Server Service Account
Run SQL Server Processes
2
Setup user account
Run installation & SharePoint products and technologies configuration wizard
3
Server farm account / database access account
·         Configuration & manage server farm.
·         Application pool identity for central administration website
·         Run SharePoint foundation timer service
4
Search service account
Run search service
5
Content access account
Used to access content sources for crawling. Defaults to search service account.
6
Application pool account
Used for running IIS web application that host SharePoint site collections.

Monday, July 9, 2012

Understanding Databases Created During Installation

After installation, you will see several databases that are created in SQL Server and
will need to be added to your SQL Server maintenance plan:

SharePoint Configuration The SharePoint configuration database(config DB) holds all of your server farm  configuration data and is akin to the Windows Server system registry. Any server that uses this installation’s config DB is considered a member of the same server farm.

Central Administration content Because the Central Administration Web application is a custom site collection in a dedicated Web application, it has a corresponding content database. Rebuilding this Web  application is not a simple task and should be avoided by correctly backing up the server for future  restoration.

Content database Each Web application has at least one corresponding content database. If you ran the  Farm Configuration Wizard, a Web application was created for you at the URL of your server and it has a corresponding content database.

Business Connectivity Services DB This database is used by Business Connectivity Services (BCS), and by default it will be named Bdc_Service_DB_<GUID>.

SharePoint Foundation Logging Used for logging purposes, it is named WSS_Logging by default.

Secure Store Service DB Provides storage and mapping of credentials.

Search Administration Database Formerly the SSP database, it hosts the Search application  configuration and access control list (ACL) used in crawling content.

Search Property Database Stores crawled properties associated with the crawled data.

Crawl Database Formerly the Search database in SharePoint Server 2007, it hosts the crawled data and  manages the crawling process.

Web Analytics Staging Database Stores raw, unaggregated Fact data.

■ Web Analytics Reporting Database Stores aggregate data for reporting.

User Profile Database Stores and manages user profile information.

Profile Synchronization Database Stores configuration and staging data for user-profile synchronization.

Social Tagging Database Stores social tagging data along with the associated URL.

State Database Used for storing temporary session state information for SharePoint components.

Word Automation Services Database Supports Word Automation Services in performing Word-related tasks on the server.

Managed Metadata Database Stores managed metadata and content
types.

Application Registry Service Database Supports the Application Registry
Service.

Regards,
Viral Shah

Thursday, July 5, 2012

what is Sharepoint2010 14 hive directory?

If you are familiar with SharePoint 2007, you must be aware of "12 hive" directory. In SharePoint 2010, "12 hive" directory has been replaced by "14 hive" directory. In most of cases this is a default path for SharePoint files.

Following are some of the folders in the "14 hive" directory:
1) Program Files\Common files\Microsoft Shared\Web Server Extensions\14 -
This directory is the installation directory for core SharePoint Server files.

2) Program Files\Common files\Microsoft Shared\Web Server Extensions\14\ADMISAPI -

This directory contains the soap services for Central Administration. If this directory is altered, remote site creation and other methods exposed in the service will not function correctly.
3) Program Files\Common files\Microsoft Shared\Web Server Extensions\14\CONFIG -
This directory contains files used to extend IIS Web sites with SharePoint Server. If this directory or its contents are altered, Web application provisioning will not function correctly.

4) Program Files\Common files\Microsoft Shared\Web Server Extensions\14\LOGS -

This directory contains setup and run-time tracing logs.
Following are some new folders added in the "14 hive" directory:
1) Program Files\Common files\Microsoft Shared\Web Server Extensions\Policy -
2) Program Files\Common files\Microsoft Shared\Web Server Extensions\UserCode -
This directory contains files used to support your sandboxed solutions.
3) Program Files\Common files\Microsoft Shared\Web Server Extensions\WebClients -
This directory contains files related to the new Client Object Model.
4) Program Files\Common files\Microsoft Shared\Web Server Extensions\WebServices -
This directory contains new wcf or .svc related files.
 Hope this helps easing confusion..!
Regards,
Viral Shah

Behind the scene when we create new Web Application in SharePoint

When we create a new Web Application in SharePoint, we can see a round wheel image circulating and a text "Processing". It may be interesting to know what events/actions actually occurs meanwhile.
Following are various actions which happens behind the scene:
  • Creates a unique entry in SharePoint configuration DB for the Web Application and assign GUID to that entry.
  • Create and configures a new site in IIS
  • Creates and configures a new IIS application pool.
  • Configures authentication protocol and encryption settings.
  • Assign a Default alternate access mapping for the Web Application.
  • Creates the first content database for the Web Application.
  • Associate a search service with the Web application.
  • Assign a name to the Web application that appears in the Web application list in SharePoint Central Administration.
  • Assign general settings to the Web application, such as maximum file upload size and default time zone.
  • Updates the web.config file to make entries for custom HTTP Modules andHandlers for SharePoint.
  • Creates virtual directories for SharePoint web services and SharePoint layouts folder.
After creating a Web Application in SharePoint, the web site is actually not created yet. It means if you try to access the Web Application using the web app url, it will show you "Page cannot be displayed" error. Basically at this point of time, a web application has been created and all the mandatory configuration has been done. Now the next step is to create a Site Collection using a particular Site Definition, then only the actual site will be created and you will be able to access the site using the url of Site Collection.
 Hope you find it interesting...!
Regards,
Viral Shah

Wednesday, July 4, 2012

Schedule Sharepoint 2010 farm backup and simple backup

Problem:

I recently setup a new SharePoint environment to host a single site collection. Even though there was not a lot of content on this small farm, the environment needed to be highly available and we needed the ability to lose no more than one hour worth of data in the case of a catastrophic failure. Retaining backups for one week was plenty for our needs. Users of the site were located globally making the site's least utilized time in the evenings (PST) with the biggest window on Saturday.
Based on this and the idea of keeping the process easy and lightweight as possible (i.e. out-of-the-SharePoint-box and running from a single location) I decided to go this route:
  1. Full farm backup on Saturday afternoon
  2. Differential farm backup on Sunday-Friday evenings
  3. Hourly differential backups on content databases
  4. Clean out last weeks Full and Differential backups on early Monday morning
Problem was I didn't know how I was going to accomplish this and than fully automate it. Breaking down this problem I determined I had three steps to make this work:
  1. Run backups via PowerShell
  2. Run these PowerShell scripts from Task Scheduler
  3. Cleanup backups

Solution:
  1. Run backups via PowerShell
    • Using the TechNet documentation for reference I used the following one-liner for Full and Differential farm backup:
              Backup-SPFarm -Directory &lt;backupFolder> -BackupMethod {Full | Differential} [-Verbose]
    • This is what I used for the hourly content db backups:
              Backup-SPFarm -Directory &lt;backupFolder> -BackupMethod Differential -Item &lt;contentDBName>
    • I took these commands and placed them into individual .ps1 files (FullFarmBackup.ps1, DiffFarmBackup.ps1 and HourlyContentDBBackup.ps1) and put them into E:\backupScripts. (If you are looking for simple backup, your search ends here.)
  2. Run these PowerShell backups from Task Scheduler
        Getting Task Scheduler to run a PowerShell script with the SharePoint snap-in was a lesson in trial and error. Ultimately this is what I did to get it working.
    • In all my .ps1 scripts I need to have the SharePoint snap-in loaded. To do this I added the live "Add-PSSnapin Microsoft.SharePoint.PowerShell" at the top.
          Add-PSSnapin Microsoft.SharePoint.PowerShell
          Backup-SPFarm -Directory &lt;backupFolder> -BackupMethod {Full | Differential} [-Verbose]
    • Under the Action tab of the task I created:
      • Program/Script = powershell
      • Add Argument (optional) = E:\backupScripts\FullFarmBackup.ps1

  3. Automate the cleanup of the above backups
       
    Because I didn't have infinite disk space and didn't need backups greater than 1 week I needed a means to cleanup old backups. Running a quick Bing search I ran across a blog post on 'Imperfect IT' that did just this. Here is a copy of what was posted on that site and to which I placed into Task Scheduler:

        # Location of spbrtoc.xml
        $spbrtoc = "&lt;backupFolder>"
        # Days of backup that will be remaining after backup cleanup.
        $days = 7
        # Import the Sharepoint backup report xml file
        [xml]$sp = gc $spbrtoc
        # Find the old backups in spbrtoc.xml
        $old = $sp.SPBackupRestoreHistory.SPHistoryObject |
        ? { $_.SPStartTime -lt ((get-date).adddays(-$days)) }
        if ($old -eq $Null) { write-host "No reports of backups older than $days days found in spbrtoc.xml.'nspbrtoc.xml isn't changed and no files are removed.'n" ; break}
        # Delete the old backups from the Sharepoint backup report xml file
        $old | % { $sp.SPBackupRestoreHistory.RemoveChild($_) }
        # Delete the physical folders in which the old backups were located
        $old | % { Remove-Item $_.SPBackupDirectory -Recurse }
        # Save the new Sharepoint backup report xml file
        $sp.Save($spbrtoc)
        Write-host "Backup(s) entries older than $days days are removed from spbrtoc.xml and harddisc."
I now have 2 scheduled tasks running (Full farm and cleanup) that provide me the ability to fulfill our simple backup strategy. 

Regards,
Viral Shah

Tuesday, July 3, 2012

The following data source cannot be used because PerformancePoint Services is not configured correctly. error while installation of Performance Point Services 2010

If you’re setting up a fresh installation of Performance Point Services 2010 , you might think that all has gone well only to see the above error in Dashboard Designer once you try to start creating scorecards and reports. This is actually quite common – it seems there are a few extra steps to do before you can start using PPS 2010. 

1) Set up the Unattended Service Account. 

Go to the Central Administration page ( on Sharepoint ). Under Application Management , click on Manage Service Applications. Then click on Performance Point Service Application , then Performance Point Service Application Settings. We need to configure the Unattended Execution Account. This is the account that PPS will use to connect to the data source ( unless you use Per User Identity , but thats another story ). If you’ve never seen this page before , there are some interesting options there. Once you set this up and hit Apply. It might be successful, or it might give you another error….

2) The Unattended Service Account cannot be set for the service application.
The Secure Store Service key might not have been generated or properly refreshed
after the service was provisioned.

Now I’m not the expert in the Secure Store Service , but Performance Point uses this to store the password for the Unattended execution account. If you get this error, you need to check that a key has been generated for the Secure Store Service. Once again , go to the Central Administration page, and under Application Management , click on Manage Service Applications. Now click on Secure Store Service. Click on Generate New Key. Once that is done, you can go back to set the Unattended Execution Account on the other page , and Dashboard Designer should then work fine. 
If you still having issue login with farm administrator account and try this steps again.

Monday, July 2, 2012

what is List View Threshold (LVT) feature?

One of the major reasons that this List View Threshold (LVT) feature was created is to protect the server from unintentional load that may either bring it down, or at least cause other users higher latency or failures. Changing this limit (default 5000) is quite simple, but I wouldn't recommend it unless you are positive that it will not negatively affect your system. One valid example of when you might want to do this is if you are using your farm to serve heavily cached content, that only gets updated once a day, and do not want the limit to apply for that. Even in that case, I'd recommend that you test this thoroughly before changing it. There's an awesome white paper out there that describes in full details what effects this has on the server, with a lot of pretty graphs and such to depict the performance implications. Here it is: Designing Large Lists and Maximizing List Performance (http://technet.microsoft.com/en-us/library/ff608068(office.14).aspx).

Also here's a link to the help topic that explains the basic limits and what they mean: http://office2010.microsoft.com/en-us/sharepointserver-help/manage-lists-and-libraries-with-many-items-HA010378155.aspx?redir=0

If you've got your mind set on changing the LVT or another resource throttling setting, here's how to do it:

1- Login to Central Admin
2- Go to Application Management -> Manage Web Applications
3- Pick the Web application for which you want to change the LVT (If you only have 1 web app plus the central admin one, the one you want to pick is the 1 web app; changing this for the central admin does you no good)
4- In the ribbon above, click General Settings. That will bring down a menu, from which you should pick Resource Throttling
5- Change the LVT (first item in this list) to another value and press OK, but please try to keep it to a reasonable number!


Following those steps will take you to the page where you can also edit a bunch of other settings. Here's a list of them, and a brief description of what they do and best practices or recommendations on how to set them:

- List View Threshold for Auditors and Administrators: This is by default a "higher limit". Queries that are run by an auditor or administrator that specifically (programmatically) request to override the LVT will be subject to this limit instead. It's 20,000 by default as opposed to the 5,000 for the LVT. I wouldn't raise this past 20,000 for the same reasons of not raising the LVT. If you'd like to read more about how to use this, take a look at this post.

- Object Model Override: If you commonly use custom code on your deployment, and have a need for overriding the LVT to a higher limit, then it may be a good idea to allow the object model override, and give auditor or administrator permissions to the application that will perform the queries. This setting is on by  default, but you may disable it if you do not need it. A good example of when you might want to use this is if you've implemented some code that will perform caching of a larger set of results that are accessed often for, say, several minutes. If you are not planning on caching the content, and are planning on running these queries often, then I wouldn't recommend using this method to get around the LVT as it will adversely affect your server's performance. In short: "tread lightly". If you'd like to read more about how to use this, take a look at this post.

- List View Lookup Threshold: This feature limits the number of joins that a query can perform. By number of joins, I mean the number of Lookup, Person/Group, or Workflow Status fields that are included in the query. So for example, if you have a view that displays 6 lookup columns, and filters on another 3 distinct lookup columns then by default that view won't work, since the List View Lookup Threshold is 8, and the view is attempting to use 9 lookups. I would recommend that you do not increase this number beyond 8, because through thorough testing we've observed that there's a serious non-gradual performance degradation that shows up above 8 joins. Not only does the throughput that the server can handle drop significantly at that point, but the query ends up using a disproportionately large amount of the SQL Server's resources, which negatively affects everybody else using that same database. If you'd like to read more about this, take a look at the "Lookup columns and list views" section of this white paper: http://technet.microsoft.com/en-us/library/ff608068(office.14).aspx

- Daily Time Window for Large Queries: This feature allows you to set a time every day where users can 'go wild'. Some people call it "happy hour", but I really think it would be a very unhappy hour for the server so I avoid that terminology :-). There are a few things that you should carefully consider before deciding what time to set this to:

It should be an off-peak hour, or at least a time during which you expect the least load, so as to affect the least number of individuals. If you pick the time to be in the middle of the work day for the majority of your users, then even those who are not using the large list may be affected negatively.
Try to keep it to a reasonable timeframe such that people can actually use it to fix their lists, rather than bug the farm admin (possibly you!) about it. If, for example, you set it to be "2-3 am", then it's unlikely that the users will be very happy about that. They won't want to wake up at 2 am just to delete this large list they no longer need, so they're more tempted to ask the farm admin to handle it for them. Remember that operations started during the window won't just abort once the window ends.. So if your window lasts till 9am, and at 9 you need the server to be crisp and clear because you get a huge load spike, people who started their list delete at 8:59 may negatively affect that experience. Consider different time zones. This is especially important if your organization or customers (if you're hosting SharePoint for others) are heavily  geographically distributed. Setting it to 6pm may seem like a good idea for your own location, but would not be great in say, Sydney, Australia.

- List Unique Permissions Threshold: This is the number of unique permissions allowed per list. If you have a folder that you break inheritance on for permissions, and set some permissions for it (and all the items inside it), then that counts as 1 against your List Unique Permissions Threshold. Unlike the LVT and other settings, this threshold is not triggered by viewing the content or performing some other operation on it, but explicitly when changing permissions. If you can afford to, then I would recommend reducing this number. It defaults to 50,000 and that is a lot of unique permissions! Your list is very likely to encounter problems with permissions before it reaches this number, so preemptively tweaking it to what might work in your environment is a good idea.

Contact me

Name

Email *

Message *

Total Pageviews