Thursday, July 19, 2012

Backup and Restore SharePoint 2010 Site Collection with PowerShell

One of the most annoying processes with SharePoint is backing up a site collection.  There is a build in tool that you can use but it is kinda clunky and can take a while to perform the backup.  I used it once and was not impressed.  PowerShell provides a quick and easy way to do site collection backups.  My favorite part is that you can do the entire backup with one line of code.  Just a little something to be careful of, you need to use the Windows SharePoint 2010 Management Shell, not the regular Windows PowerShell.

*** Note that doing the backup will put your site into a read only mode.

  1. Click Start
  2. Go to All Programs
  3. Go to Microsoft SharePoint 2010 Products
  4. Open SharePoint 2010 Management Shell
A PowerShell command prompt will appear and you need to format the following to fit the backup for your site.

Backup-SPSite -Identity SiteCollectionURLHere -Path BackupFilePathHere [-Force] [-NoSiteLock] [-UseSqlSnapshot] [-Verbose]

I recommend creating a folder where you can place these backups before starting the backup process so they aren’t just chillin on the C:\ drive of your SharePoint server; just a thought.  Here is a little explanation of those additional parameters that are inside the braces [ ]
  • Force – Include this if you want to override a backup with the same name
  • NoSiteLock – Will prevent the site from going to Read Only mode while the backup is being taken.  A small warning, if someone changes content on the site while the backup is being created and according to Microsoft “might lead to possible data corruption”
  • UseSQLSnapshot – A database snapshot will be taken before the backup begins and the backup will be done off the snapshot. The advantage is that changes can be made to the site while the backup process is running without fear of corruption.  The snapshot will be deleted automatically when the backup is completed.  You don’t need to specify the -NoSiteLock parameter when using this method
Here is a simple example of what the script may look like if you want to just do a backup:
Backup-SPSite -Identity http://servername/sites/BISite -Path D:\SharePointBackups\BISite\07-19-2012.bak

To do site restores the syntax is almost just as easy.  You will need to use the same SharePoint 2010 Management Shell as doing the backup.

Restore-SPSite -Identity SiteCollectionURLHere -Path BackupFilePathHere [-DatabaseServer DatabaseServerNameHere] [-DatabaseName ContentDatabaseNameHere] [-HostHeader HostHeaderHere] [-Force] [-GradualDelete] [-Verbose]
  • DatabaseServer – Specify the server for the content database
  • DatabaseName – Specify the name of the content database
  • HostHeader – URL of the Web application that will hold the host-named site collection
  • Force – Overwrite the site collection if it exists
  • GradualDelete – Recommended for site collections over 1 Gig in size, existing data is marked as deleted and gradually removed over time by a job rather than all at once to reduce the performance hit of deleting large amounts of data
Hopefully this will save you some time and a headaches as it did me!  As much as I don’t like coding and I stay away from PowerShell, I will admit, it is a huge time saver and super easy in the SharePoint site collection backup/restore world.

If you still find issue with backup and restore or getting errors related to sharepoint versioning. follow this post :

you can even auto-schedule the backup of your farm/site. check this article: 

Video Tutorial:

Viral Shah

Moving site from one port to another using Export & Import Command

Today I faced an issue while restoring site collection from one server to another server. it was giving me error of different versions of SharePoint Application.

This command help me solving issue. Use this command with Sharepoint 2010 Management Shell.
to access Sharepoint2010 Management Shell.
Go to Start->Microsoft SharePoint 2010 Products->Sharepoint 2010 Management Shell.

To Export List :
Export-SPweb -identity http://mossapp/picturelibrary -Path E:\viral\Backup\exportsites.cmp

To Import List :
Import-spweb –identity http://mossapp:8008 -path E:\viral\Backup\exportsites.cmp

Hope this helps!!

Viral Shah

Tuesday, July 10, 2012

Best Practice to Differentiate Accounts while configuring sharepoint 2010

Hi Guys, hope you doing great. 

Rather to keep same account for all the required services, I have try to differentiate all the accounts. i have mentioned purpose of the account as well. It will improve scalability and reduce complexity and vulnerability of the over all farm.

Sql Server Service Account
Run SQL Server Processes
Setup user account
Run installation & SharePoint products and technologies configuration wizard
Server farm account / database access account
·         Configuration & manage server farm.
·         Application pool identity for central administration website
·         Run SharePoint foundation timer service
Search service account
Run search service
Content access account
Used to access content sources for crawling. Defaults to search service account.
Application pool account
Used for running IIS web application that host SharePoint site collections.

Monday, July 9, 2012

Understanding Databases Created During Installation

After installation, you will see several databases that are created in SQL Server and
will need to be added to your SQL Server maintenance plan:

SharePoint Configuration The SharePoint configuration database(config DB) holds all of your server farm  configuration data and is akin to the Windows Server system registry. Any server that uses this installation’s config DB is considered a member of the same server farm.

Central Administration content Because the Central Administration Web application is a custom site collection in a dedicated Web application, it has a corresponding content database. Rebuilding this Web  application is not a simple task and should be avoided by correctly backing up the server for future  restoration.

Content database Each Web application has at least one corresponding content database. If you ran the  Farm Configuration Wizard, a Web application was created for you at the URL of your server and it has a corresponding content database.

Business Connectivity Services DB This database is used by Business Connectivity Services (BCS), and by default it will be named Bdc_Service_DB_<GUID>.

SharePoint Foundation Logging Used for logging purposes, it is named WSS_Logging by default.

Secure Store Service DB Provides storage and mapping of credentials.

Search Administration Database Formerly the SSP database, it hosts the Search application  configuration and access control list (ACL) used in crawling content.

Search Property Database Stores crawled properties associated with the crawled data.

Crawl Database Formerly the Search database in SharePoint Server 2007, it hosts the crawled data and  manages the crawling process.

Web Analytics Staging Database Stores raw, unaggregated Fact data.

■ Web Analytics Reporting Database Stores aggregate data for reporting.

User Profile Database Stores and manages user profile information.

Profile Synchronization Database Stores configuration and staging data for user-profile synchronization.

Social Tagging Database Stores social tagging data along with the associated URL.

State Database Used for storing temporary session state information for SharePoint components.

Word Automation Services Database Supports Word Automation Services in performing Word-related tasks on the server.

Managed Metadata Database Stores managed metadata and content

Application Registry Service Database Supports the Application Registry

Viral Shah

Thursday, July 5, 2012

what is Sharepoint2010 14 hive directory?

If you are familiar with SharePoint 2007, you must be aware of "12 hive" directory. In SharePoint 2010, "12 hive" directory has been replaced by "14 hive" directory. In most of cases this is a default path for SharePoint files.

Following are some of the folders in the "14 hive" directory:
1) Program Files\Common files\Microsoft Shared\Web Server Extensions\14 -
This directory is the installation directory for core SharePoint Server files.

2) Program Files\Common files\Microsoft Shared\Web Server Extensions\14\ADMISAPI -

This directory contains the soap services for Central Administration. If this directory is altered, remote site creation and other methods exposed in the service will not function correctly.
3) Program Files\Common files\Microsoft Shared\Web Server Extensions\14\CONFIG -
This directory contains files used to extend IIS Web sites with SharePoint Server. If this directory or its contents are altered, Web application provisioning will not function correctly.

4) Program Files\Common files\Microsoft Shared\Web Server Extensions\14\LOGS -

This directory contains setup and run-time tracing logs.
Following are some new folders added in the "14 hive" directory:
1) Program Files\Common files\Microsoft Shared\Web Server Extensions\Policy -
2) Program Files\Common files\Microsoft Shared\Web Server Extensions\UserCode -
This directory contains files used to support your sandboxed solutions.
3) Program Files\Common files\Microsoft Shared\Web Server Extensions\WebClients -
This directory contains files related to the new Client Object Model.
4) Program Files\Common files\Microsoft Shared\Web Server Extensions\WebServices -
This directory contains new wcf or .svc related files.
 Hope this helps easing confusion..!
Viral Shah

Behind the scene when we create new Web Application in SharePoint

When we create a new Web Application in SharePoint, we can see a round wheel image circulating and a text "Processing". It may be interesting to know what events/actions actually occurs meanwhile.
Following are various actions which happens behind the scene:
  • Creates a unique entry in SharePoint configuration DB for the Web Application and assign GUID to that entry.
  • Create and configures a new site in IIS
  • Creates and configures a new IIS application pool.
  • Configures authentication protocol and encryption settings.
  • Assign a Default alternate access mapping for the Web Application.
  • Creates the first content database for the Web Application.
  • Associate a search service with the Web application.
  • Assign a name to the Web application that appears in the Web application list in SharePoint Central Administration.
  • Assign general settings to the Web application, such as maximum file upload size and default time zone.
  • Updates the web.config file to make entries for custom HTTP Modules andHandlers for SharePoint.
  • Creates virtual directories for SharePoint web services and SharePoint layouts folder.
After creating a Web Application in SharePoint, the web site is actually not created yet. It means if you try to access the Web Application using the web app url, it will show you "Page cannot be displayed" error. Basically at this point of time, a web application has been created and all the mandatory configuration has been done. Now the next step is to create a Site Collection using a particular Site Definition, then only the actual site will be created and you will be able to access the site using the url of Site Collection.
 Hope you find it interesting...!
Viral Shah

Wednesday, July 4, 2012

Schedule Sharepoint 2010 farm backup and simple backup


I recently setup a new SharePoint environment to host a single site collection. Even though there was not a lot of content on this small farm, the environment needed to be highly available and we needed the ability to lose no more than one hour worth of data in the case of a catastrophic failure. Retaining backups for one week was plenty for our needs. Users of the site were located globally making the site's least utilized time in the evenings (PST) with the biggest window on Saturday.
Based on this and the idea of keeping the process easy and lightweight as possible (i.e. out-of-the-SharePoint-box and running from a single location) I decided to go this route:
  1. Full farm backup on Saturday afternoon
  2. Differential farm backup on Sunday-Friday evenings
  3. Hourly differential backups on content databases
  4. Clean out last weeks Full and Differential backups on early Monday morning
Problem was I didn't know how I was going to accomplish this and than fully automate it. Breaking down this problem I determined I had three steps to make this work:
  1. Run backups via PowerShell
  2. Run these PowerShell scripts from Task Scheduler
  3. Cleanup backups

  1. Run backups via PowerShell
    • Using the TechNet documentation for reference I used the following one-liner for Full and Differential farm backup:
              Backup-SPFarm -Directory &lt;backupFolder> -BackupMethod {Full | Differential} [-Verbose]
    • This is what I used for the hourly content db backups:
              Backup-SPFarm -Directory &lt;backupFolder> -BackupMethod Differential -Item &lt;contentDBName>
    • I took these commands and placed them into individual .ps1 files (FullFarmBackup.ps1, DiffFarmBackup.ps1 and HourlyContentDBBackup.ps1) and put them into E:\backupScripts. (If you are looking for simple backup, your search ends here.)
  2. Run these PowerShell backups from Task Scheduler
        Getting Task Scheduler to run a PowerShell script with the SharePoint snap-in was a lesson in trial and error. Ultimately this is what I did to get it working.
    • In all my .ps1 scripts I need to have the SharePoint snap-in loaded. To do this I added the live "Add-PSSnapin Microsoft.SharePoint.PowerShell" at the top.
          Add-PSSnapin Microsoft.SharePoint.PowerShell
          Backup-SPFarm -Directory &lt;backupFolder> -BackupMethod {Full | Differential} [-Verbose]
    • Under the Action tab of the task I created:
      • Program/Script = powershell
      • Add Argument (optional) = E:\backupScripts\FullFarmBackup.ps1

  3. Automate the cleanup of the above backups
    Because I didn't have infinite disk space and didn't need backups greater than 1 week I needed a means to cleanup old backups. Running a quick Bing search I ran across a blog post on 'Imperfect IT' that did just this. Here is a copy of what was posted on that site and to which I placed into Task Scheduler:

        # Location of spbrtoc.xml
        $spbrtoc = "&lt;backupFolder>"
        # Days of backup that will be remaining after backup cleanup.
        $days = 7
        # Import the Sharepoint backup report xml file
        [xml]$sp = gc $spbrtoc
        # Find the old backups in spbrtoc.xml
        $old = $sp.SPBackupRestoreHistory.SPHistoryObject |
        ? { $_.SPStartTime -lt ((get-date).adddays(-$days)) }
        if ($old -eq $Null) { write-host "No reports of backups older than $days days found in spbrtoc.xml.'nspbrtoc.xml isn't changed and no files are removed.'n" ; break}
        # Delete the old backups from the Sharepoint backup report xml file
        $old | % { $sp.SPBackupRestoreHistory.RemoveChild($_) }
        # Delete the physical folders in which the old backups were located
        $old | % { Remove-Item $_.SPBackupDirectory -Recurse }
        # Save the new Sharepoint backup report xml file
        Write-host "Backup(s) entries older than $days days are removed from spbrtoc.xml and harddisc."
I now have 2 scheduled tasks running (Full farm and cleanup) that provide me the ability to fulfill our simple backup strategy. 

Viral Shah

Tuesday, July 3, 2012

The following data source cannot be used because PerformancePoint Services is not configured correctly. error while installation of Performance Point Services 2010

If you’re setting up a fresh installation of Performance Point Services 2010 , you might think that all has gone well only to see the above error in Dashboard Designer once you try to start creating scorecards and reports. This is actually quite common – it seems there are a few extra steps to do before you can start using PPS 2010. 

1) Set up the Unattended Service Account. 

Go to the Central Administration page ( on Sharepoint ). Under Application Management , click on Manage Service Applications. Then click on Performance Point Service Application , then Performance Point Service Application Settings. We need to configure the Unattended Execution Account. This is the account that PPS will use to connect to the data source ( unless you use Per User Identity , but thats another story ). If you’ve never seen this page before , there are some interesting options there. Once you set this up and hit Apply. It might be successful, or it might give you another error….

2) The Unattended Service Account cannot be set for the service application.
The Secure Store Service key might not have been generated or properly refreshed
after the service was provisioned.

Now I’m not the expert in the Secure Store Service , but Performance Point uses this to store the password for the Unattended execution account. If you get this error, you need to check that a key has been generated for the Secure Store Service. Once again , go to the Central Administration page, and under Application Management , click on Manage Service Applications. Now click on Secure Store Service. Click on Generate New Key. Once that is done, you can go back to set the Unattended Execution Account on the other page , and Dashboard Designer should then work fine. 
If you still having issue login with farm administrator account and try this steps again.

Monday, July 2, 2012

what is List View Threshold (LVT) feature?

One of the major reasons that this List View Threshold (LVT) feature was created is to protect the server from unintentional load that may either bring it down, or at least cause other users higher latency or failures. Changing this limit (default 5000) is quite simple, but I wouldn't recommend it unless you are positive that it will not negatively affect your system. One valid example of when you might want to do this is if you are using your farm to serve heavily cached content, that only gets updated once a day, and do not want the limit to apply for that. Even in that case, I'd recommend that you test this thoroughly before changing it. There's an awesome white paper out there that describes in full details what effects this has on the server, with a lot of pretty graphs and such to depict the performance implications. Here it is: Designing Large Lists and Maximizing List Performance (

Also here's a link to the help topic that explains the basic limits and what they mean:

If you've got your mind set on changing the LVT or another resource throttling setting, here's how to do it:

1- Login to Central Admin
2- Go to Application Management -> Manage Web Applications
3- Pick the Web application for which you want to change the LVT (If you only have 1 web app plus the central admin one, the one you want to pick is the 1 web app; changing this for the central admin does you no good)
4- In the ribbon above, click General Settings. That will bring down a menu, from which you should pick Resource Throttling
5- Change the LVT (first item in this list) to another value and press OK, but please try to keep it to a reasonable number!

Following those steps will take you to the page where you can also edit a bunch of other settings. Here's a list of them, and a brief description of what they do and best practices or recommendations on how to set them:

- List View Threshold for Auditors and Administrators: This is by default a "higher limit". Queries that are run by an auditor or administrator that specifically (programmatically) request to override the LVT will be subject to this limit instead. It's 20,000 by default as opposed to the 5,000 for the LVT. I wouldn't raise this past 20,000 for the same reasons of not raising the LVT. If you'd like to read more about how to use this, take a look at this post.

- Object Model Override: If you commonly use custom code on your deployment, and have a need for overriding the LVT to a higher limit, then it may be a good idea to allow the object model override, and give auditor or administrator permissions to the application that will perform the queries. This setting is on by  default, but you may disable it if you do not need it. A good example of when you might want to use this is if you've implemented some code that will perform caching of a larger set of results that are accessed often for, say, several minutes. If you are not planning on caching the content, and are planning on running these queries often, then I wouldn't recommend using this method to get around the LVT as it will adversely affect your server's performance. In short: "tread lightly". If you'd like to read more about how to use this, take a look at this post.

- List View Lookup Threshold: This feature limits the number of joins that a query can perform. By number of joins, I mean the number of Lookup, Person/Group, or Workflow Status fields that are included in the query. So for example, if you have a view that displays 6 lookup columns, and filters on another 3 distinct lookup columns then by default that view won't work, since the List View Lookup Threshold is 8, and the view is attempting to use 9 lookups. I would recommend that you do not increase this number beyond 8, because through thorough testing we've observed that there's a serious non-gradual performance degradation that shows up above 8 joins. Not only does the throughput that the server can handle drop significantly at that point, but the query ends up using a disproportionately large amount of the SQL Server's resources, which negatively affects everybody else using that same database. If you'd like to read more about this, take a look at the "Lookup columns and list views" section of this white paper:

- Daily Time Window for Large Queries: This feature allows you to set a time every day where users can 'go wild'. Some people call it "happy hour", but I really think it would be a very unhappy hour for the server so I avoid that terminology :-). There are a few things that you should carefully consider before deciding what time to set this to:

It should be an off-peak hour, or at least a time during which you expect the least load, so as to affect the least number of individuals. If you pick the time to be in the middle of the work day for the majority of your users, then even those who are not using the large list may be affected negatively.
Try to keep it to a reasonable timeframe such that people can actually use it to fix their lists, rather than bug the farm admin (possibly you!) about it. If, for example, you set it to be "2-3 am", then it's unlikely that the users will be very happy about that. They won't want to wake up at 2 am just to delete this large list they no longer need, so they're more tempted to ask the farm admin to handle it for them. Remember that operations started during the window won't just abort once the window ends.. So if your window lasts till 9am, and at 9 you need the server to be crisp and clear because you get a huge load spike, people who started their list delete at 8:59 may negatively affect that experience. Consider different time zones. This is especially important if your organization or customers (if you're hosting SharePoint for others) are heavily  geographically distributed. Setting it to 6pm may seem like a good idea for your own location, but would not be great in say, Sydney, Australia.

- List Unique Permissions Threshold: This is the number of unique permissions allowed per list. If you have a folder that you break inheritance on for permissions, and set some permissions for it (and all the items inside it), then that counts as 1 against your List Unique Permissions Threshold. Unlike the LVT and other settings, this threshold is not triggered by viewing the content or performing some other operation on it, but explicitly when changing permissions. If you can afford to, then I would recommend reducing this number. It defaults to 50,000 and that is a lot of unique permissions! Your list is very likely to encounter problems with permissions before it reaches this number, so preemptively tweaking it to what might work in your environment is a good idea.

Sharepoint Page Request and ASP.Net Page Request

Before we compare life cycle of both the page request lets  first understand the application life cycle.

Stages of the ASP.NET application life cycle (IIS 7.0):
  1. Request is made for an application resource: When the integrated pipeline receives a request, the request passes through stages that are common to all requests. These stages are represented by the RequestNotification enumeration. All requests can be configured to take advantage of ASP.NET functionality, because that functionality is encapsulated in managed-code modules that have access to the request pipeline. For example, even though the .htm file-name extension is not explicitly mapped to ASP.NET, a request for an HTML page still invokes ASP.NET modules. This enables you to take advantage of ASP.NET authentication and authorization for all resources.
  2. The unified pipeline receives the first request for the application: When the unified pipeline receives the first request for any resource in an application, an instance of the ApplicationManager class is created, which is the application domain that the request is processed in. Application domains provide isolation between applications for global variables and enable each application to be unloaded separately. In the application domain, an instance of the HostingEnvironment class is created, which provides access to information about the application, such as the name of the folder where the application is stored. During the first request, top-level items in the application are compiled if required, which includes application code in the App_Code folder.
  3. Response objects are created for each request: After the application domain has been created and the HostingEnvironment object has been instantiated, application objects such as HttpContext, HttpRequest, and HttpResponse are created and initialized.

  4. An HttpApplication object is assigned to the request: After all application objects have been initialized, the application is started by creating an instance of the HttpApplication class. If the application has a Global.asax file, ASP.NET instead creates an instance of the Global.asax class that is derived from the HttpApplication class. It then uses the derived class to represent the application.
    Which ASP.NET modules are loaded (such as the SessionStateModule) depends on the managed-code modules that the application inherits from a parent application. It also depends on which modules are configured in the configuration section of the application's Web.config file. Modules are added or removed in the application's Web.config modules element in the system.webServer section.

  5. The request is processed by the HttpApplication pipeline: At this stage the request are processed and various events are called like ValidateRequest, URL Mapping, BeginRequest, AuthenticateRequest and so on. You can find more info over here. The events are useful for page developers who want to run code when key request pipeline events are raised. They are also useful if you are developing a custom module and you want the module to be invoked for all requests to the pipeline. Custom modules implement the IHttpModule interface. In Integrated mode in IIS 7.0, you must register event handlers in a module's Init method.
Stages of the Sharepoint Page Request:

The stages for the SharePoint page request are handled by the IIS in similar way. Except that there are custom handlers for the every sharepoint page request.
When you create a web application in sharepoint, WSS configures the IIS website by adding an IIS application map and creating several virtual directories. Windows SharePoint Services also copies a global.asax file and web.config file to the root directory of the hosting IIS Web site.
Because every request targeting a Web application is routed through aspnet_isapi.dll, the request gets fully initialized with ASP.NET context. Furthermore, its processing behavior can be controlled by using a custom HttpApplication object and adding configuration elements to the web.config file.

First, you can see that Windows SharePoint Services configures each Web application with a custom HttpApplication object by using the SPHttpApplication class. Note that this class is deployed in the Windows SharePoint Services system assembly Microsoft.SharePoint.dll.

In addition to including a custom HttpApplication object, the Windows SharePoint Services architecture uses a custom HttpHandler(SPHttpHandler) and a custom HttpModule(SPRequestModule). These two SharePoint-specific components are integrated into the HTTP Request Pipeline for a Web application using standard entries in the web.config file.


      <remove verb="GET,HEAD,POST" path="*" />
      <add verb="GET,HEAD,POST" path="*"
          type="Microsoft.SharePoint.ApplicationRuntime.SPHttpHandler,..." />

      <clear />
      <add name="SPRequest
      <!-- other standard ASP.NET httpModules added back in -->


ASP.NET 2.0 introduced a new pluggable component type known as a virtual path provider. The idea behind a virtual path provider is that it abstracts the details of where page files are stored away from the ASP.NET runtime. By creating a custom virtual path provider, a developer can write a custom component that retrieves ASP.NET file types, such as .aspx and .master files, from a remote location, such as a Microsoft SQL Server database.

The Windows SharePoint Services team created a virtual path provider named SPVirtualPathProvider that is integrated into every Web application. The SPVirtualPathProvider class is integrated into the ASP.NET request handling infrastructure by the SPRequestModule. More specifically, the SPRequestModule component contains code to register the SPVirtualPathProvider class with the ASP.NET Framework as it does its work to initialize a Web application.

I Hope you liked the article. Please do provide your feedback in the comment box.

Thank you ! Happy Sharepointing !

Behind the scene when we create new web page with sharepoint 2010 :

w3wp.exe uses too much memory and resources

What is the W3WP.exe process?

The W3WP.exe is the IIS worker process, that handles requests, maintain sessions, viewstate, and the cache.  Its been around Since IIS 6 on
Windows Server 2003, and is still found as part of IIS 7.5 with Server 2008 R2.  On a properly configured, and programmed website the W3WP.exe can usually stay under 100 Megs.   If you are running a 64bit OS it could be a little larger.  The actual size of the W3WP.exe will be determined by how much traffic your website gets, and how many pages it serves.  The 100 Meg figure is just an estimate from my own experiences.  Application Pools with multiple websites can easily go over this figure, so don't use it some magic number.  But if you are running a website that has relatively low traffic ( under 20K impressions daily ), and your W3WP.exe is taking up 100+ MB of Memory, then you might want to make some changes.
Why do I care if W3WP.exe is large?

Lets say that you have a website running on a
shared hosting account.  As your site grows, it will start to experience bottle necks from the limited resources it has access to on the shared box.  Remember the average shared server is running on average 300 websites.  You site is constantly fighting other sites for CPU time and resources.  Eventually you out grow the shared environment, and are forced to upgrade to either a VPS or a fully dedicated server.  But what you don't know is your site could last much longer on the shared box or that small VPS, if you were to clean up your code. 

Most popular causes of W3WP.exe using too much memory

Page Caching Dynamic Websites

Some Programmers will use Page Caching on their
ASP.NET pages to improve request speeds. Even though this does work.  It will quickly eat up all your memory.  For Page Caching to work you would need to cache each page and every combination based on its querystring variables.  For Members sites this could result in millions of different pages.  A more effective way of using caching is to Cache objects that do not change, and are used throughout the website.  Things like your menu bar.  Or maybe the details for an event.  This will reduce the number of calls to the database server, and speed up your site.  Some may argue that the function of a Database server is to cache popular queries, and that there is no difference between retrieving the data from the database server and caching the objects in your web server's memory.  This is absolutely false for many reasons.  First anytime your website has to retrieve data from the database, it has to open a connection to another process (the MySQL process), even if MySQL is running on the same box as the web server there is a very small performance penalty to establish this connection.  But if you are on a Shared Hosting Account, its very likely that the hosting company has setup a dedicated MySQL server in the datacenter somewhere.  So now your web server has to reach across the datacenter to a separate server.  At this point the latency of their network becomes an issue.  This is something I have personally experienced with multiple shared hosting providers.  This is the reason I highly recommend doing everything you can to reduce the calls to the database.  Object Caching can make a huge difference on page load times, especially when you are able to cache objects that use expensive queries.  If you want to learn more about how to custom manage objects in the ASP.NET cache, I wrote a few helper functions that are easily integrated into your websites.
ViewState Enabled on Labels

The ViewState is how
ASP.NET keeps track of changes that have been made to Web Controls.  ViewState is needed anytime you are going to do a postback, like for a submit form.  But a common mistake is programmers don't disable viewstate on Label controls that are often used as place holders for long concatenated HTML strings.  If you suspect your may be guilty of this all it takes is a look at the HTML source code from your web browser to confirm it.  If your HTML source looks like this

with a really big view state then you need to disable the viewstate on your label controls.  The Viewstate is stored in the W3WP.exe process.  Its easy to see how this process can consumer so much memory when you have a view state this large and on an exponential number of web pages. The solution is to change your ASP:Label Controls from this
<asp:Label ID="mainmenu" runat="server" />
to this
<asp:Label ID="mainmenu" EnableViewState="false" runat="server" />
Lets face it there is no reason you need to preserve the ViewState of a Label Control.  So go through all your pages and add the EnableViewState="false" attribute to all your Label Controls.
Long Session Time Outs
Anytime a visitor ( organic or spider ) visits your website, ASP.NET is going to maintain a session with them.  Sessions are how the server keeps track of your users and their activity on your website.  Without Sessions things like postback would not be possible.  When ASP.NET uses sessions in the traditional way, they store a Session ID in a cookie on the client's browser.  Then every time this client communicates with the server this cookie value ( Session ID ) is passed along with the request.  The Server can then identify the visitor by their Session ID and handle the requests appropriately.  Now the server has to keep track of the visitors activity.  So each time a submit form is submitted, or a web control is used, the information related to this action is stored with the viewstate in the W3WP.exe, and is tied to the Session ID.  This memory is not released until the Session Expires.  The default on ASP.NET is for Session to Expire about 20 minutes.  This means after 20 minutes of no activity from this visitor the Session would be discarded, and the memory associated with this session would be freed.  Now if you have some lengthy submit forms which could take longer than 20 minutes to fill out, then you might need to have a longer than normal session timeout.  On I have sessions set to 45 minutes ( sometimes I take a while to write an article ) But if your submit forms are the quick kind that can be filled out within a few minutes, then 20 minutes is more than enough.  If your website doesn't have any submit forms, then it might make sense to take your session timeouts down to 5 minutes or less.  The shorter your Session Time Outs, the less memory that will be needed to track your visitors as they navigate through your website.  If you want proof.  Set your Session Timeouts to be 6 hours.  Then watch over the next 6 hours as your W3WP.exe grows to a massive size.  I can only imagine what sites like Facebook must deal with considering their visitors ( or should I call them addicts ), do daily surfing marathons.

For a website like, in which the only public web control is the comment box at the bottom of the page, I could probably get by with a 15 minute session timeout.  But my backend that is run from the same Application needs much longer Session timeouts.  If I wanted to get very efficient with my memory, I would create a separate application for my backend ( control panel ) with the longer timeout, allowing me to shorten the timeout on my public application. 

So in summary.  Lots of submit forms, plus lots of unique visitors, plus a long session time out will equal a very big W3WP.exe.  You might not be able to change your submit forms, or your visitors, but you can shorten your session timeouts.
Memory Leaks

Certain objects should be cleared from memory with the Dispose() Method.  These include things like BitMap Objects.  Even though the
ASP.NET Garbage Collection does a good job of taking out the trash, I have found that on heavily loaded systems if you don't release Bitmaps with the Dispose Method it tends to lock the image file you were working with.  This object will sit in your W3WP.exe taking up space.  Even though memory leaks can be a problem, they are usually very minor.  ViewState is by far the biggest Memory hog.
Unclosed Database Connections

If you use helper functions to manage your Database connections, be sure that you close the connections, when finished.  Else this will waste memory on both the W3WP.exe side and the Database Server side.

Even though advanced programmers might feel some of this is obvious.  I found it disturbing that I was unable to find more articles online involving something so important.  Who knows, maybe if every programmer uses my advice, this could negatively affect the bottom line of
hosting companies, who's customers won't be needing to get that dedicated server just yet.  If this helped you let me know.

SharePoint or Windows Server acts slow!! use this trick

Contact me


Email *

Message *

Total Pageviews