Archive for October, 2009

October 27, 2009

Stormy Skies for Cloud Computing


I don’t know about you, but I’m somewhat confused about all the hype over “cloud computing”. Don’t get me wrong, accessing services on the Internet makes sense, sometimes. What I don’t understand is how many people tout cloud computing as the silver bullet for all of our IT ills; you know, the way virtualization was supposed to solve all of our problems? That will be a different post. Back to the point. The Web 2.0 Journal has a nice enumeration of the benefits to cloud computing. First I’d like to comment on these benefits.

 

What the vendors tell you

Reduced CostCloud technology is paid incrementally, saving organizations money.“

My comment: It depends on what cost your talking about. Making use of hosted services provides a cost-effective way to start initially deploying a capability to your organization.  However, it’s like renting an apartment vs. buying a house: you never stop paying rent. The service fee is a constant monthly expense that never goes away. Need a second service? There’s another monthly fee. The problem of course is that you never actually get ROI with this model. It’s more like a reverse ROI. You start out ahead of the game because you had few start up costs. But after a few months, you get further and further behind, because at some point you’ve paid as much in service fees as it would have cost to do the implementation in house. And you keep on paying. Indefinitely.

Check out my post on getting started with SharePoint on the cheap where I show how you can get started with SharePoint in house for $1100, with no recurring costs. That setup will support between 500 to 1000 users and provide over 100 GB of storage. Comparable hosting plans would cost between $55-$80 per month (more if you exceed the small amount of storage that comes with most plans by using document versions). That means that the in-house implementation of SharePoint would actually come out cheaper than the hosted option somewhere between 14 to 20 months. ROI within 18 months is considered good in the industry.

 

Increased StorageOrganizations can store more data than on private computer systems.”

My comment: Storage is cheap. In fact, storage is one of the cheapest IT commodities you can buy today. You can get 4 TB of business-class storage for under $6,000. Compare that to a hosted SharePoint service offering with only 1 TB of storage for $4,000 per month. What IT commodity is significantly more expensive than storage? WAN bandwidth. Cloud computing model uses WAN bandwidth like it’s going out of style. See the “What the vendors don’t tell you” section of this post for more on the topic.

 

Highly AutomatedNo longer do IT personnel need to worry about keeping software up to date.”

My comment: There is an obsessive compulsive disorder in IT circles called “software updates”. Many administrators think that they need to apply the latest hot fix or service pack the moment it comes out. It’s ok to talk about it. I had this problem too. Do you know what I discovered happens if you don’t apply most of these updates? Absolutely nothing. Servers keep running. In fact, by not applying the latest update you mitigate the risk of that update breaking your application. So, while letting the hosting provider keep the software up to date is nice, it is of questionable business benefit and can lead to misbehaving applications or unexpected downtime.

 

Flexibility Cloud computing offers much more flexibility than past computing methods.”

My comment: I have a few thoughts on this one. First, while cloud computing can provide more flexibility, more flexibility is not necessarily better. If an in-house system has enough flexibility, adding more flexibility may not offer any business benefit. Second, the most inflexible part of an organization may be the people. Every time you something in IT changes, users need to get retained. That training is time consuming and costly (time = money) . Third, Has anyone asked why there is a need for so much flexibility? Could it be a symptom of poor planning? Why not deal with the problem’s root cause rather than the symptom?

 

More MobilityEmployees can access information wherever they are, rather than having to remain at their desks.”.

My comment: This is great, but you don’t need to have your service hosted in the cloud. Get an Internet domain name and a DNS service like No-IP Plus ($25 per year). You can make your in-house web application Internet-accessible in less than an hour.

 

Allows IT to Shift FocusNo longer having to worry about constant server updates and other computing issues, government organizations will be free to concentrate on innovation.”

My comment: See my comment in the "Highly Automated“ section for my thoughts on chasing server updates. We have AD, Exchange, SharePoint, SQL Server, and TFS in house, and have no full-time IT support personnel. How do we do this? By keeping our filthy hands off of the servers. Best of all, we don’t need to pay hosting service fees for the privilege of not touching the servers. Not monkeying with existing servers allows us to focus on delivering new capability.

 

Next I’d like to add a one more item people often attribute to hosted systems and comment on that item: “Increased reliability”. There a general feeling that service providers in the cloud can do a better job of ensuring system uptime than other organizations can. In some cases I think this is true, especially when taking about specialized hosting companies. However, many cloud service providers do not fall into that category. Remember the now infamous T-Mobile Sidekick data loss incident? I guess “carrier-grade” doesn’t mean what it used it. There is no guarantee that a service providers will do a better job of ensuring service reliability than you will.

 

Along the same lines as service reliability is “service availability”. While service reliability refers to the service running, service availability refers to users actually being able to get to the service, two completely different things. When the service is running on the same LAN as the users, service availability pretty much equals service reliability. I.e., if the service is running users can most likely get to the service. It’s a different story when the service is running in the cloud and users need to access the service over the WAN. Lets face it, Internet connections go down. Speeds fluctuate. Granted, downtime for this reason doesn’t happen often, but when it does things can be very frustrating.

system down due to storm

 

What the vendors don’t tell you

Now I’d like to point out some of cloud computing’s dirty little secrets. Few people in the industry talk about these, and customers don’t find out until after they’ve signed a 1-year contract.

Single sign-on

Single sign-on (SSO) is the concept that a user has to enter his or her credentials into one system, and that the credentials would get propagated into all other systems that the user needs to access. For example, a user would login to his or her desktop computer and would be able to access the organization’s email and portal systems without additional logins. For most organizations running Microsoft software in-house, this is a daily reality because all of the organizations systems are running off of the same user directory service, the organization’s Active Directory. But what happens when the organization makes use of the hosted service in the cloud? That service has its own usernames and passwords. Users now have an additional login for the hosted service. Web browsers can help by caching user credentials on their desktops, but there are other desktop applications that don’t do as good a job with that. The issue multiplies as the organization adds additional services, each service requiring its own username and password.

 

WAN vs. LAN bandwidth

How much bandwidth do most organizations have on their LANs? Most have 1000 Mbps. How much bandwidth do these organizations have on their WANs? Usually less than 10 Mbps. That means that most organizations have roughly 100 times the bandwidth on the LANs as no their WANs. That’s important since the organizations’ users access cloud services through the WAN. Users will perceive even well-implemented cloud services as being much slower and unresponsive as compared to mediocrely implemented in0house services. The cloud service is slower, the problem lies with the users’ limited bandwidth in accessing the cloud service.

 

WAN vs. LAN reliability

How often does your LAN go down? How often does your WAN go down? Imagine losing access to all of your organization’s services in the event of an Internet connection loss? More is available in the previous section of this post titled “service availability”.

 

System integration

One of the best outcomes to increasing numbers of software vendors’ adoption of open interfaces and API standards in their software is customers’ ability to make the systems their organizations rely on to operate as a single system, rather than a collection of desperate applications. New business capabilities like business intelligence were now possible. What if an organization wanted this capability but relied on hosted services for some of its systems? Let’s say that an organization had an internal Active Directory and mail system, but made use of one vendors hosted ecommerce service and another vendor’s hosted CRM service. Seems reasonable so far? The organization wants to to answer a simple question: how many customers who have purchased from the organization within the past 6 months have emailed their sales representative directly after a purchase? This question requires data from the AD, email, ecommerce, and CRM systems. Getting to that data is hard, because the CRM and ecommerce systems must be accessed over a WAN connection. This makes the processes of getting to the vast data the systems hold very painful. We also hope that there is some way to correlate the various data entities between the systems: orders, customers, email addresses, sales people, etc.

 

Conclusion

There’s a lot of buzz about delivering software as a service (SaaS). I think that SaaS / cloud computing / utility computing will definitely be a good delivery model for some types of computing capabilities. But, there are just too many benefits to running services co-located with users to make locally running services go away anytime soon. My colleague, Brad Smith came up with a much better alternative to either model: localized software as a service (LSaaS). Briefly put, the LSaaS software delivery model makes use of hosted services as a pure SaaS model does. However, rather than users accessing a hosted service directly, the LSaaS model places an extension to the hosted service into an organization’s office. Users access the service local extension over the LAN. That service extension may communicate with the hosted service in the cloud on an as-needed basis. See my previous post on how an LSaaS software delivery model can simplify software licensing and copy protection while avoiding the pitfalls with a pure SaaS delivery model. I believe that more and more vendors will start using the LSaaS software deliver model in the next few years.We certainly will.

 

October 21, 2009

The Next Web: Twitter-Facebook AND Twitter-Microsoft deal confirmed


The Next Web reports that a deal between Twitter and Microsoft will be announced at the Web 2.0 Summit.

“What’s particularly unique is the deal is with both Microsoft AND Facebook and the company plans to integrate deeply with both company’s products and services, namely Bing and Facebook.com.”

Microsoft has already announced Twitter and Facebook integration with XBox Live. What’s next? Twitter and Facebook integration with SharePoint 2010? Here’s hoping. That would make for some real SharePoint 2010 cloud integration.

October 20, 2009

SharePoint 2010 public beta and general release announced


Microsoft issued a press release announcing the both the public beta of SharePoint 2010 and its general release. The public beta will be available November 2009, along with the public betas of Office 2010, Project 2010 and Visio 2010. The general release for SharePoint 2010 is slated for the first half of 2010.

From the Microsoft press release:

The public betas of Microsoft SharePoint Server 2010, Office 2010, Project 2010 and Visio 2010 will become available in November 2009; more information is available at http://go.microsoft.com/?linkid=9689707.

Microsoft SharePoint Server 2010 will be available in the first half of 2010. More information about Microsoft SharePoint Server 2010 can be found at http://www.microsoft.com/sharepoint.

Remember that SharePoint 2010 will be an x64-only release. That means you will need to install it onto an x64 version of Windows Server.

October 20, 2009

Watch SharePoint Conference 2009 video highlights on demand


Do you wish you were at the Microsoft SharePoint Conference 2009? Me too. Watch selected conference video highlights here:

http://www.mssharepointconference.com/Pages/videohighlights.aspx

 

Tom Rizzo, Senior Director for SharePoint, recaps the news and announcements that Steve Ballmer and Jeff Teper shared with the SharePoint Conference 2009 attendees. He also shared some fun facts that you might not have know about the conference.


Tom Rizzo’s News of the Day

 

Before diving in to the SharePoint 2010 take a moment to look back at the amazing solutions customers have built on Microsoft Office SharePoint 2007.


SharePoint Conference 2009 Opening Video

 

Hear what our customers are already saying about SharePoint 2010. See what they’re most excited about after trying it out and their plans for future deployment.


SharePoint 2010 Customer Excitement

 

Watch this short video to see SharePoint Conference 2009 come together on location in Las Vegas and learn what it takes to bring it together, such as the number of labor days and the miles of CAT 5e cable.


Time Lapse Footage

 

October 20, 2009

Watch the SharePoint Conference 2009 keynote on demand


Do you wish you were at the Microsoft SharePoint Conference 2009? Me too. But, you can watch the keynote here:

http://www.mssharepointconference.com/Pages/default.aspx

 

The keynote had lots of great content including loads of SharePoint demos such as the new SharePoint UI, Excel Services, Performance Point, FAST Search, and more. See Steve Ballmer, Jeff Teper, and others deliver the first true glimpses of what promises to be the best release of SharePoint ever.

October 20, 2009

Getting paid for your SharePoint software, part 4


Table of contents

Part 1
Part 2
Part 3
Part 4   <— You are here


This is the fourth part of my series on getting paid for your SharePoint software. If you have not already done so, please read part 1, part 2, and part 3.

In part 1, I discussed how you need to be able to do four things to transform a piece of software into a product that you can get paid for:

  1. Distribute the software
  2. Deploy the software
  3. License the software
  4. Copy protect the software

Of these, licensing and copy-protecting the software prove to be the most difficult on the SharePoint platform.

In part 2, I laid out two options that address licensing and copy-protecting SharePoint software:

  1. Licensing and copy-protecting the installer
  2. Licensing and copy-protecting the application assemblies

While the first option is the easiest way to get started, it also provides the least amount of protection and capabilities. The second option is what you would ultimately want to get to, but it can require a significant level of effort to implement, somewhere between several weeks to several months.

In part 3 I showed how we could use a services or cloud computing application architecture to give us additional licensing and copy-protection options beyond what I covered in part 2. Unfortunately, we uncovered new challenges that arise from this architecture, namely:

  1. Customer’s WAN bandwidth limitations
  2. Security and multi-tenancy concerns about the hosted service
  3. User context maintenance and single sign on (SSO)

So does that mean that a services architecture just exchanges one set of problems for another? Not necessarily. In fact, we can see that all of the issues arise from the fact that the service was hosted outside of the customer’s environment and shared among many customers. Let us examine a an alternate approach where the service the web parts access is hosted not on the Internet but in the customer’s network. My colleague Bradley Smith termed this approach as Localized Software as a Service (LSaaS).

 

Option 4: Localized Software as a Service (LSaaS)

The LSaaS application architecture leverages the benefits on services oriented architecture (SOA) and cloud computing application delivery models. However, LSaaS extends “the cloud” into the customer’s data center. Whereas a SaaS software delivery model requires that all service requests traverse the customer’s WAN to the service vendor’s Internet-based service instance, LSaaS instantiates one or more instances of the service in the customer’s datacenter.

The main advantage of LSaaS over traditional SaaS is that LSaaS gains the benefits that SaaS offers in terms of ease of licensing and copy protection, but LSaaS mitigates the issues that arise with a pure SaaS approach, namely:

  • Limited WAN bandwidth
  • Risks of multi-tenancy
  • Loss of user context when invoking the service

For a more detailed description of these potential issues, please see the “All services all the time?” section of part 3 of this series. Let’s see what our licensing and copy protection options look like:

  • Licensing – LSaaS licensing options are very similar to the SaaS options discussed in part 3. Additionally, LSaaS software delivery schemes also allow for service-instance-based licensing models. The idea is that each instantiation of your service running at the customer’s site can deliver a well-defined, quantifiable amount of service capacity. If the customer needs additional capacity, they will require additional instances of the service. Each additional service instance requires the customer pay an additional fee.
     
    The key to making this licensing model work is that the service instance must deliver a predictable, quantifiable level of service capacity. How you define the capacity depends on what your service does. Popular capacity quantifiers include data storage, processing speed, processing throughput,  Here are some examples of companies using service-instance-based license models to deliver their software as LSaaS:
     
    • Black Blade Associate’s docBlock Ascend (disclaimer: I work for Black Blade)
       
      The docBlock Ascend provides virtual document processing to the SharePoint platform. Black Blade charges a flat fee for each docBlock server appliance. Each appliance has two load-balanced processing modules. For the purposes of licensing, you can think of each processing module as a virtual document service instance. Each processing module (service instance) can process 4 virtual documents per minute. “4 virtual documents per minute” is the quantifiable service capacity per service instance.

      If the customer needs to process more than 8 virtual documents per minute, they must order additional docBlock devices. There are no per-user, per site collection, per SharePoint server, or any other types of additional fees or license complexity. Simple huh?

    • Google Search Appliance

      The Google Search Appliance (GSA) provides enterprise search capability for a variety of content sources. Each GSA is a single service instance and can index 1,000,000 documents, the service capacity. If a customer needs to index more documents, they buy more GSAs.

      The SharePoint connectors for the GSA are free and even code with source code. Why doesn’t Google charge for the connectors too? The more document repositories are connected to a GSA, the faster the customer will exceed the GSAs service capacity and buy another GSA.

  • Copy-protection – As with the SaaS software distribution option discussed in part 3, copy-protection is no longer needed for the web parts or other components deployed to the customer’s SharePoint farm, as these components are just the service client and should be freely distributable.
     
    However, unlike the SaaS option, the LSaaS service instance you are deploying to the customer’s site does need to be copy protected. The best way to achieve this is to create a service deployment model that ensures that only your service is running on a given operating system instance. That way you can write management code that ensures that everything on the operating system instance, including the operating system components, conforms to a valid instance of you service. Black Blade, Google, and other vendors accomplish this copy protection by distributing their service instances as server hardware appliances.
     
  • Level of effort – As with the SaaS software distribution option discussed in part 3, the level of effort for actually doing the licensing and copy protection is fairly low. However, architecting or re-architecting an application to use a service model can be high, especially if a high percentage of the application’s codebase makes direct calls into the SharePoint object model. There is also additional effort involved beyond that which was required for a hosted service in creating an administration and management interface for your service instances. While you can get away with running direct SQL queries and editing raw XML files to configure your hosted service, I would strongly advise creating a more end-user friendly administration experience for the LSaaS service instances you deploy to customer sites. If you are going the appliance route for your LSaaS service instances, you will need to become familiar with shell scripting, writing unmanaged (C/C++) code, and WMI.

Which option is best?

It depends. You weren’t expecting a more definitive answer, were you 🙂 Let me elaborate a bit. If you are just getting started with a product idea, have a product that is mostly user interface, or have limited resources to implement licensing and copy-protection components, I would suggest starting with option 1 (adding licensing and copy protection to the installer only), discussed in part 2 of this series. Once you have a product that is generating some income and you have some more resources to devote to implementing licensing and copy-protection, look at option 2 (adding licensing and copy protection to the installer and the application assemblies), discussed in part 2.

If you have some serious resources you can devote to implementing or revamping your product (say 3-6+ moths worth) and your product provides value through more than just a nice user interface, i.e. the product has a service layer, you will be able to look at option 3 (delivering the product using a SaaS model), discussed in part 3 or option 4 (delivering the product using a LSaaS model), discussed above. Although option 4 has a higher level of effort than option 3, both options require roughly the same magnitude of effort to implement. The key question to ask to determine to which service model (SaaS or LSaaS) you should use to deliver your product is: Where does the bulk of the data the service needs to process reside, at the customer’s site or on the Internet? By that I mean which location has the most data, measured in bytes, that the service needs to act upon, the client’s site or the Internet?

If the bulk of the data, again measured in bytes, resides on the Internet, then you can easily use a SaaS model (option 3) for delivering your software. If the bulk of the data the service needs resides in the customer’s site, then you should use an LSaaS model (option 4) to deploy your service. This is not just an academic exercise. I went through this decision process for each of the products we sell at Black Blade Associates.

 

Conclusion

This concludes my “Getting paid for your SharePoint software” series. My goal in writing this series was to encourage more developers to start thinking about getting real returns on their SharePoint expertise by monetizing their SharePoint software. A platform like SharePoint can only achieve true success if customers can procure additional platform capability as supported products, not just consulting services. With the impending release of SharePoint 2010, this is a great time to start developing sellable software products for the SharePoint platform.

 
October 20, 2009

Visual Studio 2010 Beta 2 has new SharePoint 2010 development tools


Download the Visual Studio 2010 beta and check out the SharePoint 2010 walkthroughs and how to’s here:

http://msdn.microsoft.com/en-us/vstudio/dd441784.aspx

 

Walkthroughs

How To…

 

The Catch

What’s the catch? The SharePoint development tools included in Visual Studio 2010 are exclusively for SharePoint 2010.  What can you use to accelerate your SharePoint 2007 development today? Check out WSPBuilder. WSPBuilder is an open-source tool and add-in for Visual Studio 2008 that not only automates a lot of the manual tasks of creating a SharePoint solution package (WSP file), but also comes with lots of Visual Studio templates for SharePoint artifacts, like web parts, features, content types, etc…. There’s a great WSPBuilder walkthrough by Tobias Zimmergren here.

Then. look at SharePoint Solution Installer. SharePoint Solution Installer will take any WSP package and wrap it in a very friendly wizard interface. Next, next, next, and the WSP you created using WSPBuilder is deployed to the entire SharePoint farm.

InstallerCheck.jpg

 

October 13, 2009

Impersonation options in SharePoint code


When creating web parts, event receivers, timer jobs and other SharePoint code, I often find that I need to temporarily grant the code more permissions than the current user has. I’ve found that many developers either don’t know what options they have for user impersonation through code or are confused as to the subtle differences between the various options.

Impersonate the application pool identity

WSS V3 introduced the RunWithElevatedPrivileges method of the SPSecurity class. The RunWithElevatedPrivileges method impersonates the identity of the IIS application pool serving the current SharePoint request. The RunWithElevatedPrivileges method accepts a delegate method with no parameters. The RunWithElevatedPrivileges method impersonates the identity of the IIS application pool serving the current SharePoint request just prior to executing the user-specified delegate method. Once the delegate method has completed execution, the RunWithElevatedPrivileges method reverts the identity back to current user identity. Best of all: you don’t even need to know the username or password of the application pool identity. Here is some sample code that deletes a tasks list while impersonating the application pool identity:

Guid siteid = SPContext.Current.Site.ID;
Guid webid = SPContext.Current.Web.ID;

SPSecurity.RunWithElevatedPrivileges(delegate()
{
using (SPSite site = new SPSite(siteid))
{
using (SPWeb web = site.OpenWeb(webid))
{
SPList taskList = web.Lists["Tasks"];
taskList.Delete();
}
}
});

There are a few gotchas of which you should be aware when using the RunWithElevatedPrivileges method:

  1. You code will need a the proper level of code access security (CAS) permissions in order to run the RunWithElevatedPrivileges method.
  2. While impersonating the IIS application pool identity will give your code the highest level of permission within the current SharePoint web application, your code will have minimal permissions within other SharePoint web applications. This usually makes the RunWithElevatedPrivileges method a poor choice if you need to access a different SharePoint web application from the one in which the code is currently executing.
  3. According to SharePoint deployment best practices, IIS application pool identities should have minimal permissions on the SharePoint web servers. This means that your code may have minimal access to the underlying web server while executing within the RunWithElevatedPrivileges delegate method. You may not be able to do things like write to the Windows Event logs, work with temporary files, access the registry, or any other web server resource that requires a high privilege level.
  4. This method was implemented in WSS V3, so it will be unavailable in WSS V2 or SharePoint Portal Server 2003.

 

Impersonate the application pool identity, take 2


Update: Thanks to Jonas for correcting me and apologies to Todd for misquoting him.


Todd Bleaker, SharePoint MVP, clued me in to this technique. Normally the .Net Framework System.Security.Principal.WindowsIdentity.Impersonate method accepts an integer value corresponding to a Win32 logon token. However, there is a neat trick you can use with the method: if you pass in a zero pointer (System.IntPtr.Zero), the method will allow your code to run as the identity of the IIS application pool serving the current SharePoint request. As with the RunWithElevatedPrivileges method, you do not need a username or password to do the impersonation. However, unlike the RunWithElevatedPrivileges method, you can use this technique with WSS V2 and SharePoint Portal Server 2003. Here is the same code as in the previous example, but altered to use this impersonation method:

private WindowsImpersonationContext ctx = null;
ctx = WindowsIdentity.Impersonate(System.IntPtr.Zero);

using (SPSite site = new SPSite(siteid))
{
using (SPWeb web = site.OpenWeb(webid))
{
SPList taskList = web.Lists["Tasks"];
taskList.Delete();
}
}

if (ctx != null)
ctx.Undo();

This impersonation method has the same caveats as the RunWithElevatedPrivileges method technique, except that this will work with WSS V2 and SharePoint Portal Server 2003:

  1. You code will need a the proper level of code access security (CAS) permissions in order to run the impersonation, WSS_Medium, according to Todd Bleaker’s article.
  2. While impersonating the IIS application pool identity will give your code the highest level of permission within the current SharePoint web application, your code will have minimal permissions within other SharePoint web applications. This usually makes impersonating the application pool identity a poor choice if you need to access a different SharePoint web application from the one in which the code is currently executing.
  3. According to SharePoint deployment best practices, IIS application pool identities should have minimal permissions on the SharePoint web servers. This means that your code may have minimal access to the underlying web server while executing within the RunWithElevatedPrivileges delegate method. You may not be able to do things like write to the Windows Event logs, work with temporary files, access the registry, or any other web server resource that requires a high privilege level.

 

Impersonate a named user account

While you can use the System.Security.Principal.WindowsIdentity.Impersonate method to impersonate the Local System account, the real purpose of the method is to allow you to impersonate named user accounts, such as mydomain\jsmith. The .Net Framework has a nice model for handling the impersonation context once you have a logon token but is somewhat vague on the mechanics of actually getting logon tokens. My RunAs in C# article with source code details how to get these elusive logon tokens and execute your code in the generated impersonation context. The article wraps the LogonUser Win32 API. The main benefit to this approach is that you can execute code as any user you want, as long as you know the user’s username and password. This includes doing highly privileged operations within SharePoint, SharePoint farm servers, or any other server on the network. Here is a code sample that executes a file move using user-supplied credentials:

BlackBlade.Utilities.SecurityUtilities.RunAs(delegate()
{
File.Move(“local_file”, “network_file_location”);
},
“a_username”,
“some_password”);

What are the catches:

  1. You need to know the username and password of the account you want to impersonate.
  2. Accessing the Win32 API can generate non-intuitive error messages and be tough to debug.
  3. Finding a secure place to store the username and password of the account your impersonating can be tough. I recommend looking at the MOSS 2007 SSO infrastructure.

 

Impersonate the Local System Account

Normally the LogonUser Win32 function used to get logon tokens for .Net impersonation requires a domain, username, and password to get a logon token. However, there is a neat trick you can use with the method: if you pass in “NT AUTHORITY” for the domain and “Local System” as the username, the method will allow your code to run as the Local System account. You do not need a password. The Development Hole blog has a great article with code detailing how to impersonate Built-In service accounts. The Local System account can do pretty much anything on any SharePoint server in the farm. As with the other impersonation methods, you will need to manually revert the impersonation context. Here is some code that uses the Impersonate method to write an entry to the local SharePoint server’s Application event log:


using (Impersonation imp = new Impersonation(BuiltinUser.NetworkService))
{
this.EventLog.WriteEntry("This line would throw an exception for most SharePoint users or even most application pool identities");
}

If this method allows you to impersonate the Local System account and act as a local SharePoint server administrator at will, why would you not always want to use this impersonation method? Here are a couple of reasons:

  1. This method always impersonates the web server’s Local System account, regardless of in which SharePoint web application the code is running. This can make it very difficult to track which web application is accessing the web server’s resources.
  2. In most cases, the Local System account will not have rights to any network resources, including SharePoint content or administrative tasks. While you’ll be able to do anything you want on the local web server, you will not be able to engage in any SharePoint-related activities.

 

Technorati Tags: ,,,

October 8, 2009

VMware Workstation networking not working in Windows 7


I just upgraded my Vista x64 computer to Windows 7. When I started VMware Workstation 6.5.3 for the first time after the upgrade and launched a virtual machine, VMware displayed the following error:

The virtual network drivers on the host are incompatible with the installed VMware application. Expected version 5. Please reinstall the product.

image

The error sounds a lot worse than it actually is. Turns out that all I had to do to get things working was run a repair on VMware Workstation. You will need the VMware Workstation installation program. Do not run the repair option from the Programs and Features section of the Windows 7 Control Panel. If you do, you will receive a message box telling you to run the original VMware Workstation installation software, and the network repair will not complete properly.

The repair process will take a while to complete, about 15 minutes on my computer. The installation software will remove and reinstall all of the virtual drivers, including the networking, on the host computer. You will need to restart your computer after the repair is complete.

Once you restart your host computer, you may get an this error message:

Device driver software was not successfully installed.

image

This is actually referring to the two network drivers VMware installs. I’ve only seen this on one computer so far, but as near as I can tell, there are no adverse affects yet.