Wednesday, December 03, 2008

How to get Hyper-V to read corrupted VHD files

We have a lots of projects that are virtualized.  When we finish with a project it is common practice to de-hydrate the servers and put them away.  For one of our customer's project, we followed this process over a year ago.  Lots of things have changed in the past year. First, we moved from one building to a different building.  On the process, we added new servers, clean up some old AD's, change OS's (Windows 2008 DataCenter... oh yeah.!), etc...

Now trying to hydrate back online this one project's environment with all of the infrastructure changed was a challenge. For this enterprise application we have Oracle, App Servers, Web Servers, Test Servers, etc.

Out of all of the VHD's that we needed to bring back, only 2 were recognized by Hyper-V.! Doing a check on the rest of those VHD's gave me the file or directory is corrupted and readable message... :(


This is how to check for integrity using the Hyper-V. Add the vhd file to the IDE controller and then hit the Inspect button.


[Main Instruction]
An error occurred when attempting to retrieve the virtual hard disk "F:\hydrate\Project1-Agent.vhd" on server XXX.

The file or directory is corrupted and unreadable.

At this point, if we try to recreate all of the settings/configurations stored on those servers will take days.! Some entries in the TechNet forums mentioned that they got it working by re-opening the files on VirtualPC. This is the way I got my corrupted vhd's to work:

First, I create a new Virtual Machine with a new HD.  Then went to Settings|Hard Disk 1 and loaded this corrupted vhd file.


There you will find the Virtual Disk Wizard. Click on it, go through the prompts and select the option to Edit an existing virtual disk.

Then choose to Compact it.  The reason for this, is that if you select the Convert it to a fixed-size virtual hard disk, the wizard will make the new hard disk EXACTLY the same size of what content is written at the moment in it.  Which means, if you have allocated 20GB to this disk initially, and have used 12GB of it.  Now instead of having a 20GB with 8GB free, you ended with a fixed 12GB hard drive with NO SPACE LEFT on it.


On the next screen, I save the file under a different name for precaution. Now click next and wait. It will take about 1hr for a 28GB file.  Once this process completed, I copied the new file back to our Windows 2008 server, load it on Hyper-V and it was recognized.!


Wednesday, November 12, 2008

Windows 2008 DataCenter Mouse Blues

Working with the Windows 2008 Datacenter version, I had to create a couple of Virtual machines. Once I remote into that server, and created the virtual machines, I've found out that connecting to those VM while using Remote Desktop causes the VM machine to not recognize the mouse.!

I keep getting this annoying dialog box: Mouse not captured in Remote Desktop session.

Of course the content help wasn't helpul at all:

The mouse is available in a Remote Desktop session when integration services are installed in the guest operating system. For more information, search on 'integration services' in Help.


if you look at the bottom of the Virtual Machine Connection screen you see this


I have tried removing the virtual machine additions, adding the Hyper-V Integration Services, etc, etc.  It is still did not work. I found a blog post from Maarten van Stam that talks about making the VM look at the HAL by adding an extra CPU to the VM.  At this point, I'm very aggravated that I don't have a mouse and keep having to use TABS and SHIFT-F10 keys all over, so I give it a try.

After adding an extra CPU, rebooted the VM, and now my mouse works.!  WTF?  I remove the extra CPU from the VM, and confirmed that in fact the mouse still works.

One more thing, you can get your mouse working if you install the Hyper-V Management Tools on your Windows Vista, here is the MSDN article that talks about it

Thursday, October 30, 2008

How to configure a MOSS VPC Development image

On the Microsoft TechNet site, there is a couple of articles on how to set up a MOSS environment. Following those steps will give you a good environment.  However, I have found that there is not that much control over the naming of the databases.  This is fine, but then you have to come back and do some extra steps to rename them correctly.  These are the steps that I follow to avoid having to come back and do all of that work.

I have already configure a VPC image with Windows Server 2003, SQL Server, Visual Studio 2008, etc.  I have also set it up as a Domain Controller (DC) and create a couple of accounts (SharepointService, SQLService, etc).

There are instructions on TechNet on how to setup the SQL Server before setting the MOSS environment: Prepare the Database servers. Follow those steps.

Now start the MOSS setup, select the Advance installation


I choose to install the Web Front End, since it gives me the most flexibility


Once this step has finished running, this is a good point to rename the Content DB's.


Do not run the configuration wizard yet.


Uncheck and close this window. Now use the psConfig utility, which it is available on this location:

C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN

parameter value
Server MossOnlyAW
User SharepointService
Password *********
Config DB name STS_Config
Admin Content DB name STS_AdminContent

this is the parameters to configure the initial Databases

  -cmd configdb
  -create -server MossOnlyAW  
  -database STS_Config
  -user MOSSBSG\SharepointService
  -password <myPassword> 
  -admincontentdatabase STS_AdminContent

Run this from the command prompt window and you will get the following results:

SharePoint Products and Technologies Configuration Wizard version 12.0.4518.101
Copyright (C) Microsoft Corporation 2005. All rights reserved.

Performing configuration task 1 of 3
Initializing SharePoint Products and Technologies configuration...

Successfully initialized the SharePoint Products and Technologies configuration

Performing configuration task 2 of 3
Creating the configuration database...

Successfully created the configuration database.

Performing configuration task 3 of 3
Finalizing the SharePoint Products and Technologies configuration...

Successfully completed the SharePoint Products and Technologies configuration.

Total number of configuration settings run: 3
Total number of successful configuration settings: 3
Total number of unsuccessful configuration settings: 0
Successfully stopped the configuration of SharePoint Products and Technologies.
Configuration of the SharePoint Products and Technologies has succeeded.

C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN>

now if you check your SQL server, you will see the 2 databases that were created, it also went and added that SharepointService account to the valid logins in your SQL Server.


Now go ahead and run the SharePoint Configuration Wizard. It will pick the databases that you have created, do not disconnect from this server farm:


I like to specify a port number that is easy to remember


Gotchas:  You might run into an error if the service account does not have enough rights:

Server Error in '/' Application.

The current identity (MOSSBSG\SharepointService) does not have write access to 'c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files'.

so open up this folder and add the SharepointService account to the list of  authorized users. Give this account access to modify.


Close the browser and re-open the Central Administration page. You can now start to configure your MOSS sites.

Friday, October 17, 2008

Certification vs Google Developer


I have seen this thread so many times.  Those who don't have certifications always batching those that have them. Yes there are lots of paper certs developers out there.  Braindump sites has brought the core values of the certification down.  But consider the facts:

Certification Preparation Facts:

  1. to become certified you need to at least study the preparation guide for the exam
  2. you will need to read at least one book on the subject
  3. you would probably take some simulation test before taking the exam

 Summon the vast power of certification - Dilbert

A certification WILL not get you a job.! There are lots of factors that influence the decision to hire someone.  However, having achieve a certification in a new technology goes a long way to show that you have the Passion and Commitment to stay on top of the technology.

Yes, writing a blog entry does show that you are into technology.  But having to study to pass a test is a different beast.  A blog is written on your spare time, and there is no commitment to when you need to blog.

I hate when someone will tell me that they don't know something, but they are fast learners and they can find it really quick on google.  That's great, but when you are in presenting in front of a client, they expect you to know EVERYTHING right there.  When hiring a developer, there should be a base level knowledge that they should have. Knowing the basics first and then knowing how to find information in the internet for other stuff is fine with me.

Basic knowledge like how to use the xmlWritter, or what the background worker is? or creating a delegate. Using globalization and resources?  About ClickOnce deployment?  All of these concepts and more are required on the certification exam for .NET.

When you start getting into the Enterprise playing field, certifications are more valuables in terms of knowledge, rather than chrome on your resume. i.e. knowing what can be done and how it can be done in BizTalk is more important to a client, than the actual detailed implementation of it.  This is because every client out there has a different problem that they want you to solve, and they are looking at you for answers.

Once again, certification alone does not qualify ANY developer to get a job, but at least it shows that this person has the determination to study and learn the basic to do his job.  Which one will I hire?  The one that has a black belt in google searching or the one that took the extra time, to learn something.

Do you have the time to study?  Do you have the determination and commitment?  Do you have the experience? and most important of all Are you Passionate about Technology?

Certification Pros:

  1. your team will only move as fast as the slowest developer.  Requiring a certification, will set the base by which you can start moving forward.  You don't have to start guessing what level is each of your team members.
  2. Invest some training $$ on your team, they will feel appreciate [read more...]
  3. Think like an owner..!! Everyone wants to work for a Gold Certified Partner. Well, getting those Microsoft competencies takes some requirements:
    1. Customer references
    2. CERTIFICATIONS Requirements...!!

If you think that certifications are used only for evaluating a prospective candidate, think again. How does your company benefits from having certified developers?  Can you company truly stand out among other companies out there without certified people? If you want to be a Business Partner with some of the big companies out there like Microsoft, IBM, Sun, guess what?  You need certifications...!

The guy that study and did that extra effort is the one I want on my team, and in my Company [Quick Solutions, Inc.]

Thursday, October 09, 2008

Visual Studio 2008 free eBooks

Offer from Microsoft to download some cool content, check it out

Programming Microsoft LINQ
Introducing Microsoft® Silverlight 2, Second Edition Programming Microsoft® ASP.NET 3.5
by Paolo Pialorsi and Marco Russo by Laurence Moroney by Dino Esposito
ISBN: 9780735624009 ISBN: 9780735625280 ISBN: 9780735625273

Thursday, October 02, 2008

DocTools not working

For one of the projects we are working on, we need to make sure our end user documentation matches correctly the template provided by the customer. I have started playing with one of the products that the Patterns and Practices at Microsoft is using to generate their documentation. Take a look at their package:

After going through all of the installation and pre-requisites.  I was ready to try the sample Word Document they provide.  Snag.!  

First thing I notice is that the sample scripts are looking for the Microsoft DocTools in the c:\program files\Microsoft DocTools.  Changing those scripts brought to my attention that the install program gives you the ability to specify the path on where to install the DocTools.  However, it will always install to the (x86) directory on my Vista x64.

The main error I was getting was:

PS C:\DocToolsDemo> .\HTML.cmd

C:\DocToolsDemo>PowerShell ConvertToHTML.ps1 ESBIntro.docx .\outputHTML """C:\Program Files (x86)\Microsoft DocTools\Document Converter\Formatting\MSDN2\xsl""" convertESBConfig.xml

    Directory: Microsoft.PowerShell.Core\FileSystem::C:\DocToolsDemo

Mode                LastWriteTime     Length Name
----                -------------     ------ ----
d----         10/1/2008   9:10 AM            outputHTML

    Directory: Microsoft.PowerShell.Core\FileSystem::C:\DocToolsDemo\outputHTML

Mode                LastWriteTime     Length Name
----                -------------     ------ ----
d----         10/1/2008   9:10 AM            html
Add-PSSnapin : No Windows PowerShell Snap-ins are available for version 1.
At C:\Program Files (x86)\Microsoft DocTools\Document Converter\ConverterLibrary.ps1:9 char:15
+         Add-PSSnapIn  <<<< ppConverter.Cmdlets

--Splitting: C:\DocToolsDemo\ESBIntro.docx
An Error Ocurred: The term 'Split-Document' is not recognized as a cmdlet,
function, operable program, or script file.Verify the term and try again.

PS C:\DocToolsDemo>

After doing all of the regular stuff (check PATH, re-install, start/stop program, google), I could not get this error to go away.

The problem seems to be with the OS I am running.  Since I am running a 64bit OS, I need to call the PowerShell that is on the c:\windows\syswow64 and not the one in the regular 32bit.  This is accomplish by running the one marked with the (x86)


Once I opened this one, ran the set-executionpolicy remotesigned command.  It ran and split the Word document successfully and generated my HTML help compile file.


One of those simple bugs that can waste a whole hour for you.!

Tuesday, September 16, 2008

Moving TFS Build Server from Single instance to Multiple Servers

I have moved our Build server from a single installation to a 2 server installation.  When putting together the new Build server, I used my own domain account, since I did not have access to the TFSService account. My thought was that I could work on setting up the system, then once I would get the correct userid/password, I could easily swapped that for my account...


I set the environment, and submitted a build type successful. Had a little issue with the directories permissions, but once I get that worked out, everything was building fine.  Then, I got the correct domain account, and I went and change the Team Build Service to run under this domain account.

I got this error:

error MSB4018: The "CreateWorkspaceTask" task failed unexpectedly.

Here is the description for that error:


C:\Program Files\MSBuild\Microsoft\VisualStudio\v8.0\TeamBuild\Microsoft.TeamFoundation.Build.targets(306,5):

error : The working folder c:\TFS_Builds\Builds\Sample\awing_test2\Sources is already in use by another workspace on this computer.
error MSB4018: The "CreateWorkspaceTask" task failed unexpectedly.
error MSB4018: Microsoft.TeamFoundation.VersionControl.Client.WorkingFolderInUseException:
The working folder c:\TFS_Builds\Builds\Sample\awing_test2\Sources is already in use by another workspace on this computer. ---> System.Web.Services.Protocols.SoapException: The working folder c:\TFS_Builds\Builds\Sample\awing_test2\Sources is already in use by another workspace on this computer.

From the studio client machine, I was testing some of the builds.  Well, part of the script is to build a workspace on my behalf on that machine. Since it was running under my domain account, when I switch the services to run under the TFSService account, it was encountering a duplicate error.


The solution for this error is to remove the workspace entry from the client machine, and not the build server machine.  This solved my issue and I was able to submit build types correctly.

Friday, September 05, 2008

BizTalk 2009 has been announced.!!

Sweet. The new version of BizTalk server has been announced.  I am so glad they dropped the BizTalk 2006 R3 name and went with the new BizTalk 2009.


Finally all of the OSLO technologies are starting to show up in the road map...

Read more on the Microsoft Site

Wednesday, August 06, 2008

BizTalk Business Rules Engine Handy functions in .NET

Working with the BizTalk 2006 R2 BRE api, I find myself going back to this code to start of as a base. These 2 functions get ALL of the versions of the policies or vocabularies.  If you want to work with only the latest, or the published ones, change the RulesStore.Filter enum to whatever your needs.

List of Policies

   1: public void GetPoliciesList()
   2: {
   3:     Microsoft.BizTalk.RuleEngineExtensions.RuleSetDeploymentDriver breDriver =
   4:         new Microsoft.BizTalk.RuleEngineExtensions.RuleSetDeploymentDriver();
   5:     Microsoft.RuleEngine.RuleStore breStore = breDriver.GetRuleStore();
   7:     Microsoft.RuleEngine.RuleSetInfoCollection colPolInfo = null;
   8:     colPolInfo = breStore.GetRuleSets(RuleStore.Filter.All);
   9:     foreach (RuleSetInfo pInfo in colPolInfo)
  10:     {
  11:         Trace.WriteLine(string.Format("bts- Info = [{0}].v.{1}.{2}",
  12:             pInfo.Name, pInfo.MajorRevision, pInfo.MinorRevision));
  14:         //get the policies to extract rules
  15:         Microsoft.RuleEngine.RuleSet pol = breStore.GetRuleSet(pInfo);
  16:         Trace.WriteLine(string.Format("bts- Count = [{0}]", pol.Rules.Count));
  17:     }
  18: }

This is how to get a list of all policies published on the BRE database.  Once you get all of the RuleSets, you can loop through each of them to retrieve the actual rules behind them.  Before getting to the rules, you need to get a RuleSet out of the RuleSetInfo.

List of Vocabularies

   1: public void GetVocabulariesList()
   2: {
   3:     Microsoft.BizTalk.RuleEngineExtensions.RuleSetDeploymentDriver breDriver = 
   4:         new Microsoft.BizTalk.RuleEngineExtensions.RuleSetDeploymentDriver();
   5:     Microsoft.RuleEngine.SqlRuleStore sqlRuleStore = (SqlRuleStore)breDriver.GetRuleStore();
   7:     Microsoft.RuleEngine.VocabularyInfoCollection colVocInfo = null;
   8:     colVocInfo = sqlRuleStore.GetVocabularies(RuleStore.Filter.All);
   9:     foreach (VocabularyInfo vInfo in colVocInfo)
  10:     {
  11:         Trace.WriteLine(string.Format("bts- vInfo = [{0}].v.{1}.{2}",
  12:             vInfo.Name, vInfo.MajorRevision, vInfo.MinorRevision));
  14:         //get the vocabulary to extract collection of definitions
  15:         Microsoft.RuleEngine.Vocabulary voc = sqlRuleStore.GetVocabulary(vInfo);
  16:         Trace.WriteLine(string.Format("bts- Count = [{0}]", voc.Definitions.Count));
  17:     }
  18: }

This is how to get a list of all vocabularies published on the BRE database.  If you need to see how to get to all of the definitions on a particular vocabulary, see my previous post: How to access BRE Vocabularies from .NET.

Noticed that there are some subtle differences in how you retrieve each piece of information.  For Policies you get a RuleStore while to get the Vocabularies you need to get a SQLRuleStore.

As usual, feedback is always welcome if you use this code.

Tuesday, July 29, 2008

Deploying Business Rules using C# in BizTalk 2006 R2

In one of my last project, I have been working extensively with the Business Rules Engine in BizTalk. This project contains some complex Federal/State/Company/ rules. Being a Health Management Care related project I have to deal with many issues like requirements, changing policies, rules scope to a certain type, etc.

Delivering this project took some time and effort.  Not only the requirements were *agile*, but trying to keep compliance with SOX laws was a challenge.  I can summarized all of the requirements to these 3:


  1. Policies have to be atomic (independent from changes to other policies)
  2. Versionning of policies and being able to execute ANY version, any time.
  3. Need to know which facts were used to determine an outcome.

For the main requirement, I have created a Master Rule that determine the outcome, then I've created several *supporting* rules that will help me determine which of the rules were evaluated. The versionning requirement was already implemented by the BRE in BizTalk.

a sample of this would be something like this:


It quickly became very obvious that I have to deal with lots of policies and vocabularies.  If you ever tried using the Rules Deployment Wizard, you will see that it only allows you to export a SINGLE policy at a time.  Having over 500+ policies and over 30+ vocabularies  was not going to work out.

I have found the DeployRules.exe application written by Sreedhar Pelluru from Microsoft.  Here is the original article.  I took his program and modified to meet my needs.  Since I was only testing a set of policies at a time (i.e. Federal policies only), this tool provide me the ability to only load those policies that were relevant to the type I was working on. 

I know I have learned a lot about the BRE api from reading his well documented code.  With his permission, I have posted his original work and the modifications done to it back to the community at  Yes it is still a work in progress.

Hope this help someone out there.

Tuesday, July 08, 2008

Importing BRE Vocabulary with Multiple versions

If you ever tried merging all of the versions of a single vocabulary into one XML file, so that you can import it on a single task, you will find that even though the Rules Engine Deployment Wizard understand the file format, it has a huge limitation [Bug..?  ;) ].  It only imports the last version of a vocabulary into the Rules Engine.


A sample vocabulary with 2 versions.


When Exporting this vocabulary I can't export all versions at once.  I have to export a single version at a time.!!

However, on the import, you can import a file that can contain multiple versions on it:


Once you export all individual files, you can merged them into a single xml file.

The format of the merged exported vocabulary will be something like this:

<brl xmlns="">   

  • <vocabulary id="9ab458cc-427a-4cea-bb1d-224dd5f96d98" name="CustomerLevels" uri="" description="">
            <version major="1" minor="1" description="" modifiedby="awing" date="2008-07-07T23:22:33.401-04:00"/>
            <vocabularydefinition id="b36b276e-451d-4783-8a06-623823211f85" name="Silver" description="Silver Description">


  • <vocabulary id="2422362a-77c0-4d0f-b2aa-fe6c1fe1f1d7" name="CustomerLevels" uri="" description="">
           <version major="1" minor="0" description="" modifiedby="awing" date="2008-07-07T22:39:37.19-04:00"/>
            <vocabularydefinition id="ba912d07-f96d-49ac-a2c4-e619fcec027e" name="Silver" description="Silver Description">


As you can see there you can add as many versions to this file as you want.  However, the ReDeployWiz.exe only publish and import the latest one.

Trying to figure out why this is the behavior, I used my good old friend Reflector. Bringing Reflector on the Rules Engine Deployment Wizard, I see that there is a call to the RuleSetDeploymentDriver namespace. This doImport method calls the driver.ImportAndPublishFileRuleStore to import and publish at the same time.



It seems that there is no way around this call.  It seems to be a limitation on the use of the tool.  This ImportAndPublishFileRuleStore seems to only work on a single version at a time.  If you split the versions into their own file, it can handle it. The down side of this, is that you need to make multiple calls for an import.  And let's face it, the whole nature of versioning the policies and vocabularies becomes very cumbersome when you have over 500+ rules with multiple versions in them [yes, my current project has over 500+ policies and over 30+ vocabularies]

To get around this limitation on the tool, you will have to write your own application to deploy/export all versions of a vocabulary.  To accomplish this you will need to to call the SqlRuleStore namespace instead of the RuleSetDeploymentDriver.  This namespace have the ADD method which contain several overloaded parameters. One of which it allows you to publish or not publish your vocabulary.


Here is my sample code to import ALL versions of a vocabulary in .NET code:

    1 private static int ImportVocabulary(string filename)
    2 {
    3     int result = 0;
    4     // FileRuleStore - gives access to the BRL (XML) file containing policies and vocabularies
    5     FileRuleStore fileRuleStore = null;
    7     // RuleSetDeploymentDriver has the following important methods            
    8     Microsoft.BizTalk.RuleEngineExtensions.RuleSetDeploymentDriver dd = new Microsoft.BizTalk.RuleEngineExtensions.RuleSetDeploymentDriver();
   10     // SqlRuleStore - gives access t0 the rule engine database
   11     SqlRuleStore sqlRuleStore = (SqlRuleStore)dd.GetRuleStore();
   13     //Get VocabularyInfoCollection object based on the file
   14     fileRuleStore = new FileRuleStore(filename);
   15     VocabularyInfoCollection vocabularyInfoList = fileRuleStore.GetVocabularies(RuleStore.Filter.All);
   16     foreach (VocabularyInfo vocabularyInfo in vocabularyInfoList)
   17     {
   18         string vocabularyNameWithVer = string.Format("{0}.{1}.{2}", vocabularyInfo.Name, vocabularyInfo.MajorRevision, vocabularyInfo.MinorRevision);
   19         ExtractVocabularyNameMajorMinor(vocabularyNameWithVer);
   20         VocabularyInfo vi = new VocabularyInfo(App.vocabularyName, App.vocabularyMajorVer, App.vocabularyMinorVer);
   21         Vocabulary oVoc = fileRuleStore.GetVocabulary(vi);
   22         sqlRuleStore.Add(oVoc, App.publishVocabulary);
   23     }
   24     return result;
   25 }

There, now you don't have to be bound to import a single vocabulary version every time you need to move your rules from DEV to UAT to PROD.

Hope this saves someone lots of time and grief.  Happy BRE'ing... ;)

Friday, June 06, 2008

MOCSDUG: ESB Guidance Toolkit for BizTalk 2006 R2

Last night 2nd meeting was just as good as the first one.  Richard Broida gave a good overview of some of the key points for the ESB Guidance toolkit, as well as some good background info on having a good architecture base.

There were some BizTalk developers in the room and there were some other ones that were interested on BizTalk. Somehow, the seating arrangement turned out to be all BizTalk developers in the middle section and everyone else out on the sides. ;)

Richard's blog is, and I'm waiting for his slide deck to show up at the MOCSDUG site.

ESB Guidance Toolkit for Biztalk 2006 R2 

I have been interested on the ESB Guidance toolkit, and after tonight's meeting I have decided to install it and try some of their samples.  Of interest to me are the Message Repair block and the Exception Handler block.  Will post on my findings on those blocks when I get them running.

Wednesday, May 14, 2008

Distinguished fields of type xs:dateTime not Working on Orchestrations

Writing a spyke for a simple program, I came out with this odd behavior when I try to compile my BizTalk project.


I have created a simple schema, and in one of the fields I have field of type xs:dateTime. Well, when I have tried to use this field on an expression shape, I get this build compiler error:

'System.Xml.XmlDocument' does not contain a definition for 'XXXX'

where XXXX is the field name of the element that I have declared as a xs:dateTime type. I then went and set it up as a distinguished field. Here is a sample generated xml from my test schema:

  1. <ns0:Customer xmlns:ns0="http://DistingProperty.Test.Customer.v1">

and this is the schema that I have used


When I am trying to use the distinguished field inside an expression shape, noticed that I get the Visual Studio IntelliSense:


When you try to read this value, it will always complain about the XmlDocument not being able to find the definition for the field that is defined as DateTime.

To get around this, you should use the System.Convert.ToString() instead of the .ToString() function;

1 Trace.WriteLine(" bad:[" + msgIN.DOB_datetime.ToString() + "]");

2 Trace.WriteLine("good:[" + System.Convert.ToString(msgIN.DOB_datetime) + "]");

Another lengthy way to get around this issue is to assign the distinguish field to an xmlNode and then use the xml Namespace Manager to get to the node value instead.

The code on my expression shape looks like this:

1 System.Diagnostics.Trace.WriteLine("bts- In here");


3 //assign values to person

4 xDoc = msgIN;


6 xmlnsMgr = new System.Xml.XmlNamespaceManager(xDoc.NameTable);

7 xmlnsMgr.AddNamespace("ns0", "http://DistingProperty.Test.Customer.v1");


9 xNode = xDoc.SelectSingleNode("/ns0:Customer/DOB_datetime", xmlnsMgr);

10 System.Diagnostics.Trace.WriteLine("bts-[" + xNode.OuterXml + "]");


12 xNode = xDoc.SelectSingleNode("/ns0:Customer/DOB_date", xmlnsMgr);

13 System.Diagnostics.Trace.WriteLine("bts-[" + xNode.OuterXml + "]");


15 //this works as expected

16 System.Diagnostics.Trace.WriteLine("bts- " + msgIN.FName);


Which ever way you choose, this looks like a limitation on the way the XLANG/s in the Expression shape interprets the command code.

Monday, May 12, 2008

How to Setup Windows SharePoint Services 3.0 with BizTalk 2006 R2

Today I have found this error AGAIN.! [old post]

Setup is unable to proceed due to the following error(s):
This product requires ASP.NET v2.0 to be set to 'Allow' in the list of Internet Information Services (IIS) Web Server Extensions. If it is not available in the list, re-install ASP.NET v2.0.
Correct the issue(s) listed above and re-run setup.

This time I am installing WSS 3.0 with SP1 on a Windows 2003 - SP2 machine.  This is a greenfield installation of BizTalk 2006-R2.

Noticed that on my IIS Manager, there is no ASP.NET 2.0 service extensions


I ran the standard command that *everyone* should have memorized by now... ;)

c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\aspnet_regiis.exe -iru -enable

Now, when I open my IIS Manager I see the ASP.NET v2 service extension enabled


Running the setup.exe for WSS works fine now. 

Hint:  Don't forget to select Advanced settings and then select  the Web Client configuration.  This is a necessary step, if you want to specify the name of the database where the WSS configuration will exist.

Thursday, May 08, 2008

Error trying to re-install SQL Server 2005 on a Failover Cluster environment

I had to un-install the SQL instance that I have on my virtual cluster.  The reason?  I could not get the Service Pack 2 to be recognized by my BizTalk configuration.  Apparently, the SP2 that I had installed was the 9.00.3027.0 and not the 9.00.3042.1  Monish pointed that out that I might have an older version of the Service Pack [he is the only person I know that reads those EULA information...]

 9.0.3027.0  - 12/1/2006 11:17am 9.0.3042.1  - 9/5/2007 12:09am

I am trying to start from scratch the installation of SQL 2005, and when I run the setup I get this message:


TITLE: Microsoft SQL Server 2005 Setup
There was an unexpected failure during the setup wizard. You may review the setup logs and/or click the help button for more information.

For help, click:

Well, clicking on that link, does not provided any more help. Click on the help and I get this other screen:


the last line tells me about the event type that has failed:


Now, I have the code that causes the installation to fail. What's next?  ;)

Then I found this other technical article on the MSDN 925976, this suggested cleaning up the registry. I went and clear all of the registry entries from my SQLNode1 and I still get the same error.  I then follow the same instructions on my SQLNode2.  This still did not allowed me to run the setup.  So I went one step deeper and instead of removing just the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.X\ registry key like they suggested, I removed all hives starting from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server.

Success..! now I am able to run the setup on my primary node, and I *WILL* install the correct Service Pack this time.. ;)

Monday, April 21, 2008

Day of .NET - Wilmington, OH


One more year of an awesome event. 

Jeff Blankenburg has started what seems like a great tradition: Poker night...!  After the appreciation dinner, we were all invited to room 506 for a poker night. It was amazing that almost everyone went up to the suite to just hang out.  There was an equally amount of people in there just watching than playing.

Hopefully, Jeff will remember to bring something *ELSE* for those of us that just want to hang out and don't play poker. [Xbox 360 / Wii ;).  It was a different way of bringing together all of the masses to a room where we could talk. thanks Jeff.

Sessions that I attended: 

  • Mobile with Nino: I thought that the topic was very interesting; however, I think Nino was not all there.  Maybe that trip to the MVP summit was too tiresome for him ;)  His presentation was good, but I think he could have done a lot better.  I could not attend his 2nd talk on Mobile development, but I heard it was very interesting as well...
  • XML Capabilities in SQL with Jason was great.  Learn a couple of things that I have added to my to-do task of things I want to try when I have some spare time.
  • Soft Skills with Brian.  What can I say, Brian delivered another thought provoking talk, good pointers in there.  I am lining up for the swag-monkey position for next time. ;)
  • Reliable messaging with WFC with James was good.  WFC is one of those topics that are so vast, and James really nail down the point that he was trying to make. He lived up to his *twitter promise* and in fact it was a MUCH improved talk from last year.
  • Agile Practices and TFS with the comrade was good.  I like the way that he show the agile implementation by using a tool like TFS.

I had a blast at this day of .NET event.  The new location was awesome.  We all got to eat sitting at a table.!! (unlike last year's... ;)

Following last year's tradition, it was Monish time to oversleep. And HE did.  I have never seen a Prius doing more than 70mph (or for that matter Monish driving THAT fast...;)  One thing is that me and Alexei learned is that Monish does not know how to avoid things on the road. On the way to finding a Bob Evans for early breakfast, he hit a dead skunk.! Even thought we smell the dead skunk and saw the body over 100ft ahead.!!

Looking forward next year's when it will be Alexei's turn to drive... ;)

Tuesday, April 15, 2008

VPC 2007 not running on Virtual Server 2005 R2

I have been trying to port a VPC 2007 to run on our Virtual Server 2005 R2 with no success. This is the error I am getting:

Virtual Machine
The "Virtual Hardware Standard" (Virtual PC 2007) in the configuration .vmc file for "XXX Server" was not created by Virtual Server. "XXX Server" can start, but some settings may be changed and some settings may not be used.

other errors that I am getting:

Virtual Server
The virtual machine “XXX Server” could not be started. An unexpected error occurred.

Virtual Machine
"XXX Server" could not be started because a disk-related error occurred.

I am still not sure as to what the error is. So I ran the Inspect and also the compact utility on the hard drive hoping that this action might *magically* fixed this issue.

Virtual Server 2005 R2 Pending Actions

I get the message that it did succeed compacting:

Virtual Disk Operation
The virtual hard disk "E:\Virtual Machines\XXXServer\BaseWin2K3 Hard Disk.vhd" was compacted.

However, I still get the unexpected error message. At this point, I decided to merge the diff disk with the parent and then keep a single file. Clicking on the Merge virtual hard disk link yielded this:

Virtual Server 2005 R2 Merging Disk

After this is done, I get this message:

The parent virtual hard disk appears to have been modified without using the differencing virtual hard disk located at "E:\Virtual Machines\XXXServer\W2K3 Diff.vhd". Modifying the parent virtual hard disk may result in data corruption. It is strongly recommended that you lock the parent virtual hard disk to prevent this in the future. If you recently changed time zones on your computer, you can safely continue using this virtual hard disk.

So, I decided to create a new virtual machine. First step, I deleted the VMC file. Then create a new Virtual Machine with no Virtual Hard Disk defined.

Attach a virtual hard disk later (None)

After that was done, clicked on the Configurations for that new Virtual Server, then clicked on the Virtual Hard Disk properties, and added the Virtual Hard Drive that was merged earlier:

Virtual Server 2005 R2 - add existing VHD

Started the New Virtual machine, and PRESTO..!! it's alive [muahh, muahh, muahh] and it has all of my latest changes into it.