By Larry Mosley, Senior Escalation Engineer
I was troubleshooting a new Data Warehouse Load job failure, that was particularly difficult to find the root cause of and want to share this information to help others more quickly find resolution.
I was seeing event 33503 error in the Operations Manager event log on the Data Warehouse server:
An error countered while attempting to execute ETL Module:
ETL process type: Load
Batch ID: <batchID>
Module name: LoadOMDWDataMartServiceRequestDim
Message: The given ColumnMapping does not match up with any column in the source or destination.
Stack: at System.Data.SqlClient.SqlBulkCopy.AnalyzeTargetAndCreateUpdateBulkCommand(BulkCopySimpleResultSet internalResults) at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) at Microsoft.SystemCenter.Warehouse.Utility.SqlBulkOperation.Upsert(String sourceConnStrg, String query, String destConnStrg, String destinationTable, Dictionary`2 mapping, Collection`1 pkCol, DomainUser sourceSecureUser, DomainUser destSecureUser, SqlResourceStore targetStore) at Microsoft.SystemCenter.Warehouse.Etl.ADOInterface.Upsert(DomainUser sourceConnectionUser, DomainUser destinationConnectionUser) at Microsoft.SystemCenter.Warehouse.Etl.ADOInterface.Execute(IXPathNavigable config, Watermark wm, DomainUser sourceConnectionUser, DomainUser destinationConnectionUser) at Microsoft.SystemCenter.Warehouse.Etl.LoadModule.Execute(IXPathNavigable config, Watermark wm, DomainUser sourceConnectionUser, DomainUser destinationConnectionUser, Int32 loadBatchSize) at Microsoft.SystemCenter.Warehouse.Etl.LoadModule.Execute(IXPathNavigable config, Watermark wm, DomainUser sourceConnectionUser, DomainUser destinationConnectionUser) at Microsoft.SystemCenter.Etl.ETLModule.OnDataItem(DataItemBase dataItem, DataItemAcknowledgementCallback acknowledgedCallback, Object acknowledgedState, DataItemProcessingCompleteCallback completionCallback, Object completionState)
These errors will also occur on LoadDWDataMartxxx and LoadCMDWDataMartxxx
The problem is that the mappings we supply to SqlBulkCopy in the Module configuration, do not match the CASE of columns in the source and destination tables. This will happen regardless of whether the databases are case-sensitive or not, because the SqlBulkCopy class tries to match the mappings with the database tables, but in doing so it does a case-sensitive comparison.
So how do we tell which columns are ‘bad’?
First we need to know the module that is failing. Fortunately, the module name is in ‘Module name:’ field of the event entry. In this case it is LoadOMDWDataMartServiceRequestDim. This load modules loads data from DWRepository.dbo. ServiceRequestDim to OMDWDataMart.dbo.ServiceRequestDim.
Next, connect to the Database server that hosts the DwStagingAndConfig database. Change to DwStagingAndConfig, and execute this query:
This will output a row with a column named ‘ModuleConfigDecoded’, which is XML. Click it to open it. While it’s too big to paste, here, I encourage you to do this and see what it looks like.
The XML indicates the destination table and the query used to pull the data. Following the SourceQuery element is a Mappings element, which is what we are interested in and where the problem lies.
Now comes the boring part – you’ll need to compare the value of the SourceColumnName attribute to the corresponding table in DWRepository, looking for differences in case. In this case we are looking at DWRepository.dbo.ServiceRequestDim
Do the same for the TargetColumnName attribute comparing to OMDWDataMart.dbo.ServiceRequestDim.
How did this happen?
This occurs when you change the case of a field that extends an existing class after you have imported the new MP and it is synced to the datawarehouse.
In this case, we extended the System.WorkItem.ServiceRequest class, with some new properties, then changed the case of one those property ids in a later MP version.
So, when I first imported and sync’ed the MP, I defined this property:
Later I realized I didn’t like the case and changed it to:
When you import a class that extends an existing class and sync it to the datawarehouse, the new property fields are appended to the existing table for that class in DWRepository and the DWDataMart databases. If the fields already exist, then they are left as is so we don’t risk breaking anything that may be using them. Note that we also don’t remove these columns when you remove the MP containing the extension class; that would result in data loss and breaking reports, cubes, etc.
Also, when you import an extension class, the Module configuration defining the mappings is recreated.
In this case, since we extended System.WorkItem.ServiceRequest, the field ‘Userdomain’ was added DWRepository.dbo.ServiceRequestDim, DWDataMart.dbo.ServiceRequestDim, CMDWDataMart.dbo.ServiceRequestDim, and OMDWDataMart.dbo.ServiceRequestDim.
When I imported the updated MP with the property renamed to ‘UserDomain’, the MP deployment process could see that the field was already present in the tables, so it left it alone. The Module configuration was recreated, but this time with the mapping of ‘UserDomain’.
At this point, when the load job runs, it will fail with 33503 event.
How do you fix it?
It’s easy to fix – identify the extension class, and edit the MP so that the property id’s defined in it match the case of the columns in the table.
Identifying the class is probably the hardest part. Hopefully you’ll know which class extended it, but if you don’t, these queries can help you identify the class and MP which defines it.
Connect to DWStagingAndConfig.
This query will help identify the class
You may get a number of hits, so you’ll have to look at the Table_Name field and see what makes sense. Hint: You can usually ignore any MT_System$%.
Once you know the table name, this query will get you the ManagementPack and the class, and the MP xml:
You can click the MPXML to see the actual management pack XML and to search it for the ‘TypeName’, which will define the ‘bad’ fields.
Once you know the MP, you can edit it to correct the field names, seal it, import and sync it.