Super Simple BCS WCF Service Implementation

This post will describe how to create a very simple BCS WCF implementation that can serve as a good starting point for creating a BCS implementation that is tuned for use as a search content source with SharePoint 2013.

Download the SimpleService solution and unzip it.

Open Visual Studio 2012 as Administrator (administrator is needed so you can publish to IIS)

Your solution explorer pane should look like:

image

Build the solution (F6) to make sure there aren’t any issues.  This project is just a vanilla WCF service so it should “just work.”

Open IIS Manager and create a site

  1. Right-click on Sites and choose “Add Website…”
  2. Name your site (e.g. “Simple BCS”)
  3. Pick a path (any parent directory will do, create a new directory for this site)
  4. Choose a port (e.g. 12000)
  5. Click OK

image

 

Back in Visual Studio, right click on the SimpleService project and choose “Publish…”.  In the “select or import a publish profile” drop-down, choose “<new>”

image

Name it “Local IIS” (or anything that you want)

  • Server: localhost
  • Site name: Simple BCS (or whatever you chose above)
  • URL: https://localhost:12000 (adjust if you choose a different port number)

Click “Validate Collection” to make sure everything is ok

image

Click “Publish”.  IE will open with your web service.  Click on the SimpleService.svc link.  This will be the URL you’ll need to update the bdcm with in the next step.

Back to Visual Studio, right click on SimpleService.bdcm and choose “Open with…”.  Pick “XML (Text) Editor”

image

There are two locations that need to be updated with the URL from the previous step.  Search for “SimpleService.svc” to find them.  Note that the first one has “?wsdl” appended and the second doesn’t.

image

 

Next you’ll want to update the ACL entries.  Replace all instances of “SEARCHCOE\SVC_SSP_CRAWL” with the account that you’ll be using to crawl with.  (by default it is the “Default content access account” from the Search Service Application home page

image

Now you can go to the BCS service application and import your model.  From the BCS home page choose Import

image

For the BDC Model File, browse to the SimpleService.bdcm in your project.  Check the “Permissions” box before clicking “Import”.   Ignore the “Limit Filter” warning.  It doesn’t apply to us.

Before you crawl you’ll want to add the crawled properties that will be created by this content source as well as some managed property mappings.  Assuming you use the “maintain” script, here’s a configuration file that will work:

 <Configuration>
  <configurationSection>
    <ManagedProperties>
      <!-- Valid types: 1=Text, 2=Integer, 3=Boolean, 4=Float, 5=Decimal, 6=Datetime-->
      <!-- This is an example showing what each configurable property means
        <mproperty name="referencemanagedproperty" type="1">
         <property name="RefinementEnabled">1</property>
          <property name="Queryable">1</property>
          <property name="SortableType">1</property>
          <property name="MergeCrawledProperties">1</property> 
          Include values from all crawled properties mapped. All multi valued fields must have this value set to 1. 
          For example all taxonomy fields
          <property name="MergeCrawledProperties">0</property> 
          Include values from a single crawled property based on the order specified.
        </mproperty>
      -->
      <mproperty type="2" name="RecordItemNumber">
        <property name="Description">RecordItemNumber</property>
        <property name="FullTextQueriable">False</property>
        <property name="OverrideValueOfHasMultipleValues">True</property>
        <property name="Queryable">True</property>
        <property name="Searchable">False</property>
      </mproperty>
    </ManagedProperties>
    <CrawledProperties>
      <CrawledProperty propertyName="Record.ItemNumber" propertySet="2edeba9a-0fa8-4020-8a8b-30c3cdf34ccd" />
      <CrawledProperty propertyName="Record.Title" propertySet="2edeba9a-0fa8-4020-8a8b-30c3cdf34ccd" />
    </CrawledProperties>
    <CrawledPropertyCategories>
      <category name="Business Data" propset="2edeba9a-0fa8-4020-8a8b-30c3cdf34ccd" MapToContents="1" DiscoverNewProperties="1" />
    </CrawledPropertyCategories>
    <Mappings>
      <mapping ManagedProperty="Title" CrawledProperty="Record.Title" />
      <mapping ManagedProperty="RecordItemNumber" CrawledProperty="Record.ItemNumber" />
    </Mappings>
  </configurationSection>
</Configuration>

Now back to the Search Service Application and click on Content Sources in the left nav bar

Create a New Content Source.  Name it whatever you want and choose “Line of Business Data” for Content Source Type.  You should then see SimpleServiceInst appear as an external data source.  Select it and click OK.

Start your crawl!

If everything went well, you should be able to search for your content now:

image

 

Some things that haven’t been tested/included:

  • Returning large file streams
  • Multiple/related entities
  • Using it with SP2010