Business Connectivity Services (BCS) is a set of services and features that
connect SharePoint-based solutions to sources of external data. It is
included in SharePoint Foundation 2010, SharePoint Server 2010, and
Office 2010 applications.
Search Service Application (SSA) Allows for the content to be crawled, indexed, and then allows for users to get results through search queries.
Ok, what about combing those two nice features together , The power of BCS to get data from external data sources and the power of SSA to crawl this data, index it and query it in just no time.
Searching you content using SSA can be configured from the central admin like a piece of cake , according to those articles MSDN articles Part 1 & Part 2, and this nice article
In this article I assume that you have gone among all these hassle before, But what about creating this whole structure in just one click, on feature activation for example.
What about a site scope feature that would create a Business search content source , then creates a search scope associated to this content source ,Perform a full crawl to get the crawled external fields then connect those fields to a managed properties to be easy for advanced search query.
Ok lets start.
I am assume that you have created an external content type with LOB system named "LOBSystemName" and LOB system instance named "LOBSystemInstanceName"
to create the business content source you need to do the following
Now the content source is created you will need to create a search scope to be associated with the newly created content source the following method will do this task :
The site is the current site you are creating this whole topology for and the Search context is already defined in the previous method
OK, Recap , Content Source Created DONE , Search Scope created and associated DONE, Now we need to perform a full crawl to get Crawled properties "Database table fields in our case" to be able to map those properties to a managed metadata properties.
to do this step we need to make a while loop to just start the crawl and waits till its done then call the create managed property method... the following code snippet is the key :
ID, Title, StartDate, EndDate, Entity, Type, OrganizationUnit, Owner, Alias and StrategyPlan.
The "ModelName" object is the External content type model name
Why is "EnabledForScoping" set to true .... that is easy to able the new Managed properties to be exposed to search scopes
Why is some properties "HasMultipleValues" set to false ..... ok if your Managed property is a datetime or int the return data in the search result will be System.DateTime[] not the actual crawled value by setting this property to false issue solved :)
Why is "MaxCharactersInPropertyStoreIndex" set to 450 , this to Reduce storage requirements for text properties by using a hash for comparison, or simply to be able to order search results by this metadata property.
The Whole code in one block .... Have a nice day :)
Search Service Application (SSA) Allows for the content to be crawled, indexed, and then allows for users to get results through search queries.
Ok, what about combing those two nice features together , The power of BCS to get data from external data sources and the power of SSA to crawl this data, index it and query it in just no time.
Searching you content using SSA can be configured from the central admin like a piece of cake , according to those articles MSDN articles Part 1 & Part 2, and this nice article
In this article I assume that you have gone among all these hassle before, But what about creating this whole structure in just one click, on feature activation for example.
What about a site scope feature that would create a Business search content source , then creates a search scope associated to this content source ,Perform a full crawl to get the crawled external fields then connect those fields to a managed properties to be easy for advanced search query.
Ok lets start.
I am assume that you have created an external content type with LOB system named "LOBSystemName" and LOB system instance named "LOBSystemInstanceName"
to create the business content source you need to do the following
1: using (SPSite site = new SPSite(SiteURL))
2: {
3: SearchContext context = SearchContext.GetContext(site);
4:
5: Content BSCContent = new Content(context);
6:
7: ContentSourceCollection BSCContentSourceCollection = BSCContent.ContentSources;
8: string NewContentSource = "New Content Source Title";
9:
10: if (BSCContentSourceCollection.Exists(NewContentSource))
11: {
12: Console.WriteLine("Content Source Already Exsist");
13:
14: return false;
15: }
16: else
17: {
18: try
19: {
20: BusinessDataContentSource BSCContentSource = (BusinessDataContentSource)BSCContentSourceCollection.Create(typeof(BusinessDataContentSource), NewContentSource);
21:
22: BSCContentSource.StartAddresses.Add(BusinessDataContentSource.ConstructStartAddress("Default", new Guid("00000000-0000-0000-0000-000000000000"), "LOBSystemName", "LOBSystemInstanceName"));
23:
24: BSCContentSource.StartFullCrawl();
25:
26: return true;
27: }
28: catch (Exception ex)
29: {
30: Console.WriteLine("Faild to ceate content source");
31: Console.WriteLine(ex.Message);
32:
33: throw new Exception("Faild to ceate content source \n" + ex.Message);
34: }
35: }
36:
37: }
Now the content source is created you will need to create a search scope to be associated with the newly created content source the following method will do this task :
/// <summary>
/// Create new Content Source type Search Scope
/// </summary>
/// <param name="site">The new created site </param>
/// <param name="context">The Search context to create the Search Scope within</param>
/// <param name="ContentSourceName">The Search content source name to be associated to the new Search Scope</param>
static private void CreateBCSSearchScope(SPSite site,SearchContext context, string ContentSourceName )
{
string scopeName = ContentSourceName ;
string displayGroupName = "GTS Scopes";
// remotescopes class retrieves information via search web service so we run this as the search service account
RemoteScopes remoteScopes = new RemoteScopes(SPServiceContext.GetContext(site));
// see if there is an existing scope
Scope scope = (from s
in remoteScopes.GetScopesForSite(new Uri(site.Url)).Cast<Scope>()
where s.Name == scopeName
select s).FirstOrDefault();
// only add if the scope doesn't exist already
if (scope == null)
{
Schema sspSchema = new Schema(context);
ManagedPropertyCollection properties = sspSchema.AllManagedProperties;
scope = remoteScopes.AllScopes.Create(scopeName, "Search Scope for " + scopeName, null, true, "results.aspx", ScopeCompilationType.AlwaysCompile);
scope.Rules.CreatePropertyQueryRule(ScopeRuleFilterBehavior.Include, properties["ContentSource"], ContentSourceName);
}
// see if there is an existing display group
ScopeDisplayGroup displayGroup = (from d
in remoteScopes.GetDisplayGroupsForSite(new Uri(site.Url)).Cast<ScopeDisplayGroup>()
where d.Name == displayGroupName
select d).FirstOrDefault();
// add if the display group doesn't exist
if (displayGroup == null)
displayGroup = remoteScopes.AllDisplayGroups.Create(displayGroupName, "", new Uri(site.Url), true);
// add scope to display group if not already added
if (!displayGroup.Contains(scope))
{
displayGroup.Add(scope);
displayGroup.Default = scope;
displayGroup.Update();
}
// optionally force a scope compilation so this is available immediately
remoteScopes.StartCompilation();
}
OK, Recap , Content Source Created DONE , Search Scope created and associated DONE, Now we need to perform a full crawl to get Crawled properties "Database table fields in our case" to be able to map those properties to a managed metadata properties.
to do this step we need to make a while loop to just start the crawl and waits till its done then call the create managed property method... the following code snippet is the key :
BSCContentSource.StartFullCrawl();
Console.WriteLine("Carwling wil start in 10 secounds");
Thread.Sleep(10 * 1000);
Console.WriteLine("Carwling Started");
do
{
Thread.Sleep(10 * 1000);
Console.WriteLine("Waiting the content source to finish crawling..");
} while (BSCContentSource.CrawlStatus != CrawlStatus.Idle);
Console.WriteLine("Crawling has been done successfully !");
PMFLogger.Instance.LogInformation("Crawling has been done successfully !");
//Start creating/mapping the new Metadata Properties
CreateBCSMetadataProperties(context, ModelName);Now the data is crawled, indexed ....ok lets create the managed properties and map the crawled fields to those properties , In my case i had the following Columns in the SQL database -External data source -
ID, Title, StartDate, EndDate, Entity, Type, OrganizationUnit, Owner, Alias and StrategyPlan.
/// <summary>
/// Creates new Business Search Metadata Properties to be used in the search
/// </summary>
/// <param name="context">The Search context to create the Business Search Metadata Properties within</param>
/// <param name="ModelName">The newly created BCS Model name to aquire and map columns from</param>
static private void CreateBCSMetadataProperties(SearchContext context,string ModelName)
{
Schema sspSchema = new Schema(context);
ManagedPropertyCollection properties = sspSchema.AllManagedProperties;
//Create the content properties if they does not exist else get the already created ones for the mappings
ManagedProperty ectID;
ManagedProperty ectTitle;
ManagedProperty ectStartDate;
ManagedProperty ectEndDate;
ManagedProperty ectEntity;
ManagedProperty ectType;
ManagedProperty ectOrganizationUnit;
ManagedProperty ectOwner;
ManagedProperty ectAlias;
ManagedProperty ectStrategyPlan;
if (!properties.Contains("ectID"))
{
ectID = properties.Create("ectID", ManagedDataType.Text);
ectID.EnabledForScoping = true;
}
else
ectID = properties["ectID"];
if (!properties.Contains("ectTitle"))
{
ectTitle = properties.Create("ectTitle", ManagedDataType.Text);
ectTitle.EnabledForScoping = true;
}
else
ectTitle = properties["ectTitle"];
if (!properties.Contains("ectStartDate"))
{
ectStartDate = properties.Create("ectStartDate", ManagedDataType.DateTime);
ectStartDate.EnabledForScoping = true;
ectStartDate.HasMultipleValues = false;
}
else
ectStartDate = properties["ectStartDate"];
if (!properties.Contains("ectEndDate"))
{
ectEndDate = properties.Create("ectEndDate", ManagedDataType.DateTime);
ectEndDate.EnabledForScoping = true;
ectEndDate.HasMultipleValues = false;
}
else
ectEndDate = properties["ectEndDate"];
if (!properties.Contains("ectEntity"))
{
ectEntity = properties.Create("ectEntity", ManagedDataType.Text);
ectEntity.EnabledForScoping = true;
ectEntity.HasMultipleValues = false;
}
else
ectEntity = properties["ectEntity"];
if (!properties.Contains("ectType"))
{
ectType = properties.Create("ectType", ManagedDataType.Text);
ectType.EnabledForScoping = true;
ectType.HasMultipleValues = false;
}
else
ectType = properties["ectType"];
if (!properties.Contains("ectOrganizationUnit"))
{
ectOrganizationUnit = properties.Create("ectOrganizationUnit", ManagedDataType.Text);
ectOrganizationUnit.EnabledForScoping = true;
ectOrganizationUnit.HasMultipleValues = false;
}
else
ectOrganizationUnit = properties["ectOrganizationUnit"];
if (!properties.Contains("ectOwner"))
{
ectOwner = properties.Create("ectOwner", ManagedDataType.Text);
ectOwner.EnabledForScoping = true;
ectOwner.HasMultipleValues = false;
}
else
ectOwner = properties["ectOwner"];
if (!properties.Contains("ectAlias"))
{
ectAlias = properties.Create("ectAlias", ManagedDataType.Text);
ectAlias.EnabledForScoping = true;
ectAlias.HasMultipleValues = false;
ectAlias.MaxCharactersInPropertyStoreIndex = 450;
}
else
ectAlias = properties["ectAlias"];
if (!properties.Contains("ectStrategyPlan"))
{
ectStrategyPlan = properties.Create("ectStrategyPlan", ManagedDataType.Text);
ectStrategyPlan.EnabledForScoping = true;
ectStrategyPlan.HasMultipleValues = false;
}
else
ectStrategyPlan = properties["ectStrategyPlan"];
//Map the Query Crawled Properties to the Managed Property
MaptoManagedProperty(context, ectID, ModelName + " Items.ID", ManagedDataType.Text);
MaptoManagedProperty(context, ectTitle, ModelName + " Items.Name", ManagedDataType.Text);
MaptoManagedProperty(context, ectStartDate, ModelName + " Items.StartDate", ManagedDataType.DateTime);
MaptoManagedProperty(context, ectEndDate, ModelName + " Items.EndDate", ManagedDataType.DateTime);
MaptoManagedProperty(context, ectEntity, ModelName + " Items.Entity", ManagedDataType.Text);
MaptoManagedProperty(context, ectType, ModelName + " Items.Type", ManagedDataType.Text);
MaptoManagedProperty(context, ectOrganizationUnit, ModelName + " Items.OrganizationUnit", ManagedDataType.Text);
MaptoManagedProperty(context, ectOwner, ModelName + " Items.Owner", ManagedDataType.Text);
MaptoManagedProperty(context, ectAlias, ModelName + " Items.Alias", ManagedDataType.Text);
MaptoManagedProperty(context, ectStrategyPlan, ModelName + " Items.StrategyPlan", ManagedDataType.Text);
}
/// <summary>
/// Maps the external content type columns to a specific Managed Property
/// </summary>
/// <param name="context">The Search context to map the the external content type columns within</param>
/// <param name="managedProperty">The managed property to map the column to</param>
/// <param name="crawledPropertyName">The crawled property "Column" Name</param>
/// <param name="DataType"></param>
private static void MaptoManagedProperty(SearchContext context, ManagedProperty managedProperty, string crawledPropertyName, ManagedDataType DataType)
{
SPSecurity.RunWithElevatedPrivileges(() =>
{
Schema schema = new Schema(context);
try
{
Category category = schema.AllCategories["Business Data"];
var crawledProps = category.QueryCrawledProperties(crawledPropertyName, 1, Guid.NewGuid(), String.Empty, true).Cast<CrawledProperty>();
var crawledProp = crawledProps.FirstOrDefault();
if (crawledProp != null)
{
var mappings = managedProperty.GetMappings();
mappings.Add(new Mapping(crawledProp.Propset, crawledProp.Name, crawledProp.VariantType, managedProperty.PID));
managedProperty.SetMappings(mappings);
managedProperty.Update();
}
else
{
Console.WriteLine("Query Crawled Property " + crawledPropertyName + " was not found - Mapping faild.");
}
}
catch (Exception ex)
{
throw new Exception("Faild to map field to Crawled Property \n" + ex.Message);
}
});
}Ok you will find some objects that needs to be described:
The "ModelName" object is the External content type model name
Why is "EnabledForScoping" set to true .... that is easy to able the new Managed properties to be exposed to search scopes
Why is some properties "HasMultipleValues" set to false ..... ok if your Managed property is a datetime or int the return data in the search result will be System.DateTime[] not the actual crawled value by setting this property to false issue solved :)
Why is "MaxCharactersInPropertyStoreIndex" set to 450 , this to Reduce storage requirements for text properties by using a hash for comparison, or simply to be able to order search results by this metadata property.
The Whole code in one block .... Have a nice day :)
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.SharePoint;
using Microsoft.SharePoint.Client;
using Microsoft.SharePoint.Administration;
using Microsoft.Office.Server;
using Microsoft.Office.Server.Search.Administration;
using System.Threading;
namespace BCSModelGeneration
{
public class ContentSourceGenerator
{
/// <summary>
/// Create Bussiness Connectivity Service LOB SharePoint Content Source
/// </summary>
/// <param name="InitialCatalog">Rdb Connection tenant Initial Catalog</param>
/// <param name="SiteURL">The new created site - Entity</param>
/// <param name="ModelName">The new created Model Name </param>
/// <returns>The status of the LOB SharePoint Content Source creation</returns>
internal static bool CreateBCSContentSource(string InitialCatalog, string SiteURL, string ModelName)
{
using (SPSite site = new SPSite(SiteURL))
{
SearchContext context = SearchContext.GetContext(site);
Content BSCContent = new Content(context);
ContentSourceCollection BSCContentSourceCollection = BSCContent.ContentSources;
string NewContentSource = InitialCatalog;
if (BSCContentSourceCollection.Exists(NewContentSource))
{
Console.WriteLine("Content Source Already Exsist");
return false;
}
else
{
try
{
BusinessDataContentSource BSCContentSource = (BusinessDataContentSource)BSCContentSourceCollection.Create(typeof(BusinessDataContentSource), NewContentSource);
BSCContentSource.StartAddresses.Add(BusinessDataContentSource.ConstructStartAddress("Default", new Guid("00000000-0000-0000-0000-000000000000"), InitialCatalog, InitialCatalog));
WeeklySchedule Weekly = CreatWeeklySchedule(context, 2);
BSCContentSource.FullCrawlSchedule = Weekly;
DailySchedule Daily = CreateDailySchedule(context, 2);
BSCContentSource.IncrementalCrawlSchedule = Daily;
BSCContentSource.Update();
BSCContentSource.StartFullCrawl();
Console.WriteLine("Carwling wil start in 10 secounds");
Thread.Sleep(10 * 1000);
Console.WriteLine("Carwling Started");
//Start creating the new Search Scope
CreateBCSSearchScope(site, context, BSCContentSource.Name);
do
{
Thread.Sleep(10 * 1000);
Console.WriteLine("Waiting the content source to finish crawling..");
} while (BSCContentSource.CrawlStatus != CrawlStatus.Idle);
Console.WriteLine("Crawling has been done successfully !");
//Start creating/mapping the new Metadata Properties
CreateBCSMetadataProperties(context, ModelName);
Console.WriteLine("Content Source Created");
Console.WriteLine("Starting a new Crawling process to the content source to fill the new mapped Metadata Properties");
BSCContentSource.StartFullCrawl();
return true;
}
catch (Exception ex)
{
Console.WriteLine("Faild to ceate content source");
Console.WriteLine(ex.Message);
throw new Exception("Faild to ceate content source \n" + ex.Message);
}
}
}
}
/// <summary>
/// Creats a Weekly Schedule for the search content source
/// </summary>
/// <param name="context">The Search context to create the schedule within</param>
/// <param name="WeeksInterval">Indicates that the content should be crawled every "WeeksInterval" number of weeks</param>
/// <returns></returns>
private static WeeklySchedule CreatWeeklySchedule(SearchContext context,int WeeksInterval)
{
WeeklySchedule Weekly = new WeeklySchedule(context);
Weekly.BeginDay = DateTime.Now.Day;
Weekly.BeginMonth = DateTime.Now.Month;
Weekly.BeginYear = DateTime.Now.Year;
//Starts at 1:00 AM
Weekly.StartHour = 1;
Weekly.StartMinute = 00;
//Indicates that the content should be crawled every WeeksInterval weeks.
Weekly.WeeksInterval = WeeksInterval;
return Weekly;
}
/// <summary>
/// Creats a Daily Schedule for the search content source
/// </summary>
/// <param name="context">The Search context to create the schedule within</param>
/// <param name="DaysInterval">Indicates that the content should be crawled every "DaysInterval" number of days.</param>
/// <returns></returns>
private static DailySchedule CreateDailySchedule(SearchContext context,int DaysInterval)
{
DailySchedule Daily = new DailySchedule(context);
Daily.BeginDay = DateTime.Now.Day;
Daily.BeginMonth = DateTime.Now.Month;
Daily.BeginYear = DateTime.Now.Year;
//Starts at 1:00 AM
Daily.StartHour = 1;
Daily.StartMinute = 00;
//Indicates that the content should be crawled every DaysInterval days.
Daily.DaysInterval = DaysInterval;
//Adjusting the daily schedule to run every hour
//Hourly.RepeatInterval = 60;
//Hourly.RepeatDuration = 1440;
return Daily;
}
/// <summary>
/// Create new Content Source type Search Scope
/// </summary>
/// <param name="site">The new created site - Entity</param>
/// <param name="context">The Search context to create the Search Scope within</param>
/// <param name="ContentSourceName">The Search content source name to be associated to the new Search Scope</param>
static private void CreateBCSSearchScope(SPSite site,SearchContext context, string ContentSourceName )
{
string scopeName = ContentSourceName ;
string displayGroupName = "GTS Scopes";
// remotescopes class retrieves information via search web service so we run this as the search service account
RemoteScopes remoteScopes = new RemoteScopes(SPServiceContext.GetContext(site));
// see if there is an existing scope
Scope scope = (from s
in remoteScopes.GetScopesForSite(new Uri(site.Url)).Cast<Scope>()
where s.Name == scopeName
select s).FirstOrDefault();
// only add if the scope doesn't exist already
if (scope == null)
{
Schema sspSchema = new Schema(context);
ManagedPropertyCollection properties = sspSchema.AllManagedProperties;
scope = remoteScopes.AllScopes.Create(scopeName, "Search Scope for " + scopeName, null, true, "results.aspx", ScopeCompilationType.AlwaysCompile);
scope.Rules.CreatePropertyQueryRule(ScopeRuleFilterBehavior.Include, properties["ContentSource"], ContentSourceName);
}
// see if there is an existing display group
ScopeDisplayGroup displayGroup = (from d
in remoteScopes.GetDisplayGroupsForSite(new Uri(site.Url)).Cast<ScopeDisplayGroup>()
where d.Name == displayGroupName
select d).FirstOrDefault();
// add if the display group doesn't exist
if (displayGroup == null)
displayGroup = remoteScopes.AllDisplayGroups.Create(displayGroupName, "", new Uri(site.Url), true);
// add scope to display group if not already added
if (!displayGroup.Contains(scope))
{
displayGroup.Add(scope);
displayGroup.Default = scope;
displayGroup.Update();
}
// optionally force a scope compilation so this is available immediately
remoteScopes.StartCompilation();
}
/// <summary>
/// Creates new Business Search Metadata Properties to be used in the search
/// </summary>
/// <param name="context">The Search context to create the Business Search Metadata Properties within</param>
/// <param name="ModelName">The newly created BCS Model name to aquire and map columns from</param>
static private void CreateBCSMetadataProperties(SearchContext context,string ModelName)
{
Schema sspSchema = new Schema(context);
ManagedPropertyCollection properties = sspSchema.AllManagedProperties;
//TODO: ADD THE NEW FIELDS ADDED TO THE VIEW AFTER FINALIZATION
//Create the content properties if they does not exist else get the already created ones for the mappings
ManagedProperty ectID;
ManagedProperty ectTitle;
ManagedProperty ectStartDate;
ManagedProperty ectEndDate;
ManagedProperty ectEntity;
ManagedProperty ectType;
ManagedProperty ectOrganizationUnit;
ManagedProperty ectOwner;
ManagedProperty ectAlias;
ManagedProperty ectStrategyPlan;
if (!properties.Contains("ectID"))
{
ectID = properties.Create("ectID", ManagedDataType.Text);
ectID.EnabledForScoping = true;
}
else
ectID = properties["ectID"];
if (!properties.Contains("ectTitle"))
{
ectTitle = properties.Create("ectTitle", ManagedDataType.Text);
ectTitle.EnabledForScoping = true;
}
else
ectTitle = properties["ectTitle"];
if (!properties.Contains("ectStartDate"))
{
ectStartDate = properties.Create("ectStartDate", ManagedDataType.DateTime);
ectStartDate.EnabledForScoping = true;
ectStartDate.HasMultipleValues = false;
}
else
ectStartDate = properties["ectStartDate"];
if (!properties.Contains("ectEndDate"))
{
ectEndDate = properties.Create("ectEndDate", ManagedDataType.DateTime);
ectEndDate.EnabledForScoping = true;
ectEndDate.HasMultipleValues = false;
}
else
ectEndDate = properties["ectEndDate"];
if (!properties.Contains("ectEntity"))
{
ectEntity = properties.Create("ectEntity", ManagedDataType.Text);
ectEntity.EnabledForScoping = true;
ectEntity.HasMultipleValues = false;
}
else
ectEntity = properties["ectEntity"];
if (!properties.Contains("ectType"))
{
ectType = properties.Create("ectType", ManagedDataType.Text);
ectType.EnabledForScoping = true;
ectType.HasMultipleValues = false;
}
else
ectType = properties["ectType"];
if (!properties.Contains("ectOrganizationUnit"))
{
ectOrganizationUnit = properties.Create("ectOrganizationUnit", ManagedDataType.Text);
ectOrganizationUnit.EnabledForScoping = true;
ectOrganizationUnit.HasMultipleValues = false;
}
else
ectOrganizationUnit = properties["ectOrganizationUnit"];
if (!properties.Contains("ectOwner"))
{
ectOwner = properties.Create("ectOwner", ManagedDataType.Text);
ectOwner.EnabledForScoping = true;
ectOwner.HasMultipleValues = false;
}
else
ectOwner = properties["ectOwner"];
if (!properties.Contains("ectAlias"))
{
ectAlias = properties.Create("ectAlias", ManagedDataType.Text);
ectAlias.EnabledForScoping = true;
ectAlias.HasMultipleValues = false;
ectAlias.MaxCharactersInPropertyStoreIndex = 450;
}
else
ectAlias = properties["ectAlias"];
if (!properties.Contains("ectStrategyPlan"))
{
ectStrategyPlan = properties.Create("ectStrategyPlan", ManagedDataType.Text);
ectStrategyPlan.EnabledForScoping = true;
ectStrategyPlan.HasMultipleValues = false;
}
else
ectStrategyPlan = properties["ectStrategyPlan"];
//TODO: ADD THE NEW FIELDS ADDED TO THE VIEW AFTER FINALIZATION
//Map the Query Crawled Properties to the Managed Property
MaptoManagedProperty(context, ectID, ModelName + " Items.ID", ManagedDataType.Text);
MaptoManagedProperty(context, ectTitle, ModelName + " Items.Name", ManagedDataType.Text);
MaptoManagedProperty(context, ectStartDate, ModelName + " Items.StartDate", ManagedDataType.DateTime);
MaptoManagedProperty(context, ectEndDate, ModelName + " Items.EndDate", ManagedDataType.DateTime);
MaptoManagedProperty(context, ectEntity, ModelName + " Items.Entity", ManagedDataType.Text);
MaptoManagedProperty(context, ectType, ModelName + " Items.Type", ManagedDataType.Text);
MaptoManagedProperty(context, ectOrganizationUnit, ModelName + " Items.OrganizationUnit", ManagedDataType.Text);
MaptoManagedProperty(context, ectOwner, ModelName + " Items.Owner", ManagedDataType.Text);
MaptoManagedProperty(context, ectAlias, ModelName + " Items.Alias", ManagedDataType.Text);
MaptoManagedProperty(context, ectStrategyPlan, ModelName + " Items.StrategyPlan", ManagedDataType.Text);
}
/// <summary>
/// Maps the external content type columns to a specific Managed Property
/// </summary>
/// <param name="context">The Search context to map the the external content type columns within</param>
/// <param name="managedProperty">The managed property to map the column to</param>
/// <param name="crawledPropertyName">The crawled property "Column" Name</param>
/// <param name="DataType"></param>
private static void MaptoManagedProperty(SearchContext context, ManagedProperty managedProperty, string crawledPropertyName, ManagedDataType DataType)
{
SPSecurity.RunWithElevatedPrivileges(() =>
{
Schema schema = new Schema(context);
try
{
Category category = schema.AllCategories["Business Data"];
var crawledProps = category.QueryCrawledProperties(crawledPropertyName, 1, Guid.NewGuid(), String.Empty, true).Cast<CrawledProperty>();
var crawledProp = crawledProps.FirstOrDefault();
if (crawledProp != null)
{
var mappings = managedProperty.GetMappings();
mappings.Add(new Mapping(crawledProp.Propset, crawledProp.Name, crawledProp.VariantType, managedProperty.PID));
managedProperty.SetMappings(mappings);
managedProperty.Update();
}
else
{
Console.WriteLine("Query Crawled Property " + crawledPropertyName + " was not found - Mapping faild.");
}
}
catch (Exception ex)
{
throw new Exception("Faild to map field to Crawled Property \n" + ex.Message);
}
});
}
}
}
No comments:
Post a Comment