Enabling OCR of TIFF images for SharePoint 2013 Search

SharePoint 2013 Enterprise Search has the built-in ability to OCR and index the content of your scanned tiff images during a crawl (whether they are are stored in SharePoint or not). This is a very powerful feature, yet a bit mysterious to configure as the configuration steps have changed since the 2010 version. I’ll outline the steps below:

  1. Using Server Manager, ensure the Windows TIFF iFilter feature is enabled on each crawl server
    Windows TIFF iFilter Feature
  2. Open the Local Group Policy Editor and locate the OCR folder beneath Computer Configuration > Administrative Templates.
    Group Policy Editor
  3. Edit the policy setting for “Select OCR languages from a code page”.  Choose Enabled and select the appropriate languages.
    Select OCR Languages
  4. Open the SharePoint Management Shell (using Run as Administrator) and run the following commands to configure content parsing for TIFF images.
    $ssa = Get-SPEnterpriseSearchServiceApplication
    New-SPEnterpriseSearchFileFormat -SearchApplication $ssa tif "TIFF Image File" "image/tiff"
    New-SPEnterpriseSearchFileFormat -SearchApplication $ssa tiff "TIFF Image File" "image/tiff"
  5. Restart the SharePoint Search Host Controller service.
    Restart Search Host Controller Service
  6. Open the Search Service Application administration.  Under the Crawling navigation item, navigate to File Types.  Add two new File Types for tif and tiff.Add File Type
  7. Perform a Full Crawl of your content.

Depending on how many TIFF images are crawled, this may be a considerably longer amount of time than your previous crawl time.  Additional planning may be necessary, such as potentially scoping a Content Source to only content that should be OCR’d, or adjusting crawl schedules.

 

Texas Road Side History – There’s an App for That


Have you ever wondered what that historical marker says that you just passed while driving down the road? Well, there’s an app for that, if you live in Texas.

Please cut to the chase and give me the links

Texas Road Side History was first thought up by my colleague Derek Martin as one of four applications our Slalom Consulting Dallas office developed during a one day Windows Phone 7 “lock in”.

One of our goals was to learn how to make use of Microsoft’s SQL Azure and Azure Web roles. Using SQL Azure, we took advantage of the Geography data type and the STDistance function to query for historical markers by proximity. We next exposed a WCF Web service for executing the proximity query. And lastly, generated an OData endpoint using WCF Data Services for all of our basic data access needs.

After several iterations of submissions and rejections to the WP7 Marketplace, Texas Road Side History was finally approved. Microsoft’s approval process was very impressive. Each time we received a rejection, it came with a very thorough report with the reasoning and the exact steps needed in order to make the application comply. It is clear that Microsoft is focused not on the quantity of apps in the Marketplace, but on the quality.

After our “lock-in”, I caught the mobile bug and wanted to see what iOS was all about. Dabbling a bit with Objective-C, I quickly turned to MonoTouch – which left me with just iOS APIs to ramp up on. For some of the details on consuming the WCF and OData services, read my previous two posts here and here.

Submission to the Apple App Store via iTunes Connect was a bit less impressive, albeit, faster time to market (4 days after submitting). The iTunes Connect site took a bit of getting used to. Fortunately, the app was approved on first submission; which is unfortunate in the sense that I can not report on the quality of an App Store rejection report.

Texas Road Side History has proved to be a well-rounded mobile app for getting our feet wet in the mobility space. Whether you are a history nut or just curious to see what our first mobile app looks like, I hope you’d give it a spin and let me know what you think.

Consuming an OData Feed using MonoTouch

After having great success consuming a WCF Web service using MonoTouch as I mentioned in the previous post, I next set off to consume an OData feed thinking this would be just as easy. And sure enough, it was just as easy, until I graduated my wonderful little app from the iPhone Simulator to my actual phone. Once on my phone, the application would still run just fine, but whenever a call was made to asynchronously fetch data from the OData feed – crickets – my user friendly activity indicators just keep spinning indefinitely.

Here are the steps I took to get this far:

  1. Copied into my MonoTouch project the generated data service reference class from the WP7 project where I previously consumed the OData feed
  2. Referenced the System.Data.Services.Client assembly provided by MonoTouch
  3. Put together the following method to asynchronously retrieve a single entity from the OData feed:
public void GetMarker(int markerNum, Action<Marker> successCallback, Action<Exception> errorCallback)
{
var serviceUri = new Uri(ODataServiceUrl);
var dataServiceContext = new MyDataServiceContext(serviceUri);
var markerUri = new Uri(string.Format("/Marker({0})", markerNum), UriKind.Relative);
 
try
{
dataServiceContext.BeginExecute<Marker>(markerUri, a =>
{
try
{
var results = dataServiceContext.EndExecute<Marker>(a);
var marker = results.Single();
successCallback(marker);
}
catch (Exception e)
{
errorCallback(e);
}
},
null);
}
catch (Exception e)
{
errorCallback(e);
}
}

As you can see by reviewing the above code, you would think that it is guaranteed that one of the two callbacks (either the successCallback or errorCallback) will always be called. However, launching the Mono Soft Debugger proved otherwise. I placed a breakpoint inside the anonymous delegate passed to the BeginExecute method, and sure enough, it is never reached.

Dumbfounded by what was happening, I rewrote the method to make the call synchronously by replacing the call to BeginExecute with its synchronous counterpart, Execute. It was then that I finally got some visibility into my original issue because the underlying exception was now being caught:

Attempting to JIT compile method '(wrapper managed-to-native) System.Threading.Interlocked:CompareExchange
(System.Exception&,System.Exception,System.Exception)' while running with --aot-only.
at System.Data.Services.Client.BaseAsyncResult.HandleFailure (System.Exception e) [0x00000] in :0
at System.Data.Services.Client.QueryResult.Execute () [0x00000] in :0
at System.Data.Services.Client.DataServiceRequest.Execute[Marker]
(System.Data.Services.Client.DataServiceContext context, System.Data.Services.Client.QueryComponents
queryComponents) [0x00000] in :0
at System.Data.Services.Client.DataServiceContext.Execute[Marker] (System.Uri requestUri) [0x00000] in :0
at Trentacular.Trsh.MarkerService.GetMarkerUsingDataServiceContext (Int32 markerNum, System.Action`1
successCallback, System.Action`1 errorCallback) [0x0002a]

I then found this entry in the MonoTouch troubleshooting documentation that I believe is suggesting that I need to explicitly force the AOT compiler to include the Interlocked.CompareExchange method by calling it myself before the call to Execute. Tried that, same exception still, except it is now getting thrown when I explicitly called the CompareExchange method.

Running out of options, I ditched the DataServiceContext altogether and rewrote the GetMarker method using a WebClient and Linq-to-Xml as follows:

static readonly XNamespace atomNS = XNamespace.Get("http://www.w3.org/2005/Atom");
static readonly XNamespace metadataNS = XNamespace.Get("http://schemas.microsoft.com/ado/2007/08/dataservices/metadata");
static readonly XNamespace dataservicesNS = XNamespace.Get("http://schemas.microsoft.com/ado/2007/08/dataservices");
 
public void GetMarker(int markerNum, Action<Marker> successCallback, Action<Exception> errorCallback)
{
var webClient = new WebClient();
webClient.DownloadStringCompleted += delegate(object sender, DownloadStringCompletedEventArgs args)
{
try
{
if (args.Error != null)
{
errorCallback(args.Error);
return;
}
 
var document = XDocument.Parse(args.Result);
var root = document.Root;
var properties = root.Element(atomNS + "content").Element(metadataNS + "properties");
 
var marker = new Marker
{
MarkerNum = Convert.ToInt32(properties.Element(dataservicesNS + "MarkerNum").Value),
Address = properties.Element(dataservicesNS + "Address").Value.TrimEnd(),
YearEstablished = properties.Element(dataservicesNS + "Year").Value,
MarkerText = properties.Element(dataservicesNS + "MarkerText").Value
};
 
successCallback(marker);
}
catch (Exception e)
{
errorCallback(e);
}
};
 
try
{
var markerUri = new Uri(string.Format("{0}/Marker({1})", ODataServiceUrl, markerNum), UriKind.Absolute);
webClient.DownloadStringAsync(markerUri);
}
catch (Exception e)
{
errorCallback(e);
}
}

This worked liked a charm. It is unfortunate though the DataServiceContext approach doesn’t work. While Linq-to-Xml isn’t all that bad, I am having to do more work than my spoiled .Net developer mentality would like – why traverse xml, deal with xml namespaces, and require knowledge of node names when this can all be abstracted for you.

Knowing the Mono guys are no longer working for Novell, I doubt this will ever get fixed in MonoTouch, but I am hoping that they will produce something far more exceptional with their new venture Xamarin.

iOS and WCF – Better Together Thanks to MonoTouch

I was recently presented with an opportunity to write a mobile application for a local Dallas homeless ministry. Learning about what the ministry leader envisioned for the app as well as already owning a Mac and an iPhone led me down the iOS development path. After reading a little about Objective-C and memory management, I quickly was loosing hope for turning something around before having to go back to work Monday. I guess I’ve been spoiled for too long with automatic garbage collection.

I then learned about MonoTouch, downloaded the trial, got up and running with MonoDevelop, and one evening later had finished my first application. To get my feet wet, I chose to port a Windows Phone 7 application my colleagues at Slalom and I had written called Texas Roadside History (shameless plug: if you use a WP7, search for Roadside History in the Marketplace).

I want to highlight a single HUGE benefit I’ve realized to having used MonoTouch: I get the WCF stack!

The server component of Texas Roadside History is running in Azure – SQL Azure and a Web role providing both a SOAP Web service endpoint and OData endpoints. Having already generated service references (client proxies) for these endpoints in the WP7 version of the app, I simply copied them into my MonoTouch project, referenced the System.ServiceModel assembly, and was back to making asynchronous Web service calls with zero XML parsing.

Deploying SharePoint BDC Models via Features without specifying a SiteUrl

Today I attempted to deploy a BDC model to my client’s SharePoint farm. I used the Visual Studio 2010 SharePoint project template and the Business Data Catalog Model item template to create the model, and I had no issues using the convenient Deploy command from within Visual Studio to deploy the model and make use of it in my virtual development environment. But when I deployed my solution to my client’s farm and attempted to activate the BDC model’s Feature, the activation would fail with the following error:

The default web application could not be determined. Set the SiteUrl property in feature “My BDC Model Feature” to the URL of the desired site and retry activation.
Parameter name: properties

I soon found Frederik Prijck’s blog post that explained that I am supposed to specify a SiteUrl as a property of the model that can be used to resolve a Web application and mapped BCS service application to deploy the model to. So why was it I was able to deploy the model in my development environment?

The answer is that in my development environment, I have one and only one Web application that is hosted on port 80. As Frederik explained:

Strange enough this BCS entity will automatically attach an event receiver (Microsoft.Office.SharePoint.ClientExtensions.Deployment.ImportModelReceiver) to the feature. This event receiver has a method named GetDefaultWebApp which will be called when there is no SiteUrl property set in the manifest file. GetDefaultWebApp (as you can see using reflector) will search for a webapplication on port 80 or port 443 (not 100% sure about port 443, but you will understand what i mean i hope).

Frederik then goes on to talk of a dirty solution involving creating an empty Web application on port 80 just to support deploying BDC models without having to hard code a SiteUrl in the model.

The problem for me though is my client’s farm already has a Web application on port 80. Actually they have 8 Web applications using port 80, all with different host headers.

Opening up ILSpy to see exactly what was going on in the GetDefaultWebAppUrl method reveals that there is a little more to the logic. First of all, Web applications on port 80 will take precedence over Web applications on port 443. If there is only a single Web application on port 80 or 443, the method returns that Web application’s default zone URL. If there are multiple Web applications, the method then attempts to choose the Web application whose default zone’s host name has the least number of parts when split by the period character. For example, intranet.contoso.com would take precedence over test.intranet.contoso.com. If there is still ambiguity, the method makes one last check to see if one of the default zone’s host name begins with ‘www’. If ambiguity remains, the method throws the above quoted exception.

I can’t explain why Microsoft has chosen to implement this method in such a way, nor why they choose not to document it, but knowing this behavior provides another workaround. Taking a similar approach as Frederik, I created a dummy Web application having a host header beginning with ‘www’, created the root Site Collection, and my BDC model now deploys like a charm without a hardcoded SiteUrl.

Windows Phone 7: Custom Tool Error failed to generate code for service reference

Stumbling here because you are having trouble updating a service reference in your Windows Phone 7 application? Are you getting a warning that reads something like this?

Custom tool warning: Cannot import wsdl:portType

Detail: An exception was thrown while running a WSDL import extension: System.ServiceModel.Description.DataContractSerializerMessageContractImporter

Error: Could not load type 'System.Runtime.Serialization.DataContractSet' from assembly 'System.Runtime.Serialization, Version=2.0.5.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e'.

Or this?

Custom tool warning: No endpoints compatible with Silverlight 3 were found. The generated client class will not be usable unless endpoint information is provided via the constructor.

I’ve had this problem several times now, and each time I’m stumped why this happens. So I am writing down what causes this once and for all so that I don’t keep forgetting. Here it goes …

Frequently I am wanting to consume OData feeds within my apps. An easy way to do so is to use the Open Data Protocol Client Library available on CodePlex. The library contains the System.Data.Services.Client.dll assembly that you reference in your Windows Phone application project.

This assembly is trouble when trying to mix with service reference generation. I’ve seen posts that talk about deselecting the “Reuse Types in Referenced Assemblies” option when generating the reference, but this is not exactly a solution I am happy with. What if I do want to reuse types?

The best workaround I’ve come up with so far is to generate my service references in a separate Windows Phone Class Library project that does not reference the System.Data.Services.Client.dll. I then add a project reference to this class library in the WP7 application project. Finally, I need to copy the system.serviceModel entries generated in the class library’s ServiceReferences.ClientConfig file to the WP7 application’s ServiceReferences.ClientConfig file.

It is also probably not a bad idea to include any OData generated proxies along with the System.Data.Services.Client.dll in a separate Windows Phone Class Library project as well.

I hope this post saves you some headache as it will for me time and time again.

Using the Windows Phone 7 Bing Maps Control with MVVM Light

I’ve recently adopted MVVM Light as my MVVM framework of choice for both Silverlight and Windows Phone 7 development. Why use frameworks like MVVM Light? First and foremost, to achieve clear separation of concerns between the application UI and the backing data. While this goal seems fairly elementary, pulling this off in practice can be quite challenging due to how Microsoft has designed some of the native controls. The Bing Maps control is one good example.

In my scenario (which I would guess is a fairly common scenario), I am data binding multiple locations as push pins on the map. Clicking on a push pin should cause the application to navigate to a details page (a responsibility of the view), but also the corresponding view model for the details page needs to be notified of the location in context. This is where things get a little tricky and the MVVM Light EventToCommand behavior and Messenger features come to the rescue.

To begin, I’ve put together a simple map view shown below. Notice the two interaction triggers on the map push pin. The second trigger is the navigate action … straightforward enough. The first trigger is the MVVM Light EventToCommand behavior that invokes a command called ‘SelectCommand’ on the bound location.

<phone:PhoneApplicationPage
xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
xmlns:phone="clr-namespace:Microsoft.Phone.Controls;assembly=Microsoft.Phone"
xmlns:shell="clr-namespace:Microsoft.Phone.Shell;assembly=Microsoft.Phone"
xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
xmlns:i="clr-namespace:System.Windows.Interactivity;assembly=System.Windows.Interactivity"
xmlns:ic="clr-namespace:Microsoft.Expression.Interactivity.Core;assembly=Microsoft.Expression.Interactions"
xmlns:Maps="clr-namespace:Microsoft.Phone.Controls.Maps;assembly=Microsoft.Phone.Controls.Maps"
xmlns:gs="clr-namespace:GalaSoft.MvvmLight.Command;assembly=GalaSoft.MvvmLight.Extras.WP7"
xmlns:c="clr-namespace:Trentacular.Phone.Converters"
x:Class="Trentacular.Phone.MainPage"
FontFamily="{StaticResource PhoneFontFamilyNormal}"
FontSize="{StaticResource PhoneFontSizeNormal}"
Foreground="{StaticResource PhoneForegroundBrush}"
SupportedOrientations="Portrait"
Orientation="Portrait"
DataContext="{Binding Main, Source={StaticResource Locator}}">
 
<phone:PhoneApplicationPage.Resources>
<c:GeoCoordinateConverter x:Key="GeoCoordinateConverter" />
</phone:PhoneApplicationPage.Resources>
 
<Grid x:Name="LayoutRoot">
<Maps:Map ZoomLevel="10" Center="{Binding Path=MapCenter, Mode=TwoWay}">
<Maps:MapItemsControl ItemsSource="{Binding NearbyLocations}">
<Maps:MapItemsControl.ItemTemplate>
<DataTemplate>
<Maps:Pushpin Location="{Binding Converter={StaticResource GeoCoordinateConverter}}" >
<i:Interaction.Triggers>
<i:EventTrigger EventName="MouseLeftButtonDown">
<gs:EventToCommand Command="{Binding Path=SelectCommand}" />
<ic:NavigateToPageAction TargetPage="/Details.xaml"/>
</i:EventTrigger>
</i:Interaction.Triggers>
</Maps:Pushpin>
</DataTemplate>
</Maps:MapItemsControl.ItemTemplate>
</Maps:MapItemsControl>
</Maps:Map>
</Grid>
 
</phone:PhoneApplicationPage>

For the sake of brevity, I am going to skip the Main view model, which contains a MarkerViewModel-typed ObservableCollection property called NearbyLocations that gets bound to by the MapItemsControl’s ItemsSource property shown above.

Now lets take a look at the MarkerViewModel class that serves as the view model for a single location on the map represented by a push pin. Notice the implementation of the SelectCommand property, which simply fires a PropertyChangedMessage. This will allow the details view model which listens for this particular message to update its context when the push pin is clicked.

public class MarkerViewModel : ILocation
{
public Marker Marker { get; set; }
 
public double Latitude
{
get { return Marker.Latitude; }
}
 
public double Longitude
{
get { return Marker.Longitude; }
}
 
private ICommand _selectCommand;
public ICommand SelectCommand
{
get
{
if (_selectCommand == null)
{
_selectCommand = new RelayCommand(() =>
{
var messenger = Messenger.Default;
messenger.Send(
new PropertyChangedMessage<Marker>(null, Marker, "SelectedMarker")
);
});
}
return _selectCommand;
}
}
}

Finally, in the constructor of the Details view model we register to listen for the Marker-typed PropertyChangedMessage message.

public class DetailsViewModel : ViewModelBase
{
public DetailsViewModel()
{
if (IsInDesignMode)
{
Marker = new Marker
{
ID = 1,
Title = "Design Mode Marker Title"
};
}
else
{
Messenger.Default.Register>(this,
message =>
{
DispatcherHelper.CheckBeginInvokeOnUI(() =>
{
Marker = message.NewValue;
});
});
}
}

public const string MarkerPropertyName = "Marker";
private Marker _marker;
public Marker Marker
{
get { return _marker; }
set
{
if (_marker == value)
return;

var oldValue = _marker;
_marker = value;

// Update bindings, no broadcast
RaisePropertyChanged(MarkerPropertyName);
}
}
}

To sum up, I’ve demonstrated two really great features of the MVVM Light framework – the EventToCommand behavior and the Messenger features – and how these features can be used to successfully pull off the MVVM pattern with the Windows Phone 7 Bing Maps control.

SharePoint 2010: Programatically Retrieve Credentials from the Secure Store Service

SharePoint 2010’s Secure Store Service provides a way to map credentials and delegate access to remote resources. You may already be familiar with the MOSS 2007 Single Sign-on Shared Service, which was the former equivalent. The Secure Store Service integrates seemlessly with Business Connectivity Services (BCS), but it also features an API that can be taken advantage of within custom development projects. This makes the service an attractive option for storing sensitive configuration data such as connection strings, Web service credentials, etc.

The Secure Store Service allows us to create Target Applications which house sets of credentials. The two main types are Individual and Group applications, Individual meaning credentials are mapped to individual users, and Group meaning all users share the same set of credentials.

While the raw API isn’t very intuitive, its design was likely intentional (additional security by obfuscation). With a little marshalling help from our interop library friends, we are able to retrieve credentials (provided the appropriate permissions to the target application).

To begin, we need to reference a couple of assemblies.

Microsoft.BusinessData.dll

C:Program FilesCommon FilesMicrosoft SharedWeb Server Extensions14ISAPIMicrosoft.BusinessData.dll

Microsoft.Office.SecureStoreService.dll

C:WindowsassemblyGAC_MSILMicrosoft.Office.SecureStoreService14.0.0.0__71e9bce111e9429cMicrosoft.Office.SecureStoreService.dll

And now for the reason you came to this post … the code

using System.Collections.Generic;
using System.Linq;
using System.Runtime.InteropServices;
using System.Security;
using Microsoft.BusinessData.Infrastructure.SecureStore;
using Microsoft.Office.SecureStoreService.Server;
using Microsoft.SharePoint;
 
namespace Trentacular.SharePoint.Util
{
public static class SecureStoreUtils
{
public static Dictionary<string, string> GetCredentials(string applicationID)
{
var serviceContext = SPServiceContext.Current;
var secureStoreProvider = new SecureStoreProvider { Context = serviceContext };
var credentialMap = new Dictionary<string, string>();
 
using (var credentials = secureStoreProvider.GetCredentials(applicationID))
{
var fields = secureStoreProvider.GetTargetApplicationFields(applicationID);
for (var i = 0; i < fields.Count; i++)
{
var field = fields[i];
var credential = credentials[i];
var decryptedCredential = ToClrString(credential.Credential);
 
credentialMap.Add(field.Name, decryptedCredential);
}
}
 
return credentialMap;
}
 
public static string ToClrString(this SecureString secureString)
{
var ptr = Marshal.SecureStringToBSTR(secureString);
 
try
{
return Marshal.PtrToStringBSTR(ptr);
}
finally
{
Marshal.FreeBSTR(ptr);
}
}
}
}

Linq to Entities Wild-card LIKE Extension Method

To build a wildcard-enabled Linq query for Entity Framework, you have several methods available to use from the System.String class that Entity Framework supports and transforms into SQL:

  • Contains(string value)
  • StartsWith(string value)
  • EndsWith(string value)

A simple query example might be:

var q = (from c in db.Customers
where c.CompanyName.Contains(name)
select c)
.ToList();

The above example will always search anywhere in CompanyName for a match. But you need to give your users a little more control over the match method by allowing them to supply wild-card characters at either the start or end of the text to match. This means you are left to dynamically build your query based on the presence and location of the wild-card characters.

Well my first pass at this resulted in a chunk of code that I really never want to write again. I therefore rewrote it using Expression Trees so that it could be used in any future query. Here are the resulting extension methods you are welcome to reuse:

using System;
using System.Collections.Generic;
using System.Data;
using System.Data.Objects;
using System.Data.Objects.DataClasses;
using System.Linq;
using System.Linq.Expressions;
using System.Reflection;
 
public static class LinqExtensions
{
public static IQueryable<TSource> WhereLike<TSource>(
this IQueryable<TSource> source,
Expression<Func<TSource, string>> valueSelector,
string value,
char wildcard)
{
return source.Where(BuildLikeExpression(valueSelector, value, wildcard));
}
 
public static Expression<Func<TElement, bool>> BuildLikeExpression<TElement>(
Expression<Func<TElement, string>> valueSelector,
string value,
char wildcard)
{
if (valueSelector == null)
throw new ArgumentNullException("valueSelector");
 
var method = GetLikeMethod(value, wildcard);
 
value = value.Trim(wildcard);
var body = Expression.Call(valueSelector.Body, method, Expression.Constant(value));
 
var parameter = valueSelector.Parameters.Single();
return Expression.Lambda<Func<TElement, bool>>(body, parameter);
}
 
private static MethodInfo GetLikeMethod(string value, char wildcard)
{
var methodName = "Contains";
 
var textLength = value.Length;
value = value.TrimEnd(wildcard);
if (textLength > value.Length)
{
methodName = "StartsWith";
textLength = value.Length;
}
 
value = value.TrimStart(wildcard);
if (textLength > value.Length)
{
methodName = (methodName == "StartsWith") ? "Contains" : "EndsWith";
textLength = value.Length;
}
 
var stringType = typeof(string);
return stringType.GetMethod(methodName, new Type[] { stringType });
}
}

Usage of the WhereLike extension method is as follows:

var searchTerm = "*Inc";
var q = db.Customers
.WhereLike(c => c.CompanyName, searchTerm, '*')
.ToList();

Avoiding SOAP Bloat with JSON Services

In this post I am going to walk through writing and consuming JSON services using ASP.Net, WCF, and jQuery to request the stock price for a company.

Visual Studio 2010 Web Application Project Template Additions

  • jQuery Intellisense – let Visual Studio write your jQuery for you
  • AJAX-enabled WCF Service – item template that auto-generates the web.config entries for configuring a JSON service to be consumed and proxied by an ASP.Net ScriptManager
  • Targeted web.config files – easy way to manage different service endpoints for different environments

What is JSON?

Java Script Object Notation – JSON is a subset of the object literal notation of JavaScript. Since JSON is a subset of JavaScript, it can be used in the language with no muss or fuss.

var dog = {color: "grey", name: "Spot", size: 46};

SOAP Bloat

SOAP services are extremely verbose. This verbosity enables us to use tools like Visual Studio’s built-in ”Add Service Reference” to auto-generate client proxy classes.  The following examples demonstrate the XML that is used for requesting a stock price and the corresponding response:

Example SOAP Request

<?xml version="1.0"?>
<soap:Envelope xmlns:soap="http://www.w3.org/2001/12/soap-envelope" soap:encodingStyle="http://www.w3.org/2001/12/soap-encoding">
<soap:Body xmlns:m="http://www.example.org/stock">
<m:GetStockPrice>
<m:StockName>GOOG</m:StockName>
</m:GetStockPrice>
</soap:Body>
</soap:Envelope>

Example SOAP Response

<?xml version="1.0"?>
<soap:Envelope xmlns:soap="http://www.w3.org/2001/12/soap-envelope" soap:encodingStyle="http://www.w3.org/2001/12/soap-encoding">
<soap:Body xmlns:m="http://www.example.org/stock">
<m:GetStockPriceResponse>
<m:Price>534.5</m:Price>
</m:GetStockPriceResponse>
</soap:Body>
</soap:Envelope>

The above example when formatted as JSON is as follows:

GET Request

ticker=GOOG

JSON Response

{"d":534.5}

JSON Enabling a WCF Service

  1. Using the AJAX-enabled WCF Service item template pretty much does it all.  An additional step can be taken to eliminate the need for the additions to the web.config by configuring the channel factory directly on the service declaration (.svc) file:
    <%@ ServiceHost Language="C#" Debug="true"
    Service="Trentacular.JsonWcfDemo.StockPriceService"
    Factory="System.ServiceModel.Activation.WebScriptServiceHostFactory"
    CodeBehind="StockPriceService.svc.cs" %>
  2. Decorate the operation (service method) with the WebGetAttribute to enable the use of HTTP GET for data retrieval and return the response as JSON :
    [WebGet(ResponseFormat = WebMessageFormat.Json)]

Our resulting stock price service’s code behind will look like the following:

    [ServiceContract(Namespace = JsonWcfDemoNamespace.Value)]
public class StockPriceService
{
// To use HTTP GET, add [WebGet] attribute. (Default ResponseFormat is WebMessageFormat.Json)
// To create an operation that returns XML,
//     add [WebGet(ResponseFormat=WebMessageFormat.Xml)],
//     and include the following line in the operation body:
//         WebOperationContext.Current.OutgoingResponse.ContentType = "text/xml";
[OperationContract]
[WebGet(ResponseFormat=WebMessageFormat.Json)]
public StockPrice GetStockPrice(string ticker)
{
...
}
 
// Add more operations here and mark them with [OperationContract]
}

Converting Text to JSON and Back

eval() – invokes the JavaScript compiler. The compiler will correctly parse the text and produce an object structure. The eval function is very fast. However, it can compile and execute any JavaScript program, so there can be security issues.

JSON.parse() – To defend against security issues with the eval function, it is preferred to use a JSON parser. A JSON parser will recognize only JSON text, rejecting all scripts. In browsers that provide native JSON support, JSON parsers are also much faster than eval. It is expected that native JSON support will be included in the next ECMAScript standard.

jQuery’s $.getJSON() Method

jQuery provides the getJSON method for easily making calls to services providing JSON-formatted responses and performing the JSON conversion.  Its signature is as follows:

$.getJSON( url, [ data ], [ callback(data, textStatus) ] )

url – A string containing the URL to which the request is sent.

data – A map or string that is sent to the server with the request.

callback(data, textStatus) – A callback function that is executed if the request succeeds.

Consuming our stock price service using the getJSON method will look like the following:

$.getJSON("StockPriceService.svc/GetStockPrice", { ticker: tickerValue }, function (data, textStatus) {
 
if (textStatus != 'success' || !data.d) {
$("#stockPricePanel")
.html('<span class="error">Error looking up stock price.  Did you enter a valid ticker?</span>');
} else {
var stockPrice = data.d;
var isIncrease = (stockPrice.Delta >= 0);
var deltaStyle = isIncrease ? 'gain' : 'loss';
 
$("#stockPricePanel")
.html('<span class="price">' + stockPrice.Price + '</span>')
.append('<span class="delta ' + deltaStyle + '">' + stockPrice.Delta + '</span>')
.append('<span class="percent ' + deltaStyle + '">(' + stockPrice.PercentChange + '%)</span>');
}
});

Working with WCF Serialized JSON Dates

One caveat when working with JSON is that WCF serializes DateTimes to JSON in the format:

/Date({milliseconds since 01/01/1970}-{time zone})/

image
Unfortunately, native JSON conversion does not automatically convert the date string to a JavaScript date object. The following method shows an example of how to convert a JSON date string to a javascript Date object:

function convertJSONToDate(json) {
return eval(json.replace(//Date((.*?))//gi, "new Date($1)"));
}

Download the Demo Application

image
Download Now Download Now