St. Louis Days of .NET 2014

My notes from the 2014 edition of St. Louis Days of .NET.  I was only able to attend the first day of the conference this year.

Front-End Design Patterns: SOLID CSS + JS for Backend Developers

Session Materials:

Use namespaced, unambiguous classes.   For example, use “.product_list_item” instead of “.product_list li” , and “.h1” instead of “h1”.

No cascading

Limit overriding

CSS Specificity – Specificity is the means by which a browser decides which property values are the most relevant to an element and get to be applied.
    Each CSS rule is assigned a specificity value
    Plot specificity values on a graph where the x-axis represents the line number in the CSS
    Line should be relatively flat, and only trend toward high specificity towards the end of the CSS
    Specificity graph generator:
    Another option of what a graph should look like:

Important CSS patterns and concepts
    Revealing Module
    Revealing Prototype

Optimizing Your Website’s Performance (End-To-End Diagnostics)

Session Materials:

If your test environment is different that your production environment, look for linear differences in order to estimate the differences between the servers.  For example, if the production server is a quad-core server and the test server is a dual-core server, measure the performance of the test server twice: once with one core active and once with both cores.  The difference between running with one core vs. two cores should allow you to estimate the difference between the dual-core server and the quad-core server.  Obviously, this will not be perfect, but does provide some baseline for estimating the differences between servers.

Different browsers have different limits on how many simultaneous requests can be made to a single domain (varies from 4 to 10).

Simple stuff to look at when optimizing a web site:
    Large images
    Long-running javascript
    Large viewstate

Make sure cache-expiration is set correctly for static content.  This is done in the web.config file.


Google PageSpeed
    Provides mobile and desktop scores
    Used in Google search rankings!
    Not useful for internal sites
    Similar to YSlow
    Blocked by pages requiring a login

Google Analytics (or similar)
    Useful for investigating daily loads (determine why site is slow at certain times)
    Use to investigate traffic patterns

    Reasonably priced and free options available
    Use to simulate traffic load on your site
    Only tests static html

    More expensive
    Use to simulate traffic load
    Tests everything; not just static content

New Relic
    Internal server monitoring

Hadoop For The SQL Ninja


Hive is a SQL-like query language for Hadoop.
    Originated at Facebook
    Compiles to Map/Reduce jobs
    Queries tables/catalogs defined on top of underlying data stores
    Data stores can be text files, Mongo, etc
    Data stores just need to provide rows and columns of data
    Custom data provides can be created to provide rows/columns of data

Hive is good for:
    Large scale queries
    A variety of formats
    UDF extensibility

Hive is NOT good for:
    Interactive querying
    Small tables

Hive connectivity
    ODBC/JDBC – responsive queries
    Oozie – job-based workflows
    Azure Toolkit/API – now includes Visual Studio integration for viewing/executing queries

Angular for .NET Developers

Session Materials:

AngularJS is a Javascript MVC framework
    Model-View-Controller are all on the client
    Data is exchanged via AJAX calls to REST web services
    Makes use of dependency injection

Benefits of AngularJS
    Unobtrusive Javascript
    Clean HTML
    Limits the need for third party libraries (like jQuery)
    Works well with ASP.NET MVC
    Easy Single-Page Applications (SPA)
    Testing is easy.  Jasmine is the test framework of choice.

HTML attributes provide AngularJS “hooks”.  For example, notice the attributes on the elements <html ng-app=”AngularApp”> and <input ng-model=”” />

Data binding example:

    <input ng-model=””/>
    <p>Hello {{}}</p>

    In this example, data entered into the input text box is echoed in the paragraph below the input element.

Making Rich, Interactive, Multi-Platform Applications with SignalR

Session Materials:

Use cases for SignalR
    Any application that involves polling
    Chat applications
    Real-time score updates
    Voting results
    Real-time stock prices

The Smooth Transition to TypeScript


TypeScript provides compile-time errors in Visual Studio.

TypeScript has type-checking
    Optional types on variables and parameters
    Primitive types are number, string, boolean, and any
    The “any” type tells the compiler to treat the variable like Javascript would

Intellisense for TypeScript is very good, and other typical Visual Studio tooling works as well.

TypeScript files compile to javascript (example.ts –> example.js), and the javascript is what gets referenced in your web applications.

TypeScript class definitions become javascript types.

The usual Visual Studio design and compile-time errors are available when working with classes.

A NuGet package exists that provides “jQuery typing files” that enable working with jQuery in TypeScript.

TypeScript supports generics and lambdas.

IIS URL Rewrite – Directing All Traffic to the Site Root

I recently had a need to redirect all of a web site’s incoming traffic to that site’s home page. 

All of the site’s content had been migrated to a new site, and so now the old site simply needed to let users know what had happened and give them a link to the new site. 

A simple page with some text, a link, and an image was created to replace the home page.  Then, the following IIS URL Rewrite rules were added to direct all incoming traffic to the new home page:

        <rule name="Allow Local Resources" stopProcessing="true">
            <match url=".*" />
            <conditions logicalGrouping="MatchAny" trackAllCaptures="false">
                <add input="{REQUEST_FILENAME}" pattern="image.jpg" />
            <action type="None" />
        <rule name="RedirectAllToSiteRoot" enabled="true" patternSyntax="ECMAScript" stopProcessing="true">
            <match url="^.+$" negate="false" />
            <action type="Redirect" url="/" appendQueryString="false" />
            <conditions logicalGrouping="MatchAny" trackAllCaptures="false" />

These settings were added to the “system.webServer” section of the web.config.  The first rule allows the image to be served (without redirecting), and the second directs all incoming traffic to the site root.

Installing Mediawiki on Debian

Here are the steps to follow to install Mediawiki on Debian.  It is assumed that GNOME is in use on Debian. Installation on Windows or other flavors of Linux may vary.

If you have an existing Debian installation, skip right to the "Installing Mediawiki" section.  Otherwise, start with the "Preparing the Server" section.


The quickest way to get up-and-running, particularly if you are simply evaluating Mediawiki, is to use a pre-installed Debian virtual machine. 

NOTE: Alternately, you can set up Debian from scratch on a computer or in a virtual machine.  That process is beyond the scope of these instructions.  Refer to the Debian website for guidance.

To use a pre-installed virtual machine, follow these steps:

Step 1

Install VirtualBox, if you do not already have it.  It can be downloaded from

Step 2

Download the appropriate pre-installed virtual machine from   For this tutorial, Debian 6.0.6 with GNOME was selected.

Step 3

Set up the virtual machine in VirtualBox.

Step 4

Start the virtual machine and log in.

Step 5

From the System/Administration menu, select Synaptic Package Manager.  You will be required to enter an administrative password.

Step 6

Use the Package Manager to search for "openssl".  If the installed version is between versions 1.0.1 and 1.0.1f (inclusive), it is vulnerable to the Heartbleed bug.  Do the following to upgrade it:

  1. Mark the "openssl" package for installation.
  2. If prompted to install/upgrade packages dependent on openssl, mark all of the additional packages for install/upgrade.
  3. Click the "Apply" button at the top of the Package Manager window.  Click the Apply button in the resulting dialog to confirm the changes.
  4. When prompted, close the dialog that reports that all changes were applied.  If you wish, review the details of the updates before closing the dialog.


Install Mediawiki by following these steps:

Step 1

From the Applications/Accessories menu, select Root Terminal to open a command prompt with administrative privileges.  When prompted, enter the administrative password.

Step 2

Update the list of package sources by typing

apt-get update

Step 3

Install the MySQL database management system by typing

apt-get install mysql-client mysql-common mysql-server

Step 4

When prompted, set a password for the "root" MySQL user.

Step 5

After the MySQL installation completes, open a MySQL command prompt by typing

mysql -u root -p <password>

Step 6

Create a new MySQL user named "mediawiki" and a password of "mediawiki" by typing

create user ‘mediawiki’ identified by ‘mediawiki’;

Step 7

Create a new database named "mediawiki" and grant the new "mediawiki" user rights by typing

create database mediawiki;
grant index, create, select, insert, update, delete, alter, lock tables on mediawiki.* to ‘mediawiki’@’localhost’ identified by ‘mediawiki’;
flush privileges;

Step 8

Type "exit" to leave the MySQL command prompt.

Step 9

Install the Apache HTTP server and PHP scripting language, along with all of the necessary add-ons, by typing

apt-get install apache2 libapache2-mod-php5 php5 php5-common php5-cli php5-mysql php5-mcrypt php5-curl php5-gd php5-intl php-pear

Step 10

Install Mediawiki and all extensions by typing

apt-get install mediawiki mediawiki-extensions

This may take a while (30-60 minutes).

Step 11

Open the php.ini file in the gedit text editor by typing

gedit /etc/php5/apache2/php.ini

Make the following modifications…

max_execution_time = 300
max_input_time = 120
memory_limit = 20M

as well as these additions…

When done, close gedit.

Step 12

By default, Mediawiki is installed in /var/lib/mediawiki, rather than the default apache root folder.  Use gedit to open the 000-default configuration file by typing

gedit /etc/apache2/sites-enabled/000-default

Look for "DocumentRoot /var/www" and replace it with "DocumentRoot /var/lib/mediawiki".  Similarly, replace "<Directory /var/www/>" with "<Directory /var/lib/mediawiki/>".  When done, close gedit.

Step 13

Restart the apache service by typing

service apache2 restart

Step 14

Start a web browser and navigate to http://localhost/config

Step 15

Review the page and verify that the environment checks out and that you have been given the message "You can install MediaWiki".

Step 16

Fill out all required information.  For testing in a VM, you probably want to turn off all "E-mail" features.  Use the MySQL database information specified when installing MySQL to complete the "Database config" section.

Step 17

When done, click the "Install MediaWiki!" button.

Step 18

If everything completes successfully, you should see the message "Installation successful!", as well as directions for moving/installing the LocalSettings.php file.

Step 19

Move the LocalSettings.php file as directed.  (For example, move /var/lib/mediawiki/config/LocalSettings.php to /etc/mediawiki.)

Step 20

Navigate to http://localhost/index.php to see the home page of your newly installed Mediawiki instance.



After completing the installation, I had a problem logging in using the administrative account.  I do not know if I made a mistake entering the password during the installation, or if there was a problem with the installation process itself.  So, just in case it is needed, here are the steps to reset the Mediawiki administrator password:

Step 1

Open a Debian command prompt with administrative rights.

Step 2

Navigate to the maintenance folder of the mediawiki installation (i.e.  /var/lib/mediawiki/maintenance).

Step 3

Change the password by typing

php changePassword.php –user=<adminusername> –password=<adminpassword>


If you have installed Mediawiki in a virtual machine, and you want to access it from the host machine, follow these steps (originally described at

Step 1

With the virtual machine turned off, use the VirtualBox Manager to open the Settings dialog for the virtual machine.

Step 2

Navigate to the "Network" tab.

Step 3

Adapter 1 should be enabled and attached to "NAT".  Leave this adapter active, as it will allow the virtual machine to access the Internet via the host machine’s network connection.

Step 4

Go to Adapter 2, check the "Enable network adapter" box, and set the value of the "Attached To" dropdown to "Host-only Adapter".  This adapter will allow the virtual machine to be accessed from the host.

Step 5

Start the virtual machine.

Step 6

Hover over the network icon on the status bar and note the IP address.  It should be similar to "".

Step 7

From a browser on the host machine, navigate to the IP address (i.e.


Similarly, to enable administration of the Mediawiki MySQL database from the host machine, do the following:

Step 1

Within the virtual machine, open a command prompt with administrative rights.

Step 2

Use gedit to open the file /etc/mysql/my.cnf and comment out the line

bind-address =

This allows MySQL to accept connections from something other than the local machine.

Step 3

Restart the MySQL server by typing

service mysql restart

Step 4

Open a MySQL command prompt by typing

mysql -u root -p <password>

Step 5

Enable remote root access to MySQL by typing

grant all privileges on *.* to ‘root’@’%’ identified by ‘password’;

Installing Ruby on Rails on Windows

Following is a record of the steps that I followed to install Ruby on Rails, as well as a few Ruby add-on packages, on Windows 7.  While this is somewhat of a rough draft (and therefore may not be entirely repeatable), it should at least provide some guidance.

Helpful resources – The official Ruby On Rails web site – “The easy way to install Ruby on Windows” – How to install the Ruby Development Kit

Installation of Ruby on Rails

Step 1) Go to, which bills itself as "The easy way to install Ruby on Windows".  The RubyInstaller project provides a Windows installer that includes Ruby, a baseline set of RubyGems, and the full text of "The Book of Ruby".

Step 2) Navigate to the downloads page ( ), and review the information found there.

Step 3) Determine the appropriate version of Ruby to install.  In my case, I needed to install Ruby in order to evaluate a Ruby on Rails application.  That application used an older version of Rails, so the best version of Ruby was the 1.8.7 release.  The download page at can help you determine what version to use.  Otherwise, use your favorite search engine to do the research needed to identify the best version for you.

Step 4) Download and run the RubyInstaller package.  You can choose the folder in which Ruby gets installed.  (For the rest of this tutorial, assume that Ruby is installed in C:\Ruby.)

Step 5) Optional, but recommended.  From, download the appropriate Development Kit for the version of Ruby that you selected.  Follow the instructions at to install it.  You may not need this, but it seems likely that you will.  I tried adding some gems required by the application I was evaluating, and the first thing that happened was an error because "installed build tools" were required… and one way to take care of that error is to install the Development Kit.

     Here are the detailed steps for installing the Development Kit.

     a) Download the Development Kit

     b) Extract it to a permanent location (say C:\RubyDK)

     c) In a command prompt, navigate to the root folder for the Development Kit (for example, C:\RubyDK)

     d) Execute "C:\Ruby\bin\ruby dk.rb init"

     e) Execute "C:\Ruby\bin\ruby dk.rb install"

Step 5) Navigate to C:\Ruby\bin, and install Rails by typing "gem install rails".  To install a particular version of Rails, include the version number with the -v option.  For example, "gem install rails -v 2.3.5".

At this point, you should have everything you need to get started.  To actually do something with Ruby on Rails, continue with the next steps.

Step 6) Create a new Rails application by typing "rails new <application path>"

Step 7) Start the new application by navigating to the application root and typing "rails server"

Step 8) Confirm that the application is running by using a browser to navigate to http://localhost:3000

Adding ImageMagick

ImageMagick is a popular open-source tool for performing operations with image files.  I found it to be a challenge to get it configured to work with Ruby on Windows.

To install/configure ImageMagick with Ruby, the gem (Ruby add-on) that needs to be installed is RMagick.  The process that I followed to install RMagick is detailed at

In brief, the steps are:

1) Install Ghostscript

2) Install ImageMagick

3) Open a new command window and install the RMagick gem.   Use a command something like:

gem install rmagick –platform=ruby — –with-opt-lib=C:/ImageMagick-6.7.6-Q16/lib –with-opt-include=c:/ImageMagick-6.7.6-Q16/include

While this worked, it required an older version of ImageMagick.  I used ImageMagick 6.5.6 (found in the rmagick-win32 gem package that can be downloaded from  Alternately, I’ve saved a copy at this location.

I was working with with Ruby 1.8.7 and RMagick 1.8.24.  If you are using a newer version of Ruby and/or the RMagick gem, it may be possible to also use a newer version of ImageMagick.

A Note About Deployment

I found a lot of information on the web that suggests that deploying Ruby on Rails under Windows is painful, at best.  This article ( ) suggests an easy way to configure Apache for Ruby… I did not try this method, so I cannot confirm its contents.

St. Louis Day of .NET 2013

This post is long overdue, as the 2013 Day of .NET took place almost two months ago.  I set aside my notes while I waited for presenters to post their session materials online… and then I forgot about it.  So, without further ado, here are my notes from the event:


Session: Entity Framework in the Enterprise

Session Materials:

Getting Started with Entity Framework (EF6 and MVC5)
(EF5 and MVC4)

SQL Server Data Tools 
     Use LocalDB 
     Allows for loading of test data 
     Allows for data to be "reset" to a known state 
     Remember to check the "Target Connection String" in the DB project properties dialog

Entity Framework Power Tools v.4 (Beta)
     Provides reverse engineering of databases into code-first classes, using the Fluent API

Unit Testing
     Entity Framework 6 has support for mocking frameworks
     Allows you to create your own test doubles
     It is recommended to test against a "real" DB for Last Mile test and performance tests

Audit Tracking
     SQL Server Change Data Capture
          Available in SQL Server 2008 and beyond (Enterprise Editions only)
          Uses change tables that mirror structure of tables being tracked
          Populates the change tables by analyzing the transaction log (not via triggers)
     If using EF natively
          Override the "SaveChanges" methods
          Loop through the contents of the "ChangeTracker" collection (saving the details along the way)

Performance Tracking
     Entity Framework 6 includes/allows logging of SQL statements and execution times
     Other useful tools include NLog and Glimpse

Session:  Introduction to MongoDB


     6th most popular database in the world, just behind PostgreSQL and DB2
     There are drivers for many languages, as well as a LINQ provider.
     Data stored as BSON (binary JSON)
     Everything is case-sensitive

     Speed – basic queries are much faster than SQL DBs
     Rich Dynamic Queries – not as limited as other NoSQL DBs
     Easy Replication and Failover
     Automatic Sharding

     No transactions
     No joins
     RAM intensive
     No referential integrity
     "Eventual consistency" – periods of inconsistency usually measured in milliseconds

     MongoDB shell (command line)
     Various GUI tools

     Can query by regular expression
     Can return entire records or specific fields

Object IDs
     Object IDs (auto-generated unique IDs) contain timestamp of record creation.
     Timestamps contained in Object IDs can be retrieved.
     Can define your own IDs, which is useful for sharding

     Can index pretty much any part (or parts) of a record, up to and including the entire record

     If the primary fails, a secondary is auto-elected as the new primary

Session:  Modern Web Diagnostics with A Glimpse into ASP.NET


     Installed via NuGet
     New versions are released approximately every two weeks

     Gives insight into ASP.NET, WebForms, and others
     Gives diagnostics on networks, databases, page lifecycle, viewstate, and more
     Can trace individual users
     Can be enabled/disabled in various ways (cookies, roles, etc)
     Keeps the history of the last 50 requests, so recent requests can be examined after they occur

Platform Support
     Cross browser (last versions of browers supported) and cross platform
     Support exists for tracing NHibernate, Entity Framework, MVC, WebForms
     WebAPI support is on the way (not there now)

Session:  Parallelism in .NET

Session Materials:

     More threads means more memory usage and more context switching
     Developers need to find the appropriate balance between the # of threads and resource usage
     Available since .NET 1.0

     Similar to database connection pooling
     Resources are managed much better
     Available since .NET 1.0

Parallel Linq (PLINQ)
     Example: from r in object.AsParallel() select r
     When using this, you must watch out for shared resources, and lock them correctly

Parallel Library
     Provides the parallel For, ForEach, and Invoke methods
     Allows processing to be stopped via the "ParellelLoopState" delegate

Tasks (TPL => Task Parallel Library)
     The most complex option to use, but also the most flexible
     Can be used "as needed"; they are not bound to the loop processing of the Parallel Library
     Allow parallel processes to be stopped
     Necessary for the use of Await/Async

     See the slide deck for the details of how "await" works
     Async methods must return Task
     Await can always be used on a Task, whether it is "async" or not
     When calling an async method, always await it (best practice)

Debugging Support
     When a breakpoint is hit, all running tasks stop
     Several parallel debugging windows are available under the Visual Studio "Debug" menu
          Tasks – shows all running tasks; click a task to go to the currently executing statement
          Parallel Stacks – visual display of running tasks and the call stack; click a task to see the current statement
          Parallel Watch – allows watching a variable in a particular task

Session:  A Deeper Dive Into Xamarin.Android

Presenter: or
Session Materials:

     Xamarin Studio (native) – not free
     Xamarin Studio plug-in for Visual Studio – not free

Recommended components for easing cross-platform development:
     Xamarin.Mobile – abstracts your code for location/photos/contacts across all platforms
     Xamarin.Social – similar to the Mobile component, only for social services
     Xamarin.Auth – makes OAuth easier to use

Components for Android
     Google Play Services
     Backward compatibility component (for supporting older versions of Android)


Notes about developing for Android
     Turn on Hardware Acceleration in the application manifest
     Activity (app) lifecycle events reminiscent of ASP.NET page lifecycle events (or Windows 8 app events)
     Lots of XML involved in app creation
     "Layouts" are used to create app UIs.  Reminiscent of XAML.

     Android SDK is more robust and complicated than iOS
     Not as prescriptive in UI/design
     Device fragmentation is a challenge
     Emulators are poor; use a real device for testing
     Platform is more innovative than iOS, but not as polished


Session:  All You Ever Wanted to Know About Hadoop

Presenter: Matt Winkler

Written in Java (runs on the JVM)

Installation Options
     Single computer
          HDInsight (Microsoft’s implementation) can be installed from Web Platform Installer
          Various installation packages
          Azure – multiple nodes running HDInsight can be easily provisioned
          Amazon Cloud Services

MapReduce is the tool for querying data with Hadoop
     White Paper: Data-Intensive Text Processing with MapReduce (
     MapReduce can be thought of as the assembly language for Hadoop.

Extensions to MapReduce
     Most of these compile down to MapReduce packages

     Hive – SQL-like query language
     Pig – another query platform
     SCALDING – Scala-like query language.  The syntax is LINQ-like.
Other Tools
     SQOOP – used for loading traditional RDBMS data into Hadoop
     STORM – tool for complex event processing
     OOZIE – Workflow Management for Hadoop

Session:  Building A REST API With Node.js and MongoDB

Session Materials:

Useful Node.js packages (similar to NuGet packages in .NET)
     Restify – adds REST capabilities
     Toaster – UI functionality
     Moment – date handling
     MongoDB – MongoDB client tools

WebStorm from JetBrains is a recommended Javascript editor ($49 individual developer license) – offers *free* online course on MongoDB

Session:  Starting with Code-First Entity Framework

Session materials:

Create a class that inherits from DbContext… within that class, define the tables to create

Create classes to represent each table

Database is created automatically the first time that it is accessed

Handing DB changes
     1) Update the code
     2) Via attributes, databases can be set to drop/create always, drop/create only when the model changes
     3) Database migrations are another option

Database Migrations
     Package Manager Console can be used to generate classes to handle migrations. 
     Alternately, create a Configuration class in a Migrations folder
     Use the Configuration class with the MigrateDatabaseToLatestVersion class in SetInititalizer method of the Database object.
     Or, if you choose not to trust the auto-migration, generate a TSQL script to perform the migration.
     TSQL scripts can be generated from the Package Manager Console

ExpressProfiler is a simple SQL profiler… find it on CodePlex.

Session:  Introduction to Knockout.js

Session Materials:!APKGSzHxmC91400

What is it?

     JS library for dynamic web-based UI’s
     Applies MVVM to automate data binding


     Declarative bindings
     Dependency Tracking
     Automatic UI Refresh
     Dependency Injection

MVVM (Model-View-ViewModel) Pattern

     Combination of the MVC/MVP patterns
     View – UI and UI Logic, talks with ViewModel and receives notifications from ViewModel
     ViewModel – Presentation Logic, talks with View (data binding and commands (bi-directional), notifications [to View]) and Model (bi-directional)
     Model – Business Logic and Data, talks with ViewModel

Data Binding


     Knockout.js implements the ViewModel

var myViewModel = function() {

      var data = { productid: 1, productname: "shoe", productprice=1.99 };

 = ko.observable("value");   

          this.products = ko.observableArray(data);     // "data" is an array of products

          this.handler = function(data,event) {}


     ko.applyBindings(new myViewModel());

     "ko" is the global identifier for Knockout


Attributes of HTML elements are bound to the ViewModel properties (also CSS and conditional logic like "foreach" and "if")

     <input data-bind="value: property" />

     <button data-bind="click: handler"></button>

     <tbody data-bind="foreach: products">

          <tr><td><input data-bind="value: productid"></td></tr>


Individual elements can be bound to more than one property (example: "text" bound to one thing, "visible" bound to another)

Session:  Real World Azure – How We Use Azure at Swank HealthCare

Presenter: Brad Tutterow

SQL in an Azure VM vs SQL Azure Database
     VM option does place your database in the cloud
     VM option still requires you do to your own backups/restores/server maintenance
     VM option does not provide for scalability of a "true" cloud DB

     SQL Azure DB is Microsoft’s preference

DB Changes that were needed for SQL Azure Database
     Remove Cross-DB triggers
     Remove file groups in CREATE scripts
     Account for cloud-based SQL being a limited subset of full SQL
          Example: No "USE" statement, so scripts may need update
     Modify backup strategy (no traditional Backup/Restore in the cloud)

Always run two of every Web Role
     Roles are frequently recycled by Azure
     If only one Role exists, your site is down when the Role is recycled
     If two Roles exist, Azure will switch between the two as needed, and not recycle both at the same time

Deployment best practices
     Determine which application settings (web.config) need to be changed at runtime
          Move those settings to Azure settings
          Everything else can stay in the web.config
          Changing web.config in production doesn’t "stick"… Role recycle will wipe the changes
     Create deployment packages
          Role recycle will wipe updates if not deployed via a package
     Make no assumptions about what is available on the server
          You must deploy everything your app needs (all NuGet packages, etc)
          Role recycle produces fresh copy of Windows

Database updates handled via EF Code-First Database Migrations
     Question: How would updates be handled without Code-First, or with some other ORM?

Pain points
     Local Azure emulator is unreliable and inconsistent
     No effective way to do QA on-premise (means more cost for a QA environment in Azure)
     Learning curve (not too bad)
     Azure SDK versioning (keeping everything in sync… updates are quarterly)
     EF migrations and Azure SQL (scripts don’t always work in Azure; need to be edited)

Good things
     Uptime and reliability
     Buy-in from sales/operations/infrastructure
     Enforced best practices for design and deployment
     Pristine/clean production environments
     QA/Prod environments are identical
     No IIS or Windows OS management
     Easy deployments


  • Investigate SQL Server Change Data Capture as a replacement for auditing with triggers.
  • Check out DurandalJS (mentioned in several sessions)
  • Check out Twitter Bootstrap
  • Check out LESS
  • Check out Glimpse
  • Think about what could be done with a large OCR corpus and Hadoop 

A Misconception about OAI-PMH Metadata Formats

OAI-PMH stands for Open Archives Initiative Protocol for Metadata Harvesting. In brief, OAI-PMH is a protocol that enables publishing and harvesting of metadata about objects (such as PDFs or JPGs) in an archive.  In my experience, I have found that one aspect of this protocol is often misunderstood.


OAI-PMH defines six different commands (called verbs). Three of these commands are used to publish 1) information about the archive itself, 2) the metadata formats in which data is published, and 3) the sets of data available in the archive. The remaining three commands are used for the actual publishing/harvesting of data: one command publishes a list of record identifiers, one publishes a list of metadata records, and one publishes a single metadata record. By supplying the appropriate arguments to the commands, it is possible to track changes to an archive since a certain date or between two dates.

Much more information about OAI-PHM can be found at

The Misconception

Having worked with OAI-PMH quite a bit the last few years, one thing that has consistently surprised me is a significant misconception about the protocol. Namely, more than once I have encountered the belief that the OAI-PMH protocol supports only the Dublin Core metadata format. This is simply not true.

It IS true that the OAI-PMH specification requires metadata to be published in Dublin Core, but it does NOT dictate that metadata be published ONLY in the Dublin Core format.

Specifically, the OAI-PMH specification states "…OAI-PMH supports items with multiple manifestations (formats) of metadata. At a minimum, repositories must be able to return records with metadata expressed in the Dublin Core format, without any qualification. Optionally, a repository may also disseminate other formats of metadata."

The Reality

If one browses the web and examines the various archives publishing with the OAI-PMH protocol, it quickly becomes clear that many archives publish in Dublin Core and also in one or more additional metadata formats. In almost every case, these additional formats provide much richer metadata than is possible with Dublin Core (formats like RDF, METS, and MODS are not uncommon). Here are some examples:

Rice University Digital Scholarship – Dublin Core, RDF, METS, two others
Gateway to Oklahoma History – Dublin Core, RDF, one other
Smithsonian Digital Repository – Dublin Core, Qualified Dublin Core, MODS
Pensoft Publishers – Dublin Core, MODS
PubMed Central – Dublin Core, two formats derived from NLM DTDs created for journal metadata exchange

This, I think, is the point… Dublin Core exists as the lowest common denominator for metadata exchange with OAI-PMH, but most archives should (and do) provide something richer.


The support for multiple metadata formats is the distinction that some potential users and adopters of OAI-PHM miss. I have seen archives proclaim that they support OAI-PMH, only to find that they support only the bare minimum (i.e. only Dublin Core). To them, I say "that’s nice, but given the limitations of Dublin Core, please add support for a richer metadata format". And, I have seen users dismiss OAI-PHM out of hand, complaining about its limited usefulness due to mangled Dublin Core metadata. To them, I say "look closer, and if the archives with which you are working truly only support Dublin Core, demand more".

OAI-PMH metadata formats… more than just Dublin Core.