Generate a Flat File Web Site using RazorEngine and RazorMachine

Razor and ASP.Net

As a normal person you’d probably be happy with how Razor template files are used within MVC; there’s a nice convention for where they live – they’ll be in a Views folder within your project most likely – and you refer to them either by name or sometimes just by convention – what’s that? You have an ActionResult method called “Index”? I’ll go fetch the “Index” view from the folders I normally expect the cshtml files to live in for ya then.

The way this works is fantastic; development can steam ahead without the pain and confusion of all of the possible ways you could do it wrong when choosing webforms and .aspx files.

Of course, the MS implementation of an MVC framework in itself is a wonderful thing; all but enforcing the separation of concerns that is just so easy to ignore in webforms.

Razor outside of ASP.Net

But what about when you want to dynamically generate html without a process being hosted as a website? One big use case for this is email generation; sure, you could host an MVC web API and have the content generation process constantly call it, but that seems a little inefficient.

RazorEngine and RazorMachine

There are a few solutions to this; you can actually hand roll your own (I might get onto that in a future post) or you can try out some reasonably well known open source solutions like RazorEngine:

A templating engine built on Microsoft’s Razor parsing engine, RazorEngine allows you to use Razor syntax to build dynamic templates:

[csharp]string template = "Hello @Model.Name, welcome to RazorEngine!";
string result = Razor.Parse(template, new { Name = "World" });
[/csharp]

and RazorMachine:

RazorMachine is a robust and easy to use .Net Razor v2/v3 template engine. The master branch uses Razor v3. This implementation supports layouts (masterpages) and a _viewStart construct, just like MVC does support these features. The RazorEngine works independently from MVC. It only needs the System.Web.Razor reference. It almost works exacly like Asp.Net MVC

[csharp]var rm = new RazorMachine();
var result =
rm.Execute("Hello @Model.FirstName @Model.LastName", new {FirstName="John", LastName="Smith"});
[/csharp]

There’s a short stackoverflow answer comparing them (and RazorTemplates, another similar OSS solution) too.

Getting stuck in

Create a new project and use some nuget awesomeness

[powershell]Install-Package razormachine[/powershell]

Then, if you don’t already, add references to

[csharp]system.web.helpers // for json.decode
microsoft.csharp // for dynamic types
[/csharp]

If you want to debug this functionality via a console app running from VisualStudio, you may need to uncheck “enable visual studio hosting process” in Project -> Properties -> Debug

If you want to run this outside of Visual Studio, you can just run the compiled exe (bin/debug) as admin.

If you’re using a test runner then you might be fine as is. I can’t actually remember the issue I was having as I now can’t recreate it, but I think it might have been around using dynamic models and Json decoding.

RazorEngine

This is the core bit of functionality for a basic use case for RazorEngine:

[csharp]var model = Json.Decode("{\"Description\":\"Hello World\"}");
var template = "<div class=\"helloworld\">@Model.Description</div>";
const string layout = "<html><body>@RenderBody()</body></html>";

template = string.Format("{0}{1}", "@{Layout=\"_layout\";}", template);

using (var service = new TemplateService())
{
service.GetTemplate(layout, null, "_layout");
service.GetTemplate(template, model, "template");

var result = service.Parse(template, model, null, "page");

Console.Write(result);
Console.ReadKey();
}
[/csharp]

Your output should be:

[csharp]<html><body><div class="helloworld">Hello World</div></body></html>
[/csharp]

Pretty easy, right?

RazorMachine

Here’s the equivalent in RazorMachine

[csharp]var model = Json.Decode("{\"Description\":\"Hello World\"}");
var template = "<div class=\"helloworld\">@Model.Description</div>";
const string layout = "<html><body>@RenderBody()</body></html>";

var rm = new RazorMachine();
rm.RegisterTemplate("~/shared/_layout.cshtml", layout);

var renderedContent =
rm.ExecuteContent(string.Format("{0}{1}", "@{Layout=\"_layout\";}", template), model);
var result = renderedContent.Result;

Console.Write(result);
Console.ReadKey();
[/csharp]

Again, same output:

[html]<html><body><div class="helloworld">Hello World</div></body></html>
[/html]

Notice that in both of them you have to lie about there being a layout file existing somewhere; in RazorEngine you give it a name:

[csharp]template = string.Format("{0}{1}", "@{Layout=\"_layout\";}", template);[/csharp]

then refer to that name when adding the template:

[csharp]service.GetTemplate(layout, null, "_layout");[/csharp]

In RazorMachine you register the template as a dummy virtual file:

[csharp]rm.RegisterTemplate("~/shared/_layout.cshtml", layout);[/csharp]

then refer back to it as you would normally do within ASP.Net MVC when executing the content:

[csharp]var renderedContent =
rm.ExecuteContent(string.Format("{0}{1}", "@{Layout=\"_layout\";}", template), model);[/csharp]

Differences

I’ve found it easier to process sub templates (such as @Include) within RazorEngine, as I just recursively scan a file for that keyword and add the corresponding template to the service, e.g. look at the ProcessContent and ProcessSubContent methods below:

[csharp]public class RenderHtmlPage
{
private readonly IContentRepository _contentRepository;
private readonly IDataRepository _dataRepository;

public RenderHtmlPage(IContentRepository contentRepository,
IDataRepository dataRepository)
{
_contentRepository = contentRepository;
_dataRepository = dataRepository;
}

public string BuildContentResult(string page, string id)
{
using (var service = new TemplateService())
{
// get the top level razor template, e.g. "product"
// equivalent of "product.cshtml"
var content = GetContent(page);
var data = GetData(id);

ProcessContent(content, service, data);
var result = service.Parse(content, data, null, page);

return result;
}
}

private void ProcessContent(string content,
TemplateService service,
dynamic model)
{
// does the string passed in reference a Layout at the start?
const string layoutPattern = @"@\{Layout = ""([a-zA-Z]*)"";\}";

// does the string passed in reference an Include anywhere?
const string includePattern = @"@Include\(""([a-zA-Z]*)""\)";

// recursively process the Layout
foreach (Match match in Regex.Matches(content, layoutPattern,
RegexOptions.IgnoreCase))
{
ProcessSubContent(service, match, model);
}

// recursively process the @Includes
foreach (Match match in Regex.Matches(content, includePattern,
RegexOptions.IgnoreCase))
{
ProcessSubContent(service, match, model);
}
}

private void ProcessSubContent(TemplateService service,
Match match,
dynamic model)
{
var subName = match.Groups[1].Value; // got an include/layout match?
var subContent = GetContent(subName); // go get that template then
ProcessContent(subContent, service, model); // recursively process it

service.GetTemplate(subContent, model, subName); // add it to the service
}

private string GetContent(string templateToLoad)
{
// hit the filesystem, db, API, etc to retrieve the razor template
return _contentRepository.GetContent(templateToLoad);
}

private dynamic GetData(string dataToLoad)
{
// hit the filesystem, db, API, etc to return some Json data as the model
return Json.Decode(_dataRepository.GetData(dataToLoad));
}
}
[/csharp]

Why is this useful?

I’m not going to go into the details of either RazorMachine or RazorEngine; there’s plenty of documentation up on their respective websites already. I’ve used @Includes in the examples above due to its simplicity; the libraries have differing support for things like @Html.Partial and also can be extended.

Unfortunately, the html helpers (like @Html.Partial) need to have an HttpContext and run inside of ASP.Net MVC; which is what I’m trying to avoid for now.

If you pull down my initial teeny solution from github and look at the tests you’ll notice the content of the template, layout, and model are either strings or coming from the filesystem; not related to the structure of the project or files in the project or anything like that.

This means we can deploy a rendering process that returns rendered html based on strings being passed to it. Let’s play with this concept a bit more.

Flat File Web Page Generation

Say you wanted to “host” a website directly within a CDN/cache, thus avoiding the hosting via the normal route of servers and related infrastructure. Sure, writing flat html in a text editor is a solution, but what if you wanted to still be able to structure your pages into common modules, write C# code to manage the logic to dynamically combine them, and use Razor syntax and views for defining the front end?

This next section plays on this concept a bit more; we’ll write a small app that accesses a couple of directories – one for Razor files, one for data files – and generates a flat website into a third directory.

I will then expand on this concept over a series of posts, to make something more realistic and potentially useful.

Command Line & FileSystem FTW

I’ve created another repo up on github for this section, but cutting to the chase – here is the guts of demo console app:

[csharp]const string workingRoot = "../../files";
IContentRepository content =
new FileSystemContentRepository(workingRoot + "/content");

IDataRepository data =
new FileSystemDataRepository(workingRoot + "/data");

IUploader uploader =
new FileSystemUploader(workingRoot + "/output");

var productIds = new[] {"1", "2", "3", "4", "5"};
var renderer = new RenderHtmlPage(content, data);

foreach (var productId in productIds)
{
var result = renderer.BuildContentResult("product", productId);
uploader.SaveContentToLocation(result, productId);
}
[/csharp]

The various FileSystemXX implementations either just read or write files from/to the file system. Natch.

So what we’ve got here is an implementation of the RazorEngine methods I pasted in above wrapped in a RenderHtmlPage class, being called for a number of “productIds”; these happen to exist as json files on disc, e.g. “1.json”.

Each file is being combined with whatever Razor templates are listed in the product cshtml file and its referenced @Includes. The resulting html is then saved back to the file system.

So with these views in files/content:
razorengine-flat-file-website-views

And these json files in files/data:
razorengine-flat-file-website-jsondata

We get these html files generated in files/output:
razorengine-flat-file-website-htmloutput

Hopefully you can see where this is leading; we can keep Views in one place, get the model data from somewhere else, and have the extremely generic rendering logic in another place.

The Theory

With this initial version we could take an existing ASP.Net MVC website (assuming it didn’t use any html helpers in the views..) and process it offline with a known dataset to create a readonly version of the website, ready to serve from a filesystem.

Next Up

I’ll take this concept and run with it across various implementations, gradually ending up on something that might even be useful!

Programming-MuddyFunster!

I recently saw the website programming-m*therf*cker.com (not linking to it as people’s nsfw alerts might go mental with this post) and replied with the comment below. Have a look at the site and the values it extols, then let me know if I’m being fair!

Man, that site annoys me. So long as he’s talking about programming as a hobby, then that’s fine. But all of those things he’s against are nothing to do with managers but to do with devs making high quality software and being professional.

The site also makes me happy, in that if there are people with that same attitude out there putting software into production systems then I’ll always be able to find gainful employment fixing their stuff…

Belated Year in Review: 2013 (and the start of 2014!)

This has been something I’ve wanted to write for a while, but as per usual haven’t really found the time to get stuck in.

As such, here goes – a review of the highlights from my own 2013. I think it was an amazing year; so much happened, and given I’m a optimist I feel it was all incredible.

The Year of 101

“Year of 101s”

I ended 2012 and started 2013 with a promise that I’d learn and blog about one new thing a month, calling it the Year of 101s. I didn’t call it a new year’s resolution, but it might as well have been called that since it lasted about as long as a resolution tends to!

I don’t want to call this a failure – although many would, no doubt – since it not only fired me up for writing blog posts again, but I did learn a bunch of new things over a few months.

My highlights included:

In January …

node-js-logo

… getting really stuck into the basics of node.js with 6 articles and one summary covering

January also finally got me into using Github more, and learning how to write repos to accompany blog posts

In February …

smart-tv-icon

… I had a crack at monkeying around with making an app for my Smart TV, with posts covering

In March …

element14_farnell_rpi
worksnug

… I went off-piste. Having planned to cover the Raspberry Pi throughout March, I instead created a load of “Asides” along such subjects as

There were also a few actual posts on the raspberry pi, but only getting so far as setting up, a bit more setting up, and XBMC on the pi, and they were spread across three months!

After that …

basic vagrant up

… I gave up on the whole new thing each month idea, and just tried to get on with learning stuff. Thanks to attending DevOpsDays in March (I think), and after deciding to create and consult a “Life Advisory Board” from a group of friends and previous colleagues I ended up setting myself up as a consultant.

My consulting plan was inspired by DevOpsDays and from reading The Phoenix Project – I wanted to learn enough about some core values to developing a devops mentality within a company that I could become an advisor; helping them with value stream mapping, impact mapping, 5 whys, and other retrospective processes as well as environment automation tooling.

As such, I got stuck into Vagrant and Chef with a short series of Chef for developers posts.

Then came the cloudiness …

azure-settings-1

… By August I was working on a contract (not related to devops at all, but ya gotta pay the bills, amirite?!) doing some fascinating Azure proofs of concept which I started sharing; the only concept I finished writing up was automated image resizing, but I’ve got two or three more stonking ones in the pipeline!

But, in November …

rposbo_m

… I had the extreme honour of being invited to speak at Velocity Conf!!

This was SO exciting! I’d spent MONTHS with my cohort, Dean Hume, working on the presentation, practising it, getting what we felt was a good flow to it. The research process was fascinating to me as I love to learn, and the process of distilling this into a slide or two of easily presented information was very interesting.

We’d make sure to spend a morning a week out of our respective offices, planning the next round of changes over breakfast at Southbank, then practising and practising the whole thing over beers and burgers at Bill’s – these practise sessions always had booze and were always great fun! And also surprisingly extremely productive!

We were even lucky enough to have a practise of our session at the inaugural TECHinsight, which was a massive confidence boost; this was our first public presentation together, and my first one at all, ever! Amazing fun!

Then came the big deal – actually attending VConf. Wearing the speaker lanyard was fantastic, as I got to schmooze with the other speakers; obviously I was completely silent when faced with a table of Steve Souders, John Allspaw, Yaov Weiss, and Andy Davies.

It was all utterly incredible.

So what’s happening in 2014?

As amazing as 2013 was for me, 2014 is going to be equally exciting if not more so. Off the back of VConf, Dean and I are working through the process of getting a proposal accepted for a book we’re pitching to be published via O’Reilly, and I’ll be attending the (enable smug mode) invite only (disable smug mode) conference, edgeconf in March.

I’m in my last week at my current Solution Architect contract at Asos, after which I have a two week hardcore regime of Pluralsight courses to get back into being a developer, as well as several books such as Lean Startup, Viral Loop, and Purple Cow, so that I can join an exciting new venture at a start-up with a few passionate devs and gurus to see if we can make something amazing in a few months (or before we all run out of money)!

I’m also taking up the challenge of being a trainer for a .Net training company; I’ll be running the odd course here and there for the foreseeable future on various subjects, and hopefully even writing some of the material for new courses if possible.

In Summary

Kicking off the 101s in late 2012/ early 2013 got me back into blogging, back into learning, back into developing and sharing.

Attending and presenting at TECHinsight and VelocityConf gave me new passions: collaboratively writing, presenting, and teaching.

Leaving permanent employment and going solo gave me my freedom; I just took 3 weeks off over Xmas to spend time with my oldest daughter. Going to the museums, aquarium, cinema, and just hanging out together have been worth more than any contract day rate could offer me.

Not to mention I have a hundred things I want to blog about, and a hundred more proofs of concept to dig around in. It’s going to be a very busy and very exciting year…

Velocity Conference EU 2013

The 3 day conference of web performance and operations & culture wrapped up recently, and having had the honour of presenting a session with my partner in crime Dean Hume called Getting The LEAST Out Of Your Images, and wandering around with the green underscored “Speaker” lanyard, here’s a brief summary of the event and some of my personal highlights.

Keynotes

First up, here are all of the keynote videos over on youtube; there were some really great keynotes including several from various sections of the BBC; some highlights for me were Addy Osmani’s UnCSS demo, Ilya Grigorik’s Big Query lightning demo, and the fantastic Code Club session from John Wards.

Presentations

There were a large number of sessions across three streams (web perf, mobile, and devops) covering all manner of topics from extreme anomaly detection in a vast torrent of data, through to optimising animation within a browser.

Some of the stand out sessions for me were:

Making sense of a quarter of a million metrics

Jon Cowie gave a brain melting 90 minute session taking us through how Etsy make sense of all of their monitoring data; given that they graph everything possible, this is no easy task.

Understanding the neurological impact of poor performance

Tammy Everts not only gives us an insight into the poor aeroplane connectivity where she lives, but also how people react emotionally to a poor performing website.

Rendering Performance Case Studies

Unfortunately this session clashed with the Etsy metrics one, but from what I heard it sounds like Addy Osmani had one of the best sessions at the whole conference.

High Performance Browser Networking

Another brain-melt session; Ilya gave an incredible insight into the complexities of fine tuning performance when taking into account what HTTP over TCP (and also over 3G) actually does.

Other Resources

All of slide decks are here, all of the keynotes are here, and there’s even a free online version of Ilya Grigorik’s High Performance Browser Networking book.

Summary

I probably enjoyed the 90 minute tutorial session on Wednesday more than the rest of the conference, but the Thurs and Fri were really jam packed with excellent sessions and impressive keynotes.

I loved speaking there and will certainly be pitching for more such conferences next year!

#velocityconf notes part 3: network performance amazingness

An absolutely brain melting session from Ilya Grigorik , talking about the intricacies of tcp, http (0.9-1.1-2.0), the speed of light, how the internet  instructure works, how mobile network browsing works, how http 1.1 doesn’t support the current use cases, and most fascinating for me: what mobile browsers actually do under the hood.

Amazing how an analytics beacon on a webpage or app could cause your entire battery to be zapped in a matter of hours.

It’s going to take me a few days to decompress this information in my fuzzy brain, so why not check the slides yourself here: http://bit.ly/hpbn-talk

#velocityconf notes part 2

Yoav Weiss did a great session on responsive images and techniques; this scared me a little as he’s covering a lot of content that could contradict the talk I’m doing later in the week!

image

He’s mentioned some great pertinent points that I’ll reference back to.

Other things he talked about:

LQIP, which sounds like a reintroduction of the ancient lowsrc attribute.

He’s really having a pop at Mobify and their image loading hack script, which he calls Bat-Shit-Loco-Insane ™. Nice. Explaining how it works feels like the whole room has just facepalmed.

He also talks about compressive images. Luckily nothing that completely ruins my stuff.. whew. I’ll just have to update my notes a little bit.

Each session I’m going to gives me ideas on changes I should make to our Friday session!

Velocity conf notes

First session over, and my brain is already straining from Jon Cowie’s talk on how Etsy manage to make sense of a quarter of a million metric to understand anomalies in almost real time.

I also managed to have a full on geek moment when I rocked up to the speaker lounge and parked myself on the same table as Steve Souders and Yoav Weiss whilst they discussed CSS render times and blocking.

Plus I’m so dammed happy to be wearing the green emblazoned “speaker” lanyard!

image

So, I’m speaking at Velocity Conference EU!


Velocity EU Conference 2013

This week is an amazing one for web performance & operations and culture professionals; Monday & Tuesday is Devopsdays and Weds to Fri is Velocity Conference EU. If you’re concerned with web performance and the devops process, tooling, and culture (and if not, why the heck not?!) then get along (or get your company to get you a ticket) to one or even both events!

This coming Friday 13th November I have the pleasure of co-presenting a session called Getting The LEAST Out Of Your Images with my cohort, Dean Hume at this years Velocity Conference EU!

Velocity Conference is three days of presentations, events, and discussions along Web Performance and Operation & Culture. It’s been going for several years already and sees such big names in the web perf field as John Allspaw and Steve Souders, Ilya Grigorik, Yoav Weiss, and Paul Lewis, as well as well known faces from the Ops world.

I’ve already chosen most of the sessions I’ll be attending and I’m really looking forwards to it.

If you’re attending and aren’t sure where to head on Friday afternoon, I recommend popping into the Palace Suite at 4.15pm to see some slick slides and almost as slick presenters (*ahem*) in our session:

Getting The LEAST Out Of Your Images

rposbo_m
Come say hi if you spot me! Let me know what concerns you have with image optimisation on your (or your company’s) site (and buy me an espresso :P) and we’ll have a chat.


Velocity EU Conference 2013