Android browsers, keyboards, and media queries

If you have a site that is intended to be mobile friendly, is reasonably responsive (in a media-query-stylee) and has user input areas, you may find this issue on Android due to how the keyboard interacts with the screen:

Nice text input area

portrait_text_entry

Oh noes! Screen gone wrong!

confused_orientation_text_entry

It turns out that on Android devices the keyboard doesn’t overlay the screen, but actually resizes the screen; this can cause your media queries to kick in, especially if you use “orientation”:

[css]
(orientation : landscape)
[/css]

[css]
(orientation : portrait)
[/css]

To fix this, try changing your orientation checks to aspect ratio checks instead:

swap
[css](orientation : landscape)[/css]
for
[css] (min-aspect-ratio: 13/9) [/css]

and
[css](orientation : portrait)[/css]
for
[css](max-aspect-ratio: 13/9)[/css]

Ahhh – success!

corrected_text_entry

Reference http://abouthalf.com/development/orientation-media-query-challenges-in-android-browsers/

Mobile website input box outlines: DESTROY!

Something that took me a while to figure out:

chrome_input_focus_border

The orange border around this input box only appears on focus, on mobile devices. It took a while longer to notice that it’s only on mobile Chrome too.

I understand that it’s a valuable inclusion for small screens where you’d like to know where your cursor or other input is actually being passed to, but if you wanted something other than the orange border, or to remove it entirely, trawling through your css for any reference to “border” or orange-y hex values can be a pain.

All you need to do is – for the element in question:

[css]
:focus {outline:none;}
[/css]

Annnnnd relax.

HTML5 input type “email” validation

The HTML5 “email” input type is great for short-cutting the usual email validation javascript mess. Just by defining your input box as “type=’email'” you get validation for free (in modern browsers).

html5_email_type_validation

But how do you hook into this? Perhaps change the message or do some other actions at the same time? Check out the “invalid” event:

[javascript]
$("#email-input").on(‘invalid’, function (e) {
// do stuff
});

[/javascript]

Or if you’re old school

[javascript]
document.getElementById(’email-input’).addEventListener(‘invalid’, function(e){
// do stuff
});
[/javascript]

This returns a ValidtyState object on which you have the following properties:

  • badInput
  • customError
  • patternMismatch
  • rangeOverflow
  • rangeUnderflow
  • stepMismatch
  • tooLong
  • typeMismatch
  • valid
  • valueMissing

So you can work with it like so:

[javascript]
$("#email-input").on(‘invalid’, function (e) {
if (e.target.validity.typeMismatch) {
alert("enter a valid email address!");
}
});
[/javascript]

Or you could even override the default message that appears in the tooltip:

[javascript]
$("#email-input").on(‘invalid’, function (e) {
if (e.target.validity.typeMismatch) {
e.target.setCustomValidity("enter a valid email address!");
}
});
[/javascript]

Clever stuff, eh?

My NuGet

Want your own NuGet repo? Don’t want to pay for MyGet or similar?

Here’s how I’ve done it recently at Mailcloud (over an espresso that didn’t even have time to get cold – it’s that easy)

Setting up a NuGet Server

Creating your own nuget server could barely be any easier than it is now.

Open Visual Studio -> New Project -> WebSite

visual studio new website

Empty Website

visual studio empty azure website

Open Package Manager/Console -> Install-Package Nuget.Server

install nuget server

It’ll look something like this afterwards:

nuget server installed

Add API key to appsettings

nuget server add api key

For the API key: I just grabbed mine from newguid.com:
newguid.com

Publish site

If you’re using Azure and you selected the “Create remote resources” back at the start when creating the project, you can just push this straight out to the newly created website with a right click on the project -> publish :

publish azure website

Or use powershell, or msbuild to webdeploy, or ftp it somewhere, or keep it local – your call, buddy!

And that’s the hard part done 🙂

Using it

First visit

nuget server first visit

If you haven’t configured an API key then the first visit page will alert you to this.

Push a package

This is done in the usual manner – don’t forget your API key:

push a package

Check the repo

pushed package

Let’s reference our shiny new nuget repo:

Add a new source

Edit your Package Manager settings and add in a new source, using your new repo:

package manager sources

Find your packages!

Now you can open Package Manager window or console and find your pushed nuget package:

new source packages

Happy packaging!

Using NuGet at Mailcloud

So what is NuGet, anyway?

Intro

When working with shared functionality across multiple .Net projects and team members, historically your options are limited to something like;

  1. Copy a dll containing the common functionality into your solution
  2. Register the dll into the GAC on whichever machine it needs to run
  3. Reference the project itself within your solution

There are several problems with these such as;

  • ensuring all environments have the correct version of the dll as well as any dependencies already installed
  • tight dependencies between projects, potentially breaking several when the shared project is updated
  • trust – is this something you’re willing to install into a GAC if it’s from a 3rd party?
  • so many more, much more painful, bad bad things

So how can you get around this pain?

I’m glad you asked.

Treasure! Rubies, Gems, oh my.

ruby

The ruby language has had this problem solved for many, many years – since around 2004, in fact.

Using the gem command you could install a ruby package from a central location into your project, along with all dependencies, e.g.:

gem install rails --include-dependencies

This one would pull down rails as well as packages that rails itself depended on.

You could search for gems, update your project’s gems, remove old versions, and remove the gem from your project entirely; all with minimal friction. No more scouring the internets for information on what to download, where to get it from, how to install it, and then find out you need to repeat this for a dozen other dependent packages!

You use a .gemspec file to define the contents and meta data for your gem before pushing the gem to shared repository.

Pe(a)rls of Wisdom

cpan

Even ruby gems were borne from a frustration that the ruby ecosystem wasn’t supported as well as Perl; Perl had CPAN (Comprehensive Perl Archive Network) for over a DECADE before ruby gems appeared – it’s been up since 1995!

Nubular / Nu

Nubular

If Perl had CPAN since 1995, and ruby had gems since 2005, where is the .Net solution?

I’d spent many a project forgetting where I downloaded PostSharp from or RhinoMocks, and having to repeat the steps of discovery before I could even start development; leaving the IDE in order to browse online, download, unzip, copy, paste, before referencing within the IDE, finding there were missing dependencies, rinse, repeat.

Around mid-2010 Dru Sellers and gang (including Rob Reynolds aka @ferventcoder) built the fantastic “nu[bular]” project; this was itself a ruby gem, and could only be installed using ruby gems; i.e., to use nu you needed to install ruby and rubygems.


Side note: Rob was no stranger to the concept of .Net gems and has since created the incredible Chocolatey apt-get style package manager for installing applications instead of just referencing packages within your code projects, which I’ve previously waxed non-lyrical about

Once installed you were able to pull down and install .Net packages into your projects (again, these were actually just ruby gems). At the time of writing it still exists as a ruby gem and you can see the humble beginnings and subsequent death (or rather, fading away) of the project over on its homepage (this google group).

I used this when I first heard about it and found it to be extremely promising; the idea that you can centralise the package management for the .Net ecosystem was an extremely attractive proposition; unfortunately at the time I was working at a company where introducing new and exciting (especially open source) things was generally considered Scary™. However it still had some way to go.

NuPack

NuPack

In October 2006 Nu became Nu v2, at which point it became NuPack; The Epic Trinity of Microsoft awesomeness – namely Scott Guthrie, Scott Hanselman, and Phil Haack – together with Dave Ebbo and David Fowler and the Nubular team took a mere matter of months to create the first fully open sourced project that was central to an MS product – i.e., VisualStudio which was accepted into the ASP.Net open source gallery in Oct 2006

It’s referred to as NuPack in the ASP.MVC 3 Beta release notes from Oct 6 2010 but underwent a name change due to a conflict with an existing product, NUPACK from Caltech.

NuGet! (finally)

nuget

There was a vote, and if you look through the issues listed against the project in codeplex you can see some of the other suggestions.

(Notice how none of the names available in the original vote are “NuGet”..)

Finally we have NuGet! The associated codeplex work item actually originally proposed “Nugget”, but that was change to NuGet.

Okay already, so what IS NuGet?!

Essentially the same as a gem; an archive with associated metadata in a manifest file (.nuspec for nuget, .gemspec for gems). It’s blindingly simple in concept, but takes a crapload of effort and smarts to get everything working smoothly around that simplicity.

All of the details for creating a package are on the NuGet website.

Using NuGet at Mailcloud

We decided to use MyGet initially to kick off our own private nuget feed (but will migrate shortly to a self-hosted solution most likely; I mean, look at how easy it is! Install-Package NuGet.Server, deploy, profit!)

The only slight complexity was allowing the private feed’s authentication to be saved with the package restore information; I’ll get on to this shortly as there are a couple of options.

Creating a package

Once you’ve created a project that you’d like to share across other projects, it’s simply a matter of opening a prompt in the directory where your csproj file lives and running:

nuget spec

to create the nuspec file ready for you to configure, which looks like this:

<?xml version="1.0"?>
<package >
  <metadata>
    <id>$id$</id>
    <version>$version$</version>
    <title>$title$</title>
    <authors>$author$</authors>
    <owners>$author$</owners>
    <licenseUrl>http://LICENSE_URL_HERE_OR_DELETE_THIS_LINE</licenseUrl>
    <projectUrl>http://PROJECT_URL_HERE_OR_DELETE_THIS_LINE</projectUrl>
    <iconUrl>http://ICON_URL_HERE_OR_DELETE_THIS_LINE</iconUrl>
    <requireLicenseAcceptance>false</requireLicenseAcceptance>
    <description>$description$</description>
    <releaseNotes>Summary of changes made in this release of the package.</releaseNotes>
    <copyright>Copyright 2014</copyright>
    <tags>Tag1 Tag2</tags>
  </metadata>
</package>

Fill in the blanks and then run:

nuget pack YourProject.csproj

to end up with a .nupkg file in your working directory.

.nupkg

As previously mentioned, this is just an archive. As such you can open it yourself in 7Zip or similar and find something like this:

.nupkg guts

Your compiled dll can be found in the lib dir.

Pushing to your package feed

If you’re using MyGet then you can upload your nupkg via the MyGet website directly into your feed.

If you like the command line, and I do like my command line, then you can use the nupack command to do this for you:

nuget push MyPackage.1.0.0.nupkg <your api key> -Source https://www.myget.org/F/<your feed name>/api/v2/package

Once this has completed your package will be available at your feed, ready for referencing within your own projects.

Referencing your packages

If you’re using a feed that requires authentication then there are a couple of options.

Edit your NuGet sources (Options -> Package Manager -> Package Sources) and add in your main feed URL, e.g.

http://www.myget.org/F/<your feed name>/

If you do this against a private feed then an attempt to install a package pops up a windows auth prompt:

myget.auth

This will certainly work locally, but you may have problems when using a build server such as teamcity or VisualStudio Online due to the non-interactive authentication.

One solution to this is to actually include your password (in plain text – eep!) in your nuget.config file. To do this, right click your solution and select “Enable Package Restore”.

package restore

This will create a .nuget folder in your solution containing the nuget executable, a config file and a targets file. Initially the config file will be pretty bare. If you edit it and add in something similar to the following then your package restore will use the supplied credentials for the defined feeds:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <solution>
    <add key="disableSourceControlIntegration" value="true" />
  </solution>
  <packageSources>
    <clear />
    <add key="nuget.org" value="https://www.nuget.org/api/v2/" />
    <add key="Microsoft and .NET" value="https://www.nuget.org/api/v2/curated-feeds/microsoftdotnet/" />
    <add key="MyFeed" value="https://www.myget.org/F/<feed name>/" />
  </packageSources>
  <disabledPackageSources />
  <packageSourceCredentials>
    <MyFeed>
      <add key="Username" value="myusername" />
      <add key="ClearTextPassword" value="mypassword" />
    </MyFeed>
  </packageSourceCredentials>
</configuration>

So we resupply the package sources (need to clear them first else you get duplicates), then add a packageSourceCredentials section with an element matching the name you gave your packageSource in the section above it.

Alternative Approach

Don’t like plain text passwords? Prefer auth tokens? Course ya do. Who doesn’t? In that case, another option is to use the secondary feed URL MyGet provides instead of the primary one, which contains your auth token (which can be rescinded at any time) and looks like:

https://www.myget.org/F/<your feed name>/auth/<auth token>/

Notice the extra “auth/blah-blah-blah” at the end of this version.

Summary

NuGet as a package manager solution is pretty slick. And the fact that it’s open sourced and can easily be self-hosted internally means it’s an obvious solution for managing those shared libraries within your project, personal or corporate.

Extra References

http://weblogs.asp.net/bsimser/archive/2010/10/06/unicorns-triple-rainbows-package-management-and-lasers.aspx

http://devlicio.us/blogs/rob_reynolds/archive/2010/09/21/the-evolution-of-package-management-for-net.aspx

Migrating from .html to extensionless URLs without losing SEO juice. Eeew.

If you’ve ever had a basic html website (or not so basic site, but with static html pages in it nonetheless) and eventually got around to migrating to an extensionless URL framework (e.g. MVC), then you may have had to think about setting the correct redirects so as to not dilute your SEO rating.

This is my recent adventure in doing just that within an MVC5 site taking over a selection of html pages from an older version of the site

SE-huh?

As your site is visited and crawled it will build up a score and a reputation among search engines; certain pages may appear on the first page for certain search terms. This tends to be referred to as “search juice”. Ukk.

Make that change

If you change your structure you can do one of the following:

Just use new URLs

Forget about the old ones; users might be annoyed if they’ve bookmarked something (but seriously, who bookmarks anything anymore?..), but also any reputation and score you have with the likes of Google will have to start again from scratch.

This is obviously the easiest option.

Dead as a dodo; gives an HTTP404:

http://mysite.com/aboutme.html

Alive and kickiiiiing:

http://mysite.com/aboutme

Map the old URLs to the new pages

Good for humans, but in the short term you’ll be diluting your SEO score for that resource since it appears to be two separate resources (with the score split between them).

Both return an HTTP200 and display the same content as each other:

http://mysite.com/aboutme.html

http://mysite.com/aboutme

Should you want to do this in MVC, firstly you need to capture the requests for .html files. These aren’t normally handled by .Net since there’s no processing to be done here; you might as well be processing .js files, or .css, or .jpg.

Capturing .html files – The WRONG way

For .Net to capture requests for html I have seen most people use this abomination:

[xml]
<modules runAllManagedModulesForAllRequests="true" />
[/xml]

This will cause every single request to be captured and go through the .Net pipeline, even though there’s most likely nothing for it to do. Waste of processing power, and of everyone’s time.

Capturing .html files – The RIGHT Way

Using this as reference I discovered you can define a handler to match individual patterns:

[xml]
<add name="HtmlFileHandler" path="*.html" verb="GET"
type="System.Web.HandlersTransferRequestHandler"
preCondition="integratedMode,runtimeVersionv4.0" />
[/xml]

Pop that in your node and you’ll be capturing just the .html requests.

Mapping them (for either option above)

In your Route.config you could add something along the lines of:

[csharp]
routes.MapRoute(
name: "Html",
url: "{action}.html",
defaults: new { controller = "Home", action = "Index" }
);
[/csharp]

This will match a route such as /aboutme.html and send it to the Home controller’s aboutme action – make sure you have the matching actions for each page, e.g.:

[csharp]
public ActionResult AboutUs()
{
return View();
}
public ActionResult Contact()
{
return View();
}
public ActionResult Help()
{
return View();
}
[/csharp]

Or just use a catch all route …

[csharp]
routes.MapRoute(
name: "Html",
url: "{page}.html",
defaults: new { controller = "Home", action = "SendOverThere" }
);
[/csharp]

and the matching catch all ActionResult to just display the View:
[csharp]
public ActionResult SendOverThere(string page)
{
return View(page); // displays the view with the name <page>.cshtml
}
[/csharp]

Set up permanent redirects from the old URLs to the new ones

CORRECT ANSWER!

To preserve the aforementioned juices, you need to set up RedirectResults instead of ActionResults for each page on your controller which return a PermanentRedirect, e.g.:

[csharp]
public RedirectResult AboutUs()
{
return RedirectPermanent("/AboutUs");
}
[/csharp]

Or use that catch all route …

[csharp]
routes.MapRoute(
name: "Html",
url: "{page}.html",
defaults: new { controller = "Home", action = "SendOverThere" }
);
[/csharp]

… and action:

[csharp]
public RedirectResult SendOverThere(string page)
{
return RedirectPermanent(page);
}
[/csharp]

Using the Attribute Routing in MVC5 you can do something similar and simpler, directly on your controller – no need for a Route.Config entry:

[csharp]
[Route("{page}.html")]
public RedirectResult HtmlPages(string page)
{
return RedirectPermanent(page);
}
[/csharp]

Just add the line below into your Route.Config

[csharp]
routes.MapMvcAttributeRoutes();
[/csharp]

Having problems?

This routing always drives me crazy. I find it extremely hard to debug routing problems; which is why I’d like to point you towards Phil Haack’s RouteDebugger nuget package and the accompanying article

RouteDebugger screenshot

or even Glimpse.

Glimpse screenshot

Both of these have the incredible ability to analyse any URL you enter against any routes you’ve set up, telling you which ones hit (if any). This helped me no end when I finally discovered my .html page URL was being captured, but it was looking for an action of “aboutus.html” on the controller “home”.

Worth checking out

Superscribe is an open source project by @roysvork that uses graph based routing to implement unit testable, fluent, routing, which looks really clever. I’ll give that a shot for my next project.

Good luck, whichever route you take!

(see what I did there?..)

Azure WebJobs and Azure Scheduler

Creating functionality in Azure to subscribe to messages on a queue is so 2013. You have to set up a service bus, learn about queues, topics, maybe subscribers and filters, blah, BLAH, BLAH.

Let’s not forget configuring builds and deployments and all that.

Fun though this may be, what if you want to have just the equivalent of a cronjob or scheduled task kicking off every few hours or days for some long-running or CPU-intensive task

Scheduled tasks – Old School: IaaS

Well, sure, you could create a VM, log in, and configure cron/scheduled tasks. However, all the cool kids are using webjobs instead these days. Get with the programme, grandpa! (Or something)

So what’s a webjob when it’s at home?

It depends. Each version is the equivalent of a small console app that lives in the ether (i.e., on some company’s server in a rainy field in Ireland), but how it kicks off the main functionality is different.

These webjobs can be deployed as part of a website, or ftp-ed into a specific directory; we’ll get onto this a bit later. Once there, you have an extra dashboard available which gives you a breakdown of the job execution history.

It uses the Microsoft.WindowsAzure.Jobs assemblies and can be best installed via nuget:

Install-Package Microsoft.WindowsAzure.Jobs.Host -Pre

(Notice the “-Pre”: this item is not fully cooked yet. It might need a few more months before you can safely eat it without fear of intestinal infrastructure blowout)

Let’s check out the guts of that assembly shall we?

Jobs - attributes

We’ll get onto these attributes momentarily; essentially, these allow you to decorate a public method for the job host to pick up and execute when the correct event occurs.

Here’s how the method will be called:
host - assemly methods

Notice that all of the other demos around at the moment not only use RunAndBlock, but they don’t even use the CancellationToken version (if you have a long running process, stopping it becomes a lot easier if you’re able to expose a cancellation token to another – perhaps UI – thread).

Setting it up

Right now you have limited options for deploying a webjob.

For each option you firstly need to create a new Website. Then click the WebJobs (Preview) tab at the top. Click Add at the bottom.

Now it gets Old Skool.

Zip

Zip the contents of your app’s bin/debug folder, call it WebJob.zip (I don’t actually know if the name matters though).

Enter the name for your job and browse to the zip file to upload.

You can choose; run continuously, run on a schedule (if you have Azure Scheduler Preview enabled), and run on demand.

There’s a great article on asp.net covering this method.

FTP

Webjobs are automagically picked up via convention from a particular directory structure. As such, you can choose to ftp (or other deployment method, perhaps via WebDeploy within the hosting website itself, a git commit hook, or a build server) the files that would have been in the zip into:

site\wwwroot\App_Data\jobs\{job type}\{job name}

What I’ve discovered doing this is that the scheduled jobs are in fact actually triggered jobs, which means that they are actually triggered via an HTTP POST from the Azure Scheduler.

There’s a nice intro to this over on Amit Apple’s blog

When to execute

Remember those attributes?

Jobs - attributes

Storage Queue

One version can be configured to monitor a Storage Queue (NOT a service bus queue, as I found out after writing an entire application to do this, deploy it, then click around the portal for an entire morning, certain I was missing a checkbox somewhere).

By using the attribute [QueueInput] (and optionally [QueueOutput]) your method can be configured to automatically monitor a Storage Queue, and execute when the appropriate queue has something on it.

Blob

A method with the attribute [BlobInput] (and optionally [BlobOutput]) will kick off when a blob storage container has something uploaded into it.

Woah there!
Yep, that’s right. Just by using a reference to an assembly and a couple of attributes you can shortcut the entire palava of configuring azure connections to a namespace, a container, creating a block blob reference, and/or a queue client, etc; it’s just there.

Crazy, huh?

On Demand

When you upload your webjob you are assigned a POST endpoint to that job; this allows you to either click a button in the Azure dashboard to execute the method, or alternatively execute it via an HTTP POST using Basic Auth (automatically configured at the point of upload and available within the WebJobs tab of your website).

You’ll need a [NoAutomaticTrigger] or [Description] attribute on this one.

Scheduled

If you successfully manage to sign up for the Azure Scheduler Preview then you will have an extra option in your Azure menu:

finding scheduler

Where you can even add a new schedule:

setting up scheduler

This isn’t going to add a new WebJob, just a new schedule; however adding a new scheduled webjob will create one of these implicitly.

In terms of attributes, it’s the same as On Demand.

How to execute

Remember those methods?

host - assemly methods

RunAndBlock

These require the Main method of your non-console app to instantiate a JobHost and “runandblock”. The job host will find a matching method with key attribute decorations and depending on the attributes used will fire the method when certain events occur.

[csharp]
static void Main()
{
JobHost h = new JobHost();
h.RunAndBlock();
}

public static void MyAwesomeWebJobMethod(
[BlobInput("in/{name}")] Stream input,
[BlobOutput("out/{name}")] Stream output)
{
// A new cat picture! Resize all the things!
}
[/csharp]

Scott Hanselman has a great example of this.

Call

Using some basic reflection you can look at the class itself (in my case it’s called “Program”) and get a reference to the method you want to call such that each execution just calls that method and stops.

[csharp]
static void Main()
{
var host = new JobHost();
host.Call(typeof(Program).GetMethod("MyAwesomeWebJobMethod"));
}

[NoAutomaticTrigger]
public static void MyAwesomeWebJobMethod()
{
// go find epic cat pictures and send for lulz.
}
[/csharp]

I needed to add in that [NoAutomaticTrigger] attribute, otherwise the webjob would fail completely due to no valid method existing.

In summary

WebJobs are fantastic for those offline, possibly long running tasks that you’d rather not have to worry about implementing in a website or a cloud service worker role.

I plan to use them for various small functions; at Mailcloud we already use a couple to send the sign-ups for the past few days to email and hipchat each morning.

Have a go with the non-scheduled jobs, but if you get the chance to use Azure Scheduler it’s pretty cool!

Good luck!

References

Using ScriptCS to very easily Base64 encode data in a file

Oh, what’s that? You need me to process a huge file and spew out the data plus its Base64? I’d better write a new .Net proje-OH NO WAIT A MINUTE I DON’T HAVE TO ANY MORE!

ScriptCS is C# from the command line; no project file, no visual studio (.. no intellisense..), but it’s perfect for little hacks like these.

First, go and install Chocoloatey package manager:

c:> @powershell -NoProfile -ExecutionPolicy unrestricted -Command "iex ((new-object net.webclient).DownloadString('https://chocolatey.org/install.ps1'))" &amp;&amp; SET PATH=%PATH%;%systemdrive%\chocolatey\bin

Then use chocolatey to install scriptcs:

cinst scriptcs

Then open your text editor of choice (I used SublimeText2 for this one) and write some basic C# plus references, save it as e.g. hasher.csx:
[csharp]
using System;
using System.Text;
using System.IO;
using System.Collections.Generic;

var sourceFile = @"C:\Users\rposbo\import.txt";
var destinationFile = @"C:\Users\rposbo\export.txt";

var lines = File.ReadAllLines(sourceFile);
var newOutput = new List<string>();

foreach(var line in lines){
var hash = Convert.ToBase64String(Encoding.Default.GetBytes(line));
newOutput.Add(string.Format("{0},{1}",line,hash));
}

File.WriteAllLines(destinationFile, newOutput.ToArray());
[/csharp]

Then run that from the command line:

c:\Users\rposbo\> scriptcs hasher.csx

And you’ll end up with your output; minimal fuss and effort.