Node.js @ UKWAUG: MS Cloud Day – Windows Azure Spring Release

The fourth session I attended was the highly energetic and speedy introduction to writing node.js and running it on Azure, presented by the author of Simple.Data and Simple.Web, and one of those voices of the developer community with a great JFDI attitude, Mark Rendle (@markrendle).

I’ve just recently got into node.js development myself and have been very much enjoying node, npm, express, stylus, and nib; there is a fantastic community and expanse of modules already and that can be a bit daunting.

During the session Mark’s short code example shows just how simple it can be to get up and running with node, and also how easy it is to deploy to Azure.

A nice comment was that we are on the road to “ecmascript harmony”! And that “Javascript is a great language so long as you ignore the 90% of it which coffeescript doens’t compile to.”

It was a very fast-paced session; hopefully my notes still make sense though..

What the various aspects of Azure do

  • compute – web, worker, vm
  • websites – .net, node, php
  • storage – blob, tables (distributed nosql, like cassandra), queues
  • sql – sql azure, reporting
  • services – servicebus, caching, acs

What are the Cloud Service types used for

  • web roles – iis, for apps
  • worker – no iis, for running anything

How to peruse the contents of blob or table

General tips for developing sites for use in Azure

  • keep static content in blob storage
  • websites commit and deploy much faster than cloud serviecs commit and deploy process
  • azure/iis needs server.js, not app.js

How to run RavenDB in Azure

  • Spin up a vm and install it!! (this used to be a much trickier process, but the recent Azure update meant that the VM support is mature enough to allow the simpler solution)

Developing node.js

Use jetbrains webstorm for debugging/ or the wonderful online editor, Cloud9IDE. Sublime Text 2 is a great editor for simple code requirements, and has great plugins for Javascript support. I also used this for taking all of the seminar notes as it has a simple “zen” zero-distractions interface

Next up – Hadoop and High Performance Computing

MongoDB @ UKWAUG: MS Cloud Day – Windows Azure Spring Release

My third session was about MongoDB and how you might implement it in Azure, presented by MongoDB’s own Gregor Macadam (@gregormacadam).

I only had limited knowledge of what MongoDB was before this session (a document based data store, much like CouchDB and other NoSQL variants), so given that this session appeared to be an intro to MongoDB as opposed to MongoDB on Azure then that suited me just fine!

Here are the basic notes I made during Gregor’s talk (although you may as well just go to MongoDB.org and read the  intro..):

MongoDB uses sharding for write throughput.
The REST interface uses JSON as the data transport format
Data is saved in BSON structure

The db structure (usually?) consists of three nodes; a single primary and two replicated secondary – these are referred to as a Replica Set.
A Replica Set has a single write node with async replicate to other set members, read from all

The write history (known as UpLog) is in the format "move from state A, to state B" so as to avoid overwriting changed data.

If write (to primary) fails, an automatic election determines which remainder is new primary; usually primary is the node with latest data.

It can be configured to write to multiple hosts, but the write won’t return until all writes are completed

An "arbiter" can be the tie breaker in determining the new primary node during election, and we can specify weighting for that process.

"Read" scales with more read nodes, "Write" scales with multiple read/write groups (replica sets) or sharding.

Need config database to define key ranges for sharding etc

MongoS process runs on another node and knows which shard to write your data to.

The updates are released on windows and Linux at same time

Within Azure

Data is persisted in blob storage
MongoDB runs in worker role
page blob is NTFS cloud drive (data drive?)

MongoS router process is required to load balance access to correct node, not the Azure load balancer; the Azure load balancer can end up sending the write request to a non-primary node.

OSdisk has caching enabled by default, data disk doesn’t

Code is Open Source and can be found on github and issues can be raised on the Mongo Jira site

You can sign up for a free Mongo Monitoring Service on 10gen

Main points that I took away from this is that it sounds like you need a large number of Azure VMs to get Mongo running; one for each node, one for each MongoS service, one for an arbiter (maybe more – I didn’t catch all of these details that were raised by a couple of good questions from the audience).

Although I have a big plan to use NoSQL for the front end of an ecommerce website, I don’t think that MongoDB’s Azure offering is mature enough yet. I’ll be looking into CouchDB and Raven initially and keeping an eye on MongoDB. (Interested in how to get Raven running on Azure? Wait for the next post!)

The slide deck from this session is here

Next up – node.js

IaaS @ UKWAUG: MS Cloud Day – Windows Azure Spring Release

Infrastructure as a Service in Azure

Unfortunately, the earlier network disaster at the conference meant that this session seemed to have been cut short. This is a shame as the Azure IaaS offering has really matured and I was looking forward to how I can utilise the improved system.

Since it was a short one, the notes taken on what Microsoft’s own Michael Washam (@MWashamMS) talked about are limited. Here goes:

You can can upload your own VHDs which must be fixed disks, not dynamic

Data disks are a virtual HD which can be attached to your VM; each data disk is up to 1TB!

Instance size/# of data disks

  • xs 1
  • s 2
  • m 4
  • l 8
  • xl 16

So a single XL instance VM can have 16TB of HA storage attached!

Data disks lives in blob storage

Using port forwarding and configuring a Load Balanced Set allows you to set up a cluster/farm.

The load balancer functionality has custom probes; these look for HTTP200 on a health check page. The health check page can be custom and do custom logic (e.g., auth/db conn) determining whether to return a 200 status or not.

Availability Sets ensure not all the VMs in a set would go down for usual updates and patches at the same time; i.e., your load balanced endpoint would always have an active VM on the other end.

The Windows Azure Virtual Network allows, as mentioned in the Keynote, a VPN to be set up which can be patched into your on-premises router to act as if it’s on-prem itself.

The VPN can be configured/defined via an xml file. The creation of the VMs and their attached data disks can be scripted from the mature Azure Powershell cmdlet library. Using these together Michael was able to show how he can run a powershell script to spin up a farm of ten servers from a pre-existing VHD, attach a 1TB data disk to each, and assign them IP addresses within the configured VPN range.

He then downloaded the VPN settings from Azure in a format specific to a possible corporate router to effectively bring that new server farm into the on-premises network.

This automatic setup and configuration was really rushed through and quite tough to follow at times, probably due to the lack of time. There were some tantalising snippets of script on screen momentarily, and then it was all over.

My big take away from this session was the ability to automatically spin up a preconfigured set of servers to create a QA/dev/load test environment, do horrible things to it, get some reports generated, then turn it back off. Wonderful.

:: Michael with the money shot

ukwaug ms cloud day MWasham azure spring release 2012 IaaS

Next up >> Mongo DB

UKWAUG: MS Cloud Day – Windows Azure Spring Release

Some notes on the recent UK Windows Azure User Group (@ukwaug) Microsoft Cloud Day held in the Vue cinema in Fulham Broadway; a place close to my heart, as Fulham was the first place I rented in London well over a decade ago. The cinema wasn’t there then. Nor was the mall which houses the cinema.

Nor, more significantly, was Azure (pretty sure Microsoft was though..)

To be fair, even though Azure *was* around a couple of years ago, it certainly was not a first class citizen in my tekky box of tricks; more of a "this is pretty cool – think I’ll wait until it’s finished first."

Now it’s pretty much ready; not quite "stick a knife in it, comes out clean" ready, but more "better give it five more minutes" ready.*

The first Azure attempt had a reasonably complex and confusing Silverlight interface that attempted to convey the full capabilities of Azure. If you really gave it your attention then you could develop some really complex and clever solutions via cloud-based computing.
:: Silverlight portal

If you already had a chance to catch the recent Azure keynote then you’ll probably already know that Azure has a wonderful new HTML5 portal (tata Silverlight – though you can still get that version if you want), WebSites (sort of an extension/simplification of the Web Role cloud service), git support, tfs support, and loads more; I’m not going to list and explain everything new here as I just wanted to cover some of the notes I made during the UKWAUG MS Cloud Day.
:: HTML5 portal

Given that the Spring release has been out a couple of weeks now I’ve already had a chance to play with what I found to be the more interesting changes namely: git support, tfs support, websites (sort of lightweight web roles), and support for non-.net languages, such as Node.js.

Keynote

After a short intro from Andy Cross of Elastacloud (and co-founder of UKWAUG) we are presented with the trademark red polo shirt of Scott Guthrie, ready to show off the great new developments in the Spring Azure release.

Interesting points:

  • The Azure SDKs are published on github, they take pull requests, which actually makes Azure SDKs actual OSS! Amazing!
  • The azure servers themselves are, located close to hydroelectric generators or wind powered generators. They are site managed by 1 person per 15k servers.
  • There is a 99.95% SLA, with a money back guarantee
  • The new release supports node.js, java, php, and python
  • The Azure SDKs run on linux, mac, and windows
  • There is a much simplified pricing calculator
    :: Old
  • :: new

VMs

With regards to licensing for software installed on VMs in Azure (such as SQL): you can use the standard licensing from vendor, volume licensing/SA is also supported.

You can attach a "data drive" to vm, which is a virtual fixed disk of 1TB. You can attach multiple data drives and these can be RAID-ed should you want. The data stored on these is persistent and will survive a restart.

The VMs all have a D drive which is just temporary swap file drive; the data on these is temporary and will not survive a restart.

One interesting point was raised by an audience member: why does the Windows Server 2012 VM in Azure have a floppy drive listed in "My Computer"?.. Scott couldn’t answer that one!

All endpoints on a VM are locked down initially and have to be explicitly opened up.

You can create an Availability Set; these are multiple VMs which will be guaranteed to be deployed on separate servers, separate racks, etc. This gives a sort of automatic "High" Availability.

There’s a great blog post by Maarten Balliauw about using Azure VMs to create a cloud-hosted web farm!

Since the VMs support other OSs which will also support these data drives etc, they can be formatted to whatever file format is required (such as EXT3 for example).

Scott then proceeds to use git bash to ssh into his unbuntu server VM.

Within the new portal there is a "My Images" section under the VM area; you can upload your own VM image or use a capture/snapshot created from within Azure itself.

One thing that was wonderful for a real nerd like me (whose degree was based on Unix) is the Azure cmd line support!

From the command line you can run "azure vm list" which executes a REST query against your Azure management portal. Adding a "–json" flag returns json formatted data.

VPN support

From within the portal a VPN can be created by simply selecting New -> Network -> Address Range. This is virtualised so no conflicts exist between customers using the same address range.

It is then possible to download these VPN settings to make the Azure VPN link with your local network, making it work as if it’s just in the next room.

Supports UDP and TCP but not multicast.

There is hypervisor support for porting VMs to and from Azure.

VMs are setup with a persistent drive; these are HA drives accessing data from Azure storage. As such, they are always-on.

The High Availability is implemented with a layer of SSD & caching above the HA triplicate drives with continuous storage georeplication; this covers for natural disaster (however these are always within same geopolitical boundary). This is automatically on but can turn it off to save money.

Azure now supports Chef, the open-source systems integration framework!
http://www.opscode.com/chef/

No AV is provided on the VM image by default, but you can install it yourself on the VM.

My big take-away from this session is the plan to utilise VMs to spin up dev/test environments within the VPN and turn them off outside of office hours to save money. This could potentially be covered within a standard MSDN subscription; something to look into.

Websites

Each Azure account has 10 websites for free. These start in a shared location, but you can pay for private.

All Azure accounts include 10 websites and the MSDN subscription includes bandwidth, storage, vms, etc which can be used to enhance the website (i.e., install a SQL backend or something)

Supports git, tfs, ftp, and web deploy

supports incremental deployment; first one is big, subsequent ones are incremental/smaller

The free Azure account already includes 10 websites, 1gb bandwidth, 20gb db, and 1gb storage.

In terms of DNS, you can map cnames to websites and but would need to use a VM to map anames.

At this point there was a major network outage eventually resulting in a dodgy, insanely long RJ45 cable to be laid from the projection room at the back all the way to the front of the auditorium! Apparently this cable was actually longer than the max allowed length for a wired ethernet connection, so only worked on a couple of laptops; neither of which were Scott’s.. Note to self: always have a backup *everything*, even a backup network!

Cloud Services

When creating a website in Visual Studio you can just deploy to azure as a website; the only additions to your file system might be a web.config and a csdef file.

You can convert a website project in VS to cloud service which includes cloud services within your solution and then deploys as an Azure Cloud Service

It is possible to programmatically scale; currently this is by using the Wasabi library, but does need to be scripted. This functionality will be added as a feature in future releases.

The Azure Application building blocks for Cloud Services:

  • big data,
  • db,
  • storage,
  • traffic caching,
  • messaging,
  • id (authentication),
  • media,
  • cdn,
  • networking

SQL

  • Reporting services has been added recently for BI
  • SQL Server 2012 denali (?) is a subset of full SQL Server 2012
  • VS2012 can support data projects to target the correct SQL DB type for Azure and can identify incompatible features (remember ye olde SQL Azure’s incompatibilities? Yuck..)
  • SQL is PAYG.
  • It is possible to restore a db to a specific point via the constantly backed up transaction logs
** Unfortunately the earlier network outage during this session meant we had to cut it short here as we were already overrunning quite significantly **

:: Scott Guthrie delivering the Spring Release Azure Keynote

ukwaug ms cloud day scottgu azure spring release 2012 keynote

Next up – IaaS, Mongo DB, Node.js & git & Cloud9IDE >>

* yes, I like cooking. So?!

Find of the day: site44.com

Making and hosting a website has never been easier, thanks to the likes of AppHarbor, heroku, and github pages. Now there’s a new contender on the block and it’s possibly even easier to use that the others; if you don’t do dynamic content and just want a static website, check out site44.com

This site takes your dropbox account (if you don’t know what dropbox is, head over to their site and watch the “What is Dropbox” video; essentially it’s distributed online file storage) and uses a folder (which it creates) as the root for your website.

 

site44 homepage

Just follow those steps and you’ll find a new folder appear in your local Dropbox folder:

site44 autogenerated dropbox folder

In there you’ll find a new index.html page with the default content:

site44 autogenerated homepage

Open that file locally, edit it, save it, give Dropbox a second to sync it back up and:

site44 gordons alive

Jeepers.

 

Think I might do a compare and contrast with Appharbor, github pages, and site44 at some point. The world is a clever place.

EC2 WinSCP /var/www file upload

Since I use an Amazon EC2 Microinstance to host this blog and I noticed I had no favicon appearing (which is a bad thing) I thought I might as well make one and pop it up.

I took my usual avatar (Jiraya, the “Pervy Sage” from Naruto) and just let someone else do the hard work for me, uploading it to FavICO.com instead of even bothering to download an app to do it.

Then I opened up WinSCP and used my EC2 ppk to authenticate with my EC2 instance, took that favicon.ico file and tried to upload it to /var/www/html (the default website root if you install wordpress) only to receive the error:

ec2 var www upload permission error

Hmmm.

If you check the permissions on that folder (“stat /var/www” via SSH) you’ll see that it’s owned by the “root” group; since you’re logging on as ec2-user you’re not a member of that group.

Your options, according to the internet, appear to be;

The solution I found was much easier.

  • Upload the favicon to the ec2-user home directory via WinSCP
  • Move (mv) the favicon from the ec2-user directory to /var/www via SSH using “sudo” to get the necessary permissions

Easy.

Grading a Developer

In the beginning, there was chaos

No doubt you’ve worked in a company where progression within your I.T. department, as a developer, is not easy to explain nor measure

Perhaps there is a skills matrix which defines a “senior developer” as someone who does all the stuff you do “but better”. Or this listing might contain things that are not pertinent to your technical role and have quite obviously been copy-pasted from a generic skills matrix template for another company member, or even perhaps from your manager’s previous company which might be in a barely related sector.

I only really noticed this once I became a manager (the peter principle) and was faced with giving feedback to obviously skilled developers and was unable to explain what they needed to do to progress; for me, this was lesson #1 in management: either have good answers to good questions, or find out the answers.

After a lot of internet hunting myself and a colleague in a similarly frustrating situation found a selection of insightful articles from Joel Spolsky about FogCreek developers, SWEBOK, and a Construx whitepaper on exactly these things; defining how to grade a developer within your company, which allowed us to develop our similar system to explain;

  1. how to give complete transparency to each member of your team,
  2. how to collate your team members grades into a “team” grading system in order to identify potential weaknesses,
  3. how to adapt the system to specific roles and requirements.

I’ll try my best to give some sort of structure to this article and in doing so hopefully cover each of the points above.

 

From chaos, came the desire for structure

Following are the steps we took in our attempt to achieve a better grading model..

Recognise the problem in the department

This was the easy part, and essentially was covered in my introduction. The classic “how come he’s a Senior Dev when he’s rubbish at X and I’m great at that and I’m a Mid level Dev?!” question was enough to realise that the existing system didn’t really fit since it couldn’t give a good answer.

 

Conduct a survey of all developers to identify what constitutes a developer’s role and responsibilities within the company

This required us to use the I.T. intranet (a Sharepoint portal) to request feedback from every developer in the company (about 35-40 people at the time) on what they believe their job requires of them; i.e., their key skills, abilities, and responsibilities that enable them to fulfil their role. This took quite a lot of convincing to get enough responses given the lack of conviction in the existing system, but we got there.

 

Find a group of peers to review the outputs and group them into similar genres

This involved getting a peer group together (in this case the 5 or 6 tech leads) who were committed to dedicating time each week, regularly meeting to discuss all of the responses and group them into genres. We were very fortunate that the development team as a whole took the process on board and really delivered some quality comments. The actual results of the request for feedback, already grouped into headings (but not deduped) are below:

Standards

  • Adherence to coding standards.
  • Adherence to industry and company standards and policies
  • Follows industry best practice
  • Writing code that consistently meets standards (Dev and Architecture).
  • Writes clean looking code

Refactoring

  • Improve and maintain code quality.
  • Refactor code where necessary
  • Takes time to go back and refactor own code

Deployment

  • Ability to deploy to project environments.
  • Manage the deployment of code to environment

Environment

  • Environment management (deployments scripts and configurations)
  • Manage the setting up of the development environment
  • Has knowledge outside of development (understands how production works, how systems are administered)
  • Have knowledge of the systems and how the different services/processors etc interact and their dependencies.

Merge/Branch

  • Ability to merge code and understanding of the merge/branch process.
  • Can merge code between branches accurately and reliably.
  • Configuration management (TFS)
  • Understands the branching model and how merges work.

Documentation

  • Writes clean looking code
  • Creates clear and concise sets of documentation around projects worked on for technical and non-technical audiences.
  • Documentation – ability to produce clear documentation for both IT and non-IT people.
  • Documentation (comments in code, UML specifications, SharePoint items)
  • Writes clean, self-documenting code that is well structured, simple and documented appropriately.

Debugging

  • Can the root cause of a bug through various layers of application and is able to fix typical issues at all of these layers.

Agile/Scrum

  • Good knowledge of Agile Scrum
  • Understands Agile methodologies in general and knows how we use Scrum.

Company Systems/Domain Knowledge

  • Ability to realistically estimate tasks.
  • Has a good level of understanding of the Company systems, what each system does (high level) and how they interact with each other.
  • Knowledge of processes used by the business users of the system.
  • Has knowledge outside of development (understands how production works, how systems are administered)
  • Have knowledge of the systems and how the different services/processors etc interact and their dependencies.

Coding

  • Coding in C# (3.5/4.0)
  • Functional testing (BDD, planning, development, execution, evaluation)
  • Good knowledge of sql and adherence to sql standards.
  • Good understanding of requirements.
  • Confident developing applications at client, server and database level. i.e. JavaScript, C# and T-SQL.
  • Knowledge about the frequently used parts of the .NET framework
  • Knowledge and practice of TDD.
  • Proficient in multiple languages (to understand different styles and methods of coding)
  • Unit testing (TDD, mocking, coverage assessment)
  • Use of C#.net and vb.net.
  • Use of front end technologies – jquery, json etc.
  • Writes clean looking code

Patterns & Practices/ Design Patterns

  • Design Patterns (the most used or important ones)

Estimation/Planning

  • Estimation of work items
  • Input into planning of how tasks are to be carried out – what technologies can be used.
  • Knowledge of processes used by the business users of the system.

Presentation

  • Presentation (and other knowledge dissemination skills)

Leadership

  • Coaching
  • Interviewing
  • Meeting facilitation
  • Mentoring

Learning

  • Interest in gaining experience in new technologies.
  • Learning and research

Process

  • Interest and involvement in improving the software development process and the company policies

This gave us 16 reasonably distinct areas to focus on which the group sanitised slightly and ended up with 12 “Knowledge Areas”: Debugging, Standards, Coding, Design Patterns, Documentation, Leadership, Domain Knowledge, R&D, Agile/Scrum, Presentation, Branch/Merge, Environment, and Process.

 

Within each genre, attempt to define distinct levels of ability

Each member of the group took an area each per week and attempted to define a few distinct levels of ability; it was easier for some than others, and some actual level descriptions were more verbose than others. We had agreed initially to use the levels Beginner, Basic, Good, Great, and Superstar (the latter being an overachiever), however it became apparent that defining five distinct levels was pretty tricky for some of the areas so this became Basic, Good, Great (and if possible we added in Superstar too).

Here are a couple of the example areas with their levels defined

Documentation

Basic

Updates where necessary

Good

Create support documents

Great

Creates project handover docs, support docs, implementation docs

Superstar

“Technical Author”: takes ownership, adds missing artefacts, “tech lead assistant” (that’s not meant to sound patronising, honest!)

Domain Knowledge

Basic

Knows where to find the implementation of functionality specific to the area being developed within the codebase/database. Understands the structure and separation of the projects within a given solution and where to make necessary changes in order to implement required functionality

Good

Can identify where software artefacts exists within the domain and utilise these in order to reduce code repetition. Can identify which areas of functionality being developed should be reusable artefacts

Great

Uses extensive knowledge of the system under development and other systems it may interact with to determine various potential implementations for a given requirement, defines the impact of each on each affected system, and uses this to propose the best solution(s)

Superstar

Can suggest fundamental changes to the domain model as the requirements for functionality and development evolve, in order to better fit.

 

Entire developer group reviews the outcome; feedback and rewrite where necessary

Once all of these areas have at least 3 levels of ability defined, the entire product is shared with the development team for honest feedback. The ability levels can be altered at this point, but the main aim is to get buy in and agreement from the majority of the development team.

 

Define “Core” Knowledge Areas

Each company will no doubt have a selection of key abilities for which each developer grade must have at least a minimum rating. Another couple of sessions with the peer group decided what those were for the company:

Junior Dev

  • Standards (Basic)
  • Debugging (Basic)
  • Coding (Basic)

Mid Dev

  • Standards (Good)
  • Debugging (Good)
  • Coding (Good)
  • Design Patterns (Basic)
  • Documentation (Basic)
  • Environment (Basic)

Senior Dev

  • Standards (Great)
  • Debugging (Great)
  • Coding (Great)
  • Design Patterns (Good)
  • Documentation (Good)
  • Environment (Good)

Below is a high level template where the coloured blocks define the core level template for that grade; mauve for junior, green for mid, and peach for senior (please forgive my inability to choose decent colours in Excel).

grading_levels

 

Attempt to define a template for the various levels of developer in the company

Once the Knowledge Areas are defined and the ability levels agreed to, the next step is to try and work out what constitutes the Developer grades within the company. To do this the group took a couple of steps:

Firstly, I unofficially applied the rating system to a couple of my team; one of which I thought of as a classic Senior Dev and one as a classic Mid Dev; once I had worked through this new review process with them we had a pretty good idea of which Knowledge Areas and at what level what a mid and a senior dev should have within the company.

Meanwhile, within the group we worked out a selection of minimum levels in varying groups of knowledge areas, differing per developer grade. For example,

  • Senior Developer: 4 great or higher, 8 good or higher, 10 basic or higher
  • Mid Developer: 2 great or higher, 4 good or higher, 6 basic or higher
  • Junior Developer: 2 good or higher, 6 basic or higher

 

A breakdown of coding ability per language, weighted appropriately for the specific role (i.e., UI dev, regular dev, backend dev, sql dev, etc).

Coding is a tricky one, so we started breaking that right down and making it a little more specific. For example, the devs working on the website though not UX devs would certainly require good jQuery and CSS knowledge, but perhaps less WCF and C# multithreading. As such, a role could define a language matrix, such as this:

grading_levels_languages 

In the example above, a dev would be rated on their abilities in C# (weighted at 60% of the total), VB.net (20%), HTML (10%), and jQuery (10%). The example dev scored a 3 out of 5 in C# ability, 3 for VB, 2 for HTML and 2 for jQuery, giving them a score of 2.8. Therefore the Knowledge Area “Coding” would be able to have a coding language minimum score for each ability level. Perhaps a Junior would need a minimum of 2, Mid 3, and Senior 4.

 

Review process becomes a case of identifying/checking off each entry in the various levels of each genre.

Once this system is agreed by all members of the team, an annual appraisal becomes a much easier case of looking at the dev’s current charts, noting the areas that they’re not scoring highly on, and creating objectives which allow them to focus on improving those areas in order to progress through the ranks.

Anyone overachieving (superstar status) could be temporarily given a bonus of some type; attending more seminars/webinars/events or a salary increase in between their current salary and that of the next pay grade (a great idea from the fogcreek article).

 

Any items missing must be taken care of within a set time period

If the dev is not achieving what their job title says then they can be given a deadline in which to raise their rating in the areas they are deficient. Easy.

 

Transparency of what is required in order to be put forward to promotion

No more “why is he a Senior and I’m not” questions. It’s all out there for everyone in the team to see.

 

Supporting general guidelines for attitude, punctuality etc. cover off talented but unprofessional developers

Add in general “Code of Conduct” checklist so that you don’t end up with “superstars” who are complete…ly unprofessional. For example:

Professional Conduct

  • Team Player
  • Meets Deadlines
  • Organisational Skills
  • Communication Skills
  • Approachability
  • Leadership Ability
  • Interaction With Team Members
  • Interaction With External Departments BA’s etc
  • Attendance
  • Time Keeping
  • Quality of Work

 

Varying levels of detail in the graph can make them specific for the individual (dev-team lead), the team (team lead/line manager), and the department (line manager/head of I.T.)

By overlaying all the charts for all the members in your team you end up with a team chart; useful for deciding which team best fits which new project, which team needs what training, what skills the next interviewee should be tested on in order to help fill a skills gap, etc.

Overlay the various teams’ charts and you end up with a department chart; good for the less tekky but more senior I.T. members.

 

Use the new data to identify potential new roles

We quickly realised that someone with Superstar status in Documentation and general high-mid dev ratings should think about moving into a Tech Author role (which previously didn’t exist). If they excel in Presentation, Estimation, and Agile/Scrum then perhaps focus on becoming a Scrum Master. Awesome at Design Patterns, Coding, Leadership? Become a Tech Lead.

Being able to help people identify not only where they’re weak but also where they’re strong helps to both build a solid foundation and give direction for progression.

 

Conclusion

As far as I’m aware, this process was accepted within the company (I left shortly after it’s introduction and have since learned that another, completely different, method has been “introduced” from up on high..) and I found it much easier giving appraisals when there was a peer review system helping define people’s grades.

It also gave my team specific areas in which to define SMART objectives, and there was a general increase in enthusiasm for this process.

None of this could have been possible without the articles listed below, and the dedication of the Tech Lead team members, determined to make life better for all developers.

jquery cookies

For anyone else having fun working with cookies in jquery who decided to use jquery.cookie as their plugin of choice; if you create a cookie
[javascript]
$.cookie(‘ro’,’awesome’);
[/javascript]
And then try to change it
[javascript]
$.cookie(‘ro’,’lame’);
[/javascript]
you may actually end up with a cookie that contains BOTH values.

Instead, try jquery.cookies (plural!) which does seem to update correctly. Or just write your own plugin; cookie manipulation isn’t that tough!

Simple.Data “No ADO Provider Found”

Whilst trying to get Simple.Data working with a teeeeny tiny app for work, I couldn’t get past Database.OpenConnection(…) ; each time would result in a “No ADO Provider Found” exception, which had me baffled.

Google wasn’t helping much as it just threw up the symbolsource copy of Simple.Data’s MEFHelper.cs – i.e., the exception itself.

With a bit more digging I found this post on the Google Groups board which looked hopeful.

Eventually I found out that I needed to Nuget (if I can use that as a verb) Simple.Data into whichever projects referenced the class library that uses Simple.Data.

Odd. Seems a bit unnecessary, but it works now so I’m not complaining.

Scripting the setup of a developer PC, Part 4 of 4 – Installing Custom Stuff, Interesting Things Encountered, and Conclusion

This is the final part of a four part series on attempting to automate installation and setup of a development PC with a few scripts and some funky tools. If you haven’t already, why not read the introductory post about ninite, the second part about the command line version of WebPI or perhaps the third instalment about the interesting chocolatey project? Disclaimer: this series was inspired by a blog from Maarten Balliauw

Installing Custom Stuff

There are some other applications out there which I need to be able to install via script as well, such as SQL Server 2008 R2 Tools; although WebPI (and with Chocolately Beta) can install SQL Tools, you’re actually limited to the express versions (AFAIK), and I need the standard install.

Since I have the ISO for this on the network, I can run virtualclonedrive from commandline (after chocolatey installs it) to mount the iso and run the setup application using “vcdmount.exe /l=<drive letter> <iso path>”.

Execute VCDMount with no params to get this helpful dialog for other command line params:

image

So let’s get on with it then:

SQL Server 2008 R2 Tools

It looks like SQL Server has its own command line install options; if I mount the network ISO and pass the parameters “/ACTION=install /IACCEPTSQLSERVERLICENSETERMS /Q /FEATURES=Tools,ADV_SSMS” I should be able to install SQL tools unattended. There is a dependency on Windows Installer 4.5 being installed correctly for this one to work; make sure your WebPI install worked earlier!

 

Visual Studio 2010

It looks like VS2010 has its own command line install options; if I mount the network ISO and pass the parameters “/q /full /norestart” I should be able to install VS2010 unattended. There is an entry for “VS2010SP1Core” in the WebPI xml feeds, and I have tried using that to no avail; see “Interesting Things Encountered” section at the end for a note about WebPI & VS2010.

So the final install script should look like:

@echo off

REM VS2010
"c:\Program Files (x86)\Elaborate Bytes\VirtualCloneDrive\vcdmount.exe" /l=E "Z:\Installation\SetupDevPC\VS2010SP1.ISO" 

E:/Setup/setup.exe /q /full /norestart

REM SQL Tools
"c:\Program Files (x86)\Elaborate Bytes\VirtualCloneDrive\vcdmount.exe" /l=E "Z:\Installation\SetupDevPC\SQLServer2008Tools.ISO"

E:/setup.exe /ACTION=install /IACCEPTSQLSERVERLICENSETERMS /Q /FEATURES=Tools,ADV_SSMS

Something to bear in mind is that this doesn’t work if you haven’t restarted since running the chocolatey powershell script. As such, I’ve edited the chocolatey powershell script to end with:
[powershell]shutdown /r /t 0 /d P:0:0[/powershell]

If all goes well you shouldn’t actually see anything of note; VirtualCloneDrive’s VCDMount mounts each ISO into drive “E” (VCD default install has only one virtual drive defined, in my case that was “E”) and calls the relevant executable with parameters to attempt to force a silent install. VS2010 is completely silent! SQL at least gives a few lines of feedback.

The Bad News

Unfortunately VS2010’s setup.exe doesn’t wait before returning to the script; as such, you would see the call to VS2010’s setup.exe kick off then a few seconds later a call to SQL2008’s setup.exe, which fails since there’s another install already happening.

Again, just as unfortunately, SQL2008 won’t install straight after VS2010 – it demands a restart.

The “Meh” News

My preference is now to install SQL2008 first, since this is a blocking process, then VS2010, then let it restart (remove the “/norestart” flag for VS2010).

Hence the last script is actually:

@echo off

REM SQL Tools
"c:\Program Files (x86)\Elaborate Bytes\VirtualCloneDrive\vcdmount.exe" /l=E "Z:\Installation\SetupDevPC\SQLServer2008Tools.ISO"

E:/setup.exe /ACTION=install /IACCEPTSQLSERVERLICENSETERMS /Q /FEATURES=Tools,ADV_SSMS

REM VS2010
"c:\Program Files (x86)\Elaborate Bytes\VirtualCloneDrive\vcdmount.exe" /l=E "Z:\Installation\SetupDevPC\VS2010SP1.ISO" 

E:/Setup/setup.exe /q /full

Along with the previous powershell script and the beta chocolatey nupkg, the existing script for ninite and webpi and their components, the final directory contents now look like:

281211_autoinstall_iso_dir_contents

The End Result

The Result!

Which brings us FINALLY on to the:

Conclusion

Although it is entirely possible to script the setup of a develop PC without requiring ever seeing a GUI currently, using the tools I’ve chosen to use here, it seems that it can’t be done in a fully automated fashion. Certain products still popped up a confirmation dialog, others required a reboot when I’d specifically suppressed one. Some dependencies were not always resolved correctly.

As such, I hope that you have enjoyed this introduction into my attempt to teach myself some command line WebPI, ninite, chocolatey, and general hackery, and if you have any comments or suggestions please feel free to let me know in the comments or via twitter; I’ve kept various snapshots of my VM I used for this series, so I’ll happily try out any good suggestions!

It would appear that this is a nice set of basic scripts to get a development PC up and running, however once this has been done it makes much more sense to create an image and use that for future setups. There will be a follow up post about creating an image of this configured PC so that future developer PCs can use the image instead of having to reinstall everything, which should be a pretty basic one since it’s nothing new!

Finally, these articles are already out of date! WebPI is now on v4 and the chocolatey “beta” I mentioned is actually now the mainline. No doubt everything else will be out of date in a few more days.

Interesting Things Encountered

  • The webpicmdline tool still raises the odd dialog prompting for a restart (e.g. for MVC3), even with the “suppressreboot” option. Using a really loooong product list I’d specified in one webpi command fails for a lot of them. After rebooting, webpicmd didn’t automatically pick up and carry on as expected; this is why I’ve cut the initial webpi product list to a small number and done the others via chocolatey.

  • Webpicmdline doesn’t install things in the order you list them, which can be a bit odd. i.e., WindowsInstaller45 attempts to install after .Net 4 and promptly fails. Do it on its own and you’re fine.

  • Chocolatey’s webpi support didn’t initially work; I had to restart before I could install anything. I believe this to be related to the webpi installation of WindowsInstaller45 whose required reboot I had suppressed.

  • VS2010’s “/q /full” setup options are incredibly “q” – nothing appears at all; no command line feedback, no GUI. I had to fire off setup.exe without params just to see the GUI load and show me it’s already halfway through the install process! Fantastic.

  • VS2010 exists within the WebPI listing as “VS2010SP1Core” but seems to always fail with an error about needing “VS2010SP1Prerequisite”; this product also exists in the same WebPI feed but was always failing to install via webpicmdline and chocolatey for me. Let me know if you get this working!

The Resulting Scripts

Setup_Step1.cmd

Ninite & WebPI

@echo off

REM Ninite stuff
cmd /C "Z:\Installation\SetupDevPC\Ninite_DevPC_Utils.exe"

REM WebPI stuff
cmd /C "Z:\Installation\SetupDevPC\webpicmdline.exe /AcceptEula /SuppressReboot /Products:PowerShell,PowerShell2,NETFramework20SP2,NETFramework35,NETFramework4"

cmd /C "Z:\Installation\SetupDevPC\webpicmdline.exe /AcceptEula /SuppressReboot /Products:WindowsInstaller31,WindowsInstaller45"

shutdown /r /t 0 /d P:0:0

Setup_Step2.ps1

Chocolatey
[powershell]# Chocolatey
iex ((new-object net.webclient).DownloadString(‘http://bit.ly/psChocInstall’))

# install applications
cinst virtualclonedrive
cinst sysinternals
cinst msysgit
cinst fiddler
cinst tortoisesvn

# getting the latest build for webpi support: git clone git://github.com/chocolatey/chocolatey.git | cd chocolatey | build | cd _{tab}| cinst chocolatey -source %cd%
# I’ve already done this and the resulting nugetpkg is also saved in the same network directory:
cinst chocolatey –source "Z:\Installation\SetupDevPC\"

# Now I’ve got choc I may as well use it to install a bunch of other stuff from WebPI;
# things that didn’t always work when I put them in the looong list of comma delimited installs
# IIS
cinst IIS7 -source webpi
cinst ASPNET -source webpi
cinst BasicAuthentication -source webpi
cinst DefaultDocument -source webpi
cinst DigestAuthentication -source webpi
cinst DirectoryBrowse -source webpi
cinst HTTPErrors -source webpi
cinst HTTPLogging -source webpi
cinst HTTPRedirection -source webpi
cinst IIS7_ExtensionLessURLs -source webpi
cinst IISManagementConsole -source webpi
cinst IPSecurity -source webpi
cinst ISAPIExtensions -source webpi
cinst ISAPIFilters -source webpi
cinst LoggingTools -source webpi
cinst MetabaseAndIIS6Compatibility -source webpi
cinst NETExtensibility -source webpi
cinst RequestFiltering -source webpi
cinst RequestMonitor -source webpi
cinst StaticContent -source webpi
cinst StaticContentCompression -source webpi
cinst Tracing -source webpi
cinst WindowsAuthentication -source webpi

shutdown /r /t 0 /d P:0:0
[/powershell]

Setup_Step3.cmd

VCDMount

@echo off

REM SQL Tools
"c:\Program Files (x86)\Elaborate Bytes\VirtualCloneDrive\vcdmount.exe" /l=E "Z:\Installation\SetupDevPC\SQLServer2008Tools.ISO"

E:/setup.exe /ACTION=install /IACCEPTSQLSERVERLICENSETERMS /Q /FEATURES=Tools,ADV_SSMS

REM VS2010
"c:\Program Files (x86)\Elaborate Bytes\VirtualCloneDrive\vcdmount.exe" /l=E "Z:\Installation\SetupDevPC\VS2010SP1.ISO"

E:/Setup/setup.exe /q /full

Hope you enjoyed the articles, any feedback is appreciated.