Archive for the 'SSW' Category

Calling Stored Procedures from Entity Framework 6 Code First

Just a quick solution to something I was banging my head against for a while.

If you use the EDMX/Database First way of creating your DbContext, you can import Stored Procedures to your context. The result is something like this:

public virtual ObjectResult<Customer_LoadStatus_Result> Customer_LoadStatus(Nullable<int> customerId)
{
    var customerIdParameter = customerId.HasValue ?
        new ObjectParameter("CustomerId", customerId) :
        new ObjectParameter("CustomerId", typeof(int));
 
    return ((IObjectContextAdapter)this).ObjectContext.ExecuteFunction<Customer_LoadStatus_Result>("Customer_LoadStatus", customerIdParameter);
}

If you try to copy this method to your Code First context, you’re likely to get an error saying something like, “The FunctionImport ‘Customer_LoadStatus’ could not be found in the container ‘CustomerDbContext’.” There’s probably a way to register it in the ModelBuilder, but 15min of web searches came up with nothing (maybe my Google-Fu is weak?).

Long story short, it seems you can’t just copy this to your context if you’re using Code First. Instead, do this:

public virtual Customer_LoadStatus_Result Customer_LoadStatus(int? customerId)
{
    var customerIdParameter = customerId.HasValue ?
        new SqlParameter("CustomerId", customerId) :
        new SqlParameter("CustomerId", typeof(int));

    return this.Database.SqlQuery<Customer_LoadStatus_Result>("Customer_LoadStatus @customerId",
        customerIdParameter).SingleOrDefault();
}

As a bonus, in my example I’m just returning the object I expect rather than the wrapped one you’d get from the generated code.

Release Management for the Enterprise (with videos!)

Team Foundation Server has been a great tool for managing your development projects for some time. But while it’s been a great source control and work item management tool, it’s always lacked a bit in terms of managing the deployment cycle of your application.

Too often, deployments rely on Frank from the Infrastructure team studiously following a word document to make sure every step has been checked off. It’s error-prone to say the least.

There are two great tools that are changing that. Both give you highly repeatable and automated deployments, reducing the risk of an outage.

The first is a new tool from Microsoft called Release Management for Visual Studio 2013. Yes, that’s a long name. The product has been around for a while under a different name (InRelease), but was recently acquired by Microsoft to improve this area of their offering. It’s closely-integrated with Team Foundation Server and gives you the ability to control your releases and know exactly who did what. It’s great for enterprises.

If you’re lucky enough to be at Tech Ed New Zealand this year, I’ll be speaking about it on Wednesday afternoon. I’m yet to hear about Tech Ed Australia, but I’ll let you know when I know :).

The other tool is one I’ve been using for a while called Octopus Deploy. It gives many of the same benefits as Release Management, but isn’t as closely tied to the Microsoft infrastructure. That means you can use it to deploy nearly anything from nearly any source (not just TFS).

I’ve spoken about Octopus Deploy a couple of times in the past few months, so rather than go through the features on this blog, I’ll just show you the videos.

For a quick introduction, watch my conversation with another SSW developer and MVP, Danijel Malik, about Octopus Deploy:

I also presented at DDD Melbourne recently about Octopus Deploy, and why it’s a great tool to stop you “deploying like an idiot”:

I hope you enjoy the videos, and I’m looking forward to talking about the other great tool, Release Management, at Tech Ed NZ.

TFS Maintenance – cleaning up after yourself

It’s really important to have a TFS Master as part of your team.  As the TFS Master at SSW (amongst other things), I’m constantly looking out for best practices, improvements we can make, and problems we can avoid. In particular, we like to keep our Team Foundation Server nice and clean, making sure the devs have no abandoned changes or workspaces.

Periodically, I’ll send out an email to anyone who has files they haven’t checked in for a while as per our “Do you have a report to see who has not checked in?” rule. What I often hear back is that the files have been checked out in a workspace that no longer exists. In other words the files were edited, not checked in, then the computer was rebuilt or replaced.

So what can you do?

The best option is to remove any workspaces you no longer have access to. There are two ways to do this:

  1. Use the awesome Team Foundation Sidekicks product
  2. Use the Workspaces tool in Visual Studio

 

Option 1 – Using Team Foundation Sidekicks

  1. Download Team Foundation Sidekicks 5.0
  2. Connect to your Team Foundation Server
  3. Use the Workspaces Sidekick to find and remove any workspaces you no longer use

Using Team Foundation Sidekicks to remove Workspaces

Using Team Foundation Sidekicks to remove Workspaces

 

Option 2 – Using the Workspaces tool in Visual Studio

  1. In Visual Studio, go to File | Source Control | Advanced | Workspaces…
  2. Check the Show remote workspaces checkbox to see workspaces on every computer
  3. Select and click Remove for the workspaces you no longer use

Opening TFS Workspaces in Visual Studio

Opening TFS Workspaces in Visual Studio

The Workspaces Dialog in Visual Studio

The Workspaces Dialog in Visual Studio

 

Clearing out these abandoned workspaces will let the TFS server know you aren’t using those workspaces any more. Best of all, your TFS Master will be less likely to send out angry emails!

I’ve been awarded a Microsoft MVP in Visual Studio ALM!

Microsoft MVP logoA few days ago I received an email from Microsoft informing me that I’d been awarded an MVP for Visual Studio ALM.

While my participation in the dev community has been purely for the enjoyment and love of what I do, this has been a career goal of mine and something I’ve been working towards for a few years.  I’m super excited and humbled to have been awarded alongside many of the best guys and girls in the field.

For those unfamiliar with what this award means, the best information can be found at http://mvp.microsoft.com, but here’s a quick quote from the site:

The Microsoft Most Valuable Professional (MVP) Award is our way of saying thank you to exceptional, independent community leaders who share their passion, technical expertise, and real-world knowledge of Microsoft products with others. … Today, there are more than 4,000 MVPs worldwide. They represent more than 90 countries, speak over 40 languages, answer more than 10 million questions a year, and are awarded in almost 90 Microsoft technologies—reflecting the breadth of our products and our global communities.

Thanks must go to the awesome guys and girls I work with at SSW for their support, as well as to the many enthusiastic developers I interact with as an organiser of the Brisbane .Net User Group and DDD Brisbane.

Here’s to a great 2014!

 

Enterprise MVC Apps course – redux

Last year, Adam Stephensen and I ran a number of “Enterprise MVC” courses.  These courses were hugely popular, not only selling out in Brisbane, Melbourne, and Sydney, but receiving positive feedback from each session nationwide.

Well, due to popular demand, they’re back, but in a slightly different format – it’s gone from a 2 part course to a course of 4 relatively independent days.  Day 1 will be useful if you’re new to MVC, while the other three days will take your skills to the next level by teaching how to make enterprise-quality MVC applications.

There are a few reasons for the change, all of which based on feedback from the participants.

The volume of important content was the primary reason for the change.  Adam and I really wanted to spend more time on important topics like Inversion of Control, Dependency Injection, logging, and testing.  With a 4 day course, we can really give these topics the attention they deserve.

We also think that getting your hands dirty is really important for understanding a lot of concepts.  By extending the course to 4 days, we can ensure there’s plenty of time for participants to try things rather than just hearing about them.

Finally, we found that attendees came to the course with differing levels of MVC experience.  For some, the introductory subjects were invaluable, but not for everyone.  By adding an extra day for MVC Foundations, we can confidently start at a higher level for the subsequent days.

If you’re writing MVC applications (or want to start) and you want to go beyond the demos so you can write enterprise-quality, maintainable, scalable, and robust applications in MVC, I’d highly recommend attending our new and improved courses.

For more details:

P.S. Telerik has once again jumped on board offering licenses for some of their awesome products when you register.  Check out the course pages for more info!

Upgrading TFS 2010 to TFS 2012 RC – Done

Over the weekend, we decided to take the plunge and upgrade the SSW TFS 2010 instance to the shiny new TFS 2012 release candidate.

Update: for the latest rules for migration, see http://rules.ssw.com.au/TFS/RulesToBetterTFS2012Migration/Pages/default.aspx

From Adam Cogan‘s Blog:

Initially, there was some debate as to whether we should upgrade what is the busiest server in our environment.  One call to Grant Holliday made all the difference and we quickly reached a consensus.  The upgrade was going ahead and Damian Brady was the guy in the hot seat!

There are three main upgrade paths:

  1. Upgrade from a Basic or Express version of TFS
  2. Upgrade in place
  3. Upgrade and move to new hardware (recommended)

We chose option 3 for a few reasons.  We used the opportunity to move to a SQL Server 2012 instance and change our hardware configuration a bit, but more importantly option 3 gave us the simplest rollback path; simply turn off the new server and bring the old one back up, then change any DNS settings back.

It was a very straightforward process (with a few coffee breaks):

  • Before you start – getting the VMs ready
  • Preparation steps – stopping the servers and backing up
  • Installing and upgrading TFS 2012
  • Configuring Reporting Services

The initial feedback from the test developers is extremely positive. They can’t wait to start using it!

I want to mention a couple of great blog posts.  There was a series of blog posts from Tim Elhajj, and of course we could leverage off our Rules to Better TFS 2010 Migration and the related blog posts (firstsecondthird) from Martin Hinshelwood for ideas.

Well done to the TFS team, they’ve really done a great job in this version.

SSW may have a possible claim to fame here – we might be the first company to deploy TFS 2012 RC to live.  I have not seen any blog posts about other companies migrating their live environment yet.  I’m a TFS MVP and nobody on that list has posted about their migration yet.  I’m expecting the punctilious Martin Hinshelwood to send us a nice scotch whisky!

The below is a bit of an epic, but if you’re keen to see how we did it, read on!

More Information

Our existing environment was:

  • Windows 2008 server: TFS 2010, SQL Server 2008
  • Windows 2008 server: SharePoint holding our projects

The new environment is:

  • Windows 2008 R2 server: TFS 2012 RC and a local SharePoint instance
  • Windows 2008 R2 server: SQL Server 2012

Before you start – getting the VMs ready

Thankfully, one of our sysadmins – Daragh – had already set up the servers I’d need, so we had an existing Windows 2008 R2 Server with SQL Server 2012, and a brand new Windows 2008 R2 Server for TFS.  Daragh had also already installed (but not configured) SQL Server Reporting Services and Analysis Services on the TFS box.  TFS no longer supports 32 bit, so it was important we were running 64 bit servers.

Make sure you have administrator access to all the machines you’re going to touch.  It makes it much easier if you’re a domain administrator (I got promoted for the weekend!).

Preparation Steps – stopping the servers and backing up

  1. The first step is obviously to let everyone know that TFS will be offline.
    Have a look at the Rules to Better Networks for guidance on an outage email.
  2. Take the TFS 2010 server offline.
    This is important because we’ll be upgrading from a backup, so any checkins that occur during the process won’t come across!

    1. Use TFSServiceControl quiesce to stop all the agents
    2. Stop relevant network services:
  3. Just to be safe, I also stopped the TFS Application Pools in IIS
  4. Verify that you can’t connect to the TFS server by trying to connect in Visual Studio
  5. At this stage, it’s a good idea to run the awesome DogFood Stats queries from Grant Holliday to get some statistics on your current projects.  Make sure you document these for comparison later.
  6. Backup the TFS and Reporting Server databases.
    We had quite a few relevant databases because there were a few project collections.
  7.  Back up the Reporting Services Encryption Keys.
    We’ll need to use these later to configure our new Reporting Services against the restored databases.
    To back up the encryption key, follow the steps on the MSDN page.
    Note: Make sure you remember the password you use to protect it with – you’ll need it later!
  8. Configure SQL Server Analysis Services to recover on failure.
    On your new TFS Server box:

    1. Go to Start | Administrative Tools | Services
    2. Right-click  SQL Server Analysis Services and choose Properties
    3. Go to the Recovery tab and choose “Restart the Service” for each of the failures.
  9. Move all my database backups  to the new SQL Server 2012 instance.
    This was as simple as copying each of the database backups to the new SQL 2012 server.

Installing and upgrading TFS 2012

  1. Because we’re installing from an ISO rather than a physical CD, we mount the ISO using SlySoft’s Virtual CloneDrive (or your favourite alternative).
  2. Run tfs_server.exe from the installation disk or ISO.
  3. Accept the license terms and click Install Now (because this is an admin operation, you may be given a UAC prompt – obviously, you can accept this)
  4. Watch the progress with excitement and trepidation (and coffee)

    You may get a message asking you to reboot the machine.  If it does appear… restart the machine.

    This made me a little nervous because we were using a virtual machine and a mounted ISO, but everything came back up perfectly and continued from where it left off.
  5. After installing, the Configuration Center will be shown.
    We want to Upgrade from an existing server
  6. We need to restore our TFS databases, so we’ll use the Database Restore Tool presented on the next screen.
  7. Because we’re not using a local database, we’ll change the target server instance so we can connect to our SQL 2012 server.
  8. In the Restore SQL Database screen, choose the folder where you saved your backups.  Note that this will be on your SQL Server if that’s where you put them.
    The tool will find all TFS databases automatically.
    Note: In our case, we want to restore our Reporting Server databases too, so make sure they’re ticked.

    Ensure this is correct, then click Restore.  The tool will restore each of the databases in turn.  When finished, click Close.
  9. The next step asks you to specify a configuration database to upgrade.
    First, we choose our SQL Server 2012 instance, then click List Available Databases
    The tool should find our Tfs_Configuration database automatically.
    We’ll confirm we have a backup, then click Next.
  10. We’re next asked for the service account for our Application Tier
    Network Service is fine for this, so just click Next.
  11. Next up, we’re asked if we want to configure reporting services.  Make sure the checkbox is checked, then click Next.
  12. Make sure the instance and the URLs are correct, and click Next.
    We have a local instance and we kept all the default URLs, so there was nothing to change.
  13. Next up is the TFS Warehouse database
    Because our SQL Server is in a different location, we had to change the instance.  To find the correct database, I tested the connection then listed all available databases. After that, the wizard found the Tfs_Warehouse database automatically.  If everything is correct, click Next.
  14. The Analysis Services are set up on the local TFS machine as well, so nothing needs to change on this screen. It’s a good idea to test using the Test link, however.
    If all is well, click Next.
  15. Next, we’re asked to provide an account that the Reports will run as.  We have a TFSSERVICE account for this specific purpose, so I supplied the credentials and tested successfully, then clicked Next.
  16. Next we’re asked if we want to configure SharePoint for TFS, we do, so we’ll make sure the checkbox is checked and click Next
  17. We want to use our existing SharePoint (installed with TFS), so we’ll leave the default settings and click Next.
  18. On the next screen, we get a summary of all the options we’ve chosen.
    Go through them carefully to make sure everything is as expected, and click Verify to check.

    Note: At this point I got a System Verification error at this point telling me I’d need to reboot before continuing.
    After rebooting… I had to start the wizard again!
    My initial thoughts were:

    But I persisted.
    I had to reboot twice to get past this error, but after that it all went smoothly.
  19. Now we’ve got all green ticks, we can click Configure.

    Configuration will take a little while, but when it’s finished, make sure you get all green ticks then click Next.
  20. Now each of the Project Collections will be updated.  This includes a lot of stuff, so it may take some time.
    Grab a coffee. Maybe go for a jog.

    Once you get a big green tick, click Next.
    Note: The Lab management warning told us it couldn’t tear down any labs.  As we didn’t have any set up, this wasn’t an issue.
  21. At this point, you should run Grant Holliday’s DogFoodStats scripts over the new databases to make sure there are no unexpected numbers.
    Don’t be concerned if the numbers aren’t identical; what you’re looking for is big discrepancies (for example zero checkins).
    All of our stats were either identical to the previous value, or were extremely close.

Configuring Reporting Services

  1. Finally, let’s configure our Reporting Services to work with our restored Reporting Server databases.
    On the TFS server, Go to Start | All Programs | Microsoft SQL Server 2012 | Configuration Tools | Reporting Services Configuration Manager
  2. We’ll change the Server Name to our SQL 2012 instance, click Find, then Connect
  3. Navigate to Database on the left-hand menu, then choose Change Database

  4. Select “Choose an existing report server database” then click Next
  5. Make sure we’re pointing at the correct SQL Server instance, and that testing the connection succeeds, then click Next

  6. Choose the Report Server database you restored in step 8, then click Next
  7. Leave the Credentials page with default settings, then click Next
  8. Review the information on the Summary page, then click Next
    When done, click Finish to complete the process
  9. Now we’ll restore our saved encryption key.
    Navigate to the Encryption Keys section, and choose Restore

  10. Locate the encryption key you backed up earlier, enter the password, then click OK.
    Make sure you get a green tick in the Results section
  11. Click on the Web Service URL section and Apply the default settings

    Make sure the Results section shows green ticks.

    Note: If you’re unable to click Apply, you’ve probably already completed this step. Just skip this step.
  12. Click on the Report Manager URL and Apply the default settings here as well.
    Again, check you get green ticks.
    Note: If you’re unable to click Apply, you’ve probably already completed this step. Just skip this step.
  13. Try out your new URLs by clicking on the hyperlinks in the Web Service URL and Report Manager URL sections.
    If all is well, you can Exit from the Reporting Service Configuration Manager.Aside: I had an issue here where when I navigated to the ReportServer URL, I was given an error saying “The report server installation is not initialized”.
    My solution was:
  1. Navigate back to the Encryption Keys section, Delete all Encrypted content, then restore the Encryption key again.
  2. Navigate to the Web Service URL, make a change to the URL (and change it back), then reapply the settings.
  3. Do the same to the Report Manager URL.

 After that it started working.

Final Steps

A couple more steps were required before we were up and running.  I worked with our trusty SysAdmin Daniel Hyles to get these items sorted out.

  1. Make a DNS change so tfs.ssw.com.au now points to this server’s IP address.
  2. Make changes to the firewall to allow traffic through to this server
  3. In our environment, we had TFS listening on a few different ports (including 443 for SSL).  This meant a bit more configuration in IIS, and some installation of certificates.

Testing

After the final steps were completed, testing went as smoothly as you could imagine.  Visual Studio accepted the new server without a hitch, and all our test developers could Get Latest and Check In without an issue.

To Do

There are a few loose ends that we hope to clear up over the next few days.

  1. In TFS Web Access, it appears that there are no members for any of the projects.  Obviously this isn’t causing an issue, and permission to access the Projects and Project Collections must have come across with the data migration (we made sure it wasn’t a free-for-all), but it seems a bit unusual.
  2. We still need to finish setting up the SharePoint instance that will store all our project documents.
  3. Our Build Servers have not yet been upgraded.  This should be a fairly quick task, however until it’s done, none of our builds will work.
    In particular, this affects our gated checkins.  Because they make sure the project can be built before  a checkin is accepted, they’re stopping us from checking in!
    We’ve turned these off for the moment.

Summary

The upgrade process was fairly straightforward, however there are a few suggestions I’ll give you.

If you’re doing it, I highly recommend you have a trusty SysAdmin involved for the final few steps.  A person like Daniel comes into his own on these occasions and knows exactly what changes to make and how to make them efficiently.  You don’t want a programmer doing that work if you can help it.

One thing we noted was that the URLs that were used to access the old TFS server were carried over in the configuration during the upgrade.  While this was fine for us as we were changing DNS entries to keep everything consistent, I can imagine a common final step would be to change these URLs.  It would have been nice for the wizard to prompt us for settings like that.

The feedback has been given to Brian Harry‘s TFS team so they can add some spit and polish for the RTM version.