Wow what a title, but it does explain that this post covers debugging the full .net framework application (not .net core) running on a Server Core container (not Linux) from the comfort of a Windows 10 Pro machine running Visual Studio 2015 and Docker for Windows.

Running remote tools on a machine and attaching is a pretty straight forward task, but there a re a few hurdles in the way when doing it on Windows Server Core 2016 container, but once it is scripted it is pretty painless.

Firstly we need a decent Docker image to start with, luckily the guys at Microsoft have created the Microsoft/aspnet image we can start with so lets build up our dockerfile

FROM microsoft/aspnet
RUN mkdir C:\site

This gets the base image and creates directory that we are going to use for our web files.

RUN powershell -NoProfile -Command \
    Import-module IISAdministration; \
    New-IISSite -Name "Site" -PhysicalPath C:\site -BindingInformation "*:8000:

This creates the web application in IIS and binds it to port 8000 and our path to the web files.

RUN powershell -NoProfile -Command \
    Install-WindowsFeature -Name Web-Mgmt-Service

RUN powershell -NoProfile -Command \
    New-ItemProperty -Path HKLM:\SOFTWARE\Microsoft\WebManagement\Server -Name EnableRemoteManagement -Value 1 -Force

RUN powershell -NoProfile -Command \
    Get-Service -Name WMSVC

RUN powershell -NoProfile -Command \
    Set-Service WMSVC -startuptype "Automatic"
   
RUN powershell -NoProfile -Command \
    Start-Service -Name WMSVC

RUN powershell -NoProfile -Command \
    net user iisadmin P2ssw0rd /add

RUN powershell -NoProfile -Command \
    net localgroup administrators iisadmin /add

This section is completely optional and all it does is install the web management tools for remote administration. You may want to tweak the container settings such as application pool and start and stop it from the GUI on your Windows 10 machine. If you are not bothered about IIS settings, then you can omit this section.

RUN powershell -NoProfile -Command \
$acl = Get-Acl -Path "C:\inetpub\wwwroot";Set-Acl -Path "C:\site" -AclObject $acl;

This copies the directory permissions set up for the wwwroot directory and assigns it to the site directory, so we don’t end up with nasty permission denied issues.

EXPOSE 8000 8172 4020 4021

We need to expose 8000 for our web application, 8172 for remote IIS admin and 4020 and 4021 are the ports we need opening for remote debugging tools. Visual Studio 2015 remote tools use these ports, other versions use different ports and as yet I cannot find the ports for Visual Studio 2017.

Next we locate our dockerfile using a cmd prompt and run docker build

docker build -t iis-site .

It will take a while to run through and will be quicker in future builds as you will already have the layers that make up the final image.

Now we have our image, lets run with it:-

docker run -d -p 94:94 -v "C:\Users\username\Documents\Visual Studio 2015\Projects\WebApplication4\WebApplication4":c:\site --name web1 iis-site

This creates a container based on our previous iis-site image and creates a mapping to a directory on our Windows 10 machine to the site directory we created in the dockerfile; this container will have the name web1.

Lets test this by finding the ip address:-

docker inspect -f "{{ .NetworkSettings.Networks.nat.IPAddress }}" web1

This gives me 172.19.184.110 and yours will be different as each container gets its own address. We know we are on port 8000 so open up a browser and browse to your home page:-

image

Great, now lets get debugging. In future releases of Visual Studio 2017 it will be possible to debug straight into a Windows container, but at the moment it is limited to Linux containers, so we need to do a bit more configuration.

We need to install the Remote Tools for Visual Studio 2015 so go here and download it.

Once downloaded, put the .exe in the same directory as your web application on your machine as that is already visible to the container. There are ways to automate the download using Invoke-WebRequest, but I had several issues with resolving the url so it was easier to do it myself.

Create a .bat file with the following and copy that also into the web application directory on your machine:-

rtools_setup_x64.exe /install /quiet

Get the id of the container by using the command:-

docker ps

With the id, lets open an interactive cmd prompt:-

docker exec -it 03a8eb766d5a cmd

Where 03a8eb766d5a is my container identifier.

Run the bat file:-

install-tools.bat

This will take a couple of minutes to run through, but when it is finished you can cd to the C:\Program Files\Microsoft Visual Studio 14.0\Remote Tools directory to take a look.

Exit out of the interactive session and get back to docker and run:-

docker exec -it 03a8eb766d5a "C:\Program Files\Microsoft Visual Studio 14.0\Common7\IDE\Remote Debugger\x64\msvsmon.exe" /nostatus /silent /noauth /anyuser /nosecuritywarn

This will start the remote debugging service msvsmon and your cmd prompt will just sit quiet with no cursor.

So now open up the web application in Visual Studio and go to Debug >> Attach to process

debug-1

Click on the Find button and it should find your container with its id and ip address showing, simply select it.

debug-2

debug-3

Now we need to make sure we choose Managed (v4.6, v4.5, v4.0) code or whatever version of the framework you are using. Also make sure show processes from all users is checked.

Scroll down and choose w3wp and attach. Check the security warning and then start debugging as you would normally do.

debug-4

Ideally it would be best if the remote tools could be downloaded as part of the build and the starting of msvsmon be automatic when F5 in Visual Studio. If you are happy with your container, you could save it as an image using the command:-

docker commit 03a8eb766d5a iis-debug-tools

Just remember to stop it first.

Happy coding.

Up until late last year I was experimenting with Docker container on Windows 10 and Server 2016 using the Experimental version of Docker. Well things took a turn for the worst and basically Docker refused to run and kept crashing. Life being as it is I went onto other things and have only just come back to Docker and installing the release stable version (17.03.0-ce-win1 (10296)) I get this when switching from Linux containers to Windows:-

Docker daemon failed with message:
time="2017-03-02T17:10:44Z" level=warning msg="Running experimental build"
time="2017-03-02T17:10:44.990632000Z" level=debug msg="Listener created for HTTP on npipe (//./pipe/docker_engine_windows)"
time="2017-03-02T17:10:44.995133700Z" level=info msg="Windows default isolation mode: hyperv"
time="2017-03-02T17:10:44.995133700Z" level=debug msg="Using default logging driver json-file"
time="2017-03-02T17:10:44.995632900Z" level=debug msg="WindowsGraphDriver InitFilter at C:\\ProgramData\\Docker\\windowsfilter"
time="2017-03-02T17:10:44.995632900Z" level=debug msg="Using graph driver windowsfilter"
time="2017-03-02T17:10:44.995632900Z" level=debug msg="Max Concurrent Downloads: 3"
time="2017-03-02T17:10:44.995632900Z" level=debug msg="Max Concurrent Uploads: 5"
time="2017-03-02T17:10:45.018215100Z" level=info msg="Graph migration to content-addressability took 0.00 seconds"

….LOTS MORE OF THE MESSAGE HERE…

time="2017-03-02T17:10:45.551683600Z" level=debug msg="start clean shutdown of all containers with a 15 seconds timeout..."
Error starting daemon: Error initializing network controller: Error creating default network: HNS failed with error : The parameter is incorrect.


   at Docker.Backend.ContainerEngine.Windows.DoStart(Settings settings)
   at Docker.Backend.ContainerEngine.Windows.Restart(Settings settings)
   at Docker.Core.Pipe.NamedPipeServer.<>c__DisplayClass8_0.<Register>b__0(Object[] parameters)
   at Docker.Core.Pipe.NamedPipeServer.RunAction(String action, Object[] parameters)

To fix this I stopped the MobyLinux VM that was running via Hyper-V Manager and did a factory reset on the Docker daemon.

image

It did the reset and now runs Windows containers, so if anybody has the same issue try replicating this.

Happy coding

WebUp is a new website monitoring tool that can also be configured for other endpoints such as web API urls. It is live in the Windows store as a UWP app, which means it can run on mobile, desktop, tablet and Xbox.

image

Why UWP?

The Microsoft ecosphere for app development, although on the mobile side is way behind both iOS and Android, has huge potential with everybody who runs Windows 10 included no matter what device. Here are some of the reasons I chose to publish on the Windows app store.

Auto updating

Any change I wish to make to the app once gone through certification will auto update on the end users machine leaving them with an always up to date version with no user input. Gone are the days when a new version of an application comes out, you got to go to the site, download and then install it.

No credit card information collected

WebUp is free for a trial of 7 days, but some of the features are locked out until upgraded to the full version. This payment process is entirely handled by Microsoft and although there are payment methods out there such as Stripe, as a developer I don’t need to lift a finger, just specify the price.

Runs across multiple devices

UWP apps are compiled once, but each architecture is included which means once a package is uploaded to the Windows Store, it is available for any Windows 10 device. WebUp gives a great experience on a tablet sized machine, but some users are running it on an Xbox with a massive display and using it as an information radiator. The mobile version is great when you need to whip out your phone and just check the status of your web sites.

User can multi install

The Windows Store allows a single user to install the app on up to 10 devices giving them the freedom to run it in multiple locations on different devices. WebUp can be installed on Xbox for an information radiator but you can also have it running elsewhere in an organisation with a different subset of http end points showing. A manager can also have it on their mobile device connected to the local Wi-Fi giving them satisfaction that everything is running fine.

Don’t need to worry about account management

With more traditional web based monitoring solutions, username and passwords need to be configured to log into the service, with a UWP Windows app, the installation is linked to the Microsoft account of the user so no more little yellow notes with credentials stuck to the monitor.

Can be installed inside enterprise and behind corporate firewall

As the app is installed to the device and not hosted by a third party such as SaaS providers, no holes need to be punched through the firewall if all your http end points are internal.

So those are some of the reasons I opted for UWP, it is early days at the moment, but I am already getting some traction and have other features in mind for the app.

Entity Framework code first migrations are really handy, but can be dangerous in the wrong hands as they make changes to the underlying database of your application.

Before the days when a single check in would run through the build process CI/CD pipeline, it was normal for the development stage to make constant changes to a database structure as an when the features requested needed it. With the new model of all changes running through the CI/CD pipeline, the database changes could be quickly pushed to a test or staging server in no time at all. With this new model of writing code, a few precautions have to be taken to make sure EF migrations don’t wreck havoc on the database running the staging or test servers.

Migrations in Entity Framework

A migration is a change to the schema of a database based on any model changes that have been created in the data layer of the application. If for example there is a User class and a new property is added for membership level, then with migrations enabled, a new column will be created in the database to match the new property.

But for a database already in existence and maybe even already in production, you don’t want changes to the schema to go directly through to production. So there are a few steps to take, these are:-

  1. Create a copy of the existing database on the local machine
  2. Reverse engineer the database to classes that can be used for Code First migrations
  3. Enable migrations
  4. Create a manual empty migration to tell EF that both model and database are in sync
  5. Disable migrations for all but local changes
  6. Generate a migration script for the dba to implement

Create a copy of the existing database on the local machine

I will leave this step up to you as some developers prefer to either restore a database back up locally, script the entire database and data and run the script locally or you may even have a database on another server that you can use.

Reverse engineer the database to classes that can be used for Code First migrations

By using the Entity Framework Reverse POCO Generator Visual Studio plugin, it is possible to point it at the database and generate the DbContext and all the necessary POCO classes.

Right click the project in Visual Studio and choose Add > New item.. from the context menu, you will then get a choice that includes ‘EntityFramework Reverse POCO Code First Generator’ in the template list.

reverse-engineer-db

By default this will create a context with the name MyDbContext like this:-

image

Simply change the ConnectionStringName to match that in your app.config or web.config and save the file and it will generate the necessary classes in a cs file under the T4 template.

image

 

As this tool uses a t4 template to generate the classes, it is easily configurable for example to change the context class name.

 

Enable migrations

To enable migrations, simply go to the ‘Package Manager Console’ view in Visual Studio. It can be found either by typing in the Quick launch text box at the top right or View > Other Windows  > Package Manager Console. In the default project drop down make sure the correct project is selected then type:-

Enable-Migrations

This will create a ‘Migrations’ directory within the project which will have a Configuration.cs file inside.

This file will house the main configuration class for the migrations and it is in here where you can change whether AutomaticMigrations are enabled and if data loss is acceptable. On a development machine, that is probably OK, but you don’t want those settings passed into staging or production; so use a pre-processor directive such as this

In this class, you will notice the DbMigrationsConfiguration is being inherited from which is using the DbContext class we created when we reverse engineered the database. It is this class that is responsible for the model and to keep track of the migrations.

Going back to the generated context class we need to add references to System.Data.Entity to get access to the functionality to drop and recreate the database and pluralize names. This can be added to the OnModelCreating method like this:-

By setting the Initializer context to null stops any changes being made to the schema, whereas setting it to DropCreateDatabaseIfModelChanges will do just that.

Create a manual empty migration to tell EF that both model and database are in sync

Now you have the model that matches the database schema, you can manually add a migration using this command:-

Add-Migration –IgnoreChanges

image

By giving it a name it will be easier to identify it later on in the migrations collection and will also allow it to generate a class with the Up and Down methods that are used to generate logic to change the schema to different versions. As this is the initial synced migration, these should be left empty.

image

Disable migrations for all but local changes

Now that the switches are put in place, when the code goes through the CI/CD pipeline just build it with a switch of anything but Debug. For example in TeamCity I have a build step using MSBuild using the parameter /p:Configuration=Release

 

Generate a migration script for the dba to implement

Lets face it, nobody wants to feel the wrath of a DBA if you make changes to the production database without firstly notifying them. There should even be a proper change management process that would need to be followed through before any changes are pushed to production. This is where the script generation feature from Entity Framework comes in handy.

By using the command:-

Update-Database –Script

You can generate a TSQL script that will show all the changes needed to get the database up to data with the model; this script can then be run by the DBA on the staging or production servers. Here I have added a simple class called TestTable in the generated cs file.

 

 

image

Happy coding.

Through the life of this blog in its many guises I have used Blogger, Community Server and WordPress. Now comes the time to move again and I have opted to use MiniBlog; a project created by Mads Kristensen of Visual Studio Web Essentials fame.

About MiniBlog

MiniBlog is written in ASP.Net, C# and uses the website template in Visual Studio. It has a simple architecture and persists the blog posts in XML format as physical files on the web server drive. There is much more to this platform though such as:-

  • Windows Live Writer support
  • RSS and ATOM feeds
  • Support for robots.txt and sitemap.xml
  • Much much more…

Why move?

Although I have had a great experience using WordPress over the past few years, I have become more apparent of the bloat that is downloaded to the users browser in the form of Java Script and CSS. As a developer who strives to optimise web pages to get a better UX for the user, this didn't sit right with me.

old-js

This is the old sites Java Script showing over kb of data downloaded to the users browser.

old-css

The CSS is also loaded with the many styles from various plugins used in WordPress.

Wordpress is written in PHP and my experience is .Net, also the back end it's MySQL again a technology I am not 100% experienced with. So to make the changes to optimise the blog to a level I was happy with would leave me with either rewriting the entire theme and persistence layer or move to a technology stack I can have more control over. MiniBlog gives me this. I can create the style and layout I want with Razor and CSS and tweak the caching layer in the C# code. I can also get gulp up and running to run tasks to concatenate and minimize the css and Java Script files. Also like other projects I am working on, I can easily create a work flow in Team City to do the build and deploy.

What I had to do

Firstly I had to export all my blog posts from WordPress into a format that MiniBlog can understand. Again thanks to Mads Kristensen I could use the MiniBlogFormatter tool to get my posts into the correct format. This tool outputs each post to a seperate XML file. Then going through the current blog I found and separated post I wanted to keep that covered topics and technology that are still current. Then I created a temporary website in IIS to test these posts. It soon became apparent that the directory structure was different and MiniBlog uses a post directory to store files. I didn't want to loose any links to popular posts especially those that are referred to from other sites, so I created a url rewrite rule to do redirects to the new structure for users coming in on the old url.

Optimization

As I have mentioned in a past post concatinating and minification of Java Script and CSS can be achieved by using gulp. Once this was done, the number of files downloaded was much less than before.

new-js

Now only 16 files are downloaded as opposed to 46 in the old site.

new-css

Also with the CSS, there are only 2 as opposed to 7 requests.

With the images, I used Paint.Net to decrease the resolution and size for better performance.

Then in IIS, I navigated to the HTTP Response Headers section for the directories that contain the Java Script and CSS files.

iis 

Clicked on the ‘Set common headers’ link in the top right hand corner.

http-headers

Set the expires date to a date far into the future.

Comments

I was getting a lot of spam comments which Akismet took care of, so I had to find an alternative which would also protect me. I enjoy reading the comments and like to respond so switching off comments was not an option. I eventually went for Disqus which has a plugin architecture that was really simple to implement.

So there you have it, this is now the new blog site. It will change in style over the next few weeks as I make tweaks here and there, but with the CI/CD workflow I have in place this is very easy to work with.

What next?

I run all my sites of the same box which also has both MySQL and SQL Server running on it, so soon I can switch off MySQL, uninstall the PHP processor that it's part of IIS and hopefully free up some more resources.

Happy coding