There has been quite a lot written lately about the MEAN stack which is MongoDB, Express, Angular and Node, but I am going to describe the architecture I have used to create ObsPlanner.com. It comprises of both the MEAN stack and ASP.Net, Web.API, SQL Server as well as a number of third party components.

About ObsPlanner.com

The ObsPlanner.com website is a web application aimed at the more technical amateur astronomer market. It allows the user to plan in advance what astronomical objects they wish to observe by taking into account obstructions such as buildings and then optimises the plan around the times the user is at the telescope. Plans can then be downloaded and loaded into telescope controlling software which will automate the plan. Future releases will also include a mobile/offline mode and maybe even control the scope itself via Bluetooth. But that is dependent on much more research.

This is a brief overview of the architecture I chose.

ObsPlanner schematic

 

ASP.Net MVC Razor Views

By using Razor views, I could quickly create all the views for the front end based on layouts. Also with Razor and MVC I had a security model that could validate the user and redirect them if they were not authenticated or authorized to go to that view.

Angular js

Each view has an angular controller to manage its data binding and any service calls to either the Node or Web.API layers. By using angular it was possible to create a neatly structured code base as well as giving the end user a fast UI experience.

ASP.Net MVC & Web.API

With ASP.Net MVC and Web.API I was able to utilize the forms based security model as well interface with the Entity Framework layer that manages all the relational data from SQL Server. The services also use token based authentication which was very easy to set up using MVC Filters and attributes.

Node js & Express

As the site has a large amount of math based calculations, I wanted to use a language and technology that took this responsibility away from the client machine as I couldn’t guarantee what inconsistencies would be introduced. So I opted to use node.js so previously written components in JavaScript could be re-used with little change. It is running on IIS using the IISNode plugin and neatly interfaces with MongoDB using Express.

Entity Framework (EF)

I used a code first approach to create the entities that are relational in nature so the SQL Server database could be dropped and re-generated as the model changed. I also used the repository and Unit of Work patterns which lead to a faster integration with the MVC and Web.API controllers.

SQL Server

This is the data store for more relational entities such as the user and location as well as account stores. I was happier choosing this instead of having all data in Mongo as I was familiar with the security side of things and I wanted to use entity framework, repository and unit of work patterns.

MongoDB

I chose this as part of the back end for a fast read only data store that easily and quickly integrates with Node and Express. All the astronomical data is stored in this database and is made up of multiple collections for thousands of celestial objects. By using a data store that manages json natively it was very easy to retrieve astronomical data and pass it to the front end in the same format that the angular controllers could manage.

Stripe

As ObsPlanner is a multi tiered application, I wanted an easily integrated payment system without having to worry about being compliant with the banking systems. Stripe has an easy to use API layer that can be used both as part of the front end layer as well as the middle tier layer written in C#. When a user does opt to move on to the Pro level, none of the card details go through the ObsPlanner system and only to Stripe systems.

SendGrid

There are a number of internal subsystems which need to manage the sending of messages to registered users, this is done by SendGrid whichhas a simple API and allows access to message statistics such as errors, opening rate etc.

All this is running on one instance of Windows Server 2008 R2, not ideal I know but it is all I can afford just now.

So that is a quick run down of the architecture of ObsPlanner.com as of February 2016. It may change in the coming months as more users register and I detect any pain points.

Happy coding

 

Part 1 – Securing Your Logins With ASP.Net MVC
Part 2 - Securing Web.API Requests With JSON Web Tokens (This post)

In the last post I went over the techniques you can use to secure your ASP.Net MVC logins using salted hashes in the database. This post covers the web service layer and how to secure requests to service calls that are essentially exposed to the big bad web. To show what sort of layering I am discussing, here is a basic example of the various layers I have been using on a number of projects.

Three tier architecture

Once the user has been validated and allowed into the site, all service requests are done on their behalf. To make sure nobody who is not validated get access to the service calls, we implement JSON Web Tokens or JWT.

JSON Web Tokens

Jason Web Tokens are a standard and url safe way of representing claims and was created by the Internet Engineering Task Force (IETF). They are in the form like this:-

eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9
.
eyJodHRwOi8vc2NoZW1hcy5taWNyb3NvZnQuY29tL3dzLzIwMDgvMDYvaWRlbnRpdHkvY2xhaW1zL3VzZXJkYXRhIjoiSXNWYWxpZCIsImlzcyI6InNlbGYiLCJhdWQiOiJodHRwczovL3d3dy5tb2xhcnMub3JnLnVrIiwiZXhwIjoxNDI1NTY2ODY5LCJuYmYiOjE0MjU1NjMyNjl9
.
5XieXPt8kvbAgBlmB-IclmpaIR_PkcusIUc_tWlcxas

A JWT is split into 3 sections which comprise of:-
JOSE Header - this describes the token and the hashing algorithm that is being used for it.
JWS Payload - the main content and can include claims sets, issuer, expiry date as well as any bespoke data you want to include
Signature hash - base64 encoding the header and payload and creating the message authentication code (MAC) leads to the signature hash

Creating JSON Web Tokens in .Net

Going back to the web project, in the constructor of each controller, create a private field that will store our token string.The code to generate the token uses the System.IdentityModel.Tokens.Jwt namespace which you may need to add extra references for by using the NuGet packages.The call to Authorization.GetBytes() is a method call from a class we use in a business object that sits in the Webservice layer. All it does is turns a string into a byte array.Here we just store the web token in the viewbag for rendering on each view, the reason we do this is because we don't want to run into any cross domain issues as our web and web service layers are running on different machines on different urls.Now in the angular code that is calling into the service layer we extract that token and append it to the call as a parameter.

Consuming JSON Web Tokens

In the web service layer we intercept each call by creating an override on the OnAuthorization method inside AuthorizeApi.cs within App_Start.If they have the correct and valid token then they proceed to get the data from the API call, if not they get sent a 403 Forbidden response.

References:-
JSON Web Token (JWT) - OAuth Working Group

Part 1 – Securing Your Logins With ASP.Net MVC (This post)
Part 2 - Securing Web.API Requests With JSON Web Tokens

An archetectural patterns that is becoming more popular is using ASP.Net MVC with a Web.API layer servicing the web front end via angular.js or similar technology. A kind of hybrid SPA with all the benefits that ASP.Net bring to the table. This is a two part primer running through what I do to secure logins to MVC applications. In part two I will expand on this post to cover how to secure the Web.API layer utilizing the security built into ASP.Net.

If you ever go to a web site and you cannot remember your password, you will most likely have requested a password reminder. If you get sent your current password in plain text, then that is bad news. It means the website is storing passwords in plain text and if they get hacked then they will have access to those passwords, and knowing the fact that people have a tendency to use the same password on multiple sites then they could compromise multiple sites that you use. It is really important to salt and hash your passwords for storage in the database. By doing this, you can do a string comparison against the hash and not the actual password. Here I will go through the process in code.

As usual you will have a login screen asking for username (or email address) and password. I won't go into the MVC/Razor side here, just the important code.

Take in the two form values
The LookupUser method on the SecurityService is where the magic happensThis method looks up the User from the database via a UserRepository and appends the salt to the password the user has provided. I explain what salts and hashes are a little later on, but for now know they are just a random string representation of a passkey. This combination of password and salt are then passed into the GetPasswordHashAndSalt method of the PasswordHash class.The GetPasswordHashAndSalt method reads the string into a byte array and encrypts it using SHA256, then returns a string representation of it back to the calling method. This is then the hash of the salted password which should be equal to the value in the database. On line 19 of the SecurityService class the repository does another database look-up to get the User that matches both the email address and hash value. OK, so how do we get those hashes and salts in the database in the first place? When a new user account is set up you need to generate a random salt like this:-You then store the usual user details in the database along with the salt and the hashAndSalt values in place of the password. By generating a new salt each time an account is created you minimize the risk that a hacker will get the salt and regenerate the passwords from the hashAndSalt value. Now back to the login POST method on the controller. Once the user has been authenticated in the system, you need to create a cookie for the ASP.Net forms authentication to work. First create a ticket that stores information such as the user logged in.Where LoggedInUser is the valid User object we got from the database earlier. To check for a valid ticket throughout the site, you can decorate each action method with [Authorize] filter attributes, or you could do the whole site and just have [AllowAnonymous] attributes on the login controller actions. To do this for the whole site firstly add a new AuthorizeAttribute to the FilterConfig.cs file inside App_Start like this:-Then in the Application_AuthenticateRequest method to the global.asax.cs file add this:-This method will check every request coming in to see if it has a valid FormsAuthentication ticket. If it doesn't then it will redirect the user to the default location specified in the web.config file.

Unless using third party tools, management of IIS Express is pretty limited. For example it is not possible to log to a SQL Server database for later analysis. You can however import the current log files into SQL Server and here I will show you how I go about it. Firstly you need to create the table in SQL Server, however I am going to initially use a temporary table:-These are all the fields that the current version of IIS Express 8.0 uses. However before you import, you need to strip out all comment fields such as those beginning with # symbols. To do this download the Microsoft PrepLog tool; it is an old tool, but still useful for this purpose.  Then run the tool on a log file using PrepTool on the command line:-Then finally, do a bulk insert to get it all into SQL ServerNow create a table to import the data to, the only difference here is the addition of an id column. I want an id column because I will be querying it from Entity Framework and for that you need a primary key.Then use a SELECT query to get the data into the final table:-Happy coding.

How many times have you been working on a web page only for the browser to display old cached data? Its a pain to have to remeber to press the CTRL + F5 key combination, not to mention what injury you are doing to your fingers in the process. There is a way to disable the browser cache for the main browsers, here I will quickly show you what I do. Internet Explorer Click on the gear icon
ie_1
Choose Internet options
ie_2
Go to the Browsing history section in the General tab and press the settings button
ie_3
Select the Caches and databases tab
ie_4
Then clear the check box for 'Allow website caches and databases'. Come back out by clicking the OK button all the way back to the web page view. Chrome Go to the F12 developer tools Click on the gear icon in the right hand side of the developer tools window
chrome_1
Check the 'Disable cache (while DevTools is open) under the General section
chrome_2
The cache is re-enabled when the user comes back out of the developer tools. Firefox Enter about:config in the address bar
firefox_1
Click on the button warning you about making changes Locate the network.http.use-cache entry
firefox_2
Double click the entry and it will change from true to false
firefox_3