Sunday, April 3, 2011

Cookieless Session State in ASP.NET without nasty URLs

Some of you have probably heard about the EU proposal that plans to end the internet as we know it on May 25th 2011. If you haven’t heard of it, David Naylor has made a nice little example of it’s consequences here. In essence most sites that use cookies will have to ask visitors to opt-in for every single cookie before using it. I’m very  much in favor of online privacy – yet it seems to me that this is a very poorly thought through directive. First of all, most cookies server 1 of 2 purposes:

  • Help web sites recognize visitors in order to provide them with the best possible service. Much like when I walk into my local barbershop and the barber recognizes me and knows exactly how I prefer him to cut my hair (the little I have left after reading crazy directives) – and which subjects I want to small talk about.
  • Visitor tracking in order to do statistics the site owners can use to improve the web site with. Again – it’s not all that different from when a grocery store owner thinks “wow – 10 customers this last week has asked me for low-fat milk. Perhaps I should start to carry that product here”.

I have no problem with both of the above scenarios – they fall into what I call good service and help enhance my online experience.
Another problem is that I generally dislike when legal stuff comes in the way for the best technical solution to a problem. Laws should describe the concept of what they are outlawing – not specific technical architectures such as cookies…But before I digress any further into political territory I’ll get right back on track.

Many ASP.NET developers rely on the Session State mechanism to store user relevant data within a visit, that can improve the user experience – for instance with personalization, prefilled forms, and so on. Unfortunately the Session state relies on a unique session key being stored in a local cookie in order to have a unique way to identify the same visitor throughout a visit. It actually comes with a built-in switch to make it stop using cookies – but unfortunately the solution looks rather ugly – it changes all the URLs on the site to contain a Guid and thereby track the visitor using the Guid. I, for one, am rather fond of clean and pretty friendly urls – so that’s no good. So – I started thinking…Many years ago I worked for a company that built a statistics tool. It was pretty unobtrusive and we didn’t use cookies. Instead we just tracked the source IP – and checked for repeated requests with a 10 minutes time-out. Sure, it wasn’t bullet-proof, but it actually worked surprisingly well. And in those cases where it didn’t work? Well – it was just 1 statistical entry out of many. It’s not like we used it to authorize access to the nuclear football, right?! Now, I thought that if we combine all the static information we get in the HTTP Request like IP, Accept Languages, Accept Types, User Agent and so on, smash it all together and take a fingerprint of it – we might end up with something that can almost be used as a session id. Consider: What are the odds that you’ll get 2 different visitors using the exact same configuration, coming from the exact IP on your site within the 20 minutes default time-out??
Of course it turns out I wasn’t the first to think this thought. In fact the clever people at the Electronic Frontier Foundation (EFF) has for some time been running a little example site that calculates those exact odds – just to prove that Privacy online isn’t solved by simply outlawing cookies.

So – I decided to put the thoughts into code. The code consist of 2 parts. First part is an extension method for the HttpRequest class, called “GetUniqueFingerprint()” which will return a MD5 Hash fingerprint.

using System;



using System.Collections.Generic;



using System.Linq;



using System.Web;



using System.Text;



using System.Security.Cryptography;



 



namespace AllanTech.NoCookie



{



    public static class NoCookies



    {



 



        static private string GetMd5Sum(string s)



        {



            Encoder enc = System.Text.Encoding.Unicode.GetEncoder();



            byte[] text = new byte[s.Length * 2];



            enc.GetBytes(s.ToCharArray(), 0, s.Length, text, 0, true);



            MD5 md5 = new MD5CryptoServiceProvider();



            byte[] result = md5.ComputeHash(text);



            StringBuilder sb = new StringBuilder();



            for (int i=0; i<result.Length; i++)



            {



                sb.Append(result[i].ToString("X2"));



            }



            return sb.ToString();



        }



 



        public static string GetUnqiueFingerprint(this HttpRequest Request)



        {



            string source=



                string.Join(",", Request.AcceptTypes)+";"+



                string.Join(",", Request.UserLanguages)+";"+



                Request.UserHostAddress+";"+



                Request.UserAgent;



            return GetMd5Sum(source);



        }



    }



}






Second part is a replacement for the ASP.NET SessionIDManager. This is the mechanism that uniquely identifies the visitor – either by a cookie or url – and by replacing it we can make it use our new UniqueFingerprint method instead. It’s really simple – just implement the ISessionIDManager and you’re good to go:





using System;



using System.Collections.Generic;



using System.Linq;



using System.Web;



using System.Web.SessionState;



 



namespace AllanTech.NoCookie



{



 



    public class CookielessIDManager : ISessionIDManager



    {



        public CookielessIDManager() { }



 



        #region ISessionIDManager Members



 



        public string CreateSessionID(HttpContext context)



        {



            return context.Request.GetUnqiueFingerprint();



        }



 



        public string GetSessionID(HttpContext context)



        {



            return context.Request.GetUnqiueFingerprint();



        }



 



        public void Initialize()



        {



            



        }



 



        public bool InitializeRequest(HttpContext context, bool suppressAutoDetectRedirect, out bool supportSessionIDReissue)



        {



            supportSessionIDReissue=true;



            return context.Response.IsRequestBeingRedirected;



        }



 



        public void RemoveSessionID(HttpContext context)



        {



        }



 



        public void SaveSessionID(HttpContext context, string id, out bool redirected, out bool cookieAdded)



        {



            redirected=false;



            cookieAdded=false;



        }



 



        public bool Validate(string id)



        {



            return true;



        }



 



        #endregion



    }



}




Finally, all I have to do is to change the configuration (web.config) to use my CookielessIDManager instead of the default:



<sessionState mode="InProc" sessionIDManagerType="AllanTech.NoCookie.CookielessIDManager,AllanTech.NoCookie" … /> 



Enjoy a site with 1 less cookie!

Monday, August 2, 2010

A simple, little web load tool

There are many ways of doing performance testing of web applications. In the good ol’ days I remember starting up Microsofts Application Center Test (ACT) and recording some vbscripts that could later be executed. Nowadays ACT is a lot sexier – but now it comes with Visual Studio 2010 but unfortunately only in Ultimate edition. I tried to persuade my wife to spend the $11000 on the ultimate edition – but she failed to see why this was more important than buying her a car.

Another good option is to use WebLoad. It’s a neat tool – and even if you buy it (to actually get a compiled and running version instead of the do-it-yourself-open-source) it still comes at a more decent price point. I recently played around with it – and it does solve a lot of your performance testing needs – but it’s almost a bit too much overkill for my need (which is essentially to find out how many request/s a web site can handle). I also didn’t like that it hijacked all my browsers and forced them to go through a proxy (in order for it to record what was going on) – and then failing to reset the proxy selection afterwards.

In the end I decided to spend the 30 min it would take to do a simple little performance tester of my own – that does exactly what I want it to.

I came up with AWebLoadTesting which is a compact and ultra-simple console app. It takes an input file which is essentially a text file with a list of urls to visit for each visitor during the test, an output filename – in which it will put a csv file with saved statistics – and that’s about it. If you need to you can also specify a hostname to run the test against – and even a custom UserAgent for the requests.

image

When it starts you have 0 visitors active. Then, by pressing “+” you can add visitors one at a time – and by pressing “1” and “5” you can add chunks of 10 or 50 visitors at a time. Each visitor is started in its own thread and will continuously go through the urls from the input file again and again.

“u” updates your view, “r” resets the counters", “s” saves the current data to the output file, “-“ removes a visitor” and of course “q” quits.

You’ll constantly be presented with the measured numbers: Time measured (s), Requests / s, Visitor count, Max load time, average load time and min. load time. On top of that it will show you a prioritized list of which urls are the slowest to return. That’s it.

The screenshot above is a test against a local EPiServer CMS 6.0 web site on my laptop, running with ASP.NET caching turned on (Set cache-expiration to 1h in episerver.config, site settings).

Download the binary here and the entire project here. Use AS-IS, LGPL 2.0, Quick&Dirty.

Monday, May 17, 2010

And a (non-virtual) role change

May 1st I arrived back in Denmark after spending a year in the US assisting with assembling and training the GREAT team that we have there now as well as working with some truly skilled and passionate partners (you know who you are). I must say it’s been a great learning experience as well as a very exciting time – both for EPiServer but also for me personally.

Now, that I’m back in the old world again it seemed like a good time to try a new angle at producing great software – and as luck would have it I was offered to try on the shoes as product manager. Even though I’ve always had a deep passion for coding I’d love a chance to really influence the future of creating great web sites in a way that only a product manager at EPiServer can do it.

I believe that the most important job for any software product company is to create software that solves real problems that people in their markets have. This is the key factor that more than anything should be driving both the development and sales process – solving real problems for real people. And of course solving the problems in a carefully designed and planned manner so the solutions adapts to the users needs and skills – and not the other way around. Too many times have I seen countless examples of technology and features in various products (in all industries) that are there for no other other reason than adding a feature – but not solving any real problems. Flashy as some of it may seem it’s still essentially useless. The consequence: development time that could have been spend solving problems wasted, and users confused with features that doesn’t make sense.

Luckily EPiServers history shows that we have been very successful in solving real problems. And I believe that’s why so many web sites, editors, developers and marketing people use our entire product portfolio as their platform of choice today. But of course we can still do even better. Especially with YOUR help. I want to learn how you use EPiServer CMS. And even if you don’t use it – tell me how you manage your online content and which problems we could solve for you.

Wednesday, May 5, 2010

Yo, Halo Reach Beta Peeps

A long time ago I wrote about shotcodes and for that purpose I even put a shotcode on the web site linking to this web site. It would finally seem like I now get my 15 sec of fame, since Halo Reach supposedly have used a graphic with some similarity to my shotcode in the game (see the computer terminal to the right here). After Zoidberg25 “cracked” this in the Bungie forums I’ve gotten a certain amount of visitors looking through my blog for hidden clues – or maybe even an ARG.

Although I appreciate all traffic and every single visitor to my blog is very welcome, I feel that I should probably come clean. I can deny any and all rumors that this blog is part of an intricate scheme to hide secret game codes or Easter eggs. Or maybe not. Feel free to read through every single post and comment – look for hidden codes and clues (remember that the classical Substitution Cipher is always a popular way to hide secret stuff in plain view. While you are trying to crack this one, feel free to click the links and read what my sponsors have to say. And if the adds generate enough revenue I might even buy that silly game of yours and see what all the fuzz is about (if I actually manage to clean the dust off my Xbox 360).