Thursday, March 19, 2009

Making Witty Work With a Socks Proxy

I needed a decent Twitter client that supported a Socks proxy, but was unable to find one (if you know of one, please post a comment).  While there were some decent clients that have proxy settings, none of them supported Socks. So instead, I decided to take Witty and modify it instead.

Witty uses the HttpWebRequest class which does not appear to support socks proxies (again, if this is wrong, please post a comment).  This meant replacing all usages of that class with a custom implementation that I decided to call SocksHttpWebRequest (code to follow).  I built it on top of ProxySocket, which is a free .NET API that supports Socks 4 and 5 messaging.

The changes to Twitty itself were minimal.  The TwitterLib.TwitterNet class is where all the over-the-wire communications are handled, with the CreateTwitterRequest() method being the starting point for the modifications.

// private HttpWebRequest CreateTwitterRequest(string Uri)
private WebRequest CreateTwitterRequest(string Uri)
{
// Create the web request
// HttpWebRequest request = WebRequest.Create(Uri);
var request = SocksHttpWebRequest.Create(Uri);

// rest of method unchanged
...
}

From there it’s just a matter of changing the consuming methods to expect a WebRequest object back from CreateTwitterRequest().  For most of the methods, this meant the following changes:

// HttpWebRequest request = CreateTwitterRequest(requestURL);
var request = CreateTwitterRequest(pageRequestUrl);

// using (HttpWebResponse response = request.GetResponse() as HttpWebResponse)
using (var response = request.GetResponse())
{
// some code that utilizes the response
...
}

Ah, if only the var keyword had been embraced…

A few of the consuming methods also had the following line of code in them:

request.ServicePoint.Expect100Continue = false;

Since ServicePoint is not a property of the WebRequest class and thus does not apply to SocksHttpWebRequest, I simply commented it out.

I should note that the SocksHttpWebRequest and SocksHttpWebRsponse classes are not fully featured.  That is to say, most of the virtual members that WebRequest and WebResponse expose throw a NotImplementedException, and I only provided overrides for those that were needed to make Witty run properly.  However, much of the functionality is there, and they are certainly a better starting point than I had to work with.

And now, as promised, here’s the code:


using System;
using System.Collections.Specialized;
using System.IO;
using System.Net;
using System.Net.Sockets;
using System.Text;
using Org.Mentalis.Network.ProxySocket;

namespace Ditrans
{
public class SocksHttpWebRequest : WebRequest
{

#region Member Variables

private readonly Uri _requestUri;
private WebHeaderCollection _requestHeaders;
private string _method;
private SocksHttpWebResponse _response;
private string _requestMessage;
private byte[] _requestContentBuffer;

// darn MS for making everything internal (yeah, I'm talking about you, System.net.KnownHttpVerb)
static readonly StringCollection validHttpVerbs =
new StringCollection { "GET", "HEAD", "POST", "PUT", "DELETE", "TRACE", "OPTIONS" };

#endregion

#region Constructor

private SocksHttpWebRequest(Uri requestUri)
{
_requestUri = requestUri;
}

#endregion

#region WebRequest Members

public override WebResponse GetResponse()
{
if (Proxy == null)
{
throw new InvalidOperationException("Proxy property cannot be null.");
}
if (String.IsNullOrEmpty(Method))
{
throw new InvalidOperationException("Method has not been set.");
}

if (RequestSubmitted)
{
return _response;
}
_response = InternalGetResponse();
RequestSubmitted = true;
return _response;
}

public override Uri RequestUri
{
get { return _requestUri; }
}

public override IWebProxy Proxy { get; set; }

public override WebHeaderCollection Headers
{
get
{
if (_requestHeaders == null)
{
_requestHeaders = new WebHeaderCollection();
}
return _requestHeaders;
}
set
{
if (RequestSubmitted)
{
throw new InvalidOperationException("This operation cannot be performed after the request has been submitted.");
}
_requestHeaders = value;
}
}

public bool RequestSubmitted { get; private set; }

public override string Method
{
get
{
return _method ?? "GET";
}
set
{
if (validHttpVerbs.Contains(value))
{
_method = value;
}
else
{
throw new ArgumentOutOfRangeException("value", string.Format("'{0}' is not a known HTTP verb.", value));
}
}
}

public override long ContentLength { get; set; }

public override string ContentType { get; set; }

public override Stream GetRequestStream()
{
if (RequestSubmitted)
{
throw new InvalidOperationException("This operation cannot be performed after the request has been submitted.");
}

if (_requestContentBuffer == null)
{
_requestContentBuffer = new byte[ContentLength];
}
else if (ContentLength == default(long))
{
_requestContentBuffer = new byte[int.MaxValue];
}
else if (_requestContentBuffer.Length != ContentLength)
{
Array.Resize(ref _requestContentBuffer, (int) ContentLength);
}
return new MemoryStream(_requestContentBuffer);
}

#endregion

#region Methods

public static new WebRequest Create(string requestUri)
{
return new SocksHttpWebRequest(new Uri(requestUri));
}

public static new WebRequest Create(Uri requestUri)
{
return new SocksHttpWebRequest(requestUri);
}

private string BuildHttpRequestMessage()
{
if (RequestSubmitted)
{
throw new InvalidOperationException("This operation cannot be performed after the request has been submitted.");
}

var message = new StringBuilder();
message.AppendFormat("{0} {1} HTTP/1.0\r\nHost: {2}\r\n", Method, RequestUri.PathAndQuery, RequestUri.Host);

// add the headers
foreach (var key in Headers.Keys)
{
message.AppendFormat("{0}: {1}\r\n", key, Headers[key.ToString()]);
}

if (!string.IsNullOrEmpty(ContentType))
{
message.AppendFormat("Content-Type: {0}\r\n", ContentType);
}
if (ContentLength > 0)
{
message.AppendFormat("Content-Length: {0}\r\n", ContentLength);
}

// add a blank line to indicate the end of the headers
message.Append("\r\n");

// add content
if(_requestContentBuffer != null && _requestContentBuffer.Length > 0)
{
using (var stream = new MemoryStream(_requestContentBuffer, false))
{
using (var reader = new StreamReader(stream))
{
message.Append(reader.ReadToEnd());
}
}
}

return message.ToString();
}

private SocksHttpWebResponse InternalGetResponse()
{
var response = new StringBuilder();
using (var _socksConnection =
new ProxySocket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp))
{
var proxyUri = Proxy.GetProxy(RequestUri);
var ipAddress = GetProxyIpAddress(proxyUri);
_socksConnection.ProxyEndPoint = new IPEndPoint(ipAddress, proxyUri.Port);
_socksConnection.ProxyType = ProxyTypes.Socks5;

// open connection
_socksConnection.Connect(RequestUri.Host, 80);
// send an HTTP request
_socksConnection.Send(Encoding.ASCII.GetBytes(RequestMessage));
// read the HTTP reply
var buffer = new byte[1024];

var bytesReceived = _socksConnection.Receive(buffer);
while (bytesReceived > 0)
{
response.Append(Encoding.ASCII.GetString(buffer, 0, bytesReceived));
bytesReceived = _socksConnection.Receive(buffer);
}
}
return new SocksHttpWebResponse(response.ToString());
}

private static IPAddress GetProxyIpAddress(Uri proxyUri)
{
IPAddress ipAddress;
if (!IPAddress.TryParse(proxyUri.Host, out ipAddress))
{
try
{
return Dns.GetHostEntry(proxyUri.Host).AddressList[0];
}
catch (Exception e)
{
throw new InvalidOperationException(
string.Format("Unable to resolve proxy hostname '{0}' to a valid IP address.", proxyUri.Host), e);
}
}
return ipAddress;
}

#endregion

#region Properties

public string RequestMessage
{
get
{
if (string.IsNullOrEmpty(_requestMessage))
{
_requestMessage = BuildHttpRequestMessage();
}
return _requestMessage;
}
}

#endregion

}
}


using System;
using System.IO;
using System.Net;
using System.Text;

namespace Ditrans
{
public class SocksHttpWebResponse : WebResponse
{

#region Member Variables

private WebHeaderCollection _httpResponseHeaders;
private string _responseContent;

#endregion

#region Constructors

public SocksHttpWebResponse(string httpResponseMessage)
{
SetHeadersAndResponseContent(httpResponseMessage);
}

#endregion

#region WebResponse Members

public override Stream GetResponseStream()
{
return ResponseContent.Length == 0 ? Stream.Null : new MemoryStream(Encoding.UTF8.GetBytes(ResponseContent));
}

public override void Close() { /* the base implementation throws an exception */ }

public override WebHeaderCollection Headers
{
get
{
if (_httpResponseHeaders == null)
{
_httpResponseHeaders = new WebHeaderCollection();
}
return _httpResponseHeaders;
}
}

public override long ContentLength
{
get
{
return ResponseContent.Length;
}
set
{
throw new NotSupportedException();
}
}

#endregion

#region Methods

private void SetHeadersAndResponseContent(string responseMessage)
{
// the HTTP headers can be found before the first blank line
var indexOfFirstBlankLine = responseMessage.IndexOf("\r\n\r\n");

var headers = responseMessage.Substring(0, indexOfFirstBlankLine);
var headerValues = headers.Split(new[] { "\r\n" }, StringSplitOptions.RemoveEmptyEntries);
// ignore the first line in the header since it is the HTTP response code
for (int i = 1; i < headerValues.Length; i++)
{
var headerEntry = headerValues[i].Split(new[] { ':' });
Headers.Add(headerEntry[0], headerEntry[1]);
}

ResponseContent = responseMessage.Substring(indexOfFirstBlankLine + 4);
}

#endregion

#region Properties

private string ResponseContent
{
get { return _responseContent ?? string.Empty; }
set { _responseContent = value; }
}

#endregion

}
}

Using SyntaxHighlighter 2.0 on Blogger

Updated 4/8/2009: Had originally posted wrong version of FindTagsByName() function.

Guogang Hu made a nice post a couple of years ago demonstrating how to use SyntaxHighlighter with Blogger, but since then the world has moved on.  While that post is still entirely relevant if you plan to use SyntaxHighlighter 1.5, a slight modification is required to the JavaScript Guogang supplied if you want to use version 2.0.

The key functionality in Guogang’s script is the removal of the <br> tags in the code blocks that are automatically inserted by the Blogger rendering engine in place of all line breaks. The script looks for the code blocks and then removes the extraneous <br> tags. However, due to the change in the way that code styling is indicated to SyntaxHighlighter 2.0, the script’s approach to locating the code blocks must be modified.

<!-- SyntaxHighlighter code styling changes -->
<!-- 1.5 method -->
<pre name="code" class="c-sharp">
... some code here ...
</pre>

<!-- 2.0 method -->
<pre class="brush: c#">
... some code here ...
</pre>

Whereas Guogang’s script looks for tags whose name attribute is “code,” we need to now be looking for tags whose class attribute begins with “brush:”.  And thus I give you the full, modified script:

</div></div> <!-- end outer-wrapper -->

<script type="text/javascript">
//<![CDATA[
function FindTagsByName(container, className, tagName)
{
var elements = document.getElementsByTagName(tagName);
for (var i = 0; i < elements.length; i++)
{
var tagClassName = elements[i].className;
if (tagClassName != null && tagClassName.search(className) == 0)
{
container.push(elements[i]);
}
}
}

var elements = [];
FindTagsByName(elements, "brush:", "pre");
FindTagsByName(elements, "brush:", "textarea");

for(var i=0; i < elements.length; i++)
{
if(elements[i].nodeName.toUpperCase() == "TEXTAREA") {
var childNode = elements[i].childNodes[0];
var newNode = document.createTextNode(childNode.nodeValue.replace(/<br\s*\/?>/gi,'\n'));
elements[i].replaceChild(newNode, childNode);
}
else if(elements[i].nodeName.toUpperCase() == "PRE") {
brs = elements[i].getElementsByTagName("br");
for(var j = 0, brLength = brs.length; j < brLength; j++)
{
var newNode = document.createTextNode("\n");
elements[i].replaceChild(newNode, brs[0]);
}
}
}
SyntaxHighlighter.all();
//]]>
</script>

Sunday, January 20, 2008

Pseudo Multiple Inheritance in C# 3.5

I used to think that extension methods were nothing more than a LINQ facilitator that were to be avoided in day-to-day class design.  I had visions of IntelliSense drop-downs filled with hundreds of utility methods for the String type, or, worse yet, object.  I could see junior level programmers becoming dependent on these methods, only to become disappointed when they couldn't find them after having moved on to a new project.

And then I came across Bill Wagner's post on MSDN that explains how to Create Mixins with Interfaces and Extension Methods.  It got me thinking about this new feature of C# 3.5 some more.  If you haven't read his article yet, stop now and do so before continuing.

The points Bill makes are great, but I think he danced around what could be the biggest benefit of extension methods: pseudo multiple inheritance in C# (well, sort of).  If you take Bill's method of encapsulation, and apply it to multiple interfaces, you end up with discrete collections of inheritable functionality that can be mixed and matched like Legos.

I'll demonstrate this possibility with everyone's favorite eating utensil: the spork.  I'll be using Bill's "minimalist interface paired with a mixin" approach.



public interface IFork { }
public static class IForkMixIns
{
public static void SpearFood(this IFork fork)
{
Console.WriteLine("IFork: Got it!");
}
}
public interface ISpoon { }
public static class ISpoonMixIns
{
public static void ScoopFood(this ISpoon spoon)
{
Console.WriteLine("ISpoon: Got it!");
}
}

public class Spork : IFork, ISpoon { }

Now an instance of Spork will have both the SpearFood() and ScoopFood() behaviors.

I even have pseudo method overriding with this approach as well (i.e. overriding one of the mixed-in methods).  All I have to do is implement a method of the same signature on the inheriting class--no override/new required.  As an example, I'll warn the spork users about the tines when they attempt to use it as a spoon (just in case they were expecting a smooth edge).



public class Spork : IFork, ISpoon
{
public void ScoopFood()
{
ISpoon spoon = this;
spoon.ScoopFood();
Console.WriteLine("Spork: Watch out for the tines!");
}
}

Of course this isn't a true override; it's more like using the new keyword.  Casting a Spork to an ISpoon will still result in the definition of ISpoonMixIns.ScoopFood() being used.

And that brings me to the obvious limitation of this strategy.  The title of this post includes the phrase "pseudo multiple inheritance" for a reason.  Since true method overriding is not supported, opportunities to take advantage of the polymorphic capabilities of the concrete classes are more limited.

Wednesday, October 31, 2007

PowerShell Script for AES Key Generation

I have to constantly generate AES keys for the numerous SSO requests that we receive from our clients.  The keys are used for message level security, and they're really the biggest headache we have when it comes to setting up SSO for a new client.  Everything after that is a breeze (a simple database entry).

I used to use one of the unit tests that exercises our cryptography code for this task.  I would set a break point where the AES algorithm was instantiated and then inspect the value of the Key property.  However, a short PowerShell function has now made this much easier.

   1:  function GenerateAesKey() {
   2:     $algorithm = [System.Security.Cryptography.SymmetricAlgorithm]::Create("Rijndael")
   3:     $keybytes = $algorithm.get_Key()
   4:     $key = [System.Convert]::ToBase64String($keybytes)
   5:     Write-Output $key
   6:  }

Friday, October 19, 2007

The Functional Programming Renaissance

At this point, I don't think it's a stretch on my part to say that most "experts" in the computing industry accept that we have just about reached the speed limit enforced by the inherent physical limitations of modern processor architecture.  I could link to any number of news and magazine articles that say as much, but corroboration of factual statements found in blogs is an exercise for the reader.  In any event, the design modification du jour in the chip industry is clearly increasing the number of processor cores, rather than the old school method of advancing clock frequency.

What all of this means for developers is that there are no more free lunches when it comes to the performance of their applications.  The automatic performance gains that came with processor upgrades are a thing of the past since processor speed will remain largely static.  Therefore, in order to make our applications scream, we will have to consciously work at making them take advantage of the multiple computing cores available.  However, parallel computing is an area of computer science that many programmers have no experience in.

Thankfully the computing industry is already hard at work trying to make sure that the transition to parallel computing won't necessarily feel like a step backwards.  On the .NET side of the house, Parallel LINQ (PLINQ) and Task Parallel Library (TPL) are currently under development to help make our lives easier.  However, while these frameworks are not necessarily hard to integrate into existing code and coding habits, they still require extra effort on the part of the developer since they have to be aware of the issues involved (e.g. exception handling and list ordering, just to name a couple).  In short they feel more like a bit of duct tape applied to existing technologies in order to make developers feel more comfortable.  While I can certainly appreciate the sentiment, I think the long-term solution is going to be much more dramatic.

(Re-)Enter functional programming.

Functional programming already has the inherent ability to be broken into discrete units of work that can be shuffled around from processor to processor.  This creates a needed abstraction layer around the details required to facilitate parallel programming, and leaves the developer free to worry about the details of their design.  Since functional programming is already a part of most developers lives (via SQL and, very soon, LINQ), it won't be entirely foreign.  And of course, developers always love learning new technologies anyhow.

I know that I'm not the only person to recognize functional programming as the potential wave of the future.  Microsoft is expected to be integrating F# in a future version of VisualStudio (not Orcas).

Friday, August 10, 2007

Bogus NUnit Error: 'Could not load file or assembly nunit.core'

After adding a set of unit tests to an existing test project, NUnit started throwing this exception on our build server that, of course, wasn't being thrown locally.  I was perplexed since the balance of the changes were a new test fixture, a couple of new references to the test project, and an app.config change.  How could any of those things cause NUnit not to find... itself?

I knew the error message was probably not indicative of the real problem, and my suspicion was confirmed by the first search result returned by Google.  Basically the problem boils down to an error in the app.config file.  In my case it was because the config section I had added was defined in my (customized) machine.config file, but not in the copy that was on the build server.

Thursday, August 9, 2007

Embedding an Intermediate Certificate Into Your SSL Certificate

Based on the number of forum postings and blog entries that I have run across, VeriSign's expired intermediate certificate still continues to be a problem for many people, so I didn't feel too badly when it started causing troubles for us recently.

When you get a cert from VeriSign, they don't actually sign it directly with their root CA cert.  Instead, they use an intermediate cert that in turn has been signed by the root cert.  This is all well and good since it allows them to limit exposure of their CA cert, while their customers still get the "security" they are looking for.

The crux of the issue is validation of the cert chain.  Every cert in the chain has to be valid in order for the SSL cert itself to be considered valid.  The problem is that there are still a lot of applications (mainly browsers) out there whose cert stores still have expired copies of VeriSign's intermediate cert, even though a renewed version has been available for quite a while.

Since SSL certs are most commonly used by web servers, the solution is to simply make sure the server has the renewed cert in its cert store.  VeriSign provides clear directions on how to do this for all of the major web servers.  However, if the SSL cert is being used by another web server, or if it is being used in a third party tool's web administration console, things may get a little more complicated.  In those cases, it may be easier just to embed the intermediate cert directly into the SSL cert.

You can do this with OpenSSL by issuing the following command:

openssl pkcs12 -export -in <VeriSignIssuedCert> -out <NewSSLCert> -inkey <KeyUsedToCreateCSRRequest> -certfile <VeriSignIntermediateCert>

Embedding the intermediate cert no longer makes it necessary for the client application to have its own local copy since it is already contained in the SSL cert.  And better yet, it also means you don't have to remember to install it in the cert store of every new web server that you create.