With the release of .NET 4.0 concurrent collections got finally implemented – and the magic System.Collections.Concurrent.BlockingCollection<T>, which I yearningly waited for.

My scenario:
I needed a queue, which can not be dequeue from before it is explicitely set runable or not. I also needed a switch for “adding is now completed”, which works as a safety gate when the buffer runs empty – so, that the consumer waits for completion even if there are currently no items queued.

My thought:
I’d use System.Collections.Concurrent.BlockingCollection<T> with System.Collections.Concurrent.ConcurrentQueue<T> … until I read this MSDN-spec:

If the collection is empty, this method immediately returns false.

This destroyed pt1 (explicitely marked for start) and pt2 (explicitely marked for end)

I wondered a bit about the usage of CompleteAdding, … then there was the MSDN-spec again:

After a collection has been marked as complete for adding, adding to the collection is not permitted and attempts to remove from the collection will not wait when the collection is empty.

Well …

public sealed class SynchronizedQueue<T>
	where T : class
{
	private const int STATE_NOTCOMPLETED = 0;
	private const int STATE_COMPLETED = 1;

	private readonly System.Collections.Concurrent.ConcurrentQueu<T> _queue = new System.Collections.Concurrent.ConcurrentQueue<T>();
	private readonly System.Threading.AutoResetEvent _queueAutoResetEvent = new System.Threading.AutoResetEvent(false);
	private readonly System.Threading.AutoResetEvent _runableAutoResetEvent = new System.Threading.AutoResetEvent(false);

	private bool? _runable;
	private int _state;

	public bool Runable
	{
		get
		{
			while (!this._runable.HasValue)
			{
				this._runableAutoResetEvent.WaitOne();
			}
			return this._runable.Value;
		}
		set
		{
			this._runable = value;
			this._runableAutoResetEvent.Set();
		}
	}

	public void Enqueue(T item)
	{
		if (item == null)
		{
			throw new System.ArgumentNullException("item");
		}

		this._queue.Enqueue(item);
		this._queueAutoResetEvent.Set();
	}

	public void CompleteAdding()
	{
		var state = System.Threading.Interlocked.CompareExchange(ref this._state,
		                                                         STATE_COMPLETED,
		                                                         STATE_NOTCOMPLETED);
		if (state == STATE_NOTCOMPLETED)
		{
			this._queueAutoResetEvent.Set();
		}
	}

	public bool TryDequeue(out T item)
	{
		if (!this.Runable)
		{
			item = null;
			return false;
		}

		while (!System.Linq.Enumerable.Any(this._queue))
		{
			if (this._state == STATE_COMPLETED)
			{
				item = null;
				return false;
			}

			this._queueAutoResetEvent.WaitOne();
		}

		return this._queue.TryDequeue(out item);
	}
}

Licensed under WTFPL

  • Category: C#

Plugging log4javascript, JSON, ServiceStack (and log4net) together

I really loooove log4javascript. It’s simple, mighty and fun to work with.

But imagine the need to log from the client to the server… Well … Ready?

At the first there was a customized appender:

log4javascript.JsonAppender = function (url) {
	var isSupported = true;
	var successCallback = function (data, textStatus, jqXHR) { return; };
	if (!url) {
		isSupported = false;
	}
	this.setSuccessCallback = function (successCallbackParam) {
		successCallback = successCallbackParam;
	};
	this.append = function (loggingEvent) {
		if (!isSupported) {
			return;
		}
		$.post(url, {
			'logger': loggingEvent.logger.name,
			'timestamp': loggingEvent.timeStampInMilliseconds,
			'level': loggingEvent.level.name,
			'url': window.location.href,
			'message': loggingEvent.getCombinedMessages(),
			'exception': loggingEvent.getThrowableStrRep()
		}, successCallback, 'json');
	};
};
log4javascript.JsonAppender.prototype = new log4javascript.Appender();
log4javascript.JsonAppender.prototype.toString = function () {
	return 'JsonAppender';
};

You see $.post, which is the ajax-implementation of jQuery - but you can use any other framework you like.

The next thing that should be implemented, is a method which returns a logger-instance (combined with the newly created appender):

function getClientToServerLogger(url) {
	var logger = log4javascript.getLogger('clientToServerLogger');
	var jsonAppender = new log4javascript.JsonAppender(url);
	logger.addAppender(jsonAppender);
	return logger;
}

The logger and appender are stitched together – what should be done now is the server-side counterpart. Therefore a ServiceStack-implementation can be made:

[Route("/log", "POST")]
public sealed class LoggingEvent : ServiceStack.ServiceHost.IReturnVoid
{
	public string Logger { get; set; }
	public long TimeStamp { get; set; }
	public LogLevel Level { get; set; }
	public string Url { get; set; }
	public string Message { get; set; }
	public string Exception { get; set; }
}
public partial class CustomService : ServiceStack.ServiceHost.IPostVoid<LoggingEvent>
{
	public void Post(LoggingEvent request)
	{
		if (request == null)
		{
			throw new System.ArgumentNullException("request");
		}

		var url = request.Url;
		var message = request.Message;
		var exception = request.Exception;
		var complexMessage = string.Format("url='{0}',message='{1}',exception='{2}'",
										   url,
										   message,
										   exception);

		switch (request.Level)
		{
			case LogLevel.Fatal:
				// TODO log appropriately
				break;
			case LogLevel.Info:
				// TODO log appropriately
				break;
			case LogLevel.Error:
				// TODO log appropriately
				break;
			case LogLevel.Warn:
				// TODO log appropriately
				break;
			case LogLevel.Debug:
				// TODO log appropriately
				break;
		}
	}
}

ServiceStack is used for this example, but … you can use any other webservice-solution (like WCF, old-school ASMX, …). But in fact ServiceStack is quite easy to use, has the full range of formats (which other solutions lack) and is configure-less – a pure joy to work with (I even prefer ServiceStack over any newer solution of Microsoft)

The server-side is ready to log’n'roll – but there’s still a usage missing:

var clientToServerLogger = getClientToServerLogger('<%= ServiceStack.WebHost.Endpoints.AppHostBase.Instance.GetUrl<LoggingEvent>() %>');
clientToServerLogger.fatal(message);
public static string GetUrl<TRequest>(this ServiceStack.WebHost.Endpoints.AppHostBase appHostBase)
{
	return GetUrl<TRequest>(appHostBase,
							"POST");
}

public static string GetUrl<TRequest>(this ServiceStack.WebHost.Endpoints.AppHostBase appHostBase,
									  string httpVerb)
{
	var requestType = typeof (TRequest);

	return appHostBase.GetUrl(requestType,
							  httpVerb);
}

public static string GetUrl(this ServiceStack.WebHost.Endpoints.AppHostBase appHostBase,
							Type requestType,
							string httpVerb)
{
	var endpointHostConfig = appHostBase.Config;
	var serviceStackHandlerFactoryPath = endpointHostConfig.ServiceStackHandlerFactoryPath;

	var serviceRoutes = appHostBase.Routes as ServiceStack.ServiceHost.ServiceRoutes;
	if (serviceRoutes == null)
	{
		throw new System.NotSupportedException("Property Routes of AppHostBase is not of type ServiceStack.ServiceHost.ServiceRoutes");
	}

	var restPaths = serviceRoutes.RestPaths;
	var restPath = restPaths.Where(arg => arg.RequestType == requestType)
	                        .Where(arg => arg.AllowedVerbs.Contains(httpVerb))
	                        .FirstOrDefault();
	if (restPath == null)
	{
		return null;
	}

	var httpContext = System.Web.HttpContext.Current;
	var httpRequest = httpContext.Request;
	var path = restPath.Path;
	var relativePath = string.Concat(serviceStackHandlerFactoryPath,
									 path); // bad, i know, but combining with 2 virtual paths ...
	var appRelativePath = httpRequest.TransformToAppRelativePath(relativePath);
	var absolutePath = System.Web.VirtualPathUtility.ToAbsolute(appRelativePath);

	return absolutePath;
}
public static string TransformToAppRelativePath(this System.Web.HttpRequest httpRequest,
                                                string relativePath)
{
	string appRelativePath;
	if (VirtualPathUtility.IsAbsolute(relativePath))
	{
		appRelativePath = string.Concat("~",
										relativePath);
	}
	else
	{
		appRelativePath = string.Concat("~/",
										relativePath);
	}
	return appRelativePath;
}

The GetUrl extension method is designed for a ServiceStack AppHostBase-implementation, which is hosted in IIS and is in the same application as the page which calls the extension-method

Proxy-Handler in C#

Sometimes you need  some sort of proxy – eg working with HTTP JSONP (crossdomain) when running on a HTTPS website, which would get blocked otherwise.

To achieve this, you can implement a generic HttpHandler like

public sealed class Proxy : System.Web.IHttpHandler
{
	public const string Source = "~/Proxy.ashx";

	public bool IsReusable
	{
		get
		{
			return false;
		}
	}

	public override void ProcessRequest(System.Web.HttpContext context)
	{
		var httpRequest = context.Request;
		var httpResponse = context.Response;
		var uri = System.Uri.UnescapeDataString(httpRequest.QueryString.ToString());
		if (string.IsNullOrEmpty(uri))
		{
			httpResponse.StatusCode = (int) System.Net.HttpStatusCode.Forbidden;
			httpResponse.End();
			return;
		}

		var webRequest = (System.Net.HttpWebRequest) System.Net.HttpWebRequest.Create(uri);
		webRequest.Method = httpRequest.HttpMethod;
		webRequest.ContentType = httpRequest.ContentType;
		webRequest.Accept = string.Join(", ",
										httpRequest.AcceptTypes);
		webRequest.Referer = httpRequest.UrlReferrer.NullSafe(arg => arg.ToString());
		webRequest.UserAgent = httpRequest.UserAgent;

		if (webRequest.SupportsCookieContainer)
		{
			webRequest.CookieContainer = new System.Net.CookieContainer(httpRequest.Cookies.Count);
			foreach (var key in httpRequest.Cookies.AllKeys)
			{
				var httpCookie = httpRequest.Cookies.Get(key);
				var cookie = new System.Net.Cookie
				{
					Domain = webRequest.RequestUri.Host,
					Expires = httpCookie.Expires,
					HttpOnly = httpCookie.HttpOnly,
					Name = httpCookie.Name,
					Path = httpCookie.Path,
					Secure = httpCookie.Secure,
					Value = httpCookie.Value
				};
				webRequest.CookieContainer.Add(cookie);
			}
		}

		foreach (var key in httpRequest.Headers.AllKeys.Where(arg => arg.StartsWith("X-")))
		{
			var value = httpRequest.Headers.Get(key);
			webRequest.Headers.Set(key,
								   value);
		}

		if (httpRequest.HttpMethod == "POST")
		{
			webRequest.ContentLength = httpRequest.ContentLength;
			var inputStream = httpRequest.GetBufferedInputStream();
			var outputStream = webRequest.GetRequestStream();
			outputStream.Write(inputStream);
		}

		System.Net.WebResponse webResponse;
		try
		{
			webResponse = webRequest.GetResponse();
		}
		catch (Exception exception)
		{
			httpResponse.StatusCode = (int) System.Net.HttpStatusCode.InternalServerError;
			httpResponse.Write(exception);
			httpResponse.End();
			return;
		}
		if (webResponse == null)
		{
			httpResponse.End();
			return;
		}

		using (webResponse)
		{
			httpResponse.ContentType = webResponse.ContentType;
			using (var webResponseStream = webResponse.GetResponseStream())
			{
				httpResponse.Write(webResponseStream);
			}
		}
	}
}

Following extension-methods are needed to get this code to compile

public static void Write(this HttpResponse httpResponse,
						 Stream stream)
{
	httpResponse.OutputStream.Write(stream);
}
public static void Write(this Stream outputStream,
						 Stream inputStream,
						 int bufferSize = 64)
{
	var buffer = new byte[bufferSize];
	int read;
	while ((read = inputStream.Read(buffer,
									0,
									buffer.Length)) > 0)
	{
		outputStream.Write(buffer,
						   0,
						   read);
	}
}

And use this like

$.ajaxSetup({
	'beforeSend': function (jqXHR, settings) {
		if (settings.crossDomain) {
			settings.url = '<%= this.ResolveUrl(Proxy.Source) %>?' + encodeURIComponent(settings.url);
			settings.crossDomain = false;
		}
	}
});

Licensed under WTFPL

  • Category: C#

Unhandeled Exception in Quartz.NET job

public sealed class FooJob : Quartz.IJob
{
	public void Execute(Quartz.IJobExecutionContext context)
	{
		// some freaky calls, some might fail, but we dunno, so there's no try/catch
	}
}

Usally this is no problem, as we add an EventHandler to AppDomain.CurrentDomain.UnhandledException like:

AppDomain.CurrentDomain.UnhandledException += (sender, eventArgs) =>
{
	var exception = eventArgs.ExceptionObject as System.Exception;
	// TODO log exception somewhere
};

There are tons of suggestions out there, and this worked a long time …

Unfortunately/Obviously there’s a breaking change in Quartz.NET – but you don’t need to give up hope, there’s a solution:

internal sealed class ExceptionOccuredJobListener : Quartz.IJobListener
{
	internal static readonly ExceptionOccuredJobListener Instance = new ExceptionOccuredJobListener();

	private ExceptionOccuredJobListener() {}

	public void JobToBeExecuted(Quartz.IJobExecutionContext context) {}

	public void JobExecutionVetoed(Quartz.IJobExecutionContext context)
	{
		var message = string.Format("Job {0} vetoed",
									context.JobDetail.JobType);

		// TODO log veto?!
	}

	public void JobWasExecuted(Quartz.IJobExecutionContext context,
							   Quartz.JobExecutionException jobException)
	{
		if (jobException != null)
		{
			var exception = jobException.GetBaseException();
			var message = string.Format("unhandeled exception occured in {0}",
										context.JobDetail.JobType);
			// TODO log exception
		}
	}

	public string Name
	{
		get
		{
			return "ExceptionOccuredJobListener";
		}
	}
}

Which get’s added to your Quartz.IScheduler-Instance like:

scheduler.ListenerManager.AddJobListener(ExceptionOccuredJobListener.Instance);

Licensed under WTFPL

Creating a Lucene.Net.Analysis.PerFieldAnalyzerWrapper with Flucene

I am using Flucene for defining the maps of fixed-property classes (because it’s really *fluent*).

Flucene has the ability to set an Lucene.Net.Analysis.Analyzer per field like:

this.Map(arg => arg.Field1).Analyzer.Standard();
this.Map(arg => arg.Field2).Analyzer.Keyword();

I ran into the problem, that even if you specify the analyzer on the field, the Lucene.Net.Index.IndexWriter takes another one (which gets injected in the ctor) – so, frankly, the setting of a custom analyzer in Flucene is useless …

To fix this, I’ve investigated the source and can you provide a custom implementation of Lucene.Net.Odm.FluentMappingsService

public class CustomMappingService : Lucene.Net.Odm.FluentMappingsService
{
    private readonly Lucene.Net.Util.Version _luceneVersion;

    public CustomMappingService(System.Reflection.Assembly assembly,
                                Lucene.Net.Util.Version luceneVersion)
        : base(assembly)
    {
        this._luceneVersion = luceneVersion;
    }

    public CustomMappingService(System.Collections.Generic.IEnumerable assemblies,
                                Lucene.Net.Util.Version luceneVersion)
        : base(assemblies)
    {
        this._luceneVersion = luceneVersion;
    }

    public CustomMappingService(System.Type type,
                                Lucene.Net.Util.Version luceneVersion)
        : base(type)
    {
        this._luceneVersion = luceneVersion;
    }

    public CustomMappingService(System.Collections.Generic.IEnumerable types,
                                Lucene.Net.Util.Version luceneVersion)
        : base(types)
    {
        this._luceneVersion = luceneVersion;
    }

    public Lucene.Net.Util.Version LuceneVersion
    {
        get
        {
            return this._luceneVersion;
        }
    }

    public Lucene.Net.Odm.Mapping.DocumentMapping GetDocumentMapping()
    {
        var type = typeof (T);
        var mapping = this.GetDocumentMapping(type);
        var documentMapping = mapping as Lucene.Net.Odm.Mapping.DocumentMapping;

        return documentMapping;
    }

    private object GetDocumentMapping(System.Type type)
    {
        object mapping;
        this.Mappings.TryGetValue(type,
                                  out mapping);
        return mapping;
    }

    public Lucene.Net.Analysis.PerFieldAnalyzerWrapper GetPerFieldAnalyzerWrapper()
    {
        var standardAnalyzer = this.CreateAnalyzer(typeof (Lucene.Net.Analysis.Standard.StandardAnalyzer));
        var perFieldAnalyzerWrapper = this.GetPerFieldAnalyzerWrapper(standardAnalyzer);

        return perFieldAnalyzerWrapper;
    }

    public Lucene.Net.Analysis.PerFieldAnalyzerWrapper GetPerFieldAnalyzerWrapper(Lucene.Net.Analysis.Analyzer defaultAnalyzer)
    {
        var documentMapping = this.GetDocumentMapping();
        if (documentMapping == null)
        {
            return null;
        }

        var perFieldAnalyzerWrapper = new Lucene.Net.Analysis.PerFieldAnalyzerWrapper(defaultAnalyzer);

        this.AddDocumentToPerFieldAnalyzerWrapper(perFieldAnalyzerWrapper,
                                                  documentMapping);

        return perFieldAnalyzerWrapper;
    }

    private void AddDocumentToPerFieldAnalyzerWrapperViaReflection(Lucene.Net.Analysis.PerFieldAnalyzerWrapper perFieldAnalyzerWrapper,
                                                                   System.Type type,
                                                                   string prefix = null)
    {
        var documentMapping = this.GetDocumentMapping(type);

        var addDocumentToPerFieldAnalyzerWrapperMethodInfo = this.GetType()
                                                                 .GetMethod("AddDocumentToPerFieldAnalyzerWrapper",
                                                                            System.Reflection.BindingFlags.IgnoreCase | System.Reflection.BindingFlags.Instance | System.Reflection.BindingFlags.NonPublic)
                                                                 .MakeGenericMethod(type);
        addDocumentToPerFieldAnalyzerWrapperMethodInfo.Invoke(this,
                                                              new[]
                                                              {
                                                                  perFieldAnalyzerWrapper, documentMapping, prefix
                                                              });
    }

    private void AddDocumentToPerFieldAnalyzerWrapper(Lucene.Net.Analysis.PerFieldAnalyzerWrapper perFieldAnalyzerWrapper,
                                                         Lucene.Net.Odm.Mapping.DocumentMapping documentMapping,
                                                         string prefix = null)
    {
        foreach (var fieldMapping in documentMapping.Fields)
        {
            this.AddFieldToPerFieldAnalyzerWrapper(perFieldAnalyzerWrapper,
                                                   fieldMapping,
                                                   prefix);
        }
        foreach (var embeddedMapping in documentMapping.Embedded)
        {
            var embeddedPrefix = embeddedMapping.Prefix;
            var newPrefix = string.Concat(prefix,
                                          embeddedPrefix);
            var member = embeddedMapping.Member;
            var memberType = member.MemberType;

            this.AddDocumentToPerFieldAnalyzerWrapperViaReflection(perFieldAnalyzerWrapper,
                                                                   memberType,
                                                                   newPrefix);
        }
    }

    private void AddFieldToPerFieldAnalyzerWrapper(Lucene.Net.Analysis.PerFieldAnalyzerWrapper perFieldAnalyzerWrapper,
                                                   Lucene.Net.Odm.Mapping.FieldMapping fieldMapping,
                                                   string prefix = null)
    {
        var index = fieldMapping.Index;
        if (!string.Equals(index.ToString(),
                           Lucene.Net.Documents.Field.Index.ANALYZED.ToString()))
        {
            // TODO *check* if there's any other condition... I fear that there is...
            return;
        }

        var fieldName = string.Concat(prefix,
                                      fieldMapping.FieldName);
        var analyzerType = fieldMapping.AnalyzerType;
        var analyzer = this.CreateAnalyzer(analyzerType);

        perFieldAnalyzerWrapper.AddAnalyzer(fieldName,
                                            analyzer);
    }

    protected virtual Lucene.Net.Analysis.Analyzer CreateAnalyzer(System.Type type)
    {
        if (type == typeof (Lucene.Net.Analysis.Standard.StandardAnalyzer))
        {
            // Lucene.Net.Analysis.Standard.StandardAnalyzer needs some special ctor
            return new Lucene.Net.Analysis.Standard.StandardAnalyzer(this.LuceneVersion);
        }

        var analyzer = (Lucene.Net.Analysis.Analyzer) System.Activator.CreateInstance(type);

        return analyzer;
    }
}

Licensed under WTFPL

How can I record audio within a Silverlight 4 application?

My use-case: Record audio from a phone-call
Technology to use: Silverlight 4 for client, WCF/C# 3.5 for server
Hardware: ACS Recording Jack

I will not post the full code here, rather a description which I’ve replied to a guy who asked me for some details, as I’ve posted @ StackOverflow

Converting PCM to WAV can be done with pure C# – example
The problem here is that you have to record a complete PCM-stream, because the total length is included in the WAV-header:
bwOutput.Write((uint)rawData.Length);

Yeti is not a pure “managed code“-solution – rather it’s an interface which uses P/Invoke (see lame64 implentation and lame86 implentation for more details).

I am recording an audio-stream in silverlight – pure PCM. Then I’ve downloaded g711audio and did a port to a silverlight assembly (which is quite simple: just create a new silverlight-assembly-project and add all the files as linked-items to the new solution) – actually I’ve uploaded my solution to github (just compile it and you can use it on the server and client – maybe you need to do another port to a windows-phone-sepcific assembly)
With G711 I can compress my PCM-stream realtime to 50%. After recording is finished, I use a WCF-Service which takes a byte-array to upload the compressed PCM-Stream (not really the best solution here, because I cannot use streaming here as I need additional information – such as identifier, fileName, …), then decompress it on the server, use NAudio to convert the PCM-stream to a mp3-compatible wav-stream (I record in following format: Mono, 8.000 Samples per Second and 16 Bits per Sample – which gives me a compressed datarate of about 64kbits – but for mp3-conversion I need at least Mono, 16.000 Samples per Second and 16 Bits per Sample – do not try to use 8 Bits per Sample, because the result is very bad).
Then I use yeti to convert the wav-file to a mp3-file.

Another solution, which I’ve evaluated as not really practicable in my scenario:
Whenever OnSamples() is hit in the audioSink, you could upload the (compressed) chunk with a WCF-service to the server – important: use an iterator-variable (aka i++) here, to identify the order of the chunks. When recording is completed, you could invoke a CompleteSession on the server, which merges all the chunks and then does the stuff from above.

Critical here: As I was not able (based on technology aspects) to convert to mp3 on the client and therefore my scenario depends on a server. So I’ve assigned quite a bunch of tasks to the server, which would be horrifying to the client (based on performance aspects).

If you need more information on how to use System.IO.Stream with WCF, there’s a msdn-entry available.
For our german-speaking friends we have a nice blog-entry from matze :)

Here is how I do the resampling and conversion:

using System;
using System.IO;
using NAudio.Wave;
using Yeti.Lame;
using Yeti.MMedia.Mp3;

internal static class AudioUtils
{
	private const uint Mp3SampleRate = 24;
	private const int Mp3Rate = 16000;
	private const int Mp3Bits = 16;
	private const int BufferSize = 4096;

	internal static bool ResampleWavFile(string wavFullPath, string resampledWavFullPath)
	{
		FileStream sourceFileStream;
		// open wav-file here (wavFullPath)

		using (sourceFileStream)
		{
			using (var sourceWaveFileReader = new WaveFileReader(sourceFileStream))
			{
				var sourceWaveFormat = sourceWaveFileReader.WaveFormat;
				var targetWaveFormat = new WaveFormat(Mp3Rate, Mp3Bits, sourceWaveFormat.Channels);

				if (sourceWaveFormat == targetWaveFormat)
				{
					return false;
				}

				using (var sourceWaveFormatConversionStream = new WaveFormatConversionStream(targetWaveFormat, sourceWaveFileReader))
				{
					using (var sourceBlockAlignReductionStream = new BlockAlignReductionStream(sourceWaveFormatConversionStream))
					{
						FileStream targetFileStream;
						// create wav-file here (resampledWavFullPath)

						using (targetFileStream)
						{
							using (var targetWaveFileWriter = new WaveFileWriter(targetFileStream, targetWaveFormat))
							{
								var buffer = new byte[BufferSize];
								int read;
								while ((read = sourceBlockAlignReductionStream.Read(buffer, 0, buffer.Length)) > 0)
								{
									targetWaveFileWriter.Write(buffer, 0, read);
								}
							}
						}
					}
				}
			}
		}

		return true;
	}

	internal static bool TryConvertWavFileToMp3File(string wavFullPath, string mp3FullPath)
	{
		FileStream wavReadFileStream;
		// open wav-file here (wavFullPath - actually resampledWavFullPath from above)

		using (wavReadFileStream)
		{
			var wavWaveStream = new WaveLib.WaveStream(wavReadFileStream);
			var waveFormat = wavWaveStream.Format;
			var beConfig = new BE_CONFIG(waveFormat, Mp3SampleRate);
			var mp3WriterConfig = new Mp3WriterConfig(waveFormat, beConfig);

			FileStream mp3WriteFileStream;
			// open mp3-file here (mp3FullPath)

			using (mp3WriteFileStream)
			{
				Mp3Writer mp3Writer;
				try
				{
					mp3Writer = new Mp3Writer(mp3WriteFileStream, mp3WriterConfig);
				}
				catch (Exception ex)
				{
					// TODO log the exception
					return false;
				}
				try
				{
					var buffer = new byte[mp3Writer.OptimalBufferSize];
					int read;
					while ((read = wavReadFileStream.Read(buffer, 0, buffer.Length)) > 0)
					{
						mp3Writer.Write(buffer, 0, read);
					}
				}
				catch (Exception ex)
				{
					// TODO log th exception
				}
				mp3Writer.Close();
			}
		}

		return true;
	}
}

Actually you do not need to write files for everything – you could also work with System.IO.MemoryStream (which would be faster, as you do not have any significant latency in your I/O). The reason why I prefer files is that I need a lot more debug-information in this stage – and actually disk space ain’t cost-a-lot …

Update 2012-08-28

Finally I had some time to do some research for Silverlight 5:

  • Ftp: You can now do the upload via ftp, eg with sharpLightFtp
  • P/Invoke: Currently did not start any research on this

Licensed under WTFPL

Bing Maps and jQuery

Recently I’ve implemented Bing Maps for our crm-solution and found this interesting blog entry.

Actually, there’s a lot more easier, better to read and enhanced solution querying bing maps for coordinates. I’ve included links to the original documentation, to get an idea of the parameters.

Nevertheless I’d like to copy a hint from the original blog entry to get a JSONP-response:

The critical part of this url is the addition of the “&jsonp=?” to the end of the url, this allow jquery to create a proxy allowing your code to make a cross domain call to a different url than the one the page is running on.

Find a Location by Query example:

$.getJSON('http://dev.virtualearth.net/REST/v1/Locations?jsonp=?', {
	'query': '1 Microsoft way, Redmond WA 98052',
	'includeNeighborhood': '1',
	'key': 'your key'
}, function (data) {
	if (!data) {
		// something went terribly wrong
		return;
	}
	if (data.statusCode !== 200) {
		// some failure in getting coordinates
		return;
	}
	if (!data.resourceSets[0].estimatedTotal) {
		// no results for query
		return;
	}

	var location = data.resourceSets[0].resources[0].point.coordinates;
	var latitude = location[0];
	var longitude = location[1];
	// TODO
});

Find a Location by Address example:

$.getJSON('http://dev.virtualearth.net/REST/v1/Locations?jsonp=?', {
	'adminDistrict': 'WA',
	'locality': 'Seattle',
	'postalCode': '98178',
	'addressLine': '1 Microsoft Way',
	'countryRegion': 'AU',
	'includeNeighborhood': '1',
	'key': 'your key'
}, function (data) {
	if (!data) {
		// something went terribly wrong
		return;
	}
	if (data.statusCode !== 200) {
		// some failure in getting coordinates
		return;
	}
	if (!data.resourceSets[0].estimatedTotal) {
		// no results for query
		return;
	}

	var location = data.resourceSets[0].resources[0].point.coordinates;
	var latitude = location[0];
	var longitude = location[1];
	// TODO
});

Licensed under WTFPL