Search code examples
.net-4.0consolespark-view-engine

Spark in a Console Application Targetting .NET 4.0


I was just wondering if anyone has successfully got Spark to work in a .NET 4.0 console application for compiling templates to HTML. Unfortunately I am getting the following error:

Unhandled Exception: Spark.Compiler.CompilerException: Dynamic view compilation failed.
(0,0): error CS1703: An assembly with the same identity 'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has already been imported. Try removing one of the duplicate references.

However, when I target .NET 3.5 everything works fine, however I specifically wish to target 4.0. Has anyone solved this problem, some old threads on the Spark mailing list suggest I may just have to edit a line in the source, and recompile, but I hope this is a last resort.

EDIT:

    static void Main(string[] args)
    {
        if (args.Length > 0)
        {
            var templatePath = Path.Combine(Environment.CurrentDirectory, args[0]);
            var templateName = Path.GetFileName(templatePath);
            var templateDirPath = Path.GetDirectoryName(templatePath);
            var viewFolder = new FileSystemViewFolder(templateDirPath);

            var sparkEngine = new SparkViewEngine
            {
                DefaultPageBaseType = typeof(SparkView).FullName,
                ViewFolder = viewFolder.Append(new SubViewFolder(viewFolder, "Shared")),
            };

            var descriptor = new SparkViewDescriptor().AddTemplate(templateName);
            var view = sparkEngine.CreateInstance(descriptor) as SparkView;

            view.Model = args[1];

            using (var writer = new StreamWriter(new FileStream(args[2], FileMode.Create), Encoding.UTF8))
            {
                view.RenderView(writer);
            }
        }
        else
        {
            Console.WriteLine(">>> error - missing arguments:\n\tSparkCompiler.exe [templatepath] [modelstring] [outputname]");
        }
    }

Solution

  • I didn't consider it a last resort. I changed Line #60 of src\Spark\Compiler\BatchCompiler.cs to

    var providerOptions = new Dictionary { { "CompilerVersion", "v4.0" } };

    it was originally

    var providerOptions = new Dictionary { { "CompilerVersion", "v3.5" } };

    After a recompile and referencing the new Spark.dll everything worked like a charm. er, um, i was able to proceed to the next exception.