All of the service fabric examples depict single-solution service fabric examples. This seems to go counter to the philosophy of microservices, where you want complete dependency isolation between your services. While you can manually follow this pattern, the more common practice is to enforce it by making each service it's own repository and solution/project.
How do you manage and deploy service fabric services, and enforce service contracts (ServiceInferfaces) using multiple solutions (in multiple Git repositories)?
E.g.
Service Fabric Solution
App1 - Customers
- Service1 [Carts] From Other Solution
- Service2 [LocationInfo]From Other Solution
- Service3 [REST WebAPI for Admin] From Other Solution
App2 - Products
- Service4 [Status]From Other Solution
- Service5 [Firmware]From Other Solution
- Service6 [Event Stream] From Other Solution
External Solution
- Service1
External Solution
- Service2
External Solution
- Service3
External Solution
- Service4
External Solution
- Service5
External Solution
- Service6
1) As a developer, I want to check out and build all the current versions of apps/services. I want to fire up my Service Fabric project that manages all the manifests, and deploy it to my local dev cluster. I want to enforce the same service interfaces between solutions. I don't understand how you'd do this, because the application is external to the services.
2) As a DevOps team, I want to automate pulling down the apps, building them and deploying to Azure.
How do we "enforce" isolation via separate solutions, but make it easy to pull them together and deploy into the cluster, while also making it easy to make pipelines to deploy each cluster configured uniquely for DEV, QA, and PROD environments.
What is the workflow/process/project structure to enable this? Is it even possible?
Yep, it's possible - I've done something along these lines before. These are the thoughts that spring to mind immediately...
In each Service Fabric solution have a "public" project containing just the interfaces that you want to expose from the services in that application. The output from this project could be packaged as a nuget package and pushed onto a private repository. You could call it the "interfaces" project I guess, but you wouldn't have to expose all the interfaces if you wanted to consider some of them internal to your application; these could be defined in a separate, unexposed project.
Other solutions that want to reference the services exposed by another application just had to pull down the relevant nuget package to get a reference to the service interfaces.
Now this isn't without problems:
Hope that helps.
EDIT:
An example of registering a service interface in a DI module, (Autofac style)...
This would be the DI module you expose from the public nuget package:
using System;
using Autofac;
using Microsoft.ServiceFabric.Services.Remoting.Client;
public class MyAppModule : Module
{
protected override void Load(ContainerBuilder builder)
{
builder.Register(component => ServiceProxy.Create<IMyService>(new Uri("fabric:/App/MyService"))).As<IMyService>();
// Other services...
}
}
And in the Program.cs of your consuming application, you'd include something like this:
public static void Main()
{
try
{
var container = ConfigureServiceContainer();
ServiceRuntime.RegisterServiceAsync(
"MyConsumingServiceType",
context => container.Resolve<MyConsumingService>(new TypedParameter(typeof(StatefulServiceContext), context))).GetAwaiter().GetResult();
ServiceEventSource.Current.ServiceTypeRegistered(Process.GetCurrentProcess().Id, typeof(MyConsumingService).Name);
Thread.Sleep(Timeout.Infinite);
}
catch (Exception e)
{
ServiceEventSource.Current.ServiceHostInitializationFailed(e.ToString());
throw;
}
}
private static IContainer ConfigureServiceContainer()
{
var containerBuilder = new ContainerBuilder();
// Other registrations...
containerBuilder.RegisterModule<MyAppModule>();
return containerBuilder.Build();
}
Of course, this approach will only work if you aren't partitioning your services...