I'm writing some specc2 integration tests for my spray.io project that uses dynamodb. I'm using sbt-dynamodb to load a local dynamodb into the environment. I use the following pattern to load my tables before the tests are run.
trait DynamoDBSpec extends SpecificationLike {
val config = ConfigFactory.load()
val client = new AmazonDynamoDBClient(new DefaultAWSCredentialsProviderChain())
lazy val db = {
client.setEndpoint(config.getString("campaigns.db.endpoint"))
new DynamoDB(client)
}
override def map(fs: =>Fragments): Fragments =
Step(beforeAll) ^ fs ^ Step(afterAll)
protected def beforeAll() = {
//load my tables
}
protected def afterAll() = {
//delete my tables
}
}
Then any test class can just be extended with DynamoDBSpec and the tables will be created. It all works fine, until extend DynamoDBSpec from more than one test class, during which it throws an ResourceInUseException: 'Cannot create preexisting table'. The reason is that they execute in parallel, thus it wants to execute table creation at the same time.
I tried to overcome it by running the tests in sequential mode, but beforeall and afterall are still executed in parallel.
Ideally I think it would be good to create the tables before the entire suite runs instead of each Spec class invocation, and then tear them down after the entire suite completes. Does anyone know how to accomplish that?
There are 2 ways to achieve this.
You can use an object to synchronize the creation of your database
object Database {
lazy val config = ConfigFactory.load()
lazy val client =
new AmazonDynamoDBClient(new DefaultAWSCredentialsProviderChain())
// this will only be done once in
// the same jvm
lazy val db = {
client.setEndpoint(config.getString("campaigns.db.endpoint"))
val database = new DynamoDB(client)
// drop previous tables if any
// and create new tables
database.create...
database
}
}
// BeforeAll is a new trait in specs2 3.x
trait DynamoDBSpec extends SpecificationLike with BeforeAll {
//load my tables
def beforeAll = Database.db
}
As you can see, in this model we don't remove tables when the specification is finished (because we don't know if all other specifications have been executed), we just remove then when we re-run the specifications. This can actually be a good thing because this will help you investigate failures if any.
The other way to synchronize specifications at a global level, and to properly clean-up at the end, is to use specification links.
With specs2 3.3 you can create dependencies between specification with links
. This means that you can define a "Suite" specification which is going to:
For example
import org.specs2._
import specification._
import core.Fragments
import runner.SpecificationsFinder
// run this specification with `all` to execute
// all linked specifications
class Database extends Specification { def is =
"All database specifications".title ^ br ^
link(new Create).hide ^
Fragments.foreach(specs)(s => link(s) ^ br) ^
link(new Delete).hide
def specs = specifications(pattern = ".*Db.*")
}
// start the database with this specification
class Create extends Specification { def is = xonly ^
step("create database".pp)
}
// stop the database with this specification
class Delete extends Specification { def is = xonly ^
step("delete database".pp)
}
// an example of a specification using the database
// it will be invoked by the "Database" spec because
// its name matches ".*Db.*"
class Db1Spec extends Specification { def is = s2"""
test $db
"""
def db = { println("use the database - 1"); ok }
}
class Db2Spec extends Specification { def is = s2"""
test $db
"""
def db = { println("use the database - 2"); ok }
}
When you run:
sbt> test-only *Database* -- all
You should see a trace like
create database
use the database - 1
use the database - 2
delete database