It's a very strange problem.
I have a simple class which can decode a base64 string and get the first part before the :
:
import scala.util.{Success, Try}
import org.apache.commons.codec.binary.Base64
class IdDecoder {
def decode(token: String): Option[String] = {
if (token.isEmpty)
None
else
Try(new String(Base64.decodeBase64(token.getBytes)).split(":")(0)) match {
case Success(id) => Some(id)
case _ => None
}
}
}
And define a method which decodes the string
object StrangeToken {
def main(args: Array[String]) {
decode()
}
def decode() = {
val token = "InternalServerError"
val Some(id) = (new IdDecoder).decode(token)
println("### StrangeToken's id len:" + id.length)
id.toCharArray.foreach(c => println(c.toInt))
id
}
}
When I run it in sbt's console or in IDEA or in production, the result is:
### StrangeToken's id len:15
34
123
94
65533
118
65533
73
65533
65533
122
65533
43
0
0
0
But when I run it in spec2, as:
"id decoder" should {
"get decoded string whose length is 15" in {
val id = StrangeToken.decode()
id.length must be equalTo 15
}
}
This test failed and the result is:
### StrangeToken's id len:14
34
123
94
198
118
8226
73
205
212
122
177
43
198
228
I'm not sure why the result is different in spec2.
I happen to get 14 in my sbt console, where my sbt script specifies -Dfile.encoding=UTF8
.
Your new String(bytes)
uses the default encoding. You can supply a charset to the constructor.
Second guess:
You have different versions of the -codec library in Test
configuration and otherwise.
When I bump the version back to 1.1, I also get 15.
To be honest, that was going to be my second guess originally. There is a lot of history in the -codec project, so incompatibilities or behavior changes are not amazing. Presumably the behavior that elicits the number 14 is more conforming.
It's also not amazing that a "common" dependency appears twice in dependency trees.