The Python code was taken as a basis: enter link description here Errors occur in each cycle A lot of errors (please help fix it):
Code:
var t = readLine()!
var s = readLine()!
var len_s = s.count
var t_lis = Set(t)
let character:[Character] = Array(s)
var c_s:[Character: Int] = Dictionary(uniqueKeysWithValues: zip(character, Array(repeating: 1, count: character.count)))
let character2:[Character] = Array(t_lis)
var c_t:[Character: Int] = Dictionary(uniqueKeysWithValues: zip(character2, Array(repeating: 1, count: character2.count)))
var c_res = [String: String]()
var summ = 0
for e in c_s{
c_res[e] = [c_s[e], min( c_s[e], c_t[e] )]
summ += c_res[e][1]}
for i in 0..<((t.count-s.count)+1) {
if summ == len_s-1{
print(i)
break
}
for j in c_res{
if t[i] = c_res[j]{
if c_res[t[i]][1] > 0{
c_res[t[i]][1] -= 1
summ -= 1
}}}
for l in c_res {
if (i+len_s < t.count && t[i+len_s]) = c_res{
if c_res[ t[i+len_s] ][1] < c_res[ t[i+len_s] ][0]{
c_res[ t[i+len_s] ][1] += 1
summ += 1
}}}
}
For reference, here is the original Python code that OP linked to:
t = input('t = ')
s = input('s = ')
len_s = len(s)
t_lis = list(t)
c_s = Counter(s)
c_t = Counter(t_lis[:len_s])
c_res = dict()
summ = 0
for e in c_s:
c_res[e] = [c_s[e], min( c_s[e], c_t[e] )]
summ += c_res[e][1]
for i in range( len(t)-len(s)+1 ):
if summ == len_s-1:
print(i)
break
if t[i] in c_res:
if c_res[t[i]][1] > 0:
c_res[t[i]][1] -= 1
summ -= 1
if i+len_s < len(t) and t[i+len_s] in c_res:
if c_res[ t[i+len_s] ][1] < c_res[ t[i+len_s] ][0]:
c_res[ t[i+len_s] ][1] += 1
summ += 1
else:
print(-1)
First I want to mention that the Python code that was linked to is pretty bad. By that, I mean that nothing is clearly named. It's totally obtuse as to what it's trying to accomplish. I'm sure it would be clearer if I spoke Russian or whatever the languge on that page is, but it's not either of the ones I speak. I know Python programming has a different culture around it than Swift programming since Python is often written for ad hoc solutions, but it really should be refactored, with portions extracted into well-named functions. That would make it a lot more readable, and might have helped you in your translation of it into Swift. I won't try to do those refactorings here, but once the errors are fixed, if you want to use it in any kind of production environment, you really should clean it up.
As you acknowledge, you have a lot of errors. You ask what the errors mean, but presumably you want to know how to fix the problems, so I'll address both. The errors start on this line:
c_res[e] = [c_s[e], min( c_s[e], c_t[e] )]
The first error is Cannot assign value of type '[Any]' to subscript of type 'String'
This means you are building an array containing elements of type Any
and trying to assign it to c_res[e]
. c_res
is Dictionary
with keys of type String
and values of type String
. So assuming e
were a String
, which it isn't - more on that in a sec - then c_res[e]
would have the type of the value, a String
.
The natural question would be why is the right-hand side an array of Any
. It comes down to the definition of the array isn't legal, and the compiler is choking on it (basically, a by-product of other errors). The reason is because min
expects all of its parameters to be of a single type that conforms to the Comparable
protocol, but c_s[e]
and c_s[e]
are illegal... and that's because they are both Dictionary<Character, Int>
, so they expect an index of type of Character
, but e
isn't a Character
. It's a tuple, (Character, Int)
. The reason is to be found on the preceding line:
for e in c_s{
Since c_s
is Dictionary<Character, Int>
it's elements are tuples containing a Character
and an Int
. That might be surprising for a Python programmer new to Swift. To iterate over the keys you have to specify that's what you want, so let's correct that:
for e in c_s.keys {
With that fixed, previous errors go away, but a new problem is exposed. When you index into a Dictionary
in Swift you get an optional value, because it might be nil
if there is no value stored for that key
, so it needs to be unwrapped. If you're sure that neither c_s[e]
nor c_t[e]
will be nil
you could force-unwrap them like this:
c_res[e] = [c_s[e]!, min( c_s[e]!, c_t[e]! )]
But are you sure? It's certainly not obvious that they must be. So we need to handle the optional, either with optional binding, or optional coallescing to provide a default value if it is nil
.`
for (e, csValue) in c_s {
let ctValue = c_t[e] ?? Int.max
c_res[e] = [csValue, min(csValue, ctValue]
summ += c_res[e][1]
}
Note that we've gone back to iterating over c_s
instead of c_s.keys
, but now we're using tuple binding to assign just the key to e
and the value to csValue
. This avoids optional handing for c_s
elements. For the value from c_t[e]
I use optional coallescing to default it to the maximum integer, that way if c_t[e]
is nil
, min
will still return csValue
which seems to be the intent.
But again we have exposed another problem. Now the compiler complains that we can't assign Array<Int>
to c_res[e]
which is expected to be a String
... In Swift, String
is not an Array<Int>
. I'm not sure why c_res
is defined to have values of type String
when the code puts arrays of Int
in it... so let's redefine c_res
.
var c_res = [String: [Int]]()
var summ = 0
for (e, csValue) in c_s {
let ctValue = c_t[e] ?? Int.max
c_res[e] = [csValue, min(csValue, ctValue)]
summ += c_res[e][1]
To paraphrase a Nirvana lyric, "Hey! Wait! There is a new complaint!" Specifically, e
is type Character
, but c_res
is Dictionary<String, Array<Int>>
, so let's just make c_res
a Dictionary<Character, Array<Int>>
instead.
var c_res = [Character: [Int]]()
var summ = 0
for (e, csValue) in c_s {
let ctValue = c_t[e] ?? Int.max
c_res[e] = [csValue, min(csValue, ctValue)]
summ += c_res[e][1]
}
Yay! Now we've resolved all the errors on the line we started with... but there's now one on the next line: Value of optional type '[Int]?' must be unwrapped to refer to member 'subscript' of wrapped base type '[Int]'
Again this is because when we index into a Dictionary
, the value for our key
might not exist. But we just computed the value we want to add to summ
in our call to min
, so let's save that off and reuse it here.
var c_res = [Character: [Int]]()
var summ = 0
for (e, csValue) in c_s {
let ctValue = c_t[e] ?? Int.max
let minC = min(csValue, ctValue)
c_res[e] = [csValue, minC]
summ += minC
}
Now we finally have no errors in the first loop. Remaining errors are in the nested loops.
For starters, the code uses =
to test for equality. As with all languages in the C family, Swift uses ==
as the equality operator. I think that change, which is needed in couple of places, is pretty straight forward, so I won't show that iteration. Once those are fixed, we get one of my favorite (not) errors in Swift: Type of expression is ambiguous without more context
on this line:
if t[i] == c_res[j] {
These ambiguity errors can mean one of a few things. The main reason is because elements of the expression match several definitions, and the compiler doesn't have a way to figure out which one should be used. That flavor is often accompanied by references to the possible matches. It also seems to happen when multiple type-check failures combine in a way the compiler can't give a clearer error. I think that's the version that's happening here. The source of this problem goes back to the outer loop
for i in 0..<((t.count-s.count)+1) {
which makes the loop variable, i
, be of type, Int
, combined with using i
to index into t
, which is a String
. The problem is that you can't index into String
with an Int
. You have to use String.Index
. The reason comes down to String
consisting of unicode characters and using UTF-8 internally, which means that it's characters are of different lengths. Indexing with an Int
in the same way as you would for an element of an Array
would require O(n) complexity, but indexing into a String
is expected to have O(1) complexity. String.Index
solves this by using String
methods like index(after:)
to compute indices from other indices. Basically indexing into a String
is kind of pain, so in most cases Swift programmers do something else, usually relying on the many methods String
supports to manipulate it. As I'm writing this, I haven't yet put together what the code is supposed to be doing, which makes it hard to figure out what String
methods might be helpful here, so let's just convert t
to Array[Character]
, then we can use integers to index into it:
var t = [Character](readLine()!)
That still gives an ambiguous expression error though, so I looked at the equivalent line in the Python code. This revealed a logic error in translation. Here's the Python:
if t[i] in c_res:
if c_res[t[i]][1] > 0:
c_res[t[i]][1] -= 1
summ -= 1
There is no loop. It looks like the loop was introduced to mimic the check to see if t[i]
is in c_res
, which is one way to do it, but it was done incorrectly. Swift has a way to do that more succinctly:
if c_res.keys.contains(t[i]) {
if c_res[t[i]][1] > 0 {
c_res[t[i]][1] -= 1
summ -= 1
}
}
But we can use optional binding to clean that up further:
let tChar = t[i]
if let cResValue = c_res[tChar] {
if cResValue[1] > 0 {
c_res[tChar][1] -= 1
summ -= 1
}
}
But again we have the problem of indexing into a Dictionary
returning an optional which needs unwrapping on the line,
c_res[tChar][1] -= 1
Fortunately we just ensured that c_res[tChar]
exists when we bound it to cResValue
, and the only reason we need to index into again is because we need to update the dictionary value... this is a good use of a force-unwrap:
let tChar = t[i]
if let cResValue = c_res[tChar] {
if cResValue[1] > 0 {
c_res[tChar]![1] -= 1
summ -= 1
}
}
The last loop also seems to be the result of testing for existence in c_res
and the loop variable isn't even used. Here's the original Python:
if i+len_s < len(t) and t[i+len_s] in c_res:
if c_res[ t[i+len_s] ][1] < c_res[ t[i+len_s] ][0]:
c_res[ t[i+len_s] ][1] += 1
summ += 1
We can use optional binding here combined with the comma-if syntax, and another force-unwrap.
tChar = t[i + len_s]
if i+len_s < t.count, let cResValue = c_res[tChar] {
if cResValue[1] < cResValue[0] {
c_res[tChar]![1] += 1
summ += 1
}
}
Of course, since we're re-using tChar
with a new value, it has to be changed from let
to var
.
Now it all compiles. It definitely needs refactoring, but here it is altogether:
import Foundation
var t = [Character](readLine()!)
var s = readLine()!
var len_s = s.count
var t_lis = Set(t)
let character:[Character] = Array(s)
var c_s:[Character: Int] = Dictionary(uniqueKeysWithValues: zip(character, Array(repeating: 1, count: character.count)))
let character2:[Character] = Array(t_lis)
var c_t:[Character: Int] = Dictionary(uniqueKeysWithValues: zip(character2, Array(repeating: 1, count: character2.count)))
var c_res = [Character: [Int]]()
var summ = 0
for (e, csValue) in c_s {
let ctValue = c_t[e] ?? Int.max
let minC = min(csValue, ctValue)
c_res[e] = [csValue, minC]
summ += minC
}
for i in 0..<((t.count-s.count)+1) {
if summ == len_s-1 {
print(i)
break
}
var tChar = t[i]
if let cResValue = c_res[tChar] {
if cResValue[1] > 0 {
c_res[tChar]![1] -= 1
summ -= 1
}
}
tChar = t[i + len_s]
if i+len_s < t.count, let cResValue = c_res[tChar] {
if cResValue[1] < cResValue[0] {
c_res[tChar]![1] += 1
summ += 1
}
}
}
If all of this makes you wonder why anyone would use such a picky language, there are two things to consider. The first is that you don't get so many errors when writing code originally in Swift, or even when translating from another strongly typed language. Coverting from a typeless language, like Python, is a problem, because apart from subtle other differences, you also have to pin down it's overly flexible view of data to some concrete type - and that's not always obvious how to do it. The other thing is that strongly typed languages allow you to catch huge classes of bugs early... because the type system won't even let them compile.