Search code examples
iosswiftuiviewuikitdowncast

DOWNCASTING IN SWIFT


I have an issue using Swift. I have a class:

class ClassA : UIView{

    override public init(frame: CGRect) {
        super.init(frame: frame)
    }
    
    required public init?(coder aDecoder: NSCoder) {
        super.init(coder: aDecoder)
    }

    init(from uiview: UIView) {
        super.init(frame: .zero)
    }

}

Then I have another class PageViewer

class PageViewer: UIView{

    var values : [ClassA]?
    var location: Int = 0

    public func test(){
        for i in 0..<values!.count {
            location = i
            values![i] = self.subviews[location] as! ClassA
        }
    }
}

problem: I'm trying to add UIViews as sub-views to the pageviewer but I'm getting a run time exception on this line values![i] = self.subviews[location] as! ClassA saying Could not cast value of type 'UIView' to 'MyTestApp.ClassA. I want to know if there is a possible workaround for this.


Solution

  • You can always cast down, but only sometimes cast up. In this case, you are saying all of the subviews can be anything; force them to be a specific view type so we can hammer them into this array. Depending on what you are trying to do, there must be a more elegant way of doing it, but if you really need a workaround, then:

    self.subviews[location].forEach { subView in
        if subView is ClassA {
            values.append(subView as! ClassA)
        }
    }
    

    That should pass because is validating the subview is in fact a `CVla