I understand how structs and classes (and protocols) work on the basic level. I have a rather common situation:
I need to have generic value types with operators which really must copy on assignment. These types have complex structure and I would like to be able to specialise by subclassing otherwise there will be copied code everywhere and it will be poor programming.
I have tried protocols and extensions but then because the protocol wasn't generic I was unable to define the (generic) operators I wanted. If I use classes I will not copy on assignment.
Today's example is I have Matrix, and SquareMatrix under that with specific square matrix functions. There are operators and the matrices can be populated by anything conforming to my ring protocol. I tried defining almost all the functionality in a protocol with associated type, and an extension.
Edit: I am really wondering what I should be coding. In the matrix situation I need to be able to pass a square matrix as any other, so subclassing is the only option? Maybe I'm wrong. The main issue is when I have to write a function which talks about internal values, I have to know the generic type argument to do anything useful. For example when defining addition, I have to create a new matrix and declare its generic type, but where do I get that from when I only know something is a (nongeneric) protocol - it's real type is generic but despite the protocol having this associated type, I have no way of getting it out.
Solution thanks to alexander momchliov. Essentially more work was needed to move code into the protocol extension fully and use 'Self' for all the relevant types. In the extension the compiler was happy with what the generic types were.
The code was private, I am sorry I was unable to paste any during this question. Thanks for your patience and help.
Struct inheritance/polymorphism wouldn't be possible for at least 2 reasons (that I can think of).
Structs are stored and moved around by value. This requires the compiler to know, at compile time, the exact size of the struct, in order to know how many bytes to copy after the start of a struct instance.
Suppose there was a struct A
, and a struct B
that inherits from A
. Whenever the compiler sees a variable of type A
, it has no way to be sure if the runtime type will really be an A
, or if B
was used instead. If B
added on new stored properties that A
didn't have, then B
's size would be different (bigger) than A
. The compiler would be unable to determine the runtime type, and the size of these structs.
Polymorphism would require a function table. A function table would be stored as a static member of the struct type. But to access this static member, every struct instance would need an instance member which encodes the type of the instance. This is usually called the "isa" pointer (as in, this instance is a A
type). This would be 8 bytes of overhead (on 64 bit systems) for every instance. Considering Int
, Bool
, Double
, and many other common types are all implemented as structs, this would be an unacceptable amount of overhead. Just think, a Bool
is a one byte value, which would need 8 bytes of overhead. That's 11% efficiency!
For these reasons, protocols play a huge part in Swift, because they allow you introduce inheritance-like behaviour, without these issues.