Both concepts allow new data types to be created. The only difference I can see is that in functional languages, one can perform pattern matching on algebraic data types. But there is no comparable succinct feature for OO languages. Is this an accurate statement ?
I can see three major differences between algebraic data types and OO-style classes, not counting (im)mutablility because that varies.
One thing I deliberately left out of that list was subtyping. While the vast majority of OO languages allow you to subclass (non-final, non-sealed, currently accessible) classes, and the vast majority of generally ML-family functional languages do not, it is clearly possible to forbid inheritance completely in a hypothetical OO (or at least OO-like) language, and it is likewise possible to produce subtyping and supertyping in algebraic data types; for a limited example of the latter, see this page on O'Haskell, which has been succeeded by Timber