In my implementation of a Hash Table, my hash function simply takes the value of the item I pass, call hashCode (inherited from the Object class), and modulo the size of the internal array. This internal array is an array of LinkedLists. Now if my LinkedLists become too long (and my efficiency begins to slip from O(1) to O(n)), I figured it would make sense to simply grow the size of the array. But that's where my problems lie, since I stated I hash the items I pass and modulo the size of the array (which has just changed). If I were to continue, wouldn't the hashes point to different indices in the array, thus lose the ability to refer to items in my hash table? How could I solve this?
You need the actual hash values for each of the items so that you can put them into the correct hash chain in the resized table. (Otherwise, as you observed, the items are liable to end up on the wrong chain and to not be locatable as a result.)
There are two ways to deal with this:
You could simply recalculate the hash value for each item as you add it to the new table.
You could keep a copy of the original hash values for each item in the hash chain. This is what the standard Java HashMap
implementation does ... at least in the versions I've looked at.
(The latter is a time vs space trade-off that could pay off big time if your items have an expensive hashcode
method. However, if you amortize over the lifetime of a hash table, this optimization does not alter the "big O" complexity of any of the public API methods ... assuming that your hash table resizing is exponential; e.g. you roughly double the table size each time.)