Ok so this is not a homework question obviously, but here is the thing: The bubble sort algorithm is said to be O(n^2), Ω(n). However, if we plot its time complexity as a graph (average case), and try to lower bound it, a more accurate lower bound would be Ω(n^2). However contextually we see that Ω(n) is correct. So why does lower bounding the algorithm's runtime does not work?
I think you're mixing up concepts:
Lower bound vs upper bound: Ω(f(n)) is a lower bound, and O(f(n)) is an upper bound.
Best vs worst case: Unless otherwise stated, both Ω(f(n)) and O(f(n)) refer to the worst case running time, as a function of the input size.
For bubble sort, the worst case input of size n is guaranteed to take quadratic time, so bubble sort is O(n2) and Ω(n2).
The reason "bubble sort is said to be Ω(n)" is that a lot of people mess this up.
Remember that Ω(f(n)) is the set of functions that are asymptotically bounded underneath by f(n), and when you see "Algorithm X is Ω(f(n))", read it "The worst case running time of Algorithm X is in Ω(f(n))".
(Note however that Dmitry's answer is also also correct. Because omega is a lower bound, Ω(1), Ω(log n), Ω(n), Ω(n log n), and Ω(n2) all apply)