Today I started learning Simulink, and think I can wrap my head around it. However, I came to the definition of the definition of the "Memory Block":
Definition: The Memory block holds and delays its input by one major integration time step.
I don't really understand what that means. I understand that converting from a continuous to discrete time is a very useful thing (and from what I understand, a "zero order hold" is the way to do this), but the use of a memory block as defined above is rather confusing to me.
Can someone explain in layman's terms what it does?
If operating discretely and set to "inherit sample time", it works very similar to the (IMO) more superior unit delay block:
Input...: 6, 4, 8, 3, 9, 1, 0, 0, 0...
Output: 0, 6, 4, 8, 3, 9, 1, 0, 0...
If using continuous time, it delays equal to the time of one "integration step", which depends on your mathematical solver.
Unit delays are common for discrete systems; e.g., a FIR filter.