I have been learning about programming languages and there is one question which bothers me all the time.
For example let's say that I programmed something which allows me to push a button every 5 seconds.
How does the Computer understand the waiting part(allows to push the button - waits 5 seconds and allows again)?
I already know that first higher programming languages are getting compiled into machine code so that the computer can run it. But if we take assembler for instance, which is very near to machine code, just human readble, there is no instruction for waiting.
The example which I have given with the waiting is just one example, there are much more things which I do not understand how the computer understands ;)
For short delays on simple CPUs (like microcontrollers) with a known fixed clock frequency, and no multitasking, and a simple one instruction per clock cycle design, you can wait in asm with a "delay loop". Here's the arduino source (for AVR microcontrollers) for an implementation:
https://github.com/arduino/ArduinoCore-avr/blob/master/cores/arduino/wiring.c#L120
As you can see, the behavior depends on the clock-frequency of the CPU. You wouldn't normally loop for 5 seconds, though (that's a long time to burn power). Computers normally have timer and clock chips that can be programmed to raise an interrupt at a specific time, so you can put the CPU to sleep and have it woken up on the next interrupt if there's nothing else to do. Delay loops are good (on microcontrollers) for very short delays, too short to sleep for or even to program a timer for.
You might want to get a little microcontroller-board (not necessarily arduino) to play around with. There you have way less "bloat" from an operating system or libraries and you're much closer to the hardware.