I am having a bit of a problem understanding the benefits of FPGAs for parallel processing. Everybody says it is parallel, but It looks to me like it is not truly parallel. Let's see this example:
I have a data signal coming on some pins, at 1 bit per clock cycle. The FPGA will receive this data and since it has the data already inside the integrated circuit it can start processing it right away. But this is called serial processing, not parallel. If the FPGA waits for the data to accumulate, to later process it in parallel, then we can say FPGA processing is truly parallel but what is the benefit of waiting for the data to arrive in large quantities, we will just lose time, for example, if we wait for 8-bit data, we will lose 7 cycles. So where is the benefit of parallelism of FPGAs?? I can't get it.
It would be parallel if the data were coming in parallel, like when you use the old DB-25 Parallel Port connector. However, this technology became obsolete since parallel ports can not support high speeds. Today's USB standard is serial, Ethernet is serial, so .... where is the parallelism??
The parallelism comes in if you have data that arrives in chunks, and the chunks arrive faster than they can be processed, and the chunks can be processed individually. Rather than having to slow-down the data sender, an FPGA allows you to add more processing "blocks" so that the processing goes faster.
Example: You receive data (serially or in parallel, doesn't matter) at 1MB/s in 50kB chunks, but your algorithm only allows 1 chunk to be processed per second. In an FPGA, you can wire up the "receiver" to distribute the chunks across 20 "processors", so now your sender can still send at full speed, and your receiver sees less overall lag.