I have written a testbench for a program in verilog. The weird problem is the simulator is showing input completely different compared to the one I have provided as input. Thus the output is also affected. Why is this happening? I am testing the code in Xilinx. Here is the test bench
my input is 1010101 simulator is showing 0110101
module HamDecoderTop;
// Inputs
reg clk;
reg rst;
reg [6:0] hword;
// Outputs
wire [3:0] data;
HammingDecoder uut (
.clk(clk),
.rst(rst),
.hword(hword),
.data(data)
);
initial begin
// Initialize Inputs
clk = 0;
rst = 0;
#1 rst =1;
#10 hword = 1010101;
end
always
#2 clk=~clk;
endmodule
Here is what simulator is showing.
You need to prepend your hword literal value with a 7'b
if you want it to be interpreted as a seven bit binary. By default it's interpreted as a decimal value 1,010,101
, which when converted to binary, is 11110110100110110101
. The first 7 bits of this binary number is what you're seeing in the simulator.