I have a generic std_logic_vector
with dynamic length as suggested here: https://stackoverflow.com/a/28513417/12978575
entity lfsr is
generic(
INIT : std_logic_vector
);
[...]
end entity;
Later, I want to access the bits of INIT
individually in a way where order matters. How can I ensure that the bit order of INIT
will always be a downto
bit order (for example 3 downto 0
when having 4 bit length) and never a to
order (for example 0 to 3
)?
You can provide a deterministic descending range with an object alias:
library ieee;
use ieee.std_logic_1164.all;
entity lfsr is
generic (
INIT: std_logic_vector
);
-- In the entity declarative part:
alias DINIT: std_logic_vector(INIT'LENGTH - 1 downto 0) is INIT;
end entity lfsr;
Use DINIT in the architecture where you'd have previously used INIT.
You could alias INIT in the architecture block declarative part or any declarative region (e.g. process statement, block statement, generic_statement, subprogram definition) in the architecture within the scope of the generic declaration:
library ieee;
use ieee.std_logic_1164.all;
entity lfsr is
generic (
INIT: std_logic_vector
);
end entity lfsr;
architecture foo of lfsr is
alias DINIT: std_logic_vector(INIT'LENGTH - 1 downto 0) is INIT;
begin
end architecture;