I'm sending data to and A/D converter and I need the command data to be delayed at least 50ns from clk_19khz. Here is what I have so far. How do I insert a delay of 50ns which is a requirement for the A/D between the clk_19khz and my first Dout bit to the A/D? I'm using a Xilinx FPGA. Thanks for the help!
library IEEE;
use IEEE.STD_LOGIC_1164.ALL;
-- Uncomment the following library declaration if using
-- arithmetic functions with Signed or Unsigned values
--use IEEE.NUMERIC_STD.ALL;
-- Uncomment the following library declaration if instantiating
-- any Xilinx primitives in this code.
--library UNISIM;
--use UNISIM.VComponents.all;
entity PSOL is
Port ( clk : in STD_LOGIC;
clk_19khz : OUT std_logic;
Dout :out std_logic);
end PSOL;
architecture Behavioral of PSOL is
signal temp : std_logic;
signal count : integer range 0 to 1301 := 0; --1301
signal temp2 : std_logic;
signal dcount : integer range 0 to 11 := 0; --
signal start : std_logic := '1'; -- indicates the start of
signal parity : std_logic := '1'; --used to varify data sent
signal stop : std_logic := '0'; --indicate when word/command has
--signal chip_select : bit :='1'; -- active low
begin
process (clk)
begin
if (clk' EVENT AND clk='1') then
if (count = 1301) then --1301
temp <= not(temp);
count <=0;
else
count <= count + 1;
end if;
end if;
end process;
clk_19khz <= temp;
temp2 <= temp;
process (temp2)
begin
If (temp2' EVENT and temp2 ='0') then
dcount <= dcount + 1;
parity <= '1';
stop <= '0';
start <='1';
if (dcount < 12 and start = '1' and stop = '0') then
CASE dcount is
when 1 => Dout <= start; -- need delay 50ns before this
when 2 => Dout <= '0';
when 3 => Dout <= '1';
when 4 => Dout <= '0';
when 5 => Dout <= '1';
when 6 => Dout <= '0';
when 7 => Dout <= '0';
when 8 => Dout <= '1';
when 9 => Dout <= '1';
when 10 => Dout <= parity;
when 11 => Dout <= '0';
when others => null;
end case;
end if;
end if;
--dcount <= 0;
--start <='1';
end process;
end Behavioral;
Your clock (50 MHz) has a period of 20 ns
. So you'll need a modulo-3 counter to count a delay of at least 3 clock pulses which gives a delay of 60 ns
.
Declarations:
signal delay_en : std_logic;
signal delay_us : unsigned(1 downto 0) := (others => '0');
signal delay_ov : std_logic;
Usage:
process(clk)
begin
if rising_edge(clk) then
if (delay_en = '1') then
delay_us <= delay_us + 1;
else
delay_us <= (others => '0');
end if;
end if;
end process;
delay_ov <= '1' when (delay_us = 2) else '0';
Your current implementation needs to drive delay_en
while it's waiting for the timespan. If the delay is over, it emits the signal delay_ov
(ov = overflow). This can be used by your solution to go on the in algorithm. Your code should also deassert delay_en
, what clears the counter to 0.