I’m struggling setting up timing constraints for an ADC SPI interface.
To much info in several forums but I did not find the same solution as mine.
I’m having some trouble trying to specify correct timing Constraints for an SPI interface on an FPGA.
I have been doing several research on that point but I cant reach the correct solution.
The design is a simple ADC Master interface working at 25 Mhz, same frequency of the system.
First of all I gated the output SCLK with the CS as I have no other possibility (same frequency).
SCLK <= not CLK when CS_N = ‘0’ else ‘0’;
I would rather prefer to make a clock enable and drive a registered clock, but this implies that the max output freq will be half of what I needed.
p_sclk: process(CLK, RESET_N)
begin
if RESET_N='0' then
SCLK <= '0';
elsif(CLK'event and CLK='1') then
if CS_N='1' then
SCLK <= '0';
else
SCLK <= not CLK;
end if;
end if;
end process p_sclk;
In this case I think that the gated clock is correct. But this introduces a delay that shall be taken into account during output/input delay constraints. Isn’t it?
By doing this I obtain a SCLK that is inverted to the system clock. This is quite good because the ADC sets the data at his SCLK rising edge (system neg edge) and I register it on the rising edge of the system. So I have half cycle to read the data stable.
SPI clocks and data
My first approach to the timing constraints is the next one
#internally generated clock from 100Mhz input
create_generated_clock -name {system} -divide_by 4 -source [ get_ports { FPGA_CLK } ] [ get_pins { i_main_clock_gen/s_clk_25mhz_i/Q } ]
#ADC CLK
create_generated_clock -name {serial} -invert -divide_by 1 -source [ get_pins { i_main_clock_gen/s_clk_25mhz_i/Q } ] [ get_ports { ADC_CLK } ]
set_input_delay -clock { serial} -min [expr $PCB_dly_min_SDO + $ADC_Tco_min_SDO – $PCB_dly_max_SCK] [get_ports {ADC_DATA}]
set_input_delay -clock { serial} -max [expr $PCB_dly_max_SDO + $ADC_Tco_max_SDO – $PCB_dly_min_SCK] [get_ports {ADC_DATA}]
set_output_delay -clock { serial } -max [expr $PCB_dly_max_CS + $ADC_Tsu_CS – $PCB_dly_min_SCK] [get_ports {ADC_CS}]
set_output_delay -clock { serial } -min [expr $PCB_dly_min_CS – $ADC_Th_CS – $PCB_dly_max_SCK] [get_ports {ADC_CS}]
The next diagram shows the TCO max and min of the ADC
Timing diagram
The problem is that the data is being registered by system clock so I suppose that the constraints sould be like:
TmaxSETUptime = T/2 -TCO_max – T_delay
TminSETUPtime = T/2 -TCO_min – T_delay
Where T/2 is half period, 20 ns
TCO_max/min are from the ADC datasheet 10/2,5ns
T_delay is the delay introduced from s_clk_25mhz_i/Q pin to ADC_CLK pin.It depends on the place and route
What about Hold times? How can I translate this to set_input_delay? Should I just set the constraint to system insted of serial clock?
What I’m missing? I’m quite confused….
I hope you can clarify me this a little bit
Thanks in advance
Regards
Carlos Alonso
Carlos is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.