Search code examples
matlabfiltersignal-processingconvolutionlowpass-filter

Applying low pass filter


I want to simulate an interpolator in MATLAB using upsampling followed by a low pass filter. First I have up-sampled my signal by introducing 0's.

Upsampled signal

Now I want to apply a low pass filter in order to interpolate. I have designed the following filter:

Filter design

The filter is exactly 1/8 of the normalized frequency because I need to downsample afterward. (it's a specific excersise to upsample interpolate and downsample in this particular order.)

However when I apply this filter to my data using the function filter(myfilter, data) the following signal is generated: filtered signal

I really don't know what is happening to my signal because in theory I know an interpolated signal should appear. This is the first time I'm working in MATLAB with filters because till now we only had the theory and had to assume ideal filters and analytical solutions.

Can someone give me an indication what might be wrong? I use the following code:

clear all; close all;

% Settings parameters
fs=10e6; 
N=10;
c=3/fs;
k=3;
M=8;

% Settings time and signal 
t=0:fs^-1:N*fs^-1;
x=exp(-(t.^2)./(2.*c.^2)); % Gaussian signal

% Upsampling
tu=0:(fs*M)^-1:N*fs^-1;
xu=zeros(1,size(tu,2));
sample_range=1:M:size(xu,2);
for i=1:size(x,2);
    xu(sample_range(i))=x(i);
end;

%% Direct Method

xf=filter(lpf5mhz,xu); 

Solution

  • As suggested by hotpaw2's answer, the low-pass filter needs some time to ramp up to the input signal values. This is particularly obvious with signal with sharp steps such as yours (the signal implicitly includes a large step at the first sample since past samples are assumed to be zeros by the filter call). Also, with your design parameters the delay of the filter is greater than the maximum time range shown on your output plot (1e-6), and correspondingly the output remains very small for the time range shown.

    To illustrate the point, we can take a look at the filtered output with smaller filter lengths (and correspondingly smaller delays), using filters generated with fir1(length,0.125):

    enter image description here

    Given a signal with a smoother transition such as a Gaussian pulse which has been sufficiently time delayed:

    delay = 10/fs;
    x=exp(-((t-delay).^2)./(2.*c.^2)); % Gaussian signal
    

    the filter can better ramp up to the signal value:

    enter image description here

    The next thing you may noticed, is that the filtered output has 1/Mth the amplitude as the unfiltered signal. To get an interpolated signal with similar amplitude as the unfiltered signal you would have to scale the filter output with:

    xf=M*filter(lpf5mhz,1,xu);
    

    enter image description here

    Finally, the signal is delayed by the filtering operation. So for comparison purposes you may want plot a time shifted version with:

    filter_delay = (1/(M*fs))*(length(lpf5mhz)-1)/2;
    plot(tu-(1/(M*fs))*(length(b)-1)/2, xf);
    

    enter image description here