A signal is assumed to be bandlimited to kHz. It is desired to filter this signal with an ideal bandpass filter that will pass the frequencies between kHz and kHz by a system for processing analog signals composed of a digital filter with frequency response sandwiched between an ideal A/D and an ideal D/A, both operating at sampling interval . 1. Determine the Nyquist sampling frequency, (in kHz), for the input signal. 2. Find the largest sampling period (in s) for which the overall system comprising A/D, digital filter and D/A realize the desired band pass filter.

Respuesta :

Answer:

Hello your question is poorly written attached below is the complete question

answer :

1) 60 kHz

2)  Tmax  = ( 1 / 34000 ) secs

Explanation:

1) Determine the Nyquist sampling frequency, (in kHz), for the input signal.

F(s) = 2 * Fmax

Fmax = 30 kHz  ( since Xa(t) is band limited to 30 kHz )

∴ Nyquist sampling frequency ( F(s) ) = 2 * 30 = 60 kHz

2) Determine the largest sampling period (in s) .

Nyquist sampling period = 1 / Fs  = ( 1 / 60000 ) s

but there is some aliasing of the input signal ( minimum aliasing frequency > cutoff frequency of filter ) hence we will use the relationship below

=  2π - 2π * T * 30kHz  ≥  2π * T * 4kHz

∴ T ≤ [tex]\frac{1}{34kHz}[/tex]

largest sampling period ( Tmax ) = ( 1 / 34000 ) secs

Ver imagen batolisis
ACCESS MORE
EDU ACCESS