Discrete-time signals, energy and power of discrete signals, signal symmetries, block diagrams, classification, convolution.
$$x(n)=\sin\left(2\pi\frac{1}{8}n\right)\approx\\\{\dots~-1~-0.7~\underline{0}~0.7~1~0.7~\dots\}$$
The underline indicates `n=0`.
Average of three last values of `x(n)`:
$$y(n)=\frac{1}{3}x(n)+\frac{1}{3}x(n-1)+\frac{1}{3}x(n-2)$$
A stable system decaying at a rate of 0.9:
$$y(n)=0.9y(n-1)+x(n)$$
A unstable system diverging at a rate of 1.1:
$$y(n)=1.1y(n-1)+x(n)$$
The Dirac delta function is used for impulses.
$$\delta(n)=\begin{cases}1&\text{if }n=0\\0&\text{otherwise}\end{cases}$$
The Step function (Heaviside/unit).
$$u(n)=\begin{cases}1&\text{if }n\geq0\\0&\text{otherwise}\end{cases}$$
These two functions can be combined to make other signals.
The energy of a signal is defined as:
$$E\equiv\sum_{n=-\infty}^{\infty}|x(n)|^2$$
The average power of a signal is defined as:
$$P\equiv\lim\limits_{N\to\infty}\frac{1}{2N+1}\sum\limits_{n=-N}^{N}|x(n)|^2$$
Even if a signal has infinite energy, the average power could still be finite.
We can also express the energy as:
$$E\equiv\lim\limits_{N\to\infty}E_N$$
where
$$E_N\equiv\sum_{n=-N}^N|x(n)|^2$$
$$\text{energy signal}\iff 0 < E < \infty,P=0$$
$$\text{power signal}\iff 0 < P < \infty,E=\infty$$
A signal is symmetric or even if `x(n) = x(-n)`.
A signal is antisymmetric or odd if `x(n) = -x(-n)`.
Any signal `x(n)` can be written as a sum of an odd and even signal.
$$
\begin{split}
x(n) &= x_e(n) + x_o(n) \\
x_e(n) &= \frac{1}{2}(x(n)+x(-n)) \\
x_o(n) &= \frac{1}{2}(x(n) - x(-n))
\end{split}
$$
For block diagrams see lecture slides or chapter 2.2.2 in the book.
We call systems different things depending on their properties, a system is called:
Focus is on Linear Time-Invariant (LTI) systems in this course.
$$\delta(n) \rightarrow \boxed{\text{LTI system}} \rightarrow h(n)$$
The impulse response `h(n)` of an LTI system is the output signal when the input signal is a unit impulse `\delta(n)` at time `n=0`.
An arbitrary signal can be written as the sum of delayed impulses:
$$x(n) = \sum_{k=-\infty}^\infty x(k)\delta(n-k)$$
Since LTI systems are linear and time-invariant the output `y(n)` will be the sum of impulse responses:
$$y(n) = x(n)*h(n) = \sum_{k=-\infty}^\infty x(k)h(n-k)$$
Unit impulse is the identity element of convolution:
$$x(n)*\delta(n) = \sum_{k=-\infty}^\infty x(k)\delta(n-k) = x(n)$$
Convolution fulfills the:
If we have two finite-length signals `x(n)` and `h(n)` with lengths `M` and `N` then the convolution \(y(n) = x(n) * h(n)\) will have length `M+N-1`
An LTI system is only causal if its impulse response fulfills: `h(n) = 0` for all `n < 0`.
If we take the example of a recursive system:
$$y(n) = ay(n-1)+x(n)$$
We can rewrite it in terms of the memory from `n=0`:
$$y(n) = a^{n+1}y(-1) + \sum_{k=0}^n a^kx(n-k)$$
Constant-coefficient difference equations describe LTI systems.
If the state of the system at `n=0` is zero (relaxed): e.g. `y(-1)=0` we get the zero-state response:
$$y_{zs}(n) = \sum_{k=0}^na^kx(n-k)$$
If the input signal is zero, `x(n) = 0`, we get the zero-input response (natural response):
$$y_{zi}(n) = a^{n+1}y(-1)$$
The output signal is the sum: `y(n) = y_{zi}(n) + y_{zs}(n)`.