I am trying to simulate DPCM in order to understand how it works. I am trying to implement the simplest version of DPCM which has a system scheme like in figure 1.
Sorry for the quality of the picture but this was the best I could get. Let me do some explaining of the system.
System should do a DPCM modulation like in the figure, encode it, send it via a binary symmetric channel, decode it and demodulate it.
Here is what I tried:
fs = 15000;
input_data = getaudiodata(recObj);
input_data = input_data/max(abs(input_data));
quant_steps = linspace(-14/16, 14/16, 15);
quant_vals = linspace(-15/16, 15/16, 16);
epsilon = 0.3;
delayed = 0;
for ii = 1:length(input_data)
y = input_data(ii) - delayed;
[quants(ii), y_hat] = quantiz(y, quant_steps, quant_vals);
delayed = y_hat + delayed;
end
binary_sequence = de2bi(quants, 'left-msb');
binary_sequence = binary_sequence';
binary_sequence = binary_sequence(:)';
%BSC
received_sequence = binary_sequence;
for ii = 1:length(input_data)
if rand < epsilon
received_sequence(ii) = mod((received_sequence(ii) + 1), 2);
end
end
%Demodulator
decoded = zeros(1, length(received_sequence)/4);
demodulated = zeros(1, length(received_sequence)/4);
reconstructed_signal = zeros(1, length(received_sequence)/4);
delayed = 0;
for ii = 1:length(decoded)
decoded(ii) = bi2de(received_sequence((ii - 1) * 4 + 1 : ii*4), 'left-msb');
demodulated(ii) = decoded(ii);
reconstructed_signal(ii) = (demodulated(ii) * 1/8 - 1) + delayed;
delayed = reconstructed_signal(ii);
end
MSE = (1/length(reconstructed_signal)) * sum((reconstructed_signal' - input_data).^2);
With this code, unfortunately, reconstructed signal increases constantly.
Waiting for your help.
As always,
Have a nice day!