Search code examples
matlabsignal-processingentropy

Matlab : Confusion regarding unit of entropy to use in an example


Tent Map Prediction

Figure 1. Hypothesis plot. y axis: Mean entropy. x axis: Bits.


This Question is in continuation to a previous one asked Matlab : Plot of entropy vs digitized code length

I want to calculate the entropy of a random variable that is discretized version (0/1) of a continuous random variable x. The random variable denotes the state of a nonlinear dynamical system called as the Tent Map. Iterations of the Tent Map yields a time series of length N.

The code should exit as soon as the entropy of the discretized time series becomes equal to the entropy of the dynamical system. It is known theoretically that the entropy of the system, H is log_e(2) or ln(2) = 0.69 approx. The objective of the code is to find number of iterations, j needed to produce the same entropy as the entropy of the system, H.

Problem 1: My problem in when I calculate the entropy of the binary time series which is the information message, then should I be doing it in the same base as H? OR Should I convert the value of H to bits because the information message is in 0/1 ? Both give different results i.e., different values of j.

Problem 2: It can happen that the probality of 0's or 1's can become zero so entropy correspondng to it can become infinity. To prevent this, I thought of putting a check using if-else. But, the loop

if entropy(:,j)==NaN
     entropy(:,j)=0;
 end

does not seem to be working. Shall be greateful for ideas and help to solve this problem. Thank you

UPDATE : I implemented the suggestions and answers to correct the code. However, my logic of solving was not proper earlier. In the revised code, I want to calculate the entropy for length of time series having bits 2,8,16,32. For each code length, entropy is calculated. Entropy calculation for each code length is repeated N times starting for each different initial condition of the dynamical system. This appraoch is adopted to check at which code length the entropy becomes 1. The nature of the plot of entropy vs bits should be increasing from zero and gradually reaching close to 1 after which it saturates - remains constant for all the remaining bits. I am unable to get this curve (Figure 1). Shall appreciate help in correcting where I am going wrong.

clear all

 H = 1  %in bits
 Bits = [2,8,16,32,64];
threshold = 0.5;
N=100;  %Number of runs of the experiment


for r = 1:length(Bits)


t = Bits(r)

for Runs = 1:N
    x(1)            = rand;

    for j = 2:t


        % Iterating over the Tent Map


        if x(j - 1) < 0.5
            x(j) = 2 * x(j - 1);
        else
            x(j) = 2 * (1 - x(j - 1));
        end % if
    end
    %Binarizing the output of the Tent Map
    s  = (x >=threshold);
    p1 = sum(s == 1 ) / length(s);  %calculating probaility of number of 1's
    p0 = 1 - p1;  % calculating probability of number of 0'1

    entropy(t) = -p1 * log2(p1) - (1 - p1) * log2(1 - p1); %calculating entropy in bits

    if isnan(entropy(t))
        entropy(t) = 0;
    end



    %disp(abs(lambda-H))



end


  Entropy_Run(Runs) =  entropy(t)
end
Entropy_Bits(r) = mean(Entropy_Run)
plot(Bits,Entropy_Bits)

Solution

  • First you just

    function [mean_entropy, bits] = compute_entropy(bits, blocks, threshold, replicate)
    
        if replicate
            disp('Replication is ON');
        else
            disp('Replication is OFF');
        end
    
        %%
        % Populate random vector
        if replicate
            seed = 849;
            rng(seed);
        else
            rng('default');
        end
    
        rs = rand(blocks);
    
    
        %%
        % Get random
        trial_entropy = zeros(length(bits));
    
        for r = 1:length(rs)
    
            bit_entropy = zeros(length(bits), 1); % H
    
            % Traverse bit trials
            for b = 1:(length(bits)) % N
    
                tent_map = zeros(b, 1); %Preallocate for memory management
    
                %Initialize
                tent_map(1) = rs(r);
    
                for j = 2:b % j is the iterator, b is the current bit
    
                    if tent_map(j - 1) < threshold
                        tent_map(j) = 2 * tent_map(j - 1);
                    else
                        tent_map(j) = 2 * (1 - tent_map(j - 1));
                    end % if
                end
    
                %Binarize the output of the Tent Map
                s  = find(tent_map >= threshold);
                p1 = sum(s == 1) / length(s);  %calculate probaility of number of 1's
                %p0 = 1 - p1;  % calculate probability of number of 0'1
    
                bit_entropy(b) = -p1 * log2(p1) - (1 - p1) * log2(1 - p1); %calculate entropy in bits
    
                if isnan(bit_entropy(b))
                    bit_entropy(b) = 0;
                end
    
                %disp(abs(lambda-h))
    
            end
    
            trial_entropy(:, r) = bit_entropy;
    
            disp('Trial Statistics')
            data = get_summary(bit_entropy);
            disp('Mean')
            disp(data.mean);
            disp('SD')
            disp(data.sd);
    
        end
    
        % TO DO Compute the mean for each BIT index in trial_entropy
        mean_entropy = 0;
    
        disp('Overall Statistics')
        data = get_summary(trial_entropy);
        disp('Mean')
        disp(data.mean);
        disp('SD')
        disp(data.sd);
    
        %This is the wrong mean...
        mean_entropy = data.mean;
    
        function summary = get_summary(entropy)
            summary = struct('mean', mean(entropy), 'sd', std(entropy));
        end
    end
    

    and then you just have to

    % Entropy Script
    clear all
    
    %% Settings
    replicate = false; % = false % Use true for debugging only.
    %H = 1;  %in bits
    Bits = 2.^(1:6);
    Threshold = 0.5;
    %Tolerance = 0.001;
    Blocks = 100;  %Number of runs of the experiment
    
    %% Run
    [mean_entropy, bits] = compute_entropy(Bits, Blocks, Threshold, replicate);
    
    %What we want
    %plot(bits, mean_entropy);
    
    %What we have
    plot(1:length(mean_entropy), mean_entropy);