TensorFlow implementation of "Hyper-Graph-Network Decoders for Block Codes"
More comprehensive version will be uploaded upon acceptance

Prerequisites: Python 2.7, tensorflow 1.8.0, numpy, scipy, matplotlib

Block code matrices configuration: 
Attach BCH(63,51) parity check & generator matrix in ECC_MATRIX folder. Note that you should configure the path to the parity check matrix & the generator matrix in lines 18-19.

To Train:
Run Hyper-Graph-Network.py

For each training, a folder will create with the following files:
1. Learning_curve.png - figure of the learning curve
2. results.txt - results with BER and FER values for each epoch
3. weights folder - folder for the checkpoints
4. script.py - copy of the running file, for later investigation

Hyperparameters:
start_snr - start value of SNR (in dB)
step - step value of SNR (in dB)
stop_snr - stop value of SNR (in dB)
load_weights - whether loading weights from checkpoint
weights_path - the path for the checkpoint. should be ckpt file
num_hidden_layers - how may iteration of the method
batch_size - should be YOUR_VALUE*length_of_snr_vec
numOfWordSim_train - how many codeword per snr, should be YOUR_VALUE of the batch_size
batches_for_val_per_snr - how may batches per snr in the evaluation step
multiloss - 1 means that for each check layer we add the cross entropy loss
batch_in_epoch - how many batches in epoch
num_of_batch - total number of batches
learning_rate - learning rate
train_on_zero_word - True if you want to train only with the zero code word
test_on_zero_word - False if you want to eval in valid codeword (not only the zero code word)
n_hidden_1 - how many neurons in the first layer of network 'g'
n_hidden_2 - how many neurons in the second layer of network 'g'
sf_n_hidden_1 - how many neurons in the first layer of network 'f'
sf_n_hidden_2 - how many neurons in the second layer of network 'f'
sf_n_hidden_3 - how many neurons in the third layer of network 'f'


