Codes referred in the blog on SignalProcessing and AI https://dsplog.com
This repository organizes code snippets referred to in the articles on the website https://dsplog.com.
Read the detailed article : Gradients for multiclass classification with Softmax
- derivative of softmax - implementation of softmax gradient calculation
- gradients for cross entropy loss - computing gradients for cross entropy loss function
- training multi-class classification - complete training pipeline for multi-class classification
- label smoothing implementation - label smoothing technique for improving classification
Read the detailed article: Bit Error Rate (BER) for BPSK modulation
bit_error_rate_vs_snr.ipynb: Analysis of BER vs SNR for BPSKbpsk_unequal_source_probabilities.ipynb: BPSK with non-uniform source probabilitiesbpsk_with_noise.ipynb: BPSK modulation with AWGN noise
Read the detailed article: Gradients for Binary Classification
gradients_binary_cross_entropy_loss.ipynb: Computing gradients for binary cross entropysigmoid_and_derivative.ipynb: Sigmoid function and its derivativetraining_loop_binary_classification.ipynb: Complete training loop implementationtraining_probit.ipynb: Training with probit regression
Read the detailed article: Gradients for Linear Regression
gradients_absolute_function.ipynb: Gradients for absolute error functiongradients_analytic_vs_finite_difference_vs_pytorch.ipynb: Comparison of gradient computation methodslinear_regression.ipynb: Basic linear regression implementationlinear_regression_mean_abs_error.ipynb: Linear regression with MAE losslinear_regression_pytorch.ipynb: PyTorch implementation of linear regression
Feel free to explore the notebooks and use the code as a reference for your projects!