This is the coursework section. Here, you’ll find my collection of notes, articles, and reflections on various topics related to Machine Learning (ML), Artificial Intelligence (AI), and Computer Vision (CV) and DevOps. These resources are aimed at documenting my learning journey, especially during my PhD, and are meant to serve as a study guide for anyone delving into these fields as well.
Computer Vision
I will be documenting my progress in HS2024 - Computer Vision, a course offered by Universität Bern, which covers key concepts in CV. My notes dive into various topics that form the backbone of the field. These notes will hopefully form a comprehensive resource for anyone interested in the technical foundations and applications of computer vision.
Image Formation
- Projection Models: Explanation of camera models and the mathematics behind projecting 3D objects onto 2D images. It covers the essentials of camera calibration, perspective projection, and the pinhole camera model.
- Building a real camera: Covers lens camera, depth of field, lens flaws, bay array, desmosaicing and concept of color Moiré
- Event Based Camera
Image processing
- Image filtering: Image denoising, convolution, gaussian kernel…
- Edge detection: Noise detection, gaussian filters, separability of a convolution, edge detection and partial derivative filters.
- Locating and Describing interest points: Locating Interest Points at Different Scales, cornerness function, Harris detector, Difference of Gaussians…
Preprocessing
- Fitting: Fitting points to line. Total Least Squares and Random Sample Consensus (RANSAC) method for line fitting.
- Image Alignment: robust methods for aligning images by detecting and matching distinctive features, using transformations like affine and homography.
Machine Learning
I will be documenting my progress in HS2024 - Machine Learning, a course offered by Universität Bern, which covers key concepts in Machine Learning. Most of these notes are the same as the CS229 course from Standford with extended explanations and notes.
Supervised Learning
- Linear Regression & Probabilistc interpretation
- Classification
- Generalized Linear Models
- Generative Learniing Algorithms: Gaussian Discriminant analysis model & Naive Bayes Classifier
Docker
During my work at UniBe I came to realize the importance of DevOps skills in modern software development. While machine learning and algorithms are critical components of my work, understanding DevOps has proven to be just as essential. For me, the true excitement of Computer Science lies in the ability to build complete solutions from the ground up and then deploy them for others to use. This is why I’m diving into Docker through this fantastic course on Udemy.
Course outline:
Applied Optimization
I will be documenting my progress in HS2024 - Applied Optimization, a course offered by Universität Bern. The course consists of am applied introduction, covering a broad range of practically important topics, as for instance: Mathematical modeling of real-world problems, theory of convexity, Lagrange dualism, algorithms for unconstrained and constrained optimization with inequalities (e.g. gradient descent, Newton’s method, trust-region methods, active set approaches, interior point methods, …).
Hopefully by the end of the course I will be able to:
- Understand which classes of optimization problems are easy/hard to solve.
- Model or re-formulate problems in a way that they become easier (e.g. convex).
- Understand the fundamental ideas behind unconstrained, constrained and mixed-integer optimization.
- Implement and use various optimization algorithms (programming exercises are in C++).
- Understand and tune the parameters and output statistics that are exposed by optimization packages.
Course outline:
TMAP
A significant portion of my PhD work revolves around TMAP, a versatile tool in my research. To deepen my understanding of this tool, I’ve dedicated some time to exploring its concepts and implementation. The key topics involved, such as Locality-Sensitive Hashing (LSH) and Min-Hashing, are critical components of the larger framework.
I’ve written a detailed blog post that offers a foundational introduction to TMAP and breaks down its core mechanisms. Below are links to some of the key concepts discussed within the post:
TMAP Overview:
This post provides a comprehensive introduction to TMAP, explaining its role, functionality, and how it fits into the broader landscape of data mapping and visualization.Min-Hashing:
A dive into Min-Hashing, explaining how it efficiently estimates the similarity between datasets using hashed subsets of the data.LSH Forest:
An overview of Locality-Sensitive Hashing (LSH) and how the LSH Forest algorithm is used to approximate nearest neighbors in high-dimensional spaces.K-NN Graph:
A breakdown of K-Nearest Neighbor (K-NN) graphs and how they are applied in data clustering and classification.
More posts on related concepts and applications will be added soon as I continue to expand my research.
As I progress through my PhD and research projects, I will continue to update this space with more notes, tutorials, and insights. Keep an eye out for new additions, and feel free to reach out if you’d like to discuss any of these topics in more detail!