Structure and Interpretation of Computer Programs
-
Updated
Sep 12, 2020 - Python
Structure and Interpretation of Computer Programs
👤 Multi-Armed Bandit Algorithms Library (MAB) 👮
CS 61A: Structure and Interpretation of Computer Programs, Fall 2022, UC Berkeley
Implementations of basic concepts dealt under the Reinforcement Learning umbrella. This project is collection of assignments in CS747: Foundations of Intelligent and Learning Agents (Autumn 2017) at IIT Bombay
Author's implementation of the paper Correlated Age-of-Information Bandits.
Multi-armed bandit algorithm with tensorflow and 11 policies
Thompson Sampling for Bandits using UCB policy
Codes and templates for ML algorithms created, modified and optimized in Python and R.
Foundations Of Intelligent Learning Agents (FILA) Assignments
On Upper-Confidence Bound Policies for Non-Stationary Bandit Problems
AI for the game "Connect Four". Available on PyPI.
We implemented a Monte Carlo Tree Search (MCTS) from scratch and we successfully applied it to Tic-Tac-Toe game.
Repository for the course project done as part of CS-747 (Foundations of Intelligent & Learning Agents) course at IIT Bombay in Autumn 2022.
Thompson is Python package to evaluate the multi-armed bandit problem. In addition to thompson, Upper Confidence Bound (UCB) algorithm, and randomized results are also implemented.
Python package for Unity Cloud Build api
My programs during CS747 (Foundations of Intelligent and Learning Agents) Autumn 2021-22
Multi Armed Bandits implementation using the Jester Dataset
R.I.T project
Implementation of Multi-Armed Bandit (MAB) algorithms UCB and Epsilon-Greedy. MAB is a class of problems in reinforcement learning where an agent learns to choose actions from a set of arms, each associated with an unknown reward distribution. UCB and Epsilon-Greedy are popular algorithms for solving MAB problems.
Add a description, image, and links to the ucb topic page so that developers can more easily learn about it.
To associate your repository with the ucb topic, visit your repo's landing page and select "manage topics."