Skip to content

Welcome to the Parallel-Computer-Architecture-and-Programming-Models repository! This repository serves as a central hub for various mini-projects related to different Parallel Computer Architecture and Programming Models techniques.

License

Notifications You must be signed in to change notification settings

Praveen76/Parallel-Computer-Architecture-and-Programming-Models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Parallel-Computer-Architecture-and-Programming-Models

Welcome to the Parallel-Computer-Architecture-and-Programming-Models repository! This repository serves as a central hub for various mini-projects related to different Parallel Computer Architecture and Programming Models techniques. Each project is housed in its own GitHub repository, and you can find the links to these projects below.

Parallel and Computer Architecture

Explore projects focused on parallel computing and computer architecture:

  1. Introduction to DASK:This repository contains a Jupyter notebook that provides an introduction to using Dask for parallel computing in Python. The notebook demonstrates how to implement and utilize various Dask libraries for data processing and machine learning tasks.
  2. Introduction to NUMBA:This repository contains a Jupyter notebook that provides an introduction to using Numba for just-in-time (JIT) compilation to optimize Python code. The notebook demonstrates how to use Numba to improve performance, understand its compilation modes, identify its limitations, and vectorize code.
  3. Introduction to PySpark: This repository contains a Jupyter notebook that provides an introduction to using PySpark for machine learning. The notebook demonstrates how to explore and visualize datasets, and implement machine learning models using PySpark.
  4. Introduction to RAY:This repository contains a Jupyter notebook that provides an introduction to using Ray for distributed computing in Python. The notebook demonstrates how to load data, train models, and tune hyperparameters using Ray.
  5. Distributed training using TensorFlow: This repository contains a Jupyter notebook that provides an introduction to using TensorFlow for distributed training. The notebook demonstrates different parallelizing strategies, custom training loops, and implementation of strategies with Keras for distributed model training.
  6. Introduction to RAPIDS: This repository contains a Jupyter notebook that provides an introduction to using RAPIDS for GPU-accelerated data processing and machine learning in Python. The notebook demonstrates how to load, simulate, split data, convert data formats, and train models using RAPIDS.
  7. Introduction to PySpark and MLLib: This repository contains a Jupyter notebook that provides an introduction to using PySpark and MLLib for data processing and machine learning. The notebook demonstrates how to interact with Spark using Python, understand Spark DataFrames, and implement linear regression using PySpark.
  8. Introduction to OpenMP: This repository contains a Jupyter notebook that provides an introduction to using OpenMP for parallelization in Python. The notebook demonstrates how to implement multiprocessing using OpenMP, along with the necessary concepts and examples.
  9. Parallel programming with MPI: This repository contains a Jupyter Notebook that demonstrates the implementation of standard message-passing algorithms using MPI (Message Passing Interface). It aims to help you understand the basics of point-to-point communication, blocking and non-blocking communication, and collective communication, along with their impact on program performance.
  10. Oops in Python: This repository contains a Jupyter Notebook that demonstrates the general structure of classes in Object-Oriented Programming (OOP) using Python. It aims to help you learn how to build your own classes, specialized to your needs.
  11. Time Complexity Analysis: This repository contains a Jupyter Notebook that demonstrates the concept of computational complexity, focusing on time complexity and Big-O notation. It aims to help you understand these fundamental concepts and determine the time complexity of given algorithms.
  12. Monitoring resources using Psutil: This repository contains a Jupyter Notebook that demonstrates how to monitor various resources of your device using the psutil package in Python. This includes monitoring CPU, GPU, memory, disks, network, and sensors. Additionally, it explores the multiprocessing package to evaluate the advantages of parallelism in resource monitoring.

Contributing

I welcome contributions! If you have a mini-project you'd like to add or improvements to suggest, please fork the repository and create a pull request. For major changes, please open an issue first to discuss what you would like to change.

  1. Fork the repository.
  2. Create a new branch (git checkout -b feature-branch).
  3. Make your changes.
  4. Commit your changes (git commit -m 'Add some feature').
  5. Push to the branch (git push origin feature-branch).
  6. Open a pull request.

Feel free to contribute by adding your own mini-projects to the list!

Contributing

If you have a Data Science mini-project that you'd like to share, please follow the guidelines in CONTRIBUTING.md.

Code of Conduct

Please adhere to our Code of Conduct in all your interactions with the project.

License

This project is licensed under the MIT License.

Contact

For questions or inquiries, feel free to contact me on Linkedin.

About Me:

I’m a seasoned Data Scientist and founder of TowardsMachineLearning.Org. I've worked on various Machine Learning, NLP, and cutting-edge deep learning frameworks to solve numerous business problems.

About

Welcome to the Parallel-Computer-Architecture-and-Programming-Models repository! This repository serves as a central hub for various mini-projects related to different Parallel Computer Architecture and Programming Models techniques.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published