|
|
|
|
THE ML ENGINEER 🤖
Issue #92
|
|
|
|
|
|
|
|
If you would like to suggest articles, ideas, papers, libraries, jobs, events or provide feedback just hit reply or send us an email to a@ethical.institute! We have received a lot of great suggestions in the past, thank you very much for everyone's support!
|
|
|
|
|
|
|
|
Iterative.ai CEO & DVC Author Dmitry Petrov joins the software engineering daily podcast to talk about the importance of version control in data science as well as trends, challenges and solutions.
|
|
|
|
|
|
|
Applied Researchers at Facebook have put together an interesting and comprehensible series on Differential Privacy. In this article they dive into some of the core concepts in privacy preserving ML.
|
|
|
|
|
|
|
Deep learning can benefit massively from GPU computing. Similar to the requirements when choosing a new CPU or RAM, it's important to understand the tradeoffs across different GPU choices, especially when it comes to specialised processing. This comprehensive post covers a deep dive on core concepts and best practices when considering GPU hardware for deep learning.
|
|
|
|
|
|
|
Pathmind Deep Learning Engineer joins the Data Exchange podcast to discuss some key applications of reinforcement learning to tackle simulations. In this session Max provides an insight on how they use Reinforcement Learning at Pathmind.
|
|
|
|
|
|
|
Machine Learning systems introduce new complexities where traditional monitoring approaches may fall short. This article provides some insight on the gaps in these tools when used for machine learning, as well as some of the key areas where "explainable" monitoring solutions can bring value.
|
|
|
|
|
|
|
|
|
The topic for this week's featured production machine learning libraries is Privacy Preserving ML. We are currently looking for more libraries to add - if you know of any that are not listed, please let us know or feel free to add a PR. The four featured libraries this week are:
- Google's Differential Privacy - This is a C++ library of ε-differentially private algorithms, which can be used to produce aggregate statistics over numeric data sets containing private or sensitive information.
- Intel Homomorphic Encryption Backend - The Intel HE transformer for nGraph is a Homomorphic Encryption (HE) backend to the Intel nGraph Compiler, Intel's graph compiler for Artificial Neural Networks.
- Microsoft SEAL - Microsoft SEAL is an easy-to-use open-source (MIT licensed) homomorphic encryption library developed by the Cryptography Research group at Microsoft.
- PySyft - A Python library for secure, private Deep Learning. PySyft decouples private data from model training, using Multi-Party Computation (MPC) within PyTorch.
|
|
|
|
|
|
|
© 2018 The Institute for Ethical AI & Machine Learning
|
|
|
|