Skip to content

Commit ad03fa9

Browse files
author
MichaelHirn
committed
docs/readme: add leaf book to readme
1 parent af0d7e6 commit ad03fa9

File tree

6 files changed

+127
-180
lines changed

6 files changed

+127
-180
lines changed

README.md

+54-101
Original file line numberDiff line numberDiff line change
@@ -2,44 +2,40 @@
22

33
## Introduction
44

5-
Leaf is a Machine Intelligence Framework engineered by software developers, not
6-
scientists. It was inspired by the brilliant people behind TensorFlow, Torch,
7-
Caffe, Rust and numerous research papers and brings modularity, performance and
8-
portability to deep learning. Leaf is lean and tries to introduce minimal
5+
Leaf is a open Machine Learning Framework for hackers to build classical, deep
6+
or hybrid machine learning applications. It was inspired by the brilliant people
7+
behind TensorFlow, Torch, Caffe, Rust and numerous research papers and brings
8+
modularity, performance and portability to deep learning.
9+
10+
Leaf has one of the simplest APIs, is lean and tries to introduce minimal
911
technical debt to your stack.
1012

11-
Leaf is a few months old, but thanks to its architecture and Rust, it is already one of
12-
the fastest Machine Intelligence Frameworks in the world.
13+
Leaf is a few months old, but thanks to its architecture and Rust, it is already
14+
one of the fastest Machine Intelligence Frameworks available.
1315

1416
<div align="center">
1517
<img src="http://autumnai.com/images/autumn_leaf_benchmarks_alexnet.png"><br><br>
1618
</div>
1719

1820
> See more Deep Neural Networks benchmarks on [Deep Learning Benchmarks][deep-learning-benchmarks-website].
1921
20-
Leaf is portable. Run it on CPUs, GPUs, FPGAs on machines with an OS or on
22+
Leaf is portable. Run it on CPUs, GPUs, and FPGAs, on machines with an OS, or on
2123
machines without one. Run it with OpenCL or CUDA. Credit goes to
2224
[Collenchyma][collenchyma] and Rust.
2325

2426
Leaf is part of the [Autumn][autumn] Machine Intelligence Platform, which is
25-
working on making AI algorithms 100x more computational efficient. It seeks to bring
26-
real-time, offline AI to smartphones and embedded devices.
27+
working on making AI algorithms 100x more computational efficient.
2728

2829
We see Leaf as the core of constructing high-performance machine intelligence
2930
applications. Leaf's design makes it easy to publish independent modules to make
3031
e.g. deep reinforcement learning, visualization and monitoring, network
3132
distribution, [automated preprocessing][cuticula] or scaleable production
3233
deployment easily accessible for everyone.
3334

34-
For more info, refer to
35-
* the [Leaf examples][leaf-examples],
36-
* the [Leaf Documentation][documentation],
37-
* the [Autumn Website][autumn] or
38-
* the [Q&A](#qa)
39-
4035
[caffe]: https://github.com/BVLC/caffe
4136
[rust]: https://www.rust-lang.org/
4237
[autumn]: http://autumnai.com
38+
[leaf-book]: http://autumnai.com/leaf/book
4339
[tensorflow]: https://github.com/tensorflow/tensorflow
4440
[benchmarks]: #benchmarks
4541
[leaf-examples]: #examples
@@ -52,22 +48,42 @@ For more info, refer to
5248
5349
## Getting Started
5450

55-
If you are new to Rust you can install it as detailed [here][rust_download].
56-
We also recommend taking a look at the [official Getting Started Guide][rust_getting_started].
51+
### Documentation
52+
53+
To learn how to build classical, deep or hybrid machine learning applications with Leaf, check out the [Leaf - Machine Learning for Hackers][leaf-book] book.
54+
55+
For additional information see the [Rust API Documentation][documentation] or the [Autumn Website][autumn].
56+
57+
Or start by running the **Leaf examples**.
5758

58-
If you're using Cargo, just add Leaf to your `Cargo.toml`:
59+
We are providing a [Leaf examples repository][leaf-examples], where we and
60+
others publish executable machine learning models build with Leaf. It features
61+
a CLI for easy usage and has a detailed guide in the [project
62+
README.md][leaf-examples].
63+
64+
Leaf comes with an examples directory as well, which features popular neural
65+
networks (e.g. Alexnet, Overfeat, VGG). To run them on your machine, just follow
66+
the install guide, clone this repoistory and then run
67+
68+
```bash
69+
# The examples currently require CUDA support.
70+
cargo run --release --no-default-features --features cuda --example benchmarks alexnet
71+
```
72+
73+
[leaf-examples]: https://github.com/autumnai/leaf-examples
74+
75+
### Installation
76+
77+
> Leaf is build in [Rust][rust]. If you are new to Rust you can install Rust as detailed [here][rust_download].
78+
We also recommend taking a look at the [official Rust - Getting Started Guide][rust_getting_started].
79+
80+
To start building a machine learning application (Rust only for now. Wrappers are welcome) and you are using Cargo, just add Leaf to your `Cargo.toml`:
5981

6082
```toml
6183
[dependencies]
6284
leaf = "0.2.0"
6385
```
6486

65-
If you're using [Cargo Edit][cargo-edit], you can
66-
call:
67-
68-
```bash
69-
cargo add leaf
70-
```
7187
[rust_download]: https://www.rust-lang.org/downloads.html
7288
[rust_getting_started]: https://doc.rust-lang.org/book/getting-started.html
7389
[cargo-edit]: https://github.com/killercup/cargo-edit
@@ -88,24 +104,24 @@ opencl = ["leaf/opencl"]
88104

89105
> More information on the use of feature flags in Leaf can be found in [FEATURE-FLAGS.md](./FEATURE-FLAGS.md)
90106
107+
### Contributing
91108

92-
## Examples
109+
If you want to start hacking on Leaf (e.g.
110+
[adding a new `Layer`](http://autumnai.com/leaf/book/create-new-layer.html))
111+
you should start with forking and cloning the repository.
93112

94-
We are providing a [Leaf examples repository][leaf-examples], where we and
95-
others publish executable machine learning models build with Leaf. It features
96-
a CLI for easy usage and has a detailed guide in the [project
97-
README.md][leaf-examples].
113+
We have more instructions to help you get started in the [CONTRIBUTING.md][contributing].
98114

99-
Leaf comes with an examples directory as well, which features popular neural
100-
networks (e.g. Alexnet, Overfeat, VGG). To run them on your machine, just follow
101-
the install guide, clone this repoistory and then run
115+
We also has a near real-time collaboration culture, which happens
116+
here on Github and on the [Leaf Gitter Channel][gitter-leaf].
102117

103-
```bash
104-
# The examples currently require CUDA support.
105-
cargo run --release --no-default-features --features cuda --example benchmarks alexnet
106-
```
118+
> Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as below, without any additional terms or conditions.
107119
108-
[leaf-examples]: https://github.com/autumnai/leaf-examples
120+
[contributing]: CONTRIBUTING.md
121+
[gitter-leaf]: https://gitter.im/autumnai/leaf
122+
[mj]: https://twitter.com/mjhirn
123+
[hobofan]: https://twitter.com/hobofan
124+
[irc]: https://chat.mibbit.com/?server=irc.mozilla.org&channel=%23rust-machine-learning
109125

110126
## Ecosystem / Extensions
111127

@@ -120,84 +136,21 @@ and extensible as possible. More helpful crates you can use with Leaf:
120136

121137
## Support / Contact
122138

123-
- With a bit of luck, you can find us online on the #rust-machine-learing IRC at irc.mozilla.org,
139+
- With a bit of luck, you can find us online on the #rust-machine-learning IRC at irc.mozilla.org,
124140
- but we are always approachable on [Gitter/Leaf][gitter-leaf]
125141
- For bugs and feature request, you can create a [Github issue][leaf-issue]
126142
- For more private matters, send us email straight to our inbox: developers@autumnai.com
127143
- Refer to [Autumn][autumn] for more information
128144

129145
[leaf-issue]: https://github.com/autumnai/leaf/issues
130146

131-
## Contributing
132-
133-
Want to contribute? Awesome! We have [instructions to help you get started][contributing].
134-
135-
Leaf has a near real-time collaboration culture, and it happens here on Github and
136-
on the [Leaf Gitter Channel][gitter-leaf].
137-
138-
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0
139-
license, shall be dual licensed as below, without any additional terms or
140-
conditions.
141-
142-
[contributing]: CONTRIBUTING.md
143-
[gitter-leaf]: https://gitter.im/autumnai/leaf
144-
[mj]: https://twitter.com/mjhirn
145-
[hobofan]: https://twitter.com/hobofan
146-
[irc]: https://chat.mibbit.com/?server=irc.mozilla.org&channel=%23rust-machine-learning
147-
148147
## Changelog
149148

150149
You can find the release history at the [CHANGELOG.md][changelog]. We are using [Clog][clog], the Rust tool for auto-generating CHANGELOG files.
151150

152151
[changelog]: CHANGELOG.md
153152
[Clog]: https://github.com/clog-tool/clog-cli
154153

155-
## Q&A
156-
157-
#### _Why Rust?_
158-
159-
Hardware has just recently become strong enough to support real-world
160-
usage of machine intelligence e.g. super-human image recognition, self-driving
161-
cars, etc. To take advantage of the computational power of the underlying
162-
hardware, from GPUs to clusters, you need a low-level language that allows for
163-
control of memory. But to make machine intelligence widely accessible you want
164-
to have a high-level, comfortable abstraction over the underlying hardware.
165-
166-
Rust allows us to cross this chasm.
167-
Rust promises performance like C/C++ but with safe memory-control. For now we
168-
can use C Rust wrappers for performant libraries. But in the future Rust
169-
rewritten libraries will have the advantage of zero-cost safe memory control,
170-
that will make large, parallel learning networks over CPUs and GPUs more
171-
feasible and more reliable to develop. The development of these future libraries
172-
is already under way e.g. [Glium][glium].
173-
174-
On the usability side, Rust offers a trait-system that makes it easy for
175-
researchers and hobbyists alike to extend and work with Leaf as if it were
176-
written in a higher-level language such as Ruby, Python, or Java.
177-
178-
#### _Who can use Leaf?_
179-
180-
We develop Leaf under the MIT open source license, which, paired with the easy
181-
access and performance, makes Leaf a first-choice option for researchers and
182-
developers alike.
183-
184-
#### _Why did you open source Leaf?_
185-
186-
We believe strongly in machine intelligence and think that it will have a major
187-
impact on future innovations, products and our society. At Autumn, we experienced
188-
a lack of common and well engineered tools for machine learning and therefore
189-
started to create a modular toolbox for machine learning in Rust. We hope that,
190-
by making our work open source, we will speed up research and development of
191-
production-ready applications and make that work easier as well.
192-
193-
#### _Who is Autumn?_
194-
195-
Autumn is a startup working on automated decision making. Autumn was started by
196-
two developers, MJ and Max. The startup is located in Berlin and recently
197-
received a pre-seed investment from Axel Springer and Plug&Play.
198-
199-
[glium]: https://github.com/tomaka/glium
200-
201154
## License
202155

203156
Licensed under either of

doc/book/index.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ <h1>Leaf - Machine Learning for Hackers</h1>
108108
classical, stochastic or hybrids, and solvers for executing and optimizing the
109109
model.</p>
110110
<p>This is already the entire API for machine learning with Leaf. To learn how
111-
this is possible and how to build machine learning applications, refer to
111+
this is possible and how to build machine learning applications, refer to chapters
112112
<a href="./layers.html">2. Layers</a> and <a href="./solvers.html">3. Solvers</a>. Enjoy!</p>
113113
<h2>Benefits+</h2>
114114
<p>Leaf was built with three concepts in mind: accessibility/simplicity,

doc/book/layer-lifecycle.html

+29-32
Original file line numberDiff line numberDiff line change
@@ -68,26 +68,24 @@ <h1 class="menu-title"></h1>
6868

6969
<div id="content" class="content">
7070
<h1>Layer Lifecycle</h1>
71-
<p>In <a href="./layers.html">2. Layers</a> we have already seen a little bit about how to
72-
construct a <code>Layer</code> from a <code>LayerConfig</code>. In this chapter, we take
73-
a closer look at what happens inside Leaf when initializing a <code>Layer</code> when
74-
running the <code>.forward</code> of a <code>Layer</code> and when running the <code>.backward</code>. In the
75-
next chapter <a href="./building-networks.html">2.2 Create a Network</a> we then
76-
apply our knowledge to construct deep networks via the container layer.</p>
77-
<p>Initialization (<code>::from_config</code>), <code>.forward</code> and <code>.backward</code> are the three most
78-
important methods of a <code>Layer</code> and describe basically the entire API. Let's
79-
take a closer look at what happens inside Leaf, when these methods are called.</p>
71+
<p>In chapter <a href="./layers.html">2. Layers</a> we saw how to
72+
construct a simple <code>Layer</code> from a <code>LayerConfig</code>. In this chapter, we take
73+
a closer look at what happens inside Leaf when initializing a <code>Layer</code> and when running its
74+
<code>.forward</code> and <code>.backward</code> methods. In the next chapter <a href="./building-networks.html">2.2 Create a Network</a> we
75+
apply our knowledge to construct deep networks with the container layer.</p>
76+
<p>The most important methods of a <code>Layer</code> are initialization (<code>::from_config</code>), <code>.forward</code> and <code>.backward</code>.
77+
They basically describe the entire API, so let's take a closer look at what happens inside Leaf when these methods are called.</p>
8078
<h3>Initialization</h3>
81-
<p>A layer is constructed from a <code>LayerConfig</code> via the <code>Layer::from_config</code>
79+
<p>A layer is constructed from a <code>LayerConfig</code> with the <code>Layer::from_config</code>
8280
method, which returns a fully initialized <code>Layer</code>.</p>
8381
<pre><code class="language-rust">let mut sigmoid: Layer = Layer::from_config(backend.clone(), &amp;LayerConfig::new(&quot;sigmoid&quot;, LayerType::Sigmoid))
8482
let mut alexnet: Layer = Layer::from_config(backend.clone(), &amp;LayerConfig::new(&quot;alexnet&quot;, LayerType::Sequential(cfg)))
8583
</code></pre>
8684
<p>In the example above, the first layer has a Sigmoid worker
87-
(<code>LayerType::Sigmoid</code>). The second layer has a Sequential worker.
88-
Although both <code>Layer::from_config</code> methods, return a <code>Layer</code>, the behavior of
89-
the <code>Layer</code> depends on the <code>LayerConfig</code> it was constructed with. The
90-
<code>Layer::from_config</code> calls internally the <code>worker_from_config</code> method, which
85+
(<code>LayerType::Sigmoid</code>) and the second layer has a Sequential worker.
86+
Although both <code>::from_config</code> methods return a <code>Layer</code>, the behavior of
87+
that <code>Layer</code> depends on the <code>LayerConfig</code> it was constructed with. The
88+
<code>Layer::from_config</code> internally calls the <code>worker_from_config</code> method, which
9189
constructs the specific worker defined by the <code>LayerConfig</code>.</p>
9290
<pre><code class="language-rust">fn worker_from_config(backend: Rc&lt;B&gt;, config: &amp;LayerConfig) -&gt; Box&lt;ILayer&lt;B&gt;&gt; {
9391
match config.layer_type.clone() {
@@ -99,35 +97,34 @@ <h3>Initialization</h3>
9997
}
10098
}
10199
</code></pre>
102-
<p>The layer specific <code>::from_config</code> (if available or needed) then takes care of
100+
<p>The layer-specific <code>::from_config</code> (if available or needed) then takes care of
103101
initializing the worker struct, allocating memory for weights and so on.</p>
104-
<p>In case the worker layer is a container layer, its <code>::from_config</code> takes
102+
<p>If the worker is a container layer, its <code>::from_config</code> takes
105103
care of initializing all the <code>LayerConfig</code>s it contains (which were added via its
106-
<code>.add_layer</code> method) and connecting them in
107-
the order they were provided to the <code>LayerConfig</code> of the container.</p>
108-
<p>Every <code>.forward</code> or <code>.backward</code> call that is now made to the returned <code>Layer</code> is
109-
sent to the worker.</p>
104+
<code>.add_layer</code> method) and connecting them in the order they were provided.</p>
105+
<p>Every <code>.forward</code> or <code>.backward</code> call that is made on the returned <code>Layer</code> is
106+
run by the internal worker.</p>
110107
<h3>Forward</h3>
111-
<p>The <code>forward</code> method of a <code>Layer</code> sends the input through the constructed
108+
<p>The <code>forward</code> method of a <code>Layer</code> threads the input through the constructed
112109
network and returns the output of the network's final layer.</p>
113110
<p>The <code>.forward</code> method does three things:</p>
114111
<ol>
115112
<li>Reshape the input data if necessary</li>
116-
<li>Sync the input/weights to the device were the computation happens. This step
117-
removes the worker layer from the obligation to care about memory synchronization.</li>
118-
<li>Call the <code>forward</code> method of the worker layer.</li>
113+
<li>Sync the input/weights to the device where the computation happens. This step
114+
removes the need for the worker layer to care about memory synchronization.</li>
115+
<li>Call the <code>forward</code> method of the internal worker layer.</li>
119116
</ol>
120-
<p>In case, the worker layer is a container layer, the <code>.forward</code> method of the
121-
container layer takes care of calling the <code>.forward</code> methods of its managed
117+
<p>If the worker layer is a container layer, the <code>.forward</code> method
118+
takes care of calling the <code>.forward</code> methods of its managed
122119
layers in the right order.</p>
123120
<h3>Backward</h3>
124-
<p>The <code>.backward</code> of a <code>Layer</code> works quite similar to its <code>.forward</code>. Although it
125-
does not need to reshape the input. The <code>.backward</code> computes
126-
the gradient with respect to the input and the gradient w.r.t. the parameters but
127-
only returns the gradient w.r.t the input as only that is needed to compute the
121+
<p>The <code>.backward</code> method of a <code>Layer</code> works similarly to <code>.forward</code>, apart from
122+
needing to reshape the input. The <code>.backward</code> method computes
123+
the gradient with respect to the input as well as the gradient w.r.t. the parameters. However,
124+
the method only returns the input gradient because that is all that is needed to compute the
128125
gradient of the entire network via the chain rule.</p>
129-
<p>In case the worker layer is a container layer, the <code>.backward</code> method of the
130-
container layer takes care of calling the <code>.backward_input</code> and
126+
<p>If the worker layer is a container layer, the <code>.backward</code> method
127+
takes care of calling the <code>.backward_input</code> and
131128
<code>.backward_parameter</code> methods of its managed layers in the right order.</p>
132129

133130
</div>

doc/book/layers.html

+6-6
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ <h1 class="menu-title"></h1>
6969
<div id="content" class="content">
7070
<h1>Layers</h1>
7171
<h3>What is a Layer?</h3>
72-
<p><a href="./deep-learning-glossary.html#Layer">Layers</a> are the highest-level and only building
72+
<p><a href="./deep-learning-glossary.html#Layer">Layers</a> are the only building
7373
blocks in Leaf. As we will see later on, everything is a layer. Even when
7474
we construct <a href="./deep-learning-glossary.html#Network">networks</a>, we are still just
7575
working with layers composed of smalle layers. This makes the API clean and expressive.</p>
@@ -157,15 +157,15 @@ <h4>Container Layers</h4>
157157
can be found at
158158
<a href="https://github.com/autumnai/leaf/tree/master/src/layers/container">src/layers/container</a>.</p>
159159
<h3>Why Layers?</h3>
160-
<p>The benefit of using a layer-based design approach is, that it allows for a very expressive
160+
<p>The benefit of using a layer-based design approach is that it allows for a very expressive
161161
setup that can represent, as far as we know, any machine learning algorithm.
162162
That makes Leaf a framework, that can be used to construct practical machine
163163
learning applications that combine different paradigms.</p>
164164
<p>Other machine learning frameworks take a symbolic instead of a layered approach.
165-
For Leaf, we decided against it, as we found it easier for developers to handle
166-
layers, than mathematical expressions. More complex algorithms like LSTMs are
167-
also harder to replicate in a symbolic framework than with layered ones. We
168-
believe that Leafs layer approach strikes a great balance between,
165+
For Leaf we decided against it, as we found it easier for developers to work with
166+
layers than mathematical expressions. More complex algorithms like LSTMs are
167+
also harder to replicate in a symbolic framework. We
168+
believe that Leafs layer approach strikes a great balance between
169169
expressiveness, usability and performance.</p>
170170

171171
</div>

doc/book/leaf.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -109,7 +109,7 @@ <h1>Leaf - Machine Learning for Hackers</h1>
109109
classical, stochastic or hybrids, and solvers for executing and optimizing the
110110
model.</p>
111111
<p>This is already the entire API for machine learning with Leaf. To learn how
112-
this is possible and how to build machine learning applications, refer to
112+
this is possible and how to build machine learning applications, refer to chapters
113113
<a href="./layers.html">2. Layers</a> and <a href="./solvers.html">3. Solvers</a>. Enjoy!</p>
114114
<h2>Benefits+</h2>
115115
<p>Leaf was built with three concepts in mind: accessibility/simplicity,

0 commit comments

Comments
 (0)