Jul 15 2018
Myprotein is a fairly large European sports nutrition brand. Some time ago, they’ve also entered
the powdered meal replacements market with Whole Fuel, which is “inspired” by Huel.
They also offer meal replacement bars, which I will be reviewing as well.
Whole Fuel
Whole Fuel’s nutrition profile is a bit odd, 3 servings (100g each) provide you with
100% of your daily nutrients, but only 1300 calories. This is so that you can consume
additional calorie-rich supplements such as protein shakes. I usually consume about 2 shakes per day,
plus some other foods and/or protein shakes. But I’ve also had days where I only consumed
3 shakes of Whole Fuel and didn’t feel hungry, so it could be a good option for weight loss.
When mixing, Whole Fuel is extremely thick. You are supposed to mix 3 scoops with 350-500ml water.
I use 500ml water with only 2 scoops and it’s already as thick as pudding. I would recommend
3 scoops with at least 700ml water, unfortunately my shaker bottle does not hold that much.
I use a Promixx 2.0 vortex mixer, which creates a smooth and enjoyable texture. Shaking by hand
results in lots of unpleasant chunks, it’s advised to use a electrical mixer to prepare Whole Fuel.

Whole Fuel comes in three flavours, all of them contain loads of sweeteners.
- Vanilla Raspberry: Too sweet. You can taste the raspberry but not the vanilla. This one is my favorite.
- Vanilla: Too sweet. Doesn’t taste like vanilla at all, I don’t like it.
- Chocolate: Less sweet, still too much though. Doesn’t taste like chocolate, but it’s alright.
The best thing about Whole Fuel is its price, even though Myprotein has a opaque pricing scheme where
they set higher prices to then run discounts at all times. You can pick up a 5kg bag for as low as 40 Euros,
which will provide 10 full days of nutrition at 2000 calories per day, meaning it costs 4 Euros per day.
That’s even cheaper than the already quite affordable Jimmy Joy.
Overall, there are lots of good things about Whole Fuel. High quality ingredients, great texture,
great price. But they need to work on their flavouring and reduce the sweetness.
Meal Replacement Bars
Myprotein’s meal replacement bars are high on nutrients (at least 15% but not well balanced) but quite low on
calories (227 per bar). They also contain 20g of protein and 7.6g of sugar. I don’t think they have
enough calories to deserve the name meal replacement bar, but they still keep me satiated for 1-2 hours.
Again, a good option for weight loss or people with an active lifestyle.
These bars are really handy, I usually don’t eat breakfast, but now I always grab a bar when I
leave the house in the morning and consume it on the go.

There are again three flavours here:
- Salted Caramel: Delicious.
- Chocolate Fudge: Good, but a bit dry.
- Mocha: If you like coffee, you might like these. I do not.
You’ll always want to drink a bit of water along with these bars.
Compared to similar bars, they’re also fairly affordable. I’ve gotten them for as low as 14 Euros
per pack of 12, or about 9 Euros per day if you eat nothing but these bars (not recommended).
Other Products
They occasionally give you free products with your order, so I thought I’d quickly mention these:
- Impact Whey Protein Vanilla: You can actually taste the vanilla and it’s not too sweet either - this is
how Whole Fuel should taste.
- Baked Cookie: You can eat raw sugar and it will be less sweet than this cookie. I don’t know what they were thinking.
Most of their snacks are ridiculously expensive and of questionable nutritional value.
Like a small bar of chocolate with added protein for 4 Euros.
Conclusion
Overall I’m quite happy with Myprotein’s products. They are tailored towards people who exercise
a lot and take protein supplements, but can also be used to loose weight. My only complaints are
the bad flavouring of Whole Fuel and the opaque pricing scheme of Myprotein in general.
Nov 28 2017
Runtime is a new player on the meal replacement market
that was kickstarted by Soylent a few years ago. I have tried many of the
European brands and will compare them to Runtime. Full disclosure, I was
provided with a sample pack for free.
I had never heard of Runtime before, so I checked out their website.
They offer powerded food and energy drinks and primarily target gamers. I found
the site to be very appealing, definitely one of the better one’s out there.
Since the energy drink is just that and not a complete meal, I will only talk
about the powdered food titled “Next Level Meal”. It comes in four different
flavours: Original, Coconut, Strawberry and Chocolate. Unfortunately Chocolate was not
included in the mix pack I recieved.

Each bag is a single meal, which I like because I often make a
mess with the larger bags. The price is €3.50 per bag, or about €2.50 with a
subscription. At 600kcal per bag, Runtime is among the more expensive meal
replacements.
Each bag requires you to add 300ml of water, after a bit of shaking the powder
dissolves pretty well. The texture is still a bit gritty, but I didn’t mind it.
For me, this is a definite improvement over Queal’s texture.
The original flavour is easily my favourite, it’s quite sweet with a bit of
vanilla. The coconut flavour is similarly sweet with a hint of coconut.
I couldn’t stand the taste of the strawberry, but that applies to Jimmy Joy’s
strawberry as well, it’s just not for me. Overall I’d say the flavouring is
top notch and comparable to Queal.

There is a bit of a alarming trend though, and that is the sugar content. A meal
of Jimmy Joy contains 10g of sugar, a meal of Runtime contains almost 18g of sugar.
I imagine this is part of the reason why it tastes so good, but it’s still
something to keep in mind.
To conclude this review, I like Runtime a lot. It has the best texture of all
powdered foods I’ve tried, the flavouring is great as well. The price is a bit
higher than I’d like, but from what I’ve been told we might see it decrease soon.
Oct 23 2016
If there is one thing the TypeScript people should to improve, it’s documentation.
I just went through a small Odyssey trying to convert a project to TypeScript
2.0 and the new way of getting definition files. Hint: Look at how the Rust
project does documentation.
Updating TypeScript
Go into your package.json
, set TypeScript to version 2.something
and run
npm install
. Done. Try to compile your project, it probably won’t because the
compiler is a bit more strict. The error messages however should give you an
idea what needs to be changed, go and fix your code. Visual Studio Code uses
the globally installed TypeScript version, update that one as well by running
npm install -g
- yes, not update, don’t ask why. I make this mistake every.
single. time.
Use @types
You can now install type definitions via npm, typings/tsd
are no longer needed. Installation goes like this:
npm install --save @types/react
All DefinitelyTyped definitions are available, so you might as well do this now.
After installing all typings, remove the reference path to the old definitions,
try to build and observe how TypeScript cannot resolve a single module. First,
we have to tell TypeScript that we’re using node modules:
// tsconfig.json
{
"compilerOptions": {
"moduleResolution": "node"
}
}
Now, you might actually be able to compile. Unless you’re using any global
non-standard functions like require
or core-js shims. Remember that you
had to explicitly load typings using the reference path before? This is no
longer necessary, but this also means TypeScript has no idea what typings are
available. When you import
something, they are loaded automatically, but if
something should always be loaded, you need to configure that:
// tsconfig.json
{
"compilerOptions": {
"types": [
"node", "mocha", "core-js"
]
}
}
Done, now your project should work as usual and with one less tool required.
This wasn’t actually hard, was it? It still took me around an hour to figure
out, which could’ve been prevented by a simple mention of these things in the
release announcement or elsewhere (search the title of this post, there is
zero documentation about this).
May 14 2016
I’ve been running a home server for a few years, but my upload is just too poor to
do anything serious with it, so I got myself a cheap dedicated server. Installed
FreeBSD, because lets try bhyve, their new-ish hypervisor.
The default “frontend” to bhyve is quite complex, so I used vm-bhyve
instead, which can definitely compete with Docker in ease of use.
So let’s install it. It is in ports, but the package usually outdated, so
make sure you install from source.
# if you don't have the ports tree yet
portsnap fetch extract
cd /usr/ports/sysutils/vm-bhyve && make install clean
If you plan to run anything other than FreeBSD, you’ll also need grub2-bhyve
:
cd /usr/ports/sysutils/grub2-bhyve && make install clean
Some initial config:
mkdir /var/vm
zfs create -o mountpoint=/var/vm zroot/vm
echo 'vm_enable="YES"' >> /etc/rc.conf
echo 'vm_dir="zfs:zroot/vm"' >> /etc/rc.conf
vm init
cp /usr/local/share/examples/vm-bhyve/* /var/vm/.templates/
This is enough to be able to launch VMs, but we want networking as well.
echo 'net.inet.ip.forwarding=1' >> /etc/sysctl.conf
echo 'pf_enable="YES"' >> /etc/rc.conf
vm switch create public
vm switch add public em0
vm switch nat public on
pkg install dnsmasq
echo 'dnsmasq_enable="YES"' >> /etc/rc.conf
mv /usr/local/etc/dnsmasq.conf.bhyve /usr/local/etc/dnsmasq.conf
service dnsmasq start
vm-bhyve will add a include line to /etc/pf.conf
, you might
have to move it up a bit (check with pfctl -f /etc/pf.conf
).
Now, we need an ISO, which vm-bhyve can download for us:
vm iso ftp://ftp.freebsd.org/pub/FreeBSD/releases/amd64/amd64/ISO-IMAGES/10.3/FreeBSD-10.3-RELEASE-amd64-disc1.iso
If you want to download iso manually, just put them in /var/vm/.iso/
.
Let’s launch a VM:
vm create -t freebsd-zpool -s 50G freebsd1
vm install freebsd1 FreeBSD-10.3-RELEASE-amd64-disc1.iso
vm console freebsd1
Now, just go through the installer as usual. Easy!
Next step: figure out how to assign IPv6 addresses to VMs.
Hopefully not too hard.
May 10 2016
or: How to implement XML parsing in just 500 lines of Rust.
A weekly blog about my progress on CloudFM, a offline-first, multi-backend music player.
Not the best start for a series like this, but last week my SDD died. Then I wasted
an entire evening trying to install OpenSuse Tumbleweed (something something SecureBoot).
Bottom line, I did some stuff, but not even close to what I wanted to achieve.
What’s new
hyperdav got all required functionality.
I’m not particularly proud about the code, especially the response parsing using
xml-rs is extremely verbose, even though like 90% of the body is ignored anyway.
Maybe real xml support in serde will happen one day.
WebDAV indexing is now implemented. This change broke some parts of the app, since
the URI format has changed to now always include the backend id.
All components are now dockerized. I want to do some form of automated deployment
soon-ish. Not because it makes sense right now, but because playing with ops stuff
is fun.
What’s next
In case my notebook decides to explode tomorrow, let’s set the goals a bit lighter:
- Make the web app usable for everyday listening - same as last week
- Implement the UI to add ownCloud/Box.com backends, which will be
stored as webdav behind the scenes
May 3 2016
Over the last few months, I’ve been working on a next-generation music player
for the age of “servers connected to the internet”, also known as the “cloud”.
Because I am bad at naming things, I called it CloudFM.
You don’t actually need to click the link, because there’s nothing to see there.
I’m mainly putting this out there because I want to regularly share progress
I’ve made. But let’s start with what I’ve done so far.
Why I’m doing this
Long story short, I loved Rdio, then Rdio shut down. Turns out the alternatives
aren’t as good as Rdio and a lot of them even require Flash (2016??).
I switched to Subsonic, which works ok, but Rdio was just so much better in so
many ways. So I’m building my own thing instead.
CloudFM is a music player that integrates a plethora of cloud services
(think Spotify, YouTube, Dropbox, SoundCloud and more) into a single player
interface. Since mobile internet is expensive and not always available, I want
to make it offline-capable as much as possible. And nowadays, you have so much
storage space on your phone, meanwhile your tiny SSD on your notebook
is constantly running out of space - CloudFM will let you store your music
on your phone, and listen to it on your desktop.
Micro-services written in Rust
To be honest, I did not intend to use a micro-service architecture from the start.
I actually wrote a monolithic server first, until I realized: I’m going to
need a lot of this code at different places. For example, indexing code will
have to run on the server side, but also as part of a GUI desktop app. That is
why I turned my server into a library that compiles to a couple of binaries:
indexd
: The indexing daemon, it indexes music from various online services (and local files).
proxyd
: Give it a track ID, it will respond with a audio file, not matter where
it is stored. In the future, it will also do things like on the fly re-encoding
of files, and more.
manager
: A desktop app, to index and serve local files. Will probably use the
excellent GTK bindings for Rust. Or maybe the Qt Quick bindings, because
GTK isn’t actually that great on platforms other than Linux. Ideally, both.
Web app written in TypeScript/React/Redux
Initially, I started writing it in Elm. It was an interesting experiment,
and there are a lot of things I like about the language, but the cons didn’t
quite outweight the pros. The short version: The language has a few shortcomings
even in its domain (web apps), the ecosystem is rather small, integrating
existing JavaScript libraries and APIs is a lot of work.
Searching for an alternative, I decided to use TypeScript first. I treat it
as a very powerful linter, if your code passes the linter (compiles), it’s
very likely correct. Less edit-tab-reload-tab-edit, more instant feedback
through your editor. While the type system is not as good as Rust’s, and Redux
is not very TypeScript-friendly, I do not regret it at all, partially because of
the awesome TypeScript integration in Visual Studio Code.
Choosing a front-end framework was a really straightforward process: I knew
I wanted a native mobile app, because hybrid apps just aren’t that great. And
since we bascially require PouchDB for offline availability of the database,
React and React Native are pretty much the only viable option. Together with
Redux, we’ve got a pretty nice and Elm-like stack with a big ecosystem.
What works
It’s been a bit over a week since I started rewriting the web app, and
here’s what it looks like:

This is just a very early prototype, expect things to change, a lot. A lot
of the features you’d expect from any player aren’t there, yet. The design
will also definitely change in the future.
What’s not visible on the screenshot is that the music is not just local, but
also from Jamendo. In the future, a lot more backends will follow.
What’s next
My goals for this week:
- Continue work on my WebDAV client for Rust and then implement the WebDAV backend
- Make the web app usable for everyday listening
- Start working on a Musicbrainz client
- Read Design for Hackers
Mar 28 2016
I wrote a Jamendo API client in Rust
today. And it was easy.
Yes, Rust is quite hard to learn. But after you’ve grasped the concepts it’s
built around, like ownership and lifetimes, it all makes sense. Unlike
JavaScript, which never made any sense.
Mar 26 2016
I’ve written quite a lot JavaScript and Node.js code over the last few years.
Unlike many, I think it’s a great language and server platform. However, the
lack of static types does regularly introduce errors that are hard to find.
That’s why I decided to use TypeScript for a new
project.
The website describes it as a superset of JavaScript, with typings being
optional. I was quite surprised to find out this is not true, at all. I started
by installing TypeScript, a webpack loader, and react with npm. Now, we should
be able to do this:
import * as React from "react";
ERROR in ./src/main.ts
(2,24): error TS2307: Cannot find module 'react'.
Uh, what? I just installed it, why can’t the compiler find it? After doing a bit
of reading (the official docs are severely lacking in that regard, I have to
say), I found out that typings are actually not optional. There are two
solutions:
Install the require
typings and load modules with require()
. This works
because the typings for require
define it as returning any
, therefore
disabling all type checks for it.
Install the typings for react
. This is the recommended approach, however
the typings have to be created by hand, therefore they don’t exist for all
modules.
How do you install typings, you might ask. There is a project with the same
name that provides a package manager for them, similar to npm. Install it
via npm install -g typings
, then you can install type definitions using
typings install react --save --ambient
. --save
stores them in the
typings.json
, which is like your package.json
but for typings. --ambient
is required for modules that export to the global namespace - I don’t yet
know why it’s required for react
, only that it won’t work without.
After you’ve installed them, you need to add one special line to the top of
your code:
/// <reference path='../typings/browser.d.ts'/>
The path is relative from your source file and tells TypeScript to load
the definitions that you’ve installed via typings
. Note that browser.d.ts
is for browser projects, if you target Node.js, use main.d.ts
instead.
Initially, I also had issues with multiple definitions. This is because
TypeScript, by default, loads all .ts
files. What we usually want is just a
single file that includes the rest of our code. To fix this, we need to create
a tsconfig.json
in our project root:
{
"files": [
"src/main.ts"
]
}
Now, finally, we are able to use React from TypeScript. I feel like there is a
lot that needs to be improved here, starting with the compiler error message.
Now, what if there are no typings for the module you’d like to use? As I
mentioned earlier, you can use require()
to completely bypass type checks
by installing the typings for it and then doing:
import h = require('react-hyperscript-helpers');
Unfortunately, this is not possible with ES6 modules, so we don’t get the
ability to destructure during import. I think that creating typings is what you
should be doing instead, that’s why you’re using TypeScript in the first
place, right?
The Handbook
tells you how to do it, here’s just a quick example, my react-hyperscript-helpers.d.ts
:
declare namespace ReactHyperscriptHelpers {
function h1(text: string): any;
}
declare module "react-hyperscript-helpers" {
export = ReactHyperscriptHelpers;
}
As you can see, it defines a single function h1
that takes a string
and
returns any
. Now, we can do this:
/// <reference path='./react-hyperscript-helpers.d.ts'/>
import { h1 } from "react-hyperscript-helpers";
ReactDOM.render(
h1('Hello, World!'),
document.body
);
I don’t think it ever took me longer to get a Hello World up and running.
Microsoft, please make this stuff easier to get started with.
Mar 22 2016
Nix is a package manager that works a bit differently. It allows you to install
any version of any package alongside each other. Since you can’t, for example,
have multiple python
executables, nix-shell
is used to give you a
environment with all the dependencies you need.
Let’s say you have a Rust project. You create a default.nix
in your project
directory:
with import <nixpkgs> { };
rustPlatform.buildRustPackage rec {
name = "my-project-${version}";
version = "0.1";
src = ./.;
buildInputs = [ openssl pkgconfig ];
depsSha256 = "160ar8jfzhhrg5rk3rjq3sc5mmrakysynrpr4nfgqkbq952il2zk";
}
This defines a Rust package in the current directory, with the openssl
dependency. Note that buildInputs
only lists native dependencies, your
crates are specified in your Cargo.toml
as usual.
To build a package out of this, you can run nix-build default.nix
(or just nix-build .
). However, this will always build the project from a
clean state, which we don’t really want during development. So instead, we do
nix-shell .
, which puts us in a new shell that not only has openssl
and
pkgconfig
, but also all dependencies of rustPlatform
, like rustc
and
cargo
.
Now, what if we need a database? Well, we’d have to install that through the
usual channels - right? Wrong! This is where things get really interesting: Nix
has packages for pretty much all databases, and nix-shell allows us to run
custom commands when we enter a shell. This property is called shellHook
:
rustPlatform.buildRustPackage rec {
name = "my-project-${version}";
// ...
shellHook = ''
${couchdb}/bin/couchdb -a couchdb.ini &
'';
}
This would start CouchDB every time we enter our development
environment. And if you’re still using Make to run your build commands, consider
specifying them in your shellHook
instead:
shellHook = ''
function ci {
cargo build
cargo test
}
'';
You can of course use Nix on your continuous integration platform, like Travis,
by setting the script
to:
nix-shell default.nix --command ci
By using Nix, the environment on Travis is exactly the same as the one you use
locally. No longer will you have issues because Travis hasn’t updated their
sqlite version in the last 5 years.
Mar 18 2016
This year’s hotness: Hugo. Being the web hipster that I am,
of course I switched. Not that I didn’t have a good reason, I had already
written two or three posts with Middleman, so it felt really old and used.
On a serious note, Middleman does feel a bit limiting when you build more
complex sites with it. But maybe using static site generators for anything other
than simple blogs and documentation is just a bad idea. It’s certainly a lot
better than Jekyll and its awful Liquid templating syntax.