Friday, December 23, 2016

A bit of lambda calculus

Lately I have been messing around with functional programming. I find it extremely pure and scalable.
But I wanted to get back to the core of it and suddenly I found myself in the deep forests of lambda calculus. This article is going to cover what I learned the past few weeks. Let's jump in!


Lamba syntax only has
  • variables: x
  • lambda abstractions: λx.x
  • applications: (t s)
And that's it. Boom.
You have everything at this point. You have a Turing-complete language.
And that's (the simplicity) one of the most remarkable about this system.
Unlike other languages it has one primitive element which is the function. No Integers, Numbers, Strings, Characters, Booleans, etc.. Only functions.
But with this simple concept you are able to create any real-world general-purpose programs. (That's Turing-completeness in a nutshell in a very informal way.)
At this point you are going to ask: how can I implement primitive types such as booleans or numbers with functions? - And that's a very good question.

Church encoding

Don't be afraid of this term, it won't hurt.
Church encoding only means to be able to represent data and operators in the lambda calculus.
From another perspective: the guy (Alonzo Church, who had a major impact on modern logic and computation technology) who invented the lambda calculus was not a lunatic when he decided to create the core rules of the language but instead he wanted to create a general language for an investigation where he wanted to determine the foundations of mathematics. So he knew this language should support e.g. addition on two numbers. And - come closer, let me whisper something - it can.
To understand this we have to go a bit abstract.

You are thinking right now: "how could we possibly get the primitive 1 out of only function expressions?". The answer is: we won't.
We are going to go abstract. We are going to call a certain expression 1.
 And with this abstraction we can create a rule set where e.g. addition is going to make sense, so e.g. + 1 = 2. Elementary school flashback, isn't it?
But first, let's look at the most basic primitive element of any programming language: The Booleans. (As I wrote it down it feels like an 80's TV show.)
Think about booleans as a choice.
Let's create a higher-order function where if something is true it is going to return the "first" argument of the function and it returns the second if it's false.
true := (λx.(λy x))
false := (λx.(λy y))
It makes more sense if we define the basic logical operators as well:
and := (λp.(λq. p q) p)
or := (λp.(λq. p p) q)
Let's try a simple operation: (and true false):
(λp.(λq. p q) p) (λx.(λy x)) (λx.(λy y)) := true false true := (λx.(λy x)) false true := false := (λx.(λy y))
BOOM. true AND false = false. Magic.
What is really important here is the result is a function expression as well.

OK. Now we can do some simple logic operations. How about numbers? Addition? Multiplication? ..?
We can do that too! But remember, we only have functions as building blocks, and with these we are going to mimic numbers with functions (or function compositions).
Let's pretend that character "1" is not a number but just a character. Which is the case by the way in lambda calculus since we don't have any other primitive data types than functions.
With this let's define numbers as functions.
The key concept will be the following: if a function is going to compose that function n-times we are going to look at that function as a representation of the number n.
E.g.: f(f(f(x))) /or λf. λx.  f f f x / means it's the number 3. And λf. λx. x means it's the number 0.
0 := (λf. λx. x)
1 := (λf. λx. f x)
2 := (λf. λx. f f x)
3 := (λf. λx. f f f x)
...
n := (λf. λx. f n x)

With this abstraction now we can think about numbers as function expressions.
As we discussed in the previous example at the logical operations let's create  a simple operation on numbers: increment. This is the good ol' fashioned i++; statement. Let's call it succ (as successor).
succ := (λn. λf. λx. f (n f x))
Let's succ 0 for the sake of simplicity:
succ 0 := (λn. λf. λx. f (n f x))(λf. λx. x)
Basically what this means is call f on a function w-hich already called n-times f. f n+1. Easy.
succ 0 := (λn. λf. λx. f (n f x)) (λf. λx. x) = (λf. λx. f ( (λf_. λx_. x_) f x ) ) (λf. λx. f ( (λf_. f_) x ) ) = (λf. λx. f x)
And the result is 1! That's it!
You can imagine now and belive me there is a way to "simulate" all the mathematical operations such as addition, multiplication, exponentiation, etc.. If you're interested, you can find out more here.

This "thinking about natural numbers as they were functions" concept is called Church numerals. I skipped this term on purpose: I didn't want to scare you with formal mathematical words. But that's it, now you know what Church encoding and Church numerals are. Congratulations!

Fun fact the system above only works on natural numbers since if we would go under 0 it would just give us 0. But obviously we can create a system where minus numbers are also included in this game, but I'm not going to cover that topic in this very article. :)

Here is a very good lambda calculator if you would like to play around with lambda expressions: http://www.cburch.com/lambda/

In summary: I find it extremely interesting and fascinating that we can think about data, operations and almost anything else as composition and representation of functions. Nothing else is needed, just plain, simple functions.

Happy Functional Holidays!

Wednesday, September 28, 2016

Game of life with every pixel on a canvas!

Hello everyone!

Long time haven't posted, so here is a new dirty marble of mine: game of life using every pixel on a canvas!
My first thought was: "this is brilliant!", but eventually it turned out it is eating a lot of memory. - But still, it was worth the 2 hours I spent with!

You can learn more about the "game of life" here: https://en.wikipedia.org/wiki/Conway%27s_Game_of_Life .


Wednesday, July 6, 2016

Demistfying co

I always wondered why generator functions are included in the JS language, and still don't see why they are very powerful. However one of the key reasons I still think they are a needed step for the evolution of the language is the use-case of this very article.
I am not going to talk about generator functions or Promises or the Fetch API in this article, but you can read for more information in the links I attached.
The following code snippet is basically what co package does where you can easily do asynchronous calls in a synchronous way. (the key difference here is co can deal with not only Promises, but here we just wanted to work with them, not like e.g. thunks, which co can handle).
Enough with the chit-chat, let's see some code!
The first thing what you should feel here is it has a synchronous feeling when it does something asynchronous. Witchcraft!
Let's see what is happening here (click on "Next" and "Prev" to follow the steps of the control flow)



See the Pen Demistifying co by Adam Nagy (@nagyadam2092) on CodePen.
I think this whole topic is more about to move on to the next generation asynchron call methodology, which is called async-await and thus it is a needed step to be able to build that kind of behaviour.
BTW it's fun to have a look at what an ES6 transpiler is doing with this code-snippet, go on and try it here: https://babeljs.io/repl/
In summary I think we achieved a good (not great) way of handling asynchronous calls with a "synchron-like" feeling.

Friday, June 10, 2016

JavaScript and AngularJS basics

This week I had a presentation in Frankfurt for my LH Systems colleagues about JS and Angular in general.
I really had fun over there since they were such a nice audience. Thanks guys!
Here are the links for the presentations:

Friday, February 26, 2016

Minimalistic Angular minesweeper

It's been a long time, just wanted to show this little guy to you. Nothing special, nothing fancy, just a basic minesweeper.