04/1/18

# Iterating with Dates and Time in Julia

`Julia` has good documentation on dealing with Dates and Time, however that is often in the context constructing and Date and Time objects. In this post, I am focus on the ability to iterate over Dates and Times. This is very useful in countless application.

We start of by capturing this moment and moving ahead into the future

```julia> this_moment=now()
2018-04-01T23:13:33.437```

In one hour that will be

```julia> this_moment+Dates.Hour(1)
2018-04-02T00:13:33.437```

Notice that `Julia` was clever enough properly interpret that we will be on the in another day after exactly one hour. Thanks to it multiple dispatch of the `DateTime` type to be able to do TimeType period arithmatic.

You can then write a nice `for` loop that does something every four hours for the next two days.

```julia> for t=this_moment:Dates.Hour(4):this_moment+Dates.Day(2)
println(t)
#or somethings special with that time
end
2018-04-01T23:13:33.437
2018-04-02T03:13:33.437
2018-04-02T07:13:33.437
2018-04-02T11:13:33.437
2018-04-02T15:13:33.437
2018-04-02T19:13:33.437
2018-04-02T23:13:33.437
2018-04-03T03:13:33.437
2018-04-03T07:13:33.437
2018-04-03T11:13:33.437
2018-04-03T15:13:33.437
2018-04-03T19:13:33.437
2018-04-03T23:13:33.437```

Often we are not so interested in the full dates. For example if we are reading a video file and we want to get a frame every 5 seconds while using `VideoIO.jl`. We can deal here with the simpler `Time` type.

```julia> video_start=Dates.Time(0,5,20)
00:05:20```

Here we are interested in starting 5 minutes and 20 seconds into the video.
Now we can make a nice loop from the start to finish

```for t=video_start:Dates.Second(5):video_start+Dates.Hour(2)
h=Dates.Hour(t).value
m=Dates.Minute(t).value
s=Dates.Second(t).value
ms=Dates.Millisecond(t).value
# Do something interesting with ffmpeg seek on the video
end```
02/9/18

# When Julia is faster than C, digging deeper

In my earlier post I showed an example where `Julia` is significantly faster than `c`. I got this insightful response

So I decided to dig deeper. Basically the standard `c` `rand()` is not that good. So instead I searched for the fastest Mersenne Twister there is. I downloaded the latest code and compiled it in the fastest way for my architecture.

```/* eurler2.c */
#include <stdio.h>      /* printf, NULL */
#include <stdlib.h>     /* srand, rand */
#include "SFMT.h"       /* fast Mersenne Twister */

sfmt_t sfmt;

double r2()
{
return sfmt_genrand_res53(&sfmt);

}

double euler(long int n)
{
long int m=0;
long int i;
for(i=0; i<n; i++){
double the_sum=0;
while(1) {
m++;
the_sum+=r2();
if(the_sum>1.0) break;
}
}
return (double)m/(double)n;
}

int main ()
{
sfmt_init_gen_rand(&sfmt,123456);
printf ("Euler : %2.5f\n", euler(1000000000));

return 0;
}```

I had to compile with a whole bunch of flags which I induced from `SFMT`‘s `Makefile` to get faster performance.

`gcc -O3 -finline-functions -fomit-frame-pointer -DNDEBUG -fno-strict-aliasing --param max-inline-insns-single=1800  -Wall -std=c99 -msse2 -DHAVE_SSE2 -DSFMT_MEXP=1279 -ISFMT-src-1.5.1 -o eulerfast SFMT.c euler2.c`

And after all that trouble we got the performance down to 18 seconds. Still slower that `Julia`‘s 16 seconds.

```\$ time ./eulerfast
Euler : 2.71824

real    0m18.075s
user    0m18.085s
sys 0m0.001s```

Probably, we could do a bit better with more tweaks, and probably exceed `Julia`‘s performance with some effort. But at that point, I got tired of pushing this further. The thing I love about `Julia` is how well it is engineered and hassle free. It is quite phenomenal the performance you get out of it, with so little effort. And for basic technical computing things, like random number generation, you don’t have to dig hard for a better library. The “batteries included” choices in the `Julia`‘s standard library are pretty good.

02/8/18

# When Julia is faster than C

On e-day, I came across this cool tweet from Fermat’s library

So I spend a few minutes coding this into `Julia`

```function euler(n)
m=0
for i=1:n
the_sum=0.0
while true
m+=1
the_sum+=rand()
(the_sum>1.0) && break;
end
end
m/n
end```

Timing this on my machine, I got

```julia> @time euler(1000000000)
15.959913 seconds (5 allocations: 176 bytes)
2.718219862```

Gave a little under 16 seconds.

Tried a `c` implementation

```#include <stdio.h>      /* printf, NULL */
#include <stdlib.h>     /* srand, rand */
#include <time.h>       /* time */

double r2()
{
return (double)rand() / (double)((unsigned)RAND_MAX + 1);
}

double euler(long int n)
{
long int m=0;
long int i;
for(i=0; i<n; i++){
double the_sum=0;
while(1) {
m++;
the_sum+=r2();
if(the_sum>1.0) break;
}
}
return (double)m/(double)n;
}

int main ()
{
printf ("Euler : %2.5f\n", euler(1000000000));

return 0;
}```

and compiling with either `gcc`

`gcc  -Ofast euler.c`

or `clang`

`clang  -Ofast euler.c`

gave a timing twice as long

```\$ time ./a.out
Euler : 2.71829

real    0m36.213s
user    0m36.238s
sys 0m0.004s```

For the curios, I am using this version of `Julia`

```julia> versioninfo()
Julia Version 0.6.3-pre.0
Commit 93168a6 (2017-12-18 07:11 UTC)
Platform Info:
OS: Linux (x86_64-linux-gnu)
CPU: Intel(R) Core(TM) i7-4770HQ CPU @ 2.20GHz
WORD_SIZE: 64
BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
LAPACK: libopenblas64_
LIBM: libopenlibm
LLVM: libLLVM-3.9.1 (ORCJIT, haswell)```

Now one should not put too much emphasis on such micro benchmarks. However, I found this a very curious examples when a high level language like `Julia` could be twice as fast a `c`. The `Julia` language authors must be doing some amazing mojo.

01/11/18

# Factorizations to get a very special number

Mathematical recreation is fun. I came across a nice Mind your Decisions video and thought it would be cool to use `Julia` to find that special number. Have a look at the video here before we continue

1. `using Primes`
2. ` `
3. `function getBigestFactors{T}(a::T,b::T)`
4. `    c=T()`
5. `    for k in keys(a)`
6. `        c[k]=max(a[k],b[k]) `
7. `    end`
8. `    for k in keys(b)`
9. `        c[k]=max(a[k],b[k]) `
10. `    end`
11. `    c`
12. `end`
13. ` `
14. `superfactors=reduce(getBigestFactors,map(factor,BigInt.(1:100))) `
15. ` `
16. `n=prodfactors(superfactors)`

`Julia` can be amazingly expressive. In line 14 are getting all the prime factors of all the numbers from 1 to 100 using the `map` function. We are using the type `BigInt` since will be generating pretty large numbers. The standard `Int` will simply overflow. We then use `reduce` to get all the common factors for the numbers from 1 to 100.

If your are slight confused lets take a bit slower. First we generate a one dimensional array of `BigInt`s from 1 to 100.

`BigInt.(1:100)`

This makes use of `Julia`‘s very powerful dot factorization syntax. We then `map` that array into another array of `Primes.Factorization{BigInt}` through the `factor` function in the `Primes` package. The `Primes.Factorization` type is subtype of `Associative`. I learned that from reading the implementation.

The cool thing about `Associative` types that the can be very conveniently accessed. Lets have a closer look.

```a=factor(24)
println(a)```

yields

`Primes.Factorization(2=>3,3=>1)`

That is $2^3 3$

You can then do the following

`print(a[2])`

yielding

`3`

This means that the number 24 has a 2 raised to the third power as a factor.

Or, be more adventurous and try

`print(a[5])`

which will print a

`0`

This makes perfect sense as $5^0=1$ and `1` is always a factor of any number. We will always get 0 exponent for any number that is not a prime factor of a given number. The `Primes.jl` package authors have done great use of `Julia`‘s very powerful type system.

To solve this problem we need to look at all the prime factors for all the numbers from 1 to 100 find the highest exponent of the of each of those prime factors. To do that we implement the `getBigestFactor` function when does that for any two prime factorizations. We plug that into `reduce` and et voilà!

The superfactors in line 14 are

`2^6 ⋅ 3^4 ⋅ 5^2 ⋅ 7^2 ⋅ 11 ⋅ 13 ⋅ 17 ⋅ 19 ⋅ 23 ⋅ 29 ⋅ 31 ⋅ 37 ⋅ 41 ⋅ 43 ⋅ 47 ⋅ 53 ⋅ 59 ⋅ 61 ⋅ 67 ⋅ 71 ⋅ 73 ⋅ 79 ⋅ 83 ⋅ 89 ⋅ 97`

Finally we multiply them to get our special number at line 16

`n=69720375229712477164533808935312303556800`

Are we done yet?

As the saying goes “there is more than one way to skin a cat”. Some are more elegant than others.
Here is a more elegant and computationally more efficient way.

1. `Using Primes`
2. `function getAllFactors{T}(n::T)`
3. `    p_factors=filter(isprime,T.(1:n))`
4. `    exponents=[floor(T,log(p,n)) for p in p_factors]`
5. `    zip(p_factors,exponents) |> Dict |> Primes.Factorization`
6. `end`

Here we just get the prime numbers in the range from 1 to `n` (line 3). Using those primes we then get the highest exponents that will not yield a number outside the range (line 4). Finally, we package everything as a `Primes.Factorization` (line 5).

In `Julia` REPL we get

```julia> getAllFactors(big"100")
2^6 ⋅ 3^4 ⋅ 5^2 ⋅ 7^2 ⋅ 11 ⋅ 13 ⋅ 17 ⋅ 19 ⋅ 23 ⋅ 29 ⋅ 31 ⋅ 37 ⋅ 41 ⋅ 43 ⋅ 47 ⋅ 53 ⋅ 59 ⋅ 61 ⋅ 67 ⋅ 71 ⋅ 73 ⋅ 79 ⋅ 83 ⋅ 89 ⋅ 97

julia> prodfactors(ans)
69720375229712477164533808935312303556800```

At that point, you might ask, why stop at a 100. `Julia` is lightening fast and powerful. Just for fun we can do the same for numbers from 1 to 1000.

```julia> getAllFactors(big"1000")
2^9 ⋅ 3^6 ⋅ 5^4 ⋅ 7^3 ⋅ 11^2 ⋅ 13^2 ⋅ 17^2 ⋅ 19^2 ⋅ 23^2 ⋅ 29^2 ⋅ 31^2 ⋅ 37 ⋅ 41 ⋅ 43 ⋅ 47 ⋅ 53 ⋅ 59 ⋅ 61 ⋅ 67 ⋅ 71 ⋅ 73 ⋅ 79 ⋅ 83 ⋅ 89 ⋅ 97 ⋅ 101 ⋅ 103 ⋅ 107 ⋅ 109 ⋅ 113 ⋅ 127 ⋅ 131 ⋅ 137 ⋅ 139 ⋅ 149 ⋅ 151 ⋅ 157 ⋅ 163 ⋅ 167 ⋅ 173 ⋅ 179 ⋅ 181 ⋅ 191 ⋅ 193 ⋅ 197 ⋅ 199 ⋅ 211 ⋅ 223 ⋅ 227 ⋅ 229 ⋅ 233 ⋅ 239 ⋅ 241 ⋅ 251 ⋅ 257 ⋅ 263 ⋅ 269 ⋅ 271 ⋅ 277 ⋅ 281 ⋅ 283 ⋅ 293 ⋅ 307 ⋅ 311 ⋅ 313 ⋅ 317 ⋅ 331 ⋅ 337 ⋅ 347 ⋅ 349 ⋅ 353 ⋅ 359 ⋅ 367 ⋅ 373 ⋅ 379 ⋅ 383 ⋅ 389 ⋅ 397 ⋅ 401 ⋅ 409 ⋅ 419 ⋅ 421 ⋅ 431 ⋅ 433 ⋅ 439 ⋅ 443 ⋅ 449 ⋅ 457 ⋅ 461 ⋅ 463 ⋅ 467 ⋅ 479 ⋅ 487 ⋅ 491 ⋅ 499 ⋅ 503 ⋅ 509 ⋅ 521 ⋅ 523 ⋅ 541 ⋅ 547 ⋅ 557 ⋅ 563 ⋅ 569 ⋅ 571 ⋅ 577 ⋅ 587 ⋅ 593 ⋅ 599 ⋅ 601 ⋅ 607 ⋅ 613 ⋅ 617 ⋅ 619 ⋅ 631 ⋅ 641 ⋅ 643 ⋅ 647 ⋅ 653 ⋅ 659 ⋅ 661 ⋅ 673 ⋅ 677 ⋅ 683 ⋅ 691 ⋅ 701 ⋅ 709 ⋅ 719 ⋅ 727 ⋅ 733 ⋅ 739 ⋅ 743 ⋅ 751 ⋅ 757 ⋅ 761 ⋅ 769 ⋅ 773 ⋅ 787 ⋅ 797 ⋅ 809 ⋅ 811 ⋅ 821 ⋅ 823 ⋅ 827 ⋅ 829 ⋅ 839 ⋅ 853 ⋅ 857 ⋅ 859 ⋅ 863 ⋅ 877 ⋅ 881 ⋅ 883 ⋅ 887 ⋅ 907 ⋅ 911 ⋅ 919 ⋅ 929 ⋅ 937 ⋅ 941 ⋅ 947 ⋅ 953 ⋅ 967 ⋅ 971 ⋅ 977 ⋅ 983 ⋅ 991 ⋅ 997

julia> prodfactors(ans)
7128865274665093053166384155714272920668358861885893040452001991154324087581111499476444151913871586911717817019575256512980264067621009251465871004305131072686268143200196609974862745937188343705015434452523739745298963145674982128236956232823794011068809262317708861979540791247754558049326475737829923352751796735248042463638051137034331214781746850878453485678021888075373249921995672056932029099390891687487672697950931603520000```

Now we get super special number that is 443 digits long. Lets finish by a `Julia` one liner to get the number of digits

```julia> getAllFactors(big"1000") |> prodfactors|> digits |>length
433```
09/29/16

# Making your own persistent USB bootable Ubuntu Distro: the easy way

I had a recent Ubuntu install with customizations that I wanted to be able to put it on a USB, boot into it on another machine and be able to modify its contents by, installing new packages, editing files, etc…

## PinguyBuilder to the rescue

So after a number of false starts. I stumbled upon PinguyBuilder. I followed the excellent instructions at maketecheasier, with the slight modification of checking the PinguyBuilder site for the the latest version. I used the backup option to get all the tweaks on home directory, and my none `deb` packages in. My Ubuntu was a derivative the standard Xubuntu 16.04 distro.

## Moving the ISO to the USB

The generated `iso` file could then be installed `usb-creator-gtk`, or if are working on windows, Rufus. At this point I could nicely boot using my USB stick. I could not, however, write anything persistently on the USB. If I create some file, and then reboot, it would be gone!

## Making the USB persistent

First I need to create `casper-rw` file that would be identified by boot-loader as space to write on. I was happy with 2GB, so this what I did by adopting the instructions on StackExchange post:

I created an empty 2GB file

`dd if=/dev/zero of=casper-rw bs=1M count=2048`

You can changed `count` to whatever size of persistent storage you want. I then had to format that space into something that Linux would be able to read

`mkfs.ext4 -F casper-rw`

The next step was to change `/boot/grub/grub.cfg` file on the USB to have the following menu-entry at the beginning.

```menuentry "Xubuntu in persistent mode" {
linux   /casper/vmlinuz  file=/cdrom/preseed/custom.seed boot=casper persistent iso-scan/filename=\${iso_path} quiet splash --
initrd  /casper/initrd.gz
}```

Finally, I modified `/isolinux/isolinux.cfg` to include the extra label

```label persistent
menu label live - boot the Live System
kernel /casper/vmlinuz
append  file=/cdrom/preseed/custom.seed boot=casper persistent initrd=/casper/initrd.gz quiet splash --```

et voilà!

07/11/15

# On terrorism and the hate that follows

I worry less about the damage to live and property from acts of terror, than the waves of hate that soon follow. The damage due to hate is always much worse. Hate brings out the worst in humans. It locks the mind in inescapable focus on violence. All noble ideas, and worthy aspirations fall by the wayside as survival becomes the primary preoccupation. When that happens, I sense that the terrorists have already won.

07/11/15

# الكراهية أخطر من الإرهاب

أرى نفسي أقل قلقاً من الموت و الدمار الناتج من الأعمال الإرهابية بالمقارنة لموجات الكراهية التي تتبعها. الضرر و الخسارة الناتجة من الكراهية التي تولدها هذه الأعمال أشد خطورة.
الكراهية تسقم القلوب و تبرز أقبح ما في البشر.
الكراهية تسجن العقل في دائرة لا نهائية في التركيز علي العنف.
كل الأفكار النبيلة و الطموحات الشريفة تسقط عندما يكون الهم الشاغل هو البقاء.
عندما يحدث ذالك أشعر أن الإرهاب قد نال غايته.

07/11/15

# Herodotus on Decision Making

A decision was wise, even though it led to disastrous consequences, if the evidence at hand indicated it was the best one to make; and a decision was foolish, even though it led to the happiest possible consequences, if it was unreasonable to expect those consequences.

03/7/14

# Nepomuk Pains

Recently I ran into a nasty situation on Kubuntu desktop where a host of crucial KDE components would cease to work. Chiefly amongst them were:

1. No instances of Dolphin would start
2. The kickoff launcher would not work
3. The KDE activity manager was crashing.

I suspected the Nepomuk might be the culprit and I was right.  I stopped it by running

`qdbus org.kde.NepomukServer /nepomukserver quit`

I then wiped out its data by running

`rm -r .kde/share/apps/nepomuk/`

Finally I got it started again by runing

`nepomukserver`

…and my problems were solved.

Trying to search for a solution to this problem, I found people suggesting that one would wipe the ~/.kde folder and start afresh. That sounded too drastic and a great deal of my customizations would have been lost in the process.

01/6/14

# The gods they make

The gods they make capture their fears, their ignorance, and their desire to control.
Their gods share their grotesque outlook, yet they fully capture their aspirations.
Aspirations of dominance, control, and subjugation of others.
Aspirations of meek and mean-spirited human “beings”.
The wretches are blind to human potential!