A benchmark of Thai word tokenizers written in various programming languages
The origin post was at https://veer66.wordpress.com/2017/01/19/benchmark-thai-word-tokenizers/ posted on 2017/01/19.
I wonder about speed of programs written in different languages. For example, I wonder whether one written in Kotlin and ran on JVM is slower than one written in Go. Although there are several existing benchmarks, this is one may be still important at least for me, because Thai word tokenizer is my real task.
So @iporsut and me wrote some programs in different programming languages and tried to optimize them.
I conducted the experiment on my laptop computer, which has Intel® Core™ i3-4030U CPU @ 1.90GHz × 4, on a 20MB Thai text corpus.
|Lang||1 (seconds)||2 (seconds)||3 (seconds)||AVG (seconds)|
|Rust Nightly 2017-01-08||source|
|Kotlin 1.0.6 + Clojure 1.8.0 + OpenJDK 1.8||source, source|
|Clojure 1.8.0 + Kotlin 1.0.6 + OpenJDK 1.8||source|
@iporsut has already written multicore versions, so maybe next month I will conduct another experiment.