
I first encountered Julia at StrangeLoop a few years ago. I haven't spent time with it but I intend to look into Julia more. Do you think there is a possibility that numerical code that would have been written in Fortran in the past might now be coded or recoded in Julia? Do Julia macros affect performance in a noticeable way? Geoff On Wed, Apr 15, 2020, at 14:05, Michael Bukatin wrote:
I have been looking at Julia and its ecosystem in the last few months, and it is a very interesting experience. The language has full-strength Lisp macros, and full-strength multiple dispatch (so it is a full Lisp), while the user-facing syntax is not Lisp-like:
https://docs.julialang.org/en/v1/manual/metaprogramming/
So it's both a Lisp and a non-Lisp.
Generally speaking, people who creat Julia are consistently trying to "eat one's cake and to have it too", along multiple dimensions. Another axis is that the language is more flexible than Python, but is as fast as C. This is achieved via a very tasteful language design (the compiler is a normal competent LLVM compiler without miracles, it does not play any special role in this combination of expressiveness and speed).
I have also found Julia open-source software on github unusually readable and easy to understand (it also tends to be very compact).
The reason I was looking at Julia was that I was having an unusually flexible class of machine learning problems (a class of neural machines which is based on processing complicated structured data streams, and on using "flexible tensors" with tree-shaped indices; so one can do much more with these neural machines than with traditional neural nets).
Even the most flexible Python frameworks, such as PyTorch, are too rigid for this class of problems, because they are oriented towards fixed multidimensional arrays ("tensors").
In this sense, Julia ecosystem seems to have a perfect fit, the Julia Flux machine learning framework, which is specifically oriented towards maximal flexibility and away from "tensors", while still being focused on high performance:
https://github.com/FluxML/Flux.jl
So far I was mostly reading other people's code, and doing small-scale explorations of my own (and creating publicly available notes in the process): https://github.com/anhinga/2020-julia-drafts
I think that what I am trying to do with Julia Flux should be doable single-handedly (the tools seem to be that good), but I also hope to find collaborators (a small team would be able to move really fast with this).
- Mishka
On Tue, 14 Apr 2020, Jonathan Godbout wrote:
Hey Everyone, I hope you're doing well and staying safe. Sorry for the long wait between messages. As Didier just said the ELS will be online, yay!
How's everyone doing? Fare, how's the startup, I miss the details.
Jon
-- Geoffrey S. Knauth | http://knauth.org/gsk