-
-
Notifications
You must be signed in to change notification settings - Fork 55
Description
Dyon supports a short For loop for counters starting at 0 and incremented until it is greater or equal to a pre-evalutated expression:
for i len(list) { println(list[i]) }
This For loop is approximately 2.9x faster when running on AST than the equivalent traditional For loop:
n := len(list)
for i := 0; i < n; i += 1 { println(list[i]) }
Inferring range by indexing
The expression len(list)
can be inferred by the index list[i]
in the body:
for i { println(list[i]) }
When nesting loops this way, you have to order items in a list in the same order, for example:
for i, j, k {
println(list[i][j][k]) // i, j, k must be used in the same order
}
However, as long as you don't depend on the previous indices, you can write it any way you like:
sum i, j {
list[i] - list[j]
}
Specify range
With index start and end:
for i [2, len(list)) { println(list[i]) }
Packed loops
When nesting loops of same kind, you can pack them together by separating the indices with ",":
for i, j, k {
println(list[i][j][k])
}
You can also write ranges in packed version, like for i n, j [i+1, n) { ... }
.
Examples
Computing output for a neural network:
fn run__tensor_input(tensor: [[[f64]]], input: [f64]) -> {
input := input
for i {
input = sift j {
sigmoid(∑ k {
tensor[i][j][k] * input[k]
})
}
}
return clone(input)
}
Compute energy for a system of N physical bodies:
fn energy(bodies: [{}]) -> f64 {
n := len(bodies)
return ∑ i n {
bodies[i].vel · bodies[i].vel * bodies[i].mass / 2.0 -
bodies[i].mass * ∑ j [i+1, n) {
bodies[j].mass / |bodies[i].pos - bodies[j].pos|
}
}
}
Set all weights in a neural network to random values:
fn randomize__tensor(mut tensor: [[[f64]]]) {
for i, j, k {
tensor[i][j][k] = random()
}
}
Motivation
This is designed for:
- Reduce typing
- Reduce bugs
- Improve readability of code