Yet another prime DALL-E 2 prompt:
brilliant cloud of knowledge anaphors, books, tools suspended in a chaotic virtual reality space in the style in tristan eaton, victo ngai, artgerm, rhads, ross draws, cinematic evening golden hour light, wide anamorphic shot
Two life hacks:
- Being really fucking good at something can get you very, very far -- with no other tricks.
- Introduce variance into your life. High variance life with good downside protection == more luck.
I think a profound and underrated property of whatever AGI we'll create is that, at least in the bootstrap phase, it will have been built atop the human experience: our writing, our visions, our soundscapes. It will be a beacon that shines our existence into the distance of time.
The billion-dollar idea:
I think large generative models can become much more controllable/predictable with the right interfaces. Generative models are essentially large databases + really effective search algorithms over them, so the right interface is a good search/navigation interface over its latent space.
Sometimes I feel bottlenecked by I/O (how much I'm reading/writing) and sometimes by data (knowing what to do), but right now I'm feeling severely bottlenecked by compute (just being able to execute on even 10% of the things I want to try prototyping/researching/executing).
I think that's a good thing? But man, an extra brain would be really useful right now! Too many thoughts, not nearly enough brain cells.
The world is adrift between two worldviews:
- To engineer scarcity into everything
- To engineer scarcity out of everything
I think the latter is a much more optimistic mission -- abundance over efficiency.
Downloaded OpenWebText today for some from-scratch language model (pre)training experiments! It's a bunch of small .xz files that unzip to more small .xz files, so I ended up writing a little script to automate all the folder-creating and unzipping, with a nice in-place-updating progress meter in the terminal:
std := import('std')
str := import('str')
fs := import('fs')
fmt := import('fmt')
debug := import('debug')
xzFiles := fs.listFiles('.') |> std.filter(fn(f) f.name |> str.endsWith?('.xz'))
xzFiles |> with std.each() fn(f, i) {
name := f.name
dirname := name |> str.trimEnd('_data.xz')
print('\x1b[0F\x1b[2K\x1b[0G') // erase previous line
fmt.format('Unzipping {{0}}/{{1}} {{2}}', i, len(xzFiles), f.name) |> print()
mkdir(dirname) // assume infallible
evt := exec('tar', ['-xf', name, '-C', dirname], '')
if evt.status != 0 -> {
fmt.printf('Error: {{0}}', evt.stderr)
exit(evt.status)
}
}Somewhere between 1B - 5B parameters, transformer-based language models go from interesting to intelligent to insightful. Currently training a 3B model after having worked for a while with a sub-1B one (t5-3b / t5-large) -- the difference is palpable.
A good DALL-E 2 prompt, I promise:
Soft, warm-glow holographic reality: a cloud of small lines of neatly organized luminous text filling the space around him like speech bubbles, connecting alternate possibilities in words, floating around a student's head as he stands thinking with hands extended out in a busy but cozy candlelit workshop. Wide shot on Hasselblad Mark II, photographed from behind. Firefly swarm vibes.