Linus's stream

An epiphany I had while preparing for a Metamuse podcast recording and reading through an old Hacker News thread on building my own software ecosystem -- none of this is really about productivity. It's pretty difficult to make the case, even if I can build these things quickly, that my time is not better spent elsewhere.

I think more importantly, building your own tools and software is about changing your relationship with the software that runs your life. Maybe I don't get more work output per hour of time invested, but I trust my tools more, it feels more ergonomic, and there's an intangible benefit to a deeper, more durable relationship I can have with the tools that I have my hands on for so many hours of the day.

As of this week, Oak is at a stage where what I consider "basic" features of the language and surrounding tools are done. These include

  1. The oak interpreter (obviously)
  2. Syntax highlighting in my editor of choice (Vim)
  3. Automatic code formatting (oak fmt)
  4. Compilation and bundling to single-file programs, especially to JavaScript/the web platform
  5. Basic standard libraries, including a Markdown renderer and date/time utilities

It puts Oak at toolchain feature parity with Ink where I left off with it, and makes me very comfortable to finally build on Oak, rather than simply work on Oak the language and toolchain.

Since I've gotten to this point, I've found myself keeping a terminal tab with an Oak program and a repl open, and tinkering and playing with it from time to time. Mostly writing programs that don't do anything special, like:

std := import('std')

Message := 'Hello, Mac!'
len(Message) |> std.range() |> with std.each() fn(n) {
	std.println(Message |> std.slice(0, n + 1))
}

Nonetheless, I enjoy it and it occasionally leads to interesting hacks. It's making me think about whether being able to play with a tool is a vital aspect of a good tool. Play is where a lot of discovery and divergent thinking happens, and where a tool can really come to feel right in your hands.

I've spent a bunch of the last weekend and some of this week working on oak build --web -- the Oak language toolchain feature that lets an entire Oak program (across multiple files) be cross-compiled into JavaScript, to run in browsers or on Node.js/Deno. I've done this once before for Ink with September, but there are a few improvements in the way I'm doing oak build.

  1. Most obviously, oak build is a command built entirely into the interpreter binary. Even though it's written in Oak (and therefore self-hosted), the whole thing is baked into the oak executable. This means no need to clone a separate repository / project like September. It also means it gets tested with the language's standard library tests, and that I can assume every language user has it.
  2. Speaking of tests... oak build outputs are continuously integrated against the entire Oak standard library test suite, which is something that wasn't possible with September because...
  3. oak build can take a single entry point program file and recursively follow top-level static imports to figure out which other files need to be included in the compiled bundle (including standard libraries for the JS output). No more passing multiple files to september translate.

As with many other parts of Oak, I'm really appreciating the opportunity to make architectural decisions with experience "from the field" to design with much more foresight than my first attempt.

Specifically, I'm pretty proud of the fact that oak build's current architecture lets all of the tokenizer, parser, static analyzer, bundler, and some of the code generator share code between compilation targets (Oak and JS), yielding a much more maintainable codebase.

Knowledge tooling is mostly reified notation, and everything else in service of that notation.

Spent a good half hour reading, reading over, and thinking about The Immortal by Jorge Borges. As with other works of Borges I've read, I'm taken by his eloquence and depth and the layers and the connections in the text. Absolutely one I'll be coming back to.

So much of the current blockchain scene seems like it harbors both the promises and the risks of the early Web. Decentralization, accessibility, transparency... The Web, like crypto, promised all this and more. And they provided it! But then the world caught up, and as much as the world has been transformed by the Web, the fundamental power structures of society are only marginally improved today from where they were pre-Web. I suspect in the long term, say in 25 years, crypto will be in a similar place -- technologically enabling of these high ideals, but practically limited by social structures and large players.

... which is not a bad thing! It's only natural that long-term progress in society occur in shorter cycles of optimism and realism, unbundling and bundling. But I think it's a good perspective to take as an observer of the next tech boom.

Today I learned that many (most?) ways of notating dates, especially those that inherit conventions from ISO 8601, use what's called a Proleptic Gregorian Calendar system, which just ignores the fact that different calendars got adopted at different times and projects the current Gregorian dates into the past infinitely. It's what Oak's datetime library does, for example, but it also seems to be common in RBDMS's and other languages.

Once in a while, I come across a bit of short, readable, well-documented and explained source code that's delightful to read and teaches me something new. Today, I found once such example, in Go's sync package, for sync.Once. Here's a snippet, though you should read the whole (very short) file:

func (o *Once) Do(f func()) {
	// Note: Here is an incorrect implementation of Do:
	//
	//	if atomic.CompareAndSwapUint32(&o.done, 0, 1) {
	//		f()
	//	}
	//
	// Do guarantees that when it returns, f has finished.
	// This implementation would not implement that guarantee:
	// given two simultaneous calls, the winner of the cas would
	// call f, and the second would return immediately, without
	// waiting for the first's call to f to complete.
	// This is why the slow path falls back to a mutex, and why
	// the atomic.StoreUint32 must be delayed until after f returns.

	if atomic.LoadUint32(&o.done) == 0 {
		// Outlined slow-path to allow inlining of the fast-path.
		o.doSlow(f)
	}
}

Quick lessons about the Go compiler's inlining rules, how to use atomic intrinsics, and all in clean, well-explained code. Lovely.

"Organizing your world's information" > "Organizing the world's information"

Whatever disrupts Google is going to offer not only a smarter way to find information, but a way to find your information, more contextually relevant, from your life, rather than from just the public sphere. Every search should be a personal search.

The two models of web browsers

It seems that there are two possible mental models for thinking about what a web browser is.

The first is where browsers are static places, and web pages come to them. It's a communication model of browsers. The second is where content and web pages/applications are static places in the metaverse of the Web, and browsers transport their users to these web page "places". This is a collaboration/co-habiting model of browsers.

I think the second is strictly superior. It allows browsers to be richer mediums that leverage humans' innate sense of place and space to help us navigate and collaborate more effectively, rather than forcing us to think of browsers as simple communication tools.