Email me ice (facts) to contribute! Or suggestions on what to link to, for the facts I have below!

Here's what I have so far (vaguely in order of increasing obscurity):

- Lambda calculus is just a helpful syntax for writing proofs
- Parallelism is not concurrency
- People have been saying "Single-core scaling/Moore's law is over, the future is parallel programming" since at least the 1980s.
- Functional programming has been the future since at least 1978.
- C's
`void`

should be called`unit`

, but at least they kept the same syntax for applying`f`

to`void`

/`unit`

:`f()`

- Partial evaluation gives you compilers for free
- Exception handlers are just dynamically scoped continuations
- All IO is monadic
- Dynamic languages are static languages
- Logic, programs, and category theory are the computational trinity
- Inheritance was invented to support intrusive lists and simplify GC implementation relative to composition
- call/cc is the computational interpretation of the law of excluded middle.
- Fork is the law of excluded middle, and all continuations are delimited
- Some surprising interpretation of coinduction/codata?
- Some surprising interpretation of effect systems/effect polymorphism?
- Some surprising interpretation of modal types?
- Some surprising interpretation of substructural type systems?
- Network packets/encryption are just values of an extensible type
- Microkernels are an abstraction inversion
- You can do GC without tags
- Tracing GC and refcounting are the same thing
- You can make a compacting memory allocator without moving pointers.
- Primitive datatypes (e.g. product and sum types) are actually highly sophisticated universe-polymorphic type theory constructs
- You can pretty print functions without their source code.
- All computable functions are continuous
- Function equality is decidable (at some types)