archive Block

This is example content. Double-click here and select a page to create an index of your own content.
Learn more

Authors

archive Block

This is example content. Double-click here and select a page to create an index of your own content.
Learn more

When I was an undergraduate at Indiana University, Professor Olaf Sporns mentioned to me a tantalizing paper connecting information theory to the study of consciousness. After attempting several readings I still didn't understand it, but I knew it was awesome.

When I applied to grad school in Computation and Neural Systems under Christof Koch, I asked to focus explicitly on the connection between neural complexity and consciousness, especially per the Information Integration Theory of Consciousness (IIT).

When beginning grad school, the paper, Integrated Information in Discrete Dynamical Systems, hereafter referred to as ϕ-2008, had just come out. I was a computer-science person, and I was assigned the task of making ϕ-2008 "scale-up" and then compute it for simple neural systems like the 302-neuron brain of the C. Elegans model organism.

I figured that if I'm going to spend all this time making ϕ-2008 scale big, I might as well understand, at a more fundamental level, what it actually measures. Unfortunately, the deeper I got into the fundamentals of ϕ-2008, the more dissatisfied I became with it. There was strange behavior where simple systems would make the ϕ really big (sometimes bigger than the total entropy in the system!). Finally, with a heavy heart, I concluded that instead of computing ϕ-2008 for larger systems, I needed to revise the measure itself.

This lead down a loooong road to define a measure of "synergistic mutual information", which was built out of a measure of the "intersection-information". Unfortunately by the end of my PhD, a fully acceptable intersection information hadn't yet been found, but we could increasingly bound it. At the end of my work I published my revised ϕ measure which I denoted ψ. I consider my ψ measure to be a much more principled and better behaved version of ϕ-2008.

Six fine academic papers on IIT and synergy/irreducibility.

- Quantifying synergistic mutual information
- Intersection Information based on Common Randomness
- Quantifying Redundant Information in Predicting a Target Random Variable
- Synergy, Redundancy and Common Information
- Irreducibility is Minimum Synergy Among Parts (this one got me an Erdős number of 3 via Jonathan Harel!)
- A Principled Infotheoretic φ-like Measure

**A few letters explaining (and sometimes criticizing) details of IIT**

- letter to John Searle (in response to Searle's book review).
- two letters to Scott Aaronson.

Finally, the PDF of [my thesis] and a full list of publications.

I co-authored (i.e., did several parts of the math) on the paper, Is Consciousness Computable? Quantifying Integrated Information Using Algorithmic Information Theory. I stand behind the math, but I'm a bit skeptical of the interpretation. Ironically, this paper received much more attention than any other scientific paper I've worked on. I was a peer-reviewer for several academic and popular papers on Integrated Information Theory. The only one that doesn't have anonymous/sealed reviewers is the Scholarpedia article on Integrated Information Theory. During research I would sometimes encounter obscure information theory tid-bits (e.g., the Dual Total Correlation) that where too little known in the academic community. A bunch of these I added to Wikipedia. I wrote (in optimized C++) the fastest implementation of ϕ-2008 that has ever existed.