A Personal Journey

Getting all nostalgic.

As my last semester comes to a close, I've found myself in a reflective mood in the brief spaces between the job search and the making money and the research wrap-up.

LillianFryberger.jpg

In particular, I've been thinking about my maternal grandmother a lot. She was pretty awesome. She lived longer than her siblings, the father of her children, and her children (my mother, my aunt, and both of their husbands). An extremely kind, hard-working, gentle person, she was at the same time tough and stubborn and resilient. She was, and remains, an inspiration. 

I wrote a bit about her in a little reflection piece for one of my classes last year. You can read it here or below. It doesn't do her justice. 

But it was a great experience, trying to sort out what I was learning in that class through the lens of family history. I couldn't help but think of those who raised me as I made my way through the history of computing and semiotics, from Shannon's information theory to extended cognition. Yes, I am a nerd.


Family History

My grandmother Lillian was born in 1918, a handful of years after C. S. Peirce’s death and Alan Turing’s birth. She lived to see the internet, and taught herself HTML code so she could embed midis of her favorite old songs in the body of emails she’d send me. Because she couldn’t see very well, she worked from a WebTV attached to the large screen of her television set. Needless to say, she was amazing.

Before I was born, she worked at Bell and AT&T as a switchboard operator, establishing connections between people by manually moving electrical cords and switches. Claude Shannon’s information theory with its bits, Henry Nyquist’s ideas about digitization and bandwidth, and much more grew from the telephone, which itself built on telegraphy and other inventions before it. The switchboards operated much like the early computers, which required people to manually move parts of room-sized machines to make calculations. Eventually, human-written binary code, electrical signals, and other innovations would come to replace the mechanical actions, paving the way for input-output machines modeled by Turing to become interactive computing systems built on software better modeled by something else.

Of course, my grandmother and I first started communicating before I knew language, let alone software. I had no concept of abstractions or the alphabet or other signs and symbols. But as a member of the symbolic species, I had in me a hidden capacity to map meaning, and gradually the syntax and semantics fell into place. I moved from primitive reactions to hunger, cold, and the like to using tools to play and eat. My understanding of icons, indexes, and symbols built up into an understanding and verbalization of the symbolic conventions that English speakers apply. My acquisition of language, or potentially just the ability to create artifacts, unlocked a capacity to store memory externally and build knowledge.

In the late 1980s, I extended those cognitive processes to computing systems thanks to my dad, who worked for Hewlett Packard. He was an electrical engineer by trade, trained in the navy, and went from working on radar oscilloscopes to computer scopes, from punch cards to PalmPilots, from huge pre-network machines to Oracle databases. He brought new HP equipment home to learn how it worked so he could fix it, which meant I got to explore computers in the living room and not just at school as a kid. I got lost in the DOS prompt, traveled the Oregon trail, and played my favorite game.

If Alan Kay’s vision had been fully implemented, I might’ve been learning code along with natural languages in elementary school. I might’ve been programming and learning by doing—taking my expanding symbolic capabilities and using them to conduct experiments with my computer as my teacher. Instead, I played Math Blaster and memorized multiplication tables.

But I shouldn’t be greedy. I have inherited a great deal. I’ve moved from holding multiplication tables in my head, to offloading my memories with a pen in notebooks, to exclusively using software on a laptop to store what I want to remember from class. And that software does more than just represent the written word; it is an interface to other symbolic systems as well. I can embed videos and audio into the files, or draw with a tool that still looks like a paintbrush but behaves in an entirely different, digital way. If I need the internet, I simply move to another layer thanks to Kay’s graphical user interfaces, windows, and more.

The concepts we’ve learned are helping me not just better understand the human condition but better understand my own family’s experience. I’ve come to learn that the cathode ray tubes used in old televisions were integral to the creation of technology that would lead to my grandmother’s WebTV, and many other more successful computing systems. That the HTML code my grandmother wrote consisted of symbols that both meant something to her thanks to a complex meaning-making process and could be read by computing devices that execute actions.

And there’s so much more in store. We’ve seen human cognition coupled with cars, but not the cognitive offloading that would accompany ubiquitous driverless vehicles. And we’ve seen HTML and hyperlinks and mice, but not widespread use of augmented reality lenses, wearable technology, and other versions of Douglas Engelbart’s vision of extending human intellect.

The curtain is slowly being pulled back on the meaning and complexity of this legacy and possibility. And the whole way, individual humans have been at the center, building on things that came before and finding new ways to expand their symbolic-cognitive processes.