### MDW #6

#### by cjcoughlin

After spending a third of my life in the field of Information Technology, I’ve come to realize I’m very jealous of the people that can effortlessly occupy spaces of both computer science and design. I’ve met many programmers, user interface engineers, database admins and graphic designers; all have been talented in their respective disciplines but I always envied those who could both design and program efficiently, often at the same time. Unfortunately, I never became much of a serious programmer because math was never a strong subject for me. Needless to say, this assignment about type theory was a tough one to wrap my head around, even with that bit of programming background. I’ll begin with a short explanation of who invented it and what it means in regards to how applications work.

Type theory is a mathematical and logical system put forth by the philosopher and mathematician Bertrand Russell, pictured above. Russell was a highly esteemed intellectual and prolific author and is considered one of the pioneers of early computer science. He was a quintessential jack-of-all-trades, dabbling in various fields and becoming renowned for his insights into the building blocks of nature and civilization through math and logic. In addition to his pursuits in mathematics and philosophy, Russell was also a historian and conscientious objector, eventually spending time in prison for his public displays of pacifism during the first World War. His contributions to logic include Russell’s paradox, which helped give way to his introduction of type theory:

Russell’s paradox is based on examples like this: Consider a group of barbers who shave only those men who do not shave themselves. Suppose there is a barber in this collection who does not shave himself; then by the definition of the collection, he must shave himself. But no barber in the collection can shave himself. (If so, he would be a man who does shave men who shave themselves.)

Russell’s type theory consists of “any of a class of formal systems, some of which can serve as alternatives to set theory as a foundation for all mathematics.” Since computational language could be described on a very basic level as a series of math problems involving words and numbers, Russell helped set the stage for how future machines would process information and maybe most importantly, catch any errors present at runtime. Eliminating these errors allows the machine to run efficiently and return valid data; after all, what good is a computational machine if it isn’t correct? One of the most intensive and time consuming parts of writing code is making sure it works the way it should. Nothing is more frustrating than spending hours working a component for a website or program, only to run it and immediately see errors and have the app crash. Type theory helped streamline the error checking process to make it easier for some programming languages to check as it goes, which drastically cuts down on development time and makes life easier for programmers.

As I said, this is a simple way to look at what turned out to be a complex idea but I think most computer users can appreciate the structure behind what we now take for granted when we run an application or play a game. Russell and his peers helped shape logic problems into manageable solutions decades before the rise of personal computing, which is incredible in and of itself. Type theory may make my brain hurt, but I am definitely happy it exists.