Jacob Robinson's Blog, page 20

September 27, 2021

Intersection of Technology and Art

Video games are an interesting subject, because they are simultaneously a business, a technology, and an art all at once. I think tales from the video game world give us an understanding of what technologists don’t understand about art — and what artists don’t understand about technology.

There are, traditionally, five mediums of art: art (duh), music, writing, film, and games. You can technically include other mediums such as cooking and fashion on here, but they’re usually lumped into that first catch-all term. 

The evolution of art is interesting. The first four mediums on this list are indisputably creative endeavors. Certainly you use instruments such as publishers and marketers to get from hobby to profession, but the publishers and marketers themselves don’t necessarily influence the art. They influence where the money goes, but not the art. Businessmen have tried to place themselves in the throne of the creative for these mediums, but it has never worked out.

Video games, the newest medium, are… different. You could make video games an art, or you could make them a business. Or you can make them an art and a business. A game like Final Fantasy VII is indisputably art, but a mobile game like Gardenscapes lacks many of the artistic criteria that a game like Final Fantasy VII has. Likewise, there are elements of World of Warcraft that make it an artistic endeavor, but at the end of the day they’re trying to sell you a $15 a month subscription.

Games are also a technology. Art, music, writing, and film are for the most part “technology-complete”. Sure, you can try a gimmick such as making a “4-D” movie, but this doesn’t necessarily add to the experience. A film is a film. Games, on the other hand, went from being played on a physical board, to a personal computer, to a handheld, to your phone, and now to AR/VR. Breakthrough advances in physics, AI, and economics are constantly being applied to video games. Games are not tech complete — they constantly get better as more research is done.

I imagine most people who read this blog are already at the intersection of technology and art, and thus already know where I’m going with this. You can run a games company as a startup, or you can run it as an artistic studio, but you have to understand both. Technologists fail to understand games as art, and artists fail to understand games as technology.

An example of the first case is MMOs. Often new massively-multiplayer online games come out, insisting on grand aspirations such as user-generated content and incredible new technologies that provide a never-before-seen experience. And then, they flop. They flop because beyond these grand new achievements these games feel empty. You can only have so many randomly generated quests and unique habit-inducing hooks before players begin to catch on. The reason why World of Warcraft has been successful for so many years — and the reason why Final Fantasy XIV now overtakes it — is because they had the perfect balance of art and technology. It should come as no surprise that the narrative quality of the games directly correlated with their subscription numbers. Even if people don’t pay attention to the story, they’re still drawn in by it. 

The opposite of this is referred to in the gaming world derogatorily as “interactive movies” or “walking simulators”. Some artists just want to make a story, or worse: something that just looks neat. When there’s no game to the video game, people begin to wonder why it’s even a game at all. The thing that makes games unique from other art mediums is in their interactiveness, and that interactiveness does not mean simply walking around a canvas or occasionally pressing a button. It means being fully immersed in the work. Games like Journey and Rez manage to walk this line correctly by providing an artistic experience that is enabled by interactivity. A good heuristical test for this is whether you could make essentially the same experience by just producing a movie. If the answer is yes, you failed. If the answer is no, you succeeded. 

There are some cases of studios that have proved to be the exception — who masterfully weave this intersection together. Valve Software began as a group of programmers and scientists, yet embedded their technology into the intricate narratives of Half-Life and Portal. Kojima Productions was primarily composed of creative thinkers, but when they created the Metal Gear Solid series they built a complex and exciting gameplay system that took hours for competence and years for mastery. Those who succeed best in games are people who have a solid understanding of both the left hemisphere and the right. I wonder where else this rule might apply?

 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2021 16:21

September 20, 2021

Mind’s Eye and Creative Application

There have been some fantastic breakthroughs in the research of imagination over the past thirty or so years. In this post, I want to go over some of these breakthroughs, as well as a subject that in particular has induced my interest.

For many years, we assumed everyone had something referred to as ‘the mind’s eye’ — an ability to visualize an object within your mind, even though that object may not necessarily be there in reality. To an extent, this is true — most people do have a mind’s eye. However, what we didn’t realize was that the strength of the mind’s eye varies greatly from person to person. For example, in some people the word “apple” conjures a fuzzy realization of a red object with a green stem. For others, a very realistic rendering of a specific type of apple can come almost instantly. We have even discovered people who do not have a mind’s eye at all, which appears to have significant cognitive repercussions in the form of a disorder called agnosia. 

There appears to be two factors which determine a person’s imaginative ability: intensity and variety. Intensity is as I defined in the previous example: a fuzzy image versus clear detail. Variety has to do with which senses you can imagine. Virtually all people with a mind’s eye are able to imagine sight, but fewer are able to imagine sound, touch, smell, and taste.

It is also worth noting that memory does not appear to be correlated with imagination. This seems counterintuitive at first — you’d think that if you can remember all the details of an apple perfectly, that you would have a stronger memory. But just because you are imagining a full render of an apple, does not mean that the apple is fully accurate to its version in reality. It just means you can imagine it very well. A great example of this is the artist Chuck Close, who has issues with imaginative ability and also has a photographic memory. 

There are still two major mysteries around the mind’s eye. The first is centered around the distribution of ability (if we were to put everyone on a chart, what would the distribution of imaginative ability look like?) and the second is based on the limits of ability (how good can you really get?). I am likely on the upper level of this distribution, given that I can imagine all five senses with high accuracy. But the people above me can even imagine abstractions, getting sensory information out of things like complex mathematical problems. We now believe that geniuses such as Albert Einstein or Nikola Tesla might have had this ability — hence why they were able to understand such complicated subjects with ease. 

This brings me to my personal interest in imagination: the creative application. I am primarily a fiction writer and I credit a large chunk of my ability to my mind’s eye. I’ve noticed that there are things I do as part of the imaginative process that other people find incomprehensible, and perhaps the most notable of which is active music listening.

As of writing this post, I have only known two or three people (including myself) who had an understanding of active music listening. If the following sounds familiar to you, feel free to write in the comments:

Active music listening involves the surrendering of oneself to the music. It is usually done while walking around, particularly in a small space (a larger area seems to lower the quality). Nothing else can be done — just walking and listening. It works particularly well for music without lyrics: think soundtracks rather than pop songs. Still, it’s certainly possible to get it with pop songs as well.

When you get your mind into this space, something incredible happens. Entire scenes, roughly 10 seconds to 30 seconds in length, appear in your conscious. Two aspects are special about these scenes. The first is that they are, to an extent, randomly generated. They aren’t really things you’ve developed beforehand — they’re things the music develops. This isn’t by song either, though sometimes it might be the case that you end up associating a song with a particular scene. The other aspect is that these scenes are vivid. Like, very vivid. Virtually every detail is there, and it’s very easy to run with them afterwards.

I would say, out of all the story ideas I’ve developed, 80% come from active listening, 15% come from dreams, and the remaining 5% come from other processes. It’s a very powerful tool, and it’s why I’m surprised I haven’t heard of it anywhere else. Because of this, I have a few questions for potential readers of this post:

Have you heard of this concept before? If so, do you have any resources/links that people have written about it?Have you done active music listening before? If not, did you try it after reading this post? If you did try it, were you successful in obtaining the results I do, or did you come up with nothing?What strange imaginative experiences/abilities do you have, that you don’t seem to see written about anywhere else?
 •  0 comments  •  flag
Share on Twitter
Published on September 20, 2021 16:12

September 13, 2021

Leverage and Wealth: A Simplification

I think the idea of the “Age of Leverage” is one that is incredibly important to know. Unfortunately, most people who talk about it use very abstract and pseudo-intellectual terms. I wanted to write this post to explain what leverage is, and why living in the Age of Leverage is so important.

Traditionally, work is linear. That is, with every input, you get an output in return. If I write a blog post, I get one blog post in return. If I manufacture a car, I get one car in return.

This system is fine, but what if we could change it? What if we could get 60 outputs for every input?

This idea of expanding the number of outputs per input is called leverage. The best way to introduce leverage is to use it the way it was originally defined for: financial leverage.

Let’s say we get a thousand of our own dollars, and put it into a stock. Let’s then say the stock we chose goes up 25 percent. Impressive! Let’s see our total return:

1,000*1.25 = 1,250

It looks like we’ve made a total of 250 bucks in profit. Pretty nice stuff. But what if we used leverage?

A leverage, a loan, and a debt all mean the same thing in financial terms. Essentially you’re borrowing money with the intention of paying it back in the future. But why would you want to do this? Well, let’s say that in addition to our own thousand dollars, we get an additional sixty thousand in leverage. In other words, our 1-to-1 output now becomes the 1-to-60 we talked about before. Once again, let’s assume our stock goes up 25 percent:

(1,000+60,000)*1.25 = 76,250 – 60,000 = 16,250

Our original 250 dollars of profit now becomes 15,250! This is the power of leverage. 

Of course, financial leverage isn’t all sunshine and flowers. The markets are incredibly volatile, and it’s more likely that you’ll lose 25 percent than gain it. A pretty bad situation to be in, considering you still have to pay off the debt. But what if we could apply the idea of leverage to less risky work?

After the internet revolution, we could. Let’s say I make a blog post. When it comes out, it might just get 10 views. Not great. However, as time goes on, it may grow up to get 100, or 1,000, or 10,000 views. And yet, the content has never changed. I didn’t write more posts to get more views — the views came to the same amount of input. Just like our previous example.

Where does this come to mean wealth? Well, imagine these singular inputs create enough output to generate an audience. An audience of, let’s say, 1,000. Say we decide to make a course based on what we’ve written (that’s pretty popular nowadays, right?) and price it at $250. If all of our audience buys it, that generates a profit of $250,000. It may not be realistic that all of our audience purchases the product, but that’s not the point — the point is that we turned one input (the course) into over a hundred thousand dollars.

Hopefully at this point you see why people are calling this the Age of Leverage. This power to create with very few linear costs allows us, in theory, to expand our profits beyond what could have ever been possible. And the great thing is that this isn’t limited to corporations or the 1% — you could do it, too.

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on September 13, 2021 16:33

September 6, 2021

Do this, but…

Within groups of experts, they tend to agree with each other on best principles. Sometimes, however, there are exceptions. So when experts disagree, how do you find out which side to choose?

The example I’ll use in this post is cold-emailing, for the reason of it being one of the more hilariously trivial examples and also the one that inspired this post. There tend to be two equally-sized factions in the cold-email war: one, whom we’ll call the pro-emailers, believe that it is always a good opportunity to cold email someone, giving them a genuine note and being sure to always follow up to show that you care. The other side, anti-emailers, believe that cold emailing ultimately leads to more harm than good: no matter the intention, most of the time you will simply annoy a relatively important person. 

Now, there is nuance to this. Anti-emailers probably imagine a cold-emailer as an annoying spambot, and would give more lenience to someone with a personal touch. Pro-emailers, likewise, will always believe that a warm email is better than a cold email. But for the most part it seems like these really are two different ideologies, and worse, they’re two ideologies that refuse to interact with one another, thus making direct comparison more difficult.

With cold emails, and in other cases like it, there are two routes you can take. The first is by creating a natural filter. If you cold email everyone, you’re going to filter out the anti-emailers who will never respond to your messages. Of course, if you’re cold-emailing, you’re probably a pro-emailer and wouldn’t like to talk to an anti-emailer anyway. An anti-emailer might share and laugh at your email over Twitter (as they are prone to do), but this won’t matter because they’re sharing it with other anti-emailers. 

The second is to merge the two strategies together. This isn’t always possible for diametrically-opposed conflicts, but it is possible in the case of cold emailing. For example, you can cold email but not follow up. The follow up part seems to be the main pain point of anti-emailers, anyway. An example similar to this is in blaming, which I brought up in Faulty Mental Models. One side of people says that you should always take the blame, no excuses. Another side of people say that taking the blame is ultimately harmful, not helpful — particularly for larger mistakes. So, the merging of these two ideas is to take the blame in small cases, and for large cases sweep the blame under the rug (thus ensuring no one gets hurt). 

With these two ideas in mind, hopefully you see that even in expert conflict there’s a way to figure things out for yourself. Besides, most of the time what experts say won’t apply to your life anyway.

 •  0 comments  •  flag
Share on Twitter
Published on September 06, 2021 08:56

August 30, 2021

Buying Immortality

Is donor naming an actual valid way of obtaining immortality, or is it mostly just a farce?

In Immortality I made the argument that it is rational to strive towards having yourself known past death. Sure, it is true that Alexander the Great was buried with his mule driver — but the difference is that we don’t know the mule driver’s name. And while the economic benefit towards the individual of having their name known past their death is small, it does still provide some relief knowing on your deathbed that you’ll be remembered long after you’re gone.

So, with that out of the way, we can talk about ways to obtain immortality. The obvious one is doing great things. But could you… well, buy it? Many people’s names are aligned in museums, science centers, universities, and parks all over the world. Did they achieve their immortality?

I would argue not. Immortality is much like love or happiness — it may be bought temporarily, but it doesn’t last. And in particular with immortality, it not lasting is a pretty major problem. 

The issue goes as follows: there is a difference between knowing a name and knowing a person. Most people couldn’t pinpoint a single thing that John Harvard or Leland Stanford actually did. Ask the same question about Julius Caesar or Martin Luther King and the response changes dramatically. And while it’s true many of these famous men (and women) named things after themselves, if we just knew Alexander the Great through all the cities named Alexandria we probably wouldn’t think much of him. It is a fickle, surface-level immortality. 

Fickle immortality is no immortality, just like fickle love is no love. Because of this I would argue that the only way to achieve immortality is by doing great things — perhaps not conquering countries (not a great look in 2021) but by making great works of art, fighting for change, and living by your principles. Then these things will come to you naturally.

1 like ·   •  1 comment  •  flag
Share on Twitter
Published on August 30, 2021 07:27

August 23, 2021

My 5 Tips Towards Reading Books

Like most of the things I do in my life, my approach to reading books is often seen as counterintuitive and sometimes bizarre. Still, if you have trouble building a reading habit and have tried all the other solutions out there, taking out tips from my playbook may help you. Let’s dive in.

Use Book Diversification (aka the 50/50 Rule)

Typically when you see pundits talk about their favorite books online, they go in one of either direction: a big focus on non-fiction (typically businessmen and thought leaders) or a big focus on fiction (typically creatives and artists). This should give you the hint that there’s a big gap waiting to be exploited for people who read both. I personally use the 50/50 rule — whatever I read, 50% of it must be fiction, and 50% of it must be non-fiction. Within fiction, certain parts must be poetry, prose, short stories — within non-fiction, certain parts must be business, science, biography, etc. etc. This makes sure that I’m getting a good helping of everything that’s out there. Remember, specialization is for insects.

Start with the Classics, Work Up from There

But what do you read? Well, I prefer to start with the classics — books that are highly recommended across the field by people who specialize in that branch. For fiction, this is easy — there’s an entire genre of fiction just called classics! For non-fiction, I use aggregators like Read This Twice to find books that are recommended across a wide variety of folks. I typically don’t trust individual recommendations since those are typically driven by individual taste, though if I recognize that the person and I have a lot in common in book taste I may pay closer attention. 

Now, here’s a little secret I’ll let you in on: a lot of classics you won’t like. Just because a book is classic, doesn’t automatically mean that everyone likes them — it just means that a lot of people do. What focusing on classics does is instead filter out a lot of the not-so-good books, giving you a better chance of enjoying what you read but not necessarily a perfect chance. This filtering process is what some people refer to as the Lindy effect, though I believe it isn’t just tied to the amount of time in existence. If you aren’t vibing with a book, feel free to drop it!

The Book Should Make You Pay Attention

Now, here’s a controversial one. It is the belief of most “high class” readers that — particularly when it comes to classics — you should be paying close attention to what is being written. I spurn this idea. I believe instead that a book should make you pay attention, to justify its own work. If you’re reading a book and begin to doze off, that’s not the fault of you — that’s the fault of the writer failing to keep you interested.

But when you doze off, you fail to retain so much information! What will you do about that information you lost?

This is why we read widely. If I had a nickel for every time I read a concept in one book, didn’t get it, then read it in another and suddenly understood, I’d have enough to buy my entire Goodreads backlog. What you’ll notice over time in reading is that the majority of books talk about the same few concepts, over and over again. These concepts are called mental models. If you read enough, you’ll find that you get most of these ideas naturally — and you’ll get them from the writers that drive your attention the most.

Highlighting is the Best Form of Book Notes

There are a lot of theses in the world of book note taking. Some people say to always have something nearby where you can take close notes over everything you read. If I did this I would tear my hair out and never end up reading anything. Others say to never take notes, and read naturally. This I still don’t agree with, because there are a lot of connections that aren’t made if you don’t catalog them for later. So, what I’ve developed is something in the middle: the best of both worlds that is both simple and smart. The handy-dandy method goes as follows:

I read a book. Duh.While reading, something catches my eye; a particularly poignant sentence or paragraph. I get out a highlighter, highlight it, and stick in a book marker for later.After I’ve finished the book, I take all the sentences I’ve highlighted and add them to a list. I call this list the “distillation” of the book.

I experiment a lot with how I do things like reading and working, but this highlighting method has stayed with me for three years now and I have no intention of changing it. It is shocking how effective it is, and especially so given the second step — and the last tip.

Use a Knowledge Base

A knowledge base, or a zettelkasten if you really want to sound like you know what you’re talking about, is a personal database that accounts for all the information you’ve learned over time. This process has been especially popular as of late, and I’m surprised it took this long to do so. Once I get a piece of knowledge into distillation form, I then split off the distillations into their respective “topics” in my knowledge base. From there, I use the sentences to build an actual article for that topic. For example, I may read a book on neuroscience. Most of those distillations from that book will go into the neuroscience topic on my knowledge base. I then re-read the sentences, and use the information to build out my own interpretation of the scope of neuroscience. It is a very handy way of making connections in what you read and then returning to those connections at later times. For many years I just used Google docs for my knowledge base, but I’ve recently switched over to Obsidian due to its nice interface and linking features. 

Anyway, that’s a short summary of my reading process. Unlike other processes that I have, this one is pretty much set in stone, so I think it’s a good idea to write an article on it. Anyway, I’ll see you in the next post.

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on August 23, 2021 07:16

August 16, 2021

I Am Not Your Friend

There’s something about the internet that’s changed our relationships with others. In particular, it’s changed our relationship with the “celebrity”; an individual who is omnipresent, whom we feel we can relate to. But how has this relationship changed, and why?

(Quick note: This post serves somewhat as a follow up to The Crazies, Part I and Part II. Reading these is not required, but recommended.)

We’ve already talked about how mental illness can extend via the internet. In addition, the idea of mental illness creating parasocial relationships with celebrity figures is nothing new. Every well-known celebrity has a case of a schizophrenic who misinterpreted the television screen and newspaper tabloids for reality.

Except, now things are different

The change was subtle. Celebrities of the old years were able to reduce the development of parasociality by creating the veneer of “fake”. Celebrities didn’t seem like people; they seemed like characters, objects in a distant reality. This shielding came with its own disadvantages — the development of the paparazzi is probably the most notable — but for the wide majority of us we could understand that figures like Martha Stewart and Oprah aren’t our friends. They are beings unlike us. They are separate.

The truth, of course, is that such is not the reality. Martha Stewart and Oprah are people, just like us, who live their own lives, have their own relationships, problems, et cetera, not too different from ours. But this separateness is what prevents us from seeing them as part of our lives, which they are not. It is a fake truth which institutes the real truth. 

The internet changed this dynamic. Musicians and TV stars were replaced by Youtubers and streamers, and the veneer began to fade away. People began to view the work of people with only 100,000 followers to their name, who made content not from a lavish studio set but from a living room or office in their house. These people no longer felt like objects, they felt like us. And while the traditional idea of seeing others as celebrities faded away (most online content creators would scoff at the idea of being called a celebrity), the parasociality began to manifest itself even deeper into the audience’s consciousness. 

In roughly 2013 to 2016, there began a big outpouring of cash investments into the world of online content creation. Content creators began to sign with large publishing agents, who gave them access to better equipment and studio spaces. And while these creators assumed that their audience would love the added production quality, something strange happened.

They didn’t. In fact, they hated it.

What was happening was that these creators were converting themselves within the public consciousness from that who is like us to that who is like them. In other words, they were becoming celebrities — traditional celebrities, whom we can see as objects rather than people. But it was too late; we already saw them as friends.

The scariest thing about parasociality is that it doesn’t exist within the crazies. It exists within all of us. When you read the title of this post, “I Am Not Your Friend”, you probably had an emotional reaction to it. It was light, and it was brief, but it was still there. And the reason is because, while you don’t know me (and may not even have read one of my posts before) there is a brain trigger that connects this blog to a set of relationships. I am not a character, but rather a person — a person you do not know and do not see, but that you can tell is a person, because I am not a celebrity. If you didn’t have any reaction to the title, then the opposite is true: you see this blog not as the creation of a person, but as a blog which exists within a space, that space being your computer screen. 

Humans are social creatures. We build relationships easily. That can help us, but if we’re not careful, it can hurt us just as well.

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on August 16, 2021 06:54

August 9, 2021

A Dialogue on Mindset

A young, college-aged boy is strumming along on a guitar on the corner of an urban street, a seemingly frustrated look appearing on his face. An older man sits down next to him. He keeps quiet for a bit but then begins to speak.

“Are you having a problem, young man?”

The boy, rather annoyed with the man’s sudden intrusion, has the intention of getting him away as fast as possible. “No, I’m fine.”

He looks at the boy’s guitar. It’s worn from use. “I used to be a musician, you know.”

The boy stops playing. “Oh, really?”

“Sure did. For many years. You want to be a musician, too?”

The boy looked down at the guitar. He shook his head. “No, I decided… I don’t think it’s a good idea.”

“And why is that?”

“Well, I’ve tried for awhile. I had a band I played with for about two years, but we never got anywhere. Our music never broke above 10 plays no matter what we did. I’m starting to think what my mom said makes more sense. Just give up on the music thing, use it as a hobby, and go get a real job, I guess.”

“Is music a hobby to you?”

“Well… no, I guess. I would rather it wouldn’t be. But I don’t think I have much of a choice here.”

“And what makes you think you don’t have a choice?”

“Because it’s not my call to make. There’s so many musicians out there, you know — and only a couple of listeners, and most people just listen to the top artists. If everyone’s just listening to other artists, then there’s pretty much no point to doing it.”

“And is that why you make music? So that other people will listen?”

The boy felt surprised by the man’s rather silly question. “I mean, why else would you make music?”

“You could make music because you enjoy making music.”

“Doesn’t really pay the bills, does it?”

“Do you need it to pay the bills?”

Now the boy was plain confused. “If I’m making music to make music, but I still have to work a job… well that’s just a hobby, isn’t it?”

“It’s a hobby if you just practice and never give. What I’m saying is that you keep publishing the music, same as you did before, but then still have a job that pays the bills.”

The boy vigorously shook his head. “I don’t have time for that.”

“Oh?”

“Yes, I definitely don’t have time for that. Come on — I’ll be studying for classes, having a part-time job, seeing my friends, and making music? I would love to, but something tells me there isn’t enough time in the day for that.” 

“Well, have you tried it before?”

The boy looked up. “Doing all that stuff? No, not really.”

“Then how do you know you don’t have time for it?”

“Well, I know how much time studying is. I know how much time work is. I know how much time making music is. All those combined, I can tell you it’s too long.”

“You know how long each three separately are. But how do you know the three combined is too much?”

At this point, the boy was beginning to lose his patience. “What’s the point you’re trying to get to, anyway?”

“When people have a passion for something, they make due. Things change when they go after their own happiness, but most people are too locked in to even try and see if it’s possible.”

The boy considered this. “Alright, alright. I’ll try it, after my finals. I’ll see if it works.”

The man looked at him skeptically. “If you don’t do it now, will you ever do it?”

“Oh, come on. I’m not going to try some experimental routine while I’ve got important things to handle.”

“Well, are things ever not important? What about the week afterwards, will you say you’re too tired to try it? Or that you have another obligation? When does it end?”

The boy shook his head. “You’re really pushing me on this, aren’t you?”

A car came up to the curve. The man got up. “I can tell you’re annoyed, so I’ll give you one last thing. Good it is to listen to the brain, and good it is to listen to the heart. But better to listen to both at once. People fall because they think that they conflict, that they cannot possibly live in coexistence. But you’d be surprised. Follow the heart, and let the brain open the path.”

The man left to go to the car. The boy watched the car drive off, then went back to strumming his guitar. 

 •  0 comments  •  flag
Share on Twitter
Published on August 09, 2021 16:08

August 2, 2021

A New Model for Teaching Humanities

The humanities are a complicated subject. They are much more vague than the sciences, relying more on notions of what things ought to be rather than how things actually are. Because of this, they’ve always been a sore spot in traditional modes of learning. Is there a way of improving it?

The traditional, institutional examination practices revolve around four different categories: multiple choice, problem sets, essays, and projects. Multiple choice and problem sets can work for STEM-related subjects because it’s more about ingraining a subject that is already known to be true to the reader. For the same reason, it’s a modality that doesn’t work well with humanities. This leaves us with essays and projects.

Now, this isn’t anything super revolutionary. Essays and projects already have been the foundations of most humanities classes. But I still feel like these examinations haven’t been great. It mostly revolves around the content.

When you write an essay, you’re writing an essay on something you’ve read. If you’re doing a project, it’s a project on something you’ve been assigned. Once again, logic here works perfectly well if we’re talking STEM. But the problem with humanities is that this isn’t really the point. Humanities knowledge doesn’t revolve around whether you “got” The Great Gatsby or whether you can recite Bertrand Russell’s most acclaimed work. It instead revolves around the more ephemeral concepts the works provide: love, loss, society, the individual, etc. These are the learning principles of these subjects.

So, how do you test these concepts? Well, we just take a more roundabout method:

Take a set of works on a given theme. For literature, works on the death of a loved one. For Philosophy, some takes on aesthetics. For history, a few (preferably conflicting) examinations of an event.Have the student read and analyze these works on their own. Don’t bother testing them on their knowledge at this point — this will come out later.At the end of the module, have the student create an essay or a project on their own examination. Tracing back to the examples: have the student write about the psychology one faces in the death of a loved one, about their views on aesthetics, or about what they believe was the fundamental processions of that historical event. They can cite other authors, but preferably the work is wholly their own ideas. Grade based on cohesion. Does it logically follow? Is the student using the opinions of the other authors without addressing them (a sign they didn’t actually read the content)? Is it reasonably complex? Is it reasonably simple?

When we test people in STEM, it’s to see if they comprehend a subject. If a student claims to know about particle physics, you can give them a problem set of questions related to particle physics, and they should be able to answer. The difference is that, with particle physics, there is a correct answer. There is no correct answer in literature, or art, or philosophy, or music, etc. etc. You can only test a student based on how well they can do it themselves. 

It also becomes hard to test because it’s subjective. We are naturally biased towards what we think is a good take, and what we think is a bad take. Humanities material is, almost exclusively, made up of takes. For this there’s no great answer. But at least for now we can get students to think about these things for themselves.

 •  0 comments  •  flag
Share on Twitter
Published on August 02, 2021 16:50

July 26, 2021

The Experimental Method of Learning

I’ve always had trouble with the traditional methods of learning: classroom approaches, study groups, and intensive memorization never really did much for me. But now I feel like I’m beginning to settle into a method that may work for me, and I’m interested in sharing it with all of you.

I call this strategy the experimental method of learning. That might be strange coming from a man who abhors doing research or experiments of any kind, but stay with me here. 

I don’t think this method will work for *every* skill — things that are more theoretical like quantum physics and skills that need advanced equipment like medicine might not work here. You could modify the strategy slightly to get it to work, but I haven’t tried. What I do know is that this strategy works for language-learning, creative skills (art, music, writing), business skills (finance, marketing) and probably most forms of engineering or construction. You’ll see why when I explain.

The example I’ll be using throughout is that of computer programming, which is a skill I’ve wanted to learn for a long time and took years of figuring out before I finally find a groove. The main pain point I had about programming was that the lessons you’d get on something like Codecademy and FreeCodeCamp were so starkly different from reality. On these websites, you’d get a module like “Python Lists”, and the little guide will tell you to add something to the coding area, and you’d do it, submit your progress, and get it either right or wrong. You repeat this ad-nauseum 100+ times and all of a sudden you’re a certified Python beginner. Seems simple enough.

Then you try doing the same thing in an actual coding environment. You get on VS Code and make a Hello World program. You click run and nothing happens. You check the debugger.

ERROR – PYTHON NOT FOUND

Oh, I guess that makes sense. Python 3 doesn’t just grow on trees, you know. Better install it.

So you go to the Python site, you install it to your computer. Reopen VS Code, try the whole thing again.

ERROR – PYTHON NOT FOUND

You sure about that, smart ass? It’s right there, right in the folder it was installed to. Well, as it turns out, that’s not quite the folder VS Code reads Python in. You find out you get better luck installing the VS Code Python plug in. You try one more time. 

ERROR – PACKAGE [Pick your poison] IS NOT FOUND

Alright, you know where I’m going with this. The reality is that programming is probably 1% syntax (what Codecademy teaches you) and 99% installing libraries, learning APIs, and trying to figure out how the hell to connect these paths back to your IDE. But you would have never known that if you had followed the course formula.

So, how do you get around this? Enter the experimental method:

First, learn the basics. For programming, it’s the syntax of the language you’re trying to use. For music, it’s basic theory (scales and chords and such). For art, it’s understanding deconstruction, lighting, portions, etc. Second, experiment. Figure out something you want to make with your skill — for me, it was a simple console-based trading game. Read up online on how to use component parts, then put these components together to create your project. This is most of the work in learning.

Experimenting helps you understand that learning is often not as straightforward as it looks. It is challenging, for sure — I probably spent hours trying to get basic python libraries to run in VS Code — but the outcomes from learning are more effective, and at the same time you’re learning by doing things you want to do.

If you’re having trouble figuring out a good learning paradigm for yourself, I would suggest trying this one out. It might be the one you’re looking for.

 •  0 comments  •  flag
Share on Twitter
Published on July 26, 2021 16:03