Jacob Robinson's Blog, page 14
November 21, 2022
Trends in Mass Murder Techniques
While doing research on my upcoming post surrounding infamous mass shooter Adam Lanza, I noticed a particularly bizarre historical pattern: mass murder techniques seem to follow historical patterns. In this post I go over these historical periods, as well as what might be the reason behind it.
1910s – 1960s: AssassinationOkay, I suppose this isn’t a mass murder technique, but it falls in line with the same sort of idea. In the first half of the 20th century — and some can argue, most of time before this — assassination was the way to go. In particular, assassinating a particularly important person to make a statement. It’s during this era that we begin with the assassination of Franz Ferdinand (a politically-motivated attack which started one of the biggest wars of our time) and ended with JFK (more just mental instability, but nevertheless shocking)..
There’s a few reasons why assassination might have been the most popular form of shock murder tactics at this point. The first is that forms of murder were still not very… well, mass-like. The average person did not have access to bombs, automatic guns, etc. etc. They had at worst a knife and at best a standard, old-fashioned rifle. Another potential reason (which segues well into the rest of our list) is the media circuit before mass media existed. Radio and TV were just becoming things in this era, and beforehand everything was based on mail delivered via horse-drawn carriages. So, if you wanted to make a statement, the best way wasn’t killing a few randos — it was killing someone important. That way, the news got around quickly even in the old days of snail mail.
1960s – 1980s: Serial KillingSerial killers, really, have always existed. From Jack the Ripper to Albert Fish, serial killing has been the way to go for the particularly demented. Yet, in the span of three or so decades, the majority of the world’s most famous serial killers hatched their plots. Why?
This, of course, lines up to what I mentioned during the assassination section — mass media. The news circuit is always looking for something particularly evil, disturbing, racy, and serial killing fits that bill perfectly. So when news circuits got bigger, the news began reporting on these much more often. I would say its during this time that the “copycat” nature of these mass crimes really got started, thus really driving the sequence of “murder trends”.
There’s also the fact that, following this same logic, mass media made what were once “cheap deaths” more valuable. It’s a lot easier to kill a few random people living in a suburbs than a crowned prince, and if you’re just looking for notoriety than its a fine substitution. Perhaps its for this reason that, for the first time in history, the assassination was displaced.
1990s – 2000s: TerrorismFrom here, things began to go rather quickly.
We were just beginning to get used to the new serial killer persona when in the early 90s a new threat began to emerge thanks to threats from religious zealots, whether it be fringe rebel groups in the Middle East or cultists in Japan. Terrorism, just by concept, is interesting. Unlike serial killers — which did, to be fair, always exist — terrorism is a relatively new idea. It is a new idea which, in many ways, relies solely on the mass media build up in the previous decades.
The entire idea behind terrorism goes as follows: you are a small group with a fringe belief. You know you are small, outmanned, and don’t live too deeply in people’s heads. You need a way to make your cause known, to instill fear into others, while at the same time dealing with your issues domestically. So, what better way to go about it than to scare a couple of yuppies in a far away place and have them knocking on the door, charging in and wreaking chaos like a bull in a china shop?
Terrorism came out of a few key technological innovations. The first was the invention of the internet, which even further expanded mass media’s scope — for the first time in history, people in America began to really care about what was going on in a small far off country like Kuwait. The second was that high-powered weapons like bombs and assault rifles began to be cheaply produced enough for even small bands of fighters to use effectively. There’s also the fact that the world became more connected in other ways, such as airplanes — an effect which generated perhaps the most notorious terrorist event of all time.
2010s – Present: Mass ShootingsFinally, we reach the modern day. Assassinations rarely occur (and if they do, they are in developing countries and usually just “attempts”). While serial killers still hold a special place as a fictional bogeyman, they for the most part to not come up in the news. And terrorism seems to come out of vogue. So, what has come to replace these methods of mass murder? Well, shootings of course.
Mass shootings are interesting, because they are in some ways an amalgamation of all the previous trends. They are political like terrorism, they are deranged like serial killing, and they use the same toolkit as assassination. They are also uniquely problems of the modern day: much of a mass shootings potential relies on media spread (where typically higher casualties = longer time in media circuit) as well as the availability of high powered weapons to now very basic citizens of a nation.
Many people believe that mass shootings are a problem of the United States. In pure practicality, they are not — there is a much higher frequency and death count to mass shootings in places like Africa, Southeast Asia, etc. However, in terms of the media circus, it is indeed an American problem. Like terrorism before it, mass shootings are about a small group of people trying to make a statement. That statement ends up leading into political charges, which causes the nation to uproar into chaos. The only difference is that this time it’s all domestically, whereas with terrorism it happened in a far away place and we could do a pretty good job of ignoring it and hoping it all goes away (which, in fact, turned out to be a pretty good strategy). The question of gun rights is something that is unlike the question of Islamic extremism, or capital punishment, or political assassination before it. It is, perhaps, a bit more subjective — something that tugs more at the roots of the American creed than the other threats had previously. But this is beginning to get away from the topic of this blog post — let’s go back and round things up.
Is there really a trend?I have been hinting at this throughout the article — that perhaps my title is not truthful, and there is not really a trend in mass murder in statistical terms. Indeed, I think if you actually counted the number of assassinations, serial killers, terrorist attacks, etc., you would not see any real pattern in the data. But I do believe there is a pattern when it comes to journalistic trends.
This is all rather interesting. We would like to think we do not choose to read our horrific tragedies in the same way we pick our clothes, but there does seem to be a pattern in media picking up macabre events like fashion choices. In a way, journalism and politics feeds off each other — gun violence leads to discussion of gun rights, which leads to gun violence coverage, which leads to more gun violence, etc. etc.
I do not really claim to have the knowledge of why these trends exist, nor how to stop them. But I do hope this observation provides some food for thought for anyone who might be interested in exploring these problems.
The post Trends in Mass Murder Techniques appeared first on Jacob Robinson.
November 14, 2022
What Decides Good Art?

Art has always been, of course, a little subjective. But it has always been a debate as to where the subjectivity ends and the objectivity begins. Could we figure out what makes an objectively good artwork?
First of all, let’s get some theory out of the way. Most of you already know this, but there is heavy variation on the rating of any given artwork. In other words, for a single movie you probably got a lot of people rating it one star, a lot of people rating it five stars, and a lot of people somewhere in between. This is pretty different in comparison to other “products” like cellphones or cars — usually they are either good or bad. The only other case we see this in is taste (so food, drink, etc.).
But what’s interesting is that it’s not entirely variable, and we can tell that’s true because ratings are not uniform. You can look at the iMDB top 250 and bottom 250 to see this for yourself. So the question becomes… what factors contribute to some work being “objectively” better than others?
EffortEffort is, of course, the big thing that comes to mind. And this certainly explains away the bottom portion of this list — most of the “worst” works are just ones that barely any work was spared on. So if you’ve put a lot of blood, sweat, and tears into a project, you’re already off to a good start.
Still, it’s not a satisfying answer if we want to explain the top bracket results. There are plenty of works out there that a lot of work was put into, and they ended up… mediocre. So we have to keep looking.
UniquenessWhen you look at the stuff that is high effort but still on the mere “okay” end of the spectrum, it’s usually because its trite and has been done before. And so it makes sense that the content that is both a) high effort and b) innovative or otherwise unique in some way. When we compare the dataset to these credentials, we start getting a lot closer to the art universally deemed as good. And, sure enough, most traditional definitions of “objective quality” are just some combination of effort and uniqueness. But I think we can go further on two other fronts.
Skill?Hard work is one thing. It is usually correlated with high quality, but not necessary the causal factor in high quality. In reality, those with low skill and high effort can end up making purely mediocre work, while those with high skill and low effort can make something decently good. And, of course, high skill and high effort make for the best overall content. So I think skill is involved here as well, but I also think it is distinct enough from effort that we can fit it into its own category.
Luck??So if skill connects with the effort side, luck connects with uniqueness. Regardless of skill and effort levels, unique content can end up having some very averse consequences depending on where it lands in history. Take Heaven’s Gate, for instance, a film which was once treated as a self-indulgent disaster and has later been reconsidered as an epic masterpiece. For the inverse of this, look at The Birth of a Nation, a film once treated as an epic masterpiece and has later been reconsidered as… blatantly racist. You, of course, cannot tell how your work is going to be treated decades after it is released. You never know if people are going to “get it”. You might see bits and pieces here and there — for example, there were Heaven’s Gate’s fans at launch, and people back in the 1910s who were condemning Birth of a Nation — but you never know when that group will tip one way or another.
So then, what defines objective good art? A mix of effort and uniqueness for sure, but also with a bit of skill and perhaps a dash of luck mixed in.
The post What Decides Good Art? appeared first on Jacob Robinson.
November 7, 2022
Cyberpunk in 2022

Cyberpunk (the franchise, not the genre) is shaping up to be one of the more interesting AAA stories of recent years. I wanted to write a brief about my own experiences with the series, it’s most recent successes (and failures), and where things might go from here.
An important note of context to start off: I am writing this post hot on the heels of Cyberpunk: Edgerunners, the first real entry beyond 2077 into the Cyberpunk “cinematic universe” and a resounding all around success. By the time this article comes out this hype likely would have died down (this time I purposely wanted to avoid it, for reasons I’ll elaborate on either). The negative side effect is that much more information may come out between when I write this and when it gets published. If there’s any key info I’m missing here, it’s because it wasn’t around when I wrote it!
Personal BackstoryI have a (somewhat) interesting history with CD Projekt Red (abbreviated as CDPR for the rest), the developers of Cyberpunk as most of us know it today. Knowing it can help better explain why I feel some of the ways I do.
My first CDPR game was The Witcher 2, back on release. I never finished it but I remember it being a big deal. The first big game from CDPR, really, as the first game in the trilogy was a lot more niche. It was one of the more solid WRPGs of that year, and balanced a good deal between being accessible for the average RPG player but also relying on the difficulty that good old Eastern European RPGs were known for. Of course, the game came out in early 2011. Skyrim came out in November. That settled that.
When The Witcher 3 got announced, it built a lot on the success of 2. For starters, it would be CDPR’s first open world game (the previous ones were nonlinear but not open world). Second, it would be the end of the trilogy — a trilogy where most people only played the second game, but the end nonetheless. It was a game on my radar, but this time it was going up against both Bloodborne and MGSV. At the time, it didn’t seem like it had much of a chance.
Except this time, the comparison was wrong. The Witcher 3 did very well, arguably better than the other two games combined. For MGSV it made sense, given the game’s lukewarm reception… but the success over Bloodborne was genuinely impressive. What happened?
Well, for starters, CDPR earned itself a sterling reputation in the years between 2 and 3. Considered an indie darling (despite being AAA in terms of size) the company garnered a large, diehard fanbase. And, like most large diehard fanbases, they were dedicated to seeing the game succeed.
The Witcher 3, perhaps unsurprisingly given this fact, was released to massive fanfare. I, of course, got curious myself. I bought the game relatively early on in its release, for the PS4 (didn’t have a good enough PC at the time). My initial impressions were, well, not very positive. The game was much bigger, sure — but it mostly consisted of points of interest that were already common in pretty much any major open world game. Combine that with a series of pretty bad glitches and bugs, some straight up breaking important questlines. In fact, for a brief period it was so bad that they considered taking the PS4 version of the game down entirety.
Sound familiar?
To say it was as bad as Cyberpunk’s launch is a misnomer. But if one was clever enough they could see the writing on the wall: the larger the project CDPR was working on, the more it was prone to bugs. And Cyberpunk promised to be its biggest project yet.
Of course, my complaints about Witcher 3 were muted by its overwhelming praise, and CDPR went from being a indie darling to being the indie darling, one that could do no wrong and was focused on giving its players what they wanted. So when they heard that Cyberpunk was on the horizon, fans all over rejoiced about it like the second coming.
The path to Cyberpunk was a long one. Technically the project was announced before Witcher 3, but at that point it was only a single concept video and a few pieces of art. It remained that way for many years, talked about with only the occasional “When’s Cyberpunk?” post, until finally it was unveiled only a few years before its scheduled release.
When gameplay footage was finally revealed, it actually did pique my interest right from the start. The game seemed to be going for a Fallout/Deus Ex inspired gameplay system, but with a truly open world.. But I wasn’t nearly as excited as… some people. I knew at the end of the day it was the Witcher 3 devs, and I could be getting the same mediocre experience as I got with my last round trip with them. At this point I did indeed beat Witcher 3 all the way through, and while I did learn to appreciate it more than my first attempt it still did not strike me as the “game of the decade” that CDPR fans were espousing it as. In secret — perhaps out of spite — I hoped that Cyberpunk would fail.
There was already some unsettlement pre-release. Reviews for the game came out shockingly late (usually they hit a few weeks in advance, this time was a few days). The reviews were surprisingly vague, with some even mentioning that they only got to play a small portion of the full experience. Some CDPR antagonists like myself began to smile, though most people kept optimistic.
Then came the launch.
Perhaps there has been no greater outpouring of a shitstorm within a span of 24 hours as the initial release of Cyberpunk onto the masses. It became very clear right from the get-go that the game was bugged to hell and back. For PS4 players, the game was straight up unplayable — causing Playstation to make the infamous choice of recalling the game from its platform. Poor performance, audio desync, AI issues — it was like a theme park of messed up nonsense.
Of course, as I heard the news come in, my secret CDPR bias began to reveal itself. Eager as I was to see the fanbase get their just desserts, I decided to make a — on the surface — rather strange decision. You see, in this time after release rumors abound that refunds for the game would be free-flowing and easy to get, much easier than a traditional release. So, wanting to be in the eye of the storm of the chaos first hand, I decided to buy the game. Full price. On Xbox One. Yeah, I don’t really understand why I did it either. But it was this bizarre decision on my part that leads us to the rest of this blog post.
When I booted up the game and made my character, I immediately started to laugh at the issues. My frames dropped to 15 FPS in the intro area, a few items were inaccessible, NPCs stood still and did nothing. And so, I continued to play to see what else was wrong. Then I played again. And played some more. And more. Soon my friends began to ask why I was playing the buggy piece of shit so damn much. It was around this point I realized something odd — despite all its flaws, despite all its issues, I enjoyed Cyberpunk more than Witcher 3. In fact, I enjoyed it a lot.
The next few months I spent trying to decipher what it was that pulled me into the game so damn much, all the while getting ever closer to beating it. And at this point, I feel like I’ve come up with a pretty good answer.
Why Cyberpunk Was BetterLet me start this off by stating the obvious: I am not trying to excuse the state of the Cyberpunk launch. In fact, it was objectively pretty awful. Worst is that most of its issues were likely due to investor and management decisions that then trickled down to effect the creatives themselves. Cyberpunk, quite plainly, bit off a level of magnitude more than it could chew. Some stuff was obvious — for example, infinitely branching story paths were never going to happen in a million years (just like with every other game that’s promised it). Other parts of the game you can tell fell apart due to scope creep: there are so many strange details of zero import, the map is about 3x too large, there are a ton of rumored modes that apparently got cut (multiplayer? VR??). All in all it was a simultaneously hilarious and depressing disaster.
So, like I said — the game on launch was bad. There’s no doubt about it. What I’m trying to say here instead is why I enjoyed it despite all those flaws.
I think the biggest thing, especially in comparison with The Witcher, is the world. The world building and storytelling in Cyberpunk is the one thing I have seen nobody but the deepest contrarians shit on, and for good reason.
Cyberpunk’s world is… interesting, in a good way. The best way to explain this is to think about all the other notable worlds out there: Lord of the Rings, Harry Potter, Warcraft, Star Wars, etc. These worlds are, of course, filled with danger. The life expectancy in them I imagine is not particularly high. Yet despite this, they are worlds people would wish to live in. Often people daydream about being a space trader on The Outer Rim, or have a little hobbithome, or go to Hogwarts.
Cyberpunk, on the other hand, is a world you would never, ever, want to live in.
A world where sex and violence floods the streets, where Type A investment banking crazies rip themselves apart in corporations, where economic inequality is disastrously high, where the only way to become a hero is to kill a bunch of people and then yourself. There is no occupation you would want to be in Cyberpunk. Working in a corporation? Get your life torn apart by your manager. A ripperdoc? Get kidnapped by a gang. A mercenary? Enjoy the 10 minute long lifespan and cyberpsychosis, buddy.
But here’s the thing: in the context of a videogame, it all tends to make a lot of sense. In most modern military FPS games, you kill enough people to break Geneva Conventions several times over. But in Cyberpunk, that’s the rule. Everybody kills that many people! Everybody makes the same crazy decisions! Everybody ignores traffic signs and drives five times over the speed limit!
This is what’s wild and interesting about Night City. In other more reasonable worlds you usually have to control yourself to be immersed. In Cyberpunk, that lack of self control is the immersion. Contrast this with Witcher, which is more or less generic high fantasy. Yes, there is violence and boobies and sex all the same, but it’s no different from what you might get out of a Game of Thrones or Black Company. Cyberpunk’s world, by contrast, felt genuinely exciting.
And of course, beyond the world, there are those little moments. The initial heist in Konpeki Plaza. The absolute insanity that is the Peralez questline. All the companions who, genuinely, feel like companions. And, of course, the ending — I had gotten the good ending on my first playthrough by simply playing the character, and while it felt a little too focused towards certain companions it overall really felt like a complete experience.
So, I came out of Cyberpunk with a bittersweet feeling. On one hand I genuinely enjoyed the game, on the other I — in good faith — could not recommend it to anybody. I needed something else, some sort of excuse to get people interested, something that kept all the good elements of Cyberpunk but put it in a new light, preferably with the assistance of a team that was much more competent than CDPR.
Here comes Edgerunners.
The Edgerunners AngleCyberpunk: Edgerunners is the first step in the so-called “Cyberpunk cinematic universe”, a short anime miniseries written by the Cyberpunk team and developed by Trigger, the animation studio behind Gurren Lagann and Kill La Kill, with music by Silent Hill’s Akira Yamaoka. It was announced relatively soon after the launch of 2077, being produced mostly in tandem.
As soon as I saw the announcement of Edgerunners, I knew this’d be it. Often you get those rare PB&J matchups, like Nier Automata — the combination of Yoko Taro’s writing and Platinum Games’ gameplay — that just feel right. I knew the story was there in Cyberpunk, and I knew Trigger’s art and animation was some of the best I’ve ever seen. But I also knew that 2077 was a disaster and Trigger couldn’t write an engaging plot to save their lives. It felt like a natural combo.
Of course, no one really paid attention during the leadup of Edgerunners. Virtually all the promotional material looked very good, but since everyone was caught up in the disastrous launch and its damage control the last thing anyone was interested in was a Cyberpunk anime. Hell, even in the weeks before the show launched all people could talk about was any of Trigger’s other in development projects. There was no interest.
Part of me was genuinely concerned. I had become a Cyberpunk believer, and while I was still skeptical of CDPR I wanted people to become invested in the same world that I was. It seemed, by all intents and purposes, that things were going to go out in a puff of smoke — that the show was going to be pretty good, but relegated more to an underground hit rather than something more substantive. That wouldn’t have been enough to save Cyberpunk. The series where everything went out in bangs, instead went out with a whimper.
But a few interesting things happened. First of all, the show was good. Very good. Considering what I mentioned before, not entirely surprising. But the second thing — the surprising one — was that it was a huge hit.
Over the span of a week, people became Edgerunners obsessed. The show’s hashtag trended 24/7 for five days. Tens of thousands of pieces of art work were created. 2077, most notably, received enough sales during this time to pass the 20 million lifetime sales mark — making it one of the greatest bounce-backs in sales post launch of any videogame.
So, Edgerunners was a massive, worldwide success. I was happy. CDPR, I presume, was happy. Trigger was happy though perhaps a little anxious by the fact that a 3rd party western title was 10x more popular than any of their original IPs. The question no longer became if Edgerunners would help bring back the Cyberpunk franchise. The question was, instead… would it be enough?
CDPR Under The TriggerIf you were to look under the Edgerunners hashtag during those 5 days, and filter out all the Rebecca and Lucy fan art, you’d notice an interesting pattern. There was a lot of praise for the show, and a lot of praise for Trigger, but rarely any mention of CDPR. Which was strange given the fact that, beyond being the originators of the new franchise, they did have a sizable impact on the production of the show itself (all the writing, as I had mentioned, was done on the CDPR side). This bias began to reveal itself fully when a very particular tweet reared its head in the midst of the fervor.
I can no longer find the tweet now that it’s been awhile since all this occurred, but I’m sure you can find the evidence with enough digging. It was one of those simple “Cyberpunk is good? Always has been” sorts of things. And while that statement is debatable, the response was certainly vitrolic. Filled with long lists of all the launch faults, responses saying “You’re a AAA sympathizer” etc etc, it became clear that people were certainly still mad at CDPR.
To be fair, Trigger deserves the credit it gets. They were the ones who ultimately executed on CDPR’s vision, who created all the character concepts that fans so adore, and their growth through this project is really something to promote. In a lot of ways, Edgerunners feels like Trigger “growing up” — a studio which traditionally focused on (mostly empty) spectacles of action and color now had a framework to bring a complex story and philosophy into the mix, all while keeping the same spectacle. But they did, for the most part, get handed that framework by CDPR. Really it’s this collaboration that should be praised first and foremost — not just between two studios with vastly different focuses, but also ones in vastly different places. It’s Poland and Japan, for god’s sake! You’ve got language, cultural, and geographic barriers all in one bundle! However, they still pulled it off in the end.
But, as usual, I’m not here to make arguments. The simple fact is that Edgerunners was not enough to turn CDPR back into that darling game developer they once were. So, what’s the next step?
Where Do We Go From Here?Well, the first natural place to go is the game itself. By the time I finished 2077, it was barely on version 1.1 — with the Edgerunners update it was brought to 1.6. So I, of course, decided to play it again. More specifically, I decided to buy it again, since I hadn’t brought my Xbox to the new apartment and heard that the PC version was a whole lot better, anyway.
I’m not entirely certain what was a bug fix in the updates, or what just naturally runs better on PC versus console. But immediately in my first 10 hours of playing I noticed a multitude of fixes. Some were sad, like the removal of the weird “rail grinding” glitch that gave you super speed if you sprinted on a fence. But most of them were sighs of relief, like the improvement of audio syncing and fixing model spawning and placement in cutscenes. Also the loading was much, much faster — though this might be more of a hardware difference. Overall, 1.6 feels like I’m playing a real game. A Bethesda game, maybe, but a real game nonetheless.
Yet despite how much CDPR may fix 2077, I don’t think the series’ future lies in that game. I also don’t think it lies in expansion packs like the upcoming Phantom Liberty. Rather, it lies in things similar to Edgerunners — perhaps not a cinematic universe per say, but a series of new, smaller games or stories which take place in the same world.
Unfortunately, as of this point, it doesn’t look like there’s too much steam left in Cyberpunk. As of me writing this, CDPR just released its “forward-looking” statement on new products in their pipeline. It consists of about 10 new Witcher projects and a single new Cyberpunk game. Looks like the bad guys won this time, boys. But who knows… maybe the future will change again, just like how it did with Edgerunners.
The post Cyberpunk in 2022 appeared first on Jacob Robinson.
October 31, 2022
My Life Principles
A while ago, I kept a running list of principles to guide my life. I recently rediscovered it, and it turns out it aged better than I thought it would. So here it is, open-sourced, for the world to see. Feel free to use it (or not use it) as you wish.
CraftWork hard. Take everything that you do and use it as a chance to enhance your main goals. Just make sure that it’s fun for you.Pay close attention to the craft. Focus on creativity and the art form rather than conforming to what is popular in the industry.Don’t get easily distracted. Temptation is the devil. Remember your goals. Remember why you’re doing all this. It’s a long game.Keep organized. Get rid of old experiments that don’t work, have everything in its right place.Focus on wider goals. Don’t spend too much time focusing on school and work when you know that it isn’t really the point.Generalization over specialization. Specialization is for insects. People are complex, abstract-thinking beings that are meant to have many different varying interests. Work around this.Work fast and break things. If you constantly wait until the right time, until it’s perfect, then it’s never going to be made. Just put it out there. Let things fail.Be at heart, a creative. Just doing tech is boring. Just being all business is boring. Find the art in all of it. It will help guide your competitive advantage.Make the culture, not follow it. Build the hierarchy upon which people speak and do things. Develop the ruleset, do not follow someone else’s. If someone does not want to follow, let them go.Be intensely curious. Follow the things you’re interested in (see below). Constantly be exploring for new ideas, new things to build, etc.Strictly follows interests. Don’t bother diving into things because they’re “good for your career”. You’ll be able to find a way to build regardless of your interests. You just need to make sure you explore widely.MindfulnessBe patient. Life is a marathon, not a sprint. Good things may not come for you for many years. But if you’re consistent, the question doesn’t become if — it becomes when.Be transparent and calm when things get heated. In a fight, your natural response is to get aggressive (fight) or whimper (flight). Move away from these responses. Instead, keep your cool and smile during a fight. People won’t know how to fight it.Go at life actively, not passively. Do not slouch off, or mind wander. Be in the present; you do your best work there.Force yourself to stave off loafing, hunger, sleep, when needed. See Hesse’s Siddhartha. I can think, I can wait, I can fast.Never take life all too seriously. When it boils down to it, this whole thing is just a game. Try to enjoy it.KindnessAlways be expressively grateful. People always like it. Just make sure to do it for things you really are grateful about.Use time wisely but kindly. Family and close friends get it for free. Everything else needs to pay.Play the hero. It’s dangerous, It’s reckless, It’s necessary. Put your values and the values of others ahead of yourself. You’ll be surprised how far it takes you.Live without fear, and be a beacon to those who have it. The shoulder to cry on, so to speak. The one person in the room who is fearless. That is heroism.Take responsibility. Even if it hurts you. Leaving it out in the sun just makes the whole thing worse.Show love, whenever possible. People to an extent live off love, and will show you love themselves if you give them it. The rule of reciprocity.Help the weak. Part of being a hero, part of showing love. Help those who can’t help themselves. It may reward you dividends in the future.CharismaMaster of confidence. If you have more confidence than anyone in the room, they’ll follow what you do. No matter how dumb it looks.Be articulate. Articulation is key to getting people to understand your ideas and making them think you know what you’re talking about.Speak like a leader. Gather the weight of the room. Lead people to action, if they cannot do it themselves.Be eccentric yet self-aware. Being eccentric allows you to tap into culture creation. Being self-aware hedges for social status risk.Seductive yet loyal. Gather people to your side, but remember the ones that really count.Speak first. Speaking second is for followers.Be unafraid to speak up. Most of the reason why people speak second is because they’re afraid. Build up a tolerance for fear.Do not speak out against others, but be honest. Use the praise-criticism-praise paradigm. Make sure people always know the truth, just deliver it softly.Don’t be an asshole, but let yourself be disliked. Perhaps the most important lesson in this entire list. You are much better off in life setting the people and places you go to towards your beliefs than setting your beliefs to attract people and places.Do not involve yourself in drama of the people. It isn’t worth your time. Might be fun to watch, but there are better things to do. Same reason that reality TV is bad for you.Don’t worry about people — just about the physics. Similar reasoning to the points mentioned above. Don’t try to please people, regardless of the position. You’d be surprised how much better it sets you up in the long term.Use questions and clarifications to understand what other people are saying. The key to getting someone to your side is understanding their starting perspective very, very well.PowerBe Strategic. Think about the long game. See the whole playing field. Think outside the box. You always have some sort of opponent — most of the time it’s yourself.Care more about the long term. Like it was mentioned earlier, life is a marathon not a spring.Do not be easily influenced. Whether it be the pleasures of life, or a sleek sales deal. Chances are what you’ve been doing for the past few years is better than whatever they have to sell.Keep controversial opinions hidden. Of course, say them if it breaks one of the previous mentioned laws. But don’t just spout out things that are going to make people upset.
The post My Life Principles appeared first on Jacob Robinson.
October 24, 2022
Some Brief Notes on Programming
After having tried and failed to learn how to program for many years, I’ve finally found the learning path that works best for me. I’d like to share that path just in case anyone else out there is feeling the same way.
A note before we start: I am by no means a programming expert. I’ve done some programming (obviously) and I feel pretty comfortable with it now, but I am a far cry from a PhD in Computer Science or a trained programming professional. Rather, the point of this article is to describe programming in a way that made sense to me, and might make sense to others. If you are already a programming expert, or are well on your way to learning programming, you will likely get nothing out of this article. If you want to learn programming but are stuck and have no idea how to start, there might be something for you here.
Another thing worth noting is that most of this is not my own ideas. Since I’m an amateur the last thing you probably want to hear is me wax about “programming best principles” or some-such nonsense. Rather this article is made up of bits and pieces I have heard from experts that I have then compiled into a single cohesive piece. With that stuff out of the way, let’s get started.
PracticalOur first section, Practical, will just tell you how programming works in order to do things. It won’t mention the more esoteric, academic mumbo-jumbo. I think it’s more important to learn how to do stuff first and figure out how it works later.
The idea behind a program is similar to a factory: it is that you’re taking a set of information (input) and shaping it into something else (output). A website takes site content and turns it into something readable and accessible via the internet. The Uber app takes all the drivers and riders out there and finds a way for them to come together into a marketplace. A videogame takes assets and compiles them into a living, breathing world.
The way that we convert inputs to outputs is via algorithms, the meat and potatoes of programming. Algorithms tell the program what to do with an input in order to make it an output. Usually, algorithms are made up of the following actions: conditionals, loops, and mathematical logic. Okay, really those first two are also mathematical logic, but let me keep things separated to make it easier to understand.
Conditionals are if-then statements. If I get coffee, then I go to work, etc. etc. You can make conditionals very, very complicated. Same goes for loops, which essentially just produce actions given a conditional (aka does something until a conditional is false). Mathematical logic, in the case I’m using it here, is being able to add, subtract, multiply, and divide.
An algorithm can take many forms. A single algorithm can take up an entire file, or just be a little chunk of a file (called a “function”). Large amounts of algorithms can also be simply imported into a file, to save you a ton of work from reinventing the wheel. These imports are usually referred to as packages. A big spoiler alert for programming is that 95% of the work is just installing and debugging said packages.
Of course, another huge part of this is the inputs and outputs themselves. These are referred to generally as data. Just like with algorithms, there is a lot of pieces that make up data: datatypes, variables, and arrays/dictionaries as a few examples. Datatypes are, admittedly, types of data (a string is a word, an integer is a number, a pointer is… well, you’ll learn it when you’re older). Variables are how we store data, either for its use as an input/output, or as an in between while we’re still using it for our algorithm. And arrays/dictionaries (known more generally as data structures) are ways to store a lot of data in one easily accessible place.
So we now know how programming languages work. There is, however, the challenge of learning a specific programming language. The bad news about programming languages is that there are a million of them out there. The good news is that, unlike real languages, there isn’t a whole lot of ways the fundamentals can change from one language to another. Virtually every programming language is made up of the pieces I have described to you. Sure, there can be different focuses in a language — for example, some languages are function-oriented while some are object-oriented — but in our practical minds, that really doesn’t make much of a difference. Once you learn one language really well, the process becomes easier for each subsequent language.
MasteryAs I have already told you, I am not a programming “master”. In fact, I will never be a programming master, because my use of programming insists solely upon making the things I need to make, and no further. But if you are interested in being a programming master, I did not want to leave you out. So here are some pieces of advice I’ve heard will take you down that path:
The first is that the fundamentals which I had mentioned above just aren’t going to cut it. On a practical level, we’re focused on getting things done — on a mastery level, we’re focused on getting things done effectively and efficiently. Hack solutions are no longer a priority. And because of that, we have to dive deeper in exactly what’s getting these algorithms and data to run.
Code is, of course, instructions for a computer. But how does a computer work? How do all those circuits, those ones and zeroes, those computer parts all translate into what you’re seeing on the screen? In order to be masters, we have to take away the surface-level approach and look at things more from a bottom-up perspective. Some electrical engineering knowledge is of course involved, though just to the extent that we know precisely how each piece of the computer gets us to where we need to go. When we have that info, we can find little “cheats” in the system that allow our programs to be better and better.
Take the pointers I had mentioned above, for instance. Pointers are things that “point” to a specific piece of memory in which a variable resides. This isn’t really something you’d have to worry about at a practical level but is something that’s important in mastery.
It is also worth noting that, at this level, any old algorithm isn’t going to cut it. As it turns out, some algorithms are much faster than others! People have even dedicated their entire lives to figuring out what are the fastest algorithms for a given task. At a practical level, you can probably just search “best algorithm for X” into Google and get 90% of the way there. Another shortcut that doesn’t count in mastery.
A big reason why these shortcuts don’t cut it at a higher level is because master programmers aren’t dealing with small little tools and toys as the practical ones do, but rather massive algorithms taking up massive amounts of data where speed and organization is everything. In other words, programming mastery matters a lot if you’re working at a company like Meta, but maybe a little bit less if you’re just trying to develop a custom todo app for your own productivity quirks.
…
So, I’ve reached the end of my notes. Hopefully this advice isn’t too controversial — there’s not really any hot takes I intended to put here. Based on this information, you might be wondering what next steps you can take. This is actually easier if you’re going down the “Mastery” route than the “Practical” one — you’ll need to learn a lot of theory, so you’ll mostly be on the ground reading textbooks, research, etc. on how all this stuff works. For practical programmers, the best next step is to try and build something you need. There’s plenty of “programming challenge” websites out there that give you ideas to build things like a todo list app or a calculator, but the problem is that most people don’t need todo lists or calculators because they already exist in abundance. If you don’t need it, you’re less motivated to make it. So if you need motivation to learn how to build something, it’s better to start from the angle of creating something brand new and/or tailored to your own specific needs.
Games are always a good place to start. A lot of people have ideas for games they’d like to play but don’t exist, and game development is a good way to learn a lot of practical programming skills rather quickly. The only downside to games is that making one (especially in the 3D space) is hard even for mastery programmers, and so depending on the game you envision it may be more easy to start with a simple app that might help you out. Either way, I hope you enjoyed this short list of recommendations!
The post Some Brief Notes on Programming appeared first on Jacob Robinson.
October 17, 2022
The Curse of The Conqueror

1956’s The Conqueror was a flop, and for good reason. But then, one by one, those involved began to die off. Was it a coincidence, an incident gone wrong, or something more… sinister?
Chances are you’ve never seen The Conqueror. You’re likely better off for it. The film, in itself, isn’t anything to write home about. Produced by the reclusive megalomaniac Howard Hughes (a story in himself), the film is a western-style epic starring John Wayne as… Genghis Khan. More specifically, Genghis Khan as a white man with a slightly ethnic looking moustache. No, really, I can’t get over how little they tried with this outfit. Check this out:

Now, if this post was about the actual content of the movie, it wouldn’t be very long. I’ve never seen The Conqueror myself, nor have any strong wish to do so. Rather, this is the story about what happened after the film’s critical failure.
The year is 1960, four years after the release of The Conqueror. Pedro Armendariz, a Mexican actor who — you guessed it — played a Mongol chief, began complaining about shortness of breath and persistent pain in his abdominal region, particularly in his hips. A trip to his doctor, as well as many trips thereafter, revealed the grim news: he had developed kidney cancer.
But something didn’t add up. Armendariz, at the time, was well outside the normal age of a cancer diagnosis. He lived a relatively healthy life for the day and age, regularly exercising and staying away from cigarettes. His doctors were initially optimistic about his treatment. They worked to give him the most up to date cancer treatment at the time in order to facilitate a speedy recovery. But something happened: the treatment didn’t work. In fact, it seemed as though the cancer only got worse. In June 1963, Armendariz’s doctors informed him that his kidney cancer had become terminal. A few days later, he shot himself dead in his hospital room.
Armendariz wasn’t the only case, however. Just five months prior, The Conqueror’s director Dick Powell died from cancer. Then John Wayne, then costars Susan Hayward and Agnes Moorehead, then more members of cast, crew, and production. All in all, 91 members of the team behind The Conqueror would end up developing cancer — almost half of the entire film crew.
As the years went on, the remaining crew began to panic. What had caused this, and who was next? they wondered, both among themselves and to the media tabloids. Little did they realize, the man behind it all — Howard Hughes — knew something they didn’t.
In the years before production on The Conqueror officially started, a team including Hughes and Powell began scouting for locations in America that would best fit the look of Mongolia’s desert steppes. One location that immediately came to mind was Utah; more specifically, the Escalante Desert near St. George. The land was flat and barren, yet cool enough during the afternoon and evenings to resemble the Mongolian climate. After other locations were eliminated, and the cast and crew acquired, the team made their way down to Escalante to begin the several weeks of filming under the brutal Utah sun.
A few days into the filming, Powell received a strange letter. It was unmarked, and seemed to be left directly at the door of his trailer. Upon opening the envelope and reading its contents, he learned that the letter was sent by an anonymous scientist working for the National Security department stationed in Nevada. The scientist shared some grim news: the location which they had to decided to film the near entirety of The Conqueror at was 137 miles downwind of the largest nuclear testing site in America.
The next morning, Powell shared the letter with Hughes. It was always rumored that the bulk of the above-ground tests done by the United States military took place in Utah, but the government usually kept tight-lipped on where and when such tests occurred. Hughes, having close Air Force connections from his days as a pilot, asked around to see if anyone could verify the scientist’s claims.
A few weeks later, they got word back — in the form of an official National Security representative who came right up to Hughes’ trailer. Yes, the representative told him — the stories were true. The Escalante Desert did have its own nuclear testing program. But, he assured Hughes, much research had been done on the so-called “downwind effects” of nuclear radiation and it was deemed safe enough for all those who were living down in the valley, including those filming The Conqueror. Not only that, but nuclear tests in the area had stopped for some years. You are fine, the representative told him repeatedly. You have nothing to worry about.
And so, Hughes forgot all about the incident — ignoring the fact that the representative’s concession proved the authenticity of the scientist — and filming on one of the worst films ever made continued. Perhaps Hughes would have forgotten about the whole thing, even the film itself. But then the bodies started rolling in.
No one knows why Hughes didn’t mention the letter and the representative when all this was happening. The story about the nuclear tests was only found out after the military declassified the information, and Hughes’ knowledge was revealed via documents left after his death. Perhaps he was trying to save face. Perhaps some part of him hoped that, despite what he knew, it wasn’t true. That the cancer cases were just a coincidence.
And, certainly, many people still think they are. To this day the effects of downwind radiation are unproven, and many have been quick to point out that the cancer rate of The Conqueror’s crew does line up relatively well with cancer rates as a whole during the decade. Take John Wayne, for instance — did he really get cancer from the Escalante bombings, or did it have more something to do with the fact that he smoked six cigarette packs a day? That, of course, doesn’t explain the cases like Armendariz, but it does lend credence to the fact that the spooky scientist letter may have very well been a misnomer.
Of course, Hughes himself never got over it. His mental health already in its declining stages, the curse of The Conqueror weighed heavily on him. In his late life he would obsess over the film, wonder what could have gone different — not just for the film’s quality, but also for the lives of the crew that were involved. They say that, in his final days, he would lock himself in his private theater and watch The Conqueror over, and over, and over again…
…
The Magical Reality series contains fictionalized elements in true stories for the purposes of narrative and suspense. Articles under this series should not be used in serious discussion of the events they depict.
The post The Curse of The Conqueror appeared first on Jacob Robinson.
October 10, 2022
My Daily Routine

In continuation of my “easy win” posts in order to catch up for the new year, I thought it might be interesting to some to see what my daily schedule is, and how it might compare to the average person.
Most days I wake up between 5 and 5:30. It’s a time range and not a definitive time because I use the Sleep Cycle app, which wakes you up at the best time within a 30 minute interval.
Many people’s first reaction to me waking up at 5 in the morning every day is that I must really have a lot of discipline. Not true. You see, the problem with waking up at a normal human hour is that everyone else wakes up at that time too, which means right from the get-go people are bugging you with messages, calls, and other such stuff. So, you wake up a little bit earlier than them so you can have your moment of peace. But then they wise up to you waking earlier, and they wake earlier themselves. Now all of a sudden you’ve got a nuclear arms race where whoever can wake up the earliest and escape the contact of other people wins.
For me, 5 o’clock was a good time where I could reasonably wake up and feel alive but at the same time most others would not dare touch. So that’s my logic on that.
From there, I always get done with a shower, going into a coffee and some reading. Some of you may recall I used to take my showers cold. I stopped doing this, not because the technique stopped working but because I realized a rather awkward practical issue: cold water doesn’t rinse away body wash as well, so I would always leave showers still soapy. If anyone knows a good solution to this, let me know!
As for the coffee and reading part, it’s pretty straightforward. I brew a coffee, and the entire time I’m drinking the coffee I dedicate to reading. The reading stops when the coffee ends. As for what the reading entails, it’s usually whatever (physical) books I’m currently working through. I have a very short attention span, so I’m using reading 5 to 10 books at a time — dependent on the number of bookmarks I have. That way, if I feel myself dozing off, I just move to the next book in the rotation.
My working day starts around 7:30 to 8:30. I work in 50/10 pomodoros with an hour for lunch as the long break. That lunch break marks the switch between product (writing, course development, other fun stuff) and marketing (boring things I must do to survive). So 4 pomos for product, 4 pomos for marketing, totaling around 8 hours of work. As for which one goes first, product or marketing, it depends on where (I think) most of the distractions of the day are going to be. Product tends to require deep focus work, whereas marketing is pretty mindless — in other words, you can pick it up after being distracted without too much trouble. So if I see a lot of boring meetings set in the morning, or relatives visiting in the afternoon, I’ll set marketing for the first or second block, respectively.
Now, after I finish work for the day, I go into my night routine. Based on what my morning routine was, I’m sure many of you are excited to know what my night routine is like. Here are the basics:
Just kidding, I lied. I have no night routine.
I don’t know how you people do it, the second I get off work the whole experience is a blur until I go to bed for the night. I mean, I’m doing something in that time — whether it be working on an experimental project, talking with friends and family, or just relaxing and playing some videogames — but there is no routine. So no, I can’t really help you here. I can at least tell you when I usually go to bed, which is 10 — making up about 7 hours of sleep.
Anyway, that’s it for my daily routine. Like I said, short post, just trying to catch up in the scheduling. But the next few should be pretty interesting. I’ll see you then!
The post My Daily Routine appeared first on Jacob Robinson.
October 3, 2022
The Productivity Tool World Tour 2022
After some time that I’ve spent on behemoth essays (those articles which Shall Not Be Named), I’ve gotta catch up in my scheduled posts. So the next few posts will be easy quickies on more basic, surface-level topics. And what better way to start then to do another Productivity Tool World Tour.
Okay, I guess there has never officially been a Productivity Tool World Tour (PTWT for short). When I first started the newsletter, one of the benefits of joining were that you received a “bonus” post called Top Ten Tools I Use On A Daily Basis, perhaps the most clickbait and hivemind-sounding post I have released. In the short time frame that it was a reward, I must have rewritten the post two or three times because my “tools I use on a daily basis” kept changing — I kept finding new tools, replacing old ones, then replacing new tools with the old ones which it turned out I liked more anyway. This is where the idea of the PTWT first came into play.
This time it’s official, and the rules are a little bit different: I’m simply going over every tool I’ve used, and writing a short review on it. I’ll leave it to you, the reader, to decide which one sounds the most interesting to you. Sounds good? Let’s get into it.
Todo ListsHabitica [Used premium] – The old glory for me. I think I have used Habitica the most out of any productivity app, and it still works the best for my setup. I’ve talked about how “life gamification” can help you become more productive in the past, and being able to quantize the amount of work I’m doing and how that relates to what sort of breaks/goofing off I can do allows me to balance things more easily.
Todoist [Used premium] – I’ve used Todoist a decent amount as well, and I do think it is the best “pure” todo list app. But it is missing the “Tasks → Rewards” concept that I mentioned with Habitica. For some people this won’t be a big deal and if you do just need a tool to organize all the shit you have to do then I do think Todoist is the best way to go.
Pomofocus – Okay, so the focus here is technically the Pomodoro timer and not the todo list. But I do think Pomofocus punches its weight as an all-around good tool. The Pomodoro Timer is of course probably the best productivity invention of all time, but Pomofocus brings it up a notch with a lot of nice added features and functionality.
Microsoft Todo – Microsoft Todo is rebranded Wunderlist, which a long time ago used to be considered the best Todo list app on the market. Honestly Microsoft didn’t really do that much to make it worse in any way, I think people just got thrown off by the Microsoft name, plus the fact that Microsoft doesn’t really pay too much attention to it anymore.
NotetakingEvernote [Used premium] – A lot of hiveminders seem to insist that “Evernote is dead” and honestly I have no idea what the fuck they’re talking about. I don’t personally use it but every normal person I’ve come across seems to use either it or OneNote. I’ve never met another Obsidian or Notion or (lord forbid) Roam Research user in my entire life, just Evernote and OneNote users. And this has stayed stable throughout time. Anyway, the app itself: It’s a pretty standard modern notetaker. You can clip web content, you can drag images in, you can draw stuff, etc. etc. It’s pretty neat, though can get messy with all the weird formats it takes in.
OneNote [Used premium] – OneNote is Evernote but made by Microsoft. There is no difference beyond the fact that I think the drawing features are a little bit better in OneNote than in Evernote. Full disclosure I have only used OneNote for school and work, never for personal use.
Simplenote – It’s Simplenote. It’s a notebook, and it’s simple. Though after they pushed out a backlinking feature after that became trendy, it might not be the most simple, but the idea still applies. If you just want something that has the lowest amount of clutter possible, but can also sync across all your devices, try this.
Roam Research [Used premium] – I used Roam Research for a very limited time, and missed most of the “Roam cult” stuff. I will say that backlinks really changed the game when it comes to notetaking and I’m absolutely shocked that it took until, what, 2019? for a software developer to realize that a personal wiki approach to notetaking was very, very smart. But hey, we finally got there, and now everybody does it.
Obsidian – Speaking of backlinking (or I guess people just call it “zettelkasten” now), Obsidian I think is the best when it comes to this. Unfortunately at the time I used this it was local only, which was a major drawback for me — though it looks like now they’re adding cloud saves in. Time to check it out once more, perhaps?
Notion – Notion, Notion, Notion. Could it go down as the greatest productivity tool of all time? Honestly, it has a pretty good chance. At this point the only reason I’m not running literally everything out of the tool is simply out of fear of putting all my eggs in one basket, in the slim chance that Notion shuts down. And there have been some occasional hiccups and downtimes with the service where I have gotten a glimpse of what a future without Notion might be. It isn’t pretty. But if you are daring enough, you can probably ignore the entirety of this article and just make Notion your todo list, notetaker, bookmarker, and cloud storage all in one.
Workflowy – Workflowy is a very popular tool among select Hivemind elite, and honestly I kind of get it. It’s a todo list and notetaker in one, and it has a very simple and intuitive premise. Just a very long, very deep bulleted list.
BookmarkingReadwise [Used premium] – There was a time in my life where daily email reminders of stuff I highlighted would be a phenomenal sell, but unfortunately I figured out a solution to that problem long before I ever used Readwise. Using Readwise now, it does have some interesting features but the regular recall is still the best tool it has. If you don’t have a solution for it already, I highly suggest Readwise in that case.
Mymind [Used premium] – I already spoke a lot about Mymind in my recent review of it, so I won’t bore you with more of the same. But it really does feel like the next generation in bookmarking.
Instapaper [Used premium] – Yes, Instapaper is cool, and if you’re using it with Readwise there’s a lot of nice integrations between the two. But now, MyMind does both. Once again, it’s a matter of personal preference.
StorageGoogle Drive [Used premium] – Google Drive is my cloud storage of choice, though “choice” is a bit of a misnomer here. OneDrive is pretty jank and doesn’t have the size requirements I need, and MEGA just straight up won’t accept my credit card. So I’m stuck with Google. Though it works out in the end, since most of my writing is still done in Google Docs (even though this too is beginning to be taken over by Notion…)
OneDrive – Like I mentioned, OneDrive isn’t really that great. Well, it’s good if you use windows pretty much exclusively — but if you use an Apple tablet and an Android phone (two devices that, take in mind, Microsoft doesn’t make anymore) then the quality of the engineering begins to deteriorate a bit. Other than that, it comes down to the fact that Google offers terabytes and OneDrive doesn’t.
MEGA – MEGA is, ironically, probably my preferred choice of storage. But there were some key blockers for me, beyond just the credit card issues (a thing which, according to many United States users, is a common issue). The transfer cap is a bit annoying and there are filetypes that cannot be read in comparison to GDrive, but overall its the cheapest price for the most storage, and they don’t spy on you unlike Google and Microsoft probably do.
Anyway, that wraps up my quick reviews on all the productivity tools I can think of so far. Like I said, these next few posts are going to be light on thinking and more on just general content. Still, hope you enjoyed!
The post The Productivity Tool World Tour 2022 appeared first on Jacob Robinson.
September 26, 2022
A Guide to Fiction Writing
I’m now at the point where people ask me more about my tips on writing than for job referrals (thank God), but I’ve noticed that I do not yet have a single “source of truth” on the subject besides my mostly tongue-in-cheek 10 Tips on Writing. So here is an attempt at a truly expansive guide.
The first thing to note is that there is not really any meta course, technique, or book that is going to help you out too much here. The best attempt at such that I’ve read is The Hero with a Thousand Faces by Joseph Campbell, but even then that doesn’t tell the whole story. The most important thing to know about fiction is that, at its fundamental level, it is about taking an idea in your mind and packaging it such that someone else can get as close as possible to seeing the same idea in their own mind. It will never be perfect, but the best writers get pretty damn close.
This is best described with an example. Let’s take a short story that I’ve written, Grey Area. Take the opening scene with Fox taking the call from Serah in his apartment. Visually, I can see the scene. I know what Fox looks like. I know how he shambles around the apartment, phone at his ear. I know how upset Serah sounds through the phone speaker. But you don’t. So, I have to describe things to you, and hopefully you have the same sort of emotional connection to everything that I do. I’ve gotta be tactful about it, though — I can’t just explain everything one-to-one with what I see as plainly as possible, or else your eyes are going to glaze over with boredom. So I’ve got to leave some holes for you to fill in yourself, and when I describe things it should be poetic enough that it’s easy for you to consume and naturally get the same emotional feeling that I’m going for.
Obviously, that’s a lot harder in practice than in theory, and each writer has their own strategy to try and crack it. The route I personally chose is through dialog — my stories are always very dialog heavy because, when it comes to a person saying something, you can write very plainly what they say and have it still make a lot of sense. But you don’t have to follow this strategy specifically.
In order to get inspiration — both for what to write and how to write it — reading other fiction books is valid, but I actually wouldn’t recommend it. Good books will give you the most specific advice in terms of, for example, what makes a good written description or piece of prose. I actually do have an internal document that I refer to which has pieces of writing that I think do a good job of describing both hard detail and emotion. Some examples:
“Flayed glasseyed sheep hung from their haunches, sheepsnouts bloodypapered sniveling nosejam on sawdust.” [Ulysses, James Joyce]“Edith moved into the apartment as if it were an enemy to be conquered. Though unused to physical labor, she scraped away most of the paint from the floors and walls and scrubbed at the dirt she imagined secreted everywhere; her hands blistered and her face became strained, with dark hollows beneath her eyes.” [Stoner, John Williams]“He was unaware of my touch, of my face a foot above him, as he bent the tree-top grasses down to his nibbling teeth. I was like a galaxy to him, too big to be seen. I could have picked him up, but it seemed wrong to separate him now from the surface he would never leave until he died.” [The Peregrine, J. A. Baker]But, like I mentioned, I would not recommend books as the first go to source. Instead, I would recommend movies and music. The movie recommendation is rather straight forward — a film director can get more across than a writer because they are visually showing you via actors and props what is inside their head. And so it is wise to watch a lot of movies, see what the good ones focus on in terms of detail, and then try to emulate that. Going back to the Grey Area example, in the scene where Anabelle goes to stay at Fox’s apartment for the night, this is a scene that has been done in a relatively similar fashion in many films. So it made describing the scene easier when I knew what the cues were for those scenes which I had watched before.
Music is also special, in that it’s the purest form of art: just a series of sounds and noises that together elicit a certain emotional response. As I mentioned, all writing is, really, is just trying to get other people to experience the same series of emotional responses to something as you do. Even the more intellectual writers have some sort of emotional response to the topics or themes that they’re writing about, and they try their best to place their own emotions onto you.
The purpose for music as a writer, then, is sort of a litmus test. Take a piece of music you feel a strong emotional response to, and a story that is meant to elicit that emotional response. Listen to the music, then read the story. Does it hit the same way? Obviously the music is always going to be stronger because it’s so raw, but you can get pretty close to the same response with writing if you’re clever. If the writing doesn’t hit at all, however, you know there’s some rewriting that needs to be done.
So, those are the two forms of media that I’d recommend. Videogames can also work, though they are a bit too good at describing their nature (there’s a lot of cheap tricks games can get away with because the interactive element already makes it feel engaging). As for recommendations across these, once again I don’t think my own suggestions would be of much help. Instead, I suggest this: Look at the classics, across literature, film, and music. Digest a couple, and in particular pick out the ones you either really liked or really didn’t like. From there, try to find patterns in what you liked, and try to find separate patterns in what you didn’t like. That way you have a good list of things to steal from, and a good list of things to avoid.
Anyway, that’s about it. I’m sure I’ll think of more tips outside of this as time goes on, but if that ever comes up I’ll just edit this post and add to it. If you have anything that works for you, feel free to post it in the comments.
The post A Guide to Fiction Writing appeared first on Jacob Robinson.
September 19, 2022
Is Science Perfect?

Many people seem to believe that science is an end-all be-all when it comes to discerning fact from fiction. But is our current theory of the experimental method really that powerful, or does it come with its own imperfections?
As you can imagine, science — more specifically the experimental or “scientific” method — comes with its own circle of competency. Science seems to work very well for stuff that is built into the physical world, stuff like physics and chemistry and biology where we can experience everything with our own five senses. But when we start getting into the abstract world, things begin to falter.
Take psychology, for instance. Psychology has no physical elements (beyond sensation and perception) and so it is based entirely on the abstract world of the individual, and how their minds might adapt and change over time. Just as a thought exercise, let’s say that Milgram’s Experiment is now known by the entire world’s population. Well, if the whole world knows it can be tricked by authority figures, a good chunk of the population will make the conscious decision to not get fooled. Therefore, if you were to then try the same experiment, you would likely find the results to no longer be statistically significant.
This is the key conundrum behind the replication crisis, the idea that old tricks which used to work on people now no longer do, and it’s causing us to have to rewrite most of our psychological knowledge. Some scientific diehards say the fault is at psychologist’s shoddy use of statistics, but the major culprit is instead likely that the scientific model as we know it just doesn’t work in this case. A ball won’t suddenly wise up and stop accelerating at 9.80665 m/s2, but humans can. So tight control of experimental values and logical statistics no longer work in this realm.
Same can be said in adjacent fields such as economics. There is no real “scientific” study of economics because any rule of economics can simply “go out of fashion”. The chemical properties to not go through fashion trends, but quantitative trading strategies certainly do. It’s not to say that all of our knowledge here is useless, or that studying these subjects are a waste of time. In fact, quite the opposite! When your entire model is broken, there’s a lot of interesting work to be done in fixing it. But first you have to realize that science isn’t perfect.
The post Is Science Perfect? appeared first on Jacob Robinson.


