Learning a made-up language, II

Following up on yesterday’s post: one could also ask whether GPT-3 can compose in a made-up human language. For instance, you could take English and modify it a bit:


Henderson’s native language is Gravinic, and when he speaks English, it sounds like this:


“I to the store did go and some cabbage did buy. Now I some dinner am making and I some cake will later eat. I you am not very much liking. I for you to go away am politely asking.”


In other words, the new rules are that verbs go to the end of the sentence, all verbs take an auxiliary verb indicating tense, adverbs go between auxiliary verb and main verb. Is this short text enough for GPT-3 to locate context in a language that isn’t one of the languages in its training set but rather “English with modifications inferred from prompt?”

Not yet:


Write more dialogue where Henderson tells you about his day.


GPT-3: “I then went to the park and did some reading. The sun was shining and it made me feel so relaxed. I saw some birds flying about, and it was quite a sight. After that, I went to the beach and enjoyed the sound of the waves. I even went for a swim! Finally, I returned home and watched some television before going to bed.”



It’s totally possible that more prompting would make this work, though – I didn’t have the patience to manually type in a bunch more Gravinic-inflected English. I’m not actually sure what kind of context GPT-3 is placing itself in with this response.

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on February 20, 2023 17:07
No comments have been added yet.


Jordan Ellenberg's Blog

Jordan Ellenberg
Jordan Ellenberg isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Jordan Ellenberg's blog with rss.