DSC logo

The Ghost in Anouk’s Laptop

DSC 9 book cover

by Anouk Lang and Quinn Dombrowski with Annie K. Lamar

February 17, 2021

February 2020

Anouk

February 2020. It was about as far as it was possible to be from the sunny nostalgia of kids-playing-out in 1980s Connecticut. Storms were raging up and down the British Isles, some unsettling news about a virus was just beginning to rear its head, the UK had initiated the process to leave the EU and a significant part of the British university sector, including my own institution, was on strike. But the Data-Sitters Club had just been nominated for the Best Use of DH for Fun in the annual Digital Humanities Awards and if ever there was a time when the respite of data-play was needed to drown out the literal and figurative storms in the real world, it was now. I came home from a very cold stint on the picket line, thawed out my fingers, opened up my laptop and thought: if the world’s going to end, what better thing to do than feed our Baby-Sitters Club corpus to GPT-2 to see what larfs it might deliver?

GPT-2 (short for Generative Pre-trained Transformer) is a large-scale model of language which, in the words of its creators at OpenAI, produces “conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation.” In functional terms, it takes what it knows about how language behaves, at least according to a dataset of 8 million (mostly English) web pages considered to be useful by Reddit users, translates the relationships between words and phrases in that 8-million page corpus into a list of numbers, and then, if given a prompt to start it off, will automatically generate text that follows on from that prompt using what it has learnt from the larger corpus. OpenAI opted not to release the full GPT-2 model over concerns that it could be used maliciously, as it did such a good job of imitating human-generated text, so they released a smaller version of the model for everyone other than themselves to use. And use it they did: the months since GPT-2’s release have seen a steady increase in the number of people experimenting with it, not just from computer science but also from the humanities and the cultural industries.

Some of this activity is due to Talk To Transformer, a handy web-based GUI requiring no technical expertise. Until recently, Talk To Transformer offered users the opportunity to divert themselves by typing in a few words or sentences, on the basis of which it would then generate chunks of text somewhere on the continuum from amusing incongruity to uncanny likeness. Janelle Shane used it, along with other neural networks, to compile a collection of AI-generated texts on her AI Weirdness website, the funniest of which suggest that the process works best – at least for the purposes of comedy – when the prompts come from identifiable genres that the model can, so to speak, recognize and reproduce, such as names of cookies, tweets from WeRateDogs, or Christmas carols (for extra laughs, listen to what happened when a voice actor recorded one of the creepier carols). Another of my favourite GPT-2 toys is AI Dungeon, a text-based fantasy game which brings the “yes, and” of improv to bear on the genre of the Choose Your Own Adventure story, creating an interactive adventure game many orders of magnitude better than its print culture predecessor. Here it is seamlessly incorporating instructions about installing software packages into a fantasy narrative. Or, if you want to run something on your own machine, try the packages Robin Sloan put together to generate snippets of sci-fi.

So, like many DH folks, I’d had a play with, and a good chortle at, automatic text generators done with neural networks. For this more extended foray into GPT-2 using our BSC corpus, I decided to use an off-the-shelf application in beta. While I try wherever possible to use open-source code and freely available software, this can involve a serious investment of time in learning new tools, and like I said: mid-semester, strike, and the seismic rumblings of what Covid-19 would mean for teaching and assessment were beginning to shake the ground underfoot. An out-of-the-box application offered a way to see whether the results would be worth the time needed to figure out how to work with the model in a programming language. I fed our BSC corpus into the application, waited a few hours, and then watched as my screen began to fill up with neural-net-generated Baby-Sitters’ Club fanfiction.

“Okay, you guys. What kind of dad does he have?”

“Yes,” said Claudia. “I think he’s outgoing. A lot of things. I think that’s cool, just because I think it’s a good family and he doesn’t have much in common.”

“What is that?” I asked.

“He loves it. He loves going to his house and I just think about it, like the teachers come and help out.”

“Oh. I love him.”

I didn’t hear the answer. I was excited about working out next to him.

One of the interesting things to observe in neural-net generated text is the interaction between the thematic preoccupations and stylistic tics of the two datasets on which the model has been trained (in this case, GPT-2’s training data selected according to its popularity with Reddit users, and the corpus of BSC books). The AI Dungeon example which incorporates the tedious steps of software installation into its fantasy narrative is one innocuous example of this kind of genre-mashing. Janelle Shane supplies another: when she fed Harry Potter fanfiction to a neural network that had previously been trained on recipes, she found that it had trouble keeping on track: hostile exchanges between Harry and Snape would tend to devolve into conversations about recipes. Even in the wholesome family-and-friends-oriented chunk of pseudo-BSC prose above, the murky adult world of the twenty-first century internet is beginning to seep through. I pulled the handle of my text generation slot machine and tried again.

I don’t like his dad. I like him, too. I don’t like what he thinks of me. Not sure what he thinks of me. Not sure what he thinks. Not sure what I think of him. Not sure what his parents want me to know. Not sure what he wants.

If you are trying to distract yourself from the apocalypse, you’ve got to be pretty happy with a neural net that, trained on language from the era of the internet and given BSC texts to chew on, presents you with prose that sounds like Gertrude Stein if she was a boy-crazy teenager. (And also, I call dibs on training a neural net on Stein’s collected works, if Tanya Clement or Natalia Cecire don’t beat me to it.) This scrap of Steinese was not just pleasingly absurd but also curiously redolent of what it is like to be a teenager: anxious not only about what other people think of you, but about what other people think you think of them, and what the people connected to them think that you think of other people, and so on. While I wouldn’t characterize this as revealing the latent structures of teen-speak, fictionally articulated inner monologues, or indeed anything else, it’s hard to shake the feeling that it’s nonetheless capturing something.

I’ll also say that  though I’m sceptical about the inside-baseball character of “explainable AI” as it currently stands, I’m confident that scholars of literature, narrative and language have something to add to those conversations, and that there is something to be gained from attempts to identify the textual features a neural network is picking up on, and the ways it is recombining and conjoining those features by asking people who know something about the way texts work. The literary equivalent of being able to identify the node in an image classifier that is looking out for pointy ears in pictures of cats, perhaps. Except, where identifying pointy ears as a salient feature for classifying pictures of cats is not illuminating for photography, art history or veterinary science, it remains a tantalising possibility that literary studies might learn something non-trivial about texts generated by a neural net.

One of the things that neural nets have to “learn” when producing text longer than aphorisms is the capacity to keep track of what has previously occurred. This can range from the simple – knowing that a parenthesis, once opened, must be closed – to the more complex. Our BSC model still had, evidently, some way to go in this direction.

“I’ll call Claudia’s room,” said Claudia. “Don’t you?”

“Yeah.” Claudia said.

Claudia smiled at Claudia. “I’m going to call my parents tomorrow. They’ll be all right now.

“But, don’t worry about it,” said Claudia.

“Oh, it’s okay,” she said. “Okay. I’ll call Claudia.

“Let’s leave!” shouted Claudia.

“Don’t you call Claudia?” said Claudia, as she grabbed her arm.

“I’m not!” she said. “I’m tired. I don’t want to talk to Claudia anymore.”

“Call Claudia. Come, Claud.”

“Come, and tell me about the meetings.”

Claudia was lying in bed. She was shaking her head. “My mind is on you, too. I’m glad you won’t be here. You’re the right one, too. You don’t belong to the BSC.”

Claudia didn’t say anything. She nodded.

While a story peopled almost entirely by Claudias— who talk to each other, grab each other’s arms, and hang out in each others’ bedrooms— has its charms, it shows the neural network to still be at the point of blissful ignorance about the need to differentiate characters in a conversation. Not a feature of a narrative that any class or textbook on narrative will ever bother to mention, but a convention that, once revealed through its rupture, presents itself as non-negotiable: even Joyce’s, B.S. Johnson’s and, yes, Stein’s characters remain stable and distinguishable from others in conversational exchange. And while the extract is hall-of-mirrors ridiculous, it also feels as if it captures something of the intense sociability of the BSC texts, where there is an enormous amount of talking with, emoting at, and being in close physical proximity to, one’s friends.

“We had some great parents. Mom and Dad had kids. Mom and Dad had our own little sister Stacey, who’s two. She did a lot of big things and stuff like that.”

“What do you think, Stacey?” asked Claudia, shaking her head.

“Well, it’s important to remember that we’re not living in an alternate world. You know, it’s hard.”

“But it’s a place where we can see. And we still live in an alternate world, so that you can see where you’re born.”

“That’s great. Right.”

“Or maybe, okay?” asked Stacey.

“It’s hard,” I added. “Just not when you need to.”

“Yeah.”

“That’s right,” agreed Claudia.

“Yeah. Just not when you need to. You don’t know that what you’re born is on the outside, you know.”

“Sure, it’s not that easy. Not when you need to. But it’s hard to have at least one person or someone who can come and have you with you. That’s also why you have to be best friends.”

“Well, I think you can just do a little thing,” agreed Claudia.

By this point, a little further on in its training, the neural net has advanced to the point where it understands that conversations need to occur between different people. And, while the conversational swerve from friends and family to existential anxieties and back again isn’t so typical of the books, the earnest verbal affirmations of the BSC members to one another are. There’s lots of agreeing with and echoing each others’ phrases, even as the characters happily contradict each other about the surprisingly metafictional question of whether they are living in an alternate world. (Does the term ‘alternate world’ even appear in the original corpus, I wondered? Reader, I checked: it does not. My guess is that here the model was reflecting parts of its training data – fanfiction, to which the notion of alternate worlds is central – that most resembled the BSC novels, and used the structures and phrases it found there.)

As the training went on, so the conversations between characters began to get more abstruse:

“Well, I know that the way I feel about myself, the way I feel about myself. There’s something good about myself, you know. I’m different than everyone else. Like you.”

“Not that. Not anything like that. It’s just not the way I feel. I don’t care about how I feel about myself, or what I’m trying to do.”

“Or what’s in your heart?” I asked Claudia.

“The way you feel about myself is that I always feel better,” I reminded her.

“That way, I feel better. And I feel better when I look at you in a different way. You think of myself differently. And that’s why I know what I am.”

“You know what I’m trying to do? Maybe I am just trying to be bigger, too. But I don’t have that kind of world. It’s just you and I are people. Like you. When we didn’t have one person who was older than me. Or something. But it’s your own destiny. And what I’ve done. We see that I have a lot of self-control. I guess I think that’s why I am stronger than you. I feel better now.”

“How about you?” I asked.

“I like you a lot, and I don’t see it. But I know that in your life you’re better than me. And I really believe you can’t be anybody.”

“But you never feel worse. Maybe you’re better than you really are, and what else are you trying to do, too?”

“I don’t care.” I looked around. I wasn’t sure what I was trying to do, but I guess I could tell. If I had thought of it and tried to solve it, I could have made it easier.

This extract reads as an almost parodic depiction of self-obsessed teenagers trying to articulate who they are without having the vocabulary to do so. At the same time, though, it captures something of the intensity and almost romantic dimension of those all-important teenage friendships, where talking for hours to friends (old-school style on a landline telephone, if you were a teen in the 1980s or 90s) was a crucial part of working out who you were, what your place in the world was, and how you might navigate your relationships with others. The repeated circling-back to particular topics – as a neural net sometimes does – also contributed to making the extract reminiscent of those long and luxuriantly angsty conversations. I wondered to what extent the neural net’s generation of the pronouns I and you according to their frequency in the BSC corpus might be contributing to the success of its pastiche, and which other features of language the neural net might be attending to as its learning went on.

After it got stuck in a loop where it began to generate streams of repeated terms, I gave the neural net a new prompt with the opening from our first DSC book, Quinn’s Great Idea, and was soon rewarded with another example of teen-Steinese (“I’m, of course, the only person I can describe, because I’ve come to such a great deal of difference in the way I am”). Obligingly, the neural net then got onto the topic of clothes and how they are used to telegraph identity in the BSC:

Claudia looks terrific, but is definitely not as quick and athletically as Claudia does. She dresses as a brat. She looks absolutely gorgeous, and looks glamorous. She wears perfect outfits, though.

Kristy is the one who’s more casual. For example, Kristy dresses all the way to Stoneybrook. I guess she looked a little sketchy, but Claudia is pretty casual. I think it’s been a good thing she’s more professional lately. I’ll admit, I’m almost almost like Kristy. But I am actually my best friend, and Claudia is sort of like Kristy. She goes for things that don’t fit the stereotype of young, shy, shy people. She’s a strict style person and I don’t see why I wouldn’t like to be an associate member. My roommate, Stacey, is an associate member of the BSC.

The BSC — I mean, although I’m a member of my club — was pretty much a complete mess. The conference was very organized, but the BSC was terrific. I decided to go out with Kristy.

It took about two days to complete the first meeting, so I told her about the problems. I was wearing my Mom’s jeans. But I think it was a bust out of something Claudia has.

What was striking about this was not so much the descriptions of clothing, but rather the way those descriptions were brought together with descriptions of who characters are and who they are trying to be. Though its referential cohesion leaves a fair bit to be desired, the extract also captures the way that the books introduce the characters in terms of their roles (members of the club, associate member) while simultaneously discussing their clothes and their appearance, most obviously in chapter 2 but also throughout the books.

As the training continued, the characterization felt like it was beginning  to stabilize: the neural net was now able to keep track of three members of a conversation, while continuing to do a fair job of echoing the tone of the books:

“Okay, what happened?” I asked.

“No,” Kristy answered. “Let’s move on. You have to figure out a way to make up what you want to do.”

“Okay.” I was pretty sure Stacey might have asked. It’s not as if I’d like to leave Kristy alone, but I hoped she wouldn’t make a big deal out of it.

“Okay, it’s not exactly a great idea,” Kristy began.

“Well, I don’t know, but I’m not going to throw it out. I can find out how to make up my mind about who I want to be on the next page.”

“Yes,” Kristy said. “A lot of people are going to like to draw, but you need to figure out how to use your imagination.”

Kristy’s bossiness is an aspect of her character that the BSC books like to draw attention to (the term kristy falls in the top ten most frequent collocates for bossy and bossiness within a 5 to 10-word window either side). Here, though, the text doesn’t use the term but instead telegraphs it through the dialogue, while simultaneously conveying the uncertainty of the unnamed narrator. The extract also brings out the underlying tension between Kristy and the narrator, and – another feature of teenagehood – how one often feels as if the more articulate, more socially powerful people in one’s circle are getting everything their own way, while you stand on the sidelines self-censoring and suppressing your own feelings and desires.

Neural-net-generated-Kristy, meanwhile, started getting seriously bossy:

Kristy nodded. “Look at the rules for the group, all the members. I found the idea of good ideas. And the reason I wrote these are just the ones I love.”

“Right,” I said. “I guess I’ll keep up with you.”

“Right,” said Kristy. “Okay, now let me have an idea for you, if you ask. First, to do homework on how you should be aware of each of the decisions you have made. Then, if you’re curious about how to find new ideas, I’ll pick up what I’d like to teach you.”

“Okay, I know,” I said.

“And if I have a little creativity, I’ll teach you the principle of taut and change.”

“Right. I mean, I try to do something different and know how you should feel about it, but not for anything I’m going to do it. Right. There’s nothing to be confused about.”

“I guess I can’t understand what I’m doing,” said Kristy. “I just think I’ll just come up with the best idea.”

The terms “Right” and “Okay” do some interesting work here. While they suggest acquiescence in Kristy’s unnamed interlocutor, coming from Kristy herself they work in combination with verbs in the imperative and declarative mood to convey cheerful imperiousness, signalling not so much Kristy’s agreement as her marshalling of those around her. Veering from self-assurance (“I found the idea of good ideas”) to self-doubt (“I guess I can’t understand what I’m doing”) also felt very plausibly BSC. Her interlocutor, meanwhile, is trying to assert herself without much success (“I mean, I try to do something different and know how you should feel about it, but not for anything I’m going to do it”), but Kristy continues to steamroll her. The premise that a queen bee might make all of her friends “do homework on how you should be aware of each of the decisions you have made” (note: not homework on the decisions you have made, but how you should be aware of those decisions) felt to me like something out of Judy Blume or Harriet the Spy: the discursive equivalent of a kid dressing up in her corporate mother’s 80s-era shoulder-padded jacket and heels and trying out phrases she’s heard the grown-ups use. The “principle of taut and change”, for instance, though it corresponds to nothing I know in the real world, sounds like an idea that might plausibly come of the mouth of a motivational speaker: an alternate universe’s version of ‘lean in’ that it’s easy to imagine Kristy subscribing to. In these various ways, it felt as if the output had managed to capture something of Kristy’s character and was conveying it through textual features which went beyond the lexical.

Of course, cherry-picking excerpts from a longer piece of auto-generated output, as I’m indulging in here, and observing that some of these excerpts are a better match than others to the ways the BSC characters are portrayed is the kind of thing that gives work in the digital humanities a bad name. (Though I’m cheered by Katherine Elkins and Jon Chun’s remark that “reading sublime GPT text can transform one into the critic who, enamored of an author, can do nothing more than cite in full.”) I wasn’t setting out to make definitive statements about what this particular neural net was doing with this particular training set, though. Rather, I was trying to see what kind of mimetic functions GPT-2 was exercising when faced with the genre of young adult fiction focused on a group of entrepreneurial young women, and attempting to understand the way it was imitating the original genre through some of the categories of literary analysis. I was, for instance, surprised to find even semi-coherent characterization occurring, let alone with a degree of regularity.

As the training went on, the output began to investigate the line between friends and frenemies (with a little of its previous failure to differentiate between the participants in a dialogue):

“A lesson you need to learn is that you’ve already put out an ad campaign, and I want it to be a success,” Kristy reminded me. “And I know that. This is the first big campaign. It’s going to be like a baby-sitting project.”

“That’s what I said,” Kristy added. “And when you did that, you did it.”

“Great!” Kristy giggled. “But I guess I haven’t figured out how you could be successful. You are, since you and all your friends are probably just doing stupid stuff.”

I took a deep breath and put down my pockets and started my hand toward Kristy.

“We’re going to start,” she said.

“Okay, we’ll all move on!” I went on, breaking up.

Not for the first time, I was getting flashbacks to how subtly mean teenage girls can be to each other, which added a further degree of verisimilitude to the output (uncanny valley meets valley girls?). Then, as the discussion about the advertising campaign heated up, things began to get out of hand:

“Woooohhh!” I shrieked. “We need lots of pictures of the old addresses and who knows if this is a good idea. Maybe we should take those. I mean, if we were to do that, the flyers would be too busy and we could get you to do something like that.”

“Okay, but we’re not going to do anything like that,” I said, pointing to the flyer posters we’d put up. We kept saying “Duhh, Stacey,” and giggling for a while.

I didn’t know about the flyers, but I thought that maybe I could hold our nose and say “huh-huh-huh-huh-hauh-hauh-hauh-hauh-hauh” and become a freckle.

Right. The neural net had eaten one too many of Claudia’s beloved Ring-Dings, or else had done some transmedial genre-hopping and landed in a stoner movie. It was time to see what Quinn had turned up in her own adventures with GPT-2.

Six months later

Quinn

The DSC meeting where Anouk brought her GPT-2 findings, and I laughed until my stomach hurt, was one of my very last meetings in the before-times, just days before the elementary school and preschool shut down, and time began to go slowly, then quickly, then morphed into something altogether confusing and surprising. Especially here in the Bay Area, where we rely on festivals and events to mark the year in the absence of notable seasons. Or, at least, we used to. Maybe someday we will again.

It was fall, and universities in the US were making last-minute decisions to go all-online with no students on campus (like my institution, Stanford, or the institution in the city where I live, Berkeley, following a COVID-19 outbreak stemming from a frat party), or they were bringing students back – and quickly rethinking those plans as cases exploded. My oldest kid was doing first grade online for the indefinite future, and I didn’t expect the preschool to get through the year without a closure. The election was on the horizon, everyone was bored with social distancing and masks, and there was no end in sight to the ever-increasing COVID-19 death toll and the ongoing dismantling of civil society and the infrastructure that helps keep it afloat. Reading through Anouk’s description of February in Edinburgh from the perspective of late August 2020 in the United States, what I’d have given for icy fingers and a picket line.

It had been a very long time since I laughed the way I did when Anouk first shared her GPT-2 results. I needed a good laugh, but I also wanted to try to do it myself. So I dropped Jeff Tharsen a note.

Jeff Tharsen is like a cross between a Viking and a Confucian scholar. He’s tall, with dark eyes and a long beard that once was blond and is now turning an off-white color like Mary Anne’s blouses. He’s the University of Chicago’s resident expert in humanities-based research computing but spends much of his time thinking about sound systems in premodern rhetoric and poetry, social justice and savage inequities, playing guitar like Rico from the California Diaries series, Minecraft mods, and finding infinite combos for his Magic: the Gathering deck builds. He’d blown me away recently with the “Shakespeare sonnets” and “Blake poems” he’d coaxed GPT-2 into generating, and while I’m getting better at sorting out code myself, I was happy to have some code and instructions to start from that I hoped might work.

Trying to plan ahead for releasing this DSC book as a Jupyter notebook, I put together a notebook that ran the external Python code from his repo. Next, I saw his tutorial recommended running the code not just on a computational resource larger than your laptop – but, specifically, on Graphical Processing Unit (GPU) nodes as part of his university’s high-performance computing cluster. I was impatient. I didn’t want to try to figure out how to access the GPUs available on Sherlock, Stanford’s high-performance computing cluster. My laptop, I figured, might be a little slower but I figured it’d work fine. I was so wrong.

But let’s take a step back first.

What is machine learning anyway?

As Jeff reminded me when I kept venting my impatience at him because my model didn’t seem to be learning anything at all, machine learning isn’t a child you can coax into putting their shoes on faster because we really have to go or else we’ll miss our 5-minute social distancing drop-off window at the preschool and cause a backup for other families.

It’s math. It’s just math.

If you’re going to understand machine learning, it’s time to summon your inner Stacey the Math Whiz, join the Machine Learning Mathletes, and dive on in.

Machine learning involves algorithms that are trying to predict an output, given an input, a lot of examples of input/output pairs, and some parameters you can fiddle with to improve the algorithm’s guesses.

What if we tried to come up with an algorithm to predict Claudia’s grade on a test? We could start with something simple, like assuming that her grade on a given test is going to be the same as her current grade in that class. If it’s a test in art class, Claudia will get an A. If it’s in English class, she’ll get a C. One input (the class), one calculation based on that input (look up her current grade), one output (the grade). But everyone knows that grades don’t work like that at all. There are so many more factors to take into consideration. How hard have Claudia’s parents been on her about her grades lately? When did the test take place – was it before Mimi’s death, so Mimi could help Claudia study? How many baby-sitting jobs did Claudia have that week? Did she have art class the day before? Is there an art exhibition coming up? In a machine learning context, these questions are referred to as “features”.

Knowing the answer to all those questions is one step towards improving our grade-prediction algorithm. But there’s still another set of information that we’re missing: how much does each of those things matter to the overall output? Maybe an upcoming art exhibition means that she’ll be a little more distracted than usual when trying to study, but the effect overall might be negligible. On the other hand, if Claudia’s parents are really cracking down on her, threatening to prevent her from participating in the Baby-Sitters Club until her grades improve, that’s likely to have a much bigger impact on the test outcome. The impact of a given feature is called its “weight” in machine learning.

“Deep learning” refers to a kind of machine learning where features are organized into layers. Lower layers capture basic features, and higher layers capture more subtle things. This is a little easier to understand with an example for deep learning for computer vision: a low-level feature might enable an algorithm to recognize that something in an image is a line. A higher-level feature might be able to recognize that something is a giant dangly earring like Claudia might wear. In the image below, the layer on the left identifies recurrent patterns in the way black, white and grey pixels come together to form particular shapes like lines and curves. The layer in the middle spots patterns in the way those shapes typically fit together (what a human being can recognize as facial features like eyes and noses). Then the layer on the right finds patterns in the way those eyes and noses are put together with other things mouths, foreheads and ears to make faces.

Training a machine learning algorithm is like making the algorithm play a big guessing game. Your training data has all the “right” answers. The algorithm takes part of the training data, hides some of it from itself, and attempts to guess the right answer by changing the weights it assigns its features. Then it checks its work, and typically will report back to you about how it’s doing, using something called “loss”. High loss values (and what counts as “high” depends a lot on the context) mean that the algorithm made some bad decisions during that round, leading to a bunch of mistakes. Remember, it’s a guessing game: sometimes an algorithm can get off on a wrong track for a turn or two, but it should be able to correct itself and keep improving by passing the right answers back through the network, updating weights along the way.

If your loss rate just keeps bouncing around the same numbers you started with for turn after turn, something has gone wrong. (And something had gone wrong when I tried it with Jeff’s code – more on that later.) But it’s also a problem if your loss value goes all the way to 0. You might think that a 0 loss value is great – perfect A+ right answer every time! But remember: you’re not trying to train a model that will spit your training data back at you with perfect accuracy. What you want is a model that can handle, with reasonable accuracy, data that it hasn’t seen before. When a model gets too good at predicting your training data, so good that it hurts its ability to handle new data, that’s called “overfitting”.

But back to Claudia’s grades! And here’s where things get a little spooky. The way I’ve described things so far suggests that you, the knowledgeable human who understands how the world works in general, and how Claudia’s life works in particular, define the features that the algorithm uses. Maybe you can trust the algorithm to handle the weights, but you can’t trust the ghost in your laptop to make the call on what features to use, can you?

Actually… yes? It turns out that in most cases, you’re better off leaving the whole thing to the algorithm. You can see how this can feel irksome, particularly if you’re like Kristy and want to be in control. How can you trust the ghost in the laptop to pick the right things if it’s not an expert? The fact that you’re not in there bossing the algorithm around contributes to the rhetoric around machine learning as a “black box”. But here’s the thing: hopefully you’ve already wrapped your head around the idea that computers just do things differently than humans. (If not, check out DSC #8: Text-Comparison-Algorithm-Crazy-Quinn.) With text comparison algorithms, it’s harder to adjudicate how “right” the computer’s determinations are compared to the decisions we’d make as humans. We just have to grapple with them as another perspective, take what insight we can, and sometimes agree to disagree. But machine learning is a little different because there often is, actually, a “right” answer. Without getting too worked up about how the algorithm comes up with its answer, we can evaluate how often it gets the answer right.

Which is not to say that we should pay no mind to how the ghost in the laptop makes decisions. Machine learning algorithms (often under the fancier title of “artificial intelligence” or AI) are getting adopted as an automated decision-making tool in all kinds of contexts, including ones with serious consequences, like courts attempting to determine which defendants are “high risk” for committing future crimes. Algorithms like to find shortcuts to get to the answer, or close enough. Even if you’ve tried to set things up so that you don’t tell the algorithm about race (in order to ensure that it’s not making determinations based on race) you may discover that one of the features it picks up and uses is essentially a proxy for race. It’s not for nothing that OpenAI chose not to release their full GPT-2 model, like Anouk mentioned, to avoid people using it for nefarious purposes. (That said, OpenAI reached a deal with Microsoft to give them exclusive access to the full GPT-3 model, allowing them to embed, adapt, and use the model however they please. And depending on how they vet their clients… 😬)

There are no lives at stake when we’re using machine learning with literature. But you may still want to understand how the algorithm is making its determinations, especially when it’s performing well. Is there anything we can learn about literature from the way an algorithm is able to classify it, or generate text that readers perceive as “similar” (like Anouk was trying to do in February)? Some algorithms (like k-means clustering) can easily give you the features that they’re using; with others, particularly deep learning algorithms with a lot of layers, it’s harder to determine. The features the algorithm is using may not be all that interesting, or might reflect things about your corpus that aren’t actually what you’re trying to get at (e.g. if you’re looking for what characteristics differentiate books written by British vs. American authors, the presence of the word “whilst” is probably not what you want).

Fine-tuning GPT-2

Jeff’s notebooks for fine-tuning GPT-2 were supposed to be run on a GPU (a Graphics Processing Unit – I’ll say more about those in a minute). But ever since I was a child reading the Baby-Sitters Club novels, I’ve tended to skip inconvenient instructions, cross my fingers, and hope for the best. Which is how I ended up running his notebook on my MacBook Pro (which does not have a GPU) for 24+ hours, without the model getting more BSC-like at all. It wasn’t even just a problem of me using the wrong hardware for the job, I also had made a mistake with how I set up the fine-tuning code, which turned into an opportunity to learn about “learning rates.”

Learning rates

Part of why the model wasn’t learning anything was that I’d set the learning_rate hyperparameter too high. (In machine learning, a “hyperparameter” is just a setting that you, the squishy human, adjust; “parameters” are estimated from data automatically by the machine. You adjust hyperparameters by specifying their value when you run a task like fine-tuning a model.)

Sorting out learning_rate is kind of like the process of learning to drive a car – or the difference between a driving video game (like my kids’ favorite Mario Kart 8) and driving an actual car. When you’re feeding in a large-ish data set for fine-tuning GPT-2, a “learning_rate” that is too high leads to a lot of overcompensating when fiddling with the different metaphorical steering wheels of machine learning to try to get the right answer. Not getting the right answer? EEEEEEEEK! Turn it all the way to the right! Oh, wait, that didn’t get you the right answer? OMG NO! Turn it all the way back to the left! This wild vacillating left and right might barely keep you on Rainbow Road in Mario Kart 8 (with machine learning, this looks like reliably getting the right answer, referred to as “convergence”), but it takes a lot longer and is more stressful than if you’re driving the car of machine learning with some seasoned expertise. In a lot of cases, a small change in the orientation of the steering wheel can be all you need to stay on the road. But you don’t want to be too hesitant, either! Because that also means it’ll take longer to achieve convergence. And especially if you’re paying for your GPUs (or waiting in line for relatively scarce GPU resources in an institutional High-Performance Computing cluster), you don’t want that.

So I’d spent 24 hours careening back and forth between “NOT ENOUGH!!” and “TOO MUCH!!”, but the learning_rate hyperparameter isn’t something that can adjust itself on the fly based on the context, the way you’d actually drive a car. You have to specify it upfront. And I should have chosen a smaller one. (I didn’t feel too bad, though. Adjusting learning rates is a huge and unsolved problem in machine learning and usually people just guess.)

The bigger problem was that, no seriously, you really do need GPUs to fine-tune GPT-2 at all effectively.

What’s a GPU and why should you care?

GPU stands for Graphics Processing Unit. You can think of it as a cousin of the CPU (or Central Processing Unit), which you might be familiar with from your own computer, if you’ve ever looked at the Windows Task Manager or Mac Activity Monitor to figure out what runaway process is sucking down your battery and making your fan whir with its high CPU usage. (Looking at you, Adobe Creative Cloud!) Gaming laptops need good graphics performance in order to constantly render, update, and rotate the millions of polygons that make up modern game graphics, so they’re more likely to have GPUs to make that work. If you’re trying to render 3D models at tolerable speeds, you should also work on a machine with GPUs. But why do GPUs matter for GPT-2? We’re not using machine learning to try to cluster BSC book covers or anything! (Though actually… well, don’t be surprised if you see that in a future DSC book.)

So many things in the world of machine learning assume you share a particular set of cultural references used to explain things. CPUs vs GPUs commonly get explained through some variant of “Ferrari vs. cargo truck”. Neither of those vehicle specifications mean a whole lot to me. Let’s try it another way.

Stacey and Kristy are both contestants in a department store contest where they have ten minutes to bring all the clothing items they can up to the cash register, and they can keep everything as long as they correctly sum up the cost.

Stacey is like a CPU. She’s really good at math, but is also very concerned about always looking cool – what if she bumped into a cute guy while her arms were piled high with clothes? That wouldn’t do. So she’ll speed-walk around the store, picking out a diverse set of expensive and stylish outfits, never holding more than a couple items at a time, and calculating the cost in her head.

Kristy, our GPU, is only average at math. She doesn’t care what impression she makes: she’s borrowing her brother Sam’s giant hiking backpack and is going to stuff it to the top. Kristy isn’t into fashion, so instead of putting a lot of thought into how different items of clothing might assemble into trendy outfits, she’s just going to fill the hiking backpack with multiple of the same jeans, t-shirts, turtlenecks, sweatshirts, underwear, and socks. She’s also bribing some of the elementary school kids on her Kristy’s Krushers baseball team to help her with this mall contest. Her plan is to stuff the backpack, bring it over to her team waiting by the cash register, and each kid will work on adding up the cost of a few items.

How does this work out? Stacey the CPU ends up darting back and forth around the department store, doing multiple different kinds of complex calculations (all the math to add up the price of all her items, plus the evaluation of how fashionable and combinable her choices are). This is why CPUs are probably what you use most of the time: you want something fast, nimble, and able to handle lots of different things.

For a while, it looks like Kristy the GPU has fallen far behind. Stacey the CPU already has lots of articles of clothing piled up at the cash register while Kristy is still filling the camping backpack. But in the three minutes remaining by the time Kristy makes it to the cash register with her backpack, she is able to divvy up the calculations among her team and still finish in time with the correct answer. Her team isn’t as mathematically capable as Stacey – they are, after all, just elementary school kids – but giving them each a simple task to work on (just adding up the cost for a few items), Kristy parallelizes that part of the process and it finishes quickly.

For this small task – gathering clothes for 10 minutes and adding up the total cost – maybe Stacey the CPU and Kristy the GPU finish at the same time, especially since Stacey the CPU was using some more complex calculations that might slow her down. Kristy the GPU spent most of the time loading up the camping backpack, and the actual calculation was fairly brief. But if the task were a lot bigger – if there were many more small calculations that could be parallelized across all 9+ Kristy’s Krushers team members – Kristy the GPU could easily leave Stacey the CPU in the dust.

And that’s what happened when I tried to try to fine-tune GPT-2 on the Baby-Sitters Club corpus using my laptop. Poor Stacey the CPU was running all over, trying to do the calculation to update the weights for all the features of GPT-2… while also running like 50 browser tabs… and probably more than a couple Adobe apps… plus my email, and probably like 12 other things. She wasn’t getting anywhere fast with this fine-tuning thing.

There’s more to text generation than GPT-2

Around the time I was struggling with ways to fine-tune GPT-2 using the Baby-Sitters Club corpus, fall quarter had begun at Stanford, and I started teaching my biennial non-English DH course. (Online, asynchronously, and hastily reworked week-by-week, if I’m honest about it.) I was amazed at how many students I had: 16 registered, and another five or so auditors. (Side note: if you’re teaching DH and have students who don’t normally work with English, check out the Multilingual DH site or get in touch! I’d love to help you give them what they need to be able to apply what you’re teaching to their language.)

When I learned more about of one of my students from the Classics department, I was thrown into self-doubt. Annie Lamar has warm gray eyes and an affinity for purple clothing. She has long hair that is almost always twisted and copiously pinned on the top of her head. She studies ancient languages and artificial intelligence, which– although it sounds quite serious– often results in the creation of chat bots so she can talk to her favorite ancient authors. Annie had a BA in Classics and a BS in Comp Sci from the University of Puget Sound (not far from where I grew up in the Seattle suburbs). Even before starting grad school in Classics at Stanford, she had done work on creating network graphs of texts, machine translation for a low-resource language while focusing on style, and generating metrically accurate Homeric poetry. Was there anything I could actually offer her with this class, as someone who was struggling to just get GPT-2 fine-tuned, myself?

By the end of the quarter, it turned out okay – I learned that computer science departments have their own constraints about what it’s acceptable to work on, and DH-as-such acculturation has some value. Annie’s final check-in assignment, in response to the prompt “What is digital humanities?”, was perfect: “In the words of Democritus, ‘Αρχάς είναι των όλων ατόμους και κενόν, τα δ’ άλλα πάντα νενομίσθαι.’ The beginning of everything is atoms and emptiness, the rest is all perception.” But towards the beginning of the quarter, when I wondered if I could teach anything close to what I might learn from Annie, I asked if she’d be up for talking with me and Anouk about machine learning and text generation.

Anouk

From our Zoom boxes, Quinn and I talked Annie through where we’d gotten up to so far with our neural net attempts.

“… so, we’re trying to understand what goes on under the hood when one takes a neural net like GPT-2 and retrains it on BSC text,” finished Quinn.

“Yeah, we’d really like to crack open the black box to understand how it’s working inside,” I added. “You know how it’s possible to find the individual node in a neural net that keeps track of the pointy ears in pictures of cats?” I asked.

Annie nodded.

“Well, could we do something like that? Find the nodes that have a kind of similarly identifiable function for texts. Only instead of telling us something that isn’t interesting such as  the fact that cats have pointy ears, it might be able to tell us something we don’t know about texts, like, um –”. I was incoherent, weary after a week of dealing with failing institutional virtual learning environments and anxious students, and there was a lag in my connection with the others 5000 miles away. I was clearly not making much sense.

“– like, what are the pointy ears of literary texts?” Quinn helpfully jumped in with.

“Hmm,” Annie considered this. “You know in the movie Arrival there’s an alien language? That alien language definitely means something to the aliens: it’s just that no one else can know what it means. It’s kind of the same with what’s inside the black box of a neural net: it’s just a bunch of ones and zeros in a multidimensional matrix that we have no chance of understanding. So what we have to do is to develop ways to evaluate it, for instance by using vocabulary graphing, which lets us see how well the vocabulary of a generated text matches up with the training data.”

“Evaluation!” I thought. Maybe that would be where some background in literary studies might come in handy …

Quinn, meanwhile, was trying out another conceptual metaphor. “What does it mean to take the strange sausage of GPT-2 and make it BSC-flavored?” she asked.

“Ew!” replied Annie. “I read the BSC books over and over when I was a kid but I don’t ever remember coming across Claudia and the Strange Sausage.” We giggled. “Anyway, here’s one way to think of it. You take a deep recurrent neural network and you start training it. You train and you train, and it’s all going well, and then you get to the point where you’ve overfitted and that means you’ve gone too far, as the model starts spitting out actual lines from real BSC books. So, you need to ‘freeze’ the training before that point to get it to the right level.’

“That’s really helpful,” Quinn mused, and I nodded. Neither of us had thought of neural net training like that before.

“So, what would be the difference between retraining something like GPT-2 on the BSC corpus, and training a neural net on the BSC corpus alone?” Quinn wanted to know.

“Yes!” I interrupted. “I thought neural nets needed seriously large corpora, and ours is pretty small as ML-training datasets go. Can you even use a neural net on something that size?”

“Funny you should ask,” smiled Annie. “Small data is my thing. I work on Ancient Greek corpora that are typically about 2% of the optimal size for training neural nets. In general, a “good” neural network requires about 1,000,000 lines of training data. My datasets are usually composed of 20,000 lines or less.. So I’m interested in seeing how well neural nets do with data of that size, and I’ve used OpenNMT-py to train models of that sort.”

I thought of the wealth of knowledge about the BSC among the members of the DSC. “Would it be possible to evaluate the outputs of different models – some trained on GPT-2 and retrained on the BSC corpus, and some trained on the BSC corpus alone – using the expertise of the members of the DSC?” Evaluation is important in CS but I wasn’t sure if the judgments of people with literary training … well, counted as proper evaluation.

“Completely!” was Annie’s reassuring reply. It turns out that she has actually published on this very question. “We give the model dialogue and ask it to fill in a section that you have removed – the beginning, the middle or the end, say. Then we ask humans who are familiar with those texts to evaluate if they think there is any fake text, and if so where comes.”

Quinn and I exchanged glances, or we would have had we not been on Zoom where looking someone in the eye involves looking away from their face to your webcam. This would be a way to involve the rest of the Data-Sitters, perhaps in a future book: use different models to generate a number of plausible chunks of BSC-esque prose and ask our fellow club members to evaluate which sounded the most real. Perhaps my dream about literary domain expertise actually coming in handy wasn’t such a ridiculous idea after all.

Six months after that

Quinn

It’s been a year since we started writing this book. Or maybe on some level, we’re still frozen in March. Maybe today is just March 337th in Covid Standard Time. I need a break. There is no way to take a break, at least, not while conscious. We’re reaching the point where language for talking about this is breaking down. “Burned-out” was a while ago. The metaphor crumbles in the pandemic: when a candle burns out, it stops burning. But the sun keeps rising, the days keep happening, and we continue staggering along.

In short, laughter is harder to come by, and we need it more than ever. (I mean, actually what we need is a break – I’ve had vivid fantasies of checking into a cheap motel and just spending 24 hours without anyone asking me to make them a sandwich. But that’s not happening. So I’ll take laughter.)

For the longest time, the Data-Sitters Club has been my escape from everything, but the weight of that everything, compounded by the everything coming from the rest of the world (Election! Election results! Holidays-Such-As-They-Were! 16-Day weekend for the kids! Attempted coup! State-wide stay-at-home order with minimal local ICU capacity! Another impeachment! Inauguration! California stay-at-home order lifted even though everything is still awful and there’s nowhere near enough vaccines in the state for health workers and 75+! Giant storms and power outages! GameStop stock breaks the stock market! Waiting to find out what weird and probably awful thing will happen next!) turned this book into something akin to the apple cores my oldest kid has a penchant for stashing underneath the living room bean bags. I know it’s there, but I’m afraid of what I’ll find, so I don’t look – and weeks pass, turning procrastination into dread.

But during another nondifferentiable weekend at home, I finally sat down and reread this book. And I was surprised at how much I enjoyed it. An idea began to form in my head. Maybe I could at least give this book an ending, even as it’s gotten hard to imagine we’ll actually manage to escape this state of suspended animation for our own lives.

The return of text-comparison-algorithm-crazy Quinn

I loved Annie’s idea of generating some Dawn-approved, organic Baby-Sitters Club plant-based meat substitute, made exclusively of text from the books, without the potentially unsavory admixture of GPT-2’s unspecified source websites based on Reddit links. And I was really curious if it could trip up our expert evaluators (AKA my fellow Data-Sitters with encyclopedic knowledge of the series). But this book was already long enough, so it’d be another task for another book.

What I found myself wondering was whether the computer would be able to tell that something was amiss with the text that GPT-2 generated. And furthermore, whether the computer would judge the samples from early on in the training process to be less similar than the later ones.

I’d eventually found a notebook that allowed me to fine-tune GPT-2 using Google’s GPU’s for free. Google Colab has some limitations (like timing out after a few hours), but if you’ve used Jupyter notebooks before, it’s a familiar environment, has easy integration with Google Drive, and has a free supply of shopping-contest Kristys with enormous backpacks (GPUs) who can get your job done in no time.

The notebook I modified for the occasion was created by Max Woolf. This notebook makes it easy to load in a text file from Google Drive with the training data, fine-tune GPT-2, and save the resulting model as well as samples from various stages in the training process.

The configuration I used saved sample output from GPT-2 every 10 rounds as it was being retrained using the Baby-Sitters Club corpus, and the model trained for 1000 rounds. Because things are a little different every time you train a model like this, I ran it five times, so we’d have 5 sets of samples from different training attempts. 495 samples is a lot of text to wrangle, so I pared it down to 50: one every 100 rounds.

I trained the model using just text from chapter 2’s, rather than whole books. The chapter 2’s have a core of predictability that I thought might be easy to get the machine to pick up on, but in addition to the reliable “describe all the characters and the club” prose, there’s usually some plot-based stuff to give the computer something more than tropes. (Want to hear more about chapter 2’s and some other things we’ve done with them? Check out DSC #8: Text-Comparison-Algorithm-Crazy Quinn.)

One of the frustrating things is that you can’t get GPT-2 to generate a chapter 2’s length (any chapter 2’s length, not even the short ones) of continuous text. 1,024 tokens (mostly words, but also counting punctuation) is a hard limit. This has always struck me as an arbitrary annoyance, and I was grateful when Annie dug into it and explained it to me:

Transformer models are great because they are efficient and the training time is much less than that of a typical RNN. However, transformer models sacrifice information about sentence-level word-positioning in order to run every word in a sentence through the network in parallel. This means that in a naked transformer, there is no information about syntax. One way to solve that problem is to associate every word in the sentence with a number indicating its position; this number essentially becomes another feature associated with the word in the word embedding- hence, ‘fixed positional embeddings.’

This also means that the longer the text you ask the model to generate, the less confident the model will be- if the positional embeddings only ever reached, say, 1000, then when you ask the model to generate a word at position 5,000, it has very little information to go on. Transformer models in general do much better when asked to generate shorter pieces of human-like text (e.g. chatbots). Here, it works extremely well for generating human/BSC-like conversation, but you are limited in length.

So there you have it. 1,024 tokens might be enough to get you a chapter 15’s worth of text, but chapter 2’s tend to be on the long side:

Book and chapter 2 length

Using word frequencies rather than word counts is important in order to accommodate differences in length, but I didn’t think that was enough. Some of these chapter 2’s are significantly longer than 1,024 tokens (the word count viz above counts words, not tokens, so all the punctuation would increase the GPT-2 relevant length even more) – and more total words means that even subjectively prominent words in the chapter will have a lower frequency. What I wanted was 50 random GPT-2 sized chunks of real chapter 2’s.

My first thought was to tokenize the text file (separate out words and punctuation) with all the chapter 2’s that I’d used to fine-tune GPT-2, and then grab some chunks of the right length. But then I realized that approach would probably leave me with some weird chimeras featuring the end of one chapter 2, and the beginning of the next one: there was no way to tell where one stopped and the next began.

Instead, I took the individual chapter 2 text files from my laptop, tokenized those, and fed them all into a Python dictionary that also included the source file (so we know where the sample comes from) and the total number of tokens (which I needed in order to know where to start my possible random chunks). The tokenizer returns the tokens in a list… and that list is inside a dictionary… and I’ve got another list containing all the dictionaries, so we’re creating a kind of Russian nesting doll of data.

#OS is for changing directories and listing files in those directories
import os
#NLTK tokenize is for tokenizing the text
from nltk.tokenize import word_tokenize
#Random is for generating random numbers
import random
#Here's the folder with all my chapter 2 files
ch2directory = '/Users/qad/Documents/dsc_chapters/ch2'
#Change to that directory
os.chdir(ch2directory)
#Create a list for the results
ch2examples = []
#For the files in the directory with all my ch. 2 files...
for file in os.listdir(ch2directory):
    #If the file is a .txt file
    if file.endswith('.txt'):
        #Open that file
        with open (file, 'r') as f:
            #Read the file
            ch2text = f.read()
            #Tokenize the file
            ch2tokenized = word_tokenize(ch2text)
            #Count the length of the tokenized file
            ch2length = len(ch2tokenized)
            #Create a dictionary with the filename, tokenized length, & text
            ch2data = {'book': file, 'length': ch2length, 'text': ch2tokenized}
            #Add that data to the list
            ch2examples.append(ch2data)
            #Write out the filename and the length
            print(ch2data['book'] + ' - ' + str(ch2data['length']))
008c_boy_crazy_stacey_2.txt - 1907
009c_the_ghost_at_dawns_house_2.txt - 2007
123c_claudias_big_party_2.txt - 3440
002c_claudia_and_the_phantom_phone_calls_2.txt - 2528
m31c_mary_anne_and_the_music_box_secret_2.txt - 2487
m11c_claudia_and_the_mystery_at_the_museum_2.txt - 2869
m07c_dawn_and_the_disappearing_dogs_2.txt - 2940
014c_hello_mallory_2.txt - 2229
049c_claudia_and_the_genius_of_elm_street_2.txt - 2751
083c_stacey_vs_the_bsc_2.txt - 2823
serr3c_shannons_story_2.txt - 2117
005c_dawn_and_the_impossible_three_2.txt - 2569
001c_kristys_great_idea_2.txt - 1625
012c_claudia_and_the_new_girl_2.txt - 2373
099c_staceys_broken_heart_2.txt - 5450
m13c_mary_anne_and_the_library_mystery_2.txt - 3020
m12c_dawn_and_the_surfer_ghost_2.txt - 2682
031c_dawns_wicked_stepsister_2.txt - 2942
055c_jessis_gold_medal_2.txt - 3841
013c_goodbye_stacey_goodbye_2.txt - 2031
112c_kristy_and_the_sister_war_2.txt - 3044
131c_the_fire_at_mary_annes_house_2.txt - 2550
070c_stacey_and_the_cheerleaders_2.txt - 3325
066c_maid_mary_anne_2.txt - 4179
m33c_stacey_and_the_stolen_hearts_2.txt - 2894
063c_claudias_freind_friend_2.txt - 4146
059c_mallory_hates_boys_and_gym_2.txt - 4218
024c_kristy_and_the_mothers_day_surprise_2.txt - 2850
m24c_mary_anne_and_the_silent_witness_2.txt - 2998
091c_claudia_and_the_first_thanksgiving_2.txt - 3667
117c_claudia_and_the_terrible_truth_2.txt - 2937
088c_farewell_dawn_2.txt - 4471
093c_mary_anne_and_the_memory_garden_2.txt - 2706
060c_mary_annes_makeover_2.txt - 3919
004c_mary_anne_saves_the_day_2.txt - 2293
007c_claudia_and_mean_jeanine_2.txt - 2237
120c_mary_anne_and_the_playground_fight_2.txt - 3832
m32c_claudia_and_the_mystery_in_the_painting_2.txt - 3706
010c_logan_likes_mary_anne_2.txt - 1795
061c_jessi_and_the_awful_secret_2.txt - 4179
111c_staceys_secret_friend_2.txt - 3549
m09c_kristy_and_the_haunted_mansion_2.txt - 2850
118c_kristy_thomas_dog_trainer_2.txt - 3870
m10c_stacey_and_the_mystery_money_2.txt - 3398
048c_jessis_wish_2.txt - 3268
114c_the_secret_life_of_mary_anne_spier_2.txt - 3573
090c_welcome_to_the_bsc_abby_2.txt - 3721
116c_abby_and_the_best_kid_ever_2.txt - 4046
094c_stacey_mcgill_super_sitter_2.txt - 4677
038c_kristys_mystery_admirer_2.txt - 2973
054c_mallory_and_the_dream_horse_2.txt - 2954
075c_jessis_horrible_prank_2.txt - 1706
003c_the_truth_about_stacey_2.txt - 2931
042c_jessi_and_the_dance_school_phantom_2.txt - 3804
053c_kristy_for_president_2.txt - 3424
081c_kristy_and_mr_mom_2.txt - 3851
m26c_dawn_schafer_undercover_babysitter_2.txt - 2763
068c_jessi_and_the_bad_babysitter_2.txt - 4183
115c_jessis_big_break_2.txt - 3736
040c_claudia_and_the_middle_school_mystery_2.txt - 2646
027c_jessi_and_the_superbrat_2.txt - 2265
m22c_stacey_and_the_haunted_masquerade_2.txt - 3276
011c_kristy_and_the_snobs_2.txt - 2168
m20c_mary_anne_and_the_zoo_mystery_2.txt - 3005
026c_claudia_and_the_sad_goodbye_2.txt - 2621
064c_dawns_family_feud_2.txt - 2428
m35c_abby_and_the_notorius_neighbor_2.txt - 3539
028c_welcome_back_stacey_2.txt - 2226
126c_the_all_new_mallory_pike_2.txt - 2961
022c_jessi_ramsey_petsitter_2.txt - 2944
m30c_kristy_and_the_mystery_train_2.txt - 4597
016c_jessis_secret_language_2.txt - 3152
092c_mallorys_christmas_wish_2.txt - 3594
serr2c_logan_bruno_boy_babysitter_2.txt - 3454
043c_staceys_emergency_2.txt - 2637
067c_dawns_big_move_2.txt - 2979
071c_claudia_and_the_perfect_boy_2.txt - 4205
109c_mary_anne_to_the_rescue_2.txt - 4435
130c_staceys_movie_2.txt - 3684
m19c_kristy_and_the_missing_fortune_2.txt - 3354
086c_mary_anne_and_camp_bsc_2.txt - 3360
096c_abbys_lucky_thirteen_2.txt - 1879
m18c_stacey_and_the_mystery_at_the_empty_house_2.txt - 2704
113c_claudia_makes_up_her_mind_2.txt - 3582
021c_mallory_and_the_trouble_with_twins_2.txt - 2684
047c_mallory_on_strike_2.txt - 2397
102c_mary_anne_and_the_little_princess_2.txt - 4246
m27c_claudia_and_the_lighthouse_ghost_2.txt - 4394
025c_mary_anne_and_the_search_for_tigger_2.txt - 2559
m29c_stacey_and_the_fashion_victim_2.txt - 2983
m02c_beware_dawn_2.txt - 2508
069c_get_well_soon_mallory_2.txt - 2803
057c_dawn_saves_the_planet_2.txt - 2003
097c_claudia_and_the_worlds_cutest_baby_2.txt - 3513
039c_poor_mallory_2.txt - 3437
127c_abbys_un_valentine_2.txt - 4968
065c_staceys_big_crush_2.txt - 3799
023c_dawn_on_the_coast_2.txt - 2488
087c_stacey_and_the_bad_girls_2.txt - 2866
029c_mallory_and_the_mystery_diary_2.txt - 2061
062c_kristy_and_the_worst_kid_ever_2.txt - 3129
072c_dawn_and_the_we_heart_kids_club_2.txt - 2966
073c_mary_anne_and_miss_priss_2.txt - 3335
033c_claudia_and_the_great_search_2.txt - 3580
125c_mary_anne_in_the_middle_2.txt - 3857
074c_kristy_and_the_copycat_2.txt - 4016
018c_staceys_mistake_2.txt - 2526
015c_little_miss_stoneybrook_and_dawn_2.txt - 2331
119c_staceys_ex_boyfriend_2.txt - 3106
100c_kristys_worst_idea_2.txt - 4478
m17c_dawn_and_the_halloween_mystery_2.txt - 2631
124c_stacey_mcgill_matchmaker_2.txt - 4103
106c_claudia_queen_of_the_seventh_grade_2.txt - 3976
107c_mind_your_own_business_kristy_2.txt - 3828
058c_staceys_choice_2.txt - 4371
m01c_stacey_and_the_mystery_ring_2.txt - 2595
076c_staceys_lie_2.txt - 3616
122c_kristy_in_charge_2.txt - 3355
103c_happy_holidays_jessi_2.txt - 4008
121c_abby_in_wonderland_2.txt - 3898
m15c_kristy_and_the_vampires_2.txt - 3059
serr1c_logans_story_2.txt - 2725
050c_dawns_big_date_2.txt - 3460
044c_dawn_and_the_big_sleepover_2.txt - 2567
056c_keep_out_claudia_2.txt - 4055
m05c_mary_anne_and_the_secret_in_the_attic_2.txt - 2647
045c_kristy_and_the_baby_parade_2.txt - 2828
077c_dwn_and_whitney_friends_forever_2.txt - 3179
108c_dont_give_up_mallory_2.txt - 3905
m36c_kristy_and_the_cat_burglar_2.txt - 2941
m25c_kristy_and_the_middle_school_vandal_2.txt - 1840
m23c_abby_and_the_secret_society_2.txt - 3059
079c_mary_anne_breaks_the_rules_2.txt - 2806
m34c_mary_anne_and_the_haunted_bookstore_2.txt - 4599
030c_mary_anne_and_the_great_romance_2.txt - 3103
078c_claudia_and_crazy_peaches_2.txt - 3462
m03c_mallory_and_the_ghost_cat_2.txt - 3057
110c_abby_and_the_bad_sport_2.txt - 2711
m04c_kristy_and_the_missing_child_2.txt - 3027
082c_jessi_and_the_troublemaker_2.txt - 4286
080c_mallory_pike_no_1_fan_2.txt - 3724
051c_staceys_ex_best_friend_2.txt - 3816
095c_kristy_plus_bart_equals_questionmark_2.txt - 2082
085c_claudia_kishi_live_from_wsto_2.txt - 3552
052c_mary_anne_plus_too_many_babies_2.txt - 3396
017c_mary_annes_bad_luck_mystery_2.txt - 2984
105c_stacey_the_math_whiz_2.txt - 3672
089c_kristy_and_the_dirty_diapers_2.txt - 3341
041c_mary_anne_vs_logan_2.txt - 3649
m06c_the_mystery_at_claudias_house_2.txt - 2560
101c_claudia_kishi_middle_school_dropout_2.txt - 2865
m16c_claudia_and_the_clue_in_the_photograph_2.txt - 3646
046c_mary_anne_misses_logan_2.txt - 2911
m14c_stacey_and_the_mystery_at_the_mall_2.txt - 2815
084c_dawn_and_the_school_spirit_war_2.txt - 3973
m28c_abby_and_the_mystery_baby_2.txt - 2675
037c_dawn_and_the_older_boy_2.txt - 2565
128c_claudia_and_the_little_liar_2.txt - 3523
104c_abbys_twin_2.txt - 4159
m21c_claudia_and_the_recipe_for_danger_2.txt - 3138
006c_kristys_big_day_2.txt - 2726
m08c_jessi_and_the_jewel_thieves_2.txt - 3372
032c_kristy_and_the_secret_of_susan_2.txt - 3121
034c_mary_anne_and_too_many_boys_2.txt - 2774
129c_kristy_at_bat_2.txt - 3005
035c_jessis_babysitter_2.txt - 2839
098c_dawn_and_too_many_sitters_2.txt - 2686

The next piece I worked through step-by-step, with plenty of Googling to make sure I was getting the syntax right. First, I looked up how to get a random integer in a particular range. I needed it to be between 0 (Python starts counting with 0) and the length of the list of chapter 2’s.

I knew that putting the positional number of a list item in brackets after the name of the list (like ch2examples[42] would give you that particular item, but I wasn’t sure how to get the value of a specific dictionary key (like length), so I looked that up. I knew how to do basic arithmetic in Python, so I subtract 1,024 from the chapter length to get the largest number that could be the starting point for our sample, to make sure the result was 1,024 tokens. Once I got another random integer for the start point, I needed to figure out how to join a range of items in a list, so I Googled that too. Then I tried a couple different conventions for naming the output files, and settled on samplename = 'sample_'+str(start)+'_'+ch2name so we could see (and sort) by the book it came from, and also have a (hopefully) unique ID with the start index, in case there end up being multiple samples from the same book.

#Random number between 0 and the length of the list of chapter 2's
number = random.randint(0, len(ch2examples))
#ch2examples[number]['book']
#Define ch2name as the book name corresponding to the random number selected
ch2name = ch2examples[number]['book']
#Define maxstart as the length of the chosen chapter 2, minus 1024
maxstart = ch2examples[number]['length'] - 1024
#Define start as a random number between 0 and the maxstart
start = random.randint(0, maxstart)
#Define end as start + 1024
end = start + 1024
#Define sample as all the text between the start and end defined above
sample = ' '.join(ch2examples[number]['text'][start : end])
#Defines the filename for the output random selection
samplename = 'sample_'+ch2name.replace('.txt','_')+str(start)+'.txt'

Once I got all that working, I looked up the syntax for doing a loop a specific number of times, and ran the code cell. 50 GPT-2-sized samples suddenly appeared on my laptop. And you know what? That felt really good.

#Iterates 50 times
for i in range(50):
    #All the below is the same as above, just iterated 50 times
    number = random.randint(0, len(ch2examples))
    ch2examples[number]['book']
    ch2name = ch2examples[number]['book']
    maxstart = ch2examples[number]['length'] - 1024
    start = random.randint(0, maxstart)
    end = start + 1024
    sample = ' '.join(ch2examples[number]['text'][start : end])
    samplename = 'sample_'+ch2name.replace('.txt','_')+str(start)+'.txt'
    #Creates and opens an ouptput file with the filename for the output random selection
    with open(samplename, 'w') as samplefile:
        #Writes the random selection to the filename
        samplefile.write(sample)
From child wrangling to data wrangling

There was more data wrangling to do before I could pull out my code from DSC #8: Text-Comparison-Algorithm-Crazy Quinn. (Of course there was more data wrangling. This is DH. Is there ever NOT more data wrangling before we can get to the fun part?)

I downloaded the GPT-2 samples from Google Colab for each run in a zip file, and they didn’t have a file extension. So I wrote a little Python to add .txt plus a prefix indicating which run it was from.

#Define directory for a single run's samples from the Google Colab notebook
runfiles = '/Users/qad/Downloads/ch2samples_run5'
#Change to that directory
os.chdir(runfiles)
#For each file in that directory...
for file in os.listdir(runfiles):
    #Prefix it with the run number, and suffix with the file extension
    os.rename(file, 'run5_' + file + '.txt')

Then I put all the GPT-2 samples together in the same folder and did a little more cleanup. I tokenized the text so that punctuation would be separated from words just like in our GPT-2-length samples, and when I joined all the tokens together, I excluded the first five tokens (remember, Python starts counting with 0) which were always ======== SAMPLE 1 ========.

#Define directory with all the GPT-2 samples
gpt2files = '/Users/qad/Documents/dsc_ch2_gpt2'
#Change to that directory
os.chdir(gpt2files)
#For each file in that directory
for file in os.listdir(gpt2files):
    #If it's a text file
    if file.endswith('.txt'):
        #Open the file
        with open(file, 'r') as gpt2file:
            #Read the file
            gpt2text = gpt2file.read()
            #Tokenize the file
            gpt2tokenized = word_tokenize(gpt2text)
            #Combine all the tokens except for the first 5 (start counting with 0, so 5=4)
            gpt2clean = ' '.join(gpt2tokenized[4:len(gpt2tokenized)])
            #Define output name by adding -c before .txt
            gpt2outname = file.replace('.txt', '-c.txt')
            #Open the output file name
            with open(gpt2outname, 'w') as gpt2out:
                #Write out the text to the output file name
                gpt2out.write(gpt2clean)

With that, I now had 50 GPT-2 samples from different training phases, and 50 GPT-2-sized samples from real chapter 2’s. Time to dig up my code from DSC #8: Text-Comparison-Algorithm-Crazy Quinn and rerun it on this new little corpus!

I decided to go with cosine similarity of term frequencies. I held my breath, crossed my fingers, and…

Failed text comparison 🤬🤬🤬

There should be exactly one black line running along the diagonal, where each text is being compared to itself. But instead, there are six additional smaller black lines, where some of the GPT-2 samples are being compared with one another. Somehow, only “run 1” and “run 2” were distinct: runs 3-5 were just copies of 2. Which is either an astronomically unlikely occurrence that GPT-2 managed to generate an identical set of samples on multiple different training runs and I should immediately run out to buy a lottery ticket and capitalize on this good luck… or failing to totally reset the notebook between runs meant it invisibly hung onto some variables, even though it looked like it was doing a new training run. (This happens sometimes with Jupyter notebooks, and that lack of transparency is one of the criticisms of notebooks.)

Since I needed to redo the runs and re-gather my files anyway, I realized I should make another change: the files get sorted alphabetically, and I want the sample numbers (not the run numbers) to cluster together, so they’re grouped by how far along they are in training. Time to revise the naming convention to something like samples-11_run1.txt.

#Define directory for a single run's samples from the Google Colab notebook
runfiles = '/Users/qad/Downloads/ch2samples_run5'
#Change to that directory
os.chdir(runfiles)
#For each file in that directory...
for file in os.listdir(runfiles):
    #This time put the run number after the original name, and suffix with the file extension
    os.rename(file, file + '_run5.txt')

And then re-run the comparison code.

#Imports the count vectorizer from Scikit-learn along with 
from sklearn.feature_extraction.text import CountVectorizer
#Glob is used for finding path names
import glob
#We need these to format the data correctly
from scipy.spatial.distance import pdist, squareform
#In case you're starting to run the code just at this point, we'll need os again
import os
import numpy as np
#In case you're starting to run the code just at this point, we'll need pandas again
import pandas as pd
pd.set_option("display.max_rows", None, "display.max_columns", None)
#Import matplotlib
import matplotlib.pyplot as plt
#Import seaborn
import seaborn as sns
#Defines the directory with both original chapter 2 samples and GPT-2 samples
filedir = '/Users/qad/Documents/dsc9_gpt2comparison'
#Change to that directory
os.chdir(filedir)
from sklearn.feature_extraction.text import TfidfVectorizer

# Use the glob library to create a list of file names, sorted alphabetically
# Alphabetical sorting will get us the books in numerical order
filenames = sorted(glob.glob("*.txt"))
# Parse those filenames to create a list of file keys (ID numbers)
# You'll use these later on.
filekeys = [f.split('/')[-1].split('.')[0] for f in filenames]

# Create a CountVectorizer instance with the parameters you need
freqvectorizer = TfidfVectorizer(input="filename", stop_words=None, use_idf=False, norm='l1', max_features=1000)
# Run the vectorizer on your list of filenames to create your wordcounts
# Use the toarray() function so that SciPy will accept the results
wordfreqs = freqvectorizer.fit_transform(filenames).toarray()
cosine_distances_freq = pd.DataFrame(squareform(pdist(wordfreqs, metric='cosine')), index=filekeys, columns=filekeys)
cosine_distances_freq
gpt2-011_run1-c gpt2-011_run2-c gpt2-011_run3-c gpt2-011_run4-c gpt2-011_run5-c gpt2-101_run1-c gpt2-101_run2-c gpt2-101_run3-c gpt2-101_run4-c gpt2-101_run5-c gpt2-201_run1-c gpt2-201_run2-c gpt2-201_run3-c gpt2-201_run4-c gpt2-201_run5-c gpt2-301_run1-c gpt2-301_run2-c gpt2-301_run3-c gpt2-301_run4-c gpt2-301_run5-c gpt2-401_run1-c gpt2-401_run2-c gpt2-401_run3-c gpt2-401_run4-c gpt2-401_run5-c gpt2-501_run1-c gpt2-501_run2-c gpt2-501_run3-c gpt2-501_run4-c gpt2-501_run5-c gpt2-601_run1-c gpt2-601_run2-c gpt2-601_run3-c gpt2-601_run4-c gpt2-601_run5-c gpt2-701_run1-c gpt2-701_run2-c gpt2-701_run3-c gpt2-701_run4-c gpt2-701_run5-c gpt2-801_run1-c gpt2-801_run2-c gpt2-801_run3-c gpt2-801_run4-c gpt2-801_run5-c gpt2-901_run1-c gpt2-901_run2-c gpt2-901_run3-c gpt2-901_run4-c gpt2-901_run5-c sample_004c_mary_anne_saves_the_day_2_240 sample_005c_dawn_and_the_impossible_three_2_216 sample_005c_dawn_and_the_impossible_three_2_432 sample_006c_kristys_big_day_2_321 sample_006c_kristys_big_day_2_850 sample_013c_goodbye_stacey_goodbye_2_537 sample_014c_hello_mallory_2_239 sample_024c_kristy_and_the_mothers_day_surprise_2_106 sample_025c_mary_anne_and_the_search_for_tigger_2_118 sample_027c_jessi_and_the_superbrat_2_159 sample_028c_welcome_back_stacey_2_679 sample_030c_mary_anne_and_the_great_romance_2_570 sample_032c_kristy_and_the_secret_of_susan_2_1152 sample_041c_mary_anne_vs_logan_2_1765 sample_045c_kristy_and_the_baby_parade_2_767 sample_046c_mary_anne_misses_logan_2_488 sample_048c_jessis_wish_2_68 sample_049c_claudia_and_the_genius_of_elm_street_2_105 sample_051c_staceys_ex_best_friend_2_1270 sample_057c_dawn_saves_the_planet_2_522 sample_064c_dawns_family_feud_2_305 sample_065c_staceys_big_crush_2_1311 sample_068c_jessi_and_the_bad_babysitter_2_89 sample_070c_stacey_and_the_cheerleaders_2_1293 sample_078c_claudia_and_crazy_peaches_2_36 sample_079c_mary_anne_breaks_the_rules_2_782 sample_082c_jessi_and_the_troublemaker_2_738 sample_084c_dawn_and_the_school_spirit_war_2_493 sample_090c_welcome_to_the_bsc_abby_2_1491 sample_097c_claudia_and_the_worlds_cutest_baby_2_1631 sample_100c_kristys_worst_idea_2_2163 sample_106c_claudia_queen_of_the_seventh_grade_2_958 sample_107c_mind_your_own_business_kristy_2_1194 sample_108c_dont_give_up_mallory_2_489 sample_110c_abby_and_the_bad_sport_2_535 sample_111c_staceys_secret_friend_2_700 sample_120c_mary_anne_and_the_playground_fight_2_942 sample_121c_abby_in_wonderland_2_494 sample_m03c_mallory_and_the_ghost_cat_2_261 sample_m04c_kristy_and_the_missing_child_2_1829 sample_m06c_the_mystery_at_claudias_house_2_420 sample_m10c_stacey_and_the_mystery_money_2_937 sample_m12c_dawn_and_the_surfer_ghost_2_126 sample_m14c_stacey_and_the_mystery_at_the_mall_2_1474 sample_m18c_stacey_and_the_mystery_at_the_empty_house_2_224 sample_m20c_mary_anne_and_the_zoo_mystery_2_80 sample_m25c_kristy_and_the_middle_school_vandal_2_134 sample_m27c_claudia_and_the_lighthouse_ghost_2_356 sample_m30c_kristy_and_the_mystery_train_2_1949 sample_m36c_kristy_and_the_cat_burglar_2_748
gpt2-011_run1-c 0.000000 0.323413 0.241242 0.180248 0.201788 0.296367 0.221127 0.229970 0.329125 0.294423 0.204391 0.399620 0.294434 0.185268 0.248509 0.296843 0.246077 0.198796 0.243489 0.189259 0.361294 0.408177 0.313905 0.257978 0.295150 0.340962 0.315007 0.275661 0.343400 0.241206 0.275041 0.412885 0.456441 0.230465 0.323500 0.286834 0.234322 0.229940 0.234451 0.303847 0.281143 0.295023 0.234725 0.374786 0.286393 0.234626 0.240313 0.354351 0.327190 0.216331 0.280657 0.371659 0.330236 0.197514 0.230987 0.204898 0.311633 0.234419 0.288067 0.380403 0.269793 0.316090 0.375880 0.318881 0.435693 0.302364 0.322477 0.322182 0.331877 0.171902 0.304779 0.236151 0.330618 0.341379 0.298178 0.274338 0.382497 0.250864 0.411028 0.300866 0.303292 0.319816 0.343380 0.307876 0.361247 0.346615 0.223632 0.429524 0.388881 0.256085 0.357987 0.324782 0.337621 0.309745 0.313145 0.294062 0.259532 0.266412 0.458586 0.278128
gpt2-011_run2-c 0.323413 0.000000 0.435037 0.418290 0.317343 0.220188 0.298298 0.340588 0.261456 0.212091 0.298107 0.214205 0.252569 0.287110 0.216416 0.294076 0.235887 0.242449 0.254644 0.281837 0.396288 0.351753 0.361409 0.192975 0.192563 0.195864 0.144233 0.168979 0.208177 0.251857 0.248593 0.183577 0.210524 0.218402 0.150168 0.328675 0.299931 0.277243 0.295173 0.238957 0.209769 0.202028 0.290614 0.209606 0.221706 0.312848 0.237393 0.479423 0.222347 0.208101 0.379596 0.240673 0.238559 0.325638 0.284376 0.366906 0.298099 0.277485 0.200363 0.146329 0.282440 0.187055 0.212913 0.191621 0.216133 0.254805 0.236518 0.181176 0.185667 0.230922 0.262623 0.217132 0.166459 0.187679 0.201225 0.192756 0.174884 0.211782 0.219962 0.202108 0.253219 0.184885 0.196764 0.225118 0.235671 0.186566 0.290908 0.200397 0.134324 0.203273 0.185496 0.224020 0.243964 0.178499 0.172840 0.160327 0.418507 0.240564 0.236965 0.420096
gpt2-011_run3-c 0.241242 0.435037 0.000000 0.369124 0.306006 0.373274 0.391320 0.397940 0.410283 0.406094 0.350788 0.460159 0.415590 0.319761 0.347909 0.367770 0.392230 0.359861 0.410740 0.311930 0.517296 0.444581 0.431704 0.385914 0.447239 0.402770 0.413343 0.397082 0.471245 0.451096 0.414384 0.536094 0.504359 0.372683 0.422593 0.418052 0.344757 0.333525 0.376386 0.423251 0.426888 0.457171 0.345132 0.464251 0.419932 0.364835 0.402296 0.430473 0.417648 0.342530 0.367020 0.443260 0.401664 0.328233 0.386187 0.356832 0.453871 0.410883 0.425135 0.483243 0.437123 0.414145 0.467484 0.414751 0.488166 0.432448 0.410559 0.404964 0.451504 0.337515 0.421705 0.386931 0.452270 0.468152 0.394287 0.385925 0.493576 0.370129 0.523369 0.442119 0.417056 0.469058 0.469952 0.449773 0.479589 0.460735 0.375010 0.539196 0.500505 0.392852 0.476576 0.465530 0.477651 0.466289 0.391493 0.438953 0.366972 0.403492 0.553948 0.461316
gpt2-011_run4-c 0.180248 0.418290 0.369124 0.000000 0.298558 0.332010 0.285007 0.270305 0.423795 0.335504 0.289017 0.438918 0.355384 0.241928 0.311886 0.409310 0.337901 0.270426 0.275673 0.277383 0.404661 0.513022 0.424944 0.355208 0.397336 0.401089 0.395135 0.344306 0.414301 0.338588 0.312879 0.462990 0.543097 0.297137 0.389331 0.331855 0.276099 0.328041 0.326229 0.335980 0.336779 0.320772 0.275723 0.403829 0.315979 0.286606 0.330455 0.402222 0.374005 0.302824 0.348291 0.474338 0.436451 0.279635 0.337177 0.313504 0.381964 0.291071 0.328492 0.437427 0.344655 0.403461 0.412577 0.403293 0.496665 0.389362 0.398056 0.393639 0.402905 0.261155 0.378320 0.294068 0.414790 0.378447 0.393294 0.322270 0.409744 0.287904 0.457020 0.379274 0.352157 0.382819 0.412021 0.330739 0.411300 0.385803 0.238020 0.528969 0.433921 0.331188 0.409219 0.375826 0.360095 0.380298 0.401906 0.372245 0.329566 0.327204 0.505739 0.330523
gpt2-011_run5-c 0.201788 0.317343 0.306006 0.298558 0.000000 0.294767 0.234755 0.346657 0.351142 0.328260 0.268820 0.438683 0.276705 0.280739 0.332427 0.284367 0.270258 0.265440 0.277439 0.228292 0.291330 0.378565 0.292353 0.244480 0.284813 0.340986 0.295886 0.291969 0.327021 0.232982 0.363558 0.376872 0.430095 0.259652 0.327507 0.337877 0.264508 0.291663 0.215531 0.289031 0.275186 0.301618 0.294932 0.366463 0.309137 0.264687 0.235073 0.386210 0.310752 0.227955 0.217697 0.320843 0.321541 0.229286 0.239322 0.196616 0.300452 0.282402 0.301236 0.357356 0.241598 0.290817 0.355557 0.297241 0.372579 0.267171 0.345957 0.284870 0.345632 0.244070 0.313738 0.264499 0.350850 0.355494 0.278061 0.319879 0.364254 0.306851 0.409007 0.290506 0.341590 0.298262 0.316455 0.291158 0.350967 0.360840 0.303969 0.357803 0.385901 0.250768 0.355229 0.317295 0.373178 0.320493 0.290799 0.300861 0.223750 0.304472 0.423442 0.255868
gpt2-101_run1-c 0.296367 0.220188 0.373274 0.332010 0.294767 0.000000 0.253249 0.318504 0.297506 0.239305 0.306096 0.209541 0.245578 0.202998 0.232994 0.324861 0.231413 0.221966 0.205409 0.314109 0.380028 0.389150 0.338503 0.176277 0.204567 0.188640 0.162074 0.196732 0.186495 0.248799 0.250475 0.214745 0.260420 0.204492 0.194745 0.224970 0.207870 0.261101 0.247731 0.203951 0.161047 0.195766 0.284982 0.175273 0.156987 0.224922 0.222297 0.413528 0.208840 0.198130 0.324630 0.257983 0.224650 0.283472 0.269518 0.293520 0.262847 0.263329 0.204829 0.176555 0.187422 0.164976 0.179304 0.199661 0.224301 0.210380 0.188351 0.180780 0.178260 0.208581 0.266181 0.199881 0.180572 0.179892 0.195512 0.196122 0.195301 0.192394 0.185773 0.217345 0.177419 0.184316 0.187874 0.209340 0.210535 0.163627 0.218159 0.268070 0.202765 0.192908 0.199670 0.205560 0.185747 0.148811 0.202350 0.162823 0.323053 0.213093 0.200049 0.400752
gpt2-101_run2-c 0.221127 0.298298 0.391320 0.285007 0.234755 0.253249 0.000000 0.342450 0.203203 0.252783 0.251351 0.337342 0.194099 0.213234 0.281837 0.294745 0.177945 0.251376 0.222665 0.294229 0.289731 0.313298 0.238286 0.232175 0.244311 0.295563 0.241932 0.287677 0.300637 0.181223 0.272382 0.300380 0.323867 0.194715 0.254486 0.232370 0.199166 0.278934 0.214424 0.262455 0.229935 0.231936 0.310222 0.274094 0.268366 0.160309 0.229181 0.327145 0.247240 0.212313 0.275674 0.319217 0.275396 0.193906 0.210316 0.226996 0.267348 0.299275 0.277300 0.292005 0.220240 0.255951 0.288680 0.279979 0.344129 0.331784 0.355839 0.268810 0.306402 0.234477 0.237501 0.214948 0.274493 0.245565 0.261727 0.251085 0.305259 0.208831 0.298819 0.231898 0.281600 0.255624 0.256317 0.256810 0.347418 0.284481 0.283765 0.356586 0.293025 0.230721 0.345792 0.334357 0.285909 0.247932 0.272916 0.261494 0.248307 0.285129 0.395719 0.242891
gpt2-101_run3-c 0.229970 0.340588 0.397940 0.270305 0.346657 0.318504 0.342450 0.000000 0.360718 0.256626 0.197803 0.361896 0.341759 0.155757 0.220106 0.383288 0.248698 0.162878 0.202681 0.212709 0.465579 0.516503 0.441192 0.276715 0.357577 0.403887 0.345686 0.251113 0.385829 0.321016 0.187049 0.401512 0.474110 0.254031 0.344329 0.299427 0.269754 0.207221 0.343292 0.285361 0.279941 0.271524 0.189990 0.364647 0.268270 0.293015 0.349461 0.445145 0.345519 0.241440 0.367748 0.418557 0.365369 0.307801 0.326482 0.286892 0.352152 0.157034 0.241942 0.354648 0.336158 0.359923 0.411734 0.368467 0.464952 0.393022 0.357328 0.362650 0.315430 0.190369 0.350071 0.231876 0.314473 0.330916 0.343577 0.237668 0.388452 0.204856 0.407096 0.316893 0.272866 0.303143 0.378353 0.324646 0.377716 0.303371 0.178115 0.461545 0.343293 0.282845 0.335873 0.305196 0.286622 0.280627 0.367010 0.263552 0.391393 0.255750 0.428533 0.433690
gpt2-101_run4-c 0.329125 0.261456 0.410283 0.423795 0.351142 0.297506 0.203203 0.360718 0.000000 0.268294 0.272705 0.309577 0.222502 0.282959 0.250763 0.276326 0.192267 0.280809 0.290235 0.332102 0.397757 0.287290 0.321275 0.228030 0.239278 0.280193 0.257697 0.265482 0.310159 0.255448 0.287594 0.272574 0.279742 0.198116 0.256884 0.323245 0.298923 0.259207 0.291943 0.288087 0.257738 0.281784 0.321053 0.256859 0.334942 0.246518 0.303624 0.375283 0.256403 0.255508 0.333305 0.297414 0.244740 0.275493 0.269947 0.266022 0.324450 0.306054 0.236153 0.257339 0.287366 0.261259 0.303998 0.266982 0.312304 0.362049 0.335865 0.261285 0.303526 0.304621 0.233345 0.255058 0.278778 0.243601 0.281359 0.228271 0.297429 0.227864 0.306108 0.242875 0.297548 0.251246 0.269655 0.310530 0.375881 0.271506 0.325280 0.298635 0.239018 0.269543 0.324635 0.341631 0.336891 0.238021 0.252641 0.249381 0.374117 0.295100 0.359700 0.404258
gpt2-101_run5-c 0.294423 0.212091 0.406094 0.335504 0.328260 0.239305 0.252783 0.256626 0.268294 0.000000 0.201191 0.205243 0.198335 0.199804 0.195553 0.285640 0.177527 0.184363 0.159010 0.244352 0.353212 0.342628 0.346150 0.196559 0.226689 0.258365 0.200827 0.218956 0.264253 0.209942 0.170926 0.207223 0.277406 0.207877 0.189792 0.207052 0.184188 0.255877 0.275250 0.197701 0.201921 0.155019 0.249778 0.225372 0.175754 0.191456 0.219964 0.395328 0.205823 0.169004 0.356710 0.287915 0.259159 0.287933 0.299678 0.305125 0.276265 0.205339 0.197934 0.205790 0.266287 0.230653 0.246213 0.239297 0.289114 0.353928 0.294853 0.235578 0.234670 0.217175 0.251636 0.189247 0.213735 0.211582 0.233728 0.176680 0.197700 0.143060 0.266951 0.167833 0.241931 0.211082 0.211196 0.233016 0.259444 0.184465 0.231306 0.248904 0.211175 0.194810 0.232028 0.238586 0.177641 0.195148 0.222286 0.204330 0.323612 0.233488 0.335376 0.403925
gpt2-201_run1-c 0.204391 0.298107 0.350788 0.289017 0.268820 0.306096 0.251351 0.197803 0.272705 0.201191 0.000000 0.374148 0.254139 0.171998 0.203574 0.308862 0.186475 0.167187 0.178403 0.181949 0.350377 0.406872 0.330012 0.192747 0.250029 0.383909 0.296835 0.242291 0.351935 0.203398 0.226208 0.366797 0.435251 0.212722 0.285190 0.242999 0.257310 0.158501 0.250366 0.272744 0.249075 0.236185 0.221108 0.344472 0.263312 0.202586 0.233089 0.374843 0.283307 0.180663 0.288904 0.352236 0.292879 0.251349 0.261266 0.213584 0.267494 0.173031 0.189096 0.352002 0.262984 0.304568 0.383938 0.322778 0.425887 0.363655 0.362017 0.274030 0.320140 0.203832 0.280805 0.200252 0.287329 0.312294 0.290647 0.206844 0.360444 0.186979 0.405358 0.252997 0.299265 0.275731 0.311305 0.293978 0.369433 0.302526 0.201124 0.385972 0.328682 0.242192 0.366710 0.323558 0.315406 0.260071 0.313143 0.223342 0.290284 0.218690 0.444650 0.335168
gpt2-201_run2-c 0.399620 0.214205 0.460159 0.438918 0.438683 0.209541 0.337342 0.361896 0.309577 0.205243 0.374148 0.000000 0.299266 0.301350 0.242383 0.368925 0.267163 0.266240 0.295006 0.386518 0.487885 0.414251 0.448002 0.251701 0.248677 0.213543 0.188735 0.237132 0.213739 0.327421 0.234024 0.193692 0.225993 0.245934 0.182952 0.310311 0.294738 0.347339 0.363855 0.250030 0.232180 0.246257 0.314325 0.188918 0.214370 0.318403 0.326290 0.491080 0.237561 0.239448 0.452188 0.273121 0.259845 0.409947 0.375752 0.441446 0.361044 0.288193 0.247637 0.148040 0.347299 0.211448 0.184061 0.199205 0.211871 0.316003 0.253867 0.258989 0.199321 0.272415 0.326668 0.255327 0.207775 0.151998 0.268092 0.215951 0.172830 0.197966 0.163897 0.219328 0.231094 0.239570 0.197098 0.269843 0.219684 0.167797 0.277162 0.255807 0.180213 0.233482 0.169289 0.211683 0.180479 0.185183 0.239193 0.203145 0.487692 0.311040 0.205381 0.510741
gpt2-201_run3-c 0.294434 0.252569 0.415590 0.355384 0.276705 0.245578 0.194099 0.341759 0.222502 0.198335 0.254139 0.299266 0.000000 0.231772 0.285460 0.273579 0.189492 0.233627 0.194262 0.281228 0.253615 0.282208 0.271312 0.219166 0.200635 0.257293 0.203310 0.254388 0.236560 0.161489 0.289000 0.210819 0.275013 0.191213 0.233049 0.209678 0.202155 0.283258 0.252610 0.246538 0.170422 0.196730 0.322300 0.226601 0.213211 0.162731 0.181667 0.356954 0.230150 0.206317 0.303951 0.272714 0.253522 0.241455 0.248711 0.268880 0.243313 0.293047 0.238179 0.236346 0.209528 0.223148 0.246874 0.240810 0.275957 0.321489 0.306316 0.219780 0.272025 0.265255 0.229332 0.209304 0.225486 0.220255 0.243707 0.235567 0.217485 0.208169 0.278570 0.187565 0.269938 0.223181 0.205085 0.210008 0.263477 0.261136 0.284998 0.242215 0.258218 0.182585 0.288997 0.241104 0.228984 0.195107 0.205134 0.227481 0.271450 0.257867 0.324456 0.328501
gpt2-201_run4-c 0.185268 0.287110 0.319761 0.241928 0.280739 0.202998 0.213234 0.155757 0.282959 0.199804 0.171998 0.301350 0.231772 0.000000 0.177451 0.318921 0.175611 0.132590 0.142597 0.222216 0.361106 0.428140 0.313230 0.198613 0.235257 0.315543 0.229902 0.198905 0.289265 0.212516 0.183235 0.330689 0.363411 0.177342 0.254317 0.177555 0.191429 0.199693 0.246256 0.222533 0.195001 0.178632 0.214405 0.271681 0.188685 0.177831 0.203820 0.401965 0.256345 0.154417 0.313905 0.346299 0.285139 0.221021 0.249303 0.230268 0.252362 0.185458 0.193511 0.283170 0.232534 0.248649 0.314058 0.298712 0.392407 0.295558 0.266935 0.239859 0.248743 0.148470 0.270272 0.146128 0.222490 0.257087 0.254365 0.187669 0.285326 0.128019 0.331566 0.222929 0.240078 0.208019 0.277887 0.214153 0.278276 0.248321 0.136434 0.369448 0.291799 0.219446 0.285395 0.253025 0.208486 0.203938 0.262180 0.191622 0.282498 0.181296 0.373120 0.353694
gpt2-201_run5-c 0.248509 0.216416 0.347909 0.311886 0.332427 0.232994 0.281837 0.220106 0.250763 0.195553 0.203574 0.242383 0.285460 0.177451 0.000000 0.287519 0.176087 0.181798 0.221662 0.199868 0.420651 0.347278 0.382751 0.200352 0.229331 0.261175 0.230740 0.169061 0.278766 0.263733 0.186730 0.312785 0.323323 0.177172 0.196342 0.284472 0.239953 0.189744 0.287626 0.234364 0.237238 0.204144 0.215493 0.241646 0.237632 0.274233 0.273209 0.407790 0.224335 0.173679 0.341205 0.297353 0.243086 0.271003 0.283143 0.303826 0.312511 0.220311 0.157811 0.229143 0.303683 0.246650 0.260620 0.230934 0.288599 0.280201 0.269633 0.191822 0.247169 0.188214 0.278519 0.194501 0.196601 0.226950 0.212714 0.146069 0.242132 0.150824 0.302337 0.225339 0.231460 0.208014 0.254889 0.252343 0.285310 0.217028 0.172074 0.317734 0.217031 0.208830 0.226591 0.243083 0.237967 0.208577 0.230068 0.182050 0.357542 0.174332 0.326208 0.411564
gpt2-301_run1-c 0.296843 0.294076 0.367770 0.409310 0.284367 0.324861 0.294745 0.383288 0.276326 0.285640 0.308862 0.368925 0.273579 0.318921 0.287519 0.000000 0.305145 0.282043 0.265385 0.304060 0.388628 0.268605 0.314475 0.246821 0.296171 0.319634 0.307225 0.305138 0.355547 0.269821 0.356410 0.332130 0.320669 0.261936 0.285661 0.366792 0.313397 0.306978 0.272108 0.330444 0.315019 0.320899 0.356783 0.343609 0.292569 0.299563 0.316787 0.366560 0.275166 0.236142 0.301950 0.279271 0.291672 0.305103 0.288381 0.308299 0.328822 0.338605 0.276530 0.317952 0.326962 0.304949 0.329235 0.306131 0.336933 0.383553 0.322326 0.296922 0.282681 0.306757 0.292946 0.321100 0.310838 0.315896 0.304134 0.298060 0.335500 0.308459 0.410025 0.286471 0.357344 0.327194 0.313197 0.280922 0.343905 0.346941 0.376356 0.328320 0.307547 0.268706 0.351853 0.296096 0.398477 0.315346 0.296676 0.298989 0.360828 0.309472 0.414962 0.383912
gpt2-301_run2-c 0.246077 0.235887 0.392230 0.337901 0.270258 0.231413 0.177945 0.248698 0.192267 0.177527 0.186475 0.267163 0.189492 0.175611 0.176087 0.305145 0.000000 0.198615 0.188955 0.244695 0.288421 0.328349 0.280938 0.174144 0.199381 0.266818 0.202261 0.197004 0.263609 0.172504 0.183615 0.258197 0.295002 0.168298 0.210586 0.190332 0.199857 0.197782 0.215537 0.195089 0.194153 0.175983 0.241736 0.236269 0.249411 0.164662 0.202934 0.365038 0.202121 0.162299 0.285456 0.277727 0.220482 0.208866 0.226260 0.231067 0.224546 0.203204 0.188718 0.238629 0.201434 0.192013 0.254556 0.234856 0.305144 0.297788 0.298051 0.192347 0.249923 0.198837 0.219613 0.154611 0.215979 0.193969 0.225436 0.149629 0.251581 0.140893 0.264826 0.183567 0.240627 0.185753 0.213675 0.227319 0.307016 0.219576 0.197491 0.291769 0.245940 0.203016 0.285613 0.278244 0.206416 0.180078 0.222679 0.194611 0.252775 0.205517 0.333569 0.330360
gpt2-301_run3-c 0.198796 0.242449 0.359861 0.270426 0.265440 0.221966 0.251376 0.162878 0.280809 0.184363 0.167187 0.266240 0.233627 0.132590 0.181798 0.282043 0.198615 0.000000 0.140572 0.172001 0.374298 0.391049 0.354211 0.177344 0.220462 0.315029 0.223854 0.166606 0.231736 0.197751 0.196643 0.286048 0.329798 0.159240 0.241133 0.214327 0.214113 0.189189 0.245549 0.226636 0.180463 0.188844 0.206362 0.267089 0.178942 0.199861 0.199244 0.386410 0.251388 0.161526 0.314631 0.293705 0.249149 0.250046 0.254568 0.235041 0.305252 0.148727 0.175075 0.241034 0.245379 0.257393 0.299726 0.261731 0.332834 0.265224 0.243039 0.241954 0.221821 0.146340 0.265214 0.159107 0.202714 0.233757 0.220347 0.183064 0.234392 0.147700 0.313081 0.196772 0.205207 0.203880 0.249560 0.211938 0.236000 0.230519 0.176175 0.292130 0.249673 0.183831 0.228056 0.181901 0.203988 0.196995 0.223385 0.176264 0.342194 0.173948 0.351136 0.377085
gpt2-301_run4-c 0.243489 0.254644 0.410740 0.275673 0.277439 0.205409 0.222665 0.202681 0.290235 0.159010 0.178403 0.295006 0.194262 0.142597 0.221662 0.265385 0.188955 0.140572 0.000000 0.234643 0.313223 0.359596 0.329157 0.168092 0.214719 0.309904 0.210662 0.207781 0.231023 0.159183 0.196213 0.252964 0.285753 0.164532 0.233199 0.175772 0.177996 0.227693 0.239920 0.202223 0.154345 0.140927 0.254349 0.240829 0.148938 0.144864 0.188558 0.390072 0.222364 0.145527 0.286870 0.281320 0.247848 0.254887 0.249955 0.253246 0.242335 0.175472 0.196145 0.244528 0.214009 0.242571 0.279196 0.274563 0.346775 0.298155 0.241776 0.228967 0.214514 0.170823 0.276068 0.149719 0.196650 0.227531 0.225639 0.199027 0.229291 0.156204 0.307267 0.157460 0.247318 0.195964 0.222187 0.150517 0.221873 0.237267 0.194445 0.279930 0.255262 0.183171 0.252583 0.188880 0.199316 0.199063 0.214056 0.196508 0.290923 0.191521 0.349696 0.344922
gpt2-301_run5-c 0.189259 0.281837 0.311930 0.277383 0.228292 0.314109 0.294229 0.212709 0.332102 0.244352 0.181949 0.386518 0.281228 0.222216 0.199868 0.304060 0.244695 0.172001 0.234643 0.000000 0.363635 0.368981 0.352252 0.205703 0.310835 0.347309 0.337260 0.215413 0.338838 0.267078 0.260976 0.374349 0.410871 0.233828 0.307539 0.335668 0.278505 0.180853 0.257081 0.298604 0.287064 0.274404 0.211515 0.354165 0.262587 0.275844 0.268987 0.388131 0.317515 0.209236 0.276893 0.338815 0.316087 0.244807 0.254808 0.232566 0.329581 0.180012 0.230733 0.339260 0.298793 0.313434 0.352289 0.303813 0.371654 0.305329 0.318403 0.293957 0.313874 0.190885 0.289763 0.234330 0.307598 0.339703 0.278969 0.234777 0.337305 0.236037 0.425361 0.270022 0.282555 0.304733 0.324123 0.322925 0.334064 0.326254 0.236802 0.369312 0.353209 0.209552 0.334955 0.267105 0.341429 0.304516 0.284372 0.274705 0.341485 0.245020 0.446370 0.332735
gpt2-401_run1-c 0.361294 0.396288 0.517296 0.404661 0.291330 0.380028 0.289731 0.465579 0.397757 0.353212 0.350377 0.487885 0.253615 0.361106 0.420651 0.388628 0.288421 0.374298 0.313223 0.363635 0.000000 0.375920 0.320525 0.302873 0.378795 0.422735 0.353968 0.378541 0.375449 0.225285 0.441626 0.401916 0.413279 0.343792 0.394216 0.293794 0.324453 0.378446 0.340491 0.338458 0.332344 0.333129 0.442490 0.403772 0.348605 0.283022 0.230402 0.414374 0.350096 0.298944 0.281618 0.323173 0.338206 0.294265 0.305352 0.304341 0.281060 0.363035 0.361186 0.411781 0.272931 0.322863 0.409687 0.435640 0.474229 0.358694 0.428874 0.325291 0.412101 0.331250 0.335717 0.303136 0.370416 0.388990 0.336304 0.383995 0.383296 0.350753 0.449189 0.345531 0.413975 0.348058 0.372235 0.282060 0.377652 0.389871 0.355299 0.374158 0.441003 0.325510 0.465752 0.397745 0.378542 0.336892 0.314635 0.343996 0.265152 0.342433 0.490096 0.299000
gpt2-401_run2-c 0.408177 0.351753 0.444581 0.513022 0.378565 0.389150 0.313298 0.516503 0.287290 0.342628 0.406872 0.414251 0.282208 0.428140 0.347278 0.268605 0.328349 0.391049 0.359596 0.368981 0.375920 0.000000 0.347893 0.295465 0.366282 0.333171 0.335591 0.357926 0.372740 0.312630 0.403619 0.361347 0.306026 0.317911 0.333059 0.408560 0.378604 0.356014 0.351113 0.387039 0.363720 0.376440 0.434311 0.349646 0.367109 0.318627 0.344174 0.353900 0.313634 0.329483 0.330729 0.299858 0.287269 0.349204 0.370228 0.378303 0.363952 0.419218 0.360603 0.349265 0.361210 0.341159 0.320219 0.329584 0.316945 0.404073 0.382756 0.307496 0.342141 0.401561 0.317338 0.376657 0.364330 0.350811 0.332063 0.341476 0.332368 0.371720 0.422208 0.313046 0.382778 0.364013 0.328946 0.373731 0.352108 0.384153 0.450157 0.313590 0.362205 0.300502 0.391034 0.333504 0.414897 0.352635 0.311908 0.371339 0.399764 0.374978 0.447380 0.445632
gpt2-401_run3-c 0.313905 0.361409 0.431704 0.424944 0.292353 0.338503 0.238286 0.441192 0.321275 0.346150 0.330012 0.448002 0.271312 0.313230 0.382751 0.314475 0.280938 0.354211 0.329157 0.352252 0.320525 0.347893 0.000000 0.289617 0.331857 0.354212 0.309507 0.364340 0.381046 0.274372 0.406324 0.347890 0.361700 0.308881 0.336848 0.300116 0.290522 0.367773 0.262236 0.317765 0.324769 0.294283 0.406116 0.357754 0.328787 0.254298 0.293242 0.400433 0.276117 0.271864 0.313532 0.367350 0.341737 0.286745 0.287630 0.261385 0.264891 0.372777 0.332242 0.367884 0.255728 0.277551 0.371599 0.359981 0.408221 0.356767 0.383261 0.296968 0.349420 0.333217 0.245321 0.304278 0.341122 0.342293 0.329131 0.345237 0.385248 0.326957 0.396067 0.328214 0.377923 0.307364 0.337243 0.318265 0.402877 0.366430 0.376735 0.387202 0.377793 0.296802 0.406761 0.404037 0.355920 0.329666 0.338699 0.338031 0.290226 0.335444 0.449219 0.353823
gpt2-401_run4-c 0.257978 0.192975 0.385914 0.355208 0.244480 0.176277 0.232175 0.276715 0.228030 0.196559 0.192747 0.251701 0.219166 0.198613 0.200352 0.246821 0.174144 0.177344 0.168092 0.205703 0.302873 0.295465 0.289617 0.000000 0.168706 0.242501 0.187173 0.149927 0.193832 0.168713 0.231923 0.218743 0.232117 0.160408 0.191723 0.231661 0.238798 0.175133 0.176017 0.193640 0.169318 0.179400 0.238502 0.214120 0.184751 0.213683 0.178079 0.365026 0.187768 0.134525 0.229984 0.212464 0.191172 0.250799 0.213314 0.208425 0.212355 0.171947 0.135580 0.181546 0.170080 0.169491 0.215520 0.190632 0.254553 0.183936 0.204418 0.170985 0.180013 0.164547 0.222683 0.159837 0.165744 0.187470 0.184904 0.166971 0.212175 0.178696 0.247779 0.162485 0.207362 0.161652 0.192735 0.182158 0.197986 0.196471 0.202380 0.188913 0.215618 0.166987 0.219948 0.176584 0.235049 0.156659 0.178521 0.139988 0.296619 0.188519 0.249609 0.346055
gpt2-401_run5-c 0.295150 0.192563 0.447239 0.397336 0.284813 0.204567 0.244311 0.357577 0.239278 0.226689 0.250029 0.248677 0.200635 0.235257 0.229331 0.296171 0.199381 0.220462 0.214719 0.310835 0.378795 0.366282 0.331857 0.168706 0.000000 0.243018 0.164851 0.173093 0.188635 0.172311 0.286081 0.224432 0.274104 0.167325 0.165521 0.277755 0.245129 0.271630 0.218415 0.233287 0.154123 0.158392 0.329390 0.189683 0.227290 0.241585 0.185673 0.476713 0.238711 0.171434 0.365503 0.282006 0.263806 0.291874 0.261169 0.298303 0.246438 0.267150 0.167295 0.210026 0.210392 0.198968 0.237401 0.188206 0.262490 0.224877 0.247132 0.160433 0.225824 0.228539 0.255213 0.188588 0.130296 0.213615 0.187511 0.167491 0.199283 0.186522 0.218995 0.163426 0.247428 0.149913 0.184120 0.211729 0.228100 0.203153 0.230801 0.170621 0.188535 0.208752 0.210339 0.224923 0.233820 0.176405 0.204485 0.150564 0.366065 0.182951 0.253072 0.373485
gpt2-501_run1-c 0.340962 0.195864 0.402770 0.401089 0.340986 0.188640 0.295563 0.403887 0.280193 0.258365 0.383909 0.213543 0.257293 0.315543 0.261175 0.319634 0.266818 0.315029 0.309904 0.347309 0.422735 0.333171 0.354212 0.242501 0.243018 0.000000 0.179803 0.231880 0.216473 0.258615 0.285587 0.190498 0.258101 0.254552 0.164459 0.339554 0.250056 0.350937 0.268893 0.264134 0.192518 0.237807 0.317029 0.166772 0.235470 0.308254 0.303510 0.468625 0.252339 0.262892 0.396445 0.284031 0.280405 0.324898 0.306749 0.375579 0.297794 0.314289 0.249680 0.177066 0.293315 0.198725 0.158879 0.172915 0.183061 0.283047 0.241755 0.230059 0.209326 0.250537 0.284191 0.288108 0.226834 0.201512 0.229640 0.201281 0.206413 0.236347 0.176802 0.221984 0.218062 0.204487 0.167708 0.283505 0.263732 0.203952 0.319777 0.240682 0.165612 0.194336 0.189213 0.259497 0.226229 0.198170 0.229542 0.216824 0.396230 0.321556 0.209327 0.441768
gpt2-501_run2-c 0.315007 0.144233 0.413343 0.395135 0.295886 0.162074 0.241932 0.345686 0.257697 0.200827 0.296835 0.188735 0.203310 0.229902 0.230740 0.307225 0.202261 0.223854 0.210662 0.337260 0.353968 0.335591 0.309507 0.187173 0.164851 0.179803 0.000000 0.183157 0.179433 0.205052 0.243174 0.171971 0.211179 0.194202 0.135746 0.225759 0.221046 0.312921 0.256439 0.174801 0.144612 0.132470 0.326264 0.168795 0.191687 0.223638 0.194286 0.462317 0.187413 0.177227 0.360888 0.259242 0.236345 0.286885 0.255254 0.326789 0.242199 0.277798 0.191386 0.124097 0.226183 0.133218 0.171073 0.162238 0.199712 0.228976 0.221530 0.153629 0.158351 0.229151 0.256048 0.171772 0.120973 0.143890 0.154090 0.173235 0.166476 0.167016 0.159707 0.165487 0.229708 0.144905 0.140632 0.178163 0.201916 0.136904 0.249417 0.192090 0.140870 0.176554 0.168411 0.220954 0.154193 0.146071 0.169085 0.135118 0.355810 0.211369 0.195521 0.411205
gpt2-501_run3-c 0.275661 0.168979 0.397082 0.344306 0.291969 0.196732 0.287677 0.251113 0.265482 0.218956 0.242291 0.237132 0.254388 0.198905 0.169061 0.305138 0.197004 0.166606 0.207781 0.215413 0.378541 0.357926 0.364340 0.149927 0.173093 0.231880 0.183157 0.000000 0.189199 0.211837 0.209903 0.245464 0.237789 0.168122 0.181705 0.285853 0.250526 0.204994 0.243059 0.198276 0.172959 0.187119 0.217073 0.214815 0.193762 0.289963 0.216485 0.432906 0.228944 0.153015 0.307319 0.276649 0.238539 0.294204 0.269050 0.296611 0.296477 0.175522 0.163592 0.192994 0.251909 0.201539 0.245569 0.224234 0.286223 0.182368 0.192086 0.143881 0.210690 0.160432 0.273890 0.161536 0.146892 0.212501 0.172175 0.143747 0.189789 0.169721 0.230799 0.199911 0.197968 0.163744 0.235875 0.212928 0.216196 0.184543 0.178051 0.230313 0.196226 0.167327 0.181308 0.185355 0.205538 0.157175 0.188178 0.137644 0.389845 0.153329 0.248578 0.405411
gpt2-501_run4-c 0.343400 0.208177 0.471245 0.414301 0.327021 0.186495 0.300637 0.385829 0.310159 0.264253 0.351935 0.213739 0.236560 0.289265 0.278766 0.355547 0.263609 0.231736 0.231023 0.338838 0.375449 0.372740 0.381046 0.193832 0.188635 0.216473 0.179433 0.189199 0.000000 0.200594 0.270063 0.165566 0.208749 0.201125 0.199969 0.296463 0.271050 0.324514 0.293039 0.237358 0.149216 0.199907 0.324868 0.171697 0.184720 0.276407 0.230876 0.493388 0.236836 0.211407 0.374651 0.273141 0.259894 0.352242 0.293151 0.361743 0.312044 0.259996 0.222138 0.153838 0.239689 0.178899 0.157203 0.170961 0.189877 0.157208 0.194762 0.185342 0.173404 0.233079 0.310822 0.202896 0.163704 0.176138 0.191271 0.218386 0.133638 0.220355 0.165524 0.171837 0.202702 0.175715 0.164695 0.195415 0.159766 0.187215 0.298081 0.145188 0.189009 0.163850 0.154289 0.137473 0.192837 0.156093 0.155095 0.169019 0.417195 0.262056 0.177098 0.407092
gpt2-501_run5-c 0.241206 0.251857 0.451096 0.338588 0.232982 0.248799 0.181223 0.321016 0.255448 0.209942 0.203398 0.327421 0.161489 0.212516 0.263733 0.269821 0.172504 0.197751 0.159183 0.267078 0.225285 0.312630 0.274372 0.168713 0.172311 0.258615 0.205052 0.211837 0.200594 0.000000 0.262162 0.238353 0.284012 0.162393 0.209018 0.216109 0.200890 0.262104 0.189488 0.238590 0.161736 0.175864 0.284761 0.231241 0.208496 0.166613 0.149472 0.371186 0.243971 0.175327 0.272818 0.288896 0.260255 0.238828 0.215571 0.223795 0.215641 0.207430 0.208514 0.252202 0.182985 0.223761 0.264348 0.249615 0.322966 0.258531 0.290080 0.208005 0.247737 0.189220 0.252600 0.187653 0.218020 0.248939 0.218756 0.211898 0.227904 0.188335 0.270287 0.163148 0.261848 0.184521 0.211741 0.181017 0.245957 0.260588 0.242687 0.214019 0.255444 0.178378 0.257484 0.230554 0.235892 0.186671 0.213456 0.196279 0.269021 0.221420 0.333038 0.300621
gpt2-601_run1-c 0.275041 0.248593 0.414384 0.312879 0.363558 0.250475 0.272382 0.187049 0.287594 0.170926 0.226208 0.234024 0.289000 0.183235 0.186730 0.356410 0.183615 0.196643 0.196213 0.260976 0.441626 0.403619 0.406324 0.231923 0.286081 0.285587 0.243174 0.209903 0.270063 0.262162 0.000000 0.250864 0.319300 0.184338 0.217088 0.265560 0.232648 0.229350 0.310011 0.222523 0.220125 0.225102 0.192976 0.231708 0.228358 0.235465 0.321466 0.445305 0.257219 0.203911 0.366643 0.347286 0.271423 0.312788 0.323632 0.329460 0.317846 0.166140 0.218465 0.221981 0.294361 0.256256 0.264114 0.246346 0.328746 0.355056 0.297532 0.271022 0.245824 0.181148 0.299633 0.172923 0.261863 0.230036 0.261339 0.152807 0.250394 0.137953 0.251494 0.216775 0.220095 0.234229 0.251996 0.283095 0.318329 0.198835 0.211066 0.364110 0.235549 0.214989 0.211385 0.266875 0.193600 0.208565 0.283017 0.219585 0.383780 0.258129 0.319827 0.433258
gpt2-601_run2-c 0.412885 0.183577 0.536094 0.462990 0.376872 0.214745 0.300380 0.401512 0.272574 0.207223 0.366797 0.193692 0.210819 0.330689 0.312785 0.332130 0.258197 0.286048 0.252964 0.374349 0.401916 0.361347 0.347890 0.218743 0.224432 0.190498 0.171971 0.245464 0.165566 0.238353 0.250864 0.000000 0.168467 0.248123 0.175544 0.308956 0.269850 0.350347 0.308139 0.233179 0.169221 0.208448 0.332619 0.160180 0.192110 0.260781 0.286798 0.517321 0.210599 0.241508 0.404759 0.271107 0.261012 0.385544 0.319644 0.385926 0.288665 0.287716 0.243321 0.129061 0.237270 0.179460 0.155936 0.182989 0.174540 0.294438 0.238335 0.233133 0.180796 0.304550 0.253326 0.250240 0.205730 0.167178 0.259805 0.224413 0.140875 0.218086 0.148638 0.163798 0.241827 0.200201 0.149425 0.247348 0.230518 0.178507 0.369726 0.180173 0.149175 0.181702 0.162084 0.191274 0.183015 0.158571 0.204540 0.204787 0.455855 0.340358 0.182524 0.452552
gpt2-601_run3-c 0.456441 0.210524 0.504359 0.543097 0.430095 0.260420 0.323867 0.474110 0.279742 0.277406 0.435251 0.225993 0.275013 0.363411 0.323323 0.320669 0.295002 0.329798 0.285753 0.410871 0.413279 0.306026 0.361700 0.232117 0.274104 0.258101 0.211179 0.237789 0.208749 0.284012 0.319300 0.168467 0.000000 0.259592 0.214484 0.308521 0.340012 0.368443 0.346263 0.285031 0.231453 0.270304 0.409878 0.205899 0.226948 0.297792 0.285430 0.499938 0.227997 0.249323 0.387409 0.222340 0.210603 0.380962 0.335836 0.415092 0.346774 0.346131 0.276401 0.156172 0.287392 0.214812 0.219471 0.258143 0.251208 0.279829 0.244872 0.218712 0.188059 0.334090 0.295870 0.264653 0.204434 0.195161 0.237176 0.261468 0.176794 0.276758 0.215871 0.207747 0.276895 0.211890 0.230326 0.220402 0.218967 0.242156 0.406574 0.206062 0.222222 0.248339 0.200653 0.214558 0.236629 0.211251 0.198752 0.241973 0.463396 0.327262 0.245547 0.504097
gpt2-601_run4-c 0.230465 0.218402 0.372683 0.297137 0.259652 0.204492 0.194715 0.254031 0.198116 0.207877 0.212722 0.245934 0.191213 0.177342 0.177172 0.261936 0.168298 0.159240 0.164532 0.233828 0.343792 0.317911 0.308881 0.160408 0.167325 0.254552 0.194202 0.168122 0.201125 0.162393 0.184338 0.248123 0.259592 0.000000 0.189134 0.238862 0.200275 0.219504 0.197388 0.204710 0.155486 0.185640 0.223855 0.206022 0.214661 0.182478 0.201832 0.352470 0.218529 0.160203 0.280403 0.245235 0.194667 0.232980 0.237694 0.212123 0.253520 0.185095 0.175665 0.204128 0.214885 0.220965 0.234288 0.192747 0.276986 0.238229 0.258138 0.186479 0.229293 0.167469 0.242954 0.162902 0.202445 0.193036 0.186456 0.161403 0.203419 0.148333 0.254431 0.156212 0.194538 0.186083 0.187847 0.206869 0.240163 0.208093 0.194165 0.233249 0.225549 0.163773 0.209163 0.196440 0.222234 0.184172 0.185932 0.167839 0.295462 0.183562 0.301521 0.347431
gpt2-601_run5-c 0.323500 0.150168 0.422593 0.389331 0.327507 0.194745 0.254486 0.344329 0.256884 0.189792 0.285190 0.182952 0.233049 0.254317 0.196342 0.285661 0.210586 0.241133 0.233199 0.307539 0.394216 0.333059 0.336848 0.191723 0.165521 0.164459 0.135746 0.181705 0.199969 0.209018 0.217088 0.175544 0.214484 0.189134 0.000000 0.270940 0.219575 0.290916 0.252093 0.228555 0.176209 0.163235 0.301064 0.159834 0.195740 0.253557 0.252338 0.452177 0.191303 0.185500 0.374819 0.266522 0.239192 0.298323 0.276840 0.349350 0.294045 0.262504 0.169377 0.140854 0.272035 0.173817 0.175278 0.174428 0.196617 0.262825 0.246289 0.159280 0.188414 0.223199 0.264933 0.202343 0.147373 0.163255 0.164684 0.144755 0.163676 0.169193 0.158405 0.182265 0.221274 0.156045 0.174612 0.227247 0.239752 0.154758 0.261771 0.223903 0.125994 0.186299 0.154069 0.235769 0.169930 0.174047 0.200190 0.151018 0.397664 0.226544 0.243234 0.430198
gpt2-701_run1-c 0.286834 0.328675 0.418052 0.331855 0.337877 0.224970 0.232370 0.299427 0.323245 0.207052 0.242999 0.310311 0.209678 0.177555 0.284472 0.366792 0.190332 0.214327 0.175772 0.335668 0.293794 0.408560 0.300116 0.231661 0.277755 0.339554 0.225759 0.285853 0.296463 0.216109 0.265560 0.308956 0.308521 0.238862 0.270940 0.000000 0.176901 0.281022 0.275795 0.232072 0.186698 0.176539 0.350004 0.273072 0.197807 0.144674 0.183205 0.381138 0.241580 0.217933 0.315722 0.315476 0.268307 0.240504 0.277992 0.259279 0.229938 0.256083 0.222395 0.268954 0.207585 0.210458 0.291594 0.311111 0.370009 0.326662 0.296515 0.255875 0.238577 0.228870 0.274738 0.167063 0.232388 0.236633 0.264537 0.251690 0.267576 0.187739 0.314539 0.230142 0.290131 0.227550 0.266321 0.197112 0.264061 0.263832 0.219550 0.362646 0.313491 0.247416 0.339237 0.284507 0.172937 0.216557 0.256583 0.221536 0.242387 0.241400 0.370589 0.392596
gpt2-701_run2-c 0.234322 0.299931 0.344757 0.276099 0.264508 0.207870 0.199166 0.269754 0.298923 0.184188 0.257310 0.294738 0.202155 0.191429 0.239953 0.313397 0.199857 0.214113 0.177996 0.278505 0.324453 0.378604 0.290522 0.238798 0.245129 0.250056 0.221046 0.250526 0.271050 0.200890 0.232648 0.269850 0.340012 0.200275 0.219575 0.176901 0.000000 0.291751 0.233712 0.200005 0.180761 0.156363 0.247352 0.230003 0.208689 0.165414 0.244331 0.355674 0.222857 0.195137 0.297275 0.329908 0.285761 0.227275 0.257123 0.257876 0.243692 0.240526 0.243159 0.254473 0.224095 0.219941 0.239232 0.254256 0.316156 0.320834 0.306756 0.239642 0.274095 0.206716 0.252820 0.196777 0.243146 0.236567 0.248885 0.215002 0.258936 0.171531 0.277426 0.232477 0.251986 0.230797 0.245999 0.236661 0.301529 0.230428 0.202261 0.357049 0.263944 0.199748 0.274019 0.288709 0.174904 0.221037 0.260830 0.235584 0.241974 0.254117 0.339218 0.366385
gpt2-701_run3-c 0.229940 0.277243 0.333525 0.328041 0.291663 0.261101 0.278934 0.207221 0.259207 0.255877 0.158501 0.347339 0.283258 0.199693 0.189744 0.306978 0.197782 0.189189 0.227693 0.180853 0.378446 0.356014 0.367773 0.175133 0.271630 0.350937 0.312921 0.204994 0.324514 0.262104 0.229350 0.350347 0.368443 0.219504 0.290916 0.281022 0.291751 0.000000 0.274075 0.289403 0.260793 0.294808 0.192107 0.318234 0.258502 0.248406 0.272590 0.333938 0.288953 0.186849 0.259706 0.291549 0.229235 0.257034 0.265716 0.216059 0.308326 0.200317 0.195712 0.325081 0.255295 0.288495 0.352689 0.308027 0.379290 0.317343 0.279235 0.257728 0.289130 0.184038 0.250506 0.212432 0.282241 0.298370 0.271679 0.198845 0.353841 0.223821 0.374238 0.264146 0.228169 0.266347 0.316116 0.323635 0.338090 0.285951 0.215495 0.369342 0.319034 0.230263 0.334861 0.276863 0.322191 0.234428 0.251517 0.226097 0.331440 0.230587 0.384990 0.379987
gpt2-701_run4-c 0.234451 0.295173 0.376386 0.326229 0.215531 0.247731 0.214424 0.343292 0.291943 0.275250 0.250366 0.363855 0.252610 0.246256 0.287626 0.272108 0.215537 0.245549 0.239920 0.257081 0.340491 0.351113 0.262236 0.176017 0.218415 0.268893 0.256439 0.243059 0.293039 0.189488 0.310011 0.308139 0.346263 0.197388 0.252093 0.275795 0.233712 0.274075 0.000000 0.258447 0.227379 0.229652 0.302433 0.271996 0.248472 0.219808 0.219045 0.386207 0.258222 0.200785 0.257581 0.310542 0.290244 0.240755 0.233718 0.201761 0.226796 0.253777 0.228152 0.291640 0.201005 0.248833 0.302512 0.265330 0.348217 0.250765 0.289580 0.257132 0.267911 0.210010 0.258614 0.241676 0.262663 0.281317 0.223586 0.247864 0.304761 0.251263 0.309010 0.222695 0.267061 0.234332 0.268012 0.260736 0.308601 0.297303 0.272437 0.284490 0.300772 0.231738 0.307760 0.285683 0.301010 0.245191 0.271721 0.240599 0.275699 0.249352 0.346397 0.330415
gpt2-701_run5-c 0.303847 0.238957 0.423251 0.335980 0.289031 0.203951 0.262455 0.285361 0.288087 0.197701 0.272744 0.250030 0.246538 0.222533 0.234364 0.330444 0.195089 0.226636 0.202223 0.298604 0.338458 0.387039 0.317765 0.193640 0.233287 0.264134 0.174801 0.198276 0.237358 0.238590 0.222523 0.233179 0.285031 0.204710 0.228555 0.232072 0.200005 0.289403 0.258447 0.000000 0.163016 0.177610 0.258509 0.211559 0.219547 0.206524 0.237229 0.407188 0.204811 0.198231 0.309407 0.287184 0.261805 0.277185 0.256820 0.267108 0.235452 0.234191 0.225677 0.194444 0.227640 0.181946 0.230138 0.234491 0.291009 0.241278 0.245860 0.213199 0.211495 0.241976 0.260850 0.168532 0.187339 0.174393 0.204218 0.180244 0.208400 0.181099 0.225249 0.178827 0.233724 0.174042 0.213786 0.206104 0.254327 0.169797 0.224480 0.265915 0.220642 0.212604 0.215203 0.245155 0.178273 0.192280 0.221492 0.190370 0.310468 0.225640 0.242725 0.376923
gpt2-801_run1-c 0.281143 0.209769 0.426888 0.336779 0.275186 0.161047 0.229935 0.279941 0.257738 0.201921 0.249075 0.232180 0.170422 0.195001 0.237238 0.315019 0.194153 0.180463 0.154345 0.287064 0.332344 0.363720 0.324769 0.169318 0.154123 0.192518 0.144612 0.172959 0.149216 0.161736 0.220125 0.169221 0.231453 0.155486 0.176209 0.186698 0.180761 0.260793 0.227379 0.163016 0.000000 0.145244 0.255946 0.152975 0.155224 0.178460 0.178367 0.420979 0.208175 0.168330 0.305001 0.255395 0.231613 0.259972 0.235380 0.249872 0.216885 0.191070 0.174764 0.161681 0.184009 0.160674 0.189495 0.168661 0.227567 0.216051 0.221477 0.190329 0.158638 0.188434 0.247061 0.168410 0.151900 0.160224 0.186077 0.162463 0.161713 0.152130 0.176960 0.125125 0.167668 0.140503 0.138277 0.166146 0.182275 0.179780 0.207177 0.195843 0.170838 0.141223 0.180234 0.164203 0.135663 0.131708 0.173641 0.143265 0.315117 0.212333 0.201568 0.378315
gpt2-801_run2-c 0.295023 0.202028 0.457171 0.320772 0.301618 0.195766 0.231936 0.271524 0.281784 0.155019 0.236185 0.246257 0.196730 0.178632 0.204144 0.320899 0.175983 0.188844 0.140927 0.274404 0.333129 0.376440 0.294283 0.179400 0.158392 0.237807 0.132470 0.187119 0.199907 0.175864 0.225102 0.208448 0.270304 0.185640 0.163235 0.176539 0.156363 0.294808 0.229652 0.177610 0.145244 0.000000 0.295598 0.200331 0.166640 0.180123 0.179041 0.449663 0.194443 0.165345 0.357331 0.317210 0.284169 0.262285 0.248339 0.300723 0.221982 0.234366 0.160119 0.186640 0.216907 0.180614 0.219609 0.223210 0.274650 0.254077 0.267434 0.169803 0.200753 0.207415 0.248271 0.151053 0.137263 0.186826 0.190611 0.156892 0.179431 0.150476 0.224708 0.165906 0.259757 0.140756 0.192085 0.147930 0.212695 0.178655 0.175282 0.231503 0.178575 0.196103 0.219450 0.232919 0.150297 0.167936 0.206912 0.148305 0.310879 0.171539 0.288381 0.384730
gpt2-801_run3-c 0.234725 0.290614 0.345132 0.275723 0.294932 0.284982 0.310222 0.189990 0.321053 0.249778 0.221108 0.314325 0.322300 0.214405 0.215493 0.356783 0.241736 0.206362 0.254349 0.211515 0.442490 0.434311 0.406116 0.238502 0.329390 0.317029 0.326264 0.217073 0.324868 0.284761 0.192976 0.332619 0.409878 0.223855 0.301064 0.350004 0.247352 0.192107 0.302433 0.258509 0.255946 0.295598 0.000000 0.307172 0.272619 0.276982 0.337843 0.402270 0.302964 0.227786 0.317786 0.358769 0.306917 0.302019 0.301997 0.258664 0.325206 0.185698 0.256593 0.324794 0.306474 0.302360 0.324059 0.294083 0.380674 0.352857 0.287716 0.296840 0.308196 0.190289 0.304175 0.253008 0.352763 0.300463 0.332539 0.216491 0.331608 0.209628 0.338641 0.285303 0.234623 0.307915 0.323145 0.338737 0.361543 0.264544 0.214082 0.403940 0.312898 0.219895 0.298554 0.279576 0.301869 0.245674 0.323460 0.274956 0.364468 0.283597 0.346824 0.417195
gpt2-801_run4-c 0.374786 0.209606 0.464251 0.403829 0.366463 0.175273 0.274094 0.364647 0.256859 0.225372 0.344472 0.188918 0.226601 0.271681 0.241646 0.343609 0.236269 0.267089 0.240829 0.354165 0.403772 0.349646 0.357754 0.214120 0.189683 0.166772 0.168795 0.214815 0.171697 0.231241 0.231708 0.160180 0.205899 0.206022 0.159834 0.273072 0.230003 0.318234 0.271996 0.211559 0.152975 0.200331 0.307172 0.000000 0.182934 0.242128 0.262839 0.497899 0.223731 0.217186 0.360551 0.283828 0.257174 0.338305 0.310437 0.358662 0.269917 0.285533 0.203272 0.150871 0.230520 0.166182 0.157452 0.156208 0.189129 0.252550 0.219811 0.197241 0.184033 0.248996 0.269806 0.220288 0.195603 0.171846 0.209133 0.190409 0.154821 0.167608 0.146000 0.178354 0.191581 0.172671 0.168296 0.230123 0.231323 0.164046 0.293325 0.206181 0.146369 0.186282 0.155814 0.196832 0.176203 0.156279 0.198405 0.180978 0.396068 0.279118 0.188083 0.429252
gpt2-801_run5-c 0.286393 0.221706 0.419932 0.315979 0.309137 0.156987 0.268366 0.268270 0.334942 0.175754 0.263312 0.214370 0.213211 0.188685 0.237632 0.292569 0.249411 0.178942 0.148938 0.262587 0.348605 0.367109 0.328787 0.184751 0.227290 0.235470 0.191687 0.193762 0.184720 0.208496 0.228358 0.192110 0.226948 0.214661 0.195740 0.197807 0.208689 0.258502 0.248472 0.219547 0.155224 0.166640 0.272619 0.182934 0.000000 0.212824 0.211038 0.429266 0.222800 0.170889 0.324444 0.269520 0.253561 0.310073 0.290692 0.307367 0.259580 0.211248 0.178880 0.179705 0.223349 0.210768 0.215053 0.237727 0.266678 0.252763 0.191503 0.211837 0.150830 0.187163 0.278895 0.191114 0.186327 0.203512 0.211084 0.213446 0.175035 0.182522 0.217410 0.185067 0.200593 0.190330 0.210553 0.182045 0.179711 0.196004 0.219222 0.257096 0.197237 0.166660 0.190499 0.151982 0.156908 0.154978 0.188212 0.178828 0.348232 0.219264 0.255988 0.400739
gpt2-901_run1-c 0.234626 0.312848 0.364835 0.286606 0.264687 0.224922 0.160309 0.293015 0.246518 0.191456 0.202586 0.318403 0.162731 0.177831 0.274233 0.299563 0.164662 0.199861 0.144864 0.275844 0.283022 0.318627 0.254298 0.213683 0.241585 0.308254 0.223638 0.289963 0.276407 0.166613 0.235465 0.260781 0.297792 0.182478 0.253557 0.144674 0.165414 0.248406 0.219808 0.206524 0.178460 0.180123 0.276982 0.242128 0.212824 0.000000 0.187923 0.333436 0.217980 0.192427 0.288206 0.302327 0.244112 0.199453 0.226658 0.199010 0.221431 0.257021 0.249738 0.289848 0.181991 0.237351 0.271521 0.273975 0.351256 0.354846 0.312927 0.256708 0.255525 0.225829 0.217863 0.181715 0.270638 0.227842 0.265812 0.219965 0.268089 0.185622 0.292247 0.174343 0.271896 0.218306 0.223134 0.224044 0.307695 0.261418 0.249866 0.336785 0.306663 0.213631 0.311188 0.277772 0.222447 0.227099 0.251113 0.268315 0.232059 0.253543 0.381469 0.324460
gpt2-901_run2-c 0.240313 0.237393 0.402296 0.330455 0.235073 0.222297 0.229181 0.349461 0.303624 0.219964 0.233089 0.326290 0.181667 0.203820 0.273209 0.316787 0.202934 0.199244 0.188558 0.268987 0.230402 0.344174 0.293242 0.178079 0.185673 0.303510 0.194286 0.216485 0.230876 0.149472 0.321466 0.286798 0.285430 0.201832 0.252338 0.183205 0.244331 0.272590 0.219045 0.237229 0.178367 0.179041 0.337843 0.262839 0.211038 0.187923 0.000000 0.407318 0.244667 0.171943 0.260296 0.283267 0.262511 0.222332 0.235659 0.223877 0.216047 0.271723 0.219718 0.275147 0.179498 0.213690 0.304598 0.288495 0.358117 0.209037 0.259993 0.185561 0.258230 0.218937 0.263889 0.167319 0.201045 0.257333 0.188007 0.234835 0.232292 0.213341 0.306278 0.183605 0.284441 0.189126 0.251047 0.172933 0.216004 0.263166 0.236746 0.221800 0.301406 0.224405 0.280486 0.236669 0.240879 0.203903 0.188062 0.213909 0.242713 0.185186 0.368126 0.301268
gpt2-901_run3-c 0.354351 0.479423 0.430473 0.402222 0.386210 0.413528 0.327145 0.445145 0.375283 0.395328 0.374843 0.491080 0.356954 0.401965 0.407790 0.366560 0.365038 0.386410 0.390072 0.388131 0.414374 0.353900 0.400433 0.365026 0.476713 0.468625 0.462317 0.432906 0.493388 0.371186 0.445305 0.517321 0.499938 0.352470 0.452177 0.381138 0.355674 0.333938 0.386207 0.407188 0.420979 0.449663 0.402270 0.497899 0.429266 0.333436 0.407318 0.000000 0.359791 0.368614 0.320637 0.388717 0.340356 0.355542 0.407465 0.309542 0.450396 0.390565 0.416700 0.494146 0.404447 0.457868 0.478581 0.466786 0.513154 0.472160 0.468722 0.415303 0.458658 0.385871 0.395865 0.371222 0.461608 0.398984 0.430699 0.406419 0.480444 0.425719 0.541717 0.426473 0.402392 0.444895 0.460230 0.419397 0.475494 0.472944 0.398819 0.514414 0.508227 0.389361 0.511953 0.457658 0.459760 0.417698 0.392375 0.431608 0.362717 0.387444 0.549685 0.447857
gpt2-901_run4-c 0.327190 0.222347 0.417648 0.374005 0.310752 0.208840 0.247240 0.345519 0.256403 0.205823 0.283307 0.237561 0.230150 0.256345 0.224335 0.275166 0.202121 0.251388 0.222364 0.317515 0.350096 0.313634 0.276117 0.187768 0.238711 0.252339 0.187413 0.228944 0.236836 0.243971 0.257219 0.210599 0.227997 0.218529 0.191303 0.241580 0.222857 0.288953 0.258222 0.204811 0.208175 0.194443 0.302964 0.223731 0.222800 0.217980 0.244667 0.359791 0.000000 0.177948 0.289165 0.254212 0.222515 0.289890 0.278475 0.266050 0.254775 0.280900 0.194252 0.198449 0.213758 0.195433 0.235946 0.234322 0.259306 0.261035 0.238731 0.169374 0.208214 0.264387 0.225119 0.203093 0.197007 0.174836 0.219965 0.192322 0.217258 0.208197 0.246683 0.202142 0.248217 0.172484 0.208641 0.189993 0.254597 0.187872 0.256415 0.258299 0.219916 0.219889 0.231583 0.249003 0.192675 0.202628 0.211534 0.195437 0.313639 0.228077 0.268177 0.405182
gpt2-901_run5-c 0.216331 0.208101 0.342530 0.302824 0.227955 0.198130 0.212313 0.241440 0.255508 0.169004 0.180663 0.239448 0.206317 0.154417 0.173679 0.236142 0.162299 0.161526 0.145527 0.209236 0.298944 0.329483 0.271864 0.134525 0.171434 0.262892 0.177227 0.153015 0.211407 0.175327 0.203911 0.241508 0.249323 0.160203 0.185500 0.217933 0.195137 0.186849 0.200785 0.198231 0.168330 0.165345 0.227786 0.217186 0.170889 0.192427 0.171943 0.368614 0.177948 0.000000 0.235000 0.271682 0.231975 0.228926 0.213926 0.208171 0.215851 0.186213 0.164278 0.222694 0.192730 0.182181 0.248884 0.228045 0.303899 0.220258 0.200089 0.163735 0.210624 0.152170 0.237279 0.149214 0.171276 0.217201 0.186402 0.159964 0.213529 0.143108 0.262807 0.177661 0.233897 0.183702 0.231713 0.168534 0.213065 0.207913 0.177494 0.258578 0.242795 0.146347 0.221393 0.195886 0.207594 0.159029 0.190202 0.165003 0.270179 0.155235 0.316415 0.313934
sample_004c_mary_anne_saves_the_day_2_240 0.280657 0.379596 0.367020 0.348291 0.217697 0.324630 0.275674 0.367748 0.333305 0.356710 0.288904 0.452188 0.303951 0.313905 0.341205 0.301950 0.285456 0.314631 0.286870 0.276893 0.281618 0.330729 0.313532 0.229984 0.365503 0.396445 0.360888 0.307319 0.374651 0.272818 0.366643 0.404759 0.387409 0.280403 0.374819 0.315722 0.297275 0.259706 0.257581 0.309407 0.305001 0.357331 0.317786 0.360551 0.324444 0.288206 0.260296 0.320637 0.289165 0.235000 0.000000 0.268499 0.255810 0.270275 0.285165 0.171074 0.294211 0.277260 0.308166 0.384755 0.239645 0.298510 0.411198 0.381277 0.433020 0.290027 0.346295 0.295780 0.335869 0.290874 0.316014 0.276514 0.365416 0.371944 0.296114 0.326739 0.404536 0.329336 0.460550 0.318263 0.334569 0.323552 0.384144 0.281450 0.344256 0.382163 0.325300 0.401883 0.417434 0.281952 0.394283 0.319447 0.382390 0.330806 0.311540 0.318459 0.220285 0.318535 0.452010 0.302063
sample_005c_dawn_and_the_impossible_three_2_216 0.371659 0.240673 0.443260 0.474338 0.320843 0.257983 0.319217 0.418557 0.297414 0.287915 0.352236 0.273121 0.272714 0.346299 0.297353 0.279271 0.277727 0.293705 0.281320 0.338815 0.323173 0.299858 0.367350 0.212464 0.282006 0.284031 0.259242 0.276649 0.273141 0.288896 0.347286 0.271107 0.222340 0.245235 0.266522 0.315476 0.329908 0.291549 0.310542 0.287184 0.255395 0.317210 0.358769 0.283828 0.269520 0.302327 0.283267 0.388717 0.254212 0.271682 0.268499 0.000000 0.052790 0.309302 0.292579 0.299281 0.332443 0.294242 0.265105 0.215072 0.274157 0.229004 0.260868 0.250670 0.259378 0.284129 0.284419 0.253127 0.209958 0.287796 0.293789 0.275524 0.255936 0.216288 0.240871 0.284401 0.261370 0.296869 0.320477 0.221106 0.248477 0.247132 0.204114 0.268272 0.243217 0.271692 0.348526 0.242301 0.263124 0.251848 0.264521 0.224191 0.302440 0.236946 0.226992 0.221539 0.375579 0.313103 0.271767 0.418210
sample_005c_dawn_and_the_impossible_three_2_432 0.330236 0.238559 0.401664 0.436451 0.321541 0.224650 0.275396 0.365369 0.244740 0.259159 0.292879 0.259845 0.253522 0.285139 0.243086 0.291672 0.220482 0.249149 0.247848 0.316087 0.338206 0.287269 0.341737 0.191172 0.263806 0.280405 0.236345 0.238539 0.259894 0.260255 0.271423 0.261012 0.210603 0.194667 0.239192 0.268307 0.285761 0.229235 0.290244 0.261805 0.231613 0.284169 0.306917 0.257174 0.253561 0.244112 0.262511 0.340356 0.222515 0.231975 0.255810 0.052790 0.000000 0.276546 0.273493 0.258589 0.321744 0.268327 0.237732 0.219787 0.241577 0.209134 0.256364 0.249644 0.285277 0.282720 0.279958 0.207735 0.222032 0.248317 0.267339 0.217438 0.242245 0.203781 0.212160 0.224275 0.251876 0.241038 0.291395 0.206133 0.215651 0.212879 0.218186 0.264701 0.268556 0.247237 0.305035 0.277190 0.255601 0.233462 0.241262 0.239032 0.258709 0.197176 0.210123 0.206175 0.334223 0.262756 0.286044 0.418201
sample_006c_kristys_big_day_2_321 0.197514 0.325638 0.328233 0.279635 0.229286 0.283472 0.193906 0.307801 0.275493 0.287933 0.251349 0.409947 0.241455 0.221021 0.271003 0.305103 0.208866 0.250046 0.254887 0.244807 0.294265 0.349204 0.286745 0.250799 0.291874 0.324898 0.286885 0.294204 0.352242 0.238828 0.312788 0.385544 0.380962 0.232980 0.298323 0.240504 0.227275 0.257034 0.240755 0.277185 0.259972 0.262285 0.302019 0.338305 0.310073 0.199453 0.222332 0.355542 0.289890 0.228926 0.270275 0.309302 0.276546 0.000000 0.078129 0.199106 0.264423 0.286244 0.284772 0.338618 0.227218 0.289228 0.354889 0.334748 0.404609 0.332290 0.355717 0.282598 0.325219 0.240654 0.238019 0.231377 0.309899 0.308389 0.273480 0.264909 0.345127 0.249470 0.402265 0.268578 0.311851 0.271799 0.327127 0.280244 0.375672 0.340159 0.266494 0.398870 0.362521 0.273328 0.378757 0.351927 0.328639 0.301845 0.284673 0.268312 0.240819 0.258607 0.451808 0.305022
sample_006c_kristys_big_day_2_850 0.230987 0.284376 0.386187 0.337177 0.239322 0.269518 0.210316 0.326482 0.269947 0.299678 0.261266 0.375752 0.248711 0.249303 0.283143 0.288381 0.226260 0.254568 0.249955 0.254808 0.305352 0.370228 0.287630 0.213314 0.261169 0.306749 0.255254 0.269050 0.293151 0.215571 0.323632 0.319644 0.335836 0.237694 0.276840 0.277992 0.257123 0.265716 0.233718 0.256820 0.235380 0.248339 0.301997 0.310437 0.290692 0.226658 0.235659 0.407465 0.278475 0.213926 0.285165 0.292579 0.273493 0.078129 0.000000 0.217213 0.257036 0.270872 0.258456 0.292771 0.213718 0.254880 0.328096 0.302489 0.362907 0.286163 0.318795 0.262978 0.280757 0.255038 0.242472 0.243249 0.267181 0.282618 0.258996 0.259997 0.312108 0.252785 0.343476 0.237037 0.301762 0.246598 0.290863 0.255564 0.337290 0.312680 0.282578 0.334772 0.311881 0.255336 0.342572 0.306804 0.325436 0.269300 0.255233 0.224151 0.294715 0.250677 0.370173 0.302368
sample_013c_goodbye_stacey_goodbye_2_537 0.204898 0.366906 0.356832 0.313504 0.196616 0.293520 0.226996 0.286892 0.266022 0.305125 0.213584 0.441446 0.268880 0.230268 0.303826 0.308299 0.231067 0.235041 0.253246 0.232566 0.304341 0.378303 0.261385 0.208425 0.298303 0.375579 0.326789 0.296611 0.361743 0.223795 0.329460 0.385926 0.415092 0.212123 0.349350 0.259279 0.257876 0.216059 0.201761 0.267108 0.249872 0.300723 0.258664 0.358662 0.307367 0.199010 0.223877 0.309542 0.266050 0.208171 0.171074 0.299281 0.258589 0.199106 0.217213 0.000000 0.251863 0.232975 0.260168 0.378453 0.173231 0.290693 0.391083 0.344440 0.437912 0.298720 0.339882 0.294177 0.320379 0.242971 0.257633 0.240422 0.338723 0.335644 0.288327 0.283126 0.400885 0.277755 0.413014 0.272854 0.296028 0.277180 0.340522 0.303543 0.377187 0.335965 0.266839 0.379464 0.398742 0.274543 0.389034 0.335652 0.339734 0.293924 0.310813 0.286606 0.213513 0.275233 0.441801 0.321225
sample_014c_hello_mallory_2_239 0.311633 0.298099 0.453871 0.381964 0.300452 0.262847 0.267348 0.352152 0.324450 0.276265 0.267494 0.361044 0.243313 0.252362 0.312511 0.328822 0.224546 0.305252 0.242335 0.329581 0.281060 0.363952 0.264891 0.212355 0.246438 0.297794 0.242199 0.296477 0.312044 0.215641 0.317846 0.288665 0.346774 0.253520 0.294045 0.229938 0.243692 0.308326 0.226796 0.235452 0.216885 0.221982 0.325206 0.269917 0.259580 0.221431 0.216047 0.450396 0.254775 0.215851 0.294211 0.332443 0.321744 0.264423 0.257036 0.251863 0.000000 0.301307 0.242793 0.300181 0.197046 0.224631 0.308088 0.294648 0.366695 0.317101 0.297078 0.265896 0.279808 0.268572 0.247181 0.237124 0.265412 0.272385 0.304549 0.260982 0.312882 0.237428 0.326942 0.232873 0.313557 0.256566 0.266609 0.238909 0.319790 0.263161 0.262863 0.301728 0.311689 0.246916 0.358122 0.297848 0.280231 0.250628 0.303274 0.241229 0.256585 0.269369 0.340923 0.352009
sample_024c_kristy_and_the_mothers_day_surprise_2_106 0.234419 0.277485 0.410883 0.291071 0.282402 0.263329 0.299275 0.157034 0.306054 0.205339 0.173031 0.288193 0.293047 0.185458 0.220311 0.338605 0.203204 0.148727 0.175472 0.180012 0.363035 0.419218 0.372777 0.171947 0.267150 0.314289 0.277798 0.175522 0.259996 0.207430 0.166140 0.287716 0.346131 0.185095 0.262504 0.256083 0.240526 0.200317 0.253777 0.234191 0.191070 0.234366 0.185698 0.285533 0.211248 0.257021 0.271723 0.390565 0.280900 0.186213 0.277260 0.294242 0.268327 0.286244 0.270872 0.232975 0.301307 0.000000 0.161381 0.247714 0.243902 0.260339 0.300469 0.266221 0.348263 0.259921 0.281744 0.280628 0.221386 0.144097 0.303400 0.186269 0.245879 0.246009 0.247346 0.191706 0.286086 0.168443 0.321152 0.216522 0.210400 0.244800 0.262974 0.256812 0.264234 0.252755 0.177717 0.310731 0.273523 0.185060 0.234838 0.202919 0.219617 0.199598 0.266821 0.187809 0.363954 0.229382 0.324067 0.383739
sample_025c_mary_anne_and_the_search_for_tigger_2_118 0.288067 0.200363 0.425135 0.328492 0.301236 0.204829 0.277300 0.241942 0.236153 0.197934 0.189096 0.247637 0.238179 0.193511 0.157811 0.276530 0.188718 0.175075 0.196145 0.230733 0.361186 0.360603 0.332242 0.135580 0.167295 0.249680 0.191386 0.163592 0.222138 0.208514 0.218465 0.243321 0.276401 0.175665 0.169377 0.222395 0.243159 0.195712 0.228152 0.225677 0.174764 0.160119 0.256593 0.203272 0.178880 0.249738 0.219718 0.416700 0.194252 0.164278 0.308166 0.265105 0.237732 0.284772 0.258456 0.260168 0.242793 0.161381 0.000000 0.183579 0.212324 0.195163 0.226590 0.204606 0.252440 0.218124 0.239907 0.160504 0.164980 0.176925 0.240643 0.165728 0.143530 0.188091 0.187157 0.140558 0.228534 0.154365 0.248535 0.193348 0.210965 0.148242 0.211738 0.197796 0.234960 0.171757 0.173933 0.233486 0.158020 0.195708 0.225558 0.208108 0.200845 0.145293 0.190934 0.106756 0.351048 0.153071 0.250487 0.394764
sample_027c_jessi_and_the_superbrat_2_159 0.380403 0.146329 0.483243 0.437427 0.357356 0.176555 0.292005 0.354648 0.257339 0.205790 0.352002 0.148040 0.236346 0.283170 0.229143 0.317952 0.238629 0.241034 0.244528 0.339260 0.411781 0.349265 0.367884 0.181546 0.210026 0.177066 0.124097 0.192994 0.153838 0.252202 0.221981 0.129061 0.156172 0.204128 0.140854 0.268954 0.254473 0.325081 0.291640 0.194444 0.161681 0.186640 0.324794 0.150871 0.179705 0.289848 0.275147 0.494146 0.198449 0.222694 0.384755 0.215072 0.219787 0.338618 0.292771 0.378453 0.300181 0.247714 0.183579 0.000000 0.258851 0.128218 0.127720 0.143736 0.132628 0.219560 0.213926 0.179654 0.120754 0.252343 0.272038 0.207513 0.135232 0.108907 0.169718 0.188785 0.124050 0.174569 0.135189 0.170462 0.196413 0.157659 0.130551 0.196361 0.178495 0.136353 0.292459 0.172696 0.109685 0.172260 0.124561 0.163511 0.146942 0.145972 0.166539 0.126650 0.439752 0.272478 0.136127 0.456547
sample_028c_welcome_back_stacey_2_679 0.269793 0.282440 0.437123 0.344655 0.241598 0.187422 0.220240 0.336158 0.287366 0.266287 0.262984 0.347299 0.209528 0.232534 0.303683 0.326962 0.201434 0.245379 0.214009 0.298793 0.272931 0.361210 0.255728 0.170080 0.210392 0.293315 0.226183 0.251909 0.239689 0.182985 0.294361 0.237270 0.287392 0.214885 0.272035 0.207585 0.224095 0.255295 0.201005 0.227640 0.184009 0.216907 0.306474 0.230520 0.223349 0.181991 0.179498 0.404447 0.213758 0.192730 0.239645 0.274157 0.241577 0.227218 0.213718 0.173231 0.197046 0.243902 0.212324 0.258851 0.000000 0.181127 0.265931 0.251910 0.321466 0.222690 0.252042 0.207901 0.231975 0.226324 0.219655 0.204097 0.245208 0.236172 0.237016 0.238061 0.263770 0.234366 0.261553 0.205410 0.233477 0.200817 0.234102 0.236960 0.287585 0.248601 0.266694 0.269642 0.290904 0.216017 0.281935 0.243970 0.234894 0.183040 0.238234 0.205957 0.242848 0.245940 0.286809 0.321243
sample_030c_mary_anne_and_the_great_romance_2_570 0.316090 0.187055 0.414145 0.403461 0.290817 0.164976 0.255951 0.359923 0.261259 0.230653 0.304568 0.211448 0.223148 0.248649 0.246650 0.304949 0.192013 0.257393 0.242571 0.313434 0.322863 0.341159 0.277551 0.169491 0.198968 0.198725 0.133218 0.201539 0.178899 0.223761 0.256256 0.179460 0.214812 0.220965 0.173817 0.210458 0.219941 0.288495 0.248833 0.181946 0.160674 0.180614 0.302360 0.166182 0.210768 0.237351 0.213690 0.457868 0.195433 0.182181 0.298510 0.229004 0.209134 0.289228 0.254880 0.290693 0.224631 0.260339 0.195163 0.128218 0.181127 0.000000 0.139688 0.139739 0.183964 0.196376 0.205725 0.153501 0.145018 0.230777 0.252187 0.194829 0.163477 0.143493 0.167015 0.190842 0.176392 0.177479 0.161888 0.184192 0.218618 0.171162 0.153058 0.200522 0.218001 0.162435 0.274367 0.214139 0.164582 0.177833 0.182672 0.207952 0.157741 0.139682 0.189463 0.141337 0.319382 0.250433 0.180774 0.377648
sample_032c_kristy_and_the_secret_of_susan_2_1152 0.375880 0.212913 0.467484 0.412577 0.355557 0.179304 0.288680 0.411734 0.303998 0.246213 0.383938 0.184061 0.246874 0.314058 0.260620 0.329235 0.254556 0.299726 0.279196 0.352289 0.409687 0.320219 0.371599 0.215520 0.237401 0.158879 0.171073 0.245569 0.157203 0.264348 0.264114 0.155936 0.219471 0.234288 0.175278 0.291594 0.239232 0.352689 0.302512 0.230138 0.189495 0.219609 0.324059 0.157452 0.215053 0.271521 0.304598 0.478581 0.235946 0.248884 0.411198 0.260868 0.256364 0.354889 0.328096 0.391083 0.308088 0.300469 0.226590 0.127720 0.265931 0.139688 0.000000 0.101520 0.091755 0.245489 0.225194 0.215273 0.151133 0.266339 0.286051 0.258375 0.211157 0.126157 0.243261 0.217881 0.147566 0.221173 0.140801 0.200507 0.224682 0.213484 0.126907 0.263550 0.206474 0.156554 0.331578 0.215757 0.159975 0.159925 0.176905 0.187361 0.185599 0.169523 0.192134 0.197734 0.422433 0.339612 0.169390 0.450983
sample_041c_mary_anne_vs_logan_2_1765 0.318881 0.191621 0.414751 0.403293 0.297241 0.199661 0.279979 0.368467 0.266982 0.239297 0.322778 0.199205 0.240810 0.298712 0.230934 0.306131 0.234856 0.261731 0.274563 0.303813 0.435640 0.329584 0.359981 0.190632 0.188206 0.172915 0.162238 0.224234 0.170961 0.249615 0.246346 0.182989 0.258143 0.192747 0.174428 0.311111 0.254256 0.308027 0.265330 0.234491 0.168661 0.223210 0.294083 0.156208 0.237727 0.273975 0.288495 0.466786 0.234322 0.228045 0.381277 0.250670 0.249644 0.334748 0.302489 0.344440 0.294648 0.266221 0.204606 0.143736 0.251910 0.139739 0.101520 0.000000 0.106530 0.231955 0.237841 0.221707 0.176967 0.244731 0.278345 0.242867 0.205630 0.154570 0.228098 0.206468 0.185239 0.213929 0.165999 0.179551 0.215888 0.216380 0.108222 0.274116 0.230955 0.182778 0.311237 0.186261 0.157643 0.146510 0.186559 0.180631 0.220725 0.188194 0.197296 0.177261 0.406587 0.316139 0.178307 0.417879
sample_045c_kristy_and_the_baby_parade_2_767 0.435693 0.216133 0.488166 0.496665 0.372579 0.224301 0.344129 0.464952 0.312304 0.289114 0.425887 0.211871 0.275957 0.392407 0.288599 0.336933 0.305144 0.332834 0.346775 0.371654 0.474229 0.316945 0.408221 0.254553 0.262490 0.183061 0.199712 0.286223 0.189877 0.322966 0.328746 0.174540 0.251208 0.276986 0.196617 0.370009 0.316156 0.379290 0.348217 0.291009 0.227567 0.274650 0.380674 0.189129 0.266678 0.351256 0.358117 0.513154 0.259306 0.303899 0.433020 0.259378 0.285277 0.404609 0.362907 0.437912 0.366695 0.348263 0.252440 0.132628 0.321466 0.183964 0.091755 0.106530 0.000000 0.274094 0.263645 0.248460 0.173421 0.339410 0.317071 0.321288 0.233297 0.164185 0.266303 0.259575 0.175578 0.275911 0.192335 0.226160 0.247503 0.250895 0.130551 0.310067 0.230346 0.194609 0.392354 0.211745 0.155666 0.188782 0.211208 0.197248 0.249123 0.232047 0.210036 0.207769 0.474364 0.378464 0.185814 0.494749
sample_046c_mary_anne_misses_logan_2_488 0.302364 0.254805 0.432448 0.389362 0.267171 0.210380 0.331784 0.393022 0.362049 0.353928 0.363655 0.316003 0.321489 0.295558 0.280201 0.383553 0.297788 0.265224 0.298155 0.305329 0.358694 0.404073 0.356767 0.183936 0.224877 0.283047 0.228976 0.182368 0.157208 0.258531 0.355056 0.294438 0.279829 0.238229 0.262825 0.326662 0.320834 0.317343 0.250765 0.241278 0.216051 0.254077 0.352857 0.252550 0.252763 0.354846 0.209037 0.472160 0.261035 0.220258 0.290027 0.284129 0.282720 0.332290 0.286163 0.298720 0.317101 0.259921 0.218124 0.219560 0.222690 0.196376 0.245489 0.231955 0.274094 0.000000 0.200221 0.170070 0.203400 0.227773 0.326699 0.215530 0.179750 0.237469 0.184357 0.251587 0.215610 0.282233 0.259953 0.249420 0.243494 0.189535 0.253976 0.205812 0.189458 0.243863 0.284927 0.201041 0.261763 0.221160 0.216705 0.181752 0.264881 0.207283 0.180057 0.181537 0.377601 0.231546 0.238845 0.368391
sample_048c_jessis_wish_2_68 0.322477 0.236518 0.410559 0.398056 0.345957 0.188351 0.355839 0.357328 0.335865 0.294853 0.362017 0.253867 0.306316 0.266935 0.269633 0.322326 0.298051 0.243039 0.241776 0.318403 0.428874 0.382756 0.383261 0.204418 0.247132 0.241755 0.221530 0.192086 0.194762 0.290080 0.297532 0.238335 0.244872 0.258138 0.246289 0.296515 0.306756 0.279235 0.289580 0.245860 0.221477 0.267434 0.287716 0.219811 0.191503 0.312927 0.259993 0.468722 0.238731 0.200089 0.346295 0.284419 0.279958 0.355717 0.318795 0.339882 0.297078 0.281744 0.239907 0.213926 0.252042 0.205725 0.225194 0.237841 0.263645 0.200221 0.000000 0.210745 0.179381 0.217557 0.299511 0.233201 0.206667 0.233523 0.241160 0.235408 0.209084 0.244790 0.244808 0.247085 0.204799 0.224895 0.234557 0.226780 0.205146 0.212836 0.289355 0.255382 0.251781 0.201269 0.207356 0.168887 0.241850 0.193130 0.224187 0.202530 0.411754 0.257439 0.209935 0.442421
sample_049c_claudia_and_the_genius_of_elm_street_2_105 0.322182 0.181176 0.404964 0.393639 0.284870 0.180780 0.268810 0.362650 0.261285 0.235578 0.274030 0.258989 0.219780 0.239859 0.191822 0.296922 0.192347 0.241954 0.228967 0.293957 0.325291 0.307496 0.296968 0.170985 0.160433 0.230059 0.153629 0.143881 0.185342 0.208005 0.271022 0.233133 0.218712 0.186479 0.159280 0.255875 0.239642 0.257728 0.257132 0.213199 0.190329 0.169803 0.296840 0.197241 0.211837 0.256708 0.185561 0.415303 0.169374 0.163735 0.295780 0.253127 0.207735 0.282598 0.262978 0.294177 0.265896 0.280628 0.160504 0.179654 0.207901 0.153501 0.215273 0.221707 0.248460 0.170070 0.210745 0.000000 0.204703 0.226846 0.255899 0.160851 0.139028 0.201963 0.131130 0.160308 0.181363 0.191963 0.222523 0.214619 0.233139 0.126899 0.226054 0.178204 0.220130 0.165823 0.232807 0.209381 0.183049 0.201694 0.190624 0.227802 0.203447 0.153615 0.151688 0.137253 0.316282 0.148961 0.243669 0.391543
sample_051c_staceys_ex_best_friend_2_1270 0.331877 0.185667 0.451504 0.402905 0.345632 0.178260 0.306402 0.315430 0.303526 0.234670 0.320140 0.199321 0.272025 0.248743 0.247169 0.282681 0.249923 0.221821 0.214514 0.313874 0.412101 0.342141 0.349420 0.180013 0.225824 0.209326 0.158351 0.210690 0.173404 0.247737 0.245824 0.180796 0.188059 0.229293 0.188414 0.238577 0.274095 0.289130 0.267911 0.211495 0.158638 0.200753 0.308196 0.184033 0.150830 0.255525 0.258230 0.458658 0.208214 0.210624 0.335869 0.209958 0.222032 0.325219 0.280757 0.320379 0.279808 0.221386 0.164980 0.120754 0.231975 0.145018 0.151133 0.176967 0.173421 0.203400 0.179381 0.204703 0.000000 0.219020 0.269883 0.209201 0.162069 0.130468 0.197342 0.194790 0.166365 0.197793 0.191054 0.163159 0.184154 0.145257 0.152639 0.172476 0.162550 0.166387 0.279846 0.213284 0.140293 0.185699 0.163614 0.136862 0.165386 0.149487 0.171859 0.144829 0.408155 0.261967 0.171543 0.448704
sample_057c_dawn_saves_the_planet_2_522 0.171902 0.230922 0.337515 0.261155 0.244070 0.208581 0.234477 0.190369 0.304621 0.217175 0.203832 0.272415 0.265255 0.148470 0.188214 0.306757 0.198837 0.146340 0.170823 0.190885 0.331250 0.401561 0.333217 0.164547 0.228539 0.250537 0.229151 0.160432 0.233079 0.189220 0.181148 0.304550 0.334090 0.167469 0.223199 0.228870 0.206716 0.184038 0.210010 0.241976 0.188434 0.207415 0.190289 0.248996 0.187163 0.225829 0.218937 0.385871 0.264387 0.152170 0.290874 0.287796 0.248317 0.240654 0.255038 0.242971 0.268572 0.144097 0.176925 0.252343 0.226324 0.230777 0.266339 0.244731 0.339410 0.227773 0.217557 0.226846 0.219020 0.000000 0.277563 0.152361 0.210583 0.239705 0.209549 0.197428 0.268817 0.157524 0.272967 0.217500 0.175257 0.202595 0.253344 0.224770 0.238802 0.220457 0.150673 0.319164 0.266533 0.140178 0.223118 0.202957 0.217580 0.167688 0.218032 0.178396 0.299590 0.182952 0.319004 0.341678
sample_064c_dawns_family_feud_2_305 0.304779 0.262623 0.421705 0.378320 0.313738 0.266181 0.237501 0.350071 0.233345 0.251636 0.280805 0.326668 0.229332 0.270272 0.278519 0.292946 0.219613 0.265214 0.276068 0.289763 0.335717 0.317338 0.245321 0.222683 0.255213 0.284191 0.256048 0.273890 0.310822 0.252600 0.299633 0.253326 0.295870 0.242954 0.264933 0.274738 0.252820 0.250506 0.258614 0.260850 0.247061 0.248271 0.304175 0.269806 0.278895 0.217863 0.263889 0.395865 0.225119 0.237279 0.316014 0.293789 0.267339 0.238019 0.242472 0.257633 0.247181 0.303400 0.240643 0.272038 0.219655 0.252187 0.286051 0.278345 0.317071 0.326699 0.299511 0.255899 0.269883 0.277563 0.000000 0.243224 0.271535 0.248508 0.292486 0.232674 0.299294 0.242603 0.308572 0.255301 0.272329 0.239412 0.264646 0.300336 0.351688 0.257868 0.309484 0.305025 0.272710 0.246551 0.324289 0.313782 0.281799 0.238259 0.267199 0.248376 0.315198 0.291561 0.334726 0.369016
sample_065c_staceys_big_crush_2_1311 0.236151 0.217132 0.386931 0.294068 0.264499 0.199881 0.214948 0.231876 0.255058 0.189247 0.200252 0.255327 0.209304 0.146128 0.194501 0.321100 0.154611 0.159107 0.149719 0.234330 0.303136 0.376657 0.304278 0.159837 0.188588 0.288108 0.171772 0.161536 0.202896 0.187653 0.172923 0.250240 0.264653 0.162902 0.202343 0.167063 0.196777 0.212432 0.241676 0.168532 0.168410 0.151053 0.253008 0.220288 0.191114 0.181715 0.167319 0.371222 0.203093 0.149214 0.276514 0.275524 0.217438 0.231377 0.243249 0.240422 0.237124 0.186269 0.165728 0.207513 0.204097 0.194829 0.258375 0.242867 0.321288 0.215530 0.233201 0.160851 0.209201 0.152361 0.243224 0.000000 0.152952 0.185656 0.174919 0.155353 0.205511 0.131734 0.252454 0.172685 0.237371 0.143384 0.228621 0.154152 0.241440 0.176382 0.175930 0.275698 0.243348 0.187649 0.220671 0.215478 0.170508 0.170466 0.178961 0.151235 0.285910 0.150802 0.309814 0.328372
sample_068c_jessi_and_the_bad_babysitter_2_89 0.330618 0.166459 0.452270 0.414790 0.350850 0.180572 0.274493 0.314473 0.278778 0.213735 0.287329 0.207775 0.225486 0.222490 0.196601 0.310838 0.215979 0.202714 0.196650 0.307598 0.370416 0.364330 0.341122 0.165744 0.130296 0.226834 0.120973 0.146892 0.163704 0.218020 0.261863 0.205730 0.204434 0.202445 0.147373 0.232388 0.243146 0.282241 0.262663 0.187339 0.151900 0.137263 0.352763 0.195603 0.186327 0.270638 0.201045 0.461608 0.197007 0.171276 0.365416 0.255936 0.242245 0.309899 0.267181 0.338723 0.265412 0.245879 0.143530 0.135232 0.245208 0.163477 0.211157 0.205630 0.233297 0.179750 0.206667 0.139028 0.162069 0.210583 0.271535 0.152952 0.000000 0.157811 0.139686 0.161004 0.173230 0.172026 0.192511 0.178381 0.218670 0.129954 0.183718 0.149837 0.175546 0.147834 0.214185 0.177944 0.146011 0.196821 0.163664 0.190535 0.155425 0.157621 0.157905 0.099580 0.400917 0.155999 0.207253 0.428598
sample_070c_stacey_and_the_cheerleaders_2_1293 0.341379 0.187679 0.468152 0.378447 0.355494 0.179892 0.245565 0.330916 0.243601 0.211582 0.312294 0.151998 0.220255 0.257087 0.226950 0.315896 0.193969 0.233757 0.227531 0.339703 0.388990 0.350811 0.342293 0.187470 0.213615 0.201512 0.143890 0.212501 0.176138 0.248939 0.230036 0.167178 0.195161 0.193036 0.163255 0.236633 0.236567 0.298370 0.281317 0.174393 0.160224 0.186826 0.300463 0.171846 0.203512 0.227842 0.257333 0.398984 0.174836 0.217201 0.371944 0.216288 0.203781 0.308389 0.282618 0.335644 0.272385 0.246009 0.188091 0.108907 0.236172 0.143493 0.126157 0.154570 0.164185 0.237469 0.233523 0.201963 0.130468 0.239705 0.248508 0.185656 0.157811 0.000000 0.209733 0.169905 0.161060 0.176734 0.132463 0.150623 0.192492 0.158472 0.111250 0.195571 0.214875 0.142637 0.265535 0.216082 0.145256 0.189626 0.165980 0.185295 0.145751 0.144081 0.177974 0.148925 0.400917 0.263876 0.166395 0.420857
sample_078c_claudia_and_crazy_peaches_2_36 0.298178 0.201225 0.394287 0.393294 0.278061 0.195512 0.261727 0.343577 0.281359 0.233728 0.290647 0.268092 0.243707 0.254365 0.212714 0.304134 0.225436 0.220347 0.225639 0.278969 0.336304 0.332063 0.329131 0.184904 0.187511 0.229640 0.154090 0.172175 0.191271 0.218756 0.261339 0.259805 0.237176 0.186456 0.164684 0.264537 0.248885 0.271679 0.223586 0.204218 0.186077 0.190611 0.332539 0.209133 0.211084 0.265812 0.188007 0.430699 0.219965 0.186402 0.296114 0.240871 0.212160 0.273480 0.258996 0.288327 0.304549 0.247346 0.187157 0.169718 0.237016 0.167015 0.243261 0.228098 0.266303 0.184357 0.241160 0.131130 0.197342 0.209549 0.292486 0.174919 0.139686 0.209733 0.000000 0.183068 0.195388 0.202029 0.226728 0.193000 0.199575 0.147059 0.219818 0.187525 0.229292 0.186890 0.264549 0.209070 0.192453 0.204335 0.165070 0.233572 0.208631 0.182628 0.186974 0.142195 0.336081 0.172536 0.273873 0.394055
sample_079c_mary_anne_breaks_the_rules_2_782 0.274338 0.192756 0.385925 0.322270 0.319879 0.196122 0.251085 0.237668 0.228271 0.176680 0.206844 0.215951 0.235567 0.187669 0.146069 0.298060 0.149629 0.183064 0.199027 0.234777 0.383995 0.341476 0.345237 0.166971 0.167491 0.201281 0.173235 0.143747 0.218386 0.211898 0.152807 0.224413 0.261468 0.161403 0.144755 0.251690 0.215002 0.198845 0.247864 0.180244 0.162463 0.156892 0.216491 0.190409 0.213446 0.219965 0.234835 0.406419 0.192322 0.159964 0.326739 0.284401 0.224275 0.264909 0.259997 0.283126 0.260982 0.191706 0.140558 0.188785 0.238061 0.190842 0.217881 0.206468 0.259575 0.251587 0.235408 0.160308 0.194790 0.197428 0.232674 0.155353 0.161004 0.169905 0.183068 0.000000 0.202657 0.130677 0.200680 0.174291 0.177300 0.136792 0.194303 0.216372 0.266997 0.158809 0.184740 0.250767 0.166955 0.197332 0.183929 0.232799 0.174587 0.161177 0.216747 0.145028 0.346365 0.175080 0.255358 0.397926
sample_082c_jessi_and_the_troublemaker_2_738 0.382497 0.174884 0.493576 0.409744 0.364254 0.195301 0.305259 0.388452 0.297429 0.197700 0.360444 0.172830 0.217485 0.285326 0.242132 0.335500 0.251581 0.234392 0.229291 0.337305 0.383296 0.332368 0.385248 0.212175 0.199283 0.206413 0.166476 0.189789 0.133638 0.227904 0.250394 0.140875 0.176794 0.203419 0.163676 0.267576 0.258936 0.353841 0.304761 0.208400 0.161713 0.179431 0.331608 0.154821 0.175035 0.268089 0.232292 0.480444 0.217258 0.213529 0.404536 0.261370 0.251876 0.345127 0.312108 0.400885 0.312882 0.286086 0.228534 0.124050 0.263770 0.176392 0.147566 0.185239 0.175578 0.215610 0.209084 0.181363 0.166365 0.268817 0.299294 0.205511 0.173230 0.161060 0.195388 0.202657 0.000000 0.188750 0.166259 0.149326 0.227122 0.171145 0.166328 0.184836 0.132299 0.184901 0.297546 0.160294 0.163776 0.170945 0.143630 0.139639 0.157699 0.160064 0.158711 0.179955 0.421747 0.243310 0.200278 0.441927
sample_084c_dawn_and_the_school_spirit_war_2_493 0.250864 0.211782 0.370129 0.287904 0.306851 0.192394 0.208831 0.204856 0.227864 0.143060 0.186979 0.197966 0.208169 0.128019 0.150824 0.308459 0.140893 0.147700 0.156204 0.236037 0.350753 0.371720 0.326957 0.178696 0.186522 0.236347 0.167016 0.169721 0.220355 0.188335 0.137953 0.218086 0.276758 0.148333 0.169193 0.187739 0.171531 0.223821 0.251263 0.181099 0.152130 0.150476 0.209628 0.167608 0.182522 0.185622 0.213341 0.425719 0.208197 0.143108 0.329336 0.296869 0.241038 0.249470 0.252785 0.277755 0.237428 0.168443 0.154365 0.174569 0.234366 0.177479 0.221173 0.213929 0.275911 0.282233 0.244790 0.191963 0.197793 0.157524 0.242603 0.131734 0.172026 0.176734 0.202029 0.130677 0.188750 0.000000 0.213096 0.176668 0.196645 0.164812 0.203354 0.201461 0.243850 0.156508 0.157300 0.284181 0.184115 0.176321 0.188790 0.217033 0.148132 0.151396 0.222084 0.153278 0.314165 0.191318 0.269389 0.359927
sample_090c_welcome_to_the_bsc_abby_2_1491 0.411028 0.219962 0.523369 0.457020 0.409007 0.185773 0.298819 0.407096 0.306108 0.266951 0.405358 0.163897 0.278570 0.331566 0.302337 0.410025 0.264826 0.313081 0.307267 0.425361 0.449189 0.422208 0.396067 0.247779 0.218995 0.176802 0.159707 0.230799 0.165524 0.270287 0.251494 0.148638 0.215871 0.254431 0.158405 0.314539 0.277426 0.374238 0.309010 0.225249 0.176960 0.224708 0.338641 0.146000 0.217410 0.292247 0.306278 0.541717 0.246683 0.262807 0.460550 0.320477 0.291395 0.402265 0.343476 0.413014 0.326942 0.321152 0.248535 0.135189 0.261553 0.161888 0.140801 0.165999 0.192335 0.259953 0.244808 0.222523 0.191054 0.272967 0.308572 0.252454 0.192511 0.132463 0.226728 0.200680 0.166259 0.213096 0.000000 0.204945 0.213704 0.189586 0.165309 0.280790 0.250008 0.145141 0.322993 0.226030 0.156898 0.211288 0.153708 0.242831 0.160078 0.135143 0.225498 0.197401 0.466279 0.326292 0.132783 0.481855
sample_097c_claudia_and_the_worlds_cutest_baby_2_1631 0.300866 0.202108 0.442119 0.379274 0.290506 0.217345 0.231898 0.316893 0.242875 0.167833 0.252997 0.219328 0.187565 0.222929 0.225339 0.286471 0.183567 0.196772 0.157460 0.270022 0.345531 0.313046 0.328214 0.162485 0.163426 0.221984 0.165487 0.199911 0.171837 0.163148 0.216775 0.163798 0.207747 0.156212 0.182265 0.230142 0.232477 0.264146 0.222695 0.178827 0.125125 0.165906 0.285303 0.178354 0.185067 0.174343 0.183605 0.426473 0.202142 0.177661 0.318263 0.221106 0.206133 0.268578 0.237037 0.272854 0.232873 0.216522 0.193348 0.170462 0.205410 0.184192 0.200507 0.179551 0.226160 0.249420 0.247085 0.214619 0.163159 0.217500 0.255301 0.172685 0.178381 0.150623 0.193000 0.174291 0.149326 0.176668 0.204945 0.000000 0.210463 0.137957 0.115372 0.172862 0.178379 0.194447 0.256328 0.164915 0.196052 0.167407 0.182404 0.155126 0.178236 0.166917 0.160433 0.172002 0.355875 0.226830 0.239341 0.383535
sample_100c_kristys_worst_idea_2_2163 0.303292 0.253219 0.417056 0.352157 0.341590 0.177419 0.281600 0.272866 0.297548 0.241931 0.299265 0.231094 0.269938 0.240078 0.231460 0.357344 0.240627 0.205207 0.247318 0.282555 0.413975 0.382778 0.377923 0.207362 0.247428 0.218062 0.229708 0.197968 0.202702 0.261848 0.220095 0.241827 0.276895 0.194538 0.221274 0.290131 0.251986 0.228169 0.267061 0.233724 0.167668 0.259757 0.234623 0.191581 0.200593 0.271896 0.284441 0.402392 0.248217 0.233897 0.334569 0.248477 0.215651 0.311851 0.301762 0.296028 0.313557 0.210400 0.210965 0.196413 0.233477 0.218618 0.224682 0.215888 0.247503 0.243494 0.204799 0.233139 0.184154 0.175257 0.272329 0.237371 0.218670 0.192492 0.199575 0.177300 0.227122 0.196645 0.213704 0.210463 0.000000 0.179301 0.195796 0.274047 0.245470 0.189525 0.242510 0.259170 0.201867 0.170113 0.178941 0.181683 0.189597 0.157475 0.241595 0.184678 0.375749 0.248129 0.200322 0.424791
sample_106c_claudia_queen_of_the_seventh_grade_2_958 0.319816 0.184885 0.469058 0.382819 0.298262 0.184316 0.255624 0.303143 0.251246 0.211082 0.275731 0.239570 0.223181 0.208019 0.208014 0.327194 0.185753 0.203880 0.195964 0.304733 0.348058 0.364013 0.307364 0.161652 0.149913 0.204487 0.144905 0.163744 0.175715 0.184521 0.234229 0.200201 0.211890 0.186083 0.156045 0.227550 0.230797 0.266347 0.234332 0.174042 0.140503 0.140756 0.307915 0.172671 0.190330 0.218306 0.189126 0.444895 0.172484 0.183702 0.323552 0.247132 0.212879 0.271799 0.246598 0.277180 0.256566 0.244800 0.148242 0.157659 0.200817 0.171162 0.213484 0.216380 0.250895 0.189535 0.224895 0.126899 0.145257 0.202595 0.239412 0.143384 0.129954 0.158472 0.147059 0.136792 0.171145 0.164812 0.189586 0.137957 0.179301 0.000000 0.172446 0.151977 0.204908 0.150784 0.213207 0.186963 0.148983 0.207503 0.167410 0.207361 0.165118 0.142547 0.161403 0.126353 0.343124 0.164629 0.217346 0.409680
sample_107c_mind_your_own_business_kristy_2_1194 0.343380 0.196764 0.469952 0.412021 0.316455 0.187874 0.256317 0.378353 0.269655 0.211196 0.311305 0.197098 0.205085 0.277887 0.254889 0.313197 0.213675 0.249560 0.222187 0.324123 0.372235 0.328946 0.337243 0.192735 0.184120 0.167708 0.140632 0.235875 0.164695 0.211741 0.251996 0.149425 0.230326 0.187847 0.174612 0.266321 0.245999 0.316116 0.268012 0.213786 0.138277 0.192085 0.323145 0.168296 0.210553 0.223134 0.251047 0.460230 0.208641 0.231713 0.384144 0.204114 0.218186 0.327127 0.290863 0.340522 0.266609 0.262974 0.211738 0.130551 0.234102 0.153058 0.126907 0.108222 0.130551 0.253976 0.234557 0.226054 0.152639 0.253344 0.264646 0.228621 0.183718 0.111250 0.219818 0.194303 0.166328 0.203354 0.165309 0.115372 0.195796 0.172446 0.000000 0.220451 0.204822 0.169708 0.303980 0.170715 0.154381 0.165636 0.182092 0.171763 0.175469 0.181698 0.177281 0.161618 0.391581 0.290557 0.171284 0.413971
sample_108c_dont_give_up_mallory_2_489 0.307876 0.225118 0.449773 0.330739 0.291158 0.209340 0.256810 0.324646 0.310530 0.233016 0.293978 0.269843 0.210008 0.214153 0.252343 0.280922 0.227319 0.211938 0.150517 0.322925 0.282060 0.373731 0.318265 0.182158 0.211729 0.283505 0.178163 0.212928 0.195415 0.181017 0.283095 0.247348 0.220402 0.206869 0.227247 0.197112 0.236661 0.323635 0.260736 0.206104 0.166146 0.147930 0.338737 0.230123 0.182045 0.224044 0.172933 0.419397 0.189993 0.168534 0.281450 0.268272 0.264701 0.280244 0.255564 0.303543 0.238909 0.256812 0.197796 0.196361 0.236960 0.200522 0.263550 0.274116 0.310067 0.205812 0.226780 0.178204 0.172476 0.224770 0.300336 0.154152 0.149837 0.195571 0.187525 0.216372 0.184836 0.201461 0.280790 0.172862 0.274047 0.151977 0.220451 0.000000 0.156591 0.217706 0.221245 0.228331 0.238298 0.220491 0.218909 0.178450 0.191262 0.202638 0.177821 0.161536 0.308272 0.177414 0.295938 0.366517
sample_110c_abby_and_the_bad_sport_2_535 0.361247 0.235671 0.479589 0.411300 0.350967 0.210535 0.347418 0.377716 0.375881 0.259444 0.369433 0.219684 0.263477 0.278276 0.285310 0.343905 0.307016 0.236000 0.221873 0.334064 0.377652 0.352108 0.402877 0.197986 0.228100 0.263732 0.201916 0.216196 0.159766 0.245957 0.318329 0.230518 0.218967 0.240163 0.239752 0.264061 0.301529 0.338090 0.308601 0.254327 0.182275 0.212695 0.361543 0.231323 0.179711 0.307695 0.216004 0.475494 0.254597 0.213065 0.344256 0.243217 0.268556 0.375672 0.337290 0.377187 0.319790 0.264234 0.234960 0.178495 0.287585 0.218001 0.206474 0.230955 0.230346 0.189458 0.205146 0.220130 0.162550 0.238802 0.351688 0.241440 0.175546 0.214875 0.229292 0.266997 0.132299 0.243850 0.250008 0.178379 0.245470 0.204908 0.204822 0.156591 0.000000 0.240136 0.259384 0.176531 0.247053 0.190377 0.182080 0.113307 0.196648 0.199246 0.161691 0.201401 0.415717 0.249158 0.227997 0.452935
sample_111c_staceys_secret_friend_2_700 0.346615 0.186566 0.460735 0.385803 0.360840 0.163627 0.284481 0.303371 0.271506 0.184465 0.302526 0.167797 0.261136 0.248321 0.217028 0.346941 0.219576 0.230519 0.237267 0.326254 0.389871 0.384153 0.366430 0.196471 0.203153 0.203952 0.136904 0.184543 0.187215 0.260588 0.198835 0.178507 0.242156 0.208093 0.154758 0.263832 0.230428 0.285951 0.297303 0.169797 0.179780 0.178655 0.264544 0.164046 0.196004 0.261418 0.263166 0.472944 0.187872 0.207913 0.382163 0.271692 0.247237 0.340159 0.312680 0.335965 0.263161 0.252755 0.171757 0.136353 0.248601 0.162435 0.156554 0.182778 0.194609 0.243863 0.212836 0.165823 0.166387 0.220457 0.257868 0.176382 0.147834 0.142637 0.186890 0.158809 0.184901 0.156508 0.145141 0.194447 0.189525 0.150784 0.169708 0.217706 0.240136 0.000000 0.222132 0.220955 0.126794 0.199407 0.165010 0.217741 0.157049 0.156167 0.193880 0.135849 0.394513 0.225671 0.179254 0.455577
sample_120c_mary_anne_and_the_playground_fight_2_942 0.223632 0.290908 0.375010 0.238020 0.303969 0.218159 0.283765 0.178115 0.325280 0.231306 0.201124 0.277162 0.284998 0.136434 0.172074 0.376356 0.197491 0.176175 0.194445 0.236802 0.355299 0.450157 0.376735 0.202380 0.230801 0.319777 0.249417 0.178051 0.298081 0.242687 0.211066 0.369726 0.406574 0.194165 0.261771 0.219550 0.202261 0.215495 0.272437 0.224480 0.207177 0.175282 0.214082 0.293325 0.219222 0.249866 0.236746 0.398819 0.256415 0.177494 0.325300 0.348526 0.305035 0.266494 0.282578 0.266839 0.262863 0.177717 0.173933 0.292459 0.266694 0.274367 0.331578 0.311237 0.392354 0.284927 0.289355 0.232807 0.279846 0.150673 0.309484 0.175930 0.214185 0.265535 0.264549 0.184740 0.297546 0.157300 0.322993 0.256328 0.242510 0.213207 0.303980 0.221245 0.259384 0.222132 0.000000 0.357563 0.290474 0.248833 0.278008 0.270412 0.218336 0.218994 0.272414 0.182991 0.305044 0.142985 0.351510 0.401938
sample_121c_abby_in_wonderland_2_494 0.429524 0.200397 0.539196 0.528969 0.357803 0.268070 0.356586 0.461545 0.298635 0.248904 0.385972 0.255807 0.242215 0.369448 0.317734 0.328320 0.291769 0.292130 0.279930 0.369312 0.374158 0.313590 0.387202 0.188913 0.170621 0.240682 0.192090 0.230313 0.145188 0.214019 0.364110 0.180173 0.206062 0.233249 0.223903 0.362646 0.357049 0.369342 0.284490 0.265915 0.195843 0.231503 0.403940 0.206181 0.257096 0.336785 0.221800 0.514414 0.258299 0.258578 0.401883 0.242301 0.277190 0.398870 0.334772 0.379464 0.301728 0.310731 0.233486 0.172696 0.269642 0.214139 0.215757 0.186261 0.211745 0.201041 0.255382 0.209381 0.213284 0.319164 0.305025 0.275698 0.177944 0.216082 0.209070 0.250767 0.160294 0.284181 0.226030 0.164915 0.259170 0.186963 0.170715 0.228331 0.176531 0.220955 0.357563 0.000000 0.199272 0.223864 0.201619 0.173722 0.273948 0.195344 0.170251 0.195333 0.459303 0.273378 0.216599 0.452673
sample_m03c_mallory_and_the_ghost_cat_2_261 0.388881 0.134324 0.500505 0.433921 0.385901 0.202765 0.293025 0.343293 0.239018 0.211175 0.328682 0.180213 0.258218 0.291799 0.217031 0.307547 0.245940 0.249673 0.255262 0.353209 0.441003 0.362205 0.377793 0.215618 0.188535 0.165612 0.140870 0.196226 0.189009 0.255444 0.235549 0.149175 0.222222 0.225549 0.125994 0.313491 0.263944 0.319034 0.300772 0.220642 0.170838 0.178575 0.312898 0.146369 0.197237 0.306663 0.301406 0.508227 0.219916 0.242795 0.417434 0.263124 0.255601 0.362521 0.311881 0.398742 0.311689 0.273523 0.158020 0.109685 0.290904 0.164582 0.159975 0.157643 0.155666 0.261763 0.251781 0.183049 0.140293 0.266533 0.272710 0.243348 0.146011 0.145256 0.192453 0.166955 0.163776 0.184115 0.156898 0.196052 0.201867 0.148983 0.154381 0.238298 0.247053 0.126794 0.290474 0.199272 0.000000 0.206000 0.153090 0.222496 0.187076 0.152004 0.196043 0.126036 0.462301 0.263618 0.180054 0.471258
sample_m04c_kristy_and_the_missing_child_2_1829 0.256085 0.203273 0.392852 0.331188 0.250768 0.192908 0.230721 0.282845 0.269543 0.194810 0.242192 0.233482 0.182585 0.219446 0.208830 0.268706 0.203016 0.183831 0.183171 0.209552 0.325510 0.300502 0.296802 0.166987 0.208752 0.194336 0.176554 0.167327 0.163850 0.178378 0.214989 0.181702 0.248339 0.163773 0.186299 0.247416 0.199748 0.230263 0.231738 0.212604 0.141223 0.196103 0.219895 0.186282 0.166660 0.213631 0.224405 0.389361 0.219889 0.146347 0.281952 0.251848 0.233462 0.273328 0.255336 0.274543 0.246916 0.185060 0.195708 0.172260 0.216017 0.177833 0.159925 0.146510 0.188782 0.221160 0.201269 0.201694 0.185699 0.140178 0.246551 0.187649 0.196821 0.189626 0.204335 0.197332 0.170945 0.176321 0.211288 0.167407 0.170113 0.207503 0.165636 0.220491 0.190377 0.199407 0.248833 0.223864 0.206000 0.000000 0.187535 0.133423 0.186401 0.145040 0.182033 0.175479 0.323979 0.240503 0.236483 0.341251
sample_m06c_the_mystery_at_claudias_house_2_420 0.357987 0.185496 0.476576 0.409219 0.355229 0.199670 0.345792 0.335873 0.324635 0.232028 0.366710 0.169289 0.288997 0.285395 0.226591 0.351853 0.285613 0.228056 0.252583 0.334955 0.465752 0.391034 0.406761 0.219948 0.210339 0.189213 0.168411 0.181308 0.154289 0.257484 0.211385 0.162084 0.200653 0.209163 0.154069 0.339237 0.274019 0.334861 0.307760 0.215203 0.180234 0.219450 0.298554 0.155814 0.190499 0.311188 0.280486 0.511953 0.231583 0.221393 0.394283 0.264521 0.241262 0.378757 0.342572 0.389034 0.358122 0.234838 0.225558 0.124561 0.281935 0.182672 0.176905 0.186559 0.211208 0.216705 0.207356 0.190624 0.163614 0.223118 0.324289 0.220671 0.163664 0.165980 0.165070 0.183929 0.143630 0.188790 0.153708 0.182404 0.178941 0.167410 0.182092 0.218909 0.182080 0.165010 0.278008 0.201619 0.153090 0.187535 0.000000 0.152508 0.147170 0.162839 0.206758 0.177183 0.470359 0.261603 0.179455 0.478028
sample_m10c_stacey_and_the_mystery_money_2_937 0.324782 0.224020 0.465530 0.375826 0.317295 0.205560 0.334357 0.305196 0.341631 0.238586 0.323558 0.211683 0.241104 0.253025 0.243083 0.296096 0.278244 0.181901 0.188880 0.267105 0.397745 0.333504 0.404037 0.176584 0.224923 0.259497 0.220954 0.185355 0.137473 0.230554 0.266875 0.191274 0.214558 0.196440 0.235769 0.284507 0.288709 0.276863 0.285683 0.245155 0.164203 0.232919 0.279576 0.196832 0.151982 0.277772 0.236669 0.457658 0.249003 0.195886 0.319447 0.224191 0.239032 0.351927 0.306804 0.335652 0.297848 0.202919 0.208108 0.163511 0.243970 0.207952 0.187361 0.180631 0.197248 0.181752 0.168887 0.227802 0.136862 0.202957 0.313782 0.215478 0.190535 0.185295 0.233572 0.232799 0.139639 0.217033 0.242831 0.155126 0.181683 0.207361 0.171763 0.178450 0.113307 0.217741 0.270412 0.173722 0.222496 0.133423 0.152508 0.000000 0.190833 0.169236 0.171831 0.181914 0.420830 0.259833 0.208039 0.414049
sample_m12c_dawn_and_the_surfer_ghost_2_126 0.337621 0.243964 0.477651 0.360095 0.373178 0.185747 0.285909 0.286622 0.336891 0.177641 0.315406 0.180479 0.228984 0.208486 0.237967 0.398477 0.206416 0.203988 0.199316 0.341429 0.378542 0.414897 0.355920 0.235049 0.233820 0.226229 0.154193 0.205538 0.192837 0.235892 0.193600 0.183015 0.236629 0.222234 0.169930 0.172937 0.174904 0.322191 0.301010 0.178273 0.135663 0.150297 0.301869 0.176203 0.156908 0.222447 0.240879 0.459760 0.192675 0.207594 0.382390 0.302440 0.258709 0.328639 0.325436 0.339734 0.280231 0.219617 0.200845 0.146942 0.234894 0.157741 0.185599 0.220725 0.249123 0.264881 0.241850 0.203447 0.165386 0.217580 0.281799 0.170508 0.155425 0.145751 0.208631 0.174587 0.157699 0.148132 0.160078 0.178236 0.189597 0.165118 0.175469 0.191262 0.196648 0.157049 0.218336 0.273948 0.187076 0.186401 0.147170 0.190833 0.000000 0.145500 0.215135 0.170757 0.357305 0.246472 0.209888 0.448544
sample_m14c_stacey_and_the_mystery_at_the_mall_2_1474 0.309745 0.178499 0.466289 0.380298 0.320493 0.148811 0.247932 0.280627 0.238021 0.195148 0.260071 0.185183 0.195107 0.203938 0.208577 0.315346 0.180078 0.196995 0.199063 0.304516 0.336892 0.352635 0.329666 0.156659 0.176405 0.198170 0.146071 0.157175 0.156093 0.186671 0.208565 0.158571 0.211251 0.184172 0.174047 0.216557 0.221037 0.234428 0.245191 0.192280 0.131708 0.167936 0.245674 0.156279 0.154978 0.227099 0.203903 0.417698 0.202628 0.159029 0.330806 0.236946 0.197176 0.301845 0.269300 0.293924 0.250628 0.199598 0.145293 0.145972 0.183040 0.139682 0.169523 0.188194 0.232047 0.207283 0.193130 0.153615 0.149487 0.167688 0.238259 0.170466 0.157621 0.144081 0.182628 0.161177 0.160064 0.151396 0.135143 0.166917 0.157475 0.142547 0.181698 0.202638 0.199246 0.156167 0.218994 0.195344 0.152004 0.145040 0.162839 0.169236 0.145500 0.000000 0.169302 0.129188 0.343990 0.203170 0.170215 0.388769
sample_m18c_stacey_and_the_mystery_at_the_empty_house_2_224 0.313145 0.172840 0.391493 0.401906 0.290799 0.202350 0.272916 0.367010 0.252641 0.222286 0.313143 0.239193 0.205134 0.262180 0.230068 0.296676 0.222679 0.223385 0.214056 0.284372 0.314635 0.311908 0.338699 0.178521 0.204485 0.229542 0.169085 0.188178 0.155095 0.213456 0.283017 0.204540 0.198752 0.185932 0.200190 0.256583 0.260830 0.251517 0.271721 0.221492 0.173641 0.206912 0.323460 0.198405 0.188212 0.251113 0.188062 0.392375 0.211534 0.190202 0.311540 0.226992 0.210123 0.284673 0.255233 0.310813 0.303274 0.266821 0.190934 0.166539 0.238234 0.189463 0.192134 0.197296 0.210036 0.180057 0.224187 0.151688 0.171859 0.218032 0.267199 0.178961 0.157905 0.177974 0.186974 0.216747 0.158711 0.222084 0.225498 0.160433 0.241595 0.161403 0.177281 0.177821 0.161691 0.193880 0.272414 0.170251 0.196043 0.182033 0.206758 0.171831 0.215135 0.169302 0.000000 0.168000 0.348076 0.212712 0.244287 0.390592
sample_m20c_mary_anne_and_the_zoo_mystery_2_80 0.294062 0.160327 0.438953 0.372245 0.300861 0.162823 0.261494 0.263552 0.249381 0.204330 0.223342 0.203145 0.227481 0.191622 0.182050 0.298989 0.194611 0.176264 0.196508 0.274705 0.343996 0.371339 0.338031 0.139988 0.150564 0.216824 0.135118 0.137644 0.169019 0.196279 0.219585 0.204787 0.241973 0.167839 0.151018 0.221536 0.235584 0.226097 0.240599 0.190370 0.143265 0.148305 0.274956 0.180978 0.178828 0.268315 0.213909 0.431608 0.195437 0.165003 0.318459 0.221539 0.206175 0.268312 0.224151 0.286606 0.241229 0.187809 0.106756 0.126650 0.205957 0.141337 0.197734 0.177261 0.207769 0.181537 0.202530 0.137253 0.144829 0.178396 0.248376 0.151235 0.099580 0.148925 0.142195 0.145028 0.179955 0.153278 0.197401 0.172002 0.184678 0.126353 0.161618 0.161536 0.201401 0.135849 0.182991 0.195333 0.126036 0.175479 0.177183 0.181914 0.170757 0.129188 0.168000 0.000000 0.354803 0.153284 0.179063 0.398172
sample_m25c_kristy_and_the_middle_school_vandal_2_134 0.259532 0.418507 0.366972 0.329566 0.223750 0.323053 0.248307 0.391393 0.374117 0.323612 0.290284 0.487692 0.271450 0.282498 0.357542 0.360828 0.252775 0.342194 0.290923 0.341485 0.265152 0.399764 0.290226 0.296619 0.366065 0.396230 0.355810 0.389845 0.417195 0.269021 0.383780 0.455855 0.463396 0.295462 0.397664 0.242387 0.241974 0.331440 0.275699 0.310468 0.315117 0.310879 0.364468 0.396068 0.348232 0.232059 0.242713 0.362717 0.313639 0.270179 0.220285 0.375579 0.334223 0.240819 0.294715 0.213513 0.256585 0.363954 0.351048 0.439752 0.242848 0.319382 0.422433 0.406587 0.474364 0.377601 0.411754 0.316282 0.408155 0.299590 0.315198 0.285910 0.400917 0.400917 0.336081 0.346365 0.421747 0.314165 0.466279 0.355875 0.375749 0.343124 0.391581 0.308272 0.415717 0.394513 0.305044 0.459303 0.462301 0.323979 0.470359 0.420830 0.357305 0.343990 0.348076 0.354803 0.000000 0.314246 0.510381 0.301307
sample_m27c_claudia_and_the_lighthouse_ghost_2_356 0.266412 0.240564 0.403492 0.327204 0.304472 0.213093 0.285129 0.255750 0.295100 0.233488 0.218690 0.311040 0.257867 0.181296 0.174332 0.309472 0.205517 0.173948 0.191521 0.245020 0.342433 0.374978 0.335444 0.188519 0.182951 0.321556 0.211369 0.153329 0.262056 0.221420 0.258129 0.340358 0.327262 0.183562 0.226544 0.241400 0.254117 0.230587 0.249352 0.225640 0.212333 0.171539 0.283597 0.279118 0.219264 0.253543 0.185186 0.387444 0.228077 0.155235 0.318535 0.313103 0.262756 0.258607 0.250677 0.275233 0.269369 0.229382 0.153071 0.272478 0.245940 0.250433 0.339612 0.316139 0.378464 0.231546 0.257439 0.148961 0.261967 0.182952 0.291561 0.150802 0.155999 0.263876 0.172536 0.175080 0.243310 0.191318 0.326292 0.226830 0.248129 0.164629 0.290557 0.177414 0.249158 0.225671 0.142985 0.273378 0.263618 0.240503 0.261603 0.259833 0.246472 0.203170 0.212712 0.153284 0.314246 0.000000 0.359443 0.393566
sample_m30c_kristy_and_the_mystery_train_2_1949 0.458586 0.236965 0.553948 0.505739 0.423442 0.200049 0.395719 0.428533 0.359700 0.335376 0.444650 0.205381 0.324456 0.373120 0.326208 0.414962 0.333569 0.351136 0.349696 0.446370 0.490096 0.447380 0.449219 0.249609 0.253072 0.209327 0.195521 0.248578 0.177098 0.333038 0.319827 0.182524 0.245547 0.301521 0.243234 0.370589 0.339218 0.384990 0.346397 0.242725 0.201568 0.288381 0.346824 0.188083 0.255988 0.381469 0.368126 0.549685 0.268177 0.316415 0.452010 0.271767 0.286044 0.451808 0.370173 0.441801 0.340923 0.324067 0.250487 0.136127 0.286809 0.180774 0.169390 0.178307 0.185814 0.238845 0.209935 0.243669 0.171543 0.319004 0.334726 0.309814 0.207253 0.166395 0.273873 0.255358 0.200278 0.269389 0.132783 0.239341 0.200322 0.217346 0.171284 0.295938 0.227997 0.179254 0.351510 0.216599 0.180054 0.236483 0.179455 0.208039 0.209888 0.170215 0.244287 0.179063 0.510381 0.359443 0.000000 0.522906
sample_m36c_kristy_and_the_cat_burglar_2_748 0.278128 0.420096 0.461316 0.330523 0.255868 0.400752 0.242891 0.433690 0.404258 0.403925 0.335168 0.510741 0.328501 0.353694 0.411564 0.383912 0.330360 0.377085 0.344922 0.332735 0.299000 0.445632 0.353823 0.346055 0.373485 0.441768 0.411205 0.405411 0.407092 0.300621 0.433258 0.452552 0.504097 0.347431 0.430198 0.392596 0.366385 0.379987 0.330415 0.376923 0.378315 0.384730 0.417195 0.429252 0.400739 0.324460 0.301268 0.447857 0.405182 0.313934 0.302063 0.418210 0.418201 0.305022 0.302368 0.321225 0.352009 0.383739 0.394764 0.456547 0.321243 0.377648 0.450983 0.417879 0.494749 0.368391 0.442421 0.391543 0.448704 0.341678 0.369016 0.328372 0.428598 0.420857 0.394055 0.397926 0.441927 0.359927 0.481855 0.383535 0.424791 0.409680 0.413971 0.366517 0.452935 0.455577 0.401938 0.452673 0.471258 0.341251 0.478028 0.414049 0.448544 0.388769 0.390592 0.398172 0.301307 0.393566 0.522906 0.000000
#Defines the size of the image
plt.figure(figsize=(100, 100))
#Increases the label size so it's more legible
sns.set(font_scale=3)
#Generates the visualization using the data in the dataframe
ax = sns.heatmap(cosine_distances_freq)
#Displays the image
plt.show()
_images/dsc9_34_0.png

Well, then.

If I had assigned random filenames, so that not all the GPT-2 and original excerpts had clustered together, it would have been especially difficult to pick out which rows and columns were which. Sure, there are clearly runs that went off in a bit of an oddball direction (the light-colored rows), but it’s not like the real chapter 2’s are perfectly uniform, either. (Compare, for instance, “Kristy and the Middle School Vandal” and “Kristy and the Cat Burglar”, which are noticeably lighter than the other real chapter 2 columns/rows, on the far right of the horizontal axis.)

The excerpt from “Kristy and the Cat Burglar” is straight-up BSC mystery prose, e.g.:

“I usually check the place out three times a day. But just as I was starting my rounds here today I had an emergency call about my wife being taken to the hospital. So I took off . Oh, man, I don’t believe this!”

“That’s interesting,”’ said Sergeant Johnson , who had climbed out of his car by then. “I had a call right around that time too. An anonymous call, telling me to check out this house.’’ Have I mentioned that Sergeant Johnson has the clearest, bluest eyes I ‘ve ever seen? Well, he does. But at that moment, they looked clouded, almost gray. There was something about that phone call that bothered him.

The security guard groaned. “I’m going to be out of a job,’’ he said. “What was stolen? Have you checked the vault yet?”

One thing that surprised me was that the runs don’t get markedly more similar to original chapter 2’s once they train more. Sure, most of the run 11’s are noticeably less-similar, but by the run 101’s it depends more on the particular direction a particular run took (as a result of turning the metaphorical steering wheels of machine learning this way or that). It’s kind of funny how on multiple training runs, run 401 was a bit of an oddball. Run 401 from training attempt 2 gets at some of the same mystery-esque vibes:

was a natural anyway; shs’s short, with brown hair and brown eyes. I guess she was worried about looks, maybe because her parents disapprove of her interest in sports (her hair is permed and she has no interest in anything else). Or maybe she had a good time with her older sister,” Dawn added. “What else happened?”

“I met these kids,” I said.

“Can’t you imagine that?”

“Me and my friends went to a concert in Stamford. We were wearing black shoes. Then the lights switched from blues to white.”

I smiled. “They were making a song called ‘Duh-huh-huh’ in which I”m the lead…. It wasn”t very good,” I said. “Now guess what — the lights switched back to blues as the song ended and we played the entire concert again.”

“It went great,” I said. “I could hear the blues playing in the background.”

“Dawn, that”s one of the things I hated about the school year.” “How many more classes will this school be in?” she asked.

“I think I’ll take a second to think about that.” (Dawn has to sit a hundred different ways a day, and it seems like a long train ride) “Two hundred and fifteen,” said Dawn slowly, quietly, “I think. Three hundred and eleven,” she began slowly.

“Great,” said Kristy. “The music was good, but there was still just nothing left over.”

“We played the whole thing from about seventy-five to seventy-seven,” I told her, with some rehearsal help from Mary Anne. “I hope I’re lucky to find some of that stuff.”

“If that is the case,” Mal said, “you might want to go to Stoneybrook Manor and search there for some of that stuff.”

“Where did I tell you about it?” asked Dawn. “I wonder. I guess it looks like a haunted house.”

“It’s a haunted house,” I said.

“What kind of place is that?” asked Mary Anne.

“It’s a haunted house,” I said.

“Well, it’s full of stories, you know, stuff that”s old, hearsay, ghosts, stuff like that.”

“Oh, I knew about that,” said Dawn, smiling as hard as she could. “I don’t know too much about horses.” She pulled her hair back from her face. “I’ll just ask Dawn what happened.”

“Just telling the truth,” Mary Anne said. “Who put that?”

“You know your parents,” said Mary Anne. “I meant what I said. Why did your dad bring them up?” (Their parents had been friends all their lives, and had fallen in love with Mary Anne and Richard.)

“You meant it too, Dad!” I said sharply.

“I know what,” Mary Anne said. “I just wanted to tell you, Dad, all this.”

“It’s only fair,” said Dawn, just as sharply, “I’m allowed to talk.”

Mary Anne lifted her eyebrows a couple of ways up. “Don’t you want to talk?”

“Oh, yes.” I did. “It’s not fair.” I didn’t expect her to back away or anything, but she did. “

“How old were you then?” asked Dawn.

“I guess I was seventeen.” She looked worried. “Did you want to talk about it?”

“I”d like to talk about it,” I said. “I don”t know how to talk.” Mary Anne smiled and smiled back with some rehearsal help.

I was disappointed. It wasn”t like “real,” I realized. “There”s no real old-school.”

“Well, there was,” I said. “I guess then, anyway.” I was looking at Dawn. She was

“I guess then, anyway”, indeed. Weird runs can happen anywhere – even in the later rounds. One of the least-similar texts is round 901, attempt 3:

a big family in our church (my dad is vice-president). He doesn’t attend many meetings because we have to attend only four regular men’s houses. Plus, we have an awful lot of pets. ’Well, you have two minutes to feed everybody a box of puppies,” Claudia cried.

“Puppies!” cried all of the friends I didn’t touch.

“Feed everybody a box of puppies,” Stacey said. “Hey, it’s not a bad thing to do. You can be friends too.”

“Feed people,” Dawn remarked.

Jessi said, “An awful lot of people are desperate for a kitten.”

“The poor ones!” Logan said.

“Feed people,” Stacey remarked.

Mal said, “I’m starving!”

“On a poor person!” Claudia said.

“Poor people can’t go hungry sometimes,” Abby volunteered.

“Poor people exist!” Logan exclaimed.

“The planet you’re looking for could be yours!” Jessi chimed in.

Everyone became fiercenessed.

Claudia shrugged. “The punks never ate the punks,” she said. “It’s fine in the punks’ way.”

“Yeah,” Abby laughed.

“I could have stuck puppies on the box,” Claudia agreed.

“Maybe some of the puppies would go to a puppy shop,” Dawn suggested.

“Or just kids’ toys,” Claudia volunteered.

Mary Anne frowned. “I’m not going to stick an adorable puppy on a box with puppies!”

“I’ve already given up,” Mallory volunteered.

Mallory squirmed up. “You have a lot of work, Mallory.”

“I’m making major progress,” Jessica said.

“Mallory’s been able to show progress,” Abby replied.

“Well, Mallory and I are still in the middle of a seventh-grade romance,” Mary Anne began.

“Yeah!” I blushed. “We were just picking up a cold that afternoon. Mallory’s practically freezing!”

“I have lunch,” Abby said.

“I just figured out a new way to survive,” Mary Anne said. “It’s called a jujube.”

“Well … this morning it’s a meal.” I handed the phone to Mallory. “I take it you and Jessi have a baby and Mallory needs a litter?”

Mallory nodded. “Okay. You and Jessi go talk to each other. Then we’re off.”

Jessi nodded again. “It’s fine.”

“I heard about the recycling program at the school,” Claudia said.

I did some quick math, and then checked my friend list and voilà! A total of four kids was sitting for four families!

“Not bad,” I said. “They’re so many kids I wish we had a whole bunch instead of just two.”

Claudia waved her hand to let me in. “The We ♥ Kids Club!”

“Great,” I said. “Good for me.”

“I’m looking forward to sitting with you all day,” Mary Anne said. “Your shoulders will lift.”<|endoftext|> The sound of clacking on my ultrasonic Barbe radio could be heard all over town. (My ultrasonic barbe is old school, no pun intended.)

“Yesss, this is Dawn Schafer,” Claudia Kishi said. “I heard this new project going on today at your school headquarters, Aurora Elementary School.”

Shaking my head, I grinned. “Well, it’s true,” I said.

“And it’s nice to have you back,” Mallory Pike said.

Mary Anne Spier chirped in.

Rrrring!

I jumped sideways.

No, this wasn’t quite the crash that it seemed. All my left leg was mangled. I had landed on my stomach.

In order to quantitatively identify the most-different passages (instead of, you know, just relying on my eyeballs to differentiate colors on the graph), I summed up the cosine distance values for each row (remember that each row – and each column – corresponds to a text, and when two identical texts are compared, the value will be 0) and pulled out the 10 passages with the highest score:

#Adds sum column to spreadsheet with cosine distances
cosine_distances_freq['Sum'] = cosine_distances_freq.sum(axis=0)
#Display spreadsheet with cosine distances, including new Sum column at the end
cosine_distances_freq
gpt2-011_run1-c gpt2-011_run2-c gpt2-011_run3-c gpt2-011_run4-c gpt2-011_run5-c gpt2-101_run1-c gpt2-101_run2-c gpt2-101_run3-c gpt2-101_run4-c gpt2-101_run5-c gpt2-201_run1-c gpt2-201_run2-c gpt2-201_run3-c gpt2-201_run4-c gpt2-201_run5-c gpt2-301_run1-c gpt2-301_run2-c gpt2-301_run3-c gpt2-301_run4-c gpt2-301_run5-c gpt2-401_run1-c gpt2-401_run2-c gpt2-401_run3-c gpt2-401_run4-c gpt2-401_run5-c gpt2-501_run1-c gpt2-501_run2-c gpt2-501_run3-c gpt2-501_run4-c gpt2-501_run5-c gpt2-601_run1-c gpt2-601_run2-c gpt2-601_run3-c gpt2-601_run4-c gpt2-601_run5-c gpt2-701_run1-c gpt2-701_run2-c gpt2-701_run3-c gpt2-701_run4-c gpt2-701_run5-c gpt2-801_run1-c gpt2-801_run2-c gpt2-801_run3-c gpt2-801_run4-c gpt2-801_run5-c gpt2-901_run1-c gpt2-901_run2-c gpt2-901_run3-c gpt2-901_run4-c gpt2-901_run5-c sample_004c_mary_anne_saves_the_day_2_240 sample_005c_dawn_and_the_impossible_three_2_216 sample_005c_dawn_and_the_impossible_three_2_432 sample_006c_kristys_big_day_2_321 sample_006c_kristys_big_day_2_850 sample_013c_goodbye_stacey_goodbye_2_537 sample_014c_hello_mallory_2_239 sample_024c_kristy_and_the_mothers_day_surprise_2_106 sample_025c_mary_anne_and_the_search_for_tigger_2_118 sample_027c_jessi_and_the_superbrat_2_159 sample_028c_welcome_back_stacey_2_679 sample_030c_mary_anne_and_the_great_romance_2_570 sample_032c_kristy_and_the_secret_of_susan_2_1152 sample_041c_mary_anne_vs_logan_2_1765 sample_045c_kristy_and_the_baby_parade_2_767 sample_046c_mary_anne_misses_logan_2_488 sample_048c_jessis_wish_2_68 sample_049c_claudia_and_the_genius_of_elm_street_2_105 sample_051c_staceys_ex_best_friend_2_1270 sample_057c_dawn_saves_the_planet_2_522 sample_064c_dawns_family_feud_2_305 sample_065c_staceys_big_crush_2_1311 sample_068c_jessi_and_the_bad_babysitter_2_89 sample_070c_stacey_and_the_cheerleaders_2_1293 sample_078c_claudia_and_crazy_peaches_2_36 sample_079c_mary_anne_breaks_the_rules_2_782 sample_082c_jessi_and_the_troublemaker_2_738 sample_084c_dawn_and_the_school_spirit_war_2_493 sample_090c_welcome_to_the_bsc_abby_2_1491 sample_097c_claudia_and_the_worlds_cutest_baby_2_1631 sample_100c_kristys_worst_idea_2_2163 sample_106c_claudia_queen_of_the_seventh_grade_2_958 sample_107c_mind_your_own_business_kristy_2_1194 sample_108c_dont_give_up_mallory_2_489 sample_110c_abby_and_the_bad_sport_2_535 sample_111c_staceys_secret_friend_2_700 sample_120c_mary_anne_and_the_playground_fight_2_942 sample_121c_abby_in_wonderland_2_494 sample_m03c_mallory_and_the_ghost_cat_2_261 sample_m04c_kristy_and_the_missing_child_2_1829 sample_m06c_the_mystery_at_claudias_house_2_420 sample_m10c_stacey_and_the_mystery_money_2_937 sample_m12c_dawn_and_the_surfer_ghost_2_126 sample_m14c_stacey_and_the_mystery_at_the_mall_2_1474 sample_m18c_stacey_and_the_mystery_at_the_empty_house_2_224 sample_m20c_mary_anne_and_the_zoo_mystery_2_80 sample_m25c_kristy_and_the_middle_school_vandal_2_134 sample_m27c_claudia_and_the_lighthouse_ghost_2_356 sample_m30c_kristy_and_the_mystery_train_2_1949 sample_m36c_kristy_and_the_cat_burglar_2_748 Sum
gpt2-011_run1-c 0.000000 0.323413 0.241242 0.180248 0.201788 0.296367 0.221127 0.229970 0.329125 0.294423 0.204391 0.399620 0.294434 0.185268 0.248509 0.296843 0.246077 0.198796 0.243489 0.189259 0.361294 0.408177 0.313905 0.257978 0.295150 0.340962 0.315007 0.275661 0.343400 0.241206 0.275041 0.412885 0.456441 0.230465 0.323500 0.286834 0.234322 0.229940 0.234451 0.303847 0.281143 0.295023 0.234725 0.374786 0.286393 0.234626 0.240313 0.354351 0.327190 0.216331 0.280657 0.371659 0.330236 0.197514 0.230987 0.204898 0.311633 0.234419 0.288067 0.380403 0.269793 0.316090 0.375880 0.318881 0.435693 0.302364 0.322477 0.322182 0.331877 0.171902 0.304779 0.236151 0.330618 0.341379 0.298178 0.274338 0.382497 0.250864 0.411028 0.300866 0.303292 0.319816 0.343380 0.307876 0.361247 0.346615 0.223632 0.429524 0.388881 0.256085 0.357987 0.324782 0.337621 0.309745 0.313145 0.294062 0.259532 0.266412 0.458586 0.278128 29.418396
gpt2-011_run2-c 0.323413 0.000000 0.435037 0.418290 0.317343 0.220188 0.298298 0.340588 0.261456 0.212091 0.298107 0.214205 0.252569 0.287110 0.216416 0.294076 0.235887 0.242449 0.254644 0.281837 0.396288 0.351753 0.361409 0.192975 0.192563 0.195864 0.144233 0.168979 0.208177 0.251857 0.248593 0.183577 0.210524 0.218402 0.150168 0.328675 0.299931 0.277243 0.295173 0.238957 0.209769 0.202028 0.290614 0.209606 0.221706 0.312848 0.237393 0.479423 0.222347 0.208101 0.379596 0.240673 0.238559 0.325638 0.284376 0.366906 0.298099 0.277485 0.200363 0.146329 0.282440 0.187055 0.212913 0.191621 0.216133 0.254805 0.236518 0.181176 0.185667 0.230922 0.262623 0.217132 0.166459 0.187679 0.201225 0.192756 0.174884 0.211782 0.219962 0.202108 0.253219 0.184885 0.196764 0.225118 0.235671 0.186566 0.290908 0.200397 0.134324 0.203273 0.185496 0.224020 0.243964 0.178499 0.172840 0.160327 0.418507 0.240564 0.236965 0.420096 24.479467
gpt2-011_run3-c 0.241242 0.435037 0.000000 0.369124 0.306006 0.373274 0.391320 0.397940 0.410283 0.406094 0.350788 0.460159 0.415590 0.319761 0.347909 0.367770 0.392230 0.359861 0.410740 0.311930 0.517296 0.444581 0.431704 0.385914 0.447239 0.402770 0.413343 0.397082 0.471245 0.451096 0.414384 0.536094 0.504359 0.372683 0.422593 0.418052 0.344757 0.333525 0.376386 0.423251 0.426888 0.457171 0.345132 0.464251 0.419932 0.364835 0.402296 0.430473 0.417648 0.342530 0.367020 0.443260 0.401664 0.328233 0.386187 0.356832 0.453871 0.410883 0.425135 0.483243 0.437123 0.414145 0.467484 0.414751 0.488166 0.432448 0.410559 0.404964 0.451504 0.337515 0.421705 0.386931 0.452270 0.468152 0.394287 0.385925 0.493576 0.370129 0.523369 0.442119 0.417056 0.469058 0.469952 0.449773 0.479589 0.460735 0.375010 0.539196 0.500505 0.392852 0.476576 0.465530 0.477651 0.466289 0.391493 0.438953 0.366972 0.403492 0.553948 0.461316 41.355960
gpt2-011_run4-c 0.180248 0.418290 0.369124 0.000000 0.298558 0.332010 0.285007 0.270305 0.423795 0.335504 0.289017 0.438918 0.355384 0.241928 0.311886 0.409310 0.337901 0.270426 0.275673 0.277383 0.404661 0.513022 0.424944 0.355208 0.397336 0.401089 0.395135 0.344306 0.414301 0.338588 0.312879 0.462990 0.543097 0.297137 0.389331 0.331855 0.276099 0.328041 0.326229 0.335980 0.336779 0.320772 0.275723 0.403829 0.315979 0.286606 0.330455 0.402222 0.374005 0.302824 0.348291 0.474338 0.436451 0.279635 0.337177 0.313504 0.381964 0.291071 0.328492 0.437427 0.344655 0.403461 0.412577 0.403293 0.496665 0.389362 0.398056 0.393639 0.402905 0.261155 0.378320 0.294068 0.414790 0.378447 0.393294 0.322270 0.409744 0.287904 0.457020 0.379274 0.352157 0.382819 0.412021 0.330739 0.411300 0.385803 0.238020 0.528969 0.433921 0.331188 0.409219 0.375826 0.360095 0.380298 0.401906 0.372245 0.329566 0.327204 0.505739 0.330523 35.910895
gpt2-011_run5-c 0.201788 0.317343 0.306006 0.298558 0.000000 0.294767 0.234755 0.346657 0.351142 0.328260 0.268820 0.438683 0.276705 0.280739 0.332427 0.284367 0.270258 0.265440 0.277439 0.228292 0.291330 0.378565 0.292353 0.244480 0.284813 0.340986 0.295886 0.291969 0.327021 0.232982 0.363558 0.376872 0.430095 0.259652 0.327507 0.337877 0.264508 0.291663 0.215531 0.289031 0.275186 0.301618 0.294932 0.366463 0.309137 0.264687 0.235073 0.386210 0.310752 0.227955 0.217697 0.320843 0.321541 0.229286 0.239322 0.196616 0.300452 0.282402 0.301236 0.357356 0.241598 0.290817 0.355557 0.297241 0.372579 0.267171 0.345957 0.284870 0.345632 0.244070 0.313738 0.264499 0.350850 0.355494 0.278061 0.319879 0.364254 0.306851 0.409007 0.290506 0.341590 0.298262 0.316455 0.291158 0.350967 0.360840 0.303969 0.357803 0.385901 0.250768 0.355229 0.317295 0.373178 0.320493 0.290799 0.300861 0.223750 0.304472 0.423442 0.255868 30.199622
gpt2-101_run1-c 0.296367 0.220188 0.373274 0.332010 0.294767 0.000000 0.253249 0.318504 0.297506 0.239305 0.306096 0.209541 0.245578 0.202998 0.232994 0.324861 0.231413 0.221966 0.205409 0.314109 0.380028 0.389150 0.338503 0.176277 0.204567 0.188640 0.162074 0.196732 0.186495 0.248799 0.250475 0.214745 0.260420 0.204492 0.194745 0.224970 0.207870 0.261101 0.247731 0.203951 0.161047 0.195766 0.284982 0.175273 0.156987 0.224922 0.222297 0.413528 0.208840 0.198130 0.324630 0.257983 0.224650 0.283472 0.269518 0.293520 0.262847 0.263329 0.204829 0.176555 0.187422 0.164976 0.179304 0.199661 0.224301 0.210380 0.188351 0.180780 0.178260 0.208581 0.266181 0.199881 0.180572 0.179892 0.195512 0.196122 0.195301 0.192394 0.185773 0.217345 0.177419 0.184316 0.187874 0.209340 0.210535 0.163627 0.218159 0.268070 0.202765 0.192908 0.199670 0.205560 0.185747 0.148811 0.202350 0.162823 0.323053 0.213093 0.200049 0.400752 22.852908
gpt2-101_run2-c 0.221127 0.298298 0.391320 0.285007 0.234755 0.253249 0.000000 0.342450 0.203203 0.252783 0.251351 0.337342 0.194099 0.213234 0.281837 0.294745 0.177945 0.251376 0.222665 0.294229 0.289731 0.313298 0.238286 0.232175 0.244311 0.295563 0.241932 0.287677 0.300637 0.181223 0.272382 0.300380 0.323867 0.194715 0.254486 0.232370 0.199166 0.278934 0.214424 0.262455 0.229935 0.231936 0.310222 0.274094 0.268366 0.160309 0.229181 0.327145 0.247240 0.212313 0.275674 0.319217 0.275396 0.193906 0.210316 0.226996 0.267348 0.299275 0.277300 0.292005 0.220240 0.255951 0.288680 0.279979 0.344129 0.331784 0.355839 0.268810 0.306402 0.234477 0.237501 0.214948 0.274493 0.245565 0.261727 0.251085 0.305259 0.208831 0.298819 0.231898 0.281600 0.255624 0.256317 0.256810 0.347418 0.284481 0.283765 0.356586 0.293025 0.230721 0.345792 0.334357 0.285909 0.247932 0.272916 0.261494 0.248307 0.285129 0.395719 0.242891 26.470411
gpt2-101_run3-c 0.229970 0.340588 0.397940 0.270305 0.346657 0.318504 0.342450 0.000000 0.360718 0.256626 0.197803 0.361896 0.341759 0.155757 0.220106 0.383288 0.248698 0.162878 0.202681 0.212709 0.465579 0.516503 0.441192 0.276715 0.357577 0.403887 0.345686 0.251113 0.385829 0.321016 0.187049 0.401512 0.474110 0.254031 0.344329 0.299427 0.269754 0.207221 0.343292 0.285361 0.279941 0.271524 0.189990 0.364647 0.268270 0.293015 0.349461 0.445145 0.345519 0.241440 0.367748 0.418557 0.365369 0.307801 0.326482 0.286892 0.352152 0.157034 0.241942 0.354648 0.336158 0.359923 0.411734 0.368467 0.464952 0.393022 0.357328 0.362650 0.315430 0.190369 0.350071 0.231876 0.314473 0.330916 0.343577 0.237668 0.388452 0.204856 0.407096 0.316893 0.272866 0.303143 0.378353 0.324646 0.377716 0.303371 0.178115 0.461545 0.343293 0.282845 0.335873 0.305196 0.286622 0.280627 0.367010 0.263552 0.391393 0.255750 0.428533 0.433690 31.670140
gpt2-101_run4-c 0.329125 0.261456 0.410283 0.423795 0.351142 0.297506 0.203203 0.360718 0.000000 0.268294 0.272705 0.309577 0.222502 0.282959 0.250763 0.276326 0.192267 0.280809 0.290235 0.332102 0.397757 0.287290 0.321275 0.228030 0.239278 0.280193 0.257697 0.265482 0.310159 0.255448 0.287594 0.272574 0.279742 0.198116 0.256884 0.323245 0.298923 0.259207 0.291943 0.288087 0.257738 0.281784 0.321053 0.256859 0.334942 0.246518 0.303624 0.375283 0.256403 0.255508 0.333305 0.297414 0.244740 0.275493 0.269947 0.266022 0.324450 0.306054 0.236153 0.257339 0.287366 0.261259 0.303998 0.266982 0.312304 0.362049 0.335865 0.261285 0.303526 0.304621 0.233345 0.255058 0.278778 0.243601 0.281359 0.228271 0.297429 0.227864 0.306108 0.242875 0.297548 0.251246 0.269655 0.310530 0.375881 0.271506 0.325280 0.298635 0.239018 0.269543 0.324635 0.341631 0.336891 0.238021 0.252641 0.249381 0.374117 0.295100 0.359700 0.404258 28.594481
gpt2-101_run5-c 0.294423 0.212091 0.406094 0.335504 0.328260 0.239305 0.252783 0.256626 0.268294 0.000000 0.201191 0.205243 0.198335 0.199804 0.195553 0.285640 0.177527 0.184363 0.159010 0.244352 0.353212 0.342628 0.346150 0.196559 0.226689 0.258365 0.200827 0.218956 0.264253 0.209942 0.170926 0.207223 0.277406 0.207877 0.189792 0.207052 0.184188 0.255877 0.275250 0.197701 0.201921 0.155019 0.249778 0.225372 0.175754 0.191456 0.219964 0.395328 0.205823 0.169004 0.356710 0.287915 0.259159 0.287933 0.299678 0.305125 0.276265 0.205339 0.197934 0.205790 0.266287 0.230653 0.246213 0.239297 0.289114 0.353928 0.294853 0.235578 0.234670 0.217175 0.251636 0.189247 0.213735 0.211582 0.233728 0.176680 0.197700 0.143060 0.266951 0.167833 0.241931 0.211082 0.211196 0.233016 0.259444 0.184465 0.231306 0.248904 0.211175 0.194810 0.232028 0.238586 0.177641 0.195148 0.222286 0.204330 0.323612 0.233488 0.335376 0.403925 23.760205
gpt2-201_run1-c 0.204391 0.298107 0.350788 0.289017 0.268820 0.306096 0.251351 0.197803 0.272705 0.201191 0.000000 0.374148 0.254139 0.171998 0.203574 0.308862 0.186475 0.167187 0.178403 0.181949 0.350377 0.406872 0.330012 0.192747 0.250029 0.383909 0.296835 0.242291 0.351935 0.203398 0.226208 0.366797 0.435251 0.212722 0.285190 0.242999 0.257310 0.158501 0.250366 0.272744 0.249075 0.236185 0.221108 0.344472 0.263312 0.202586 0.233089 0.374843 0.283307 0.180663 0.288904 0.352236 0.292879 0.251349 0.261266 0.213584 0.267494 0.173031 0.189096 0.352002 0.262984 0.304568 0.383938 0.322778 0.425887 0.363655 0.362017 0.274030 0.320140 0.203832 0.280805 0.200252 0.287329 0.312294 0.290647 0.206844 0.360444 0.186979 0.405358 0.252997 0.299265 0.275731 0.311305 0.293978 0.369433 0.302526 0.201124 0.385972 0.328682 0.242192 0.366710 0.323558 0.315406 0.260071 0.313143 0.223342 0.290284 0.218690 0.444650 0.335168 27.722984
gpt2-201_run2-c 0.399620 0.214205 0.460159 0.438918 0.438683 0.209541 0.337342 0.361896 0.309577 0.205243 0.374148 0.000000 0.299266 0.301350 0.242383 0.368925 0.267163 0.266240 0.295006 0.386518 0.487885 0.414251 0.448002 0.251701 0.248677 0.213543 0.188735 0.237132 0.213739 0.327421 0.234024 0.193692 0.225993 0.245934 0.182952 0.310311 0.294738 0.347339 0.363855 0.250030 0.232180 0.246257 0.314325 0.188918 0.214370 0.318403 0.326290 0.491080 0.237561 0.239448 0.452188 0.273121 0.259845 0.409947 0.375752 0.441446 0.361044 0.288193 0.247637 0.148040 0.347299 0.211448 0.184061 0.199205 0.211871 0.316003 0.253867 0.258989 0.199321 0.272415 0.326668 0.255327 0.207775 0.151998 0.268092 0.215951 0.172830 0.197966 0.163897 0.219328 0.231094 0.239570 0.197098 0.269843 0.219684 0.167797 0.277162 0.255807 0.180213 0.233482 0.169289 0.211683 0.180479 0.185183 0.239193 0.203145 0.487692 0.311040 0.205381 0.510741 27.532069
gpt2-201_run3-c 0.294434 0.252569 0.415590 0.355384 0.276705 0.245578 0.194099 0.341759 0.222502 0.198335 0.254139 0.299266 0.000000 0.231772 0.285460 0.273579 0.189492 0.233627 0.194262 0.281228 0.253615 0.282208 0.271312 0.219166 0.200635 0.257293 0.203310 0.254388 0.236560 0.161489 0.289000 0.210819 0.275013 0.191213 0.233049 0.209678 0.202155 0.283258 0.252610 0.246538 0.170422 0.196730 0.322300 0.226601 0.213211 0.162731 0.181667 0.356954 0.230150 0.206317 0.303951 0.272714 0.253522 0.241455 0.248711 0.268880 0.243313 0.293047 0.238179 0.236346 0.209528 0.223148 0.246874 0.240810 0.275957 0.321489 0.306316 0.219780 0.272025 0.265255 0.229332 0.209304 0.225486 0.220255 0.243707 0.235567 0.217485 0.208169 0.278570 0.187565 0.269938 0.223181 0.205085 0.210008 0.263477 0.261136 0.284998 0.242215 0.258218 0.182585 0.288997 0.241104 0.228984 0.195107 0.205134 0.227481 0.271450 0.257867 0.324456 0.328501 24.446830
gpt2-201_run4-c 0.185268 0.287110 0.319761 0.241928 0.280739 0.202998 0.213234 0.155757 0.282959 0.199804 0.171998 0.301350 0.231772 0.000000 0.177451 0.318921 0.175611 0.132590 0.142597 0.222216 0.361106 0.428140 0.313230 0.198613 0.235257 0.315543 0.229902 0.198905 0.289265 0.212516 0.183235 0.330689 0.363411 0.177342 0.254317 0.177555 0.191429 0.199693 0.246256 0.222533 0.195001 0.178632 0.214405 0.271681 0.188685 0.177831 0.203820 0.401965 0.256345 0.154417 0.313905 0.346299 0.285139 0.221021 0.249303 0.230268 0.252362 0.185458 0.193511 0.283170 0.232534 0.248649 0.314058 0.298712 0.392407 0.295558 0.266935 0.239859 0.248743 0.148470 0.270272 0.146128 0.222490 0.257087 0.254365 0.187669 0.285326 0.128019 0.331566 0.222929 0.240078 0.208019 0.277887 0.214153 0.278276 0.248321 0.136434 0.369448 0.291799 0.219446 0.285395 0.253025 0.208486 0.203938 0.262180 0.191622 0.282498 0.181296 0.373120 0.353694 24.247108
gpt2-201_run5-c 0.248509 0.216416 0.347909 0.311886 0.332427 0.232994 0.281837 0.220106 0.250763 0.195553 0.203574 0.242383 0.285460 0.177451 0.000000 0.287519 0.176087 0.181798 0.221662 0.199868 0.420651 0.347278 0.382751 0.200352 0.229331 0.261175 0.230740 0.169061 0.278766 0.263733 0.186730 0.312785 0.323323 0.177172 0.196342 0.284472 0.239953 0.189744 0.287626 0.234364 0.237238 0.204144 0.215493 0.241646 0.237632 0.274233 0.273209 0.407790 0.224335 0.173679 0.341205 0.297353 0.243086 0.271003 0.283143 0.303826 0.312511 0.220311 0.157811 0.229143 0.303683 0.246650 0.260620 0.230934 0.288599 0.280201 0.269633 0.191822 0.247169 0.188214 0.278519 0.194501 0.196601 0.226950 0.212714 0.146069 0.242132 0.150824 0.302337 0.225339 0.231460 0.208014 0.254889 0.252343 0.285310 0.217028 0.172074 0.317734 0.217031 0.208830 0.226591 0.243083 0.237967 0.208577 0.230068 0.182050 0.357542 0.174332 0.326208 0.411564 24.625548
gpt2-301_run1-c 0.296843 0.294076 0.367770 0.409310 0.284367 0.324861 0.294745 0.383288 0.276326 0.285640 0.308862 0.368925 0.273579 0.318921 0.287519 0.000000 0.305145 0.282043 0.265385 0.304060 0.388628 0.268605 0.314475 0.246821 0.296171 0.319634 0.307225 0.305138 0.355547 0.269821 0.356410 0.332130 0.320669 0.261936 0.285661 0.366792 0.313397 0.306978 0.272108 0.330444 0.315019 0.320899 0.356783 0.343609 0.292569 0.299563 0.316787 0.366560 0.275166 0.236142 0.301950 0.279271 0.291672 0.305103 0.288381 0.308299 0.328822 0.338605 0.276530 0.317952 0.326962 0.304949 0.329235 0.306131 0.336933 0.383553 0.322326 0.296922 0.282681 0.306757 0.292946 0.321100 0.310838 0.315896 0.304134 0.298060 0.335500 0.308459 0.410025 0.286471 0.357344 0.327194 0.313197 0.280922 0.343905 0.346941 0.376356 0.328320 0.307547 0.268706 0.351853 0.296096 0.398477 0.315346 0.296676 0.298989 0.360828 0.309472 0.414962 0.383912 31.366862
gpt2-301_run2-c 0.246077 0.235887 0.392230 0.337901 0.270258 0.231413 0.177945 0.248698 0.192267 0.177527 0.186475 0.267163 0.189492 0.175611 0.176087 0.305145 0.000000 0.198615 0.188955 0.244695 0.288421 0.328349 0.280938 0.174144 0.199381 0.266818 0.202261 0.197004 0.263609 0.172504 0.183615 0.258197 0.295002 0.168298 0.210586 0.190332 0.199857 0.197782 0.215537 0.195089 0.194153 0.175983 0.241736 0.236269 0.249411 0.164662 0.202934 0.365038 0.202121 0.162299 0.285456 0.277727 0.220482 0.208866 0.226260 0.231067 0.224546 0.203204 0.188718 0.238629 0.201434 0.192013 0.254556 0.234856 0.305144 0.297788 0.298051 0.192347 0.249923 0.198837 0.219613 0.154611 0.215979 0.193969 0.225436 0.149629 0.251581 0.140893 0.264826 0.183567 0.240627 0.185753 0.213675 0.227319 0.307016 0.219576 0.197491 0.291769 0.245940 0.203016 0.285613 0.278244 0.206416 0.180078 0.222679 0.194611 0.252775 0.205517 0.333569 0.330360 22.676824
gpt2-301_run3-c 0.198796 0.242449 0.359861 0.270426 0.265440 0.221966 0.251376 0.162878 0.280809 0.184363 0.167187 0.266240 0.233627 0.132590 0.181798 0.282043 0.198615 0.000000 0.140572 0.172001 0.374298 0.391049 0.354211 0.177344 0.220462 0.315029 0.223854 0.166606 0.231736 0.197751 0.196643 0.286048 0.329798 0.159240 0.241133 0.214327 0.214113 0.189189 0.245549 0.226636 0.180463 0.188844 0.206362 0.267089 0.178942 0.199861 0.199244 0.386410 0.251388 0.161526 0.314631 0.293705 0.249149 0.250046 0.254568 0.235041 0.305252 0.148727 0.175075 0.241034 0.245379 0.257393 0.299726 0.261731 0.332834 0.265224 0.243039 0.241954 0.221821 0.146340 0.265214 0.159107 0.202714 0.233757 0.220347 0.183064 0.234392 0.147700 0.313081 0.196772 0.205207 0.203880 0.249560 0.211938 0.236000 0.230519 0.176175 0.292130 0.249673 0.183831 0.228056 0.181901 0.203988 0.196995 0.223385 0.176264 0.342194 0.173948 0.351136 0.377085 23.220866
gpt2-301_run4-c 0.243489 0.254644 0.410740 0.275673 0.277439 0.205409 0.222665 0.202681 0.290235 0.159010 0.178403 0.295006 0.194262 0.142597 0.221662 0.265385 0.188955 0.140572 0.000000 0.234643 0.313223 0.359596 0.329157 0.168092 0.214719 0.309904 0.210662 0.207781 0.231023 0.159183 0.196213 0.252964 0.285753 0.164532 0.233199 0.175772 0.177996 0.227693 0.239920 0.202223 0.154345 0.140927 0.254349 0.240829 0.148938 0.144864 0.188558 0.390072 0.222364 0.145527 0.286870 0.281320 0.247848 0.254887 0.249955 0.253246 0.242335 0.175472 0.196145 0.244528 0.214009 0.242571 0.279196 0.274563 0.346775 0.298155 0.241776 0.228967 0.214514 0.170823 0.276068 0.149719 0.196650 0.227531 0.225639 0.199027 0.229291 0.156204 0.307267 0.157460 0.247318 0.195964 0.222187 0.150517 0.221873 0.237267 0.194445 0.279930 0.255262 0.183171 0.252583 0.188880 0.199316 0.199063 0.214056 0.196508 0.290923 0.191521 0.349696 0.344922 22.778062
gpt2-301_run5-c 0.189259 0.281837 0.311930 0.277383 0.228292 0.314109 0.294229 0.212709 0.332102 0.244352 0.181949 0.386518 0.281228 0.222216 0.199868 0.304060 0.244695 0.172001 0.234643 0.000000 0.363635 0.368981 0.352252 0.205703 0.310835 0.347309 0.337260 0.215413 0.338838 0.267078 0.260976 0.374349 0.410871 0.233828 0.307539 0.335668 0.278505 0.180853 0.257081 0.298604 0.287064 0.274404 0.211515 0.354165 0.262587 0.275844 0.268987 0.388131 0.317515 0.209236 0.276893 0.338815 0.316087 0.244807 0.254808 0.232566 0.329581 0.180012 0.230733 0.339260 0.298793 0.313434 0.352289 0.303813 0.371654 0.305329 0.318403 0.293957 0.313874 0.190885 0.289763 0.234330 0.307598 0.339703 0.278969 0.234777 0.337305 0.236037 0.425361 0.270022 0.282555 0.304733 0.324123 0.322925 0.334064 0.326254 0.236802 0.369312 0.353209 0.209552 0.334955 0.267105 0.341429 0.304516 0.284372 0.274705 0.341485 0.245020 0.446370 0.332735 28.804482
gpt2-401_run1-c 0.361294 0.396288 0.517296 0.404661 0.291330 0.380028 0.289731 0.465579 0.397757 0.353212 0.350377 0.487885 0.253615 0.361106 0.420651 0.388628 0.288421 0.374298 0.313223 0.363635 0.000000 0.375920 0.320525 0.302873 0.378795 0.422735 0.353968 0.378541 0.375449 0.225285 0.441626 0.401916 0.413279 0.343792 0.394216 0.293794 0.324453 0.378446 0.340491 0.338458 0.332344 0.333129 0.442490 0.403772 0.348605 0.283022 0.230402 0.414374 0.350096 0.298944 0.281618 0.323173 0.338206 0.294265 0.305352 0.304341 0.281060 0.363035 0.361186 0.411781 0.272931 0.322863 0.409687 0.435640 0.474229 0.358694 0.428874 0.325291 0.412101 0.331250 0.335717 0.303136 0.370416 0.388990 0.336304 0.383995 0.383296 0.350753 0.449189 0.345531 0.413975 0.348058 0.372235 0.282060 0.377652 0.389871 0.355299 0.374158 0.441003 0.325510 0.465752 0.397745 0.378542 0.336892 0.314635 0.343996 0.265152 0.342433 0.490096 0.299000 35.697719
gpt2-401_run2-c 0.408177 0.351753 0.444581 0.513022 0.378565 0.389150 0.313298 0.516503 0.287290 0.342628 0.406872 0.414251 0.282208 0.428140 0.347278 0.268605 0.328349 0.391049 0.359596 0.368981 0.375920 0.000000 0.347893 0.295465 0.366282 0.333171 0.335591 0.357926 0.372740 0.312630 0.403619 0.361347 0.306026 0.317911 0.333059 0.408560 0.378604 0.356014 0.351113 0.387039 0.363720 0.376440 0.434311 0.349646 0.367109 0.318627 0.344174 0.353900 0.313634 0.329483 0.330729 0.299858 0.287269 0.349204 0.370228 0.378303 0.363952 0.419218 0.360603 0.349265 0.361210 0.341159 0.320219 0.329584 0.316945 0.404073 0.382756 0.307496 0.342141 0.401561 0.317338 0.376657 0.364330 0.350811 0.332063 0.341476 0.332368 0.371720 0.422208 0.313046 0.382778 0.364013 0.328946 0.373731 0.352108 0.384153 0.450157 0.313590 0.362205 0.300502 0.391034 0.333504 0.414897 0.352635 0.311908 0.371339 0.399764 0.374978 0.447380 0.445632 35.785294
gpt2-401_run3-c 0.313905 0.361409 0.431704 0.424944 0.292353 0.338503 0.238286 0.441192 0.321275 0.346150 0.330012 0.448002 0.271312 0.313230 0.382751 0.314475 0.280938 0.354211 0.329157 0.352252 0.320525 0.347893 0.000000 0.289617 0.331857 0.354212 0.309507 0.364340 0.381046 0.274372 0.406324 0.347890 0.361700 0.308881 0.336848 0.300116 0.290522 0.367773 0.262236 0.317765 0.324769 0.294283 0.406116 0.357754 0.328787 0.254298 0.293242 0.400433 0.276117 0.271864 0.313532 0.367350 0.341737 0.286745 0.287630 0.261385 0.264891 0.372777 0.332242 0.367884 0.255728 0.277551 0.371599 0.359981 0.408221 0.356767 0.383261 0.296968 0.349420 0.333217 0.245321 0.304278 0.341122 0.342293 0.329131 0.345237 0.385248 0.326957 0.396067 0.328214 0.377923 0.307364 0.337243 0.318265 0.402877 0.366430 0.376735 0.387202 0.377793 0.296802 0.406761 0.404037 0.355920 0.329666 0.338699 0.338031 0.290226 0.335444 0.449219 0.353823 33.450365
gpt2-401_run4-c 0.257978 0.192975 0.385914 0.355208 0.244480 0.176277 0.232175 0.276715 0.228030 0.196559 0.192747 0.251701 0.219166 0.198613 0.200352 0.246821 0.174144 0.177344 0.168092 0.205703 0.302873 0.295465 0.289617 0.000000 0.168706 0.242501 0.187173 0.149927 0.193832 0.168713 0.231923 0.218743 0.232117 0.160408 0.191723 0.231661 0.238798 0.175133 0.176017 0.193640 0.169318 0.179400 0.238502 0.214120 0.184751 0.213683 0.178079 0.365026 0.187768 0.134525 0.229984 0.212464 0.191172 0.250799 0.213314 0.208425 0.212355 0.171947 0.135580 0.181546 0.170080 0.169491 0.215520 0.190632 0.254553 0.183936 0.204418 0.170985 0.180013 0.164547 0.222683 0.159837 0.165744 0.187470 0.184904 0.166971 0.212175 0.178696 0.247779 0.162485 0.207362 0.161652 0.192735 0.182158 0.197986 0.196471 0.202380 0.188913 0.215618 0.166987 0.219948 0.176584 0.235049 0.156659 0.178521 0.139988 0.296619 0.188519 0.249609 0.346055 20.595460
gpt2-401_run5-c 0.295150 0.192563 0.447239 0.397336 0.284813 0.204567 0.244311 0.357577 0.239278 0.226689 0.250029 0.248677 0.200635 0.235257 0.229331 0.296171 0.199381 0.220462 0.214719 0.310835 0.378795 0.366282 0.331857 0.168706 0.000000 0.243018 0.164851 0.173093 0.188635 0.172311 0.286081 0.224432 0.274104 0.167325 0.165521 0.277755 0.245129 0.271630 0.218415 0.233287 0.154123 0.158392 0.329390 0.189683 0.227290 0.241585 0.185673 0.476713 0.238711 0.171434 0.365503 0.282006 0.263806 0.291874 0.261169 0.298303 0.246438 0.267150 0.167295 0.210026 0.210392 0.198968 0.237401 0.188206 0.262490 0.224877 0.247132 0.160433 0.225824 0.228539 0.255213 0.188588 0.130296 0.213615 0.187511 0.167491 0.199283 0.186522 0.218995 0.163426 0.247428 0.149913 0.184120 0.211729 0.228100 0.203153 0.230801 0.170621 0.188535 0.208752 0.210339 0.224923 0.233820 0.176405 0.204485 0.150564 0.366065 0.182951 0.253072 0.373485 23.367278
gpt2-501_run1-c 0.340962 0.195864 0.402770 0.401089 0.340986 0.188640 0.295563 0.403887 0.280193 0.258365 0.383909 0.213543 0.257293 0.315543 0.261175 0.319634 0.266818 0.315029 0.309904 0.347309 0.422735 0.333171 0.354212 0.242501 0.243018 0.000000 0.179803 0.231880 0.216473 0.258615 0.285587 0.190498 0.258101 0.254552 0.164459 0.339554 0.250056 0.350937 0.268893 0.264134 0.192518 0.237807 0.317029 0.166772 0.235470 0.308254 0.303510 0.468625 0.252339 0.262892 0.396445 0.284031 0.280405 0.324898 0.306749 0.375579 0.297794 0.314289 0.249680 0.177066 0.293315 0.198725 0.158879 0.172915 0.183061 0.283047 0.241755 0.230059 0.209326 0.250537 0.284191 0.288108 0.226834 0.201512 0.229640 0.201281 0.206413 0.236347 0.176802 0.221984 0.218062 0.204487 0.167708 0.283505 0.263732 0.203952 0.319777 0.240682 0.165612 0.194336 0.189213 0.259497 0.226229 0.198170 0.229542 0.216824 0.396230 0.321556 0.209327 0.441768 26.404749
gpt2-501_run2-c 0.315007 0.144233 0.413343 0.395135 0.295886 0.162074 0.241932 0.345686 0.257697 0.200827 0.296835 0.188735 0.203310 0.229902 0.230740 0.307225 0.202261 0.223854 0.210662 0.337260 0.353968 0.335591 0.309507 0.187173 0.164851 0.179803 0.000000 0.183157 0.179433 0.205052 0.243174 0.171971 0.211179 0.194202 0.135746 0.225759 0.221046 0.312921 0.256439 0.174801 0.144612 0.132470 0.326264 0.168795 0.191687 0.223638 0.194286 0.462317 0.187413 0.177227 0.360888 0.259242 0.236345 0.286885 0.255254 0.326789 0.242199 0.277798 0.191386 0.124097 0.226183 0.133218 0.171073 0.162238 0.199712 0.228976 0.221530 0.153629 0.158351 0.229151 0.256048 0.171772 0.120973 0.143890 0.154090 0.173235 0.166476 0.167016 0.159707 0.165487 0.229708 0.144905 0.140632 0.178163 0.201916 0.136904 0.249417 0.192090 0.140870 0.176554 0.168411 0.220954 0.154193 0.146071 0.169085 0.135118 0.355810 0.211369 0.195521 0.411205 21.739619
gpt2-501_run3-c 0.275661 0.168979 0.397082 0.344306 0.291969 0.196732 0.287677 0.251113 0.265482 0.218956 0.242291 0.237132 0.254388 0.198905 0.169061 0.305138 0.197004 0.166606 0.207781 0.215413 0.378541 0.357926 0.364340 0.149927 0.173093 0.231880 0.183157 0.000000 0.189199 0.211837 0.209903 0.245464 0.237789 0.168122 0.181705 0.285853 0.250526 0.204994 0.243059 0.198276 0.172959 0.187119 0.217073 0.214815 0.193762 0.289963 0.216485 0.432906 0.228944 0.153015 0.307319 0.276649 0.238539 0.294204 0.269050 0.296611 0.296477 0.175522 0.163592 0.192994 0.251909 0.201539 0.245569 0.224234 0.286223 0.182368 0.192086 0.143881 0.210690 0.160432 0.273890 0.161536 0.146892 0.212501 0.172175 0.143747 0.189789 0.169721 0.230799 0.199911 0.197968 0.163744 0.235875 0.212928 0.216196 0.184543 0.178051 0.230313 0.196226 0.167327 0.181308 0.185355 0.205538 0.157175 0.188178 0.137644 0.389845 0.153329 0.248578 0.405411 22.410691
gpt2-501_run4-c 0.343400 0.208177 0.471245 0.414301 0.327021 0.186495 0.300637 0.385829 0.310159 0.264253 0.351935 0.213739 0.236560 0.289265 0.278766 0.355547 0.263609 0.231736 0.231023 0.338838 0.375449 0.372740 0.381046 0.193832 0.188635 0.216473 0.179433 0.189199 0.000000 0.200594 0.270063 0.165566 0.208749 0.201125 0.199969 0.296463 0.271050 0.324514 0.293039 0.237358 0.149216 0.199907 0.324868 0.171697 0.184720 0.276407 0.230876 0.493388 0.236836 0.211407 0.374651 0.273141 0.259894 0.352242 0.293151 0.361743 0.312044 0.259996 0.222138 0.153838 0.239689 0.178899 0.157203 0.170961 0.189877 0.157208 0.194762 0.185342 0.173404 0.233079 0.310822 0.202896 0.163704 0.176138 0.191271 0.218386 0.133638 0.220355 0.165524 0.171837 0.202702 0.175715 0.164695 0.195415 0.159766 0.187215 0.298081 0.145188 0.189009 0.163850 0.154289 0.137473 0.192837 0.156093 0.155095 0.169019 0.417195 0.262056 0.177098 0.407092 24.054871
gpt2-501_run5-c 0.241206 0.251857 0.451096 0.338588 0.232982 0.248799 0.181223 0.321016 0.255448 0.209942 0.203398 0.327421 0.161489 0.212516 0.263733 0.269821 0.172504 0.197751 0.159183 0.267078 0.225285 0.312630 0.274372 0.168713 0.172311 0.258615 0.205052 0.211837 0.200594 0.000000 0.262162 0.238353 0.284012 0.162393 0.209018 0.216109 0.200890 0.262104 0.189488 0.238590 0.161736 0.175864 0.284761 0.231241 0.208496 0.166613 0.149472 0.371186 0.243971 0.175327 0.272818 0.288896 0.260255 0.238828 0.215571 0.223795 0.215641 0.207430 0.208514 0.252202 0.182985 0.223761 0.264348 0.249615 0.322966 0.258531 0.290080 0.208005 0.247737 0.189220 0.252600 0.187653 0.218020 0.248939 0.218756 0.211898 0.227904 0.188335 0.270287 0.163148 0.261848 0.184521 0.211741 0.181017 0.245957 0.260588 0.242687 0.214019 0.255444 0.178378 0.257484 0.230554 0.235892 0.186671 0.213456 0.196279 0.269021 0.221420 0.333038 0.300621 23.117617
gpt2-601_run1-c 0.275041 0.248593 0.414384 0.312879 0.363558 0.250475 0.272382 0.187049 0.287594 0.170926 0.226208 0.234024 0.289000 0.183235 0.186730 0.356410 0.183615 0.196643 0.196213 0.260976 0.441626 0.403619 0.406324 0.231923 0.286081 0.285587 0.243174 0.209903 0.270063 0.262162 0.000000 0.250864 0.319300 0.184338 0.217088 0.265560 0.232648 0.229350 0.310011 0.222523 0.220125 0.225102 0.192976 0.231708 0.228358 0.235465 0.321466 0.445305 0.257219 0.203911 0.366643 0.347286 0.271423 0.312788 0.323632 0.329460 0.317846 0.166140 0.218465 0.221981 0.294361 0.256256 0.264114 0.246346 0.328746 0.355056 0.297532 0.271022 0.245824 0.181148 0.299633 0.172923 0.261863 0.230036 0.261339 0.152807 0.250394 0.137953 0.251494 0.216775 0.220095 0.234229 0.251996 0.283095 0.318329 0.198835 0.211066 0.364110 0.235549 0.214989 0.211385 0.266875 0.193600 0.208565 0.283017 0.219585 0.383780 0.258129 0.319827 0.433258 26.091316
gpt2-601_run2-c 0.412885 0.183577 0.536094 0.462990 0.376872 0.214745 0.300380 0.401512 0.272574 0.207223 0.366797 0.193692 0.210819 0.330689 0.312785 0.332130 0.258197 0.286048 0.252964 0.374349 0.401916 0.361347 0.347890 0.218743 0.224432 0.190498 0.171971 0.245464 0.165566 0.238353 0.250864 0.000000 0.168467 0.248123 0.175544 0.308956 0.269850 0.350347 0.308139 0.233179 0.169221 0.208448 0.332619 0.160180 0.192110 0.260781 0.286798 0.517321 0.210599 0.241508 0.404759 0.271107 0.261012 0.385544 0.319644 0.385926 0.288665 0.287716 0.243321 0.129061 0.237270 0.179460 0.155936 0.182989 0.174540 0.294438 0.238335 0.233133 0.180796 0.304550 0.253326 0.250240 0.205730 0.167178 0.259805 0.224413 0.140875 0.218086 0.148638 0.163798 0.241827 0.200201 0.149425 0.247348 0.230518 0.178507 0.369726 0.180173 0.149175 0.181702 0.162084 0.191274 0.183015 0.158571 0.204540 0.204787 0.455855 0.340358 0.182524 0.452552 25.601012
gpt2-601_run3-c 0.456441 0.210524 0.504359 0.543097 0.430095 0.260420 0.323867 0.474110 0.279742 0.277406 0.435251 0.225993 0.275013 0.363411 0.323323 0.320669 0.295002 0.329798 0.285753 0.410871 0.413279 0.306026 0.361700 0.232117 0.274104 0.258101 0.211179 0.237789 0.208749 0.284012 0.319300 0.168467 0.000000 0.259592 0.214484 0.308521 0.340012 0.368443 0.346263 0.285031 0.231453 0.270304 0.409878 0.205899 0.226948 0.297792 0.285430 0.499938 0.227997 0.249323 0.387409 0.222340 0.210603 0.380962 0.335836 0.415092 0.346774 0.346131 0.276401 0.156172 0.287392 0.214812 0.219471 0.258143 0.251208 0.279829 0.244872 0.218712 0.188059 0.334090 0.295870 0.264653 0.204434 0.195161 0.237176 0.261468 0.176794 0.276758 0.215871 0.207747 0.276895 0.211890 0.230326 0.220402 0.218967 0.242156 0.406574 0.206062 0.222222 0.248339 0.200653 0.214558 0.236629 0.211251 0.198752 0.241973 0.463396 0.327262 0.245547 0.504097 28.565469
gpt2-601_run4-c 0.230465 0.218402 0.372683 0.297137 0.259652 0.204492 0.194715 0.254031 0.198116 0.207877 0.212722 0.245934 0.191213 0.177342 0.177172 0.261936 0.168298 0.159240 0.164532 0.233828 0.343792 0.317911 0.308881 0.160408 0.167325 0.254552 0.194202 0.168122 0.201125 0.162393 0.184338 0.248123 0.259592 0.000000 0.189134 0.238862 0.200275 0.219504 0.197388 0.204710 0.155486 0.185640 0.223855 0.206022 0.214661 0.182478 0.201832 0.352470 0.218529 0.160203 0.280403 0.245235 0.194667 0.232980 0.237694 0.212123 0.253520 0.185095 0.175665 0.204128 0.214885 0.220965 0.234288 0.192747 0.276986 0.238229 0.258138 0.186479 0.229293 0.167469 0.242954 0.162902 0.202445 0.193036 0.186456 0.161403 0.203419 0.148333 0.254431 0.156212 0.194538 0.186083 0.187847 0.206869 0.240163 0.208093 0.194165 0.233249 0.225549 0.163773 0.209163 0.196440 0.222234 0.184172 0.185932 0.167839 0.295462 0.183562 0.301521 0.347431 21.438264
gpt2-601_run5-c 0.323500 0.150168 0.422593 0.389331 0.327507 0.194745 0.254486 0.344329 0.256884 0.189792 0.285190 0.182952 0.233049 0.254317 0.196342 0.285661 0.210586 0.241133 0.233199 0.307539 0.394216 0.333059 0.336848 0.191723 0.165521 0.164459 0.135746 0.181705 0.199969 0.209018 0.217088 0.175544 0.214484 0.189134 0.000000 0.270940 0.219575 0.290916 0.252093 0.228555 0.176209 0.163235 0.301064 0.159834 0.195740 0.253557 0.252338 0.452177 0.191303 0.185500 0.374819 0.266522 0.239192 0.298323 0.276840 0.349350 0.294045 0.262504 0.169377 0.140854 0.272035 0.173817 0.175278 0.174428 0.196617 0.262825 0.246289 0.159280 0.188414 0.223199 0.264933 0.202343 0.147373 0.163255 0.164684 0.144755 0.163676 0.169193 0.158405 0.182265 0.221274 0.156045 0.174612 0.227247 0.239752 0.154758 0.261771 0.223903 0.125994 0.186299 0.154069 0.235769 0.169930 0.174047 0.200190 0.151018 0.397664 0.226544 0.243234 0.430198 22.844062
gpt2-701_run1-c 0.286834 0.328675 0.418052 0.331855 0.337877 0.224970 0.232370 0.299427 0.323245 0.207052 0.242999 0.310311 0.209678 0.177555 0.284472 0.366792 0.190332 0.214327 0.175772 0.335668 0.293794 0.408560 0.300116 0.231661 0.277755 0.339554 0.225759 0.285853 0.296463 0.216109 0.265560 0.308956 0.308521 0.238862 0.270940 0.000000 0.176901 0.281022 0.275795 0.232072 0.186698 0.176539 0.350004 0.273072 0.197807 0.144674 0.183205 0.381138 0.241580 0.217933 0.315722 0.315476 0.268307 0.240504 0.277992 0.259279 0.229938 0.256083 0.222395 0.268954 0.207585 0.210458 0.291594 0.311111 0.370009 0.326662 0.296515 0.255875 0.238577 0.228870 0.274738 0.167063 0.232388 0.236633 0.264537 0.251690 0.267576 0.187739 0.314539 0.230142 0.290131 0.227550 0.266321 0.197112 0.264061 0.263832 0.219550 0.362646 0.313491 0.247416 0.339237 0.284507 0.172937 0.216557 0.256583 0.221536 0.242387 0.241400 0.370589 0.392596 26.294555
gpt2-701_run2-c 0.234322 0.299931 0.344757 0.276099 0.264508 0.207870 0.199166 0.269754 0.298923 0.184188 0.257310 0.294738 0.202155 0.191429 0.239953 0.313397 0.199857 0.214113 0.177996 0.278505 0.324453 0.378604 0.290522 0.238798 0.245129 0.250056 0.221046 0.250526 0.271050 0.200890 0.232648 0.269850 0.340012 0.200275 0.219575 0.176901 0.000000 0.291751 0.233712 0.200005 0.180761 0.156363 0.247352 0.230003 0.208689 0.165414 0.244331 0.355674 0.222857 0.195137 0.297275 0.329908 0.285761 0.227275 0.257123 0.257876 0.243692 0.240526 0.243159 0.254473 0.224095 0.219941 0.239232 0.254256 0.316156 0.320834 0.306756 0.239642 0.274095 0.206716 0.252820 0.196777 0.243146 0.236567 0.248885 0.215002 0.258936 0.171531 0.277426 0.232477 0.251986 0.230797 0.245999 0.236661 0.301529 0.230428 0.202261 0.357049 0.263944 0.199748 0.274019 0.288709 0.174904 0.221037 0.260830 0.235584 0.241974 0.254117 0.339218 0.366385 24.740886
gpt2-701_run3-c 0.229940 0.277243 0.333525 0.328041 0.291663 0.261101 0.278934 0.207221 0.259207 0.255877 0.158501 0.347339 0.283258 0.199693 0.189744 0.306978 0.197782 0.189189 0.227693 0.180853 0.378446 0.356014 0.367773 0.175133 0.271630 0.350937 0.312921 0.204994 0.324514 0.262104 0.229350 0.350347 0.368443 0.219504 0.290916 0.281022 0.291751 0.000000 0.274075 0.289403 0.260793 0.294808 0.192107 0.318234 0.258502 0.248406 0.272590 0.333938 0.288953 0.186849 0.259706 0.291549 0.229235 0.257034 0.265716 0.216059 0.308326 0.200317 0.195712 0.325081 0.255295 0.288495 0.352689 0.308027 0.379290 0.317343 0.279235 0.257728 0.289130 0.184038 0.250506 0.212432 0.282241 0.298370 0.271679 0.198845 0.353841 0.223821 0.374238 0.264146 0.228169 0.266347 0.316116 0.323635 0.338090 0.285951 0.215495 0.369342 0.319034 0.230263 0.334861 0.276863 0.322191 0.234428 0.251517 0.226097 0.331440 0.230587 0.384990 0.379987 27.313764
gpt2-701_run4-c 0.234451 0.295173 0.376386 0.326229 0.215531 0.247731 0.214424 0.343292 0.291943 0.275250 0.250366 0.363855 0.252610 0.246256 0.287626 0.272108 0.215537 0.245549 0.239920 0.257081 0.340491 0.351113 0.262236 0.176017 0.218415 0.268893 0.256439 0.243059 0.293039 0.189488 0.310011 0.308139 0.346263 0.197388 0.252093 0.275795 0.233712 0.274075 0.000000 0.258447 0.227379 0.229652 0.302433 0.271996 0.248472 0.219808 0.219045 0.386207 0.258222 0.200785 0.257581 0.310542 0.290244 0.240755 0.233718 0.201761 0.226796 0.253777 0.228152 0.291640 0.201005 0.248833 0.302512 0.265330 0.348217 0.250765 0.289580 0.257132 0.267911 0.210010 0.258614 0.241676 0.262663 0.281317 0.223586 0.247864 0.304761 0.251263 0.309010 0.222695 0.267061 0.234332 0.268012 0.260736 0.308601 0.297303 0.272437 0.284490 0.300772 0.231738 0.307760 0.285683 0.301010 0.245191 0.271721 0.240599 0.275699 0.249352 0.346397 0.330415 26.429446
gpt2-701_run5-c 0.303847 0.238957 0.423251 0.335980 0.289031 0.203951 0.262455 0.285361 0.288087 0.197701 0.272744 0.250030 0.246538 0.222533 0.234364 0.330444 0.195089 0.226636 0.202223 0.298604 0.338458 0.387039 0.317765 0.193640 0.233287 0.264134 0.174801 0.198276 0.237358 0.238590 0.222523 0.233179 0.285031 0.204710 0.228555 0.232072 0.200005 0.289403 0.258447 0.000000 0.163016 0.177610 0.258509 0.211559 0.219547 0.206524 0.237229 0.407188 0.204811 0.198231 0.309407 0.287184 0.261805 0.277185 0.256820 0.267108 0.235452 0.234191 0.225677 0.194444 0.227640 0.181946 0.230138 0.234491 0.291009 0.241278 0.245860 0.213199 0.211495 0.241976 0.260850 0.168532 0.187339 0.174393 0.204218 0.180244 0.208400 0.181099 0.225249 0.178827 0.233724 0.174042 0.213786 0.206104 0.254327 0.169797 0.224480 0.265915 0.220642 0.212604 0.215203 0.245155 0.178273 0.192280 0.221492 0.190370 0.310468 0.225640 0.242725 0.376923 23.740724
gpt2-801_run1-c 0.281143 0.209769 0.426888 0.336779 0.275186 0.161047 0.229935 0.279941 0.257738 0.201921 0.249075 0.232180 0.170422 0.195001 0.237238 0.315019 0.194153 0.180463 0.154345 0.287064 0.332344 0.363720 0.324769 0.169318 0.154123 0.192518 0.144612 0.172959 0.149216 0.161736 0.220125 0.169221 0.231453 0.155486 0.176209 0.186698 0.180761 0.260793 0.227379 0.163016 0.000000 0.145244 0.255946 0.152975 0.155224 0.178460 0.178367 0.420979 0.208175 0.168330 0.305001 0.255395 0.231613 0.259972 0.235380 0.249872 0.216885 0.191070 0.174764 0.161681 0.184009 0.160674 0.189495 0.168661 0.227567 0.216051 0.221477 0.190329 0.158638 0.188434 0.247061 0.168410 0.151900 0.160224 0.186077 0.162463 0.161713 0.152130 0.176960 0.125125 0.167668 0.140503 0.138277 0.166146 0.182275 0.179780 0.207177 0.195843 0.170838 0.141223 0.180234 0.164203 0.135663 0.131708 0.173641 0.143265 0.315117 0.212333 0.201568 0.378315 20.480272
gpt2-801_run2-c 0.295023 0.202028 0.457171 0.320772 0.301618 0.195766 0.231936 0.271524 0.281784 0.155019 0.236185 0.246257 0.196730 0.178632 0.204144 0.320899 0.175983 0.188844 0.140927 0.274404 0.333129 0.376440 0.294283 0.179400 0.158392 0.237807 0.132470 0.187119 0.199907 0.175864 0.225102 0.208448 0.270304 0.185640 0.163235 0.176539 0.156363 0.294808 0.229652 0.177610 0.145244 0.000000 0.295598 0.200331 0.166640 0.180123 0.179041 0.449663 0.194443 0.165345 0.357331 0.317210 0.284169 0.262285 0.248339 0.300723 0.221982 0.234366 0.160119 0.186640 0.216907 0.180614 0.219609 0.223210 0.274650 0.254077 0.267434 0.169803 0.200753 0.207415 0.248271 0.151053 0.137263 0.186826 0.190611 0.156892 0.179431 0.150476 0.224708 0.165906 0.259757 0.140756 0.192085 0.147930 0.212695 0.178655 0.175282 0.231503 0.178575 0.196103 0.219450 0.232919 0.150297 0.167936 0.206912 0.148305 0.310879 0.171539 0.288381 0.384730 21.988350
gpt2-801_run3-c 0.234725 0.290614 0.345132 0.275723 0.294932 0.284982 0.310222 0.189990 0.321053 0.249778 0.221108 0.314325 0.322300 0.214405 0.215493 0.356783 0.241736 0.206362 0.254349 0.211515 0.442490 0.434311 0.406116 0.238502 0.329390 0.317029 0.326264 0.217073 0.324868 0.284761 0.192976 0.332619 0.409878 0.223855 0.301064 0.350004 0.247352 0.192107 0.302433 0.258509 0.255946 0.295598 0.000000 0.307172 0.272619 0.276982 0.337843 0.402270 0.302964 0.227786 0.317786 0.358769 0.306917 0.302019 0.301997 0.258664 0.325206 0.185698 0.256593 0.324794 0.306474 0.302360 0.324059 0.294083 0.380674 0.352857 0.287716 0.296840 0.308196 0.190289 0.304175 0.253008 0.352763 0.300463 0.332539 0.216491 0.331608 0.209628 0.338641 0.285303 0.234623 0.307915 0.323145 0.338737 0.361543 0.264544 0.214082 0.403940 0.312898 0.219895 0.298554 0.279576 0.301869 0.245674 0.323460 0.274956 0.364468 0.283597 0.346824 0.417195 29.190410
gpt2-801_run4-c 0.374786 0.209606 0.464251 0.403829 0.366463 0.175273 0.274094 0.364647 0.256859 0.225372 0.344472 0.188918 0.226601 0.271681 0.241646 0.343609 0.236269 0.267089 0.240829 0.354165 0.403772 0.349646 0.357754 0.214120 0.189683 0.166772 0.168795 0.214815 0.171697 0.231241 0.231708 0.160180 0.205899 0.206022 0.159834 0.273072 0.230003 0.318234 0.271996 0.211559 0.152975 0.200331 0.307172 0.000000 0.182934 0.242128 0.262839 0.497899 0.223731 0.217186 0.360551 0.283828 0.257174 0.338305 0.310437 0.358662 0.269917 0.285533 0.203272 0.150871 0.230520 0.166182 0.157452 0.156208 0.189129 0.252550 0.219811 0.197241 0.184033 0.248996 0.269806 0.220288 0.195603 0.171846 0.209133 0.190409 0.154821 0.167608 0.146000 0.178354 0.191581 0.172671 0.168296 0.230123 0.231323 0.164046 0.293325 0.206181 0.146369 0.186282 0.155814 0.196832 0.176203 0.156279 0.198405 0.180978 0.396068 0.279118 0.188083 0.429252 23.926227
gpt2-801_run5-c 0.286393 0.221706 0.419932 0.315979 0.309137 0.156987 0.268366 0.268270 0.334942 0.175754 0.263312 0.214370 0.213211 0.188685 0.237632 0.292569 0.249411 0.178942 0.148938 0.262587 0.348605 0.367109 0.328787 0.184751 0.227290 0.235470 0.191687 0.193762 0.184720 0.208496 0.228358 0.192110 0.226948 0.214661 0.195740 0.197807 0.208689 0.258502 0.248472 0.219547 0.155224 0.166640 0.272619 0.182934 0.000000 0.212824 0.211038 0.429266 0.222800 0.170889 0.324444 0.269520 0.253561 0.310073 0.290692 0.307367 0.259580 0.211248 0.178880 0.179705 0.223349 0.210768 0.215053 0.237727 0.266678 0.252763 0.191503 0.211837 0.150830 0.187163 0.278895 0.191114 0.186327 0.203512 0.211084 0.213446 0.175035 0.182522 0.217410 0.185067 0.200593 0.190330 0.210553 0.182045 0.179711 0.196004 0.219222 0.257096 0.197237 0.166660 0.190499 0.151982 0.156908 0.154978 0.188212 0.178828 0.348232 0.219264 0.255988 0.400739 22.715101
gpt2-901_run1-c 0.234626 0.312848 0.364835 0.286606 0.264687 0.224922 0.160309 0.293015 0.246518 0.191456 0.202586 0.318403 0.162731 0.177831 0.274233 0.299563 0.164662 0.199861 0.144864 0.275844 0.283022 0.318627 0.254298 0.213683 0.241585 0.308254 0.223638 0.289963 0.276407 0.166613 0.235465 0.260781 0.297792 0.182478 0.253557 0.144674 0.165414 0.248406 0.219808 0.206524 0.178460 0.180123 0.276982 0.242128 0.212824 0.000000 0.187923 0.333436 0.217980 0.192427 0.288206 0.302327 0.244112 0.199453 0.226658 0.199010 0.221431 0.257021 0.249738 0.289848 0.181991 0.237351 0.271521 0.273975 0.351256 0.354846 0.312927 0.256708 0.255525 0.225829 0.217863 0.181715 0.270638 0.227842 0.265812 0.219965 0.268089 0.185622 0.292247 0.174343 0.271896 0.218306 0.223134 0.224044 0.307695 0.261418 0.249866 0.336785 0.306663 0.213631 0.311188 0.277772 0.222447 0.227099 0.251113 0.268315 0.232059 0.253543 0.381469 0.324460 24.480442
gpt2-901_run2-c 0.240313 0.237393 0.402296 0.330455 0.235073 0.222297 0.229181 0.349461 0.303624 0.219964 0.233089 0.326290 0.181667 0.203820 0.273209 0.316787 0.202934 0.199244 0.188558 0.268987 0.230402 0.344174 0.293242 0.178079 0.185673 0.303510 0.194286 0.216485 0.230876 0.149472 0.321466 0.286798 0.285430 0.201832 0.252338 0.183205 0.244331 0.272590 0.219045 0.237229 0.178367 0.179041 0.337843 0.262839 0.211038 0.187923 0.000000 0.407318 0.244667 0.171943 0.260296 0.283267 0.262511 0.222332 0.235659 0.223877 0.216047 0.271723 0.219718 0.275147 0.179498 0.213690 0.304598 0.288495 0.358117 0.209037 0.259993 0.185561 0.258230 0.218937 0.263889 0.167319 0.201045 0.257333 0.188007 0.234835 0.232292 0.213341 0.306278 0.183605 0.284441 0.189126 0.251047 0.172933 0.216004 0.263166 0.236746 0.221800 0.301406 0.224405 0.280486 0.236669 0.240879 0.203903 0.188062 0.213909 0.242713 0.185186 0.368126 0.301268 24.193034
gpt2-901_run3-c 0.354351 0.479423 0.430473 0.402222 0.386210 0.413528 0.327145 0.445145 0.375283 0.395328 0.374843 0.491080 0.356954 0.401965 0.407790 0.366560 0.365038 0.386410 0.390072 0.388131 0.414374 0.353900 0.400433 0.365026 0.476713 0.468625 0.462317 0.432906 0.493388 0.371186 0.445305 0.517321 0.499938 0.352470 0.452177 0.381138 0.355674 0.333938 0.386207 0.407188 0.420979 0.449663 0.402270 0.497899 0.429266 0.333436 0.407318 0.000000 0.359791 0.368614 0.320637 0.388717 0.340356 0.355542 0.407465 0.309542 0.450396 0.390565 0.416700 0.494146 0.404447 0.457868 0.478581 0.466786 0.513154 0.472160 0.468722 0.415303 0.458658 0.385871 0.395865 0.371222 0.461608 0.398984 0.430699 0.406419 0.480444 0.425719 0.541717 0.426473 0.402392 0.444895 0.460230 0.419397 0.475494 0.472944 0.398819 0.514414 0.508227 0.389361 0.511953 0.457658 0.459760 0.417698 0.392375 0.431608 0.362717 0.387444 0.549685 0.447857 41.617101
gpt2-901_run4-c 0.327190 0.222347 0.417648 0.374005 0.310752 0.208840 0.247240 0.345519 0.256403 0.205823 0.283307 0.237561 0.230150 0.256345 0.224335 0.275166 0.202121 0.251388 0.222364 0.317515 0.350096 0.313634 0.276117 0.187768 0.238711 0.252339 0.187413 0.228944 0.236836 0.243971 0.257219 0.210599 0.227997 0.218529 0.191303 0.241580 0.222857 0.288953 0.258222 0.204811 0.208175 0.194443 0.302964 0.223731 0.222800 0.217980 0.244667 0.359791 0.000000 0.177948 0.289165 0.254212 0.222515 0.289890 0.278475 0.266050 0.254775 0.280900 0.194252 0.198449 0.213758 0.195433 0.235946 0.234322 0.259306 0.261035 0.238731 0.169374 0.208214 0.264387 0.225119 0.203093 0.197007 0.174836 0.219965 0.192322 0.217258 0.208197 0.246683 0.202142 0.248217 0.172484 0.208641 0.189993 0.254597 0.187872 0.256415 0.258299 0.219916 0.219889 0.231583 0.249003 0.192675 0.202628 0.211534 0.195437 0.313639 0.228077 0.268177 0.405182 24.016485
gpt2-901_run5-c 0.216331 0.208101 0.342530 0.302824 0.227955 0.198130 0.212313 0.241440 0.255508 0.169004 0.180663 0.239448 0.206317 0.154417 0.173679 0.236142 0.162299 0.161526 0.145527 0.209236 0.298944 0.329483 0.271864 0.134525 0.171434 0.262892 0.177227 0.153015 0.211407 0.175327 0.203911 0.241508 0.249323 0.160203 0.185500 0.217933 0.195137 0.186849 0.200785 0.198231 0.168330 0.165345 0.227786 0.217186 0.170889 0.192427 0.171943 0.368614 0.177948 0.000000 0.235000 0.271682 0.231975 0.228926 0.213926 0.208171 0.215851 0.186213 0.164278 0.222694 0.192730 0.182181 0.248884 0.228045 0.303899 0.220258 0.200089 0.163735 0.210624 0.152170 0.237279 0.149214 0.171276 0.217201 0.186402 0.159964 0.213529 0.143108 0.262807 0.177661 0.233897 0.183702 0.231713 0.168534 0.213065 0.207913 0.177494 0.258578 0.242795 0.146347 0.221393 0.195886 0.207594 0.159029 0.190202 0.165003 0.270179 0.155235 0.316415 0.313934 20.788037
sample_004c_mary_anne_saves_the_day_2_240 0.280657 0.379596 0.367020 0.348291 0.217697 0.324630 0.275674 0.367748 0.333305 0.356710 0.288904 0.452188 0.303951 0.313905 0.341205 0.301950 0.285456 0.314631 0.286870 0.276893 0.281618 0.330729 0.313532 0.229984 0.365503 0.396445 0.360888 0.307319 0.374651 0.272818 0.366643 0.404759 0.387409 0.280403 0.374819 0.315722 0.297275 0.259706 0.257581 0.309407 0.305001 0.357331 0.317786 0.360551 0.324444 0.288206 0.260296 0.320637 0.289165 0.235000 0.000000 0.268499 0.255810 0.270275 0.285165 0.171074 0.294211 0.277260 0.308166 0.384755 0.239645 0.298510 0.411198 0.381277 0.433020 0.290027 0.346295 0.295780 0.335869 0.290874 0.316014 0.276514 0.365416 0.371944 0.296114 0.326739 0.404536 0.329336 0.460550 0.318263 0.334569 0.323552 0.384144 0.281450 0.344256 0.382163 0.325300 0.401883 0.417434 0.281952 0.394283 0.319447 0.382390 0.330806 0.311540 0.318459 0.220285 0.318535 0.452010 0.302063 32.092564
sample_005c_dawn_and_the_impossible_three_2_216 0.371659 0.240673 0.443260 0.474338 0.320843 0.257983 0.319217 0.418557 0.297414 0.287915 0.352236 0.273121 0.272714 0.346299 0.297353 0.279271 0.277727 0.293705 0.281320 0.338815 0.323173 0.299858 0.367350 0.212464 0.282006 0.284031 0.259242 0.276649 0.273141 0.288896 0.347286 0.271107 0.222340 0.245235 0.266522 0.315476 0.329908 0.291549 0.310542 0.287184 0.255395 0.317210 0.358769 0.283828 0.269520 0.302327 0.283267 0.388717 0.254212 0.271682 0.268499 0.000000 0.052790 0.309302 0.292579 0.299281 0.332443 0.294242 0.265105 0.215072 0.274157 0.229004 0.260868 0.250670 0.259378 0.284129 0.284419 0.253127 0.209958 0.287796 0.293789 0.275524 0.255936 0.216288 0.240871 0.284401 0.261370 0.296869 0.320477 0.221106 0.248477 0.247132 0.204114 0.268272 0.243217 0.271692 0.348526 0.242301 0.263124 0.251848 0.264521 0.224191 0.302440 0.236946 0.226992 0.221539 0.375579 0.313103 0.271767 0.418210 28.216747
sample_005c_dawn_and_the_impossible_three_2_432 0.330236 0.238559 0.401664 0.436451 0.321541 0.224650 0.275396 0.365369 0.244740 0.259159 0.292879 0.259845 0.253522 0.285139 0.243086 0.291672 0.220482 0.249149 0.247848 0.316087 0.338206 0.287269 0.341737 0.191172 0.263806 0.280405 0.236345 0.238539 0.259894 0.260255 0.271423 0.261012 0.210603 0.194667 0.239192 0.268307 0.285761 0.229235 0.290244 0.261805 0.231613 0.284169 0.306917 0.257174 0.253561 0.244112 0.262511 0.340356 0.222515 0.231975 0.255810 0.052790 0.000000 0.276546 0.273493 0.258589 0.321744 0.268327 0.237732 0.219787 0.241577 0.209134 0.256364 0.249644 0.285277 0.282720 0.279958 0.207735 0.222032 0.248317 0.267339 0.217438 0.242245 0.203781 0.212160 0.224275 0.251876 0.241038 0.291395 0.206133 0.215651 0.212879 0.218186 0.264701 0.268556 0.247237 0.305035 0.277190 0.255601 0.233462 0.241262 0.239032 0.258709 0.197176 0.210123 0.206175 0.334223 0.262756 0.286044 0.418201 25.759709
sample_006c_kristys_big_day_2_321 0.197514 0.325638 0.328233 0.279635 0.229286 0.283472 0.193906 0.307801 0.275493 0.287933 0.251349 0.409947 0.241455 0.221021 0.271003 0.305103 0.208866 0.250046 0.254887 0.244807 0.294265 0.349204 0.286745 0.250799 0.291874 0.324898 0.286885 0.294204 0.352242 0.238828 0.312788 0.385544 0.380962 0.232980 0.298323 0.240504 0.227275 0.257034 0.240755 0.277185 0.259972 0.262285 0.302019 0.338305 0.310073 0.199453 0.222332 0.355542 0.289890 0.228926 0.270275 0.309302 0.276546 0.000000 0.078129 0.199106 0.264423 0.286244 0.284772 0.338618 0.227218 0.289228 0.354889 0.334748 0.404609 0.332290 0.355717 0.282598 0.325219 0.240654 0.238019 0.231377 0.309899 0.308389 0.273480 0.264909 0.345127 0.249470 0.402265 0.268578 0.311851 0.271799 0.327127 0.280244 0.375672 0.340159 0.266494 0.398870 0.362521 0.273328 0.378757 0.351927 0.328639 0.301845 0.284673 0.268312 0.240819 0.258607 0.451808 0.305022 28.684059
sample_006c_kristys_big_day_2_850 0.230987 0.284376 0.386187 0.337177 0.239322 0.269518 0.210316 0.326482 0.269947 0.299678 0.261266 0.375752 0.248711 0.249303 0.283143 0.288381 0.226260 0.254568 0.249955 0.254808 0.305352 0.370228 0.287630 0.213314 0.261169 0.306749 0.255254 0.269050 0.293151 0.215571 0.323632 0.319644 0.335836 0.237694 0.276840 0.277992 0.257123 0.265716 0.233718 0.256820 0.235380 0.248339 0.301997 0.310437 0.290692 0.226658 0.235659 0.407465 0.278475 0.213926 0.285165 0.292579 0.273493 0.078129 0.000000 0.217213 0.257036 0.270872 0.258456 0.292771 0.213718 0.254880 0.328096 0.302489 0.362907 0.286163 0.318795 0.262978 0.280757 0.255038 0.242472 0.243249 0.267181 0.282618 0.258996 0.259997 0.312108 0.252785 0.343476 0.237037 0.301762 0.246598 0.290863 0.255564 0.337290 0.312680 0.282578 0.334772 0.311881 0.255336 0.342572 0.306804 0.325436 0.269300 0.255233 0.224151 0.294715 0.250677 0.370173 0.302368 27.519855
sample_013c_goodbye_stacey_goodbye_2_537 0.204898 0.366906 0.356832 0.313504 0.196616 0.293520 0.226996 0.286892 0.266022 0.305125 0.213584 0.441446 0.268880 0.230268 0.303826 0.308299 0.231067 0.235041 0.253246 0.232566 0.304341 0.378303 0.261385 0.208425 0.298303 0.375579 0.326789 0.296611 0.361743 0.223795 0.329460 0.385926 0.415092 0.212123 0.349350 0.259279 0.257876 0.216059 0.201761 0.267108 0.249872 0.300723 0.258664 0.358662 0.307367 0.199010 0.223877 0.309542 0.266050 0.208171 0.171074 0.299281 0.258589 0.199106 0.217213 0.000000 0.251863 0.232975 0.260168 0.378453 0.173231 0.290693 0.391083 0.344440 0.437912 0.298720 0.339882 0.294177 0.320379 0.242971 0.257633 0.240422 0.338723 0.335644 0.288327 0.283126 0.400885 0.277755 0.413014 0.272854 0.296028 0.277180 0.340522 0.303543 0.377187 0.335965 0.266839 0.379464 0.398742 0.274543 0.389034 0.335652 0.339734 0.293924 0.310813 0.286606 0.213513 0.275233 0.441801 0.321225 29.114921
sample_014c_hello_mallory_2_239 0.311633 0.298099 0.453871 0.381964 0.300452 0.262847 0.267348 0.352152 0.324450 0.276265 0.267494 0.361044 0.243313 0.252362 0.312511 0.328822 0.224546 0.305252 0.242335 0.329581 0.281060 0.363952 0.264891 0.212355 0.246438 0.297794 0.242199 0.296477 0.312044 0.215641 0.317846 0.288665 0.346774 0.253520 0.294045 0.229938 0.243692 0.308326 0.226796 0.235452 0.216885 0.221982 0.325206 0.269917 0.259580 0.221431 0.216047 0.450396 0.254775 0.215851 0.294211 0.332443 0.321744 0.264423 0.257036 0.251863 0.000000 0.301307 0.242793 0.300181 0.197046 0.224631 0.308088 0.294648 0.366695 0.317101 0.297078 0.265896 0.279808 0.268572 0.247181 0.237124 0.265412 0.272385 0.304549 0.260982 0.312882 0.237428 0.326942 0.232873 0.313557 0.256566 0.266609 0.238909 0.319790 0.263161 0.262863 0.301728 0.311689 0.246916 0.358122 0.297848 0.280231 0.250628 0.303274 0.241229 0.256585 0.269369 0.340923 0.352009 28.041642
sample_024c_kristy_and_the_mothers_day_surprise_2_106 0.234419 0.277485 0.410883 0.291071 0.282402 0.263329 0.299275 0.157034 0.306054 0.205339 0.173031 0.288193 0.293047 0.185458 0.220311 0.338605 0.203204 0.148727 0.175472 0.180012 0.363035 0.419218 0.372777 0.171947 0.267150 0.314289 0.277798 0.175522 0.259996 0.207430 0.166140 0.287716 0.346131 0.185095 0.262504 0.256083 0.240526 0.200317 0.253777 0.234191 0.191070 0.234366 0.185698 0.285533 0.211248 0.257021 0.271723 0.390565 0.280900 0.186213 0.277260 0.294242 0.268327 0.286244 0.270872 0.232975 0.301307 0.000000 0.161381 0.247714 0.243902 0.260339 0.300469 0.266221 0.348263 0.259921 0.281744 0.280628 0.221386 0.144097 0.303400 0.186269 0.245879 0.246009 0.247346 0.191706 0.286086 0.168443 0.321152 0.216522 0.210400 0.244800 0.262974 0.256812 0.264234 0.252755 0.177717 0.310731 0.273523 0.185060 0.234838 0.202919 0.219617 0.199598 0.266821 0.187809 0.363954 0.229382 0.324067 0.383739 25.101184
sample_025c_mary_anne_and_the_search_for_tigger_2_118 0.288067 0.200363 0.425135 0.328492 0.301236 0.204829 0.277300 0.241942 0.236153 0.197934 0.189096 0.247637 0.238179 0.193511 0.157811 0.276530 0.188718 0.175075 0.196145 0.230733 0.361186 0.360603 0.332242 0.135580 0.167295 0.249680 0.191386 0.163592 0.222138 0.208514 0.218465 0.243321 0.276401 0.175665 0.169377 0.222395 0.243159 0.195712 0.228152 0.225677 0.174764 0.160119 0.256593 0.203272 0.178880 0.249738 0.219718 0.416700 0.194252 0.164278 0.308166 0.265105 0.237732 0.284772 0.258456 0.260168 0.242793 0.161381 0.000000 0.183579 0.212324 0.195163 0.226590 0.204606 0.252440 0.218124 0.239907 0.160504 0.164980 0.176925 0.240643 0.165728 0.143530 0.188091 0.187157 0.140558 0.228534 0.154365 0.248535 0.193348 0.210965 0.148242 0.211738 0.197796 0.234960 0.171757 0.173933 0.233486 0.158020 0.195708 0.225558 0.208108 0.200845 0.145293 0.190934 0.106756 0.351048 0.153071 0.250487 0.394764 21.911413
sample_027c_jessi_and_the_superbrat_2_159 0.380403 0.146329 0.483243 0.437427 0.357356 0.176555 0.292005 0.354648 0.257339 0.205790 0.352002 0.148040 0.236346 0.283170 0.229143 0.317952 0.238629 0.241034 0.244528 0.339260 0.411781 0.349265 0.367884 0.181546 0.210026 0.177066 0.124097 0.192994 0.153838 0.252202 0.221981 0.129061 0.156172 0.204128 0.140854 0.268954 0.254473 0.325081 0.291640 0.194444 0.161681 0.186640 0.324794 0.150871 0.179705 0.289848 0.275147 0.494146 0.198449 0.222694 0.384755 0.215072 0.219787 0.338618 0.292771 0.378453 0.300181 0.247714 0.183579 0.000000 0.258851 0.128218 0.127720 0.143736 0.132628 0.219560 0.213926 0.179654 0.120754 0.252343 0.272038 0.207513 0.135232 0.108907 0.169718 0.188785 0.124050 0.174569 0.135189 0.170462 0.196413 0.157659 0.130551 0.196361 0.178495 0.136353 0.292459 0.172696 0.109685 0.172260 0.124561 0.163511 0.146942 0.145972 0.166539 0.126650 0.439752 0.272478 0.136127 0.456547 22.759457
sample_028c_welcome_back_stacey_2_679 0.269793 0.282440 0.437123 0.344655 0.241598 0.187422 0.220240 0.336158 0.287366 0.266287 0.262984 0.347299 0.209528 0.232534 0.303683 0.326962 0.201434 0.245379 0.214009 0.298793 0.272931 0.361210 0.255728 0.170080 0.210392 0.293315 0.226183 0.251909 0.239689 0.182985 0.294361 0.237270 0.287392 0.214885 0.272035 0.207585 0.224095 0.255295 0.201005 0.227640 0.184009 0.216907 0.306474 0.230520 0.223349 0.181991 0.179498 0.404447 0.213758 0.192730 0.239645 0.274157 0.241577 0.227218 0.213718 0.173231 0.197046 0.243902 0.212324 0.258851 0.000000 0.181127 0.265931 0.251910 0.321466 0.222690 0.252042 0.207901 0.231975 0.226324 0.219655 0.204097 0.245208 0.236172 0.237016 0.238061 0.263770 0.234366 0.261553 0.205410 0.233477 0.200817 0.234102 0.236960 0.287585 0.248601 0.266694 0.269642 0.290904 0.216017 0.281935 0.243970 0.234894 0.183040 0.238234 0.205957 0.242848 0.245940 0.286809 0.321243 24.493369
sample_030c_mary_anne_and_the_great_romance_2_570 0.316090 0.187055 0.414145 0.403461 0.290817 0.164976 0.255951 0.359923 0.261259 0.230653 0.304568 0.211448 0.223148 0.248649 0.246650 0.304949 0.192013 0.257393 0.242571 0.313434 0.322863 0.341159 0.277551 0.169491 0.198968 0.198725 0.133218 0.201539 0.178899 0.223761 0.256256 0.179460 0.214812 0.220965 0.173817 0.210458 0.219941 0.288495 0.248833 0.181946 0.160674 0.180614 0.302360 0.166182 0.210768 0.237351 0.213690 0.457868 0.195433 0.182181 0.298510 0.229004 0.209134 0.289228 0.254880 0.290693 0.224631 0.260339 0.195163 0.128218 0.181127 0.000000 0.139688 0.139739 0.183964 0.196376 0.205725 0.153501 0.145018 0.230777 0.252187 0.194829 0.163477 0.143493 0.167015 0.190842 0.176392 0.177479 0.161888 0.184192 0.218618 0.171162 0.153058 0.200522 0.218001 0.162435 0.274367 0.214139 0.164582 0.177833 0.182672 0.207952 0.157741 0.139682 0.189463 0.141337 0.319382 0.250433 0.180774 0.377648 22.046746
sample_032c_kristy_and_the_secret_of_susan_2_1152 0.375880 0.212913 0.467484 0.412577 0.355557 0.179304 0.288680 0.411734 0.303998 0.246213 0.383938 0.184061 0.246874 0.314058 0.260620 0.329235 0.254556 0.299726 0.279196 0.352289 0.409687 0.320219 0.371599 0.215520 0.237401 0.158879 0.171073 0.245569 0.157203 0.264348 0.264114 0.155936 0.219471 0.234288 0.175278 0.291594 0.239232 0.352689 0.302512 0.230138 0.189495 0.219609 0.324059 0.157452 0.215053 0.271521 0.304598 0.478581 0.235946 0.248884 0.411198 0.260868 0.256364 0.354889 0.328096 0.391083 0.308088 0.300469 0.226590 0.127720 0.265931 0.139688 0.000000 0.101520 0.091755 0.245489 0.225194 0.215273 0.151133 0.266339 0.286051 0.258375 0.211157 0.126157 0.243261 0.217881 0.147566 0.221173 0.140801 0.200507 0.224682 0.213484 0.126907 0.263550 0.206474 0.156554 0.331578 0.215757 0.159975 0.159925 0.176905 0.187361 0.185599 0.169523 0.192134 0.197734 0.422433 0.339612 0.169390 0.450983 25.092015
sample_041c_mary_anne_vs_logan_2_1765 0.318881 0.191621 0.414751 0.403293 0.297241 0.199661 0.279979 0.368467 0.266982 0.239297 0.322778 0.199205 0.240810 0.298712 0.230934 0.306131 0.234856 0.261731 0.274563 0.303813 0.435640 0.329584 0.359981 0.190632 0.188206 0.172915 0.162238 0.224234 0.170961 0.249615 0.246346 0.182989 0.258143 0.192747 0.174428 0.311111 0.254256 0.308027 0.265330 0.234491 0.168661 0.223210 0.294083 0.156208 0.237727 0.273975 0.288495 0.466786 0.234322 0.228045 0.381277 0.250670 0.249644 0.334748 0.302489 0.344440 0.294648 0.266221 0.204606 0.143736 0.251910 0.139739 0.101520 0.000000 0.106530 0.231955 0.237841 0.221707 0.176967 0.244731 0.278345 0.242867 0.205630 0.154570 0.228098 0.206468 0.185239 0.213929 0.165999 0.179551 0.215888 0.216380 0.108222 0.274116 0.230955 0.182778 0.311237 0.186261 0.157643 0.146510 0.186559 0.180631 0.220725 0.188194 0.197296 0.177261 0.406587 0.316139 0.178307 0.417879 24.182734
sample_045c_kristy_and_the_baby_parade_2_767 0.435693 0.216133 0.488166 0.496665 0.372579 0.224301 0.344129 0.464952 0.312304 0.289114 0.425887 0.211871 0.275957 0.392407 0.288599 0.336933 0.305144 0.332834 0.346775 0.371654 0.474229 0.316945 0.408221 0.254553 0.262490 0.183061 0.199712 0.286223 0.189877 0.322966 0.328746 0.174540 0.251208 0.276986 0.196617 0.370009 0.316156 0.379290 0.348217 0.291009 0.227567 0.274650 0.380674 0.189129 0.266678 0.351256 0.358117 0.513154 0.259306 0.303899 0.433020 0.259378 0.285277 0.404609 0.362907 0.437912 0.366695 0.348263 0.252440 0.132628 0.321466 0.183964 0.091755 0.106530 0.000000 0.274094 0.263645 0.248460 0.173421 0.339410 0.317071 0.321288 0.233297 0.164185 0.266303 0.259575 0.175578 0.275911 0.192335 0.226160 0.247503 0.250895 0.130551 0.310067 0.230346 0.194609 0.392354 0.211745 0.155666 0.188782 0.211208 0.197248 0.249123 0.232047 0.210036 0.207769 0.474364 0.378464 0.185814 0.494749 28.758498
sample_046c_mary_anne_misses_logan_2_488 0.302364 0.254805 0.432448 0.389362 0.267171 0.210380 0.331784 0.393022 0.362049 0.353928 0.363655 0.316003 0.321489 0.295558 0.280201 0.383553 0.297788 0.265224 0.298155 0.305329 0.358694 0.404073 0.356767 0.183936 0.224877 0.283047 0.228976 0.182368 0.157208 0.258531 0.355056 0.294438 0.279829 0.238229 0.262825 0.326662 0.320834 0.317343 0.250765 0.241278 0.216051 0.254077 0.352857 0.252550 0.252763 0.354846 0.209037 0.472160 0.261035 0.220258 0.290027 0.284129 0.282720 0.332290 0.286163 0.298720 0.317101 0.259921 0.218124 0.219560 0.222690 0.196376 0.245489 0.231955 0.274094 0.000000 0.200221 0.170070 0.203400 0.227773 0.326699 0.215530 0.179750 0.237469 0.184357 0.251587 0.215610 0.282233 0.259953 0.249420 0.243494 0.189535 0.253976 0.205812 0.189458 0.243863 0.284927 0.201041 0.261763 0.221160 0.216705 0.181752 0.264881 0.207283 0.180057 0.181537 0.377601 0.231546 0.238845 0.368391 26.702701
sample_048c_jessis_wish_2_68 0.322477 0.236518 0.410559 0.398056 0.345957 0.188351 0.355839 0.357328 0.335865 0.294853 0.362017 0.253867 0.306316 0.266935 0.269633 0.322326 0.298051 0.243039 0.241776 0.318403 0.428874 0.382756 0.383261 0.204418 0.247132 0.241755 0.221530 0.192086 0.194762 0.290080 0.297532 0.238335 0.244872 0.258138 0.246289 0.296515 0.306756 0.279235 0.289580 0.245860 0.221477 0.267434 0.287716 0.219811 0.191503 0.312927 0.259993 0.468722 0.238731 0.200089 0.346295 0.284419 0.279958 0.355717 0.318795 0.339882 0.297078 0.281744 0.239907 0.213926 0.252042 0.205725 0.225194 0.237841 0.263645 0.200221 0.000000 0.210745 0.179381 0.217557 0.299511 0.233201 0.206667 0.233523 0.241160 0.235408 0.209084 0.244790 0.244808 0.247085 0.204799 0.224895 0.234557 0.226780 0.205146 0.212836 0.289355 0.255382 0.251781 0.201269 0.207356 0.168887 0.241850 0.193130 0.224187 0.202530 0.411754 0.257439 0.209935 0.442421 26.497930
sample_049c_claudia_and_the_genius_of_elm_street_2_105 0.322182 0.181176 0.404964 0.393639 0.284870 0.180780 0.268810 0.362650 0.261285 0.235578 0.274030 0.258989 0.219780 0.239859 0.191822 0.296922 0.192347 0.241954 0.228967 0.293957 0.325291 0.307496 0.296968 0.170985 0.160433 0.230059 0.153629 0.143881 0.185342 0.208005 0.271022 0.233133 0.218712 0.186479 0.159280 0.255875 0.239642 0.257728 0.257132 0.213199 0.190329 0.169803 0.296840 0.197241 0.211837 0.256708 0.185561 0.415303 0.169374 0.163735 0.295780 0.253127 0.207735 0.282598 0.262978 0.294177 0.265896 0.280628 0.160504 0.179654 0.207901 0.153501 0.215273 0.221707 0.248460 0.170070 0.210745 0.000000 0.204703 0.226846 0.255899 0.160851 0.139028 0.201963 0.131130 0.160308 0.181363 0.191963 0.222523 0.214619 0.233139 0.126899 0.226054 0.178204 0.220130 0.165823 0.232807 0.209381 0.183049 0.201694 0.190624 0.227802 0.203447 0.153615 0.151688 0.137253 0.316282 0.148961 0.243669 0.391543 22.409576
sample_051c_staceys_ex_best_friend_2_1270 0.331877 0.185667 0.451504 0.402905 0.345632 0.178260 0.306402 0.315430 0.303526 0.234670 0.320140 0.199321 0.272025 0.248743 0.247169 0.282681 0.249923 0.221821 0.214514 0.313874 0.412101 0.342141 0.349420 0.180013 0.225824 0.209326 0.158351 0.210690 0.173404 0.247737 0.245824 0.180796 0.188059 0.229293 0.188414 0.238577 0.274095 0.289130 0.267911 0.211495 0.158638 0.200753 0.308196 0.184033 0.150830 0.255525 0.258230 0.458658 0.208214 0.210624 0.335869 0.209958 0.222032 0.325219 0.280757 0.320379 0.279808 0.221386 0.164980 0.120754 0.231975 0.145018 0.151133 0.176967 0.173421 0.203400 0.179381 0.204703 0.000000 0.219020 0.269883 0.209201 0.162069 0.130468 0.197342 0.194790 0.166365 0.197793 0.191054 0.163159 0.184154 0.145257 0.152639 0.172476 0.162550 0.166387 0.279846 0.213284 0.140293 0.185699 0.163614 0.136862 0.165386 0.149487 0.171859 0.144829 0.408155 0.261967 0.171543 0.448704 22.915660
sample_057c_dawn_saves_the_planet_2_522 0.171902 0.230922 0.337515 0.261155 0.244070 0.208581 0.234477 0.190369 0.304621 0.217175 0.203832 0.272415 0.265255 0.148470 0.188214 0.306757 0.198837 0.146340 0.170823 0.190885 0.331250 0.401561 0.333217 0.164547 0.228539 0.250537 0.229151 0.160432 0.233079 0.189220 0.181148 0.304550 0.334090 0.167469 0.223199 0.228870 0.206716 0.184038 0.210010 0.241976 0.188434 0.207415 0.190289 0.248996 0.187163 0.225829 0.218937 0.385871 0.264387 0.152170 0.290874 0.287796 0.248317 0.240654 0.255038 0.242971 0.268572 0.144097 0.176925 0.252343 0.226324 0.230777 0.266339 0.244731 0.339410 0.227773 0.217557 0.226846 0.219020 0.000000 0.277563 0.152361 0.210583 0.239705 0.209549 0.197428 0.268817 0.157524 0.272967 0.217500 0.175257 0.202595 0.253344 0.224770 0.238802 0.220457 0.150673 0.319164 0.266533 0.140178 0.223118 0.202957 0.217580 0.167688 0.218032 0.178396 0.299590 0.182952 0.319004 0.341678 22.918839
sample_064c_dawns_family_feud_2_305 0.304779 0.262623 0.421705 0.378320 0.313738 0.266181 0.237501 0.350071 0.233345 0.251636 0.280805 0.326668 0.229332 0.270272 0.278519 0.292946 0.219613 0.265214 0.276068 0.289763 0.335717 0.317338 0.245321 0.222683 0.255213 0.284191 0.256048 0.273890 0.310822 0.252600 0.299633 0.253326 0.295870 0.242954 0.264933 0.274738 0.252820 0.250506 0.258614 0.260850 0.247061 0.248271 0.304175 0.269806 0.278895 0.217863 0.263889 0.395865 0.225119 0.237279 0.316014 0.293789 0.267339 0.238019 0.242472 0.257633 0.247181 0.303400 0.240643 0.272038 0.219655 0.252187 0.286051 0.278345 0.317071 0.326699 0.299511 0.255899 0.269883 0.277563 0.000000 0.243224 0.271535 0.248508 0.292486 0.232674 0.299294 0.242603 0.308572 0.255301 0.272329 0.239412 0.264646 0.300336 0.351688 0.257868 0.309484 0.305025 0.272710 0.246551 0.324289 0.313782 0.281799 0.238259 0.267199 0.248376 0.315198 0.291561 0.334726 0.369016 27.505233
sample_065c_staceys_big_crush_2_1311 0.236151 0.217132 0.386931 0.294068 0.264499 0.199881 0.214948 0.231876 0.255058 0.189247 0.200252 0.255327 0.209304 0.146128 0.194501 0.321100 0.154611 0.159107 0.149719 0.234330 0.303136 0.376657 0.304278 0.159837 0.188588 0.288108 0.171772 0.161536 0.202896 0.187653 0.172923 0.250240 0.264653 0.162902 0.202343 0.167063 0.196777 0.212432 0.241676 0.168532 0.168410 0.151053 0.253008 0.220288 0.191114 0.181715 0.167319 0.371222 0.203093 0.149214 0.276514 0.275524 0.217438 0.231377 0.243249 0.240422 0.237124 0.186269 0.165728 0.207513 0.204097 0.194829 0.258375 0.242867 0.321288 0.215530 0.233201 0.160851 0.209201 0.152361 0.243224 0.000000 0.152952 0.185656 0.174919 0.155353 0.205511 0.131734 0.252454 0.172685 0.237371 0.143384 0.228621 0.154152 0.241440 0.176382 0.175930 0.275698 0.243348 0.187649 0.220671 0.215478 0.170508 0.170466 0.178961 0.151235 0.285910 0.150802 0.309814 0.328372 21.349046
sample_068c_jessi_and_the_bad_babysitter_2_89 0.330618 0.166459 0.452270 0.414790 0.350850 0.180572 0.274493 0.314473 0.278778 0.213735 0.287329 0.207775 0.225486 0.222490 0.196601 0.310838 0.215979 0.202714 0.196650 0.307598 0.370416 0.364330 0.341122 0.165744 0.130296 0.226834 0.120973 0.146892 0.163704 0.218020 0.261863 0.205730 0.204434 0.202445 0.147373 0.232388 0.243146 0.282241 0.262663 0.187339 0.151900 0.137263 0.352763 0.195603 0.186327 0.270638 0.201045 0.461608 0.197007 0.171276 0.365416 0.255936 0.242245 0.309899 0.267181 0.338723 0.265412 0.245879 0.143530 0.135232 0.245208 0.163477 0.211157 0.205630 0.233297 0.179750 0.206667 0.139028 0.162069 0.210583 0.271535 0.152952 0.000000 0.157811 0.139686 0.161004 0.173230 0.172026 0.192511 0.178381 0.218670 0.129954 0.183718 0.149837 0.175546 0.147834 0.214185 0.177944 0.146011 0.196821 0.163664 0.190535 0.155425 0.157621 0.157905 0.099580 0.400917 0.155999 0.207253 0.428598 22.137351
sample_070c_stacey_and_the_cheerleaders_2_1293 0.341379 0.187679 0.468152 0.378447 0.355494 0.179892 0.245565 0.330916 0.243601 0.211582 0.312294 0.151998 0.220255 0.257087 0.226950 0.315896 0.193969 0.233757 0.227531 0.339703 0.388990 0.350811 0.342293 0.187470 0.213615 0.201512 0.143890 0.212501 0.176138 0.248939 0.230036 0.167178 0.195161 0.193036 0.163255 0.236633 0.236567 0.298370 0.281317 0.174393 0.160224 0.186826 0.300463 0.171846 0.203512 0.227842 0.257333 0.398984 0.174836 0.217201 0.371944 0.216288 0.203781 0.308389 0.282618 0.335644 0.272385 0.246009 0.188091 0.108907 0.236172 0.143493 0.126157 0.154570 0.164185 0.237469 0.233523 0.201963 0.130468 0.239705 0.248508 0.185656 0.157811 0.000000 0.209733 0.169905 0.161060 0.176734 0.132463 0.150623 0.192492 0.158472 0.111250 0.195571 0.214875 0.142637 0.265535 0.216082 0.145256 0.189626 0.165980 0.185295 0.145751 0.144081 0.177974 0.148925 0.400917 0.263876 0.166395 0.420857 22.409420
sample_078c_claudia_and_crazy_peaches_2_36 0.298178 0.201225 0.394287 0.393294 0.278061 0.195512 0.261727 0.343577 0.281359 0.233728 0.290647 0.268092 0.243707 0.254365 0.212714 0.304134 0.225436 0.220347 0.225639 0.278969 0.336304 0.332063 0.329131 0.184904 0.187511 0.229640 0.154090 0.172175 0.191271 0.218756 0.261339 0.259805 0.237176 0.186456 0.164684 0.264537 0.248885 0.271679 0.223586 0.204218 0.186077 0.190611 0.332539 0.209133 0.211084 0.265812 0.188007 0.430699 0.219965 0.186402 0.296114 0.240871 0.212160 0.273480 0.258996 0.288327 0.304549 0.247346 0.187157 0.169718 0.237016 0.167015 0.243261 0.228098 0.266303 0.184357 0.241160 0.131130 0.197342 0.209549 0.292486 0.174919 0.139686 0.209733 0.000000 0.183068 0.195388 0.202029 0.226728 0.193000 0.199575 0.147059 0.219818 0.187525 0.229292 0.186890 0.264549 0.209070 0.192453 0.204335 0.165070 0.233572 0.208631 0.182628 0.186974 0.142195 0.336081 0.172536 0.273873 0.394055 23.220706
sample_079c_mary_anne_breaks_the_rules_2_782 0.274338 0.192756 0.385925 0.322270 0.319879 0.196122 0.251085 0.237668 0.228271 0.176680 0.206844 0.215951 0.235567 0.187669 0.146069 0.298060 0.149629 0.183064 0.199027 0.234777 0.383995 0.341476 0.345237 0.166971 0.167491 0.201281 0.173235 0.143747 0.218386 0.211898 0.152807 0.224413 0.261468 0.161403 0.144755 0.251690 0.215002 0.198845 0.247864 0.180244 0.162463 0.156892 0.216491 0.190409 0.213446 0.219965 0.234835 0.406419 0.192322 0.159964 0.326739 0.284401 0.224275 0.264909 0.259997 0.283126 0.260982 0.191706 0.140558 0.188785 0.238061 0.190842 0.217881 0.206468 0.259575 0.251587 0.235408 0.160308 0.194790 0.197428 0.232674 0.155353 0.161004 0.169905 0.183068 0.000000 0.202657 0.130677 0.200680 0.174291 0.177300 0.136792 0.194303 0.216372 0.266997 0.158809 0.184740 0.250767 0.166955 0.197332 0.183929 0.232799 0.174587 0.161177 0.216747 0.145028 0.346365 0.175080 0.255358 0.397926 21.614560
sample_082c_jessi_and_the_troublemaker_2_738 0.382497 0.174884 0.493576 0.409744 0.364254 0.195301 0.305259 0.388452 0.297429 0.197700 0.360444 0.172830 0.217485 0.285326 0.242132 0.335500 0.251581 0.234392 0.229291 0.337305 0.383296 0.332368 0.385248 0.212175 0.199283 0.206413 0.166476 0.189789 0.133638 0.227904 0.250394 0.140875 0.176794 0.203419 0.163676 0.267576 0.258936 0.353841 0.304761 0.208400 0.161713 0.179431 0.331608 0.154821 0.175035 0.268089 0.232292 0.480444 0.217258 0.213529 0.404536 0.261370 0.251876 0.345127 0.312108 0.400885 0.312882 0.286086 0.228534 0.124050 0.263770 0.176392 0.147566 0.185239 0.175578 0.215610 0.209084 0.181363 0.166365 0.268817 0.299294 0.205511 0.173230 0.161060 0.195388 0.202657 0.000000 0.188750 0.166259 0.149326 0.227122 0.171145 0.166328 0.184836 0.132299 0.184901 0.297546 0.160294 0.163776 0.170945 0.143630 0.139639 0.157699 0.160064 0.158711 0.179955 0.421747 0.243310 0.200278 0.441927 23.819734
sample_084c_dawn_and_the_school_spirit_war_2_493 0.250864 0.211782 0.370129 0.287904 0.306851 0.192394 0.208831 0.204856 0.227864 0.143060 0.186979 0.197966 0.208169 0.128019 0.150824 0.308459 0.140893 0.147700 0.156204 0.236037 0.350753 0.371720 0.326957 0.178696 0.186522 0.236347 0.167016 0.169721 0.220355 0.188335 0.137953 0.218086 0.276758 0.148333 0.169193 0.187739 0.171531 0.223821 0.251263 0.181099 0.152130 0.150476 0.209628 0.167608 0.182522 0.185622 0.213341 0.425719 0.208197 0.143108 0.329336 0.296869 0.241038 0.249470 0.252785 0.277755 0.237428 0.168443 0.154365 0.174569 0.234366 0.177479 0.221173 0.213929 0.275911 0.282233 0.244790 0.191963 0.197793 0.157524 0.242603 0.131734 0.172026 0.176734 0.202029 0.130677 0.188750 0.000000 0.213096 0.176668 0.196645 0.164812 0.203354 0.201461 0.243850 0.156508 0.157300 0.284181 0.184115 0.176321 0.188790 0.217033 0.148132 0.151396 0.222084 0.153278 0.314165 0.191318 0.269389 0.359927 21.063928
sample_090c_welcome_to_the_bsc_abby_2_1491 0.411028 0.219962 0.523369 0.457020 0.409007 0.185773 0.298819 0.407096 0.306108 0.266951 0.405358 0.163897 0.278570 0.331566 0.302337 0.410025 0.264826 0.313081 0.307267 0.425361 0.449189 0.422208 0.396067 0.247779 0.218995 0.176802 0.159707 0.230799 0.165524 0.270287 0.251494 0.148638 0.215871 0.254431 0.158405 0.314539 0.277426 0.374238 0.309010 0.225249 0.176960 0.224708 0.338641 0.146000 0.217410 0.292247 0.306278 0.541717 0.246683 0.262807 0.460550 0.320477 0.291395 0.402265 0.343476 0.413014 0.326942 0.321152 0.248535 0.135189 0.261553 0.161888 0.140801 0.165999 0.192335 0.259953 0.244808 0.222523 0.191054 0.272967 0.308572 0.252454 0.192511 0.132463 0.226728 0.200680 0.166259 0.213096 0.000000 0.204945 0.213704 0.189586 0.165309 0.280790 0.250008 0.145141 0.322993 0.226030 0.156898 0.211288 0.153708 0.242831 0.160078 0.135143 0.225498 0.197401 0.466279 0.326292 0.132783 0.481855 26.665725
sample_097c_claudia_and_the_worlds_cutest_baby_2_1631 0.300866 0.202108 0.442119 0.379274 0.290506 0.217345 0.231898 0.316893 0.242875 0.167833 0.252997 0.219328 0.187565 0.222929 0.225339 0.286471 0.183567 0.196772 0.157460 0.270022 0.345531 0.313046 0.328214 0.162485 0.163426 0.221984 0.165487 0.199911 0.171837 0.163148 0.216775 0.163798 0.207747 0.156212 0.182265 0.230142 0.232477 0.264146 0.222695 0.178827 0.125125 0.165906 0.285303 0.178354 0.185067 0.174343 0.183605 0.426473 0.202142 0.177661 0.318263 0.221106 0.206133 0.268578 0.237037 0.272854 0.232873 0.216522 0.193348 0.170462 0.205410 0.184192 0.200507 0.179551 0.226160 0.249420 0.247085 0.214619 0.163159 0.217500 0.255301 0.172685 0.178381 0.150623 0.193000 0.174291 0.149326 0.176668 0.204945 0.000000 0.210463 0.137957 0.115372 0.172862 0.178379 0.194447 0.256328 0.164915 0.196052 0.167407 0.182404 0.155126 0.178236 0.166917 0.160433 0.172002 0.355875 0.226830 0.239341 0.383535 21.483180
sample_100c_kristys_worst_idea_2_2163 0.303292 0.253219 0.417056 0.352157 0.341590 0.177419 0.281600 0.272866 0.297548 0.241931 0.299265 0.231094 0.269938 0.240078 0.231460 0.357344 0.240627 0.205207 0.247318 0.282555 0.413975 0.382778 0.377923 0.207362 0.247428 0.218062 0.229708 0.197968 0.202702 0.261848 0.220095 0.241827 0.276895 0.194538 0.221274 0.290131 0.251986 0.228169 0.267061 0.233724 0.167668 0.259757 0.234623 0.191581 0.200593 0.271896 0.284441 0.402392 0.248217 0.233897 0.334569 0.248477 0.215651 0.311851 0.301762 0.296028 0.313557 0.210400 0.210965 0.196413 0.233477 0.218618 0.224682 0.215888 0.247503 0.243494 0.204799 0.233139 0.184154 0.175257 0.272329 0.237371 0.218670 0.192492 0.199575 0.177300 0.227122 0.196645 0.213704 0.210463 0.000000 0.179301 0.195796 0.274047 0.245470 0.189525 0.242510 0.259170 0.201867 0.170113 0.178941 0.181683 0.189597 0.157475 0.241595 0.184678 0.375749 0.248129 0.200322 0.424791 24.511197
sample_106c_claudia_queen_of_the_seventh_grade_2_958 0.319816 0.184885 0.469058 0.382819 0.298262 0.184316 0.255624 0.303143 0.251246 0.211082 0.275731 0.239570 0.223181 0.208019 0.208014 0.327194 0.185753 0.203880 0.195964 0.304733 0.348058 0.364013 0.307364 0.161652 0.149913 0.204487 0.144905 0.163744 0.175715 0.184521 0.234229 0.200201 0.211890 0.186083 0.156045 0.227550 0.230797 0.266347 0.234332 0.174042 0.140503 0.140756 0.307915 0.172671 0.190330 0.218306 0.189126 0.444895 0.172484 0.183702 0.323552 0.247132 0.212879 0.271799 0.246598 0.277180 0.256566 0.244800 0.148242 0.157659 0.200817 0.171162 0.213484 0.216380 0.250895 0.189535 0.224895 0.126899 0.145257 0.202595 0.239412 0.143384 0.129954 0.158472 0.147059 0.136792 0.171145 0.164812 0.189586 0.137957 0.179301 0.000000 0.172446 0.151977 0.204908 0.150784 0.213207 0.186963 0.148983 0.207503 0.167410 0.207361 0.165118 0.142547 0.161403 0.126353 0.343124 0.164629 0.217346 0.409680 21.386809
sample_107c_mind_your_own_business_kristy_2_1194 0.343380 0.196764 0.469952 0.412021 0.316455 0.187874 0.256317 0.378353 0.269655 0.211196 0.311305 0.197098 0.205085 0.277887 0.254889 0.313197 0.213675 0.249560 0.222187 0.324123 0.372235 0.328946 0.337243 0.192735 0.184120 0.167708 0.140632 0.235875 0.164695 0.211741 0.251996 0.149425 0.230326 0.187847 0.174612 0.266321 0.245999 0.316116 0.268012 0.213786 0.138277 0.192085 0.323145 0.168296 0.210553 0.223134 0.251047 0.460230 0.208641 0.231713 0.384144 0.204114 0.218186 0.327127 0.290863 0.340522 0.266609 0.262974 0.211738 0.130551 0.234102 0.153058 0.126907 0.108222 0.130551 0.253976 0.234557 0.226054 0.152639 0.253344 0.264646 0.228621 0.183718 0.111250 0.219818 0.194303 0.166328 0.203354 0.165309 0.115372 0.195796 0.172446 0.000000 0.220451 0.204822 0.169708 0.303980 0.170715 0.154381 0.165636 0.182092 0.171763 0.175469 0.181698 0.177281 0.161618 0.391581 0.290557 0.171284 0.413971 23.096672
sample_108c_dont_give_up_mallory_2_489 0.307876 0.225118 0.449773 0.330739 0.291158 0.209340 0.256810 0.324646 0.310530 0.233016 0.293978 0.269843 0.210008 0.214153 0.252343 0.280922 0.227319 0.211938 0.150517 0.322925 0.282060 0.373731 0.318265 0.182158 0.211729 0.283505 0.178163 0.212928 0.195415 0.181017 0.283095 0.247348 0.220402 0.206869 0.227247 0.197112 0.236661 0.323635 0.260736 0.206104 0.166146 0.147930 0.338737 0.230123 0.182045 0.224044 0.172933 0.419397 0.189993 0.168534 0.281450 0.268272 0.264701 0.280244 0.255564 0.303543 0.238909 0.256812 0.197796 0.196361 0.236960 0.200522 0.263550 0.274116 0.310067 0.205812 0.226780 0.178204 0.172476 0.224770 0.300336 0.154152 0.149837 0.195571 0.187525 0.216372 0.184836 0.201461 0.280790 0.172862 0.274047 0.151977 0.220451 0.000000 0.156591 0.217706 0.221245 0.228331 0.238298 0.220491 0.218909 0.178450 0.191262 0.202638 0.177821 0.161536 0.308272 0.177414 0.295938 0.366517 23.529561
sample_110c_abby_and_the_bad_sport_2_535 0.361247 0.235671 0.479589 0.411300 0.350967 0.210535 0.347418 0.377716 0.375881 0.259444 0.369433 0.219684 0.263477 0.278276 0.285310 0.343905 0.307016 0.236000 0.221873 0.334064 0.377652 0.352108 0.402877 0.197986 0.228100 0.263732 0.201916 0.216196 0.159766 0.245957 0.318329 0.230518 0.218967 0.240163 0.239752 0.264061 0.301529 0.338090 0.308601 0.254327 0.182275 0.212695 0.361543 0.231323 0.179711 0.307695 0.216004 0.475494 0.254597 0.213065 0.344256 0.243217 0.268556 0.375672 0.337290 0.377187 0.319790 0.264234 0.234960 0.178495 0.287585 0.218001 0.206474 0.230955 0.230346 0.189458 0.205146 0.220130 0.162550 0.238802 0.351688 0.241440 0.175546 0.214875 0.229292 0.266997 0.132299 0.243850 0.250008 0.178379 0.245470 0.204908 0.204822 0.156591 0.000000 0.240136 0.259384 0.176531 0.247053 0.190377 0.182080 0.113307 0.196648 0.199246 0.161691 0.201401 0.415717 0.249158 0.227997 0.452935 26.006766
sample_111c_staceys_secret_friend_2_700 0.346615 0.186566 0.460735 0.385803 0.360840 0.163627 0.284481 0.303371 0.271506 0.184465 0.302526 0.167797 0.261136 0.248321 0.217028 0.346941 0.219576 0.230519 0.237267 0.326254 0.389871 0.384153 0.366430 0.196471 0.203153 0.203952 0.136904 0.184543 0.187215 0.260588 0.198835 0.178507 0.242156 0.208093 0.154758 0.263832 0.230428 0.285951 0.297303 0.169797 0.179780 0.178655 0.264544 0.164046 0.196004 0.261418 0.263166 0.472944 0.187872 0.207913 0.382163 0.271692 0.247237 0.340159 0.312680 0.335965 0.263161 0.252755 0.171757 0.136353 0.248601 0.162435 0.156554 0.182778 0.194609 0.243863 0.212836 0.165823 0.166387 0.220457 0.257868 0.176382 0.147834 0.142637 0.186890 0.158809 0.184901 0.156508 0.145141 0.194447 0.189525 0.150784 0.169708 0.217706 0.240136 0.000000 0.222132 0.220955 0.126794 0.199407 0.165010 0.217741 0.157049 0.156167 0.193880 0.135849 0.394513 0.225671 0.179254 0.455577 23.062196
sample_120c_mary_anne_and_the_playground_fight_2_942 0.223632 0.290908 0.375010 0.238020 0.303969 0.218159 0.283765 0.178115 0.325280 0.231306 0.201124 0.277162 0.284998 0.136434 0.172074 0.376356 0.197491 0.176175 0.194445 0.236802 0.355299 0.450157 0.376735 0.202380 0.230801 0.319777 0.249417 0.178051 0.298081 0.242687 0.211066 0.369726 0.406574 0.194165 0.261771 0.219550 0.202261 0.215495 0.272437 0.224480 0.207177 0.175282 0.214082 0.293325 0.219222 0.249866 0.236746 0.398819 0.256415 0.177494 0.325300 0.348526 0.305035 0.266494 0.282578 0.266839 0.262863 0.177717 0.173933 0.292459 0.266694 0.274367 0.331578 0.311237 0.392354 0.284927 0.289355 0.232807 0.279846 0.150673 0.309484 0.175930 0.214185 0.265535 0.264549 0.184740 0.297546 0.157300 0.322993 0.256328 0.242510 0.213207 0.303980 0.221245 0.259384 0.222132 0.000000 0.357563 0.290474 0.248833 0.278008 0.270412 0.218336 0.218994 0.272414 0.182991 0.305044 0.142985 0.351510 0.401938 25.796692
sample_121c_abby_in_wonderland_2_494 0.429524 0.200397 0.539196 0.528969 0.357803 0.268070 0.356586 0.461545 0.298635 0.248904 0.385972 0.255807 0.242215 0.369448 0.317734 0.328320 0.291769 0.292130 0.279930 0.369312 0.374158 0.313590 0.387202 0.188913 0.170621 0.240682 0.192090 0.230313 0.145188 0.214019 0.364110 0.180173 0.206062 0.233249 0.223903 0.362646 0.357049 0.369342 0.284490 0.265915 0.195843 0.231503 0.403940 0.206181 0.257096 0.336785 0.221800 0.514414 0.258299 0.258578 0.401883 0.242301 0.277190 0.398870 0.334772 0.379464 0.301728 0.310731 0.233486 0.172696 0.269642 0.214139 0.215757 0.186261 0.211745 0.201041 0.255382 0.209381 0.213284 0.319164 0.305025 0.275698 0.177944 0.216082 0.209070 0.250767 0.160294 0.284181 0.226030 0.164915 0.259170 0.186963 0.170715 0.228331 0.176531 0.220955 0.357563 0.000000 0.199272 0.223864 0.201619 0.173722 0.273948 0.195344 0.170251 0.195333 0.459303 0.273378 0.216599 0.452673 27.264883
sample_m03c_mallory_and_the_ghost_cat_2_261 0.388881 0.134324 0.500505 0.433921 0.385901 0.202765 0.293025 0.343293 0.239018 0.211175 0.328682 0.180213 0.258218 0.291799 0.217031 0.307547 0.245940 0.249673 0.255262 0.353209 0.441003 0.362205 0.377793 0.215618 0.188535 0.165612 0.140870 0.196226 0.189009 0.255444 0.235549 0.149175 0.222222 0.225549 0.125994 0.313491 0.263944 0.319034 0.300772 0.220642 0.170838 0.178575 0.312898 0.146369 0.197237 0.306663 0.301406 0.508227 0.219916 0.242795 0.417434 0.263124 0.255601 0.362521 0.311881 0.398742 0.311689 0.273523 0.158020 0.109685 0.290904 0.164582 0.159975 0.157643 0.155666 0.261763 0.251781 0.183049 0.140293 0.266533 0.272710 0.243348 0.146011 0.145256 0.192453 0.166955 0.163776 0.184115 0.156898 0.196052 0.201867 0.148983 0.154381 0.238298 0.247053 0.126794 0.290474 0.199272 0.000000 0.206000 0.153090 0.222496 0.187076 0.152004 0.196043 0.126036 0.462301 0.263618 0.180054 0.471258 24.303077
sample_m04c_kristy_and_the_missing_child_2_1829 0.256085 0.203273 0.392852 0.331188 0.250768 0.192908 0.230721 0.282845 0.269543 0.194810 0.242192 0.233482 0.182585 0.219446 0.208830 0.268706 0.203016 0.183831 0.183171 0.209552 0.325510 0.300502 0.296802 0.166987 0.208752 0.194336 0.176554 0.167327 0.163850 0.178378 0.214989 0.181702 0.248339 0.163773 0.186299 0.247416 0.199748 0.230263 0.231738 0.212604 0.141223 0.196103 0.219895 0.186282 0.166660 0.213631 0.224405 0.389361 0.219889 0.146347 0.281952 0.251848 0.233462 0.273328 0.255336 0.274543 0.246916 0.185060 0.195708 0.172260 0.216017 0.177833 0.159925 0.146510 0.188782 0.221160 0.201269 0.201694 0.185699 0.140178 0.246551 0.187649 0.196821 0.189626 0.204335 0.197332 0.170945 0.176321 0.211288 0.167407 0.170113 0.207503 0.165636 0.220491 0.190377 0.199407 0.248833 0.223864 0.206000 0.000000 0.187535 0.133423 0.186401 0.145040 0.182033 0.175479 0.323979 0.240503 0.236483 0.341251 21.281579
sample_m06c_the_mystery_at_claudias_house_2_420 0.357987 0.185496 0.476576 0.409219 0.355229 0.199670 0.345792 0.335873 0.324635 0.232028 0.366710 0.169289 0.288997 0.285395 0.226591 0.351853 0.285613 0.228056 0.252583 0.334955 0.465752 0.391034 0.406761 0.219948 0.210339 0.189213 0.168411 0.181308 0.154289 0.257484 0.211385 0.162084 0.200653 0.209163 0.154069 0.339237 0.274019 0.334861 0.307760 0.215203 0.180234 0.219450 0.298554 0.155814 0.190499 0.311188 0.280486 0.511953 0.231583 0.221393 0.394283 0.264521 0.241262 0.378757 0.342572 0.389034 0.358122 0.234838 0.225558 0.124561 0.281935 0.182672 0.176905 0.186559 0.211208 0.216705 0.207356 0.190624 0.163614 0.223118 0.324289 0.220671 0.163664 0.165980 0.165070 0.183929 0.143630 0.188790 0.153708 0.182404 0.178941 0.167410 0.182092 0.218909 0.182080 0.165010 0.278008 0.201619 0.153090 0.187535 0.000000 0.152508 0.147170 0.162839 0.206758 0.177183 0.470359 0.261603 0.179455 0.478028 24.703610
sample_m10c_stacey_and_the_mystery_money_2_937 0.324782 0.224020 0.465530 0.375826 0.317295 0.205560 0.334357 0.305196 0.341631 0.238586 0.323558 0.211683 0.241104 0.253025 0.243083 0.296096 0.278244 0.181901 0.188880 0.267105 0.397745 0.333504 0.404037 0.176584 0.224923 0.259497 0.220954 0.185355 0.137473 0.230554 0.266875 0.191274 0.214558 0.196440 0.235769 0.284507 0.288709 0.276863 0.285683 0.245155 0.164203 0.232919 0.279576 0.196832 0.151982 0.277772 0.236669 0.457658 0.249003 0.195886 0.319447 0.224191 0.239032 0.351927 0.306804 0.335652 0.297848 0.202919 0.208108 0.163511 0.243970 0.207952 0.187361 0.180631 0.197248 0.181752 0.168887 0.227802 0.136862 0.202957 0.313782 0.215478 0.190535 0.185295 0.233572 0.232799 0.139639 0.217033 0.242831 0.155126 0.181683 0.207361 0.171763 0.178450 0.113307 0.217741 0.270412 0.173722 0.222496 0.133423 0.152508 0.000000 0.190833 0.169236 0.171831 0.181914 0.420830 0.259833 0.208039 0.414049 23.866805
sample_m12c_dawn_and_the_surfer_ghost_2_126 0.337621 0.243964 0.477651 0.360095 0.373178 0.185747 0.285909 0.286622 0.336891 0.177641 0.315406 0.180479 0.228984 0.208486 0.237967 0.398477 0.206416 0.203988 0.199316 0.341429 0.378542 0.414897 0.355920 0.235049 0.233820 0.226229 0.154193 0.205538 0.192837 0.235892 0.193600 0.183015 0.236629 0.222234 0.169930 0.172937 0.174904 0.322191 0.301010 0.178273 0.135663 0.150297 0.301869 0.176203 0.156908 0.222447 0.240879 0.459760 0.192675 0.207594 0.382390 0.302440 0.258709 0.328639 0.325436 0.339734 0.280231 0.219617 0.200845 0.146942 0.234894 0.157741 0.185599 0.220725 0.249123 0.264881 0.241850 0.203447 0.165386 0.217580 0.281799 0.170508 0.155425 0.145751 0.208631 0.174587 0.157699 0.148132 0.160078 0.178236 0.189597 0.165118 0.175469 0.191262 0.196648 0.157049 0.218336 0.273948 0.187076 0.186401 0.147170 0.190833 0.000000 0.145500 0.215135 0.170757 0.357305 0.246472 0.209888 0.448544 23.397766
sample_m14c_stacey_and_the_mystery_at_the_mall_2_1474 0.309745 0.178499 0.466289 0.380298 0.320493 0.148811 0.247932 0.280627 0.238021 0.195148 0.260071 0.185183 0.195107 0.203938 0.208577 0.315346 0.180078 0.196995 0.199063 0.304516 0.336892 0.352635 0.329666 0.156659 0.176405 0.198170 0.146071 0.157175 0.156093 0.186671 0.208565 0.158571 0.211251 0.184172 0.174047 0.216557 0.221037 0.234428 0.245191 0.192280 0.131708 0.167936 0.245674 0.156279 0.154978 0.227099 0.203903 0.417698 0.202628 0.159029 0.330806 0.236946 0.197176 0.301845 0.269300 0.293924 0.250628 0.199598 0.145293 0.145972 0.183040 0.139682 0.169523 0.188194 0.232047 0.207283 0.193130 0.153615 0.149487 0.167688 0.238259 0.170466 0.157621 0.144081 0.182628 0.161177 0.160064 0.151396 0.135143 0.166917 0.157475 0.142547 0.181698 0.202638 0.199246 0.156167 0.218994 0.195344 0.152004 0.145040 0.162839 0.169236 0.145500 0.000000 0.169302 0.129188 0.343990 0.203170 0.170215 0.388769 20.780493
sample_m18c_stacey_and_the_mystery_at_the_empty_house_2_224 0.313145 0.172840 0.391493 0.401906 0.290799 0.202350 0.272916 0.367010 0.252641 0.222286 0.313143 0.239193 0.205134 0.262180 0.230068 0.296676 0.222679 0.223385 0.214056 0.284372 0.314635 0.311908 0.338699 0.178521 0.204485 0.229542 0.169085 0.188178 0.155095 0.213456 0.283017 0.204540 0.198752 0.185932 0.200190 0.256583 0.260830 0.251517 0.271721 0.221492 0.173641 0.206912 0.323460 0.198405 0.188212 0.251113 0.188062 0.392375 0.211534 0.190202 0.311540 0.226992 0.210123 0.284673 0.255233 0.310813 0.303274 0.266821 0.190934 0.166539 0.238234 0.189463 0.192134 0.197296 0.210036 0.180057 0.224187 0.151688 0.171859 0.218032 0.267199 0.178961 0.157905 0.177974 0.186974 0.216747 0.158711 0.222084 0.225498 0.160433 0.241595 0.161403 0.177281 0.177821 0.161691 0.193880 0.272414 0.170251 0.196043 0.182033 0.206758 0.171831 0.215135 0.169302 0.000000 0.168000 0.348076 0.212712 0.244287 0.390592 22.853884
sample_m20c_mary_anne_and_the_zoo_mystery_2_80 0.294062 0.160327 0.438953 0.372245 0.300861 0.162823 0.261494 0.263552 0.249381 0.204330 0.223342 0.203145 0.227481 0.191622 0.182050 0.298989 0.194611 0.176264 0.196508 0.274705 0.343996 0.371339 0.338031 0.139988 0.150564 0.216824 0.135118 0.137644 0.169019 0.196279 0.219585 0.204787 0.241973 0.167839 0.151018 0.221536 0.235584 0.226097 0.240599 0.190370 0.143265 0.148305 0.274956 0.180978 0.178828 0.268315 0.213909 0.431608 0.195437 0.165003 0.318459 0.221539 0.206175 0.268312 0.224151 0.286606 0.241229 0.187809 0.106756 0.126650 0.205957 0.141337 0.197734 0.177261 0.207769 0.181537 0.202530 0.137253 0.144829 0.178396 0.248376 0.151235 0.099580 0.148925 0.142195 0.145028 0.179955 0.153278 0.197401 0.172002 0.184678 0.126353 0.161618 0.161536 0.201401 0.135849 0.182991 0.195333 0.126036 0.175479 0.177183 0.181914 0.170757 0.129188 0.168000 0.000000 0.354803 0.153284 0.179063 0.398172 20.539438
sample_m25c_kristy_and_the_middle_school_vandal_2_134 0.259532 0.418507 0.366972 0.329566 0.223750 0.323053 0.248307 0.391393 0.374117 0.323612 0.290284 0.487692 0.271450 0.282498 0.357542 0.360828 0.252775 0.342194 0.290923 0.341485 0.265152 0.399764 0.290226 0.296619 0.366065 0.396230 0.355810 0.389845 0.417195 0.269021 0.383780 0.455855 0.463396 0.295462 0.397664 0.242387 0.241974 0.331440 0.275699 0.310468 0.315117 0.310879 0.364468 0.396068 0.348232 0.232059 0.242713 0.362717 0.313639 0.270179 0.220285 0.375579 0.334223 0.240819 0.294715 0.213513 0.256585 0.363954 0.351048 0.439752 0.242848 0.319382 0.422433 0.406587 0.474364 0.377601 0.411754 0.316282 0.408155 0.299590 0.315198 0.285910 0.400917 0.400917 0.336081 0.346365 0.421747 0.314165 0.466279 0.355875 0.375749 0.343124 0.391581 0.308272 0.415717 0.394513 0.305044 0.459303 0.462301 0.323979 0.470359 0.420830 0.357305 0.343990 0.348076 0.354803 0.000000 0.314246 0.510381 0.301307 34.150405
sample_m27c_claudia_and_the_lighthouse_ghost_2_356 0.266412 0.240564 0.403492 0.327204 0.304472 0.213093 0.285129 0.255750 0.295100 0.233488 0.218690 0.311040 0.257867 0.181296 0.174332 0.309472 0.205517 0.173948 0.191521 0.245020 0.342433 0.374978 0.335444 0.188519 0.182951 0.321556 0.211369 0.153329 0.262056 0.221420 0.258129 0.340358 0.327262 0.183562 0.226544 0.241400 0.254117 0.230587 0.249352 0.225640 0.212333 0.171539 0.283597 0.279118 0.219264 0.253543 0.185186 0.387444 0.228077 0.155235 0.318535 0.313103 0.262756 0.258607 0.250677 0.275233 0.269369 0.229382 0.153071 0.272478 0.245940 0.250433 0.339612 0.316139 0.378464 0.231546 0.257439 0.148961 0.261967 0.182952 0.291561 0.150802 0.155999 0.263876 0.172536 0.175080 0.243310 0.191318 0.326292 0.226830 0.248129 0.164629 0.290557 0.177414 0.249158 0.225671 0.142985 0.273378 0.263618 0.240503 0.261603 0.259833 0.246472 0.203170 0.212712 0.153284 0.314246 0.000000 0.359443 0.393566 24.694388
sample_m30c_kristy_and_the_mystery_train_2_1949 0.458586 0.236965 0.553948 0.505739 0.423442 0.200049 0.395719 0.428533 0.359700 0.335376 0.444650 0.205381 0.324456 0.373120 0.326208 0.414962 0.333569 0.351136 0.349696 0.446370 0.490096 0.447380 0.449219 0.249609 0.253072 0.209327 0.195521 0.248578 0.177098 0.333038 0.319827 0.182524 0.245547 0.301521 0.243234 0.370589 0.339218 0.384990 0.346397 0.242725 0.201568 0.288381 0.346824 0.188083 0.255988 0.381469 0.368126 0.549685 0.268177 0.316415 0.452010 0.271767 0.286044 0.451808 0.370173 0.441801 0.340923 0.324067 0.250487 0.136127 0.286809 0.180774 0.169390 0.178307 0.185814 0.238845 0.209935 0.243669 0.171543 0.319004 0.334726 0.309814 0.207253 0.166395 0.273873 0.255358 0.200278 0.269389 0.132783 0.239341 0.200322 0.217346 0.171284 0.295938 0.227997 0.179254 0.351510 0.216599 0.180054 0.236483 0.179455 0.208039 0.209888 0.170215 0.244287 0.179063 0.510381 0.359443 0.000000 0.522906 29.420805
sample_m36c_kristy_and_the_cat_burglar_2_748 0.278128 0.420096 0.461316 0.330523 0.255868 0.400752 0.242891 0.433690 0.404258 0.403925 0.335168 0.510741 0.328501 0.353694 0.411564 0.383912 0.330360 0.377085 0.344922 0.332735 0.299000 0.445632 0.353823 0.346055 0.373485 0.441768 0.411205 0.405411 0.407092 0.300621 0.433258 0.452552 0.504097 0.347431 0.430198 0.392596 0.366385 0.379987 0.330415 0.376923 0.378315 0.384730 0.417195 0.429252 0.400739 0.324460 0.301268 0.447857 0.405182 0.313934 0.302063 0.418210 0.418201 0.305022 0.302368 0.321225 0.352009 0.383739 0.394764 0.456547 0.321243 0.377648 0.450983 0.417879 0.494749 0.368391 0.442421 0.391543 0.448704 0.341678 0.369016 0.328372 0.428598 0.420857 0.394055 0.397926 0.441927 0.359927 0.481855 0.383535 0.424791 0.409680 0.413971 0.366517 0.452935 0.455577 0.401938 0.452673 0.471258 0.341251 0.478028 0.414049 0.448544 0.388769 0.390592 0.398172 0.301307 0.393566 0.522906 0.000000 38.582972
#Shows the rows with the 10 largest values in the Sum column
#(i.e. the texts that are the least similar to the other ones)
cosine_distances_freq.nlargest(10, 'Sum')
gpt2-011_run1-c gpt2-011_run2-c gpt2-011_run3-c gpt2-011_run4-c gpt2-011_run5-c gpt2-101_run1-c gpt2-101_run2-c gpt2-101_run3-c gpt2-101_run4-c gpt2-101_run5-c gpt2-201_run1-c gpt2-201_run2-c gpt2-201_run3-c gpt2-201_run4-c gpt2-201_run5-c gpt2-301_run1-c gpt2-301_run2-c gpt2-301_run3-c gpt2-301_run4-c gpt2-301_run5-c gpt2-401_run1-c gpt2-401_run2-c gpt2-401_run3-c gpt2-401_run4-c gpt2-401_run5-c gpt2-501_run1-c gpt2-501_run2-c gpt2-501_run3-c gpt2-501_run4-c gpt2-501_run5-c gpt2-601_run1-c gpt2-601_run2-c gpt2-601_run3-c gpt2-601_run4-c gpt2-601_run5-c gpt2-701_run1-c gpt2-701_run2-c gpt2-701_run3-c gpt2-701_run4-c gpt2-701_run5-c gpt2-801_run1-c gpt2-801_run2-c gpt2-801_run3-c gpt2-801_run4-c gpt2-801_run5-c gpt2-901_run1-c gpt2-901_run2-c gpt2-901_run3-c gpt2-901_run4-c gpt2-901_run5-c sample_004c_mary_anne_saves_the_day_2_240 sample_005c_dawn_and_the_impossible_three_2_216 sample_005c_dawn_and_the_impossible_three_2_432 sample_006c_kristys_big_day_2_321 sample_006c_kristys_big_day_2_850 sample_013c_goodbye_stacey_goodbye_2_537 sample_014c_hello_mallory_2_239 sample_024c_kristy_and_the_mothers_day_surprise_2_106 sample_025c_mary_anne_and_the_search_for_tigger_2_118 sample_027c_jessi_and_the_superbrat_2_159 sample_028c_welcome_back_stacey_2_679 sample_030c_mary_anne_and_the_great_romance_2_570 sample_032c_kristy_and_the_secret_of_susan_2_1152 sample_041c_mary_anne_vs_logan_2_1765 sample_045c_kristy_and_the_baby_parade_2_767 sample_046c_mary_anne_misses_logan_2_488 sample_048c_jessis_wish_2_68 sample_049c_claudia_and_the_genius_of_elm_street_2_105 sample_051c_staceys_ex_best_friend_2_1270 sample_057c_dawn_saves_the_planet_2_522 sample_064c_dawns_family_feud_2_305 sample_065c_staceys_big_crush_2_1311 sample_068c_jessi_and_the_bad_babysitter_2_89 sample_070c_stacey_and_the_cheerleaders_2_1293 sample_078c_claudia_and_crazy_peaches_2_36 sample_079c_mary_anne_breaks_the_rules_2_782 sample_082c_jessi_and_the_troublemaker_2_738 sample_084c_dawn_and_the_school_spirit_war_2_493 sample_090c_welcome_to_the_bsc_abby_2_1491 sample_097c_claudia_and_the_worlds_cutest_baby_2_1631 sample_100c_kristys_worst_idea_2_2163 sample_106c_claudia_queen_of_the_seventh_grade_2_958 sample_107c_mind_your_own_business_kristy_2_1194 sample_108c_dont_give_up_mallory_2_489 sample_110c_abby_and_the_bad_sport_2_535 sample_111c_staceys_secret_friend_2_700 sample_120c_mary_anne_and_the_playground_fight_2_942 sample_121c_abby_in_wonderland_2_494 sample_m03c_mallory_and_the_ghost_cat_2_261 sample_m04c_kristy_and_the_missing_child_2_1829 sample_m06c_the_mystery_at_claudias_house_2_420 sample_m10c_stacey_and_the_mystery_money_2_937 sample_m12c_dawn_and_the_surfer_ghost_2_126 sample_m14c_stacey_and_the_mystery_at_the_mall_2_1474 sample_m18c_stacey_and_the_mystery_at_the_empty_house_2_224 sample_m20c_mary_anne_and_the_zoo_mystery_2_80 sample_m25c_kristy_and_the_middle_school_vandal_2_134 sample_m27c_claudia_and_the_lighthouse_ghost_2_356 sample_m30c_kristy_and_the_mystery_train_2_1949 sample_m36c_kristy_and_the_cat_burglar_2_748 Sum
gpt2-901_run3-c 0.354351 0.479423 0.430473 0.402222 0.386210 0.413528 0.327145 0.445145 0.375283 0.395328 0.374843 0.491080 0.356954 0.401965 0.407790 0.366560 0.365038 0.386410 0.390072 0.388131 0.414374 0.353900 0.400433 0.365026 0.476713 0.468625 0.462317 0.432906 0.493388 0.371186 0.445305 0.517321 0.499938 0.352470 0.452177 0.381138 0.355674 0.333938 0.386207 0.407188 0.420979 0.449663 0.402270 0.497899 0.429266 0.333436 0.407318 0.000000 0.359791 0.368614 0.320637 0.388717 0.340356 0.355542 0.407465 0.309542 0.450396 0.390565 0.416700 0.494146 0.404447 0.457868 0.478581 0.466786 0.513154 0.472160 0.468722 0.415303 0.458658 0.385871 0.395865 0.371222 0.461608 0.398984 0.430699 0.406419 0.480444 0.425719 0.541717 0.426473 0.402392 0.444895 0.460230 0.419397 0.475494 0.472944 0.398819 0.514414 0.508227 0.389361 0.511953 0.457658 0.459760 0.417698 0.392375 0.431608 0.362717 0.387444 0.549685 0.447857 41.617101
gpt2-011_run3-c 0.241242 0.435037 0.000000 0.369124 0.306006 0.373274 0.391320 0.397940 0.410283 0.406094 0.350788 0.460159 0.415590 0.319761 0.347909 0.367770 0.392230 0.359861 0.410740 0.311930 0.517296 0.444581 0.431704 0.385914 0.447239 0.402770 0.413343 0.397082 0.471245 0.451096 0.414384 0.536094 0.504359 0.372683 0.422593 0.418052 0.344757 0.333525 0.376386 0.423251 0.426888 0.457171 0.345132 0.464251 0.419932 0.364835 0.402296 0.430473 0.417648 0.342530 0.367020 0.443260 0.401664 0.328233 0.386187 0.356832 0.453871 0.410883 0.425135 0.483243 0.437123 0.414145 0.467484 0.414751 0.488166 0.432448 0.410559 0.404964 0.451504 0.337515 0.421705 0.386931 0.452270 0.468152 0.394287 0.385925 0.493576 0.370129 0.523369 0.442119 0.417056 0.469058 0.469952 0.449773 0.479589 0.460735 0.375010 0.539196 0.500505 0.392852 0.476576 0.465530 0.477651 0.466289 0.391493 0.438953 0.366972 0.403492 0.553948 0.461316 41.355960
sample_m36c_kristy_and_the_cat_burglar_2_748 0.278128 0.420096 0.461316 0.330523 0.255868 0.400752 0.242891 0.433690 0.404258 0.403925 0.335168 0.510741 0.328501 0.353694 0.411564 0.383912 0.330360 0.377085 0.344922 0.332735 0.299000 0.445632 0.353823 0.346055 0.373485 0.441768 0.411205 0.405411 0.407092 0.300621 0.433258 0.452552 0.504097 0.347431 0.430198 0.392596 0.366385 0.379987 0.330415 0.376923 0.378315 0.384730 0.417195 0.429252 0.400739 0.324460 0.301268 0.447857 0.405182 0.313934 0.302063 0.418210 0.418201 0.305022 0.302368 0.321225 0.352009 0.383739 0.394764 0.456547 0.321243 0.377648 0.450983 0.417879 0.494749 0.368391 0.442421 0.391543 0.448704 0.341678 0.369016 0.328372 0.428598 0.420857 0.394055 0.397926 0.441927 0.359927 0.481855 0.383535 0.424791 0.409680 0.413971 0.366517 0.452935 0.455577 0.401938 0.452673 0.471258 0.341251 0.478028 0.414049 0.448544 0.388769 0.390592 0.398172 0.301307 0.393566 0.522906 0.000000 38.582972
gpt2-011_run4-c 0.180248 0.418290 0.369124 0.000000 0.298558 0.332010 0.285007 0.270305 0.423795 0.335504 0.289017 0.438918 0.355384 0.241928 0.311886 0.409310 0.337901 0.270426 0.275673 0.277383 0.404661 0.513022 0.424944 0.355208 0.397336 0.401089 0.395135 0.344306 0.414301 0.338588 0.312879 0.462990 0.543097 0.297137 0.389331 0.331855 0.276099 0.328041 0.326229 0.335980 0.336779 0.320772 0.275723 0.403829 0.315979 0.286606 0.330455 0.402222 0.374005 0.302824 0.348291 0.474338 0.436451 0.279635 0.337177 0.313504 0.381964 0.291071 0.328492 0.437427 0.344655 0.403461 0.412577 0.403293 0.496665 0.389362 0.398056 0.393639 0.402905 0.261155 0.378320 0.294068 0.414790 0.378447 0.393294 0.322270 0.409744 0.287904 0.457020 0.379274 0.352157 0.382819 0.412021 0.330739 0.411300 0.385803 0.238020 0.528969 0.433921 0.331188 0.409219 0.375826 0.360095 0.380298 0.401906 0.372245 0.329566 0.327204 0.505739 0.330523 35.910895
gpt2-401_run2-c 0.408177 0.351753 0.444581 0.513022 0.378565 0.389150 0.313298 0.516503 0.287290 0.342628 0.406872 0.414251 0.282208 0.428140 0.347278 0.268605 0.328349 0.391049 0.359596 0.368981 0.375920 0.000000 0.347893 0.295465 0.366282 0.333171 0.335591 0.357926 0.372740 0.312630 0.403619 0.361347 0.306026 0.317911 0.333059 0.408560 0.378604 0.356014 0.351113 0.387039 0.363720 0.376440 0.434311 0.349646 0.367109 0.318627 0.344174 0.353900 0.313634 0.329483 0.330729 0.299858 0.287269 0.349204 0.370228 0.378303 0.363952 0.419218 0.360603 0.349265 0.361210 0.341159 0.320219 0.329584 0.316945 0.404073 0.382756 0.307496 0.342141 0.401561 0.317338 0.376657 0.364330 0.350811 0.332063 0.341476 0.332368 0.371720 0.422208 0.313046 0.382778 0.364013 0.328946 0.373731 0.352108 0.384153 0.450157 0.313590 0.362205 0.300502 0.391034 0.333504 0.414897 0.352635 0.311908 0.371339 0.399764 0.374978 0.447380 0.445632 35.785294
gpt2-401_run1-c 0.361294 0.396288 0.517296 0.404661 0.291330 0.380028 0.289731 0.465579 0.397757 0.353212 0.350377 0.487885 0.253615 0.361106 0.420651 0.388628 0.288421 0.374298 0.313223 0.363635 0.000000 0.375920 0.320525 0.302873 0.378795 0.422735 0.353968 0.378541 0.375449 0.225285 0.441626 0.401916 0.413279 0.343792 0.394216 0.293794 0.324453 0.378446 0.340491 0.338458 0.332344 0.333129 0.442490 0.403772 0.348605 0.283022 0.230402 0.414374 0.350096 0.298944 0.281618 0.323173 0.338206 0.294265 0.305352 0.304341 0.281060 0.363035 0.361186 0.411781 0.272931 0.322863 0.409687 0.435640 0.474229 0.358694 0.428874 0.325291 0.412101 0.331250 0.335717 0.303136 0.370416 0.388990 0.336304 0.383995 0.383296 0.350753 0.449189 0.345531 0.413975 0.348058 0.372235 0.282060 0.377652 0.389871 0.355299 0.374158 0.441003 0.325510 0.465752 0.397745 0.378542 0.336892 0.314635 0.343996 0.265152 0.342433 0.490096 0.299000 35.697719
sample_m25c_kristy_and_the_middle_school_vandal_2_134 0.259532 0.418507 0.366972 0.329566 0.223750 0.323053 0.248307 0.391393 0.374117 0.323612 0.290284 0.487692 0.271450 0.282498 0.357542 0.360828 0.252775 0.342194 0.290923 0.341485 0.265152 0.399764 0.290226 0.296619 0.366065 0.396230 0.355810 0.389845 0.417195 0.269021 0.383780 0.455855 0.463396 0.295462 0.397664 0.242387 0.241974 0.331440 0.275699 0.310468 0.315117 0.310879 0.364468 0.396068 0.348232 0.232059 0.242713 0.362717 0.313639 0.270179 0.220285 0.375579 0.334223 0.240819 0.294715 0.213513 0.256585 0.363954 0.351048 0.439752 0.242848 0.319382 0.422433 0.406587 0.474364 0.377601 0.411754 0.316282 0.408155 0.299590 0.315198 0.285910 0.400917 0.400917 0.336081 0.346365 0.421747 0.314165 0.466279 0.355875 0.375749 0.343124 0.391581 0.308272 0.415717 0.394513 0.305044 0.459303 0.462301 0.323979 0.470359 0.420830 0.357305 0.343990 0.348076 0.354803 0.000000 0.314246 0.510381 0.301307 34.150405
gpt2-401_run3-c 0.313905 0.361409 0.431704 0.424944 0.292353 0.338503 0.238286 0.441192 0.321275 0.346150 0.330012 0.448002 0.271312 0.313230 0.382751 0.314475 0.280938 0.354211 0.329157 0.352252 0.320525 0.347893 0.000000 0.289617 0.331857 0.354212 0.309507 0.364340 0.381046 0.274372 0.406324 0.347890 0.361700 0.308881 0.336848 0.300116 0.290522 0.367773 0.262236 0.317765 0.324769 0.294283 0.406116 0.357754 0.328787 0.254298 0.293242 0.400433 0.276117 0.271864 0.313532 0.367350 0.341737 0.286745 0.287630 0.261385 0.264891 0.372777 0.332242 0.367884 0.255728 0.277551 0.371599 0.359981 0.408221 0.356767 0.383261 0.296968 0.349420 0.333217 0.245321 0.304278 0.341122 0.342293 0.329131 0.345237 0.385248 0.326957 0.396067 0.328214 0.377923 0.307364 0.337243 0.318265 0.402877 0.366430 0.376735 0.387202 0.377793 0.296802 0.406761 0.404037 0.355920 0.329666 0.338699 0.338031 0.290226 0.335444 0.449219 0.353823 33.450365
sample_004c_mary_anne_saves_the_day_2_240 0.280657 0.379596 0.367020 0.348291 0.217697 0.324630 0.275674 0.367748 0.333305 0.356710 0.288904 0.452188 0.303951 0.313905 0.341205 0.301950 0.285456 0.314631 0.286870 0.276893 0.281618 0.330729 0.313532 0.229984 0.365503 0.396445 0.360888 0.307319 0.374651 0.272818 0.366643 0.404759 0.387409 0.280403 0.374819 0.315722 0.297275 0.259706 0.257581 0.309407 0.305001 0.357331 0.317786 0.360551 0.324444 0.288206 0.260296 0.320637 0.289165 0.235000 0.000000 0.268499 0.255810 0.270275 0.285165 0.171074 0.294211 0.277260 0.308166 0.384755 0.239645 0.298510 0.411198 0.381277 0.433020 0.290027 0.346295 0.295780 0.335869 0.290874 0.316014 0.276514 0.365416 0.371944 0.296114 0.326739 0.404536 0.329336 0.460550 0.318263 0.334569 0.323552 0.384144 0.281450 0.344256 0.382163 0.325300 0.401883 0.417434 0.281952 0.394283 0.319447 0.382390 0.330806 0.311540 0.318459 0.220285 0.318535 0.452010 0.302063 32.092564
gpt2-101_run3-c 0.229970 0.340588 0.397940 0.270305 0.346657 0.318504 0.342450 0.000000 0.360718 0.256626 0.197803 0.361896 0.341759 0.155757 0.220106 0.383288 0.248698 0.162878 0.202681 0.212709 0.465579 0.516503 0.441192 0.276715 0.357577 0.403887 0.345686 0.251113 0.385829 0.321016 0.187049 0.401512 0.474110 0.254031 0.344329 0.299427 0.269754 0.207221 0.343292 0.285361 0.279941 0.271524 0.189990 0.364647 0.268270 0.293015 0.349461 0.445145 0.345519 0.241440 0.367748 0.418557 0.365369 0.307801 0.326482 0.286892 0.352152 0.157034 0.241942 0.354648 0.336158 0.359923 0.411734 0.368467 0.464952 0.393022 0.357328 0.362650 0.315430 0.190369 0.350071 0.231876 0.314473 0.330916 0.343577 0.237668 0.388452 0.204856 0.407096 0.316893 0.272866 0.303143 0.378353 0.324646 0.377716 0.303371 0.178115 0.461545 0.343293 0.282845 0.335873 0.305196 0.286622 0.280627 0.367010 0.263552 0.391393 0.255750 0.428533 0.433690 31.670140

7 of the top 10 most-different text excerpts were generated by GPT-2.

But what about the most-similar text excerpts? This time, I pulled out the ones with the lowest total values:

#Shows the rows with the 10 smallest values in the Sum column
#(i.e. the texts that are the most similar to the other ones)
cosine_distances_freq.nsmallest(10, 'Sum')
gpt2-011_run1-c gpt2-011_run2-c gpt2-011_run3-c gpt2-011_run4-c gpt2-011_run5-c gpt2-101_run1-c gpt2-101_run2-c gpt2-101_run3-c gpt2-101_run4-c gpt2-101_run5-c ... sample_m10c_stacey_and_the_mystery_money_2_937 sample_m12c_dawn_and_the_surfer_ghost_2_126 sample_m14c_stacey_and_the_mystery_at_the_mall_2_1474 sample_m18c_stacey_and_the_mystery_at_the_empty_house_2_224 sample_m20c_mary_anne_and_the_zoo_mystery_2_80 sample_m25c_kristy_and_the_middle_school_vandal_2_134 sample_m27c_claudia_and_the_lighthouse_ghost_2_356 sample_m30c_kristy_and_the_mystery_train_2_1949 sample_m36c_kristy_and_the_cat_burglar_2_748 Sum
gpt2-801_run1-c 0.281143 0.209769 0.426888 0.336779 0.275186 0.161047 0.229935 0.279941 0.257738 0.201921 ... 0.164203 0.135663 0.131708 0.173641 0.143265 0.315117 0.212333 0.201568 0.378315 20.480272
sample_m20c_mary_anne_and_the_zoo_mystery_2_80 0.294062 0.160327 0.438953 0.372245 0.300861 0.162823 0.261494 0.263552 0.249381 0.204330 ... 0.181914 0.170757 0.129188 0.168000 0.000000 0.354803 0.153284 0.179063 0.398172 20.539438
gpt2-401_run4-c 0.257978 0.192975 0.385914 0.355208 0.244480 0.176277 0.232175 0.276715 0.228030 0.196559 ... 0.176584 0.235049 0.156659 0.178521 0.139988 0.296619 0.188519 0.249609 0.346055 20.595460
sample_m14c_stacey_and_the_mystery_at_the_mall_2_1474 0.309745 0.178499 0.466289 0.380298 0.320493 0.148811 0.247932 0.280627 0.238021 0.195148 ... 0.169236 0.145500 0.000000 0.169302 0.129188 0.343990 0.203170 0.170215 0.388769 20.780493
gpt2-901_run5-c 0.216331 0.208101 0.342530 0.302824 0.227955 0.198130 0.212313 0.241440 0.255508 0.169004 ... 0.195886 0.207594 0.159029 0.190202 0.165003 0.270179 0.155235 0.316415 0.313934 20.788037
sample_084c_dawn_and_the_school_spirit_war_2_493 0.250864 0.211782 0.370129 0.287904 0.306851 0.192394 0.208831 0.204856 0.227864 0.143060 ... 0.217033 0.148132 0.151396 0.222084 0.153278 0.314165 0.191318 0.269389 0.359927 21.063928
sample_m04c_kristy_and_the_missing_child_2_1829 0.256085 0.203273 0.392852 0.331188 0.250768 0.192908 0.230721 0.282845 0.269543 0.194810 ... 0.133423 0.186401 0.145040 0.182033 0.175479 0.323979 0.240503 0.236483 0.341251 21.281579
sample_065c_staceys_big_crush_2_1311 0.236151 0.217132 0.386931 0.294068 0.264499 0.199881 0.214948 0.231876 0.255058 0.189247 ... 0.215478 0.170508 0.170466 0.178961 0.151235 0.285910 0.150802 0.309814 0.328372 21.349046
sample_106c_claudia_queen_of_the_seventh_grade_2_958 0.319816 0.184885 0.469058 0.382819 0.298262 0.184316 0.255624 0.303143 0.251246 0.211082 ... 0.207361 0.165118 0.142547 0.161403 0.126353 0.343124 0.164629 0.217346 0.409680 21.386809
gpt2-601_run4-c 0.230465 0.218402 0.372683 0.297137 0.259652 0.204492 0.194715 0.254031 0.198116 0.207877 ... 0.196440 0.222234 0.184172 0.185932 0.167839 0.295462 0.183562 0.301521 0.347431 21.438264

10 rows × 101 columns

6 of the top 10 text excerpts that are most similar to everything else are original BSC text… but GPT-2 was still represented by 4 texts – and the #1 most-similar text to everything else is GPT-2! It does make some intuitive sense: if GPT-2 fine-tuning is supposed to make regular GPT-2 sound “like” some other text, it looks like there were some really successful samples! And none of these most-similar GPT-2 samples were from the beginning of the run – the earliest was run 401, there’s a 601, 801, and 901. Attempts 1, 4 (twice) and 5 are represented. But the totality makes me feel the randomness of the flailing-at-feature-knobs aspect of training a model.

Let’s compare the most-similar BSC sample (from “Mary Anne and the Zoo Mystery”) and the most-similar GPT-2 sample (run 801 from attempt 1):

“Mary Anne and the Zoo Mystery”

GPT-2

the director’s chair that she always sits in during club meetings. “If you guys want, we can organize the kids and help spread the word. Maybe we can even do a few projects to help raise money for Babar.” I love it when Kristy gets excited about a project. She puts herself one hundred percent behind it and makes it happen. Take the Baby-sitters Club, one of her greatest ideas of all time. She came up with the idea to form the club while sitting in her kitchen one day, listening to her mom try to find a baby-sitter. You see, Kristy’s father walked out on her family when Kristy was six and never looked back. That left Kristy’s mom to work and raise four kids. Anyway, on that afternoon last year, Mrs. Thomas (that was her name then) made phone call after phone call trying to find someone to take care of David Michael, Kristy’s younger brother (he’s seven and a half now). That’s when the great idea hit Kristy like a bolt out of the blue. Why not form a club consisting of responsible, experienced sitters? Parents could make one call, to the club, and reach a whole bunch of good sitters at once. Brilliant, huh? In the beginning, there were only four of us — me, Kristy, Claudia, and Stacey McGill. Claudia, who is a fantastic artist, designed our flyers and the club was on its way. We decided to meet three times a week, on Mondays, Wednesdays, and Fridays from five-thirty to six in Claud’s room. Kristy and I used to live across the street from Claudia on Bradford Court. That’s all changed now. I already told you that my dad married Sharon and we moved into their farmhouse. Well, Kristy’s mom married Watson Brewer (a genuine millionaire) and Kristy and her three brothers — Charlie, age seventeen, Sam, age fifteen, and David Michael — moved into his big beautiful mansion on the other side of town. Watson has two kids from his first marriage, seven-year-old Karen and Andrew, who’s four. After he married Mrs. Thomas (now Mrs. Brewer), they adopted Emily Michelle, a two-year-old from Vietnam. Then Nannie, Kristy’s grandmother, moved in to help look after everybody. When Karen and Andrew are living at the big house (they call their mom’s house the little house), which is every other month, there are ten people at the Brewer mansion. It’s a good thing it’s so huge. But anyway, in the beginning, when the club was formed, Kristy and Claudia and Stacey and I were neighbors. We decided to hold our meetings at Claud’s because she has a phone in her room and — are you ready for this? — it’s not just an extension, it’s her very own phone line. Kristy became president because the club was her idea and because she’s a real leader. (Some people would say she’s loud and bossy, but I prefer to call her strong-willed.) We can count on Kristy to be at every meeting in her jeans, sneakers, and a turtleneck shirt. She usually wears her baseball cap placed firmly on her brown hair, which is pulled into a ponytail. Since we were holding meetings at Claudia’s, we elected her vice-president. Claud does a lot of things for the club, such as pick up calls when the club’s not in session and supply us with snacks. She is also our resident artist and all-around cool, creative person. In the looks category, I would say that Claudia is gorgeous with her shiny jet black hair and perfect clear complexion. Claudia does have a couple of flaws. (Don’t we all?) One, she is a major junk food addict. I’m not kidding. She stashes cookies and candy bars all over her room. And two, although she is very smart, schoolwork is Claudia’s nemesis. She can paint a picture of clouds soaring over a summer landscape with her eyes closed, but ask her to diagram a sentence and she freezes up completely. At one point her parents were so concerned about her falling behind in her studies that they asked her to consider quitting the BSC. It was awful. Fortunately, Claudia pulled her grades up so she didn’t have to quit. But we understand that, for Claudia, homework has to come first. Stacey McGill, who is a real math whiz, was our first treasurer. She and Claudia are a lot alike. Both have the absolute coolest clothes — not because they’re rich and can afford to buy a million outfits, but because they have a real sense of style. Claudia will go to a used clothing store and buy an old black vest, a beat-up derby, and an old-fashioned collarless shirt for a few dollars. Then she’ll add some lace and beads to the vest to make it extra funky. She’ll make a sequined headband for the derby and belt the big white shirt over some wild leggings and voilà! She looks like a million dollars. Stacey’s style is just as cool, but a little more sophisticated and sleek. I think it comes from the fact that she lived in New York City for such a long time

red one in her own clothes with black suspenders. One thing that does tie up the club notebook is Claudia’s homework. Claud is fantastic at it. She does it in such a professional, original way. I like that. I try to do what I can to improve every day. We also have two sitters who are younger members of the club. They fill in when we have too many sitters. One of them is Jessi Ramsey, who lives in my new neighborhood. The other is Logan Bruno, who happens to be Mary Anne’s boyfriend, and he is wonderful. Jessi has curly golden hair and a flair for funny looks. She has two younger sisters, Claire and Andrew, so she likes clothing and makeup and all that junk. Jessi adores horse prints and the local kid-Kerr, and you can tell that she likes things like recycled baby-sitting materials, organic strawberries, and wildflowers. She also likes to think of herself as a ballerina, which is a classification system for performing arts arts arts (academic), creative, in which the dancer is the student and the choreographer is the professional. According to Jessi, ballet is the one she wants to do as much dance as she can, and that it will probably be a big success simply by practicing her dancing. I bet she’ll do the trick just like David Michek does, and it’s as graceful as the way a football player catches the ball and tries to make a dive in the air while taking a release from his foot just before his jump press. Jessi’s family life is more complicated than Claud’s. Like Claud, her family lives in Stoneybrook, because Stacey’s dad remarried, leaving Jessi ‘where’s-you-going-to-meet-the-new-born-at-the-right-time basis.” Jessi’s mom grew up in New York City, which helps to explain why she is such a great ballet dancer. She moved to Stoneybrook when her dad was transferred back to New York City. And then, when her parents divorced, her mom went back to Stoneybrook to stay with her father. (Her move left her with a combined household income of almost four times what she made in New York.) So, at the beginning of seventh grade, her grandmother, Mimi, moved in with Jessi and her family. (Before Mimi, there was only one Charlie Thomas, the other one Yodel, and the last one Arnold Schwarzenegger, and they lived in a mansion across town.) There are two other members of the BSC, a girl named Dawn Schafer and one boy, a girl, a boy, and a girl, a dog, a cat, and a friend of the club called … my friend Stacey. Remember how I was telling you about Stacey? Well, she was the treasurer of the BSC. That’s because she joined shortly after I went back to New York. She and Mary Anne Brewer, the president, hatched up around Mary Anne. Later when they were dating, they discovered that Mary Anne’s dad and Dawn’s mom had dated in high school. They got them fixed, and by the time they got married, they had a huge family. And the Thomases moved all the way back to New York, along with Dawn and her mom and two brothers. So now Stacey lives in the Big Apple. You see, our club was born out of this — the club we all came up with was Kristy’s, not Stacey’s, idea. Kristy thought the four main business functions would be business, club meetings, where the sitting kids could hang out, and club dues. Kristy thought that regular people would be busy: everybody can be club secretary (except Mary Anne’s parents, whom they try to have sorted out). Dues are necessary, though. We hold our meetings on Mondays, Wednesdays, and Fridays, from five-thirty to six. On Fridays, we close at five-thirty. Besides meeting for five hours out-of-the-way, paying dues, and being nice, Stacey has a special talent. She’s an artist. She paints, sculpts, sculpts. She makes jewelry and clay pottery. Her work takes shape almost like a work of art. She’s in the process of putting together the most fascinating clay sculptures ever put up. They’re so artistically created, they’re almost unbelievable. I’m so impressed with her sculptures that I’ve even given her a performance onstage at my school’s productions. * * * As anyone who has ever been in Stoneybrook knows knows, Kristy. She’s my best

The excerpt from “Mary Anne and the Zoo Mystery” is instantly recognizable as classic BSC chapter 2: we get the (oft-recounted) origin story of the club, and a detailed description of Kristy and Claudia.

GPT-2 is clearly going for something similar in this sample. Sure, it focuses on Jessi instead of Kristy and Claudia, but that narration includes a physical description, family background, and things about her personality and hobbies. It introduces other characters, as well as things about the club. And it happens to get some BSC facts right, like “Logan Bruno, who happens to be Mary Anne’s boyfriend” and the fact that Jessi is one of the “younger members of the club”.

But even though, quantitatively, these two passages have nearly the same total cosine distance value (and GPT-2’s is slightly lower), no BSC fan would be confused about which was machine-generated. And any reader of English should be able to easily identify the machine-generated one, too. The word frequencies are right, but the result is… weird. “She also likes to think of herself as a ballerina, which is a classification system for performing arts arts arts (academic), creative, in which the dancer is the student and the choreographer is the professional.” And there’s the mixed sports references, plus a sequence of actions that are hard to imagine a human doing simultaneously: “I bet she’ll do the trick just like David Michek does, and it’s as graceful as the way a football player catches the ball and tries to make a dive in the air while taking a release from his foot just before his jump press.” And then there are some issues with counting and getting into loops that remind me of talking with my very talkative 2-year-old: “There are two other members of the BSC, a girl named Dawn Schafer and one boy, a girl, a boy, and a girl, a dog, a cat, and a friend of the club called … my friend Stacey.”

As useful as text-comparison algorithms like those in DSC 8: Text-Comparison Algorithm Crazy Quinn can be, especially when working with a lot of text, these GPT-2 experiments really drive home the value (okay, I’ll go so far as to say “necessity”) of reading – or at least spot-checking – the texts with your very own eyeballs. You know, just to make sure there aren’t any ghosts in your laptop.

Stochastic parrots

We’ve had a lot of good laughs at the expense of ghosts in various computers in this DSC book, but it’s not all funny. Since we started writing this book, GPT-2 has been surpassed by GPT-3. GPT-3 is a language model that’s 116x bigger than the full GPT-2. It’s so big, it would be unfeasibly costly to fine-tune it. And… you kinda don’t need to? It has so much text that it can write in various genres and styles, including the style of particular authors like Dr. Seuss, often with surprising success. (Check out GPT-3 Creative Fiction for some examples.) What’s it made of, you ask? Much more than just websites linked from Reddit. Its documentation gestures towards two data sets that may be relevant for our purposes: “Books1” and “Books2”. No one knows what those actually are, but guesses include everything on Libgen and/or Google Books. Which means that the Baby-Sitters Club is probably floating around in there somewhere.

You have to request access to GPT-3, and the LitLab hasn’t gotten it as of February 2021, so I haven’t played with it myself. But I have taken note of the criticisms of GPT-3. That at the end of the day, it’s still not – and can’t be – “intelligent”. I keep coming back to what Jeff told me last summer: “it’s just math”. And that’s just it. It’s a big pile of data about the probability that one particular word will follow another. Once you have a model, there are settings you can fiddle with regarding text generation, including the “temperature”. The closer you set this value to 0, the more repetitive the results will be: the model will just repeat back things from its training data. A temperature of 1 is pretty common, requiring the model to be moderately “creative” without devolving into total nonsense. But the problem is that this ghost in the laptop doesn’t know anything except word frequencies – it doesn’t know the axes along which it can be legitimately creative before things get weird. Combining new color- or style-adjectives with articles of clothing and accessories to generate new outfits for Claudia? Great! GPT-2 is on it:

  • Claudia was wearing a leopard print checkered scarf over a clear blue sundress.

  • Claudia was wearing an oversized purple backpack, a white short sleeved turtleneck, and a white polka dot under her black boots.

  • Claudia was wearing a baggy black shirt and jeans, which Claudia had painted in red, white, and green.

  • Claudia was wearing a wool coat and pearl-studded high-top sneakers. She had on black scrunchies and braided shoelaces under her clothes. (The top pair of her coat she made herself. She makes the main stuff.)

(Okay, even here, we see the limitations of GPT-2’s understanding of how clothes typically get layered.)

But it also shows absolutely no hesitation in generating text that flies in the face of human anatomy:

The front door flew open, and Kristy — tall, graceful, and incredibly pretty with three legs and the most total body of skin I’ve ever seen — burst into the room.

But even though OpenAI plasters their models with warnings (including not using it in medical contexts because … you know… they don’t actually know anything), “artificial intelligence” has gotten a lot of interest from business, government (including courts), and medicine, as Anouk mentioned above – sometimes with terrible results. Because all the model actually knows is the word frequencies it learned from its sources… and not all those sources reflect humanity as we’d like it to be. A prototype of a medical chatbot using GPT-3 told a patient to kill themselves. And Jerome Pesenti, head of the Facebook AI lab, criticized the sexist, racist, and other derogatory language that GPT-3 produced when discussing Jews, women, Black people, and the Holocaust.

This made me morbidly curious – what awfulness would our fine-tuned model spit out, given a prompt that might trigger some awful thing that it learned before we introduced it to the Baby-Sitters Club?

I tried “Anna is Jewish”:

  • Anna is Jewish and has lived for over ten years. That’s not a bad thing, just weird.

  • Anna is Jewish. Kristy’s mom is the rabbi of our family.

  • Anna is Jewish and Mal is a true New Yorker — a world away from Yankee Stadium.

  • Anna is Jewish, though she wasn’t born that way.

  • Anna is Jewish.) It was Kristy’s idea to bring Jewish kids together and to help to find table after table for Jewish weddings.

  • Anna is Jewish. (The rest of us are Caucasian.)

  • Anna is Jewish, while Kristy’s mom is a Roman Catholic.

Some of these make me raise an eyebrow, but the vast majority of the 1,000 samples it generated treated this fact like any other in a list of facts about various characters in the BSC.

How about “Jessi is Black” (which, sans capitalization, is a common phrase in the Baby-Sitters Club)?

  • Jessi is Black. She is a truly talented artist and knows how to do just about everything she wants to do — from packaging cookies to making hand-me-down jewelry.

  • Jessi is Black, and we all really like her. But she can be incredibly stubborn. She even argues with us!

  • Jessi is Black, with long, shiny black hair and gorgeous chocolate-brown eyes.

  • Jessi is Black, and we adore her. But her family is a worse bunch than mine.

  • Jessi is Black. Kristy is sometimes rude and even sad and doesn’t understand how the kids in her neighborhood are affected by prejudice.

  • Jessi is Black. When we first met her, Claudia had to convince us that she was as pretty as a Black woman and, more important, could still wear her eyelets

  • Jessi is Black. It’s sort of a gross, disgusting thing to do, even if the facts are true.

  • Jessi is Black, not just plain ugly.

  • Jessi is Black, so she’s usually pretty tactful.

😳 Yikes. Some of the examples just treat it as another fact (or catch onto the “Jessi is Black and Mal is white” trope from the Baby-Sitters Club books), but we’re seeing a lot of negative bleed-through from the base GPT-2 data here. In case you’re not familiar with the Baby-Sitters Club corpus, Jessi is a ballerina (and sometimes swimmer), not an artist (that’s Claudia). Her family is great (not a “worse bunch” than anyone’s), though early on there’s some normal tensions with her aunt. No one calls her gross or ugly, and she’s not particularly stubborn compared to the other BSC characters.

How about “What about the Holocaust?”

“What about the Holocaust?”

“The Kristy camp claims the whole thing is a cover-up,” I said.

“What about the Holocaust?” Jessi asked.

“I don’t know,” I said.

“You know what bothers you most? Not knowing the names of the people who did it.”

“Who did it?” Jessi guessed.

“Nancy Drew,” I said. “Dr. Jekyll and McBaron.”

“What about the Holocaust?” I asked.

Stacey nodded. “No, no. It’s a terrible idea. But what if the Jewish people did go on a killing spree? What would they do?”

What about the Holocaust? Kristy has a BIG problem, and that’s totally understandable. But we have to remember that not all of Kristy’s ideas are the same. For example, there are her prejudiced ideas — like the color-coded public school buildings so that African-American and Latino students can look the same, or the one-size-fits-all practices for boys (boys, in fact, are taught in these buildings by Kristy’s boss, Richard.)

If this is the garbage we’re seeing from the smallest release of GPT-2, I don’t want to imagine what GPT-3 would produce.

Some of these issues are taken up in a paper called “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜” by Dr. Timnit Gebru and Prof. Emily Bender (of whom I’m a huge fan, particularly for her “Bender Rule” which basically works out to “always name the language you’re working on – even/especially if it’s just English”; I wish we’d universally adopt it in DH). The paper is notable not least for the fact that it led to Google firing Dr. Gebru from her position in AI Ethics after they tried to censor said paper, which triggered a cavalcade of protests, other members of the team quitting or being forced out, Google messing with the search results for “Timnit Gebru”, and generally surfacing all sorts of HR practices ranging from the dubious to racist. It’s worth reading through the whole thing. This interview with Prof. Bender captures some of the major points, including the environmental costs of training ginormous models like GPT-3, the fact that these models really aren’t “intelligent”, in addition to the harm done by those creations generating text like I’ve described (and shown) above.

I love the “stochastic parrot” phrasing – the implicit squawkiness, the fact that parrots produce poop that requires cleaning-up after. (Or maybe I’m just too far down the rabbit hole of potty training with the local toddler.) Next time you hear “artificial intelligence”, I hope what you imagine is a “stochastic parrot” that is, fundamentally, “just math.”

Anouk

February 2021. Another cold day, with Edinburgh blanketed this time in an unusually deep snowfall whose fairytale prettiness was an eerie reminder of climate catastrophe. The university buildings were again deserted, though not because of a strike. While GPT-2 might have been having a good pandemic, bulking out and levelling up to GPT-3, for the rest of us it felt like we’d gone from a rather quaint fear of falling down a well to being in freefall for almost a year, with no idea at all of where the bottom was or what would happen to us when we hit it.

Mid-abyss, I mused on how far we’d come, or not. We’d neither cracked open the black box of machine-generated text nor vacuumed up the ghosts in our machines into a trap, Ghostbusters-style, in order to examine them. We did have some other things, though. We had a Model Of Our Own, fine-tuned by Quinn, and some conceptual metaphors for understanding it, courtesy of Annie. We had a little better of an understanding of which thematic preoccupations of the original BSC texts lent themselves particularly well to being reproduced by a machine learning model (the various permutations of family and romantic relationships amongst a set of characters, the ways an individual’s identity is delineated by means of their appearance, clothes and hobbies) and which stylistic tics a model could most readily emulate (creative combinations of colors, styles and items of clothing, the relentless cheeriness and exuberance of BSC narrators, the way the narrative is driven through dialogue). We also had a more precise sense of how closely machine-generated texts matched the originals on which they were trained, and some ideas for future directions we could take which would take advantage of the expertise of our fellow Data-Sitters.

Most importantly, though, Quinn and I had had many good laughs, sorely needed and greatly valued over the past year. So we decided to put GPT-2 in its place, on Twitter, where we hoped it might bring occasional bursts of joy. We combed through the text files Quinn had generated, pulled out chunks of 280 characters or less, then used Zach Whalen’s handy SSBot to build a twitterbot, @daily_dsc. As a final touch, I ran the cover of Claudia and the Phantom Phone Calls through a convolutional neural network (you can try it too). It latched onto Claudia’s almond-shaped eyes and, pleasingly, turned them for once not into code for exotic otherness but into puppy dog eyes which it then also added to the phone, her hair, her hand, and the child she was babysitting.

Our bot began burbling happily away (“”Oh, my lord!” I shrieked, and flopped on her bed. I took her bag of potato chips out of my pocket, and stuffed each into my hair. The potato chips were one yellow potato chip wrapper with a picture of Stacey on one side and a picture of a woman holding potato chips in the other”) and I gazed out the window at the extravagant fluffiness of the once-in-a-childhood snowfall. There was sunshine and blue sky, an unusual sight in a Scottish winter, and eleven days and counting till primary schools were planning to reopen their doors to the youngest pupils.

I had written earlier that I was hopeful that scholars of literature, narrative, and language might have something to offer the conversations among computer scientists and ethicists around explainable AI. Annie, Katia & Quinn nudged me to change “hopeful” to “confident,” so I did, but with some unease: if there’s one thing the discourse around AI needs less of, it’s confidence. We didn’t have answers. We couldn’t identify the nodes producing racist bilge and simply zap them in order to generate an enchanted fictional world where “Jessi is Black and Mal is white” isn’t always already overdetermined by the crushing weight of historical inequities and ongoing violence whose eruptions had deepened the horror of the past year. Out the window, the cushiony puffs of white adorning the walls and roofs of my neighbourhood gave a serene roundedness to the sharp corners of the world, making it easy to forget the grime underneath.

Annie, looking over my virtual shoulder, remarked, “You know, it’s not AI—or math—that produces racist language. It reflects the racist thoughts that are encoded into the data we give it. Simply put, AI isn’t racist: we are. That’s why it’s important for humanities scholars to be part of work on machine learning, and to be able to adequately evaluate training texts before using them to make decisions with, say, legal consequences.” She was right, of course. Busting the lid off the black box and rummaging around in it to try and fix things was a distraction, and about as feasible as prying open people’s heads to rearrange the neural networks in their brains to change the way they think. But I find it a helpful analogy to think of the billions of associations that a machine learning model makes as it assembles a mathematical model of the similarities between all the words in the language (and then some: fiercenessed, I’re, shs’s), and the associations that human beings form as we absorb information about the world from texts, especially the beloved, and frequently reread, texts from childhood.

The authors would like to thank Annie K. Lamar for her help with the preparation of this manuscript. (Really, seriously, we couldn’t have done this without her!)

If you'd like to check out the Colab notebook we used, play with our model, or look at the samples generated during fine-tuning, it's all available in the DSC9 GitHub repo.

Suggested citation

Lang, Anouk and Quinn Dombrowski with Annie K. Lamar. “The Ghost in Anouk’s Laptop”. The Data-Sitters Club. February 17, 2021.