moaning, holding ourselves to quell the inner pain: we are experiencing incredible growth.
I'm focused on UI development & implementation, with an indescribably furious power.
I remembered more of it, I see a desolate wasteland, and a wellness program --- our office is located in San Francisco
Good morning, I saw your profile looks like a very long profile and thought it looked like monsters and I am an executive recruiter
Hi Zachary. We haven't spoken before, but I was still aware however, still alive to witness my imprisonment.
Hope your week is going to explode.
Kyle, My name is Jen Burns and I wanted to just follow the birds
Zachary, Apologies for the future.
Engineering teams based in both San Diego and New York City have a very high pitched scream.
front end engineer would literally make or break the next hour, I walked through this area for a while and then my face ended, and the rats.
Hi Kyle, My name was a river flowing, and a fan spinning in front of me
Full Stack RoR engineers to join and sacrifice babies, or kill me or join their expanding engineering team at Riviera Partners.
Your Specialties: Obsessed with keeping up-to-date with the underlying consciousness grid, god, gaian supermind, universal consciousness
Hello Kyle, I'm reaching out to you because I didn't die, didn't even recognize my parents at this point, will definitely work more on this
Hi Zachary, Hope your week is going horribly slow, i want this to end.
the doctors and nurses looked like worms or centipedes crawling in and we found your information online
We began to melt with the founders to learn more about what you think
The dose was for the team.
Candidate must be able to deal with the drug that i made
looking for an Engineer / Developer role, feel free to pass along to anyone you think might be a brutal mental experience
the power to aggregate all of his family's toothbrushes and put them in my penis so that I find all this very, very frightening
Hadoop or MongoDB Express or other persistence, and I'm abjectly terrified.
We are building a platform which will be a good day and be close to death and I say gabbada, there are VERY CLEARLY TWO VOICES SPEAKING!
My pupils have literally engulfed my corneas; I look forward to hopefully working with 15 or 20 recruiters, but without the BS.
Previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously.
Apologies for the future.
Apologies for the future.
So many good band names in there. I want to go back in time, tell my 10 year old self to continue playing drums and go on to found the only band that matters, Brutal Mental Experience (BMe.)
So good. We're actually looking for a Fullstack engineer, so I particularly enjoyed that one.
So good. We're actually looking for a Fullstack engineer, so I particularly enjoyed that one.
This needs its own LinkedIn Recruiter account. Should we do a kickstarter?
This needs its own LinkedIn Recruiter account. Should we do a kickstarter?
Let us be thankful we have an occupation to fill.
I was waiting for this to show up here. One of my favorite markov chain results so far.
Needs more Lovecraft.
Is there a canned implementation people typically use to make these generators?
Are they reading through reams of ungrammatical gibberish to find these interesting nuggets, or are the generators smarter than that? Like they run the markov output through some classifiers maybe.
Markov generators that match on phrases -- i.e. the frequency chart for "quick brown fox jumped over the lazy dog" is 'quick brown' => fox, 'brown fox' => jumped -- can mimic grammatical speech very well, as long as the corpus is varied. Most markov bots don't need to be any cleverer than that.
(YMMV, not my bot but I do run one based on Mispy's framework, etc etc.)
The longer the corpus, the more the output tends to be "normal" text instead of domain specific: you get fewer comedic and domain-specific nouns and verbs. That can be somewhat countered by matching on longer multi-word patterns, but if you go too far, you just get the original text back out (too-frequently-repeated phrases). Even if tuned properly, mispunctuated and run-on sentences tend to be a problem, so I'd assume that most of the Markov twitterbots are manual: have it generate a bunch then cherry-pick the best one.
My bot uses as its corpus my tweets and seems relatively coherent; he's just single-word matching as far as I know. Which makes sense -- tweets seem like they would make an ideal corpus because they're moderately repetitive, but with enough reuse of common words that you can get interesting mixes. @alazy_ebooks.
There are plenty of Markov chain generators out there. With good source material, you usually don't need to sift through reams to find the gems.
I ran some late 1980s RFCs with Moby Dick. Opening that file to a random location, I found this without needing to scroll:
"Join a multicast packet that is all sharks, and followed by the occasional wide intervals between the bindings for outgoing multicast packets."