Sentry

This is a sci-fi story by legendary American writer Frederic Brown. Do yourself a favour, go buy this AMAZING collection of very short stories of his. Or find some for free here.
Yeah I know, I’ve got no Amazon affiliation :/

He was wet and muddy and hungry and cold and he was fifty thousand light years from home.

A strange blue sun gave light, and gravity, twice what he was used to, made every movement difficult. But in tens of thousands of years this part of war hadn’t changed. The flyboys were fine with their sleek spaceships and their fancy weapons. When the chips are down, though, it was still the foot soldier, the infantry, that had to take the ground and hold it, foot by bloody foot. Like this damned planet of a star he’s never heard of until they’d landed him there. And now it was sacred ground because the aliens were there too.

The aliens, the only other intelligent race in the Galaxy…cruel, hideous and repulsive monsters. Contact had been made with them near the centre of the Galaxy, after the slow, difficult colonization of a dozen thousand planets; and it had been war at sight; they’d shot without even trying to negotiate, or to make peace. Now, planet by bitter planet, it was being fought out. He was wet and muddy and hungry and cold, and the day was raw with a high wind that hurt his eyes. But the aliens were trying to infiltrate and every sentry post was vital. He stayed alert, gun ready. Fifty thousand light-years from home, fighting on a strange world and wondering if he’d ever live to see home again.

And then he saw one of them crawling toward him. He drew a bead and fired. The alien made that strange horrible sound they all make, then lay still. He shuddered at the sound and sight of the alien lying there. One ought to be able to get used to them after a while, but he’d never been able to. Such repulsive creatures they were, with only two arms and two legs, ghastly white skins and no scales.

Artificial Intelligence and SEO: signals and probability

Someone’s asked a couple questions about this post, and I realised I find it very hard to express my knowledge and opinions about Artificial Intelligence and its relationship with web marketing. This immediately triggers two thoughts: on the one hand, I understand the topic way less than I would like to; secondarily, it is a field in which most people don’t have a strong understanding, either. Probably it is because most experts of the field consider the marketing industry trivial (I know I would if I were them), and marketers prove them right by not knowing anything about it, nor caring to know. Which is a mistake per se.

Let me add something on top of this specific point: artificial intelligence, and machine learning in particular, is going to have a strong, strong impact on our job, like in every single job. Even better, it ALREADY DOES. How can you not see how featured snippets are generated? Do you think your developer can implement that? No they cannot. Every day, we talk about billions and trillions of pages, queries, and detected user intents. Only a machine can handle that sort of amount of information bits and put them in order, and this, this is what fascinates me. This is the only reason why it still makes sense to have an organic traffic strategy, 10 years or so after SEO died. Because bear in mind: if you don’t approach and work with AI, then it is dead.
A whole book might be written on the correlation of the verb I just used, “to think”, with a machine. Is the search engine really thinking? It’s a topic for which neither I nor anyone else could possibly have all the answers, ranging from philosophy to advanced engineering: what is it “to think”? While I do think I can provide an opinion on this, as valuable as anyone else’s, I reckon this is not the place to do so: indeed, it’s irrelevant. Whether it is actually “thinking”, what G does; or it’s just imitating us, mirroring what it sees as a parrot, it is the final result that matters now in this specific argument. G engineers do not have all the answers (and more often than not lie about it), when it comes to understanding why the SERP (the Search Engine Results Page) looks the way it does: JohnMu doesn’t know, Larry Page most definitely don’t know. Matt Cutts never knew.
Believing that anyone at Google knows why the search engine behaves somehow in a given moment, is equally irrational as believing that the IBM team that engineered Deep Blue is able to defeat Garry Kasparov in a game of chess.
They cannot.

Somehow, building upon millions and millions of matches and moves and “observing” real-life champions, Deep Blue learned. It learned from experience, which is what we do as kids isn’t it. Regardless, it put information together and used it, learned to use it properly in new, unexpected situations: that’s exactly what we do as little kids, and what Google’s search engine does now on a daily basis.

The AI learning process is well explained in many a TED Talk by better people than yrstruly. But let me try to wrap it up as best I can. A computer’s speed and memory are both better than ours. Better at a logarithmic level. What it’s (still) worse than us at is recognising connections between dots. Our brains are extremely talented at recognising patterns, which is pretty much what I poorly try to explain with my pen example: you might have never seen this specific object, but you’ve seen dozens of similar ones. It’s got these and those characteristics. The environment you find it in is thus and thus. All this bits and pieces make you recognise it as a pen, even though it might be something else: putting all of them together, it’s likely a pen.

Using signals

So how does Google know that Wikipedia’s information is to be shown on top of a SERP? It’s because of thousands of instances in which:
1. people searched for “X”, came back to SERP, searched for “X wiki”
2. people linked from their site to the Wiki page dedicated to “X”
3. Websites are created, in which the sole textual content is copypasted from Wiki
There’s many more “political” reasons, but you get the point: G used signals. With a statistically relevant amount of inputs (signals), it’s able to recognise how likely something is to answer the user’s needs.

Probability

Relevantly, an AI is not merely based on an I/O system. My machine at home goes by on/off, Y/N, positive/negative. It’s got no grey areas. A Deep Learning system does the same, but with such magnitude that it’s able to assign probability. For a traditional computer, either X is a pen, or it’s not. For AI, X is likely to be a pen: in lack of a better options, it is a pen.

If we suggest that it is not, in fact, a pen, an AI-powered system is able to suggest another option (what was the second-most-likely). It’s also going to learn that the likelihood of that specific combination of characteristics being proper of a pen is less than it previously believed, and will keep this in mind in the next instance. Every input teaches it something.

Finally, let me say that I’m fully aware that I do not have a complete grasp on the topic: any inputs are very much appreciated because I believe this is something web marketers should talk about.

Soundtrack’s pretty easy to choose this time: Fear Inoculum by Tool