This AI makes neologisms by portmanteau-ing the English language

Yesterday a smart person named Thomas Dimson, who formerly wrote “the algorithm” at Instagram, launched a site that uses the Natural Language Processing (NLP) algorithm: Transformers, and OpenAI‘s infamous GPT-2 AI-powered text generator, to generate and define new English words, and use them in a sentence.

It’s called This Word Does Not Exist, and it has so far created gems such as:

A disclaimer at the bottom of the site reads: Words are not reviewed and may reflect bias in the training set.

You can also write your own neologism and the AI will define it for you. It’s a fun diversion, but does it have any use? Probably not in this form. But it speaks to how AI may be used in the fun-and-games side of life, but also how it may ultimately shape the foundations of how we communicate.

Why it’s hot:

It’s fun to participate in the creation of something new (without having to work too hard), and language is the perfect playground for experimentation.

As AI becomes more influential in our daily lives, it’s interesting (and perhaps a little disturbing) to imagine the ways in which it may take part in creating the very words we use to communicate. What else might AI give us that we have heretofore considered to be the exclusive domain of humans?

Source: TheNextWeb

Emojis Meet Hieroglyphs: If King Tut Could Text

An exhibition at the Israel Museum in Jerusalem, “Emoglyphs: Picture-Writing From Hieroglyphs to the Emoji,” highlights the seemingly obvious, but also complicated, relationship between the iconic communication system from antiquity and the lingua franca of the cyber age.

A visual and linguistic exercise in time travel, “Emoglyphs” juxtaposes the once indecipherable pictogram writing of ancient Egypt, which first developed about 5,000 years ago, with the more accessible and universal usage of pictograms that originated in Japan in the late 1990s.

It was always hard to explain how to read hieroglyphics,” said Shirly Ben Dor Evian, an Egyptologist and the show’s curator. “In recent years it’s become easier to explain because people are writing with pictures. So I started looking into emoji.”

The first thing she noticed, she said, was that some emojis look like hieroglyphs.

A chart at the entrance of the exhibition pairs a column of hieroglyphs with a column of emojis. The similarities are uncanny: There’s no need for translation.

The Egyptian depiction of a slender, generic dog closely resembles the emoji of a prancing canine in profile. A duck (often used as a generic for a winged creature in ancient Egyptian) reappears thousands of years later as an almost identical, left-facing emoji duck. And the “emoji man dancing” strikes a similar pose to a hieroglyph of a dancing man, one arm raised and with little but a purple disco suit and a loin cloth from 3,000 years ago to distinguish between them.

The exhibition, in a small gallery in the museum’s Bronfman Archaeology Wing, has more than 60 ancient Egyptian artifacts on display; most are from the museum’s collection and many of them on view for the first time. Visitors can quiz themselves on their understanding of emojis and their newly acquired knowledge of hieroglyphs on interactive screens. Data on the differing interpretation of some emojis will be gathered as part of a survey.

The two systems may have common features, but there are also deep and complex differences.

Hieroglyphics was a complete written language, and while even an illiterate person could recognize and understand some basic symbols, the scribes worked according to strict rules and had to be highly skilled. Ancient Egyptian inscriptions eventually morphed into the dry efficiency of the first alphabet of around 20 characters, which could be more easily taught and executed, leading to an explosion in communications.

“What’s happened now,” said Ms. Ben Dor Evian, who has a hieroglyph app on her cellphone, “is that it is easier to click on an emoji than to write a whole word.”

Emojis often serve as emotional shorthand — think smiley blowing a heart kiss to soften a message or send love, or a winking face to signal sarcasm — filling an expressive void that text messages may fail to convey.

In ancient Egyptian writing and art, the image of a scarab, or dung beetle, expressed a whole concept of the afterlife and rebirth and was used in inscriptions as the verb “to become.”

With the beginnings of research into the field of emoji, Egyptologists, cognitive linguists and communication experts have started debating the similarities between the two communication systems and what sets them apart.

Some have hailed emoji as a new language. One enthusiast produced a crowdsourced and crowd-funded emoji version of Herman Melville’s classic “Moby-Dick” titled “Emoji Dick.” In 2015, Oxford Dictionaries chose the “face with tears of joy” emoji as its word of the year, saying it best represented “the ethos, mood and preoccupations” of the period.

But Chaim Noy, a professor in the school of communications at Bar Ilan University near Tel Aviv who teaches a course on emoji “because it attracts students,” considers it simplistic and populist to speak of emoji as a language, viewing it as a kind of body-language supplement to text.

An expert in museum studies as well, Professor Noy said there was nevertheless “drama” in the exhibition, which runs through Oct. 12, juxtaposing the high culture of the museum and ancient Egypt against the bottom-up, lowbrow culture of emoji.

“It’s a bit provocative, it brushes off the tired, dusty image,” he said.

 Source: NY Times

Why It’s Hot:

We focus a ton on emojis in our social monitor and linguistic analyses. It’s interesting to see emojis juxtaposed with language in a way that both shows their staying power as well as highlights their limitations.

 

Headphones that translate 40 languages

Designed to work with the Google Pixel 2 smartphones, the Pixel Buds wireless earphones can work as a universal translator and have conversations across 40 languages.

Speak one language into the earphones, the smartphone will translate it and speak the other language out loud on the phone using Google Translate app.

 

Source

Why it’s hot: language might no longer be a barrier to moving around the world. When will technology help us transcribe different languages? Maybe also animal languages?

Related image

This new AR app lets you point and translate

Thanks to the recent release of Apple’s ARKit, front end developer Frances Ng has created a point-and-translate app. That’s right, simply point your phone at the item you want to translate and if the item is recognized, associated language options will display.

AR_Translation

Here it is in action.

Whether you’re looking to learn a new language in the comfort of your own home or in a foreign land looking for a helping hand, the app is a great example of often-too-rare AR utility.

The UX magic is possible through a combination of Apple’s ARKit and an existing database of about 1,000 learned objects that Ng ported into the app. While Ng says her app is just a demo and she has no immediate plans to take it to market, what’s so remarkable is that while companies like Microsoft spent many years and dollars on mastering object recognition, Ng was able to build her app in a weekend, simply because it’s building off so much past work that’s now freely available and baked into platforms like Apple’s.

https://www.fastcodesign.com/90136943/want-to-learn-a-new-language-with-this-ar-app-just-point-and-tap?utm_content=buffer58149&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer

Why It’s Hot

  • With the new AR development kits, we should definitely see a run of useful apps like this (as well as I’m sure lots of non-useful examples!)
  • It’s also an example of how more and more development is being done off of robust platforms increasing speed to market and standardizing patterns for fast consumer orientation