What the eclipse can teach us about science

While people across the country are racing to find safety glasses and make last-minute travel arrangements, scientists are making other preparations. These are just some of the natural phenomena that will tested with the help of this week’s solar eclipse:

Einstein’s theory of relativity

Einstein’s 1915 theory says that massive objects should warp the shape of space itself by a noticeable amount. Something like the Sun should bend the light from the constellations behind it, making the stars look as if they’ve moved over a teeny bit.

 

Learning more about the sun’s corona

The sun’s corona, “the bright, high-energy plasma blasting off the Sun’s surface” is the source of space weather, as well as energy particles that “can cause wild auroras, harm satellites, or potentially even swap votes in voting machines should they hit electronics in the right place.” A special telescope, which blocks out most of the sun for the viewer, is used to observe the corona, but eclipses allow scientists to get more precise images of the corona itself. Scientists will be observing the corona from the ground as well as from hot air balloons.

Plant and animal behavior

Unexpected darkness in a plant or animal’s habitat could allow scientists to study their reactions. Many plants and animals behave differently in the run-up or wake of natural disasters, will they show any new behavior during the eclipse?

The effect on weather in different climate zones

“The eclipse will be passing over several different ecosystems, including forests, farmland, and prairies.” Professional scientists, as well as citizen scientists, are preparing to record the temperature throughout the eclipse in St. Louis and the surrounding area.

Why it’s hot

There’s still much about the world we have yet to learn, and a natural phenomenon like the eclipse gives us a unique perspective to measure and observe. It’s something to be excited about that isn’t horrible!

To see other experiments, and to learn how to set up your own for the eclipse, see the full article at Gizmodo

Augmented reality without glasses

Diagram of artificial lense

Artificial lens diagram via techcrunch.com

Six months ago, Omega Ophthalmics did a small trial of seven patients outside of the US. Their goal was to test for adverse effects of a surgery similar to lens replacements that often accompany cataract removals. The difference? Rather than replacing the cloudy lens with a normal artificial lens, surgeons instead implanted a lens that could be used for augmented reality, interactive sensors, or drug delivery.

Why it’s hot

Although widespread adoption of this technology is unlikely in the near future, scientists, entrepreneurs, and venture capitalists hope that there is a market for such implants in an aging population that wants to be independent for longer. Whether this small trial is successful may pave the way for larger trials to test additional possibilities and risk.

Learn more at TechCrunch.com

Flawless live translation technology might be closer than we think

iTranslate has launched a new app, called Converse, that seeks to speed up live translations and make the process more human and enjoyable. The simple interface is designed to be used without having to stare at your screen. With no clunky interface to grapple with, and over 40 languages automatically recognized by the system, anyone can quickly start translating themselves and others to be better understood.

Why it’s hot:

Converse brings translations to users in a way that is human-focused rather than technology focused. While other companies have attempted live translations, setting up and accessing these programs is usually too complicated to do in the heat of the moment. By making the software more accessible, people are more likely to use it.

Why it’s maybe not:

Because this app is so new, there are still some kinks to work out. Language and localization is notoriously difficult (think early Siri trying to understand Scots), and some of the translation algorithms just aren’t quite there yet. This seems to be the universal story for translation programs, even iTranslate’s other eponymous app.

Read more here https://www.theverge.com/2017/8/3/16076084/itranslate-app-real-time-translation

Figma adds collaboration to the product design process

While some digital product designers are still using Illustrator or Photoshop to design interfaces for screens, a battle has begun to design the next generation of product design tools. Sketch has captured a large portion of the market as a tool that is easy to adopt and master, but other vendors have their sights set on bigger and brighter futures.

Figma is an interface design tool that combines product design functionality with feedback, collaboration, and prototyping. It is built especially for digital product designers, taking into consideration device constraints, accessibility, and even production needs.

Why it’s hot…

Rather than switching file types and relying on email, Slack, or annotations, imagine being able to work side-by-side with other members of your team – designers and non-designers alike – in one file to get a project production-ready. Additionally, Figma works across devices and operating systems, meaning that everyone can use the device they want and still have access to the same files. Think Office 365 or Google Docs, but for product design.

Showing code from designed screens [via figma.com]

Adobe and other large vendors are trying to implement similar collaborative capabilities in XD and other programs, but Figma has the added benefit of starting from scratch, without legacy product dependencies. It will be exciting to see how they use this momentum to carry them forward.

Learn more at https://www.figma.com