Optimus Ride, a leading self-driving vehicle technology company, announced the launch of New York state’s first self-driving vehicle program, located at the Brooklyn Navy Yard (Yard), a 300-acre industrial park with more than 400 manufacturing businesses and 10,000 employees onsite. Beginning tomorrow, six autonomous vehicles will transport passengers between the NYC Ferry stop at Dock 72 and the Yards’ Cumberland Gate at Flushing Avenue, a vital connection point in a truly multi-modal commute for thousands of passengers and a first-of-its-kind commercial autonomous driving system.
Beginning August, a total of six Optimus Ride autonomous vehicles will transport an expected 500 passengers per day and more than 16,000 passengers per month. Initially, there will be a safety driver and software operator in the vehicle while in operation. The system will run on a continuous loop between 7:00 a.m. and 10:30 p.m. on weekdays between the dock and the Cumberland Gate. On weekends the system will run between the dock and Building 77 during those same time periods.
Why it’s hot: To test drive and validate self-driving vehicle technology, Optimus Ride found a way that seamlessly fits into people’s everyday life and adds value to it.
Starship Technologies, an autonomous delivery startup created in 2014 by two Skype co-founders, has been in public testing mode in 20 countries around the world since 2015. Now the company says it is ready for its first “major commercial rollout”.
Employees of company ‘Intuit’ in Mountain View, California, will be able to order breakfast, lunch and coffee from their staff cafeteria and have it delivered to any point in the company’s Silicon Valley campus by one of Starship’s six-wheeled autonomous robots.
“You place your order, it’s one click, then you drop a pin where you want the robot to meet you,” says Starship co-founder Janus Friis. “We’ve seen huge demand for breakfast. For some reason people just don’t want to wait – they want to go straight to work and avoid the queue in the early hours of the day.”
Starship is now on the lookout for other campuses across western Europe and the US where it can deploy the robots.
Why it’s hot: This is just another step towards the autonomous driving cars and Amazon drone-delivered packages – talk about a seamless customer experience!
Cruise, General Motors’ autonomous vehicle unit, plans to mass produce a self-driving car without a steering wheel or pedals by 2019. GM says it has submitted a safety report as well as an application to regulators to approve the design, its fourth-generation model.
Along with the omission of driver-centric features, Cruise’s next-generation car is designed for ride-hailing passengers in mind. Riders will be able to summon one via a mobile app and adjust settings like interior temperature.It has no controls whatsoever, not even buttons you can push — it 100 percent treats you as a passenger, no matter where you sit. The car can even open and shut doors on its own.
The company describes it as “the first production-ready vehicle built from the start to operate safely on its own, with no driver, steering wheel, pedals or manual controls.”
Why It’s Hot:
Though 2019 is a year away, only seven states currently allow for driverless cars, and Cruise’s home state of California is in the process of passing a bill to allow for this. It has also applied for needed exemptions to federal regulations. If the company is able to get the green light, they will be the first successful company to have a mass-produced autonomous vehicle cruising the streets of the U.S.
The wave of magical CES 2018 innovations has begun to roll in, and among those already announced is a company called Nuance Communications’s “Dragon Drive” – an (extremely) artificially intelligent assistant for your car.
“By combining conversational artificial intelligence with a number of nonverbal cues, Dragon Drive helps you talk to your car as though you were talking to a person. For example, the AI platform now boasts gaze detection, which allows drivers to get information about and interact with objects and places outside of the car simply by looking at them and asking Dragon Drive for details. If you drive past a restaurant, you can simply focus your gaze at said establishment and say, “Call that restaurant,” or “How is that restaurant rated?” Dragon Drive provides a “meaningful, human-like response.”
Moreover, the platform enables better communication with a whole host of virtual assistants, including smart home devices and other popular AI platforms. In this way, Dragon Drive claims, drivers will be able to manage a host of tasks all from their cars, whether it’s setting their home heating system or transferring money between bank accounts.
Dragon Drive’s AI integration does not only apply to external factors, but to components within the car as well. For instance, if you ask the AI platform to find parking, Dragon Drive will take into consideration whether or not your windshield wipers are on to determine whether it ought to direct you to a covered parking area to avoid the rain. And if you tell Dragon Drive you’re cold, the system will automatically adjust the car’s climate (but only in your area, keeping other passengers comfortable).
Why It’s Hot:
Putting aside the question of how many AI assistants we might have in our connected future, what was really interesting to see was the integration of voice and eye tracking biometrics. Things like using your voice as your key (/to personalize your settings to you and your passengers), the car reminding you of memories that happened at locations you’re passing, and identifying stores/buildings/restaurants/other things along your route with just a gaze, it’s amazing to think what the future holds when all the technologies we’ve only just seen emerging in recent years converge.
The Tesla Model 3 has been billed as a groundbreaking car. And in one respect, it is: It doesn’t have an instrument cluster.
Although it is unusual to have the most important displays and controls on the left side of the screen instead of the center or right, keep in mind the screen’s location in the center of the car, to the driver’s right. A large speedometer is located at the top left of the screen, which turns red if you are speeding. Below that is a graphic of the car. When parked you can open the hood, trunk, and charging door. The navigation and music selection screens work much the same way you would expect in any other infotainment system, tablet, or smartphone.
Why It’s Hot
It’s one of the more significant updates to car dashboard U.I. in a long time – it will be interesting to hear the usability feedback now that the cars are being delivered. It also marks a more aggressive step towards autonomous cars.
A company called Wheely’s has created Moby Mart, a 24/7, on demand, self-driving, drone and digital assistant serviced, all electric, environmentally friendly, grab and go, digital payment only convenience store, currently autonomously piloting the streets of Shanghai. Or, as they put it on their website – “It is the store that comes to you, instead of you coming to the store.”
Bonus non-product marketing demo video:
Why it’s hot:
It’s interesting to think about what the world could look like when a number of often separate technologies come together. This may just be a primitive attempt at imagining it, but imagine the ultimate convenience provided by combining a number of technologies individually aimed at creating convenience for people. Anything could be delivered to you wherever you are, without direct human assistance.
In collaboration with Oxbotica, the city of London is testing a self-driving shuttle that will ferry +/-100 people along a short route in an effort to show average people they can safely ride and share a space with autonomous cars.
The model being tested is designed to provide safe transportation in and environment which presents multiple obstacles (pedestrians crossing paths). Their goal is to find out how passengers react to the experience of being transported by a computer-run system. They also hope to gain insights on interior design and functionality.
Why it’s hot:
Not everyone has access ($$) to technology, but as it becomes more ubiquitous, everyone needs to be familiarized/comfortable in order for the technology to be widely accepted and implemented.
As advertisers/marketers, we often face the challenge of getting consumers to adopt new technologies and have to find ways to familiarize people with changing tech (ike the cardboard VR glasses).