Архив метки: SDK

Banuba raises $7M to supercharge any app or device with the ability to really see you

Walking into the office of Viktor Prokopenya — which overlooks a central London park — you would perhaps be forgiven for missing the significance of this unassuming location, just south of Victoria Station in London. While giant firms battle globally to make augmented reality a “real industry,” this jovial businessman from Belarus is poised to launch a revolutionary new technology for just this space. This is the kind of technology some of the biggest companies in the world are snapping up right now, and yet, scuttling off to make me a coffee in the kitchen is someone who could be sitting on just such a company.
Regardless of whether its immediate future is obvious or not, AR has a future if the amount of investment pouring into the space is anything to go by.
In 2016 AR and VR attracted $2.3 billion worth of investments (a 300 percent jump from 2015) and is expected to reach $108 billion by 2021 — 25 percent of which will be aimed at the AR sector. But, according to numerous forecasts, AR will overtake VR in 5-10 years.
Apple is clearly making headway in its AR developments, having recently acquired AR lens company Akonia Holographics and in releasing iOS 12 this month, it enables developers to fully utilize ARKit 2, no doubt prompting the release of a new wave of camera-centric apps. This year Sequoia Capital China, SoftBank invested $50 million in AR camera app Snow. Samsung recently introduced its version of the AR cloud and a partnership with Wacom that turns Samsung’s S-Pen into an augmented reality magic wand.
The IBM/Unity partnership allows developers to integrate into their Unity applications Watson cloud services such as visual recognition, speech to text and more.
So there is no question that AR is becoming increasingly important, given the sheer amount of funding and M&A activity.

Joining the field is Prokopenya’s “Banuba” project. For although you can download a Snapchat-like app called “Banuba” from the App Store right now, underlying this is a suite of tools of which Prokopenya is the founding investor, and who is working closely to realize a very big vision with the founding team of AI/AR experts behind it.
The key to Banuba’s pitch is the idea that its technology could equip not only apps but even hardware devices with “vision.” This is a perfect marriage of both AI and AR. What if, for instance, Amazon’s Alexa couldn’t just hear you? What if it could see you and interpret your facial expressions or perhaps even your mood? That’s the tantalizing strategy at the heart of this growing company.
Better known for its consumer apps, which have been effectively testing their concepts in the consumer field for the last year, Banuba is about to move heavily into the world of developer tools with the release of its new Banuba 3.0 mobile SDK. (Available to download now in the App Store for iOS devices and Google Play Store for Android.) It’s also now secured a further $7 million in funding from Larnabel Ventures, the fund of Russian entrepreneur Said Gutseriev, and Prokopenya’s VP Capital.
This move will take its total funding to $12 million. In the world of AR, this is like a Romulan warbird de-cloaking in a scene from Star Trek.
Banuba hopes that its SDK will enable brands and apps to utilise 3D Face AR inside their own apps, meaning users can benefit from cutting-edge face motion tracking, facial analysis, skin smoothing and tone adjustment. Banuba’s SDK also enables app developers to utilise background subtraction, which is similar to “green screen” technology regularly used in movies and TV shows, enabling end-users to create a range of AR scenarios. Thus, like magic, you can remove that unsightly office surrounding and place yourself on a beach in the Bahamas…
Because Banuba’s technology equips devices with “vision,” meaning they can “see” human faces in 3D and extract meaningful subject analysis based on neural networks, including age and gender, it can do things that other apps just cannot do. It can even monitor your heart rate via spectral analysis of the time-varying color tones in your face.
It has already been incorporated into an app called Facemetrix, which can track a child’s eyes to ascertain whether they are reading something on a phone or tablet or not. Thanks to this technology, it is possible to not just “track” a person’s gaze, but also to control a smartphone’s function with a gaze. To that end, the SDK can detect micro-movements of the eye with subpixel accuracy in real time, and also detects certain points of the eye. The idea behind this is to “Gamify education,” rewarding a child with games and entertainment apps if the Facemetrix app has duly checked that they really did read the e-book they told their parents they’d read.
If that makes you think of a parallel with a certain Black Mirror episode where a young girl is prevented from seeing certain things via a brain implant, then you wouldn’t be a million miles away. At least this is a more benign version…
Banuba’s SDK also includes “Avatar AR,” empowering developers to get creative with digital communication by giving users the ability to interact with — and create personalized — avatars using any iOS or Android device.Prokopenya says: “We are in the midst of a critical transformation between our existing smartphones and future of AR devices, such as advanced glasses and lenses. Camera-centric apps have never been more important because of this.” He says that while developers using ARKit and ARCore are able to build experiences primarily for top-of-the-range smartphones, Banuba’s SDK can work on even low-range smartphones.
The SDK will also feature Avatar AR, which allows users to interact with fun avatars or create personalised ones for all iOS and Android devices. Why should users of Apple’s iPhone X be the only people to enjoy Animoji?
Banuba is also likely to take advantage of the news that Facebook recently announced it was testing AR ads in its newsfeed, following trials for businesses to show off products within Messenger.
Banuba’s technology won’t simply be for fun apps, however. Inside two years, the company has filed 25 patent applications with the U.S. patent office, and of six of those were processed in record time compared with the average. Its R&D center, staffed by 50 people and based in Minsk, is focused on developing a portfolio of technologies.
Interestingly, Belarus has become famous for AI and facial recognition technologies.
For instance, cast your mind back to early 2016, when Facebook bought Masquerade, a Minsk-based developer of a video filter app, MSQRD, which at one point was one of the most popular apps in the App Store. And in 2017, another Belarusian company, AIMatter, was acquired by Google, only months after raising $2 million. It too took an SDK approach, releasing a platform for real-time photo and video editing on mobile, dubbed Fabby. This was built upon a neural network-based AI platform. But Prokopenya has much bolder plans for Banuba.
In early 2017, he and Banuba launched a “technology-for-equity” program to enroll app developers and publishers across the world. This signed up Inventain, another startup from Belarus, to develop AR-based mobile games.
Prokopenya says the technologies associated with AR will be “leveraged by virtually every kind of app. Any app can recognize its user through the camera: male or female, age, ethnicity, level of stress, etc.” He says the app could then respond to the user in any number of ways. Literally, your apps could be watching you.
So, for instance, a fitness app could see how much weight you’d lost just by using the Banuba SDK to look at your face. Games apps could personalize the game based on what it knows about your face, such as reading your facial cues.
Back in his London office, overlooking a small park, Prokopenya waxes lyrical about the “incredible concentration of diversity, energy and opportunity” of London. “Living in London is fantastic,” he says. “The only thing I am upset about, however, is the uncertainty surrounding Brexit and what it might mean for business in the U.K. in the future.”
London may be great (and will always be), but sitting on his desk is a laptop with direct links back to Minsk, a place where the facial recognition technologies of the future are only now just emerging.

Banuba raises $7M to supercharge any app or device with the ability to really see you

Coinbase acquires Distributed Systems to build ‘Login with Coinbase’

Coinbase wants to be Facebook Connect for crypto. The blockchain giant plans to develop “Login with Coinbase” or a similar identity platform for decentralized app developers to make it much easier for users to sign up and connect their crypto wallets. To fuel that platform, today Coinbase announced it has acquired Distributed Systems, a startup founded in 2015 that was building an identity standard for dApps called the Clear Protocol.
The five-person Distributed Systems team and its technology will join Coinbase. Three of the team members will work with Coinbase’s Toshi decentralized mobile browser team, while CEO Nikhil Srinivasan and his co-founder Alex Kern are forming the new decentralized identity team that will work on the Login with Coinbase product. They’ll be building it atop the “know your customer” anti-money laundering data Coinbase has on its 20 million customers. Srinivasan tells me the goal is to figure out “How can we allow that really rich identity data to enable a new class of applications?”

Distributed Systems had raised a $1.7 million seed round last year led by Floodgate and was considering raising a $4 million to $8 million round this summer. But Srinivasan says, “No one really understood what we’re building,” and it wanted a partner with KYC data. It began talking to Coinbase Ventures about an investment, but after they saw Distributed Systems’ progress and vision, “they quickly tried to move to find a way to acquire us.”
Distributed Systems began to hold acquisition talks with multiple major players in the blockchain space, and the CEO tells me it was deciding between going to “Facebook, or Robinhood, or Binance, or Coinbase,” having been in formal talks with at least one of the first three. Of Coinbase the CEO said, they “were able to convince us they were making big bets, weaving identity across their products.” The financial terms of the deal weren’t disclosed.

Coinbase’s plan to roll out the Login with Coinbase-style platform is an SDK that others apps could integrate, though that won’t necessarily be the feature’s name. That mimics the way Facebook colonized the web with its SDK and login buttons that splashed its brand in front of tons of new and existing users. This turned Facebook into a fundamental identity utility beyond its social network.
Developers eager to improve conversions on their signup flow could turn to Coinbase instead of requiring users to set up whole new accounts and deal with crypto-specific headaches of complicated keys and procedures for connecting their wallet to make payments. One prominent dApp developer told me yesterday that forcing users to set up the MetaMask browser extension for identity was the part of their signup flow where they’re losing the most people.
This morning Coinbase CEO Brian Armstrong confirmed these plans to work on an identity SDK. When Coinbase investor Garry Tan of Initialized Capital wrote that “The main issue preventing dApp adoption is lack of native SDK so you can just download a mobile app and a clean fiat to crypto in one clean UX. Still have to download a browser plugin and transfer Eth to Metamask for now Too much friction,” Armstrong replied “On it :)”

On it 🙂
— Brian Armstrong (@brian_armstrong) August 15, 2018

In effect, Coinbase and Distributed Systems could build a safer version of identity than we get offline. As soon as you give your Social Security number to someone or it gets stolen, it can be used anywhere without your consent, and that leads to identity theft. Coinbase wants to build a vision of identity where you can connect to decentralized apps while retaining control. “Decentralized identity will let you prove that you own an identity, or that you have a relationship with the Social Security Administration, without making a copy of that identity,” writes Coinbase’s PM for identity B. Byrne, who’ll oversee Srinivasan’s new decentralized identity team. “If you stretch your imagination a little further, you can imagine this applying to your photos, social media posts, and maybe one day your passport too.”
Considering Distributed Systems and Coinbase are following the Facebook playbook, they may soon have competition from the social network. It’s spun up its own blockchain team and an identity and single sign-on platform for dApps is one of the products I think Facebook is most likely to build. But given Coinbase’s strong reputation in the blockchain industry and its massive head start in terms of registered crypto users, today’s acquisition well position it to be how we connect our offline identity with the rising decentralized economy.

What the Facebook Crypto team could build

Coinbase acquires Distributed Systems to build ‘Login with Coinbase’

Конкурс разработчиков приложений для платформы 2net

Qualcomm Life, "дочка" Qualcomm, объявляет о доступности пакета средств разработки приложений (SDK) для платформы 2net.
Конкурс разработчиков приложений для платформы 2net

YogiPlay Debuts “YogiMeter,” An Educator-Based Rating System For Children’s Learning Apps

yogiplay_logo_rgb

YogiPlay, a Menlo Park-based company from husband-and-wife team Cedric and Michal Selling, is attempting to tackle the critical problem of surfacing appropriate, trusted, and carefully vetted educational apps for children. The company recently raised $1 million in VC funding from DN Capital and Richmond Park Partners, in order to develop a system for rating apps for kids, specifically targeting the ages 2 through 8. Today, YogiPlay is announcing the results of those efforts with YogiMeter, its new app rating system designed to help parents find learning apps for children that have been vetted by a team of educational experts.

The Sellins are both Stanford-trained engineers, with Cedric hailing from Aruba Wireless and Michal an early Google employee. But they’re also parents who grew frustrated with how difficult it was to find quality learning apps for children. Cedric says they were inspired to start the company after watching their two-year old daughter interacting with mobile apps, and realized what powerful learning tools they could be.

In June, the company hired Dr. Jim Gray, who previously served as the Director of Learning at LeapFrog, where he was in charge of all curricula for the last seven or so years. At YogiPlay, he led the development of the YogiMeter system, which has been designed to assess the engagement levels and educational qualities of mobile apps.

“It’s using the same principles I’ve been using all along from my knowledge of child development and interactive media,” he says of YogiMeter. “I’ve structured in a way with some very specific ways to look at how and why kids would be engaged, and if they’re engaged, how and why they might learn.” He also vetted this rubric with other colleagues not associated with YogiPlay to get their feedback and input.

While there are a few startups working to rank and review mobile apps, like KinderTown, for instance (which also vets apps with educators), Dr. Gray says that he believes the YogiMeter system uses a more developmental approach with techniques common to those familiar in child development and education. “The others are not as rigorous, research-based, structured and consistent,” he says describing YogiMeter’s competition.

The system he developed ranks and analyzes apps in two main areas — engagement and educational quality. For determining an app’s engagement, it looks at things like user interactions, user experience, intrinsically motivated engagement, extrinsically motivated engagement and socially motivated engagement. And to analyze the app’s ability to teach, it looks at whether the app will actually engage the child in learning, as it proposes to do, and whether that learning is deep, authentic, personalized, differentiated, and whether or not parents can track the child’s progress throughout. On that last front, it should be noted that YogiPlay also offers mobile developers an SDK which allows them to integrate parental communication tools within their iOS or Android app. Less than twenty developers on iOS and Android are now using this SDK in their apps today.

The YogiMeter rankings will soon be featured visually in the company’s online parent-facing app discovery center and personalized dashboard as well as within its standalone mobile apps, all of which are undergoing massive redesigns right now.  (Hence, no screenshots are appropriate here). YogiPlay’s mobile app is available on Android currently, but has been held up in Apple’s approval queue for around three months. However, since the apps are HTML5-based, it seems that even if it was rejected from iOS, the company could easily make it accessible on mobile devices.

To date, YogiPlay has rated over 600 apps with YogiMeter, and is adding new apps to the system daily. The company has also seen some 45,000 parents register on the site, providing their email and their child’s age and other details. Although parents can’t see the YogiMeter rating just yet on YogiPlay.com, all the apps have now been vetted through the system.


YogiPlay Debuts “YogiMeter,” An Educator-Based Rating System For Children’s Learning Apps

Building Smarter Apps

iPhone App Store Icon Wall

A new breed of mobile applications is coming. These new apps will not only “sense” the world around you, using the smartphone’s sensors like the compass, GPS, and accelerometer – they’ll also be able to combine that data with a history of your actions to intelligently determine your likes, interests and favorites. This understanding of the world, or “ambient discovery” if you will, could then be piped into any app to make it smarter, whether it’s a social app for finding friends, a Siri-like personal assistant, a fitness app, a mobile game, or anything else.

This, at least, is the promise from the Palo Alto-based startup, Alohar Mobile, which recently introduced new SDKs for mobile app developers interested in experimenting with the possibilities of smarter apps.

Alohar Mobile (newly emerged from stealth mode), was founded by a former Platform Architect of Google’s Location Server Platform, Sam Liang. You may know him as the guy who put the “blue dot” location service in tens of thousands of mobile apps, including the default Map app on iPhone and Android, as well as in the Facebook check-in, Foursquare and Yelp. He also architected Google Latitude.

Liang started the company with Stanford alumni, Larry Wang and Alvin Lau, and they’ve now raised $2 million in funding from notable angel investors, including David Cheriton, the first investor in Google, Fortinet founder and CEO Ken Xie, and Tim Draper of Draper Fisher Jurvetson.

As for what, exactly, Alohar is providing – that’s a bit more complicated. It’s not just developing a smarter Siri, although that description is sure to catch more readers’ attention than something like “mobile development platform,” for example. While a smarter Siri-like app could be the product of Alohar’s work, it is not the work itself.

Lau describes the technology as an “ambient sensing platform.”

Um, say what?

“We’ve developed technology that sits on a smartphone that analyzes data coming from all the different sensors on your phone – for example, GPS and Wi-Fi – but a lot of companies do that, that’s nothing special. But we also gather data from the accelerometer, the compass, the gyroscope,” explains Lau. “It helps us to determine a person’s exact location.”

What that means is that apps using Alohar’s technology can precisely determine where someone is because of the way data is combined. For example, an app relying on GPS alone may know that you’re somewhere near a Starbucks, but can’t really tell if you’re there or in an adjacent store. Alohar-enabled apps, however, could detect things like the rate at which you’re moving (60 MPH? You’re probably driving down the road past a Starbucks), the direction you’re headed (moving towards the building slowly? You’re probably walking into the Starbucks), the network you’re connected to (ATTWIFI? You’re probably inside the Starbucks), and even time of day (8:30 AM? You’re probably at the Starbucks on the ground level of that skyscraper, not the nightclub on the top floor).

None of the data is used in isolation, but is instead parsed by advanced algorithms to make sense of your actions and movements. The algorithms give the app higher or lower probabilities to different types of places.

These algorithms can also take into account what you’ve done in the past and use that to help weight the data appropriately. For example, if you’ve visited that Starbucks several times over the past couple of weeks, but have never visited the bagel shop next door, the algorithm knows that you’re probably at the Starbucks.

Alohar’s technology has been packaged into a SDK for mobile developers, which allows them to create new apps or enhance existing ones. They’ve also released a sample app into the App Store called PlaceMe, which is an interesting product on its own. The app tracks and records your movements, producing a virtual trail you could later pull up online. A bit creepy, perhaps, but the company says it would be handy for Alzheimer’s patients to have installed.

But while PlaceMe is a fun experiment, the focus for the company is more so on the tech behind it. Some mobile app makers are already working on integrations, but Alohar can’t reveal who just yet, only give general descriptions. “Developers who are using [the SDK] are in the categories of dating, fitness and health apps that want to track your exercise and make recommendations, and shopping apps that make suggestions based on your location and your likes and favorites,” says Lau.

He also mentioned some check-in apps were experimenting with auto-checkins and the reduced battery consumption the tech enables. Plus, two of these twelve “ambient location” startups that were hot during this year’s SXSW have begun to implement the technology, too.

But it’s still early days for Alohar. The Android SDK came out in March and the iOS version arrived just this month. Both are in beta. So far, around 65 developers are evaluating or integrating the technology, Lau says.

And yet, almost any app that uses location services could benefit from the more precise targeting the tech offers, assuming everything works as advertised. More than that, the tech could enable a whole new kind of experience for developers to build on top of – one where users don’t have to do so much manual labor to explain to apps where they are and what they’re doing and what they want to do.

It’s yet another step towards engineering the serendipitous discovery of the world around us, via our mobile devices. It’s the underpinnings that could breathe intelligence into our apps, which could then make them, at best, more useful, more engaging, and ultimately, more loved…or, at worst, more creepy, more intrusive, more stalker-ish.

How developers choose to implement the technology, and the level of control they give to users surrounding that data’s use and storage, could raise a whole new series of questions about data privacy even though Apple, Google, developers and the government, are still figuring out what to do about the concerns we already have now – those that come more basic actions like accessing the address book or storing GPS data.

But with the fast pace of technology, sometimes you have to weigh the good with the bad and choose to move forward or get left behind. Using this scale, the possibilities to develop more intelligent apps – not to mention ones that can reduce battery drains – is a more exciting and promising step than the potential for abuse, real as it may be, from unscrupulous developers or the government (at least in less authoritarian regimes like the U.S.). You may not agree. That’s fine. But sometimes the laws have to catch up with the world, and in the mobile ecosystem, this is clearly going to be the case for years to come.

Below, a demo of how Alohar, doing things like automatically ordering an ambulance for you. “I’ve fallen and I can’t get up?” Yes, your phone will know.

Image credit: Ryan Orr


Building Smarter Apps