Архив метки: Siri Shortcuts

Google Assistant iOS update lets you say ’Hey Siri, OK Google’

Apple probably didn’t intend to let competitors take advantage of Siri Shortcuts this way, but you can now launch Google Assistant on your iPhone by saying “Hey Siri, OK Google .”
But don’t expect a flawless experience — it takes multiple steps. After updating the Google Assistant app on iOS, you need to open the app to set up a new Siri Shortcut for Google Assistant.
As the name suggests, Siri Shortcuts lets you record custom phrases to launch specific apps or features. For instance, you can create Siri Shortcuts to play your favorite playlist, launch directions to a specific place, text someone and more. If you want to chain multiple actions together, you can even create complicated algorithms using Apple’s Shortcuts app.
By default, Google suggests the phrase “OK Google.” You can choose something shorter, or “Hey Google,” for instance. After setting that up, you can summon Siri and use this custom phrase to launch Google’s app.
You may need to unlock your iPhone or iPad to let iOS open the app. The Google Assistant app then automatically listens to your query. Again, you need to pause and wait for the app to appear before saying your query.
This is quite a cumbersome walk-around and I’m not sure many people are going to use it. But the fact that “Hey Siri, OK Google” exists is still very funny.
On another note, Google Assistant is still the worst when it comes to your privacy. The app pushes you to enable “web & app activity,” the infamous all-encompassing privacy destroyer. If you activate that setting, Google will collect your search history, your Chrome browsing history, your location, your credit card purchases and more.
It’s a great example of dark pattern design. If you haven’t enabled web & app activity, there’s a flashy blue banner at the bottom of the app that tells you that you can “unlock more Assistant features.”

When you tap it, you get a cute little animated drawing to distract you from the text. There’s only one button, which says “More,” If you tap it, the “More” button becomes “Turn on” — many people are not even going to see “No thanks” on the bottom left.
It’s a classic persuasion method. If somebody asks you multiple questions and you say yes every time, you’ll tend to say yes to the last question even if you don’t agree with it. You tapped on “Get started” and “More” so you want to tap on the same button one more time. If you say no, Google asks you one more time if you’re 100 percent sure.
So make sure you read everything and you understand that you’re making a privacy trade-off by using Google Assistant.

Google Assistant iOS update lets you say ’Hey Siri, OK Google’

Apple needs a feature like Google’s Call Screen

Google just one-upped Apple in a significant way by addressing a problem that’s plaguing U.S. cellphone owners: spam calls. The company’s new Pixel 3 flagship Android smartphone is first to introduce a new call screening feature that leverages the built-in Google Assistant. The screening service transcribes the caller’s request in real-time, allowing you to decide whether or not to pick up, and gives you a way to respond.
Despite the numerous leaks about Google’s new hardware, Call Screen and the launch of Duplex for restaurant reservations were big surprises coming from Google’s hardware event yesterday.
Arguably, they’re even more important developments than fancy new camera features  – even if Group Selfie and Top Shot are cool additions to Google’s new phone.
Apple has nothing like this call screening feature, only third-party call blocking apps – which are also available on Android, of course.
Siri today simply isn’t capable of answering phones on your behalf, politely asking the caller what they want, and transcribing their response instantly. It needs to catch up, and fast.
Half of calls will be spam in 2019
Call Screen, based on Google’s Duplex technology, is a big step for our smart devices. One where we’re not just querying our Assistant for help with various tasks, or to learn the day’s news and weather, but one where the phone’s assistant is helping with real-world problems.
In addition to calling restaurants to inquire about tables, Assistant will now help save us from the increasing barrage of spam calls.
This is a massive problem that every smartphone owner can relate to, and one the larger mobile industry has so far failed to solve.
Nearly half of all cellphone calls next year will be from scammers. And their tactics have gotten much worse in recent months.
They now often trick people by claiming to be the IRS, a bank, government representatives, and more. They pretend you’re in some sort of legal trouble. They say someone has stolen your bank card. They claim you owe taxes. Plus, they often use phone number spoofing tricks to make their calls appear local in order to get recipients to pick up.
The national Do-Not-Call registry hasn’t solved the problem. And despite large FCC fines, the epidemic continues.
A.I. handles the spammers 
In light of an industry solution, Google has turned to A.I.
The system has been designed to sound more natural, stepping in to do the sort of tasks we don’t want to – like calling for bookings, or screening our calls by first asking “who is this, please?” 
With Call Screen, as Google explained yesterday, Pixel device owners will be able to tap a button when a call comes in to send it to the new service. Google Assistant will answer the call for you, saying: “Hi, the person you’re calling is using a screening service from Google, and will get a copy of this conversation. Go ahead and say your name and why you’re calling.”
The caller’s response is then transcribed in real-time on your screen.
These transcripts aren’t currently being saved, but Google says they could be stored in your Call History in the future.
To handle the caller, you can tap a variety of buttons to continue or end the conversation. Based on the demo and support documentation, these include things like: “Who is this?,” “I’ll call you back,” “Tell me more,” “I can’t understand,” or “Is it urgent?”
You can also use the Assistant to say things like, “Please remove the number from your contact list. Thanks and goodbye,” the demo showed, after the recipient hit the “Report as spam” button.
While Google’s own Google Voice technology has been able to screen incoming calls, this involved little more than asking for the caller’s name. Call Screen is next-level stuff, to put it mildly.
And it’s all taking place on the device, using A.I. – it doesn’t need to use your Wi-Fi connection or your mobile data, Google says.

As Call Screen is adopted at scale, Google will have effectively built out its own database of scammers. It could then feasibly block spam calls or telemarketers on your behalf as an OS-level feature at some point in the future.
“You’ll never have to talk to another telemarketer,” said Google PM Liza Ma at the event yesterday, followed by cheers and applause – one of the few times the audience even clapped during this otherwise low-key press conference.
Google has the better A.I. Phone
The news of Call Screen, and of Duplex more broadly, is another shot fired across Apple’s bow.
Smartphone hardware is basically good enough, and has been for some time. Apple and Google’s modern smartphones take great photos, too. New developments on the camera front matter more to photography enthusiasts than to the average user. The phones are fine. The cameras are fine. So what else can the phones do?
The next battle for smartphones is going to be about A.I. technology.
Apple is aware that’s the case.
In June, the company introduced what we called its “A.I. phone” – an iPhone infused with Siri smarts to personalize the device and better assist. It allows users to create A.I.-powered workflows to automate tasks, to speak with Siri more naturally with commands they invent, and to allow apps to make suggestions instead sending interruptive notifications.

But much of Siri’s capabilities still involve manual tweaking on users’ parts.
You record custom Siri voice commands to control apps (and then have to remember what your Siri catch phrase is in order to use them). Workflows have to be pinned together in a separate Siri Shortcuts app that’s over the heads of anyone but power users.
These are great features for iPhone owners, to be sure, but they’re not exactly automating A.I. technology in a seamless way. They’re Apple’s first steps towards making A.I. a bigger part of what it means to use an iPhone.
Call Screen, meanwhile, is a use case for A.I. that doesn’t require a ton of user education or manual labor. Even if you didn’t know it existed, pushing a “screen call” button when the phone rings is fairly straightforward stuff.
And it’s not just going to be just a Pixel 3 feature.
Said Google, Pixel 3 owners in the U.S. are just getting it first. It will also roll out to older Pixel devices next month (in English). Presumably, however, it will come to Android itself in time, when these early tests wrap.
After all, if the mobile OS battle is going to be over A.I. going forward, there’s no reason to keep A.I. advancements tied to only Google’s own hardware devices.

Apple needs a feature like Google’s Call Screen

iOS 12 is all about making your phone work better

The pace of iOS innovation has been so intense that even Apple couldn’t keep up. In some ways, iOS 11’s main feature was that it was packed with bugs, with autocorrect bugs, messages arriving out of order and the Calculator app not calculating properly. iOS 12 is a nice change of pace.
“For iOS 12, we’re doubling down on performance,” Apple’s SVP of Software Engineering Craig Federighi said at WWDC.
While there are a few interesting new features, iOS 12 isn’t a splashy release like the ones that were released over the past few years. It doesn’t change the way you use an iPad and it doesn’t open up apps with new hooks across the board.
It’s clear that all the low-hanging fruit has been addressed. Now, Apple is mostly adding new frameworks for specific categories of apps instead of releasing major platform changes that affect all third-party apps.
And for the rest, it’s all about refinements, bug fixes and optimizations. Apple released the first public beta of iOS 12 today. I played a bit with early beta versions of iOS 12, so here’s what you should be looking for.
Operating system changes
Let’s start with the updates at the operating system level. iOS 12 should be faster than iOS 11, including on older devices.
You know that feeling of instant regret when you update your old iPhone or iPad to a new version of iOS. Everything seems much slower. Apple wants to reverse this trend and make iOS 12 faster for the iPhone 5s or the iPad mini 2.
Apps should launch faster, the keyboard should appear more quickly, the camera should be more reactive and more. It’s hard to feel that with a beta version of iOS 12, so we’ll have to look at that statement again in September.
Other than that, there is another major theme for iOS 12 — making you look at your phone less often. And this goal is reflected with three new features — Screen Time, better notifications and a more granular Do Not Disturb mode.
Screen Time is a brand new feature that lets you see how much time you wasted scrolling through feeds. You’ll get weekly reports and parents can set up app limits that sync across all your iOS devices.
Do Not Disturb is now more granular as you can set it up for an hour, until the end of an event or until you leave a location. Many people didn’t want to use this feature because they forgot to turn it off.
As for notifications, they are now grouped by default. In my experience, it takes a while to get used to it, but it’s a big improvement for noisy apps. You can also swipe on a notification to disable notifications from a specific app or turn them into silent notifications. You’ll feel more in control of your iPhone instead of feeling like your iPhone is controlling you.

App updates
Apple couldn’t stop at those improvements and had to release app updates for its own apps. Let’s look at the most memorable ones.
You can finally ditch Skype for good as FaceTime now supports group conversations — at least if all your friends are using iPhones. This feature alone will definitely increase iPhone stickiness, just like the fact that you can’t participate in iMessage conversations on Android.
Talking about Messages, most iPhone users won’t see a difference this year as Apple focused on the iPhone X. In addition to new Animojis, you can now create your own avatar using Memoji. I have to say that I really like Snap’s Bitmoji, so I’m quite excited to use it. The only issue is that it feels like a one-way conversation if you’re not messaging someone who is using an iPhone X. It’s the kind of features that will start to make sense after a few years when everybody has Face ID on their iPhone.
Four other Apple apps got an update. Stocks and Apple News received some design improvements. Voice Memos will now store your memos in iCloud and sync them with your iPad and Mac without using iTunes (finally). Lastly, iBooks is now called Apple Books, and it now looks more like the updated App Store.
Apple’s two bets
With iOS 12, Apple is pursuing its big bet on augmented reality and starting something new with Siri. Those platform changes could resonate well with developers and users or could become a distraction for everyone.
Apple’s augmented reality SDK is getting a major update. With ARKit 2, developers can create apps that share the same augmented reality world between multiple users. You can imagine multiplayer games and shareable worlds. Apple also worked on improving the overall performance of the framework.
But does it really matter? It feels like many geeks like you, TechCrunch readers, tested ARKit apps after the release of iOS 11. But there hasn’t been a mainstream hit so far. It’s still unclear if people actually want to use their iOS device to power an augmented reality experience.

And the second big thing is Siri Shortcuts. After Apple acquired Workflow, the automation app for iOS, many people wondered what it would mean for automation fans. The good news is that Apple is completely embracing Workflow with a set of features.
App developers can now configure Shortcuts to let users add to Siri a restaurant booking, a favorite Deliveroo order or a favorite sports team. On paper, it’s quite powerful and limited at the same time. It sounds like bookmarks for Siri.
Most users will stop at suggested shortcuts. But power users will be able to configure multi-step workflows in the new Shortcuts app. It’s just like Workflow, but with a new name and new home automation features.
This is great news if you’re a power user, but I wonder if Shortcuts will find a mainstream audience. I couldn’t test those features as it’s not yet available in the beta. Maybe Shortcuts will be added with iOS 12.1 or 12.2.
There are many small refinements in iOS 12 that I haven’t listed there. For instance, Portrait Mode has been improved and the Photos app is getting better at showing you personalized recommendations. Or if you have an iPhone X, you’ll be able to add a second face to unlock your phone.
iOS 12 looks especially promising if you consider your iPhone as infrastructure. Many people want a device that is as reliable as possible. And iOS 12 should stand out on this front.

Apple just released the first iOS 12 beta to everyone

iOS 12 is all about making your phone work better