Архив метки: Google Play

Google starts pulling unvetted Android apps that access call logs and SMS messages

Google is removing apps from Google Play that request permission to access call logs and SMS text message data but haven’t been manually vetted by Google staff.
The search and mobile giant said it is part of a move to cut down on apps that have access to sensitive calling and texting data.
Google said in October that Android apps will no longer be allowed to use the legacy permissions as part of a wider push for developers to use newer, more secure and privacy minded APIs. Many apps request access to call logs and texting data to verify two-factor authentication codes, for social sharing, or to replace the phone dialer. But Google acknowledged that this level of access can and has been abused by developers who misuse the permissions to gather sensitive data — or mishandle it altogether.
“Our new policy is designed to ensure that apps asking for these permissions need full and ongoing access to the sensitive data in order to accomplish the app’s primary use case, and that users will understand why this data would be required for the app to function,” wrote Paul Bankhead, Google’s director of product management for Google Play.
Any developer wanting to retain the ability to ask a user’s permission for calling and texting data has to fill out a permissions declaration.
Google will review the app and why it needs to retain access, and will weigh in several considerations, including why the developer is requesting access, the user benefit of the feature that’s requesting access and the risks associated with having access to call and texting data.
Bankhead conceded that under the new policy, some use cases will “no longer be allowed,” rendering some apps obsolete.
So far, tens of thousands of developers have already submitted new versions of their apps either removing the need to access call and texting permissions, Google said, or have submitted a permissions declaration.
Developers with a submitted declaration have until March 9 to receive approval or remove the permissions. In the meantime, Google has a full list of permitted use cases for the call log and text message permissions, as well as alternatives.
The last two years alone has seen several high-profile cases of Android apps or other services leaking or exposing call and text data. In late 2017, popular Android keyboard ai.type exposed a massive database of 31 million users, including 374 million phone numbers.

Another huge database exposed millions of call logs and SMS text messages

Google starts pulling unvetted Android apps that access call logs and SMS messages

Обзор лучших игр про космос на Андроид 2019

Представляем обстоятельную подборку игрушек на космическую и научно-фантастическую тематику для телефонов. Игры в космосе на Андроид разбросаны по разным разделам Google Play, поэтому обзор поможет сориентироваться и скачать бесплатно понравившуюся стреля

WhatsApp has an encrypted child porn problem

WhatsApp chat groups are being used to spread illegal child pornography, cloaked by the app’s end-to-end encryption. Without the necessary number of human moderators, the disturbing content is slipping by WhatsApp’s automated systems. A report from two Israeli NGOs reviewed by TechCrunch details how third-party apps for discovering WhatsApp groups include “Adult” sections that offer invite links to join rings of users trading images of child exploitation. TechCrunch has reviewed materials showing many of these groups are currently active.
TechCrunch’s investigation shows that Facebook could do more to police WhatsApp and remove this kind of content. Even without technical solutions that would require a weakening of encryption, WhatsApp’s moderators should have been able to find these groups and put a stop to them. Groups with names like “child porn only no adv” and “child porn xvideos” found on the group discovery app “Group Links For Whats” by Lisa Studio don’t even attempt to hide their nature. And a screenshot provided by anti-exploitation startup AntiToxin reveals active WhatsApp groups with names like “Children ” or “videos cp” — a known abbreviation for ‘child pornography’.
A screenshot from today of active child exploitation groups on WhatsApp. Phone numbers and photos redacted. Provided by AntiToxin.
Better manual investigation of these group discovery apps and WhatsApp itself should have immediately led these groups to be deleted and their members banned. While Facebook doubled its moderation staff from 10,000 to 20,000 in 2018 to crack down on election interference, bullying and other policy violations, that staff does not moderate WhatsApp content. With just 300 employees, WhatsApp runs semi-independently, and the company confirms it handles its own moderation efforts. That’s proving inadequate for policing a 1.5 billion-user community.
The findings from the NGOs Screen Savers and Netivei Reshe were written about today by Financial Times, but TechCrunch is publishing the full report, their translated letter to Facebook, translated emails with Facebook, their police report, plus the names of child pornography groups on WhatsApp and group discovery apps listed above. A startup called AntiToxin Technologies that researches the topic has backed up the report, providing the screenshot above and saying it’s identified more than 1,300 videos and photographs of minors involved in sexual acts on WhatsApp groups. Given that Tumblr’s app was recently temporarily removed from the Apple App Store for allegedly harboring child pornography, we’ve asked Apple if it will temporarily suspend WhatsApp, but have not heard back. 

View this document on Scribd
Uncovering a nightmare
In July 2018, the NGOs became aware of the issue after a man reported to one of their hotlines that he’d seen hardcore pornography on WhatsApp. In October, they spent 20 days cataloging more than 10 of the child pornography groups, their content and the apps that allow people to find them.
The NGOs began contacting Facebook’s head of Policy, Jordana Cutler, starting September 4th. They requested a meeting four times to discuss their findings. Cutler asked for email evidence but did not agree to a meeting, instead following Israeli law enforcement’s guidance to instruct researchers to contact the authorities. The NGO reported their findings to Israeli police but declined to provide Facebook with their research. WhatsApp only received their report and the screenshot of active child pornography groups today from TechCrunch.
Listings from a group discovery app of child exploitation groups on WhatsApp. URLs and photos have been redacted.
WhatsApp tells me it’s now investigating the groups visible from the research we provided. A Facebook spokesperson tells TechCrunch, “Keeping people safe on Facebook is fundamental to the work of our teams around the world. We offered to work together with police in Israel to launch an investigation to stop this abuse.” A statement from the Israeli Police’s head of the Child Online Protection Bureau, Meir Hayoun, notes that: “In past meetings with Jordana, I instructed her to always tell anyone who wanted to report any pedophile content to contact the Israeli police to report a complaint.”
A WhatsApp spokesperson tells me that while legal adult pornography is allowed on WhatsApp, it banned 130,000 accounts in a recent 10-day period for violating its policies against child exploitation. In a statement, WhatsApp wrote that:

WhatsApp has a zero-tolerance policy around child sexual abuse. We deploy our most advanced technology, including artificial intelligence, to scan profile photos and images in reported content, and actively ban accounts suspected of sharing this vile content. We also respond to law enforcement requests around the world and immediately report abuse to the National Center for Missing and Exploited Children. Sadly, because both app stores and communications services are being misused to spread abusive content, technology companies must work together to stop it.

But it’s that over-reliance on technology and subsequent under-staffing that seems to have allowed the problem to fester. AntiToxin’s CEO Zohar Levkovitz tells me, “Can it be argued that Facebook has unwittingly growth-hacked pedophilia? Yes. As parents and tech executives we cannot remain complacent to that.”

Automated moderation doesn’t cut it
WhatsApp introduced an invite link feature for groups in late 2016, making it much easier to discover and join groups without knowing any members. Competitors like Telegram had benefited as engagement in their public group chats rose. WhatsApp likely saw group invite links as an opportunity for growth, but didn’t allocate enough resources to monitor groups of strangers assembling around different topics. Apps sprung up to allow people to browse different groups by category. Some usage of these apps is legitimate, as people seek communities to discuss sports or entertainment. But many of these apps now feature “Adult” sections that can include invite links to both legal pornography-sharing groups as well as illegal child exploitation content.
A WhatsApp spokesperson tells me that it scans all unencrypted information on its network — basically anything outside of chat threads themselves — including user profile photos, group profile photos and group information. It seeks to match content against the PhotoDNA banks of indexed child pornography that many tech companies use to identify previously reported inappropriate imagery. If it finds a match, that account, or that group and all of its members, receive a lifetime ban from WhatsApp.
A WhatsApp group discovery app’s listings of child exploitation groups on WhatsApp
If imagery doesn’t match the database but is suspected of showing child exploitation, it’s manually reviewed. If found to be illegal, WhatsApp bans the accounts and/or groups, prevents it from being uploaded in the future and reports the content and accounts to the National Center for Missing and Exploited Children. The one example group reported to WhatsApp by Financial Times was already flagged for human review by its automated system, and was then banned along with all 256 members.
To discourage abuse, WhatsApp says it limits groups to 256 members and purposefully does not provide a search function for people or groups within its app. It does not encourage the publication of group invite links and the vast majority of groups have six or fewer members. It’s already working with Google and Apple to enforce its terms of service against apps like the child exploitation group discovery apps that abuse WhatsApp. Those kind of groups already can’t be found in Apple’s App Store, but remain available on Google Play. We’ve contacted Google Play to ask how it addresses illegal content discovery apps and whether Group Links For Whats by Lisa Studio will remain available, and will update if we hear back. [Update 3pm PT: Google has not provided a comment but the Group Links For Whats app by Lisa Studio has been removed from Google Play. That’s a step in the right direction.]
But the larger question is that if WhatsApp was already aware of these group discovery apps, why wasn’t it using them to track down and ban groups that violate its policies. A spokesperson claimed that group names with “CP” or other indicators of child exploitation are some of the signals it uses to hunt these groups, and that names in group discovery apps don’t necessarily correlate to the group names on WhatsApp. But TechCrunch then provided a screenshot showing active groups within WhatsApp as of this morning, with names like “Children ” or “videos cp”. That shows that WhatsApp’s automated systems and lean staff are not enough to prevent the spread of illegal imagery.
The situation also raises questions about the trade-offs of encryption as some governments like Australia seek to prevent its usage by messaging apps. The technology can protect free speech, improve the safety of political dissidents and prevent censorship by both governments and tech platforms. However, it also can make detecting crime more difficult, exacerbating the harm caused to victims.
WhatsApp’s spokesperson tells me that it stands behind strong end-to-end encryption that protects conversations with loved ones, doctors and more. They said there are plenty of good reasons for end-to-end encryption and it will continue to support it. Changing that in any way, even to aid catching those that exploit children, would require a significant change to the privacy guarantees it’s given users. They suggested that on-device scanning for illegal content would have to be implemented by phone makers to prevent its spread without hampering encryption.
But for now, WhatsApp needs more human moderators willing to use proactive and unscalable manual investigation to address its child pornography problem. With Facebook earning billions in profit per quarter and staffing up its own moderation ranks, there’s no reason WhatsApp’s supposed autonomy should prevent it from applying adequate resources to the issue. WhatsApp sought to grow through big public groups, but failed to implement the necessary precautions to ensure they didn’t become havens for child exploitation. Tech companies like WhatsApp need to stop assuming cheap and efficient technological solutions are sufficient. If they want to make money off huge user bases, they must be willing to pay to protect and police them.

WhatsApp has an encrypted child porn problem

A look at the Android Market (aka Google Play) on its 10th Anniversary

Google Play has generated more than twice the downloads of the iOS App Store, reaching a 70 percent share of worldwide downloads in 2017, according to a new report from App Annie, released in conjunction with the 10th anniversary of the Android Market, now called Google Play. The report also examined the state of Google Play’s marketplace and the habits of Android users.
It found that, despite the large share of downloads, Google Play only accounted for 34 percent of worldwide consumer spend on apps, compared with 66 percent on the iOS App Store in 2017 — a figure that’s stayed relatively consistent for years.

Those numbers are consistent with the narrative that’s been told about the two app marketplaces for some time, as well. That is, Google has the sheer download numbers, thanks to the wide distribution of its devices — including its reach into emerging markets, thanks to low-cost smartphones. But Apple’s ecosystem is the one making more money from apps.
App Annie also found that the APAC (Asia-Pacific) region accounts for more than half of Google Play consumer spending. Japan was the largest market of all-time on this front, topping the charts with $25.1 billion dollars spent on apps and in-app purchases. It was followed by the U.S. ($19.3 billion) and South Korea ($11.2 billion).
The firm attributed some of Google Play’s success in Japan to carrier billing. This has allowed consumer spending to increase in markets like South Korea, Taiwan, Thailand and Singapore, as well, it said.
As to what consumers are spending their money on? Games, of course.
The report found that games accounted for 41 percent of downloads, but 88 percent of spend.

Outside of games, in-app subscriptions have contributed to revenue growth.
Non-game apps reached $2.7 billion in consumer spend last year, with 4 out of the top 5 apps offering a subscription model. The No. 1 app, LINE, was the exception. It was followed by subscription apps Tinder, Pandora, Netflix and HBO NOW.
In addition, App Annie examined the app usage patterns of Android users, and found they tend to have a lot of apps installed. In several markets, including the U.S. and Japan, Android users had more than 60 apps installed on their phones, and they used more than 30 apps every month.
Australia, the U.S. and South Korea led the way here, with users’ phones holding 100 or more apps.

The report also looked at the most popular games and apps of all time by both downloads and consumer spend. There weren’t many surprises on these lists, with apps like those made by Facebook dominating the top apps by downloads list, and subscription services dominating top apps by spend.

App Annie also noted Google Play has seen the release of nearly 10 million apps since its launch in 2008. Not all these remain, of course — by today’s count, there are just over 2.8 million apps live on Google Play.

 

The full report is available here.

A look at the Android Market (aka Google Play) on its 10th Anniversary

TikTok adds video reactions to its newly-merged app

Just about a month after the merger of the short-form video apps Musical.ly and TikTok, the app is introducing a new social feature, allowing users to post their reactions to the videos that they watch.
Instead of text comments, these reactions will take the form of videos that are essentially superimposed on top of existing clips. The idea of a reaction video should be familiar to anyone who’s spent some time on YouTube, but TikTok is incorporating the concept in way that looks like a pretty seamless.
To post a reaction, users just need to choose the React option in the Share menu for a given video. The app will then record your audio and video as the clip plays. You can also decide where on the screen you want your reaction video to appear.
If you don’t recognize the TikTok name, that’s probably because the app only launched in the United States at the beginning of August, but it’s been available in China for a couple of years.

Back in 2017, Bytedance — the Chinese company behind TikTok as well as news aggregator Toutiao — acquired Musical.ly for around $1 billion. It eventually merged the two apps to combine their audiences and features; Musical.ly users were moved over with their existing videos and settings.
The company says Reactions will be available in the updated app on Google Play and the Apple App Store over the next day or two.

TikTok adds video reactions to its newly-merged app