Архив метки: FBI

FaceApp gets federal attention as Sen. Schumer raises alarm on data use

It’s been hard to get away from FaceApp over the last few days, whether it’s your friends posting weird selfies using the app’s aging and other filters, or the brief furore over its apparent (but not actual) circumvention of permissions on iPhones. Now even the Senate is getting in on the fun: Sen. Chuck Schumer (D-NY) has asked the FBI and the FTC to look into the app’s data handling practices.
“I write today to express my concerns regarding FaceApp,” he writes in a letter sent to FBI Director Christopher Wray and FTC Chairman Joseph Simons. I’ve excerpted his main concerns below:
In order to operate the application, users must provide the company full and irrevocable access to their personal photos and data. According to its privacy policy, users grant FaceApp license to use or publish content shared with the application, including their username or even their real name, without notifying them or providing compensation.
Furthermore, it is unclear how long FaceApp retains a user’s data or how a user may ensure their data is deleted after usage. These forms of “dark patterns,” which manifest in opaque disclosures and broader user authorizations, can be misleading to consumers and may even constitute a deceptive trade practices. Thus, I have serious concerns regarding both the protection of the data that is being aggregated as well as whether users are aware of who may have access to it.
In particular, FaceApp’s location in Russia raises questions regarding how and when the company provides access to the data of U.S. citizens to third parties, including potentially foreign governments.
For the cave-dwellers among you (and among whom I normally would proudly count myself) FaceApp is a selfie app that uses AI-esque techniques to apply various changes to faces, making them look older or younger, adding accessories, and, infamously, changing their race. That didn’t go over so well.
There’s been a surge in popularity over the last week, but it was also noticed that the app seemed to be able to access your photos whether you said it could or not. It turns out that this is actually a normal capability of iOS, but it was being deployed here in somewhat of a sneaky manner and not as intended. And arguably it was a mistake on Apple’s part to let this method of selecting a single photo go against the “never” preference for photo access that a user had set.
Fortunately the Senator’s team is not worried about this or even the unfounded (we checked) concerns that FaceApp was secretly sending your data off in the background. It isn’t. But it very much does send your data to Russia when you tell it to give you an old face, or a hipster face, or whatever. Because the computers that do the actual photo manipulation are located there — these filters are being applied in the cloud, not directly on your phone.
His concerns are over the lack of transparency that user data is being sent out to servers who knows where, to be kept for who knows how long, and sold to who knows whom. Fortunately the obliging FaceApp managed to answer most of these questions before the Senator’s letter was ever posted.

FaceApp responds to privacy concerns 

The answers to his questions, should we choose to believe them, are that user data is not in fact sent to Russia, the company doesn’t track users and usually can’t, doesn’t sell data to third parties, and deletes “most” photos within 48 hours.
Although the “dark patterns” of which the Senator speaks are indeed an issue, and although it would have been much better if FaceApp had said up front what it does with your data, this is hardly an attempt by a Russian adversary to build up a database of U.S. citizens.
While it is good to see Congress engaging with digital privacy, asking the FBI and FTC to look into a single app seems unproductive when that app is not doing much that a hundred others, American and otherwise, have been doing for years. Cloud-based processing and storage of user data is commonplace — though usually disclosed a little better.
Certainly as Sen. Schumer suggests, the FTC should make sure that “there are adequate safeguards in place to protect the privacy of Americans…and if not, that the public be made aware of the risks associated with the use of this application or others similar to it.” But this seems the wrong nail to hang that on. We see surreptitious slurping of contact lists, deceptive deletion promises, third-party sharing of poorly anonymized data, and other bad practices in apps and services all the time — if the federal government wants to intervene, let’s have it. But let’s have a law or a regulation, not a strongly worded letter written after the fact.
Schumer Faceapp Letter by TechCrunch on Scribd

FaceApp gets federal attention as Sen. Schumer raises alarm on data use

FBI reportedly overestimated inaccessible encrypted phones by thousands

The FBI seems to have been caught fibbing again on the topic of encrypted phones. FBI director Christopher Wray estimated in December that it had almost 7,800 phones from 2017 alone that investigators were unable to access. The real number is likely less than a quarter of that, The Washington Post reports.
Internal records cited by sources put the actual number of encrypted phones at perhaps 1,200 but perhaps as many as 2,000, and the FBI told the paper in a statement that “initial assessment is that programming errors resulted in significant over-counting of mobile devices reported.” Supposedly having three databases tracking the phones led to devices being counted multiple times.
Such a mistake would be so elementary that it’s hard to conceive of how it would be possible. These aren’t court notes, memos or unimportant random pieces of evidence, they’re physical devices with serial numbers and names attached. The idea that no one thought to check for duplicates before giving a number to the director for testimony in Congress suggests either conspiracy or gross incompetence.

Inquiry finds FBI sued Apple to unlock phone without considering all options

The latter seems more likely after a report by the Office of the Inspector General that found the FBI had failed to utilize its own resources to access locked phones, instead suing Apple and then hastily withdrawing the case when its basis (a locked phone from a terror attack) was removed. It seems to have chosen to downplay or ignore its own capabilities in order to pursue the narrative that widespread encryption is dangerous without a backdoor for law enforcement.
An audit is underway at the Bureau to figure out just how many phones it actually has that it can’t access, and hopefully how this all happened.
It is unmistakably among the FBI’s goals to emphasize the problem of devices being fully encrypted and inaccessible to authorities, a trend known as “going dark.” That much it has said publicly, and it is a serious problem for law enforcement. But it seems equally unmistakable that the Bureau is happy to be sloppy, deceptive or both in its advancement of a tailored narrative.

FBI reportedly overestimated inaccessible encrypted phones by thousands

LocationSmart didn’t just sell mobile phone locations, it leaked them

What’s worse than companies selling the real-time locations of cell phones wholesale? Failing to take security precautions that prevent people from abusing the service. LocationSmart did both, as numerous sources indicated this week.
The company is adjacent to a hack of Securus, a company in the lucrative business of prison inmate communication; LocationSmart was the partner that allowed the former to provide mobile device locations in real time to law enforcement and others. There are perfectly good reasons and methods for establishing customer location, but this isn’t one of them.
Police and FBI and the like are supposed to go directly to carriers for this kind of information. But paperwork is such a hassle! If carriers let LocationSmart, a separate company, access that data, and LocationSmart sells it to someone else (Securus), and that someone else sells it to law enforcement, much less paperwork required! That’s what Securus told Senator Ron Wyden (D-OR) it was doing: acting as a middle man between the government and carriers, with help from LocationSmart.
LocationSmart’s service appears to locate phones by which towers they have recently connected to, giving a location within seconds to as close as within a few hundred feet. To prove the service worked, the company (until recently) provided a free trial of its service where a prospective customer could put in a phone number and, once that number replied yes to a consent text, the location would be returned.
It worked quite well, but is now offline. Because in its excitement to demonstrate the ability to locate a given phone, the company appeared to forget to secure the API by which it did so, Brian Krebs reports.
Krebs heard from CMU security researcher Robert Xiao, who had found that LocationSmart “failed to perform basic checks to prevent anonymous and unauthorized queries.” And not through some hardcore hackery — just by poking around.
“I stumbled upon this almost by accident, and it wasn’t terribly hard to do. This is something anyone could discover with minimal effort,” he told Krebs. Xiao posted the technical details here.
They verified the back door to the API worked by testing it with some known parties, and when they informed LocationSmart, the company’s CEO said they would investigate.
This is enough of an issue on its own. But it also calls into question what the wireless companies say about their own policies of location sharing. When Krebs contacted the four major U.S. carriers, they all said they all require customer consent or law enforcement requests.
Yet using LocationSmart’s tool, phones could be located without user consent on all four of those carriers. Both of these things can’t be true. Of course, one was just demonstrated and documented, while the other is an assurance from an industry infamous for deception and bad privacy policy.
There are three options that I can think of:
LocationSmart has a way of finding location via towers that does not require authorization from the carriers in question. This seems unlikely for technical and business reasons; the company also listed the carriers and other companies on its front page as partners, though their logos have since been removed.
LocationSmart has a sort of skeleton key to carrier info; their requests might be assumed to be legit because they have law enforcement clients or the like. This is more likely, but also contradicts the carriers’ requirement that they require consent or some kind of law enforcement justification.
Carriers don’t actually check on a case by case basis whether a request has consent; they may foist that duty off on the ones doing the requests, like LocationSmart (which does ask for consent in the official demo). But if carriers don’t ask for consent and third parties don’t either, and neither keeps the other accountable, the requirement for consent may as well not exist.
None of these is particularly heartening. But no one expected anything good to come out of a poorly secured API that let anyone request the approximate location of anyone’s phone. I’ve asked LocationSmart for comment on how the issue was possible (and also Krebs for a bit of extra data that might shed light on this).
It’s worth mentioning that LocationSmart is not the only business that does this, just the one implicated today in this security failure and in the shady practices of Securus.

LocationSmart didn’t just sell mobile phone locations, it leaked them

Inquiry finds FBI sued Apple to unlock phone without considering all options

The Office of the Inspector General has issued its report on the circumstances surrounding the FBI’s 2016 lawsuit attempting to force Apple to unlock an iPhone as part of a criminal investigation. While it stops short of saying the FBI was untruthful in its justification of going to court, the report is unsparing of the bureaucracy and clashing political motives that ultimately undermined that justification.
The official narrative, briefly summarized, is that the FBI wanted to get into a locked iPhone allegedly used in the San Bernardino attack in late 2015. Then-director Comey explained on February 9 that the Bureau did not have the capability to unlock the phone, and that as Apple was refusing to help voluntarily, a lawsuit would be filed compelling it to assist.
But then, a month later, a miracle occurred: a third-party had come forward with a working method to unlock the phone and the lawsuit would not be necessary after all.
Though this mooted the court proceedings, which were dropped, it only delayed the inevitable and escalating battle between tech and law enforcement — specifically the “going dark” problem of pervasive encryption. Privacy advocates saw the suit as a transparent (but abortive) attempt to set a precedent greatly expanding the extent to which tech companies would be required to help law enforcement. Apple of course fought tooth and nail.
In 2016 the OIG was contacted by Amy Hess, a former FBI Executive Assistant Director, who basically said that the process wasn’t nearly so clean as the Bureau made it out to be. In the course of its inquiries the Inspector General did find that to be the case, though although the FBI’s claims were not technically inaccurate or misleading, they also proved simply to be incorrect — and it is implied that they may have been allowed to be incorrect in order to further the “going dark” narrative.
The full report is quite readable (if you can mentally juggle the numerous acronyms), but the findings are essentially as follows.
Although Comey stated on February 9 that the FBI did not have the capability to unlock the phone and would seek legal remedy, the inquiry found that the Bureau had not exhausted all the avenues available to it, including some rather obvious ones.
Comey at a hearing in 2017
For instance, one senior engineer was tasked with asking trusted vendors if they had anything that could help — two days after Comey already said the FBI had no options left. Not only that, but there was official friction over whether classified tools generally reserved for national security purposes should be considered for this lesser, though obviously serious, criminal case.
In the first case, it turned out that yes, a vendor did have a solution “90 percent” done, and was happy to finish it up over the next month. How could the director have said that the FBI didn’t have the resources to do this, when it had not even asked its usual outside sources for help?
In the second, it’s still unclear whether there in fact exist classified tools that could have been brought to bear on the device in question. Testimony is conflicting on this point, with some officials saying that there was a “line in the sand” drawn between classified and unclassified tools, and another saying it was just a matter of preference. Regardless, those involved were less than forthcoming even within the Bureau, and even internal leadership was left wondering if there were solutions they hadn’t considered.
Hess, who brought the initial complaint to the OIG, was primarily concerned not that there was confusion in the ranks — it’s a huge organization and communication can be difficult — but that the search for a solution was deliberately allowed to fail in order that the case could act as a precedent advantageous to the FBI and other law enforcement agencies. Comey was known to be very concerned with the “going dark” issue and would likely have pursued such a case with vigor.
So the court case, Hess implied, was the real goal, and the meetings early in 2016 were formalities, nothing more than a paper trail to back up Comey’s statements. When a solution was actually found, because an engineer had taken initiative to ask around, officials hoping for a win in court were dismayed:
She became concerned that the CEAU Chief did not seem to want to find a technical solution, and that perhaps he knew of a solution but remained silent in order to pursue his own agenda of obtaining a favorable court ruling against Apple. According to EAD Hess, the problem with the Farook iPhone encryption was the “poster child” case for the Going Dark challenge.
The CEAU Chief told the OIG that, after the outside vendor came forward, he became frustrated that the case against Apple could no longer go forward, and he vented his frustration to the ROU Chief. He acknowledged that during this conversation between the two, he expressed disappointment that the ROU Chief had engaged an outside vendor to assist with the Farook iPhone, asking the ROU Chief, “Why did you do that for?”
While this doesn’t really imply a pattern of deception, it does suggest a willingness and ability on the part of FBI leadership to manipulate the situation to its advantage. A judge saying the likes of Apple must do everything possible to unlock an iPhone, and all forward ramifications of that, would be a tremendous coup for the Bureau and a major blow to user privacy.
The OIG ultimately recommends that the FBI “improve communication and coordination” so that this type of thing doesn’t happen (and it is reportedly doing so). Ironically, if the FBI had communicated to itself a bit better, the court case likely would have continued under pretenses that only its own leadership would know were false.

Inquiry finds FBI sued Apple to unlock phone without considering all options

Leaked Photo Purportedly Shows iPhone 5′s Insides: A5 Processor, Bigger Display And Battery

a5-chip_02

Being the pedigreed gadget geeks that we are, most of us here at TechCrunch are always up for a good gadget tear-down. We’ve seen everything from phones, to tablets, to friggin’ FBI tracking devices get splayed apart, and we loved every second of it.

So when a tear-down gives a glimpse inside of what very much looks to be the iPhone 5? Yes, please.

Leaked on Chinese microblogging site Weibo (if Twitter and Facebook had a baby that only spoke Chinese, that’d be Weibo), this little glimpse into what’s being claimed as the innards of an iPhone 5 show at least 3 things of note:

  • A5 processor (Same CPU used in the iPad 2)
  • A slightly bigger/stronger battery (4.2v, 1430 mAh versus 3.7v , 1420 mAh)
  • The LCD backpanel (the blue bit just barely visible on the left side) definitely appears to be going from edge to edge, as rumors have long suggested the display would do.

If you’re having a hard time telling heads from tails in the picture, note that the components on the right would actually flip over (see the attached cable) and rest on top of that visible on the left. Note that some components appear to be missing — most notably, there’s no antenna, nor any body/caging to hold it all in place.

What do you think? Is this one the real deal, or just an inside look at a rather detailed clone?


Company:
Apple
Website:
apple.com
Launch Date:
January 4, 1976
IPO:

September 19, 1980, NASDAQ:AAPL

Started by Steve Jobs, Steve Wozniak, and Ronald Wayne, Apple has expanded from computers to consumer electronics over the last 30 years, officially changing their name from Apple Computer, Inc. to Apple, Inc. in January 2007.

Among the key offerings from Apple’s product line are: Pro line laptops (MacBook Pro) and desktops (Mac Pro), consumer line laptops (MacBook) and desktops (iMac), servers (Xserve), Apple TV, the Mac OS X and Mac OS X Server operating systems, the iPod (offered with…

Learn more


Leaked Photo Purportedly Shows iPhone 5′s Insides: A5 Processor, Bigger Display And Battery