Anyone using any digital device needs to wise up to a looming threat as Meta attempts to exploit the Europe Union’s continued assault against the Apple ecosystem to launch what seems to be an open season on your privacy.
What seems to be happening is a tyrannical combination of the following sequence of events:
- Concerning Digital Markets Act (DMA) compliance, the European Commission has published a document demanding that Apple change numerous aspects of iOS so third-party developers can use technologies that are available only to Apple right now. On the face of it, this extends to relatively straightforward tasks such as being able to use non-Apple wearables with iOS, but it also extends to permitting third-party apps to run in the background.
- Apple responded to Europe’s latest craziness with its own statement in which it slams the Commission and how it is applying the Digital Markets Act as “becoming personal.” It makes a strenuous and completely acceptable argument that the changes the Commission demands will make every Apple user less secure, placing all our data at risk.
But the emerging threat may be something far worse.
- Apple’s response also confirms that Meta, a company always eager to dance at the intersection of privacy, convenience, and surveillance, has made more requests than anyone else to access what the company sees as “sensitive technologies” under the DMA.
What does that mean?
I’ll let Apple explain: “If Apple were to have to grant all of these requests, Facebook, Instagram, and WhatsApp could enable Meta to read on a user’s device all of their messages and emails, see every phone call they make or receive, track every app that they use, scan all of their photos, look at their files and calendar events, log all of their passwords, and more. This is data that Apple itself has chosen not to access in order to provide the strongest possible protection to users.”
Think about that.
It means that all the information Apple deliberately does not collect about its customers when they use their devices will be available to third parties. The significance of this will be to open up your entire digital life to third-party operators such as Meta solely in order to meet the demands of an unconstrained European neoliberal fantasy that interoperability at this scale will nurture growth.
The wrong kind of growth
It will nurture some growth, mainly by nurturing vast instability across the digital experience. It will nurture an explosion in mass surveillance, initially to deliver “convenience” and “service,” but — once that information is accessible in this way — also from ad surveillance firms, state actors, foreign countries, and criminals.
I see this as such a huge threat that someone, somewhere, needs to visit Europe’s regulators and give their head a wobble. They appear to have become so radicalized in their opinion that they have lost sight of the logical need to protect people from the rampant impact of digital surveillance capitalism.
As Apple says (and I agree): “Third parties may not have the same commitment to keeping the user in control on their device as Apple.”
Looking around, who does?
Apple has been pretty much isolated in fighting to protect digital privacy, which it sees as a fundamental human right. Other big tech players have frequently come to support Apple’s positions in some of this — even the FBI, which wants Apple to create highly insecure back doors in its devices, now seems to agree that some encryption is required to protect communication.
Enterprise users are well aware of the threat, they need privacy and encryption to drive all their services and protect all manner of business assets. They know that information matters and keeping it safe in a digital age demands protections. Those protections are seemingly undermined by Europe’s naivety. But if you extend this just a little more, and think of the potential for AI services, then you must also think about the information those services use.
Machine intelligence
Your personal information — or information about you held by others — also suddenly becomes data that third-party firms become greedy for. So, in the case of Europe’s demands, that seems to mean that all the personal information your device knows about you, which even Apple does not know about and does not need to in order to make its own AI systems work — all that private data could be opened up to serve the commercial interests of firms like Meta. All that data may become fuel for the AI mill.
Meta even wants access to your private communications, Apple warns.
I’m not at all clear why Meta wants, needs, or even deserves, such access.
“The General Data Protection Regulation (GDPR), which Apple has always supported, set a strong set of privacy rules for all companies to comply with,” Apple warned. “The DMA was not intended to provide a way around the rules. But the end result could be that companies like Meta — which has been fined by regulators time and again for privacy violations — gains unfettered access to users’ devices and their most personal data. If Apple is forced to allow access to sensitive technologies that it has no ability to protect, the security risks would be substantial and virtually impossible to mitigate.”
Is that what you want?
I don’t.
It could get worse
Look, it really is like this: It doesn’t matter one iota if Apple’s stance on privacy and user security also helps it build its business; what matters is that that stance is the appropriate position to take. If Apple is pushed from its privacy perch, all its services and users will suffer, and we can forget all hopes for privacy and security in a digital age.
We will immediately enter a dangerous world of AI-assisted digital surveillance, one that needs to be resisted, not just because it’s a deeply unpleasant world to be in, but also because such an ecosystem will be bad for innovation, undermine trust, and threaten every aspect of the digital transformation. Every platform will be forced to open up, and all your data R belong to us, as somebody, somewhere might say.
Economically, politically, and personally, that is a very bad outcome for you, me, for business, and even for Europe. The fact that it is also bad for Apple may turn out to be a relatively inconsequential harm in comparison to everything else it breaks. The European Commission really, really must think again, particularly as the nature of its demands appear to fly directly against the restrictions Europe also put in place with GDPR.
Please pull back from this deeply dangerous diktat.
You can follow me on social media! Join me on BlueSky, LinkedIn, Mastodon, and MeWe.
Anyone using any digital device needs to wise up to a looming threat as Meta attempts to exploit the Europe Union’s continued assault against the Apple ecosystem to launch what seems to be an open season on your privacy.
What seems to be happening is a tyrannical combination of the following sequence of events:
Concerning Digital Markets Act (DMA) compliance, the European Commission has published a document demanding that Apple change numerous aspects of iOS so third-party developers can use technologies that are available only to Apple right now. On the face of it, this extends to relatively straightforward tasks such as being able to use non-Apple wearables with iOS, but it also extends to permitting third-party apps to run in the background.
Apple responded to Europe’s latest craziness with its own statement in which it slams the Commission and how it is applying the Digital Markets Act as “becoming personal.” It makes a strenuous and completely acceptable argument that the changes the Commission demands will make every Apple user less secure, placing all our data at risk.
But the emerging threat may be something far worse.
Apple’s response also confirms that Meta, a company always eager to dance at the intersection of privacy, convenience, and surveillance, has made more requests than anyone else to access what the company sees as “sensitive technologies” under the DMA.
What does that mean?
I’ll let Apple explain: “If Apple were to have to grant all of these requests, Facebook, Instagram, and WhatsApp could enable Meta to read on a user’s device all of their messages and emails, see every phone call they make or receive, track every app that they use, scan all of their photos, look at their files and calendar events, log all of their passwords, and more. This is data that Apple itself has chosen not to access in order to provide the strongest possible protection to users.”
Think about that.
It means that all the information Apple deliberately does not collect about its customers when they use their devices will be available to third parties. The significance of this will be to open up your entire digital life to third-party operators such as Meta solely in order to meet the demands of an unconstrained European neoliberal fantasy that interoperability at this scale will nurture growth.
The wrong kind of growth
It will nurture some growth, mainly by nurturing vast instability across the digital experience. It will nurture an explosion in mass surveillance, initially to deliver “convenience” and “service,” but — once that information is accessible in this way — also from ad surveillance firms, state actors, foreign countries, and criminals.
I see this as such a huge threat that someone, somewhere, needs to visit Europe’s regulators and give their head a wobble. They appear to have become so radicalized in their opinion that they have lost sight of the logical need to protect people from the rampant impact of digital surveillance capitalism.
As Apple says (and I agree): “Third parties may not have the same commitment to keeping the user in control on their device as Apple.”
Looking around, who does?
Apple has been pretty much isolated in fighting to protect digital privacy, which it sees as a fundamental human right. Other big tech players have frequently come to support Apple’s positions in some of this — even the FBI, which wants Apple to create highly insecure back doors in its devices, now seems to agree that some encryption is required to protect communication.
Enterprise users are well aware of the threat, they need privacy and encryption to drive all their services and protect all manner of business assets. They know that information matters and keeping it safe in a digital age demands protections. Those protections are seemingly undermined by Europe’s naivety. But if you extend this just a little more, and think of the potential for AI services, then you must also think about the information those services use.
Machine intelligence
Your personal information — or information about you held by others — also suddenly becomes data that third-party firms become greedy for. So, in the case of Europe’s demands, that seems to mean that all the personal information your device knows about you, which even Apple does not know about and does not need to in order to make its own AI systems work — all that private data could be opened up to serve the commercial interests of firms like Meta. All that data may become fuel for the AI mill.
Meta even wants access to your private communications, Apple warns.
I’m not at all clear why Meta wants, needs, or even deserves, such access.
“The General Data Protection Regulation (GDPR), which Apple has always supported, set a strong set of privacy rules for all companies to comply with,” Apple warned. “The DMA was not intended to provide a way around the rules. But the end result could be that companies like Meta — which has been fined by regulators time and again for privacy violations — gains unfettered access to users’ devices and their most personal data. If Apple is forced to allow access to sensitive technologies that it has no ability to protect, the security risks would be substantial and virtually impossible to mitigate.”
Is that what you want?
I don’t.
It could get worse
Look, it really is like this: It doesn’t matter one iota if Apple’s stance on privacy and user security also helps it build its business; what matters is that that stance is the appropriate position to take. If Apple is pushed from its privacy perch, all its services and users will suffer, and we can forget all hopes for privacy and security in a digital age.
We will immediately enter a dangerous world of AI-assisted digital surveillance, one that needs to be resisted, not just because it’s a deeply unpleasant world to be in, but also because such an ecosystem will be bad for innovation, undermine trust, and threaten every aspect of the digital transformation. Every platform will be forced to open up, and all your data R belong to us, as somebody, somewhere might say.
Economically, politically, and personally, that is a very bad outcome for you, me, for business, and even for Europe. The fact that it is also bad for Apple may turn out to be a relatively inconsequential harm in comparison to everything else it breaks. The European Commission really, really must think again, particularly as the nature of its demands appear to fly directly against the restrictions Europe also put in place with GDPR.
Please pull back from this deeply dangerous diktat.
You can follow me on social media! Join me on BlueSky, LinkedIn, Mastodon, and MeWe. Read More