Turns out that people become okay with cameras every few inches observing every action as long as you say "It's for gestures.", even though the data stream will inevitably make it back to corpo for training ( and voyeurism[0] by the other parties that'll be in the room).
Same excuse for the VR headsets. "Oh, it has a red LED that fires when recording!" , meanwhile the thing has 30 other IR cams in non-stop loops consuming the environment at power-on til death.
[0]: https://www.reuters.com/article/world/uk/nsa-staff-used-spy-...
Currently I see Apple as safer than, say Google or Microsoft, but not as the privacy bastion it claims to be.
It's opt in, and the bolded option is "ask app not to track", so I'm really not sure what the issue is here.
Years ago, Apple's Weather app sourced their data from The Weather Channel. That meant these three tracking options ragasrding your location:
- Always share - You get real-time weather alerts, very useful some seasons
- Share while using - You get current weather, but lose real-time alerts
- Do not share - Might as well uninstall the app
Then Apple made Apple Weather, which collects weather data from multiple sources, and is supposedly safer to share real-time location with since Apple won't share it with anyone. Before this, The Weather Channel had the real-time location of millions worldwide, and all Apple had for privacy was that prompt.
This is the kind of stack reengineering I'm talking about, that makes privacy a real proposal, but applied deeper so it really makes a difference.
Unless you're some sort of globetrotter going to a new city every week, the app is quite usable just by adding your city.
>Before this, The Weather Channel had the real-time location of millions worldwide
Are you sure apple wasn't proxying the traffic through their servers?
edit: for instance the stocks app very prominently shows the data is from yahoo stocks, but if you check "most contacted domains" in app privacy report, they're all apple domains. It doesn't contact yahoo at all.
In fact, it could have been the case already and you would not have known ir.
Wouldn't the bigger issue be that they can abuse the same thing to grab camera and or microphone from your phone? Probably more useful than airpods too, given that a phone's always on, unlike airpods.
Using this to detect gestures does seem very cool, however. Seems like a fascinating engineering challenge.
Don't know, sounds totally useless, like most on-air gesture interfaces.
Finer volume control. Different gestures for fast forward vs next track. Additional gestures e.g. for liking the current track.
Doesn't sound useless to me at all.
Also the new model can regonize when you fall a sleep and stop the media. I think it works, but I'm not sure how quickly it detects the sleep.
(I don't really want to be wearing them while asleep, but my body sometimes has other plans.)
It’s the only reason why I have upgraded to ANC on the Airpods 4, because I don’t really like ANC, but I want the smart case.
If they add cameras to them, regardless of the implementation, I'm pretty sure that not only is against every gym-policy, but may be an actual criminal offense in certain states.
So I guess what I am saying is: This could be an anti-feature for certain people, or get people into trouble who continue to do a preexisting habit.
Is having the camera on illegal in and of itself, or only when you're actively recording?
You're essentially setting the scene to where people cannot know if they're being recorded or not while a camera is always pointed at them. That's a problem. Least of all because law enforcement will need to investigate to determine if the law was broken on each complaint.
"But internal processing!" isn't quite the slam dunk defense you seem to think it is. It won't work like that in the real world, people being recorded while changing won't care about that distinction.
Right, because there's very few plausible justifications for why someone would be aiming their phone in a changing room. The same doesn't apply to airpods.
>You're essentially setting the scene to where people cannot know if they're being recorded or not while a camera is always pointed at them. That's a problem. Least of all because law enforcement will need to investigate to determine if the law was broken on each complaint.
Is there any indication that you'll be able to get a video feed from airpods? If not, and you basically have to jailbreak them to do surreptitious recordings, what makes them more or less notable than all the other spy cameras you can get today and are more or less tolerated by the authorities?
>"But internal processing!" isn't quite the slam dunk defense you seem to think it is. It won't work like that in the real world, people being recorded while changing won't care about that distinction.
Do you think it's some outrageous breach of privacy that face id (and similar face scanning technologies) are constantly on the lookout for faces, including in bathrooms?
It doesn't today because they don't have cameras on them, it won't tomorrow if they do. People will definitely need to justify (maybe legally), why they're pointing the camera on their AirPods at people changing.
> Is there any indication that you'll be able to get a video feed from airpods?
You're just repeating the same "internal processing" point you've made, and that I've already pointed out isn't a legal or practical difference in the real world.
You're not answering my question. Do you or do you not think that face id phones are facing legal obstacles because they also have cameras that randomly turn on for "internal processing"? If face id phones get a pass, why don't you think airpods would get a pass?
So more of an issue in that case; and the laws I am talking about don't have a "IR Camera" exception.
UI should be consistent, it allows users to learn a muscle memory, this "hide stuff until you're 20cm away" stuff is the antithesis of that (and all good design in general).
That said, as someone who does pottery (messy hands), wears gloves/hats (stuff in the way), and has relatively poor fine motor control, I guess I welcome any solution that doesn't mean getting clay or cold air in my hair/ear.
The battery consumption and latency of the IR cameras will be interesting though. Too sensitive, and you'll eat up your battery. Not sensitive enough, and UX suffers.
1: https://support.apple.com/en-us/102628 2: https://news.ycombinator.com/item?id=45186975
Meta showed that it was possible to do that from cameras mounted on glasses.
However, the power required to do that is quite high (30-60mw to detect, more to do the pose extractions)
So I suspect its just hand recognition.
Seems like a negative tradeoff.
https://japandaily.jp/why-you-cant-turn-off-the-camera-shutt...
> Japan’s requirement for an audible camera shutter sound isn’t just a quirky design decision — it’s a deliberate policy meant to prevent secret photography.
That depends on what "taking a photo or video" means. If it only covers making a recording, then it won't apply to airpods. The same applies for faceID for instance, I doubt japanese iPhones are making shutter sounds everytime you pull out your phone, even though it's obviously using the camera.
I suppose that the "camera" could be as simple as an optical flow sensor [1] commonly used on mice and quad-copters and placed behind the black plastic so there would not be a visible lens [2].
0. https://web.ece.ucsb.edu/~ymostofi/WiFiReadingThroughWall
1. https://ieeexplore.ieee.org/document/10164626
2. https://www.schneier.com/blog/archives/2012/07/camera-transp...
We don't want no fucking infra cameras for "better hand gestures and enhanced spatial audio experience".
https://www.macrumors.com/2024/06/30/new-airpods-to-feature-...
> The IR cameras can detect environmental image changes, facilitating a broader range of gestures to improve user interaction. For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience."
I wonder if the signal could be integrated into AR glasses or headset to provide a wider FOV to the wearer.
Geez, if only the Apple Vision had some kind of gyroscope and accelerometer so it could detect head motion without relying on external hardware...
I had an APV for a while, controlling it with just gestures was sweet. If they're looking to bring those kinds of gestures to other Apple devices via AirPods (i.e. beyond just bringing more gestures to the AirPods), I'm intrigued.