Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This statement has a number of issues.

First, you have to trust Apple that the indicator _really_ can't be disabled. You also have to trust that there isn't a vulnerability Apple is not aware about that could allow rolling the camera without the light coming on. This has happened in the past [0] and there are known Apple products that are vulnerable, yet the statement never mentions this making you believe it's impossible.

Second, once the camera light is on, the data has already been captured. The light just told you about it, not prevented it. The plastic cover or a piece of tape does prevent it even if your laptop security is compromised.

Third, in a world where remote conferences are more and more common, more and more software doesn't do a very good job at letting you know when it's about to enable your camera. You might click on a link to an all hands conference to listen in while you're changing only to have the software helpfully enable the camera and broadcast you for the rest of the company. I believe in big conferences organizer may sometimes control other ppl's camera as well. You can totally imagine a scenario when the organizer misclicks and enables the camera for the wrong person instead of a scheduled presenter.

Camera covers solve all of those problems.

[0]: https://jscholarship.library.jhu.edu/handle/1774.2/36569



> First, you have to trust Apple that the indicator _really_ can't be disabled. You also have to trust that there isn't a vulnerability Apple is not aware about that could allow rolling the camera without the light coming on.

I have made this point several times throughout this thread, so I apologize for repeating myself:

Every laptop Apple has manufactured in the last ten years has an LED connected to the same circuit which powers up the camera. You cannot send power to the camera without also sending power to the LED, which will in turn cause the LED to light up. Unless the LED is broken, in which case you will know because it will never light up.

If you manage to find a vulnerability in this system, I don't think I even mind, because you've also broken physics and very possibly found a way to generate unlimited electricity forever.


> Every laptop Apple has manufactured in the last ten years has an LED connected to the same circuit which powers up the camera.

At the very least, I need a citation or an official statement. Because clearly, this has not always been the case [1]:

We describe how to disable the LED on a class of Apple internal iSight webcams used in some versions of MacBook laptops and iMac desktops.

[..] our investigation of the iSight revealed that it is designed around a microprocessor and a separate image sensor with an indicator LED sitting between them such that whenever the image sensor is transmitting images to the microcontroller, a hardware interlock illuminates the LED. We show how to reprogram the microcontroller with arbitrary, new firmware. This in turn enables us to reconfigure the image sensor, allowing us to bypass the hardware interlock and disable the LED.

[..] iSight webcam [was] found in previous generation Apple products including the iMac G5 and early Intel-based iMacs,MacBooks, and MacBook Pros until roughly 2008

Whatever reason Apple had to design the camera system this way back in 2008, is probably still a valid reason (cost, hardware simplicity, spacial constraints etc.). It means Apple and others have incentives to build camera systems that are easier to compromise. It's enough for me to worry.

[1] https://jscholarship.library.jhu.edu/bitstream/handle/1774.2...


I find it odd that you argue with the 'in the last 10 years' statement with a thing from 2008


"Whatever reason Apple had to design the camera system this way back in 2008, is probably still a valid reason"

reading comprehension is an important skill to develop


Skepticism is healthy.


So is trust. You can’t possibly audit the source of every piece of software that touches your life, even if all of it were open source. Hell, things like Heartbleed or Shellshock sat in OpenSSL / Bash for 5-10 years.


Trust isn't healthy, when you're worried about cybersecurity. The economics of trust work differently in real life than at web scale.


Heh. Heartbleed is my go-to example why we should be skeptical of the security of all software.


If you don't trust an open-source program due to the possibility of e.g. Heartbleed, then it's only reasonable to trust closed-source software (e.g. the majority of macOS, including in all likelihood the parts of it controlling the camera) even less.


Then we agree.


"If you manage to find a vulnerability in this system, I don't think I even mind, "

Taking a quick photo doesn't illuminate the LED for long enough for a human to reliably notice.

"because you've also broken physics "

Or merely recognized the role of the human element. A shame, Apple was once good at this.


What does it look like if an app takes a series of photos once a second while shutting on and off the camera in between?


What if only the camera is allowed to shut itself down and never responds to such requests if they happen weithin the first one second after it has been powered on


That requires software. Software can be reprogrammed.


Newer Macs will also require the light to come on for a lot least a few seconds when the camera activates.


That sounds like the led is controlled by logic, instead of being in the power loop.


You could connect the led to a capacitor, which would be charged when camera is connected to power, and would discharge powering the led for a few more seconds after power is disconnected.

But yeah, I wouldn’t trust a led, because I can’t reverse engineer the circuit that’s in my particular device.


Who enforces that requirement?


That is exactly what OP meant: you need to

A) trust that in your actual laptop in front of you, this mechanism is correctly implemented.

B) trust that you are not in the vulnerable phase where your LED broke and you did not notice yet. (Especially when you rarely use your camera).


Regarding B: I’m wondering if the camera or the led typically break Would first with a higher probability


It is extremely unlikely (in the human terms - not possible) that LED would break in such a way that it transmits the power and not emit light at the same time.

Of course the LED can be installed in a parallel connection on the circuit, but I read the op statement as it is not the case.


"Let's devise a way to purposefully burn the LED in such a way that current can go through." We have to keep in mind that not everything has to be by accident :) I have no idea how likely it is to be possible, though, you've got a point.


Why did you read that it's not the case? I read the opposite. That if the LED broke, you'd notice it because your camera would be on and the light would be off.

Not that the camera never turns on.


I would bet on the LED being parallel - otherwise sending more current through it will burn more power, generate more heat, and likely cause it to wear faster.

I think the point though is that it's not software controlled in any way: powering on the camera lights up the LED, and there's no way to bypass that with only software. Or at least that's the claim.


The problem is that the average user has no way to verify this and also the light doesn't prevent the camera from turning on, it merely notifies you that the camera is on.

A manual, physical barrier, especially an aftermarket one, solves those issues. Personally, I use electrical tape.


> ”the light doesn't prevent the camera from turning on, it merely notifies you that the camera is on.”

No, but the OS prevents the camera from turning on without permission from the user.

(There have been bugs/compromises to this in the past, but at the browser level - you still had to give camera permission to the browser)

Besides, if you’ve got some compromised or surreptitious software on your MacBook trying to secretly take photos, you probably have much bigger security problems to worry about than just what it can see through the camera.


Without worrying about malware, several times I've clicked on some Skype like program while trying to make a voice call only to find it trying to transmit ugly video of me.


Valid point. I have to admit I'm a bit "video call vain" as well. I like to make sure my hair doesn't look too crazy and my room doesn't look too messy before getting on zoom/skype/etc. But one thing the last few months has taught me is that many of my friends/colleagues/family really don't care about these things!


Yeah, I'm not even worried about malware, I'm more worried about joining a call with video on by mistake using a legit program.

I mean, the thing is that I never want to use video on calls, basically. Waste of bandwidth and no worry about broadcasting the wrong thing by mistake.


You’re focused on the wrong part of the chain here. As the camera system is only as weak as it’s weakest link, if Apple indeed made a circuit connected to the LED (and I fully trust you on that), then the weakest link is elsewhere: company provided laptops are often altered prior to be given to an employee. I know of colonies who install software to track messages etc. What’s to say the same companies don’t alter the circuit board to modify the LED behavior?

There is a risk/reward/effort to look at, putting a small piece of tape is low risk / low effort / high reward (if your company actually angers laptops).


If they take the efford to alter the circuit they might as well place a camera somewhere else, listen to all your network traffic, install a (hardware) keylogger and what not.

I think you are taking this too far.

People who fear to be tracked buy a laptop in a random store and don't use a provided one.


What about company laptops,where you're much more likely to be targeted based on your job, not your personality.

Snowden already showed us the depths that governments will go to, to compromise their victims with hardware swaps and worse. And it's already been 7 years. They're even better at it now.


> What’s to say the same companies don’t alter the circuit board

The realities of modifying hardware. Is it possible? Sure. Is a company going to do it routinely at scale? Highly unlikely, because unlike software modifications, this would be pretty expensive. Are you aware of any companies that routinely do _hardware_ modifications on employee Macbooks?


Not aware of any that do that for laptops but I know 2 personally that do that for phones. They have a collection of devices (phones) trash to go, so it’s not as unscalable as I initially thought because they re-use the devices.

So I’m assuming if some do it for phones, must be some doing it in laptops.

Again. It’s all about probabilities. 1/ What’s the likelihood of the company doing that? Close to none. 2/ what would be the severity of the issue if they were doing that for me? Very high. 3/ what’s the effort level to prevent that? Very little.

This ratio ultimately tells us what to do.


Why do I even have to trust Apple and physics here? Why can't Apple just provide a physical lid for the camera to disable it. Why even take that chance.


Because a physical lid is ugly and inelegant.


Dell elite books have very elegant camera shutters with a barely visible slide.

But obviously this is beyond Apple.


So do their flagship Precision series laptops. Dell knows how to cater to their professional user base.


I happen to have (collecting dust somewhere...) one of those old Firewire webcams Apple made. It has a physical shutter you can open and close by rotating the front. It's about as beautiful and elegant as it gets.

If Apple still cared about beautiful and elegant products, it could surely find a way to incorporate a miniaturized version of that in a Macbook.


My Thinkpad comes with a physical lid that works perfectly and is in no way inelegant.


My ThinkPad comes with a butt-ugly lid which works perfectly, and I don't care about the ugliness.


Tape it is then.


I trust you absolutely and toally on this. Why wouldn't I?

But if you had your entire net worth riding on it, would you trust yourself to be infallibly correct or would you trust something along the lines of a post-it note to be completely sure? You know, if your life depended on it on every single possible model of apple laptop in all circumstances imaginable? (Do we include if your laptop was interecepted and altered by a hostile agent? Because we know that happens too...)


Devil's advocate: If our threat model includes your laptop being tampered with by an evil maid competent enough to imperceptibly modify the camera LED circuit, couldn't they just install a separate camera elsewhere (maybe in one of the speakers)?


Or they put you to sleep in a way where you have no memory and place the bug inside your body. You can go on like that forever. So leave your doors unlocked because you can't ever be 'safe' right? Obviously not.

There's a scale from dead easy to more difficult to very dificult. Easier to get you is a bigger problem. Cheap & easy to prevent - well why wouldn't you? It's asymetric.

Wouldn't you feel hilariously stupid if someone modified your camera circuit when interecepting your laptop and you actually didn't stick a post-it over it to thwart their dastardly plans.

The point here is making the kind of claims made about LEDs and camera circuits is really, really easy when telling other people what is not a risk. When you carry that risk - ie "all possible models and other threat vectors" suddenly you should not be so sure anymore. A physical cover is better, easier, cheaper and basically infallible for what it is advertised to do. Asymetric payoffs are worth noting. A genunine plausible risk scenarios are all you need to take a /trivial/ mitigation step.

Apple making trivial mitigation steps harder is really, really, really stupid. In fact, beyond merely stupid, it's unwittignly and incompetently user-hostile. (Unless you think they're design process has been infiltrated by the NSA or something, which I guess is at least possible, but I think it unlikely in the face of utterly incompetent idiocy - which Apple do display from time to time).


Please, share the circuit for the LED. To take a picture, it takes 4ms - a human eye would not even register that LED turning on.

I don't think I even mind, because you've also broken physics and very possibly found a way to generate unlimited electricity forever.

Unless there is an option to send higher voltage to the camera (control the VRM) and increase the current through the LED wear it off quickly for instance. The statement is incredible condescending, esp. given no link to actual schematics.


Can one power up the camera for only a few milliseconds? Enough to take a still photo, without the LED getting bright enough for a human to notice?


In the past, this was exactly how it was done. Here's an article on the FBI doing it with an Apple webcam six years ago [0].

Should be noted that was after Apple went to the effort of making a hardware delay to try and force the LED to turn on first, but it was still worked around.

[0] https://www.smh.com.au/technology/researchers-activate-apple...


It doesn't have to be milliseconds. Most of people don't sit and stare on the camera all day. I have multiple displays and turning camera for a second when I am looking at another display - or even the laptop one but concentrating on something on the screen, especially the lower part - would slip my attention very easily. Or I might notice something off with peripheral sight, but it is very inexact and while I turn my head to bring it into the field of sight where I have a good resolution, it could be already gone. Of course, the cover has none of these problems. If it's covered, then it's covered.


No, because the LED doesn't need nearly as much power as the camera. And it takes the camera more than a few miliseconds to start working.


> much power as the camera

You mean current, right? And it'd have current limit circuit, at a resistor but likely a slow start circuit entirely? Unless you share the schematics, it's an empty argument.


This used to be a thing, where you’d flash the LED briefly and hope the user didn’t notice. But new Macs prevent this by having a minimum duration the light will remain on.


FWIW I tried using the command line utility isightcapture on my 2019 macbook, and the LED turned on for 4-5 seconds and I got a dialog asking if I wanted to allow access to the camera. So this seems to be true. I still have a camera cover though


Hm, seems like the LED has some logic to it and isn't just connected directly to the camera....


> You cannot send power to the camera without also sending power to the LED, which will in turn cause the LED to light up. Unless the LED is broken, in which case you will know because it will never light up.

You make multiple assumptions here

1) You assume that the during the time that passes between the LED breaking and the user noticing, there was not a single attack or a single blunder that caused the camera to turn on and record/capture something that was unintended.

2) You assume that the LED breaks deterministically. The LED can break randomly. Maybe it lights up when nothing is being recorded resulting in a false positive. The user has no way of differentiating between a false positive and a true positive which can result in unintended captures.

3) Similarly the LED can break in a way where it sometimes doesn't light up when something is being recorded even though power is always sent to the LED when the camera is on resulting in a false negative. Again, the user has no idea of differentiating between a false negative and a true negative.


> If you manage to find a vulnerability in this system, I don't think I even mind, because you've also broken physics and very possibly found a way to generate unlimited electricity forever

Or your LED might be broken.


If the LED is broken, so is the camera.


The LED circuit is likely wired in parallel with the camera. It could fail and the camera would still get power.


Regardless of how foolproof the design is, I don’t have the ability to assess it to ensure it’s doing what I’ve been told it’s doing.

I can definitely assess a shutter.


> Every laptop Apple has manufactured in the last ten years has an LED connected to the same circuit which powers up the camera. You cannot send power to the camera without also sending power to the LED, which will in turn cause the LED to light up. Unless the LED is broken, in which case you will know because it will never light up.

In order to accept this argument I need to trust that you, an internet rando I know nothing about, are telling the truth AND that it'll remain so for any future Apple models. I think no matter how confident you're in your assessment of the current Apple hardware, you can't in good faith argue that they will not change course in the future for whatever reason.

Also, again, they already messed it up once in the past. It won't be hard to imagine that they will do it again some time in the future or already doing so.


Yet this is only one of the points mansioned above...


This is admittedly a bit of a movie plot threat, but could an evil maid attack rewire this, then later malware takes advantage of the rewiring?

IMHO layers of security are good.

On my end I worry about the risks of constantly just leaving my Mac, which has filevault enabled, simply protected by a screensaver. Is that less secure than if I put it to sleep? And presumably turning it off completely is safest?

How do I make informed choices about how much "locking" to do when I step away?

These are all things I think about reading an article like this, and I'd love to hear other's thoughts.


Yet Dell and some other laptop manufacturers started to include a physical privacy slider right in the hardware. Considering Apple stance on privacy, I hope they consider this at some point.


Why not then put it on the iPhone too. Considering Apple's stance on aesthetics and non-moving parts, seems unlikely


> Every laptop Apple has manufactured in the last ten years has an LED connected to the same circuit which powers up the camera

It would be very helpful if you could provide some evidence of that


You already found one yourself: if the LED is broken, you will notice at some point when it is too late.


I think the third point is especially strong argument for having the cover. It gives you a second physical layer of security rather than possibly a button you might automatically click away and it gives you the opportunity to join the call first and then decide to actually share (e.g. if the setting is more formal than expected or if the other side isn’t sharing).

Relying on the light going on after the fact is a much weaker protection, the user may have opted in to always letting an app use a camera and between the 15 or so UIs that the conference apps have the user might miss that it actually turns the camera on unless the button is clicked.

Now if apple were to release an os Level protection that automatically pulls up a screen showing what is shared from the view of the camera and asks for that approval, that would improve this situation.


> Now if apple were to release an os Level protection that automatically pulls up a screen showing what is shared from the view of the camera and asks for that approval, that would improve this situation.

That doesn’t help for the attack that many people are using here, which is if the software on your machine is compromised.


I didn’t refer to that concern, that’s a different discussion - in a corporate setting the policies might help avoid that by limiting what can be installed.

where they don’t help is accidentally activating the camera and only realizing afterwards. That’s a UX issue with so many different video conference technologies that all behave slightly differently.


Yes, I think that’s a legitimate concern. I can’t say I haven’t started videoconferencing software with a thumb over the camera ;)


I had to switch on my camera for a video call for the first time yesterday. (I can usually get around it and do audio only)

Removed the piece of electrical tape I had over the camera to find the image was completely blurry from the glue. Good to know if it ever falls off and I don't have tape to replace immediately.


I use gaffer's tape. Sufficiently (for my standards) unlikely to fall off, and the better quality stuff is designed not to leave residue.


Many high-end camera stores sell Gaffers tape. It is used extensively in the film industry exactly because it almost always leaves no residue. And what it does leave (in my experience) comes off very easily.


This is why I use a smaller piece of normal clear tape UNDER the outer layer of tape to protect the lens from the glue.


A small piece of paper works well also.


Just to illustrate the third point:

Our company's video conferencing software has multiple "modes" for a conference call, which the moderator could configure. Hardly anyone ever changed the mode, but at one point while trying to configure something unrelated I ended up switching modes in the middle of the conference call. To my horror the software immediately turned on everyone's camera.

Luckily the strain of streaming 40 video feeds to everyone 40 participants pretty much locked up the call, but for a brief moment I was able to enable approximately 40 cameras from people who were just sitting in their houses, who knows how dressed or what was going on behind them (I tried not to look).

I'm pretty sure we can apply Hanlon's Razor here and assume it was just an innocent bug: it's not hard to see how joe programmer might have overlooked the default settings when the mode is changed during some completely unrelated refactor. But whatever the case, as long as video conferencing remains lucrative vendors will continue to pack features into the software, and as long as they keep adding features, they will continue to create additional edge cases to trigger these incidents.


> First, you have to trust Apple that the indicator _really_ can't be disabled.

I'm not disagreeing with your other points nor am I saying that this isn't the case nor am I a fan of any apple products... However designed circuits to implement such functionality is quite common.

[0]; https://en.m.wikipedia.org/wiki/Series_and_parallel_circuits


If the circuit turns on when the camera is on, how does it have always-on True-Tone? Is it a different sensor or is Apple just full of shit?


True Tone is a different sensor ( which isn’t a camera)


I wonder if the TrueTone sensor (which is most likely similar to a 1-pixel "camera") can be used to determine if the user is near the laptop, then only turn the camera on when you think they're not paying attention.


Different sensor.


Fourth, lights are not accessible to blind persons, but switches or covers potentially can be.


This one is one of the most solid points. Remember Tim Apple saying "When we work on making our devices accessible by the blind, I don't consider the bloody ROI."?


Not to mention the CIA had an exploit for Samsung TVs that recorded audio even when the TV seems to be turned off[1]. It is better to have peace of mind by blocking the camera physically.

[1] https://www.zdnet.com/article/how-cia-mi5-hacked-your-smart-...


Well yes but those are Samsung TVs.


> Kenneth White, a security researcher and cryptographer, told The Intercept that smart TVs are a "historically pretty easy target," and that there is "zero chance" that the CIA targeted only Samsung.


I agree, the official statement is comparing pears and apples. A led gives you no control.

The real alternative to a cover would be a physical on/off button next to the camera which would physically connect/disconnect the camera behind the hood.


I also thought about another point recently. Many companies hand out Apple laptops with pre-installed software to their employees. It's not entirely unrealistic to imagine that some of the pre-installed software is intended to spy on the employees to make sure they don't use company hardware to do anything weird and takes screenshots and camera shots occasionally.

In this kind of situations you'll see the LED and you'll know what's happening and you'd much prefer to have a physical cover on your webcam.


Any points like "but the LED is directly fed from the power line" are moot.

(1) With a mechanical shutter, the state of blocking can be trivially and reliably inspected. With electrical control, this is not the case.

(2) A mechanical shutter works by making it physically impossible for the camera to see anything, while powered or not. With any electrical control, this is nit the case.

So, either you use a mechanical shutter, or all bets are off.


i don’t understand the argument that you trust apple enough to use their os but not be enough to believe their green light


You have to use some operating system at the end of the day, and vulnerabilities can impact any of them. You might as well trust Apple as much as the next company, since all of them are liable for millions of lines of code, and it's likely they all presently have undiscovered vulnerabilities.


Why trust a company at all when you don't have to? Linux, the BSDs, etc.


They probably have a lot of vulnerabilities as well.


If someone isn't getting paid, what incentive do they have to patch vulnerabilities in a timely manner?


Not sure what you are asking? As far as I can tell, hackers hack like fish swim. We just do. For a multitude of reasons.

For many people, me included, if you wanted to give me a bunch of money to do something and I bit, not that I would nowadays, that could actually lead to a loss of motivation.

As for vulnerabilities, many open source projects responds to them much quicker that industry. Say for pride in their work. as one motivation.


Pride can be fleeting. There's certainly advantages and disadvantages to depending on security patches from a paid organization and from volunteers. Regardless of which OS you use, you will need one that regularly patches its vulnerabilities.


So the base Apple builds their OS on is a huge security vulnerability? Or does it suddenly become secure when apple is involved?


They trust the physics, simplicity and verifiability of a cover more than they trust the invulnerability of Apple's black-box software & hardware implementation.


there are ways to get over other forms of trust, for example: use being pseudoanonymous. There is no way around video, unless you happen to wear a mask.


I used to work at Discord. There's a reason virtually every Trust and Safety (the division that handles online harassment cases) Associate has a physical covering over their webcam.

Regardless of how bad actors are accessing these photos, its eminently obvious that people are getting webcam photos of them taken without their knowledge.


The article also goes on to say users can use something thin like paper as an alternative, or simply remove it before closing the lid.


> First, you have to trust Apple that the indicator _really_ can't be disabled.

Or anyone with a basic knowledge of electronics who could verify it.


On the exact laptop you are using right this minute since multiple versions of each Mac model exists making a general advice useless.


For me the second argument is already strong enough to settle the discussion




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: