Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sometimes there is an engineering solution to a social problem. Rather than blurring faces software can now isolate them, downscale and then upscale them, such that they look like real people but don't actually exist. The photo doesn't look censored but the consent issue goes away. Then you just have to identify the faces that you want to keep. The same could be done with the brands and logos that randomly litter photos, creating other legal issues.


>> The photo doesn't look censored but the consent issue goes away.

No it doesn't. What post-processing you do is up to you but that doesn't change how people feel about you taking the picture in the first place. They have no idea weather you're going to blur them, enhance them, throw the picture away, or what. Like the author said, it's legal and you don't need consent. The real trick is to not be creepy about it but that's not universally defined.

I was at the marina one day and saw a dog chasing a stick in the water. Lots of splashing about. I stopped and took a picture with my big honking camera. I was careful to get just the dog and not the woman that was throwing the stick, but neither her nor her husband could have known exactly what I got a shot of. Nobody said a word and I moved on thinking how odd it could seem to do that. A friendly comment about what a great subject the dog is might have helped calm any discomfort they may have had but I didn't get the impression it was an issue. Was it to them? I dunno.


I know thats the direction we're heading but do you have an example that's available now and realistic?


I don't think that's actually a solution, just a way of making the same underlying problem harder to spot.

In general, I am extremely distrustful of this as an idea:

> Sometimes there is an engineering solution to a social problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: