> As recently seen with Intel, there seems to be a trend where developers will do this pointless client-side decryption. When the client has the key, it’s strange that anyone would think that would be secure.
I stay and work in India. Yesterday, as part of a VAPT audit by a third party auditor, the auditors "recommended" that we do exactly this. I wonder if this directive comes as part of some outdated cyber security guidelines that are passed around here? Not entirely sure.
When I asked them about how I'd pass the secret to the client to do the client side encryption/decryption without that key being accessible to someone who is able to MITM intercept our HTTPS only API calls anyway, the guy basically couldn't understand my question and fumbled around in his 'Burp' suite pointing exasperatedly to how he is able to see the JSON body in POST requests.
Most of the security people we've met here, from what I can tell are really clueless. Internally, we call these guys "burp babies" (worse than "script kiddies") who just seem to know how to follow some cookie cutter instructions on using the Burp suite.
I am a pretty cookie cutter developer. We just make glorified CRUDs and I have tried to convince the engineering director hundreds of times that "There is no use of encrypting and decrypting localstorage with a key thats sitting right inside the client code." Yet they keep insisting on it in the code-quality checklist.
My guess - he’s avoiding political risk. If something goes bad, it’s better to say “it was encrypted but they got the keys” than to defend data wasn’t encrypted.
It’s semantics in terms of actual difference to an attacker, but it’s a world of difference when explaining to executives.
I guess they think it results in some kind of security by obscurity... Maybe ward off lazy beginner hackers..
- [deleted]