Facebook would like to listen to what you listen or watch

Last week, Facebook announced a new feature in their status update. If switched on, this feature will identify the songs or TV program that it will identify through the microphone of the mobile device.  It will propose to share this information with your community (and propose a 30 second free sample of the song or a synopsis of the TV program).

Screen Shot 05-26-14 at 05.13 PM

A new example of the use of audio fingerprinting.   By default, the feature is switched off.   Furthermore, the user decides when to share and with whom to share the information.  Thus, in theory, there is no associated privacy issues.   The user remains in control.

Facebook claims that it will not share it if you do not want.   Unfortunately, Facebook does not precise whether it will collect the information for its own profiling even if the user refuses to share it with friends.

As I’m paranoid and as there is no free lunch…     I don’t care as I do not have a Facebook account.  Will you use it?

CataCrypt 2014

In the tsunami of the catastrophic HeartBleed bug, this new IEEE workshop will be interesting.   cataCRYPT stands for “catastrophic events related to cryptography and security with their possible solutions.”

The main point is: many cryptographic protocols are only based on the security of one cryptographic algorithm (e.g. RSA) and  we don’t know the exact RSA security (including Ron Rivest). What if somebody finds a  clever and fast factoring algorithm? Well, it is indeed an hypothesis but we know several  instances of possible progress. A new fast algorithm is a possible catastroph if not handled  properly. And there are other problems with hash functions, elliptic curves, aso. Think also about the recent Heartbleed bug (April 2014, see http://en.wikipedia.org/wiki/Heartbleed): the discovery was very late and we were close to a catastrophic situation.

So we are thinking about a regular workshop, the name is CATACRYPT, about these possible problems and their solutions. It includes problems with cryptographic algorithms, protocols, PKI, DRM, TLS-SSL, smart cards, RSA dongles, MIFARE, aso. Quantum computing, resilience and agility are also on the program.

The birth of cataCRYPT is not opportunistic.  His founder, Jean Jacques Quisquater, had launched the idea last year.  Its announcement following HeartBleed is a pure coincidence.

The paper submission deadline is 2 June 2014.   Hurry up…

The conference’s site is http://catacrypt.net/

Snowcrash, Snowden, Snow on trust

This prezi presentation describes how the perceived trust on Internet has eroded over time, with a clear acceleration last year.   This is a personal (provocative) vision.  The companies cited are just (non-exhaustive) examples and are more representing a category (type of flagship).

[prezi id="http://prezi.com/9jxg2wbcdm4z/snow-crash-et-al/" width=670]

Apple’s security vulnerability

On February 21, Apple announced that all its most recent versions of operating systems (indeed since 2012), had a critical vulnerability in the SSL implementation. The vulnerability was due to a coding bug in the SSL certificate verification routine. Following is the problematic piece of software.

static OSStatus

SSLVerifySignedServerKeyExchange(SSLContext *ctx, bool isRsa, SSLBuffer signedParams,uint8_t *signature, UInt16 signatureLen)

{

OSStatus err;

if ((err = SSLHashSHA1.update(&hashCtx, &serverRandom)) != 0)

goto fail;

if ((err = SSLHashSHA1.update(&hashCtx, &signedParams)) != 0)

goto fail;

goto fail;

if ((err = SSLHashSHA1.final(&hashCtx, &hashOut)) != 0)

goto fail;

fail:

SSLFreeBuffer(&signedHashes);

SSLFreeBuffer(&hashCtx);

return err;

}

The highlighted “goto fail” should not be in the source code. It is equivalent to “if conditions then goto fail else goto fail”.  In all cases, it will goto fail!  It bypasses all subsequent tests of the SSL certificate. Thus, an attacker could easily use a forged certificate that would be accepted as valid. She could mount a man in the middle attack.

This code is part of Apple public source code. This is due to non-adequate programming practices. Usually, we accept at least two things in software coding:

  1. Unitary tests; they should have covered all the branches of the if conditions and should have detected this error.
  2. Non-regression test; if the error was introduced later (as SSL is available for a long time), they should have spotted it.

These practices seem not be in place. This is a major issue for a code that handles security. Security requires the best coding practice. Bugs can easily turn into security vulnerabilities.

 

In the fuzz, I have even seen conspiracy theory that the bug was introduced by NSA. I would expect NSA to use stealthier techniques than such obvious back door. furthermore, if I am a bad guy who wants to introduce a backdoor, I will design one that I am the only one to be able to exploit. This one is available for every body.

Lesson: Secure software requires best software practice. And a derivative of law 6: Programmer is the weakest link.

Bring Your Own Cloud

In 2013, the cloud security alliance released “The Notorious Nine” threats for cloud. A few months later, I have the feeling that the most important threat is missing: “Bring Your Own Cloud (BYOC)”.

BYOC is when an employee uses a cloud based service without the blessing of his company for business purpose. The employee clearly puts the company at risk. The employee may bypass all the security policies of the company, as well as the fences the company put to protect its IP or infrastructure.

BYOC is so easy to do and unfortunately it is awfully convenient.

  • You just need to enroll on a free SaaS service to launch it immediately. It is sometimes faster than asking the same service from the IT team. How many of your employees have opened an account at DropBox, Box, GitHub, or whatever other cloud sharing service. How many of your sensitive information are already widely in the cloud? The employee will most probably not check whether the system is secure. The default settings are not necessarily the ones that you would use. Of course, the employee will not have read the SLA.
  • You just need to use the company credit card to open an account at IaaS or PaaS providers. This is clearly faster than asking the IT team to install a bunch of servers in the DMZ. But how secure will they be?

The fast and free/cheap enrollment of cloud services make it extremely attractive for employees. And they do not make it maliciously. They will always have strong rationales for their action.

But, it can become easily a nightmare for the company when the things are going wrong. Especially, if the employee used his/her personal mail to register rather than the company’s email. In that case, the company will have hard time to handle these accounts.

What can we do? Cloud is inevitable, thus we must anticipate the movement. A few actions:

  • Provide a company blessed solution in the cloud for the type of services will need. This solution can be fine tuned to have the security requirements you expect. The account will be in the name of the company, thus manageable. Premium services offer often better security services such as authentication using your Active Directory, logging, metering…
  • Update your security policy to make it mandatory to use only the company blessed solution
  • Educate your employees so that they are aware of the risks of BYOC
  • Listen to their needs and offer an attractive list of company blessed services
  • Make it convenient to enroll the company blessed services.

 

Do you share this concern? What would you recommend?

A graphical password solution: PixelPin

Graphical passwords are an alternative to usual textual passwords. They use an image as main support and image handling such as pointing position in the picture as entry mode. They can be convenient on tactile screens, more difficult for robots to mimic human behavior, and claimed to offer better memory resilience.

Since early 1990s, the literature has been rather extensive in the field. Technicolor published several papers in the field (search for Maetz and Eluard). But we rarely see a product that implements such a solution.

UK-based company, PixelPin offers such a solution. It is based on Bonder’s seminal patent (5559961). When registering, you select one image as a support and four points in the image in a given order. When answering the challenge, you have to select the four points in the initial order. To limit risks of shoulder surfing, the precision of positioning is rather fine (at least on a computer). After 5 attempts, the account is locked for 15 minutes. Reset sends a reset token via the email used to register.

To increase memory resilience, and to ease the positioning you should select a picture with clear identified salient points else you will be quickly locked out. Of course, using too obvious salient points reduces the space of “keys” to explore.

The main issue is the network effect needed for such solution. It will be efficient if the sites are common and often visited, else your memory will fade. Unfortunately, I did not find many sites using PixelPin. The startup was launched beginning last year.

NSA spies us: what a surprise!

I twill start this new year (for which I wish you all the best) by some ranting.  Since the Snowden’s story started, I never commented.  Now I will a little bit as I start to be upset by all this hypocrisy.  Snowden shed some lights on the behavior and skillset of the NSA.   This is interesting.  But what is not acceptable, is that media seem to be surprised.  WE KNEW IT FOR YEARS.

 

NSA spies our electronic personal communications!  We knew it for years.  Echelon was  known in the 90s.  The new systems are just a natural evolution to new communication means and enhanced computing capacities. It was even known that the scope was larger than military/political actions.   NSA published patents about semantic analysis of natural speech.  The purpose was obvious.  I remember an initiative that asked people to generate random mails with gibberish inside but also some alleged keywords (such as terrorism, NSA,…) that should trigger the scrutiny of NSA.  The aim was to try to flood the system.

 

NSA is studying advanced techniques such as quantum computing to crack ciphers!  I would expect any serious governments to have their black cabinet studying this topic.  Once more, it is known that NSA may have some advances over the academic/public domain in this field.  In 1974, US banking industry asked IBM to design a commercial cipher to protect electronic banking transaction.  With the help of the NIST, IBM designed the famous DES.  End of 80s, academic world discovered a new devastating technique: differential cryptanalysis.  In 1991, Eli BIHAM and Adi SHAMIR demonstrated that surprisingly DES was immune to this ”unknown” attack (which was not the case for many other ciphers).  In 1994, Don COPPERSMITH, who was part of the DES design team, revealed that DES had been designed to resist to differential cryptanalysis.  In 1974, NSA knew already differential cryptanalysis but kept this knowledge secret as it gave a competitive edge to US secret agencies.

Secret services do not play fair democratic games!  This is why they are called secret services.  Hollywood told about that so often as well as John LE CARRE. 

 

So please, let us stop this hypocrite surprise: we knew about (but not the details).

 

E. Biham and A. Shamir, “Differential cryptanalysis of DES-like cryptosystems,” Journal of Cryptology, vol. 4, Jan. 1991, pp. 3–72 available at http://link.springer.com/article/10.1007/BF00630563.

D. Coppersmith, “The Data Encryption Standard (DES) and its strength against attacks,” IBM Journal of Research and Development, vol. 38, 1994, pp. 243–250.