ra2 studio - stock.adobe.com

Is Clubhouse safe, and should CISOs stop its use?

With more concerns being raised over the privacy and security of social media app Clubhouse, we consider whether security teams should consider restricting or stopping employees from using it

This article can also be found in the Premium Editorial Download: Computer Weekly: Are you ready to return to the office?

Security professionals are considering questions over the privacy and wider security of hot new social network Clubhouse following a leak of private audio chats by a user who found a way to beat the system and stream them beyond the confines of the service.

The audio-only app launched last year but has flown under the radar until recently, largely because it remains invite only, which means its user base has been restricted to those “lucky” enough to be beckoned beyond the velvet rope. It is also currently only available for Apple iOS devices, limiting its spread.

Since January 2021, however, it has become more prominent after leading tech influencers, including Elon Musk, began to promote their use of it.

It also gained traction in China, where for a time it was able to operate around the parameters of the Great Firewall, which meant users based there could freely discuss topics such as China’s abuse of its Uighur minority, the Hong Kong pro-democracy movement, or Taiwanese independence – although Beijing is now wise to this and has banned it.

Besides the – now locked down – flaw that enabled audio data from Clubhouse to be made public, there are a number of other issues with the app that have security observers in a spin.

For example, creating an account on the service cajoles the user to upload their device address books, which means that if someone you know has joined Clubhouse, the service has probably hoovered up your contact details without your consent to make further recommendations and encourage more connections and sign-ups.

Because of this, many users have reported having connections highlighted that they may prefer to have kept secret – in some instances including drug dealers and therapists, or even abusive or harassing exes.

A further concern for many has been the location of elements of the service’s back-end infrastructure, which is owned by a company called Agora.

Why is this an issue? An investigation earlier in February by Stanford University’s Internet Observatory (SIO) holds the answer. Agora, despite the prominent Silicon Valley address displayed on its website, is a Shanghai-founded startup, and in the course of its probe, Stanford’s team found that it is likely to have access to Clubhouse users’ raw audio.

Chinese connection

On at least one occasion, the SIO said it saw room metadata being relayed to servers it believed were hosted in China, and audio data to servers managed by Chinese entities. Also, because user Clubhouse ID numbers and chatroom IDs are transmitted in plaintext, it could be possible to connect Clubhouse IDs with user profiles.

Barring the presence of end-to-end encryption – which the SIO does not believe Clubhouse to have implemented – this means audio data could be intercepted, transcribed and stored by Agora, which, because of its location, is subject to Chinese surveillance laws giving Beijing access to said data if it wants it in the course of a national security or criminal investigation. Huawei also fell foul of this law in the UK.

The SIO is at pains to point out that Agora has said it does not store user audio or metadata except for networking monitoring and billing purposes, and it is important to note that if this is true, Beijing cannot legally request Clubhouse user data from Agora because it doesn’t have that data.

However, this does not exclude the possibility of misrepresentation by Agora, or that Chinese intelligence agencies could access the network of their own accord. If metadata is indeed being transmitted into China, the Chinese government can probably collect it without accessing Agora’s network, said the researchers.

Also, said the team, China almost certainly cannot access Clubhouse data direct from Clubhouse – save by hacking into its systems – so the risk to the average Clubhouse user is likely to be no higher than it would be on Twitter, for example.

In a statement following the SIO’s disclosure, Clubhouse said it was rolling out changes to add further encryption and blocks to stop its clients from ever transmitting pings to servers in China, and would engage external cyber security aids to review and validate this policy.

Shortcomings exposed

With Clubhouse having found a ready home among the kind of businesspeople who enjoy posting motivational screeds to LinkedIn about their early morning jogs and skincare regimes, it is highly likely that the app is already present on the devices, and therefore the networks, of many organisations.

And with multiple major cyber security and privacy issues being highlighted within the space of a month, it seems clear to many observers that, as of the time of writing, Clubhouse is in a similar position to where Zoom was at the beginning of the Covid-19 pandemic – that is to say, experiencing explosive growth that has laid bare some security shortcomings in its design and development.

Jeremy Turner, head of threat intelligence at Coalition, a US-based cyber insurance and risk specialist, was among those who took this view. He said:The Clubhouse breach puts a spotlight on a common problem for technology startups – the benefits of technology are often the prime focus or motivating factor for both developers and users, which can be short-sighted.

“When a technology’s value is so significant and adoption so swift, the risks come as an afterthought. Startups should be cautious of moving faster than they can keep up with security and privacy considerations.

“When developers push new technology into the hands of early adopters, the risks are easy to ignore or think of as a problem for tomorrow, when in reality they should develop data security measures as thoroughly as you develop new user experiences. Early-stage development risks always seem to be over the horizon – until they’re not.”

Ray Walsh, digital privacy expert at ProPrivacy, added: “It seems that Clubhouse has ongoing issues that seriously bring into question the platform’s ability to provide privacy and data security for its users. It is vital that anybody who uses Clubhouse is aware of the potential that their conversations are being recorded.

“The onus is now on Clubhouse to provide security for its app in such a way that this threat is completely put to bed. Until security can be proven by the app’s developer, it is best to assume that this is a completely open and public forum for interaction, where what is said could potentially be accessed and heard by anybody.”

GDPR not considered

Martin Jartelius, chief security officer at Outpost24, took a similar view, pointing out that Clubhouse has already run foul of the German data protection authorities for possible breaches of the General Data Protection Regulation (GDPR). Indeed, he said, for a time, the app’s terms and conditions made no mention of the GDPR.

“Personally, I have not spent an ounce of effort looking into the platform and how it works,” he said, “but I think we can all agree that there is room for doubt that for a company that forgot to address GDPR in its terms and conditions, technical security would have been first in mind when building it.

“Use the platform for what it is – a space to meet and chat in a likely insecure and far from a privacy-focused environment – and if you are comfortable with that, you won’t be disappointed. If privacy is your thing, this for sure is not the platform to choose for your interactions.”

Acceptable use

OneLogin global data protection officer Niamh Muldoon warned that even without the recent spate of incidents, using an app like Clubhouse on a corporate device could violate workplace policy.

The framework that most organisations have put in place around privacy is governed by the employment contract and company policies such as acceptable use,” she said. “This is a dual contractual commitment where employees give permission to the organisation to have their data, and employers commit to honouring privacy obligations for use of this data, as well as protect it as it is processed and stored in their systems.

“The privacy exposure concern is if the recordings are shared without permission. This could result in a privacy breach, as in accidental and/or deliberate disclosure, along with subject access reports [SARs] being raised for investigation. Remember, privacy is about using data solely for the purpose for which it was collected, so organisations cannot monitor Clubhouse chatrooms to protect their company and associated information assets.”

Muldoon added: “Therefore, I do believe this is a tool that organisations should give careful consideration to, before allowing them on their company endpoints. This is particularly true following the recent breach of this platform. I would urge a decision to restrict the use of this application until the investigation report has been shared and remediation actions implemented.”

Keith Geraghty, solutions architect at Edgescan, said it is better to be cautious when it comes to all messaging apps on corporate devices – not just Clubhouse.

He urged CISOs to either restrict or call a halt to the – likely unofficial – use of Clubhouse, unless of course they are happy to take the risk of their conversations being made public.

The pandemic incentivised the adoption of new ways to communicate virtually, which meant an increase in new platforms and an effort to innovate existing ones,” said Geraghty. “I would encourage people to do the research and adopt a different service on which their company can provide a track record of due diligence and security first.

“This is not just in response to the Clubhouse recording issue, but a general security best practice – personal messaging platforms, but also other applications and websites, are a security concern when accessed from work devices.”

Lookout security engineer Burak Agca said the Clubhouse story serves as a timely reminder of social media best practice for any individual, and a warning not to share data on any channels.

“We now seamlessly transition between our work and personal lives on a mobile device, increasing the risk of users inadvertently sharing corporate information on social media, even if it is with another co-worker,” he said.

Education a better defence?

However, while in some cases it clearly makes sense for certain platforms, apps or websites to be blocked at the endpoint, Javvad Malik, KnowBe4 security awareness advocate, said this risked creating an extended game of whack-a-mole, in which security teams are forever creating new rules and exceptions.

“A good security strategy blocks high-risk sites, but also educates users in the dangers and privacy implications of using others,” he said. “In doing so, people will make better risk decisions and either stay away from a risky site, or use it in a way that doesn’t compromise them personally or the organisation.”

Read more on Web application security

CIO
Security
Networking
Data Center
Data Management
Close