Security Stop Press : Beware Fake, AI-Generated Investment Scams In Facebook

A recent BBC investigation has highlighted how fraudsters are using fake, AI-generated scam stories, often with bogus celebrity endorsements, as paid-for Facebook adverts that link through to fake investment scheme pages (cloaking scams).

It’s been reported that the scammers beat Facebook’s automated detection systems by first creating an ad that links through to a harmless page and after the ad has been approved, the scammers then introduce a redirect to a malicious page.

Under the Online Safety Act, online services will be required to assess the risk of their users being harmed by illegal content on their platforms. The advice is to always research, check, and verify celebrity endorsements and investment legitimacy, consult professionals, and report suspicious ads to protect yourself from fraudulent schemes.

An Apple Byte : Instagram and Facebook Ads ‘Apple Tax’

Meta has announced that it will be passing on Apple’s 30 per cent service charge (often referred to as the “Apple tax”) to advertisers who pay to boost posts on Facebook and Instagram through the iOS app.

This move is a response to Apple’s in-app purchase fees, which apply to digital transactions within apps available on the iOS platform (announced in the updated App Store guidelines back in 2022). Advertisers wanting to avoid the additional 30 per cent fee can do so by opting to boost their posts from the web, using either or via desktop and mobile browsers.

Meta says it is “required to either comply with Apple’s guidelines, or remove boosted posts from our apps” and that, “we do not want to remove the ability to boost posts, as this would hurt small businesses by making the feature less discoverable and potentially deprive them of a valuable way to promote their business.” 

Apple has reportedly responded (a statement in MacRumors), saying that it has “always required that purchases of digital goods and services within apps must use In-App Purchase,” and that because boosting a post “is a digital service — so of course In-App Purchase is required”.

Meta’s introduction of the Apple tax for advertisers on iOS apps highlights the conflict with Apple over digital ad space control and monetisation and this move, aimed at challenging Apple’s app store policies, could make advertising more costly and complicated for small businesses.

Tech Insight : New Privacy Features For Facebook and Instagram

Meta has announced the start of a roll-out of default end-to-end encryption for all personal chats and calls via Messenger and Facebook, with a view to making them more private and secure.

Extra Layer Of Security and Privacy 

Meta says that despite it being an optional feature since 2016, making it the default has “taken years to deliver” but will provide an extra layer of security. Meta highlights the benefits of default end-to-end encryption saying that “messages and calls with friends and family are protected from the moment they leave your device to the moment they reach the receiver’s device” and that “nobody, including Meta, can see what’s sent or said, unless you choose to report a message to us.“  

Default end-to-end encryption will roll-out to Facebook first and then to Instagram later, after the Messenger upgrade is completed.

Not Just Security and Privacy 

Meta is also keen to highlight the other benefits of its new default version of end-to-end encryption for users which include additional functionality, such as the ability to edit messages, higher media quality, and disappearing messages. For example:

– Users can edit messages that may have been sent too soon, or that they’d simply like to change, for up to 15 minutes after the messages have been sent.

– Disappearing messages on Messenger will now last for 24 hours after being sent, and Meta says it’s improving the interface to make it easier to tell when ‘disappearing messages’ is turned on.

– To retain privacy and reduce pressure on users to feel like they need to respond to messages immediately, Meta’s new read receipt control allows users to decide if they want others to see when they’ve read their messages.


Considering that Facebook Messenger has approximately 1 billion users worldwide, the roll-out could take months.

Why Has It Taken So Long To Introduce? 

Meta says it’s taken so long (7 years) to introduce because its engineers, cryptographers, designers, policy experts and product managers have had to rebuild Messenger features from the ground up using the Signal protocol and Meta’s own Labyrinth protocol.

Also, Meta had intended to introduce default end-to-end encryption back in 2022 but had to delay its launch over concerns that it could prevent Meta detecting child abuse on its platform.

Other Messaging Apps Already Have It 

Other messaging apps that have already introduced default end-to-end encryption include Meta-owned WhatsApp (in 2016), and Signal Foundation’s Signal messaging service which has also been upgraded to guard against future encryption-breaking attacks (as much you realistically can), e.g. quantum computer encryption cracking.


There are several issues involved with the introduction of end-to-end encryption in messaging apps. For example:

– Governments have long wanted to force tech companies to introduce ‘back doors’ to their apps using the argument that they need to monitor content for criminal activity and dangerous behaviour, including terrorism, child sexual abuse and grooming, hate speech, criminal gang communications, and more. Unfortunately, creating a ‘back door’ destroys privacy, leaves users open to other risks (e.g. hackers) and reduces trust between users and the app owners.

– Attempted legal pressure has been applied to apps like WhatsApp and Facebook Messenger, such as the UK’s Online Safety Act. The UK government wanted to have the ability to securely scan encrypted messages sent on Signal and WhatsApp as part of the law but has admitted that this can’t happen because the technology to do so doesn’t exist (yet).

There are many compelling arguments for having (default) end-to-end encryption in messaging apps, such as:

– Consumer protection, i.e. it safeguards financial information during online banking and shopping, preventing unauthorised access and misuse.

– Business security, e.g. when used in WhatsApp and VPNs, encryption protects sensitive corporate data, ensuring data privacy and reducing cybercrime risks.

– Safe Communication in conflict zones (as highlighted by Ukraine). For example, encryption can facilitate secure, reliable communication in war-torn areas, aiding in broadcasting appeals, organising relief, combating disinformation, and protecting individuals from surveillance and tracking by hostile forces.

– Ensuring the safety of journalists and activists, particularly in environments with censorship or oppressive regimes, by keeping information channels secure and private.

– However, for most people using Facebook’s Messenger app, encryption is simply more of a general reassurance.

What Does This Mean For Your Business?

For Meta, the roll-out of default end-to-end encryption for Facebook and Instagram has been a bit of a slog and a long time coming. However, its introduction to bring FB Messenger in line with Meta’s popular WhatsApp essentially enhances user privacy and security and helps Facebook to claw its way back a little towards positioning itself as a company that’s a strong(er) advocate for digital safety.

For UK businesses, this move offers enhanced protection for sensitive data and communication, aligning with growing demands for cyber security and providing some peace of mind. However, the move presents further challenges and frustration for law enforcement and the UK government, potentially complicating efforts to monitor criminal activities and enforce regulations like the Online Safety Act. Overall, the initiative could be said to underscore a broader trend towards prioritising user privacy and security in the digital landscape, as well as being another way for tech giants like Meta to compete with other apps like Signal. It’s also a way for Meta to demonstrate that it won’t be forced into bowing to government pressure that could destroy the integrity and competitiveness of its products and negatively affect user trust in its brand (which has taken a battering in recent years).

Tech News : 20 NHS Trusts Shared Personal Data With Facebook

An Observer investigation has reported uncovering evidence that 20 NHS Trusts have been collecting data about patients’ medical conditions and sharing it with Facebook.

Using A Covert Tracking Tool 

The newspaper’s investigation found that over several years, the trusts have been using the Meta Pixel analytics tool to collect patient browsing data on their websites. The kind of data collected includes page views, buttons clicked, and keywords searched. This data can be matched with IP address and Facebook accounts to identify individuals and reveal their personal medical details.

Sharing this collected personal data, albeit unknowingly, with Facebook’s parent company without the consent of NHS Trust website users and, therefore, illegally (data protection/GDPR) is a breach of privacy rights.

Meta Pixel 

The Meta Pixel analytics tool is a piece of code which enables website owners to track visitor activities on their website, helps identify Facebook and Instagram users and see how they interacted with the content on your website. This information can then be used for targeted advertising.

17 Have Now Removed It 

It’s been reported that since the details of the newspaper’s investigation were made public, 17 of the 20 NHS trusts identified as using the Meta Pixel tool have now removed it from their website, with 8 of those trusts issuing an apology.

The UK’s Infromation Commissioner’s Office (ICO) is now reported to have begun an investigation into the activities of the trust.


Under the UK Data Protection Act 2018 and the EU General Data Protection Regulation (GDPR), organisations processing personal data must obtain lawful grounds for processing, which typically includes obtaining user consent. Personal data is any information that can directly or indirectly identify an individual.

An NHS trust using an analytics tool like Meta Pixel on their website to collect and share personal data without obtaining user consent, could potentially be illegal and both the NHS trust and the analytics tool provider (Meta) have responsibilities under data protection laws.

The GDPR and the UK Data Protection Act require organisations to provide transparent information to individuals about the collection and use of their personal data, including the purposes of processing and any third parties with whom the data is shared. Individuals must be given the opportunity to provide informed consent before their personal data is collected, unless another lawful basis for processing applies.

What Does This Mean For Your Business? 

The recent revelation that 20 NHS Trusts have been collecting and sharing personal data with Facebook through the use of the Meta Pixel analytics tool raises important lessons for businesses regarding their data protection practices. The Trusts’ actions, conducted without user consent, appear to represent a breach of privacy rights and potentially violate data protection laws, including the UK Data Protection Act 2018 and GDPR.

The Meta Pixel analytics tool, although widely used as an advertising effectiveness measurement tool, can have unintended consequences when it comes to personal data, such as medical data, and data privacy. The amount of information shared through this tool is often underestimated, and the implications for the NHS trusts could be severe. As several online commentators have pointed out, the trusts may have known little about how the Meta Pixel tool works and, therefore, collected, and shared user data unwittingly, however ignorance is unlikely to stand up as an excuse.

It is, of course encouraging that in response to the investigation, 17 out of the 20 identified NHS Trusts have at least removed the Meta Pixel tool from their websites, with some going on to issue apologies. To avoid similar privacy breaches and maintain the trust of customers, businesses should take immediate action.

Examples of how businesses could ensure their data protection compliance as regards their website and any tools used could include establishing a cross-functional data protection team with members from legal, technology, and marketing, and with the support of senior management. They could also conduct a thorough analysis of all data collected and transferred by websites and apps and identify the data necessary for their operations and ensure that legal grounds (such as consent) are in place for collecting and processing that data. For most smaller businesses it’s a case of remembering to stay on top of data protection matters, check what any tools are collecting and keep the importance of consent top-of-mind.

The implications for Meta of the newspaper’s report and the impending ICO investigation are significant as well. The incident highlights the need for greater transparency and understanding of the tools and services offered by companies like Meta, especially when it comes to sensitive topics and personal data. Privacy concerns arise when information from browsing habits is shared with social media platforms. Meta must address these concerns and ensure that the data collected through its tools is handled in accordance with data protection laws and user consent.

Overall, this case emphasises the importance of data protection compliance, informed consent, and transparency in the handling of personal data. Businesses must prioritise privacy and data security to maintain customer trust and avoid legal consequences.